Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Writing Parquet map(map) column can not set the outer key as non-null. #9129

Closed
res-life opened this issue Aug 29, 2023 · 1 comment · Fixed by #9147
Closed

[BUG] Writing Parquet map(map) column can not set the outer key as non-null. #9129

res-life opened this issue Aug 29, 2023 · 1 comment · Fixed by #9147
Assignees
Labels
bug Something isn't working

Comments

@res-life
Copy link
Collaborator

res-life commented Aug 29, 2023

Describe the bug
Reports the following error when writing map(map) column.

key column can not be nullable

The SchemaUtils.writerOptionsFromField can not handle map(map) column.
SchemaUtils uses structBuilder to simulate mapBuilder, but does not set the outer key as non-null. For the inner key, it's OK.

Steps/Code to reproduce bug

val data = Seq(
  (Map(Map(111->111, 112->112) -> 1, Map(121->121, 122->122) -> 2), 1),
  (Map(Map(211->111, 212->112) -> 1, Map(221->121, 222->122) -> 2), 2)
)
val df = spark.createDataFrame(data).toDF("c1", "c2")
df.write.parquet("/tmp/a.parquet")

Error:

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (chongg-pc executor driver): java.lang.IllegalArgumentException: key column can not be nullable
	at ai.rapids.cudf.ColumnWriterOptions.mapColumn(ColumnWriterOptions.java:530)
	at com.nvidia.spark.rapids.SchemaUtils$.writerOptionsFromField(SchemaUtils.scala:298)
	at com.nvidia.spark.rapids.SchemaUtils$.$anonfun$writerOptionsFromSchema$1(SchemaUtils.scala:329)
	at scala.collection.Iterator.foreach(Iterator.scala:943)
	at scala.collection.Iterator.foreach$(Iterator.scala:943)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
	at org.apache.spark.sql.types.StructType.foreach(StructType.scala:102)
	at com.nvidia.spark.rapids.SchemaUtils$.writerOptionsFromSchema(SchemaUtils.scala:327)
	at com.nvidia.spark.rapids.GpuParquetWriter.<init>(GpuParquetFileFormat.scala:374)
	at com.nvidia.spark.rapids.GpuParquetFileFormat$$anon$1.newInstance(GpuParquetFileFormat.scala:287)
	at org.apache.spark.sql.rapids.GpuSingleDirectoryDataWriter.newOutputWriter(GpuFileFormatDataWriter.scala:235)
	at org.apache.spark.sql.rapids.GpuSingleDirectoryDataWriter.<init>(GpuFileFormatDataWriter.scala:217)
	at org.apache.spark.sql.rapids.GpuFileFormatWriter$.executeTask(GpuFileFormatWriter.scala:326)
	at org.apache.spark.sql.rapids.GpuFileFormatWriter$.$anonfun$write$15(GpuFileFormatWriter.scala:266)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:136)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Environment details (please complete the following information)
Branch 23.10

@res-life res-life added bug Something isn't working ? - Needs Triage Need team to review and classify labels Aug 29, 2023
@sameerz sameerz removed the ? - Needs Triage Need team to review and classify label Aug 30, 2023
@sameerz
Copy link
Collaborator

sameerz commented Aug 30, 2023

Related to rapidsai/cudf#14003

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants