Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] join_test.py::test_struct_self_join[IGNORE_ORDER({'local': True})] failed in spark330 #4455

Closed
pxLi opened this issue Jan 5, 2022 · 0 comments · Fixed by #4458
Closed
Assignees
Labels
bug Something isn't working

Comments

@pxLi
Copy link
Collaborator

pxLi commented Jan 5, 2022

Describe the bug
pyspark.sql.utils.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

22/01/05 00:21:37 ERROR GpuOverrideUtil: Encountered an exception applying GPU overrides java.lang.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

[2022-01-05T00:21:40.201Z] ShowTables [namespace#66, tableName#67, isTemporary#68], V2SessionCatalog(spark_catalog), [default]

[2022-01-05T00:21:40.201Z] 

[2022-01-05T00:21:40.201Z] java.lang.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

[2022-01-05T00:21:40.201Z] ShowTables [namespace#66, tableName#67, isTemporary#68], V2SessionCatalog(spark_catalog), [default]

[2022-01-05T00:21:40.201Z] 

[2022-01-05T00:21:40.201Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.assertIsOnTheGpu(GpuTransitionOverrides.scala:503) ~[spark330/:?]

[2022-01-05T00:21:40.201Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.$anonfun$apply$3(GpuTransitionOverrides.scala:569) ~[spark330/:?]

[2022-01-05T00:21:40.201Z] 	at com.nvidia.spark.rapids.GpuOverrides$.logDuration(GpuOverrides.scala:456) ~[spark3xx-common/:?]

[2022-01-05T00:21:40.201Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.$anonfun$apply$1(GpuTransitionOverrides.scala:549) ~[spark330/:?]

[2022-01-05T00:21:40.202Z] 	at com.nvidia.spark.rapids.GpuOverrideUtil$.$anonfun$tryOverride$1(GpuOverrides.scala:3944) ~[spark3xx-common/:?]

[2022-01-05T00:21:40.202Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.apply(GpuTransitionOverrides.scala:580) ~[spark330/:?]

[2022-01-05T00:21:40.202Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.apply(GpuTransitionOverrides.scala:37) ~[spark330/:?]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$2(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$2$adapted(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:514) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:452) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at scala.collection.immutable.List.foldLeft(List.scala:91) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:451) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$2(QueryExecution.scala:170) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:196) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:196) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:170) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.withCteMap(QueryExecution.scala:73) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:163) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:163) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:214) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:259) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:228) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:103) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:482) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:83) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:482) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:458) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.202Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) [spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617) [spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_312]

[2022-01-05T00:21:40.203Z] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_312]

[2022-01-05T00:21:40.203Z] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_312]

[2022-01-05T00:21:40.203Z] 	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_312]

[2022-01-05T00:21:40.203Z] 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.Gateway.invoke(Gateway.java:282) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.commands.CallCommand.execute(CallCommand.java:79) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at py4j.ClientServerConnection.run(ClientServerConnection.java:106) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.203Z] 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]

[2022-01-05T00:21:40.203Z] 22/01/05 00:21:37 ERROR GpuOverrideUtil: Encountered an exception applying GPU overrides java.lang.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

[2022-01-05T00:21:40.203Z] ShowTables [namespace#66, tableName#67, isTemporary#68], V2SessionCatalog(spark_catalog), [default]

[2022-01-05T00:21:40.203Z] 

[2022-01-05T00:21:40.203Z] java.lang.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

[2022-01-05T00:21:40.203Z] ShowTables [namespace#66, tableName#67, isTemporary#68], V2SessionCatalog(spark_catalog), [default]

[2022-01-05T00:21:40.203Z] 

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.assertIsOnTheGpu(GpuTransitionOverrides.scala:503) ~[spark330/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.$anonfun$apply$3(GpuTransitionOverrides.scala:569) ~[spark330/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuOverrides$.logDuration(GpuOverrides.scala:456) ~[spark3xx-common/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.$anonfun$apply$1(GpuTransitionOverrides.scala:549) ~[spark330/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuOverrideUtil$.$anonfun$tryOverride$1(GpuOverrides.scala:3944) ~[spark3xx-common/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.apply(GpuTransitionOverrides.scala:580) ~[spark330/:?]

[2022-01-05T00:21:40.203Z] 	at com.nvidia.spark.rapids.GpuTransitionOverrides.apply(GpuTransitionOverrides.scala:37) ~[spark330/:?]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$2(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$2$adapted(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:555) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:514) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:452) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at scala.collection.immutable.List.foldLeft(List.scala:91) ~[scala-library-2.12.15.jar:?]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:451) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$2(QueryExecution.scala:170) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:196) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:196) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:170) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.withCteMap(QueryExecution.scala:73) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:163) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:163) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:106) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:482) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:83) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:482) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.203Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:458) ~[spark-catalyst_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622) ~[spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) [spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617) [spark-sql_2.12-3.3.0-SNAPSHOT.jar:3.3.0-SNAPSHOT]

[2022-01-05T00:21:40.204Z] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_312]

[2022-01-05T00:21:40.204Z] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_312]

[2022-01-05T00:21:40.204Z] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_312]

[2022-01-05T00:21:40.204Z] 	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_312]

[2022-01-05T00:21:40.204Z] 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.Gateway.invoke(Gateway.java:282) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.commands.CallCommand.execute(CallCommand.java:79) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at py4j.ClientServerConnection.run(ClientServerConnection.java:106) [py4j-0.10.9.3.jar:?]

[2022-01-05T00:21:40.204Z] 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z] ../../src/main/python/join_test.py::test_struct_self_join[IGNORE_ORDER({'local': True})] ERROR [100%]

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z] ==================================== ERRORS ====================================

[2022-01-05T00:21:40.204Z] __________________ ERROR at teardown of test_struct_self_join __________________

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z] request = <SubRequest 'spark_tmp_table_factory' for <Function test_struct_self_join>>

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z]     @pytest.fixture

[2022-01-05T00:21:40.204Z]     def spark_tmp_table_factory(request):

[2022-01-05T00:21:40.204Z]         base_id = 'tmp_table_{}'.format(random.randint(0, 1000000))

[2022-01-05T00:21:40.204Z]         yield TmpTableFactory(base_id)

[2022-01-05T00:21:40.204Z]         sp = get_spark_i_know_what_i_am_doing()

[2022-01-05T00:21:40.204Z] >       tables = sp.sql("SHOW TABLES".format(base_id)).collect()

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z] ../../src/main/python/conftest.py:267: 

[2022-01-05T00:21:40.204Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

[2022-01-05T00:21:40.204Z] /home/jenkins/agent/workspace/jenkins-rapids_it-3.3.x-SNAPSHOT-dev-github-21/jars/spark-3.3.0-SNAPSHOT-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/sql/session.py:1037: in sql

[2022-01-05T00:21:40.204Z]     return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)

[2022-01-05T00:21:40.204Z] /home/jenkins/agent/workspace/jenkins-rapids_it-3.3.x-SNAPSHOT-dev-github-21/jars/spark-3.3.0-SNAPSHOT-bin-hadoop3.2/python/lib/py4j-0.10.9.3-src.zip/py4j/java_gateway.py:1321: in __call__

[2022-01-05T00:21:40.204Z]     return_value = get_return_value(

[2022-01-05T00:21:40.204Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z] a = ('xro377', <py4j.clientserver.JavaClient object at 0x7f25aac2efd0>, 'o66', 'sql')

[2022-01-05T00:21:40.204Z] kw = {}, converted = IllegalArgumentException()

[2022-01-05T00:21:40.204Z] 

[2022-01-05T00:21:40.204Z]     def deco(*a: Any, **kw: Any) -> Any:

[2022-01-05T00:21:40.204Z]         try:

[2022-01-05T00:21:40.204Z]             return f(*a, **kw)

[2022-01-05T00:21:40.204Z]         except Py4JJavaError as e:

[2022-01-05T00:21:40.204Z]             converted = convert_exception(e.java_exception)

[2022-01-05T00:21:40.204Z]             if not isinstance(converted, UnknownException):

[2022-01-05T00:21:40.204Z]                 # Hide where the exception came from that shows a non-Pythonic

[2022-01-05T00:21:40.204Z]                 # JVM exception message.

[2022-01-05T00:21:40.204Z] >               raise converted from None

[2022-01-05T00:21:40.204Z] E               pyspark.sql.utils.IllegalArgumentException: Part of the plan is not columnar class org.apache.spark.sql.execution.datasources.v2.ShowTablesExec

[2022-01-05T00:21:40.204Z] E               ShowTables [namespace#66, tableName#67, isTemporary#68], V2SessionCatalog(spark_catalog), [default]
@pxLi pxLi added bug Something isn't working ? - Needs Triage Need team to review and classify labels Jan 5, 2022
@GaryShen2008 GaryShen2008 removed the ? - Needs Triage Need team to review and classify label Jan 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants