Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] div_by_zero test is failing on Spark 330 on 22.06 #5717

Closed
firestarman opened this issue Jun 2, 2022 · 0 comments · Fixed by #5718
Closed

[BUG] div_by_zero test is failing on Spark 330 on 22.06 #5717

firestarman opened this issue Jun 2, 2022 · 0 comments · Fixed by #5718
Labels
bug Something isn't working P0 Must have for release

Comments

@firestarman
Copy link
Collaborator

FAILED ../../src/main/python/arithmetic_ops_test.py::test_div_by_zero_ansi[1/0]
FAILED ../../src/main/python/arithmetic_ops_test.py::test_div_by_zero_ansi[a/0]
FAILED ../../src/main/python/arithmetic_ops_test.py::test_div_by_zero_ansi[a/b]
../../src/main/python/arithmetic_ops_test.py::test_div_by_zero_ansi[a/b] 22/06/02 02:34:22 ERROR Executor: Exception in task 3.0 in stage 1.0 (TID 7)
org.apache.spark.SparkArithmeticException: Division by zero. To return NULL instead, use `try_divide`. If necessary set "spark.sql.ansi.enabled" to "false" (except for ANSI interval type) to bypass this error.
== SQL(line 1, position 1) ==
a/b
^^^

        at org.apache.spark.sql.errors.QueryExecutionErrors$.divideByZeroError(QueryExecutionErrors.scala:184) ~[spark-catalyst_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.sql.errors.QueryExecutionErrors.divideByZeroError(QueryExecutionErrors.scala) ~[spark-catalyst_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) ~[?:?]
        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) ~[spark-sql_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:760) ~[spark-sql_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:364) ~[spark-sql_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:890) ~[spark-core_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:890) ~[spark-core_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.12-3.3.1-SNAPSHOT.jar:3.3.1-SNAPSHOT]

@firestarman firestarman added bug Something isn't working ? - Needs Triage Need team to review and classify labels Jun 2, 2022
@firestarman firestarman changed the title [BUG] div_by_zero test is failing on Spark 330 [BUG] div_by_zero test is failing on Spark 330 on 22.06 Jun 2, 2022
@firestarman firestarman added P0 Must have for release and removed ? - Needs Triage Need team to review and classify labels Jun 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P0 Must have for release
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant