Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] map_test ansi failed in spark330 #4564

Closed
pxLi opened this issue Jan 19, 2022 · 1 comment · Fixed by #4690
Closed

[BUG] map_test ansi failed in spark330 #4564

pxLi opened this issue Jan 19, 2022 · 1 comment · Fixed by #4690
Assignees
Labels
bug Something isn't working P0 Must have for release spark 3.3+

Comments

@pxLi
Copy link
Collaborator

pxLi commented Jan 19, 2022

Describe the bug

14:36:54  FAILED ../../src/main/python/map_test.py::test_simple_get_map_value_ansi_fail[Map(String(not_null),String)]
14:36:54  FAILED ../../src/main/python/map_test.py::test_map_element_at_ansi_fail[Map(String(not_null),String)]
14:36:54  =================================== FAILURES ===================================
14:36:54  ______ test_simple_get_map_value_ansi_fail[Map(String(not_null),String)] _______
14:36:54  
14:36:54  data_gen = Map(String(not_null),String)
14:36:54  
14:36:54      @pytest.mark.skipif(is_before_spark_311(), reason="Only in Spark 3.1.1 + ANSI mode, map key throws on no such element")
14:36:54      @pytest.mark.parametrize('data_gen', [simple_string_to_string_map_gen], ids=idfn)
14:36:54      def test_simple_get_map_value_ansi_fail(data_gen):
14:36:54  >       assert_gpu_and_cpu_error(
14:36:54                  lambda spark: unary_op_df(spark, data_gen).selectExpr(
14:36:54                      'a["NOT_FOUND"]').collect(),
14:36:54                      conf={'spark.sql.ansi.enabled':True,
14:36:54                            'spark.sql.legacy.allowNegativeScaleOfDecimal': True},
14:36:54                      error_message='java.util.NoSuchElementException')
14:36:54  
14:36:54  ../../src/main/python/map_test.py:148: 
14:36:54  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
14:36:54  ../../src/main/python/asserts.py:569: in assert_gpu_and_cpu_error
14:36:54      assert_py4j_exception(lambda: with_cpu_session(df_fun, conf), error_message)
14:36:54  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
14:36:54  
14:36:54  func = <function assert_gpu_and_cpu_error.<locals>.<lambda> at 0x7f091df18a60>
14:36:54  error_message = 'java.util.NoSuchElementException'
14:36:54  
14:36:54      def assert_py4j_exception(func, error_message):
14:36:54          """
14:36:54          Assert that a specific Java exception is thrown
14:36:54          :param func: a function to be verified
14:36:54          :param error_message: a string such as the one produce by java.lang.Exception.toString
14:36:54          :return: Assertion failure if no exception matching error_message has occurred.
14:36:54          """
14:36:54          with pytest.raises(Py4JJavaError) as py4jError:
14:36:54              func()
14:36:54  >       assert error_message in str(py4jError.value.java_exception)
14:36:54  E       AssertionError
14:36:54  
14:36:54  ../../src/main/python/asserts.py:558: AssertionError
14:36:54  _________ test_map_element_at_ansi_fail[Map(String(not_null),String)] __________
14:36:54  
14:36:54  data_gen = Map(String(not_null),String)
14:36:54  
14:36:54      @pytest.mark.skipif(is_before_spark_311(), reason="Only in Spark 3.1.1 + ANSI mode, map key throws on no such element")
14:36:54      @pytest.mark.parametrize('data_gen', [simple_string_to_string_map_gen], ids=idfn)
14:36:54      def test_map_element_at_ansi_fail(data_gen):
14:36:54  >       assert_gpu_and_cpu_error(
14:36:54                  lambda spark: unary_op_df(spark, data_gen).selectExpr(
14:36:54                      'element_at(a, "NOT_FOUND")').collect(),
14:36:54                      conf={'spark.sql.ansi.enabled':True,
14:36:54                            'spark.sql.legacy.allowNegativeScaleOfDecimal': True},
14:36:54                      error_message='java.util.NoSuchElementException')
14:36:54  
14:36:54  ../../src/main/python/map_test.py:179: 
14:36:54  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
14:36:54  ../../src/main/python/asserts.py:569: in assert_gpu_and_cpu_error
14:36:54      assert_py4j_exception(lambda: with_cpu_session(df_fun, conf), error_message)
14:36:54  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
14:36:54  
14:36:54  func = <function assert_gpu_and_cpu_error.<locals>.<lambda> at 0x7f091d3ba160>
14:36:54  error_message = 'java.util.NoSuchElementException'
14:36:54  
14:36:54      def assert_py4j_exception(func, error_message):
14:36:54          """
14:36:54          Assert that a specific Java exception is thrown
14:36:54          :param func: a function to be verified
14:36:54          :param error_message: a string such as the one produce by java.lang.Exception.toString
14:36:54          :return: Assertion failure if no exception matching error_message has occurred.
14:36:54          """
14:36:54          with pytest.raises(Py4JJavaError) as py4jError:
14:36:54              func()
14:36:54  >       assert error_message in str(py4jError.value.java_exception)
14:36:54  E       AssertionError
14:36:54  
14:36:54  ../../src/main/python/asserts.py:558: AssertionError
@pxLi pxLi added bug Something isn't working ? - Needs Triage Need team to review and classify labels Jan 19, 2022
@sameerz sameerz added spark 3.3+ P0 Must have for release and removed ? - Needs Triage Need team to review and classify labels Jan 24, 2022
@amahussein
Copy link
Collaborator

Spark-37750 added support to a config to configure the behavior of throwing an exception on error.
In this issue I will only fix the exception, and leave support of the new config to be done in #4668

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P0 Must have for release spark 3.3+
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants