-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] test_non_empty_ctas fails on yarn #3476
Comments
this failed again after kicking a second time. |
This looks like the same stacktrace as reported at apache/spark#26619 (comment). The stacktrace shows that a Hive 3.x shim is getting loaded, but it looks like a Hive 2.x jar is being used underneath it which goes about as well as one would expect. |
seeing some tests on ucx egx yarn after this: [2021-09-17T08:09:26.387Z] integration_tests/src/main/python/parquet_write_test.py::test_non_empty_ctas[True][ALLOW_NON_GPU(DataWritingCommandExec,HiveTableScanExec)] [31mFAILED[0m[31m [ 99%][0m [2021-09-17T08:09:26.388Z] �[1m�[31mE : java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.plan.LoadTableDesc$LoadFileType�[0m |
actually the nightly didn't push last night so reclose this and see if it fails again |
This is still happening. |
Note that the failure that is occurring happens when the test is performing a CPU-only table write. There's no GPU operations being performed, so it seems like the Spark YARN cluster is misconfigured somehow to cause this type of error. |
Describe the bug
06:31:10 integration_tests/src/main/python/parquet_write_test.py::test_non_empty_ctas[True][ALLOW_NON_GPU(DataWritingCommandExec,HiveTableScanExec)] FAILED [ 99%]
06:31:10 integration_tests/src/main/python/parquet_write_test.py::test_non_empty_ctas[False][ALLOW_NON_GPU(DataWritingCommandExec,HiveTableScanExec)] FAILED [100%]
06:31:10
The text was updated successfully, but these errors were encountered: