Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix tests failing on Spark 3.1 #557

Merged
merged 1 commit into from
Aug 14, 2020
Merged

Conversation

jlowe
Copy link
Member

@jlowe jlowe commented Aug 13, 2020

This fixes a couple of Scala tests that were failing on Spark 3.1.0-SNAPSHOT.

One of the tests was relying on a ProjectExec to work correctly, and Spark 3.1.0 optimized it out. I added an explicit project after the filter to work around it.

The other test failed because something that doesn't work in Spark 3.0 now works in Spark 3.1. Updated the test to check the Spark version before deciding what kind of test to run.

Signed-off-by: Jason Lowe <jlowe@nvidia.com>
@jlowe jlowe added test Only impacts tests build Related to CI / CD or cleanly building labels Aug 13, 2020
@jlowe jlowe added this to the Aug 3 - Aug 14 milestone Aug 13, 2020
@jlowe jlowe self-assigned this Aug 13, 2020
@jlowe
Copy link
Member Author

jlowe commented Aug 13, 2020

build

@revans2 revans2 merged commit 48e7ed7 into NVIDIA:branch-0.2 Aug 14, 2020
longsFromCSVDf, conf = floatAggConf) {
frame => val res = frame.selectExpr("avg(distinct longs) filter (where longs < 5)")
res
if (spark.SPARK_VERSION_SHORT < "3.1.0") {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is fine but we had added a function to the shim layer to get the version so that we could handle different shim layers in addition to apache versions. There is one test using it already, it would be nice to stay consistent (see StringFallbackSuite):

  val isValidTestForSparkVersion = ShimLoader.getSparkShims.getSparkShimVersion match {
    case SparkShimVersion(major, minor, _) => major == 3 && minor == 0
    case DatabricksShimVersion(major, minor, _) => major == 3 && minor == 0
    case _ => true
  }

@jlowe jlowe deleted the fix-310-tests branch October 16, 2020 20:30
nartal1 pushed a commit to nartal1/spark-rapids that referenced this pull request Jun 9, 2021
Signed-off-by: Jason Lowe <jlowe@nvidia.com>
nartal1 pushed a commit to nartal1/spark-rapids that referenced this pull request Jun 9, 2021
Signed-off-by: Jason Lowe <jlowe@nvidia.com>
tgravescs pushed a commit to tgravescs/spark-rapids that referenced this pull request Nov 30, 2023
…IDIA#557)

Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com>

Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build Related to CI / CD or cleanly building test Only impacts tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants