-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial definition for Spark 4.0.0 shim #10725
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -810,6 +810,7 @@ | |
351 | ||
</noSnapshot.buildvers> | ||
<snapshot.buildvers> | ||
400 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Need to regenerate scala2.13 pom There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done. I was curious why we need to commit the 2.13 pom when it can be generated when needed. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. In other words, isn't this a lot similar to generating shims for different versions of Spark? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's committed for convenience. Developers can directly point their IDE to the scala2.13 pom or directly build after pulling source. If it required manual generation, switching branches in your local repo would be fraught with problems if you forget to re-generate the scala2.13 pom after moving to a new commit. Something that would be very easy to forget. |
||
</snapshot.buildvers> | ||
<databricks.buildvers> | ||
330db, | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should actually only be in the snapshotScala213.buildvers at the moment. Spark 4.0.0 does not support Scala 2.12, so the CI cannot actually build the shim under the default Scala version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right, and that is what my initial thought was but as you have noted it needs to be added to the
all.buildvers
for the 2.12 pom.xml. At this point, I have added it to theall.buildvers
for Scala2.12 and 2.13 but as things get clearer closer to the release of Spark 4.0.0 we will have to either ignore the check forall.buildvers
for Scala 2.12 build while keeping the check for Scala 2.13 build.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why 400 was added to snapshot.buildvers. That's not what we want, right? 400 is not ready to be built. We want 400 to be in all.buildvers but not in any definition of what is buildable. We need 400 to be declared as a shim but not one that builds yet. Therefore I would expect the change to be more like this:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, probably something for a future PR, but we should consider how to organize this in the future for 40x shims that will only build under Scala 2.13. These future shims (400, 401, etc.) should be able to live in
all.buildvers
but the build system can handle them as well. Maybe put it in another section that shared with*Scala213.buildvers
sections as well?