-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shim layer support for Spark 3.0.0 Databricks #442
Conversation
build |
I'm going to see if I can get ci-dev env going, otherwise this build won't pass. Or I could change to exclude databricks by default until we have it working, but let me ping the devops from ngc to see if I can get dev env going in reasonable amount of time. |
build |
ci dev build passed and pushed one version of jar to urm |
The changes look good to me, I just wanted to be sure I understood everything that you wanted to happen. So is this the final version? Do we intend to push a version of the databricks shim to sonatype and have the final build pull that in for shadeing? |
good question, ideally yes we push it to sonatype. I'll turn the exclude-databricks on by default for now and then followup with a different issue for pushing to sontaype. |
note, still updating jenkins files |
build |
1 similar comment
build |
* Add in Databricks shim
* Add in Databricks shim
[auto-merge] bot-auto-merge-branch-22.08 to branch-22.10 [skip ci] [bot]
Add in Support for Spark 3.0.0 databricks. This requires to be built on a databricks host, so the default build doesn't build it, but it will try to pull from sonatype. That will obviously be broken until we push once. I did add in an option to skip pulling it in via the -Pexclude-databricks option.
Unfortunately our dev-ci seems to be messed up again so I have not been able to test full end to end with deployment to urm. if it fails I will have to do a followup or work with team to get the dev environment back working.
For databricks, it does require joins and then FileSourcescanExec, so I ended up copying those files again. Hopefully we can do the follow on to combine those somehow - issue #412