Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Dynamic partition counts for task write stats may be incorrect #1010

Open
jlowe opened this issue Oct 22, 2020 · 0 comments
Open

[BUG] Dynamic partition counts for task write stats may be incorrect #1010

jlowe opened this issue Oct 22, 2020 · 0 comments
Labels
bug Something isn't working Spark 3.1+ Bugs only related to Spark 3.1 or higher SQL part of the SQL/Dataframe plugin

Comments

@jlowe
Copy link
Member

jlowe commented Oct 22, 2020

Describe the bug
To fix #1006 we stopped using BasicWriteTaskStats and used our own version which matches the original schema before the Spark 3.1.0 change. However that change was made to fix a bug, specifically SPARK_32978. We should include a similar fix.

Steps/Code to reproduce bug
See SPARK_32978

@jlowe jlowe added bug Something isn't working ? - Needs Triage Need team to review and classify SQL part of the SQL/Dataframe plugin labels Oct 22, 2020
@sameerz sameerz added Spark 3.1+ Bugs only related to Spark 3.1 or higher and removed ? - Needs Triage Need team to review and classify labels Oct 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Spark 3.1+ Bugs only related to Spark 3.1 or higher SQL part of the SQL/Dataframe plugin
Projects
None yet
Development

No branches or pull requests

2 participants