Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Avoid going over memory and size limitations in GpuExpression evaluation #7257

Open
1 of 6 tasks
revans2 opened this issue Dec 5, 2022 · 0 comments
Open
1 of 6 tasks
Labels
epic Issue that encompasses a significant feature or body of work reliability Features to improve reliability or bugs that severly impact the reliability of the plugin

Comments

@revans2
Copy link
Collaborator

revans2 commented Dec 5, 2022

Is your feature request related to a problem? Please describe.
This is related to #7252 but tries to tackle the problem of Expressions instead of SparkPlan exec nodes. A lot of this is really only needed for cases when we need to split the input batch into multiple output batches. This can happen, but in practice is not that common.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
epic Issue that encompasses a significant feature or body of work reliability Features to improve reliability or bugs that severly impact the reliability of the plugin
Projects
None yet
Development

No branches or pull requests

2 participants