-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Rounding past the size that can be stored in a type produces incorrect results #4273
Comments
@revans2 Hi, could you share how to reproduce this ? I tried several cases via spark-shell and seems CUDF worked fine.
|
Hi @revans2, I failed to find the corner cases which GPU and Spark produces different results for int round and float round. For int round, Could you show some examples? Thanks. |
It happens when you round past what the type can theoretically hold. byte = -3 This is why it is a real corner case. No one should ever do it, because it makes no sense to. But we still want to cover all of the possibilities. While you are at it there is some Decimal code that fixes/works around this same issue for us. spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/DecimalUtil.scala Lines 87 to 94 in b91318d
I overthought things when I wrote it, or perhaps didn't think enough, and it will always return 0, ignoring the overflow, so it might be nice to simplify it and make it just always return 0's in that case instead of casting to truncate followed by rounding. |
There are some odd corner cases when rounding for integers and floating point values. If you try to round past what the number can hold Spark will typically produce a 0 all of the time. For integers there is still some overflow issues possible, so it is not 100% that way. For CUDF with integers we can get numbers that are way off and for Floating point values we can get back a NaN. This is a real corner case though because we only support a constant scale value so someone would have to get things rather messed up.
We have fixed this for Decimal though.
The text was updated successfully, but these errors were encountered: