You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My project uses RAFT brute force knn and I noticed a drop in precision when I upgraded to the latest RAFT. I moved to cuvs but still see cosine similarity that's off by more than 1e-04 when compared to the dot product cpu calculation. Is this a bad use case for cuvs?
I had this issue before when I was using cuML and two_pass_precission fixed it, unfortunately it also suffered from a different correctness bug rapidsai/cuml#5569. Would appreciate any suggestions.
The text was updated successfully, but these errors were encountered:
Thanks for creating an issue about this @phact. can you share a little more info about how you are using this? Are you using the Python API? What precision is your data? (float or double?). If it's not too hard to provide a trivial reproducible example then that would be helpful.
Sometimes this can be caused my multiple sources of small precision errors throughout the computational stack- inner product can mount small errors, then follow-on arithmetic can make things slightly worse. We will work to get this fixed if you can help us understand more.
My project uses RAFT brute force knn and I noticed a drop in precision when I upgraded to the latest RAFT. I moved to cuvs but still see cosine similarity that's off by more than 1e-04 when compared to the dot product cpu calculation. Is this a bad use case for cuvs?
I had this issue before when I was using cuML and two_pass_precission fixed it, unfortunately it also suffered from a different correctness bug rapidsai/cuml#5569. Would appreciate any suggestions.
The text was updated successfully, but these errors were encountered: