-
Notifications
You must be signed in to change notification settings - Fork 182
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix divide-by-zero, deprecation, and futurewarnings #236
Conversation
`np.float` and `np.int` have been deprecated. Replaced with the built in `float` and `int` respectively.
`sklearn` added random state as input to `AffinityPropogation` in 0.23. To preserve the old behavior `randomstate = 0` was added as this is the previous hard coded value according to the docs here `https://scikit-learn.org/stable/modules/generated/sklearn.cluster.AffinityPropagation.html`
Division by 0 in the clustering and visualization code seems to happen in the normal course of operation and should result in `inf` thus silencing the warnings at those points.
Codecov Report
@@ Coverage Diff @@
## master #236 +/- ##
==========================================
+ Coverage 80.27% 80.32% +0.04%
==========================================
Files 11 11
Lines 862 864 +2
Branches 189 189
==========================================
+ Hits 692 694 +2
Misses 138 138
Partials 32 32
Continue to review full report at Codecov.
|
Cool, thanks. I wonder why plotlyviz had not-unix line endings... Can you make any future PRs off of not-master branches? It's okay for this one. |
Sure, is there a particular branch you would like me to use? |
Oh, you know what, I totally misread it. I thought this was coming from your fork's master branch. Long day, sorry. PRs against this repo's master is preferred, as you've been doing. |
Currently the test suite generates a handful of warnings. These include:
np.float
andnp.int
AffinityPropogation
api changesThis pull request aims to fix these warnings. It changes all occurences of
np.int
andnp.float
toint
andfloat
respectively, addsrandomstate=0
toAffinityPropogation
to keep previous behavior, and silences division by 0 errors as the behaviorx / 0 = inf
seems to be the expect behavior where they arose. It does not fix the invalid division error as I don't understand the function which raises that warning well enough to tell ifNaN
is expected behavior or if the data has to be cleansed somehow.