Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set use_blas64 to false to fix MKL.jl interaction numpy problem #35

Merged
merged 2 commits into from
Oct 13, 2020

Conversation

fgerick
Copy link
Contributor

@fgerick fgerick commented Mar 6, 2020

same as the other pr, updated to latest master.

@carstenbauer
Copy link
Member

Would be good to link to the other PR.

@aminya aminya mentioned this pull request Mar 6, 2020
@KristofferC KristofferC mentioned this pull request Mar 12, 2020
@aminya
Copy link
Contributor

aminya commented Mar 12, 2020

This builds now with the new version. I don't see any reason for not merging this.

@carstenbauer
Copy link
Member

carstenbauer commented Mar 13, 2020

What are the side effects of USE_BLAS64=false again? (apart form fixing the numpy interplay)

UPDATE: Oh, I asked this before. Anyways, I guess this 32-bit "limitation" should be mentioned in the README if this gets merged, no?

@fgerick
Copy link
Contributor Author

fgerick commented Mar 13, 2020

is it easy to include a flag in the build for the individual user to choose?

src/install.jl Outdated

@assert libblas_idx !== nothing && liblapack_idx !== nothing

lines[libblas_idx] = "const libblas_name = $(repr(name))"
lines[liblapack_idx] = "const liblapack_name = $(repr(name))"
if useblas64_idx != nothing
lines[useblas64_idx] = "const USE_BLAS64 = false"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will set it to use 32 bit integers even when reverting to normal sysimage? Anyway, I can fix it up.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bump. @KristofferC Not sure I understand this - but it would be nice to get this PR merged.

@carstenbauer
Copy link
Member

carstenbauer commented Mar 13, 2020

is it easy to include a flag in the build for the individual user to choose?

It's probably fine to simply make it the default. But I think it's worth mentioning in the README.md (or somewhere else?).

@aminya
Copy link
Contributor

aminya commented May 26, 2020

Some time ago I did some comparison between BLAS 64 and 32, and the performance gain of BLAS 64 was noticeable. I think BLAS 32 should not be the default behavior, but only be enabled by an option,

@ViralBShah
Copy link
Contributor

Some time ago I did some comparison between BLAS 64 and 32, and the performance gain of BLAS 64 was noticeable. I think BLAS 32 should not be the default behavior, but only be enabled by an option,

It's a bit odd. Can you share the benchmarks?

@ViralBShah
Copy link
Contributor

ViralBShah commented May 30, 2020

Even if there is a performance issue, we probably still need to ship 32-bit as the default, given the number of crash complaints we receive. We should file the performance issue with Intel.

@ViralBShah
Copy link
Contributor

What is holding this PR from being merged?

@ViralBShah ViralBShah changed the title use_blas64 false to fix MKL.jl interaction numpy problem Set use_blas64 to false to fix MKL.jl interaction numpy problem May 30, 2020
@ViralBShah
Copy link
Contributor

@stevengj If we merge this - I assume it will solve all the issues we are having on the python side.

@stevengj
Copy link
Member

Probably.

@aminya
Copy link
Contributor

aminya commented May 30, 2020

@ViralBShah I created an issue to remind myself to do some benchmarks
#50

If switching to 64 bit version be a matter of a special flag or option, definitely that is easy enough for everyone (like me) who does not use Python to use the 64bit version.

@fgerick
Copy link
Contributor Author

fgerick commented Jun 13, 2020

I have added the option to build MKL.jl with the 64bit version of MKL by setting ENV["USE_BLAS64"]=true before building. Also, the install script reverts the alteration by the installation. I added a small statement into the README, but maybe someone wants to say something about performance issues...

@fgerick
Copy link
Contributor Author

fgerick commented Jun 13, 2020

I guess once this is merged the comment in the readme: "frequently encountered compatibility issues" is already deprecated...

@fgerick
Copy link
Contributor Author

fgerick commented Jul 6, 2020

Is there anything left to be done here? I'd be happy to see this move on and make MKL with Julia more end-user friendly. I'm still avoiding MKL because of Arpack.jl but I'm hoping if MKL.jl has less struggles steps will be taken there also.

@ViralBShah
Copy link
Contributor

We need @andreasnoack to look at this - whenever he has a moment.

@ViralBShah
Copy link
Contributor

@KristofferC Do you understand this well enough to know if it will work right? It would be nice to use MKL as 32-bit by default so as to not cause trouble with python.

@andreasnoack andreasnoack reopened this Oct 12, 2020
@ViralBShah
Copy link
Contributor

Failing on nightly.

@andreasnoack
Copy link
Member

@KristofferC Is the failure on nightly known?

@ViralBShah
Copy link
Contributor

https://travis-ci.org/github/JuliaComputing/MKL.jl/jobs/735062657#L262

ERROR: ArgumentError: Package Artifacts not found in current path:
- Run `import Pkg; Pkg.add("Artifacts")` to install the Artifacts package.

@andreasnoack
Copy link
Member

I suppose it's an issue with PackageCompiler on 1.6? I'll merge, as tests are passing on 1.5.

@andreasnoack andreasnoack merged commit 84ae4d4 into JuliaLinearAlgebra:master Oct 13, 2020
@KristofferC
Copy link
Contributor

KristofferC commented Oct 13, 2020

How will this work with MKLSparse? Will we lose the ability to do Int64 sparse matmuls? I'm not really sure how this interacts with the sparse MKL part.

@andreasnoack
Copy link
Member

How will this work with MKLSparse? Will we lose the ability to do Int64 sparse matmuls?

@fgerick Would you be able to test this?

@KristofferC
Copy link
Contributor

It seems MKL at compile time uses BlasInt: https://github.com/JuliaSparse/MKLSparse.jl/blob/6060cd08a2586b275c95aac650c379ff48130ef7/src/BLAS/level_2_3/generator.jl#L36

So I guess that means it would be restricted to Int32 (and silently for Int64 use the Julia native one)?

@ViralBShah
Copy link
Contributor

Yes, presumably. I think that is also the behaviour we need by default, otherwise it leads to all those crashes in matplotlib.

@KristofferC
Copy link
Contributor

KristofferC commented Oct 13, 2020

Yeah, all I'm saying is that right now, if someone installs this it most likely effectively turns off MKLSparse since everyone is using the default Int64 for their sparse matrices. And I guess many people that use MKLSparse use MKL (and vice versa).

@ViralBShah
Copy link
Contributor

Oh I didn't quite understand that. Good to know. Is there a way to opt into the Int64 version?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants