Skip to content
This repository has been archived by the owner on Sep 28, 2024. It is now read-only.

Plan to merge three similar packages #36

Closed
yuehhua opened this issue Feb 14, 2022 · 14 comments · Fixed by #38
Closed

Plan to merge three similar packages #36

yuehhua opened this issue Feb 14, 2022 · 14 comments · Fixed by #38

Comments

@yuehhua
Copy link
Collaborator

yuehhua commented Feb 14, 2022

Since this project is mature, is there any plan to move to FluxML?

@foldfelis
Copy link
Contributor

Yes, sure, is there anything I need to do before I transfer the project?

@yuehhua
Copy link
Collaborator Author

yuehhua commented Feb 14, 2022

@DhairyaLGandhi Could you help this?

@foldfelis
Copy link
Contributor

Hi @DhairyaLGandhi, sorry for the late reply. I don't know why I didn't recive the notifecation of the FluxML invitation.
I am currently discuss with @ChrisRackauckas about the future plan. Since both NeuralOperators.jl and OperatorLearning.jl provides solutions to the same problem, it might be a good idea to combine the efforts.

@ChrisRackauckas
Copy link
Member

There's a third one too: https://github.com/CliMA/OperatorFlux.jl @bischtob @charleskawczynski and pulling in @pzimbrod . There's nothing wrong with having multiple repos of course, but I think finding the most useful ideas of each repo and folding them together to get top notch performance would be more useful in the end.

Note that one of the plans I have for where OperatorLearning can go is, given its small neural networks in many cases, we have a fully pre-cached ML library (right now called SimpleChains, being set public fairly soon) that is extremely fast for this small NN case by being fully non-allocating and direct usage of LoopVectorization. At least for the DeepONet parts that can be very useful.

I hope to get a GSoC going and some Julia Lab students going on this as well.

@bischtob
Copy link

I am down to work together on one repo @ChrisRackauckas ! Would be fantastic to collaborate.

@pzimbrod
Copy link
Member

I would definitely chime in on the idea of combining our efforts. I think there still is enough functionality and optimizations yet to be implemented and I'd love to see that field grow further in Julia, so why not work on it together 🙂

@yuehhua
Copy link
Collaborator Author

yuehhua commented Feb 18, 2022

@ChrisRackauckas I am also interesting in contributing this work. There is another work for multi-wavelet operator ongoing. My direction is bringing neural operator, GNN and geometric deep learning together. If there is anything I can help, just let me know.

@ChrisRackauckas
Copy link
Member

Should we have an "internal conference" demoing the 3 packages to try and figure out the difference? The best time I can think of would be 6pm EST (midnight Germany, 7am Taiwan, 3pm west coast. I think that might be the only time where someone isn't in the dreaded 1am-5am range?) 15 minutes each, no slides or anything, just demo the package and figure out what to do? Monday the 21st would work for me. Otherwise I think we'll end up in a bit of deadlock.

@yuehhua
Copy link
Collaborator Author

yuehhua commented Feb 18, 2022

The 21st, 6pm EST? It would be 22nd, 7am Taiwan. It works for me.

@yuehhua yuehhua changed the title Plan to move to FluxML Plan to merge three similar packages Feb 18, 2022
@foldfelis
Copy link
Contributor

Works for me as well.

@pzimbrod
Copy link
Member

That would be okay for me as well

@bischtob
Copy link

Monday 21st works for me.

@bischtob
Copy link

Where should we meet? Zoom?

@ChrisRackauckas
Copy link
Member

Invites went out if I have your email. Otherwise just message me your email to get you on there.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants