Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New README for FastAI #14

Merged
merged 6 commits into from
Dec 10, 2020
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,16 @@

![Logo](https://raw.githubusercontent.com/opus111/FastAI.jl/master/fastai-julia-logo.png)

This code is inspired by [fastai](https://github.com/fastai/fastai/blob/master/fastai/), but differs in implementation in several ways. Most importantly, the original Python code makes heavy use of side-effects where the `Learner` holds different state variables, and other objects access and modify them.
FastAI.jl is inspired by [fastai](https://github.com/fastai/fastai/blob/master/fastai/), and is a repository of best practices for Deep Learning with Flux. Its goal is to enable creating state-of-the-art models, while freeing the developer from having to implement most of the sub-components. FastAI allows the design, training, and delivery of models that compete with the best in class, using a few lines of code.

This has been replaced by a more functional design. The state is now transmitted via arguments to `Callbacks` which may then pass them on to `Metrics`.
FastAI.jl contains thorough documentation, examples and tutorials, but does not contain the source of the core components. It is an umbrella package combining the functionality specialized packages. These packages include:
- [Flux.jl](https://github.com/FluxML/Flux.jl): 100% pure-Julia Deep Learning stack. Provides lightweight abstractions on top of Julia's core GPU and AD support.
- [FluxTraining.jl](https://github.com/lorenzoh/FluxTraining.jl): Easily customized training loops, a large library of useful metrics, and many useful utilities (such as logging)
- [DataLoaders.jl](https://github.com/lorenzoh/DataLoaders.jl): Multi-threaded data loading built on MLDataPattern.jl (similar to PyTorch's `DataLoader`).
- [MLDataPattern.jl](https://github.com/JuliaML/MLDataPattern.jl): Utility package for subsetting, partitioning, iterating, and resampling of Machine Learning datasets.
- [MLDatasets.jl](https://github.com/JuliaML/MLDatasets.jl): A community effort to provide a common interface for accessing common Machine Learning (ML) datasets.
- [DataAugmentation.jl](https://github.com/lorenzoh/DataAugmentation.jl): Utilities for augmenting image data
- [Metalhead.jl](https://github.com/FluxML/Metalhead.jl): Computer vision models for Flux
- [Transformers.jl](https://github.com/chengchingwen/Transformers.jl): NLP transformer-based models for Flux

*Note*: this is a package in-development. Expect breaking changes for the foreseeable future, but we want you to test out the package by following the documentation. Any contributions are welcome via PRs/issues.

Much of the documentation has been copied from the original Python, and modified where appropriate.
*Note*: this is a package in-development. One should expect major breaking changes for the foreseeable future. But we are very interested in meeting the desires of the community, so all comments and contributions are welcome via PRs/issues.