This is a slight refactor of Tensorflow's data-conversion-attention example. All the credit goes to the TF team and the people that built the model.
I've just refactored things (in a way that make more sense to me) while learning the attention model.
- I've reorganized the file structure
- I've dropped the frontend part as I'm only interested in the model
- I'm using Jest instead of Jasmine
npm run train
- Train the modelnpm run test
- Run unit testsnpm run flow
- One execution of the model (with apply()) over an actual input.