Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix mamba integration by making it a variant of outlines.models.transformers #32

Draft
wants to merge 12 commits into
base: transformers-use-logits-processor
Choose a base branch
from

Conversation

lapp0
Copy link
Owner

@lapp0 lapp0 commented Jun 14, 2024

Fixes dottxt-ai#808

Rendered docs: https://github.com/lapp0/outlines/blob/fix-mamba-integration/docs/reference/models/transformers.md#alternative-model-classes

Problem

  • Per the above issue, Mamba doesn't use logits processors, it uses SequenceGenerator. It should use SequenceGeneratorAdapter and have logits processors manage the automata.
  • models.mamba doesn't work at all in main.

Solution

Update models.transformers to accept a model_class argument allowing for additional model types beyond AutoModelForCausalLM. Make models.mamba simply a variant of models.transformers with the model_class=transformers.MambaForCausalLM passed. This results in nearly zero maintenance required for models.mamba.

Unrelated work

Additionally Zach on Discord requested T5-based structured generation. I tested it with model_class=transformers.AutoModelForSeq2SeqLM and it works with zero additional changes. The only change I made related to this is adding a model_t5 fixture to our test_generate.py tested models and documenting it in docs/reference/models/transformers.md.

@lapp0 lapp0 force-pushed the fix-mamba-integration branch 3 times, most recently from 84ea1eb to 99813cf Compare June 14, 2024 22:07
@lapp0 lapp0 force-pushed the transformers-use-logits-processor branch 17 times, most recently from 3123f3a to 9b513af Compare June 16, 2024 21:50
@lapp0 lapp0 force-pushed the fix-mamba-integration branch 2 times, most recently from 783acf6 to 68ea867 Compare June 17, 2024 03:35
@lapp0 lapp0 force-pushed the transformers-use-logits-processor branch 8 times, most recently from 872b9c6 to 5ce23f7 Compare June 17, 2024 18:10
@lapp0 lapp0 force-pushed the transformers-use-logits-processor branch 7 times, most recently from 32319df to 7d43bbd Compare July 3, 2024 14:42
@lapp0 lapp0 marked this pull request as draft July 15, 2024 09:06
@lapp0 lapp0 force-pushed the fix-mamba-integration branch 7 times, most recently from 60449d4 to 75dc370 Compare July 15, 2024 23:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants