Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add high level llama-cpp-python logits processor #540

Closed
dtiarks opened this issue Jan 15, 2024 · 2 comments · Fixed by #556
Closed

Add high level llama-cpp-python logits processor #540

dtiarks opened this issue Jan 15, 2024 · 2 comments · Fixed by #556
Assignees
Labels
enhancement transformers Linked to the `transformers` integration

Comments

@dtiarks
Copy link
Contributor

dtiarks commented Jan 15, 2024

Implement an outlines based logits processor for llama-cpp-python according to the interface in docs: https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#llama_cpp.LogitsProcessor

@rlouf rlouf added enhancement transformers Linked to the `transformers` integration labels Jan 16, 2024
@rlouf
Copy link
Member

rlouf commented Jan 16, 2024

This would be a much better way to integrate with llama.cpp since it would require less maintenance on our end.

@dtiarks
Copy link
Contributor Author

dtiarks commented Jan 16, 2024

Yeah, I guess this is also true for transformers and others (👀 TensorRT-LLM).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement transformers Linked to the `transformers` integration
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants