-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for remote models (OpenAI, ...) #5
Comments
Would be great if Ava supported remote openai api. This would allow us to reuse the server and avoid loading the model multiple times if we are using it in a different app.
|
Would be really great if Ava supported using an already running Ollama instance via its API! |
Yes, this is in the works, but not finished yet. |
Just a small update, the UI part has been rewritten and we now have What's missing:
|
/api/chat/completions
endpoint which will just wrap what we do in client-side and then if we are using remote endpoint, we can just proxyThe text was updated successfully, but these errors were encountered: