-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG about API base URL #86
Comments
I was having a similar issue with ollama. Came here trying to find the solution. Found out I hadn't setup Cross-Origin Resource Sharing (CORS) settings per the BMO chatbot's GitHub documentation: CORS Setup For All OS |
I'm having this problem on linux but the CORS setup seems to be tailored or focused on MacOS specifically. Are you both on Mac? Edit: I made some progress as this line edited in ollama.service allows me to populate models now: [Service] However, whenever I try to type in the BMO chat, it fails. Ollama sends back a 400 Bad Request. I'm looking at the HTTP traffic in wireshark... |
@skloeckner do you find a solution? I am also running openwebui on local server and trying to figure it out how to connect both services. EDIT: nvm, just expose Ollama API outside the container stack
|
I wanted to bring to your attention a couple of issues I've encountered while using your service:
The other model service providers I use have their own API base URLs.
When I input my API base URL, the system fails to detect the model. It would be greatly appreciated if you could implement an option to manually enter the model selection.
Additionally, when I manually fill in the details in the documentation, certain UI elements disappear, forcing me to reinstall the plugin.
Thank you for considering these improvements.
The text was updated successfully, but these errors were encountered: