Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG about API base URL #86

Open
Aexuss opened this issue May 29, 2024 · 3 comments
Open

BUG about API base URL #86

Aexuss opened this issue May 29, 2024 · 3 comments

Comments

@Aexuss
Copy link

Aexuss commented May 29, 2024

I wanted to bring to your attention a couple of issues I've encountered while using your service:
The other model service providers I use have their own API base URLs.
When I input my API base URL, the system fails to detect the model. It would be greatly appreciated if you could implement an option to manually enter the model selection.
搜狗截图20240529103115

Additionally, when I manually fill in the details in the documentation, certain UI elements disappear, forcing me to reinstall the plugin.
1
2
Thank you for considering these improvements.

@Vhugomald
Copy link

I was having a similar issue with ollama. Came here trying to find the solution. Found out I hadn't setup Cross-Origin Resource Sharing (CORS) settings per the BMO chatbot's GitHub documentation: CORS Setup For All OS

@skloeckner
Copy link

skloeckner commented Jun 13, 2024

I'm having this problem on linux but the CORS setup seems to be tailored or focused on MacOS specifically. Are you both on Mac?

Edit: I made some progress as this line edited in ollama.service allows me to populate models now:

[Service]
Environment="OLLAMA_ORIGINS=app://obsidian.md*"

However, whenever I try to type in the BMO chat, it fails. Ollama sends back a 400 Bad Request.
I also use OpenwebUI and I'm using the same base URL in BMO.

I'm looking at the HTTP traffic in wireshark...

@Szymok
Copy link

Szymok commented Jul 1, 2024

@skloeckner do you find a solution? I am also running openwebui on local server and trying to figure it out how to connect both services.

EDIT:

nvm, just expose Ollama API outside the container stack

ports: - ${OLLAMA_WEBAPI_PORT-11434}:11434

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants