Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support custom openAI endpoints for AI selfhosters #256

Open
1 task done
DirtWolfPunk opened this issue Sep 25, 2024 · 1 comment
Open
1 task done

Support custom openAI endpoints for AI selfhosters #256

DirtWolfPunk opened this issue Sep 25, 2024 · 1 comment
Labels

Comments

@DirtWolfPunk
Copy link

🔖 Feature description

Add configuration support for custom OpenAI compatible endpoints for tools like ollama or localAI instead of proprietary AI.

🎤 Why is this feature needed ?

I prefer to selfhost when possible and have been very keen on local only AI tools.

✌️ How do you aim to achieve this?

I want this feature to include a new configuration variable for openAI API endpoint

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@DirtWolfPunk DirtWolfPunk added the type: feature-request New feature or request label Sep 25, 2024
@nevo-david
Copy link
Contributor

Not possible at the moment with CopilotKit,
Tagging @ataibarkai and @arielweinberger for this :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants