Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for custom models #1

Open
transitive-bullshit opened this issue Nov 15, 2023 · 2 comments
Open

Add support for custom models #1

transitive-bullshit opened this issue Nov 15, 2023 · 2 comments
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed

Comments

@transitive-bullshit
Copy link
Owner

transitive-bullshit commented Nov 15, 2023

Currently, the models is hard-coded to use the OpenAI chat completion API, but it wouldn't be very difficult to use custom LLMs or external model providers.

The only real constraint is that the custom models need to support function calling and/or ideally parallel tool calling using OpenAI's tool_calls format.

Will consider implementing this depending on how much love this issue gets.

@transitive-bullshit transitive-bullshit added enhancement New feature or request help wanted Extra attention is needed good first issue Good for newcomers labels Nov 15, 2023
@ChuckJonas
Copy link

Seems like this could be done using one of the function calling models on hugging face.

Most of the work would just revolve around converting the open.chat.completions payload to a prompt that the model likes.

Looking at the example notebook, the format seems to be something like this:

<s> <FUNCTIONS>{
    "function": "search_bing",
    "description": "Search the web for content on Bing. This allows users to search online/the internet/the web for content.",
    "arguments": [
        {
            "name": "query",
            "type": "string",
            "description": "The search query string"
        }
    ]
}</FUNCTIONS>

[INST] Search bing for the tallest mountain in Ireland [/INST]
</s>

Was able to run this using a hugging face endpoint with Llama-2-7b-chat-hf-function-calling-v2, and it seemed to work ok.

@praneybehl
Copy link

Possibly add support for Mistral API? Ref: https://docs.mistral.ai/platform/overview/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants