Skip to content

his code selects the best model for your specific question.

Notifications You must be signed in to change notification settings

mihirhbhatt/LLM_SELECTOR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM_Selector

Auto Select the Best LLM for Your Use Case or Queries

Clone this Repo on your local drive and then get started.

This code selects the best model for your specific question.

FLOW +>

Step 1: User Asks a Question

Step 2: A Local LLM runs checks to find the best model to run for the particular user input

Step 3: The user query is sent to the chosen one

Step 4: The chosen LLM runs giving the output

CODE EXPLANATION

main.py +> Basic Implementation of Langchain for Ollama

main_working.py +> Gives your Terminal Experience (This is the full code your can run on your terminal using "python main_working.py")

main_stream.py +> Give a Web UI experience using Streamlit (To run this code, type "streamlit run main_stream.py" on your terminal)

requirements.txt +> Run these requirements before running any codes above ("pip install -r requirements.txt" on your terminal)

REQUIREMENTS:

  1. You need to have Ollama Running on your System

About

his code selects the best model for your specific question.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages