From 2c995dadc68bfd88c74dbcbc93b69595531c549c Mon Sep 17 00:00:00 2001 From: perpendicularai Date: Tue, 13 Aug 2024 19:53:24 -0700 Subject: [PATCH] Updated README --- README.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/README.md b/README.md index 2e568c5c8..9d756839a 100644 --- a/README.md +++ b/README.md @@ -21,6 +21,7 @@ This package provides: - [LangChain compatibility](https://python.langchain.com/docs/integrations/llms/llamacpp) - [LlamaIndex compatibility](https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html) - Chat with kernel memory + - LlamaCpp Python UI - OpenAI compatible web server - [Local Copilot replacement](https://llama-cpp-python.readthedocs.io/en/latest/server/#code-completion) - [Function Calling support](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling) @@ -825,6 +826,21 @@ Check out the [examples folder](examples/low_level_api) for more examples of usi Documentation is available via [https://llama-cpp-python.readthedocs.io/](https://llama-cpp-python.readthedocs.io/). If you find any issues with the documentation, please open an issue or submit a PR. +## UI's +# SeKernel_for_LLM_UI +This is the repository for the UI for the SeKernel_for_LLM module + +## How to: +- Clone the repo +- Ensure that you have llama-cpp-python installed and running +- Add your model to the `kernel.py` script +- Launch the UI by running `python sekernel_ui.py` +-- Please note : Only internet-connected chat is supported. If you have the skills, you can checkout the plugins.py module to add more functionality to your UI. + +## Short-films +https://github.com/user-attachments/assets/a6e75136-bd3f-4960-8791-6f83094f2123 + + ## Development This package is under active development and I welcome any contributions.