Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when testing the LLM #22

Closed
pdalaya opened this issue Jan 14, 2024 · 9 comments · Fixed by #27
Closed

Error when testing the LLM #22

pdalaya opened this issue Jan 14, 2024 · 9 comments · Fixed by #27

Comments

@pdalaya
Copy link

pdalaya commented Jan 14, 2024

Hello,
The server loads fine, and is started. However, when I do a request I get an error.

llama.cpp/ggml.c:9067: �[31;1massert(!isnan(x)) failed�[0m (cosmoaddr2line /C/Users/userName/Documents/Personal Projects/Flickr/Assets/StreamingAssets/llamafile-server.exe 4c54f9 4e3982 4e7379 5c9afc 5e7b93)
UnityEngine.Debug:LogError (object)
LLMUnity.LLM:DebugLog (string,bool) (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLM.cs:125)
LLMUnity.LLM:DebugLogError (string) (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLM.cs:131)
LLMUnity.LLMUnitySetup/<>c__DisplayClass0_0:<CreateProcess>b__1 (object,System.Diagnostics.DataReceivedEventArgs) (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLMUnitySetup.cs:41)
System.Threading._ThreadPoolWaitCallback:PerformWaitCallback ()

image

Do you know what is the issue? I tested with the samples.

@amakropoulos
Copy link
Collaborator

Thanks for submitting the issue 👏 !
This would be related to the llamafile library that we are using for the LLM server.
What OS are you using?
Could you send me the command that is printed in the Debug Log? It starts with Server command:.

@pdalaya
Copy link
Author

pdalaya commented Jan 15, 2024

Hello,
I am on windows.
I tried again this morning, but I get another error as soon as I Hit play :

Win32Exception: ApplicationName='C:/Users/userName/Documents/Personal Projects/Flickr/Assets/StreamingAssets/llamafile-server.exe', CommandLine=' --port 13333 -m "C:/Users/userName/Documents/Personal Projects/Flickr/Assets/StreamingAssets/mistral-7b-instruct-v0.2.Q4_K_M.gguf" -c 512 -b 512 --log-disable --nobrowser -np 1', CurrentDirectory='', Native error= The system cannot find the file specified.

System.Diagnostics.Process.StartWithCreateProcess (System.Diagnostics.ProcessStartInfo startInfo) (at <d054a9182977441aa432503a474315ba>:0)
System.Diagnostics.Process.Start () (at <d054a9182977441aa432503a474315ba>:0)
(wrapper remoting-invoke-with-check) System.Diagnostics.Process.Start()
LLMUnity.LLMUnitySetup.CreateProcess (System.String command, System.String commandArgs, LLMUnity.Callback`1[T] outputCallback, LLMUnity.Callback`1[T] errorCallback, System.Collections.Generic.List`1[T] environment, System.Boolean redirectOutput, System.Boolean redirectError) (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLMUnitySetup.cs:42)
LLMUnity.LLM.StartLLMServer () (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLM.cs:186)
LLMUnity.LLM.OnEnable () (at ./Library/PackageCache/ai.undream.llmunity@bb974c6551/Runtime/LLM.cs:109)

@amakropoulos
Copy link
Collaborator

This means that LLMUnity can't find the llamafile-server.exe.
Have you changed the StreamingAssets folder?
If you close the project and open it again, it should automatically re-download the required files.

@pdalaya
Copy link
Author

pdalaya commented Jan 15, 2024

Oh yeah your are right. Actually my antivirus is deleting the server.
Where I can download it manually?

@amakropoulos
Copy link
Collaborator

You can download it from here inside the Assets/StreamingAssets folder and rename it to llamafile-server.exe

@amakropoulos
Copy link
Collaborator

Regarding the issue:
Can you try to download the latest version of llamafile from here inside the Assets/StreamingAssets folder and rename it to llamafile-server.exe
and then let me know if it works for you?

@pdalaya
Copy link
Author

pdalaya commented Jan 16, 2024

Thanks it works! Amazing, can't wait to mess around with this. Awesome!

@amakropoulos
Copy link
Collaborator

Perfect! This will be merged with the upcoming release 🙂

@amakropoulos
Copy link
Collaborator

@pdalaya this is now merged in v1.0.2!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants