-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference server down when trying to load a model for a Code Generation Application #1557
Labels
Comments
I though that the problem is that I run two services in the machine. But even after stopping the custom service, I cannot get the inference server up of the code generator app. I think this is the problem:
|
jeffmaury
added a commit
to jeffmaury/ai-lab
that referenced
this issue
Aug 16, 2024
Fixes containers#1557 Signed-off-by: Jeff MAURY <jmaury@redhat.com>
jeffmaury
added a commit
to jeffmaury/ai-lab
that referenced
this issue
Aug 16, 2024
Fixes containers#1557 Signed-off-by: Jeff MAURY <jmaury@redhat.com>
jeffmaury
added a commit
to jeffmaury/ai-lab
that referenced
this issue
Aug 16, 2024
…1558) Fixes containers#1557 Signed-off-by: Jeff MAURY <jmaury@redhat.com>
jeffmaury
added a commit
that referenced
this issue
Aug 16, 2024
Verified with 1.2.3 version on Mac OS 14 M2 with libkrun machine with enabled GPU... |
With just CPU, it also worked. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Bug description
Inference server went down while trying to load a model.
Even while inference server is not running, in the running app for ai-lab it seems like everything is ok. App is green, should be red/orange at least to show that app actually wont work. Loading up the app on the url:port does not load.
Operating system
Mac OS 14 M2
Installation Method
from
ghcr.io/containers/podman-desktop-extension-ai-lab
container imageVersion
next (development version)
Steps to reproduce
Actual result: Code generator app's inference server went down
Relevant log output
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: