Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

πŸ”¨ Scaffold for refactor #2340

Conversation

ashwinvaidya17
Copy link
Collaborator

πŸ“ Description

  • llava runs locally through ollama
    image

✨ Changes

Select what type of change your PR is:

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • πŸ”¨ Refactor (non-breaking change which refactors the code base)
  • πŸš€ New feature (non-breaking change which adds functionality)
  • πŸ’₯ Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • πŸ“š Documentation update
  • πŸ”’ Security update

βœ… Checklist

Before you submit your pull request, please make sure you have completed the following steps:

  • πŸ“‹ I have summarized my changes in the CHANGELOG and followed the guidelines for my type of change (skip for minor changes, documentation updates, and test enhancements).
  • πŸ“š I have made the necessary updates to the documentation (if applicable).
  • πŸ§ͺ I have written tests that support my changes and prove that my fix is effective or my feature works (if applicable).

For more information about code review checklists, see the Code Review Checklist.

Signed-off-by: Ashwin Vaidya <ashwinnitinvaidya@gmail.com>
Copy link
Contributor

@samet-akcay samet-akcay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only two comments. Looking clean

class Vlm(AnomalyModule):
"""Visual anomaly model."""

def __init__(self, backend: VLMBackend = VLMBackend.OLLAMA, api_key: str | None = None, k_shot: int = 3) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quite minor, but it would be good if backend could be str as well

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking of an alternative where instead of the llm provider, the users can select the model, and based on it the model will select the right provider.

@@ -24,6 +24,7 @@
from .rkde import Rkde
from .stfpm import Stfpm
from .uflow import Uflow
from .vlm import Vlm
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VlmAd maybe, as Vlm might sound quite generic?

Signed-off-by: Ashwin Vaidya <ashwinnitinvaidya@gmail.com>
Signed-off-by: Ashwin Vaidya <ashwinnitinvaidya@gmail.com>
@ashwinvaidya17 ashwinvaidya17 merged commit 660acf1 into openvinotoolkit:feature/vlm_anomaly Oct 4, 2024
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants