Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐞Fix ollama #2519

Merged
merged 6 commits into from
Jan 20, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ core = [
"open-clip-torch>=2.23.0,<2.26.1",
]
openvino = ["openvino>=2024.0", "nncf>=2.10.0", "onnx>=1.16.0"]
vlm = ["ollama<0.4.0", "openai", "python-dotenv","transformers"]
vlm = ["ollama>=0.4.0", "openai", "python-dotenv","transformers"]
loggers = [
"comet-ml>=3.31.7",
"gradio>=4",
Expand Down
7 changes: 3 additions & 4 deletions src/anomalib/models/image/vlm_ad/backends/ollama.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,7 @@
from .base import Backend

if module_available("ollama"):
from ollama import chat
from ollama._client import _encode_image
from ollama import Image, chat
else:
chat = None

Expand Down Expand Up @@ -101,7 +100,7 @@
Args:
image (str | Path): Path to the reference image file
"""
self._ref_images_encoded.append(_encode_image(image))
self._ref_images_encoded.append(Image(value=image))

Check warning on line 103 in src/anomalib/models/image/vlm_ad/backends/ollama.py

View check run for this annotation

Codecov / codecov/patch

src/anomalib/models/image/vlm_ad/backends/ollama.py#L103

Added line #L103 was not covered by tests

@property
def num_reference_images(self) -> int:
Expand Down Expand Up @@ -144,7 +143,7 @@
if not chat:
msg = "Ollama is not installed. Please install it using `pip install ollama`."
raise ImportError(msg)
image_encoded = _encode_image(image)
image_encoded = Image(value=image)

Check warning on line 146 in src/anomalib/models/image/vlm_ad/backends/ollama.py

View check run for this annotation

Codecov / codecov/patch

src/anomalib/models/image/vlm_ad/backends/ollama.py#L146

Added line #L146 was not covered by tests
messages = []

# few-shot
Expand Down
Loading