Skip to content

Commit

Permalink
refactor: Update get_llm function to accept optional metadata parameter
Browse files Browse the repository at this point in the history
The get_llm function in ava_mosaic_ai/__init__.py has been updated to accept an optional metadata parameter. This change allows for passing additional metadata to the LLMFactory constructor when creating an instance of the LLMFactory class.

Refactor the get_llm function to accept an optional metadata parameter in order to provide more flexibility and customization when initializing the LLMFactory.
  • Loading branch information
pyrotank41 committed Sep 28, 2024
1 parent 29c3eb6 commit 714e518
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions ava_mosaic_ai/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from typing import Optional
from ava_mosaic_ai.llm_factory import LLMFactory


def get_llm(provider: str) -> LLMFactory:
return LLMFactory(provider)
def get_llm(provider: str, metadata:Optional[dict]) -> LLMFactory:
return LLMFactory(provider, metadata=metadata)

0 comments on commit 714e518

Please sign in to comment.