Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use llm-foundry for ICL metrics #287

Merged
merged 8 commits into from
Jul 23, 2024
13 changes: 6 additions & 7 deletions open_lm/utils/llm_foundry_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@
"""Implements a Hugging Causal LM wrapped inside a :class:`.ComposerModel`."""

from typing import Mapping, Union

from composer.metrics.nlp import (
from llmfoundry.eval.metrics.nlp import (
InContextLearningLMAccuracy,
InContextLearningLMExpectedCalibrationError,
InContextLearningMCExpectedCalibrationError,
InContextLearningMultipleChoiceAccuracy,
InContextLearningQAAccuracy,
InContextLearningCodeEvalAccuracy,
InContextLearningGenerationExactMatchAccuracy,
)
from composer.metrics.nlp import (
LanguageCrossEntropy,
LanguagePerplexity,
)
Expand All @@ -33,10 +33,9 @@
LanguagePerplexity(),
InContextLearningLMAccuracy(),
InContextLearningMultipleChoiceAccuracy(),
InContextLearningQAAccuracy(),
InContextLearningGenerationExactMatchAccuracy(),
InContextLearningLMExpectedCalibrationError(),
InContextLearningMCExpectedCalibrationError(),
InContextLearningCodeEvalAccuracy(),
]


Expand All @@ -51,4 +50,4 @@ def __init__(self, model, tokenizer):
)

def generate(self, input_ids=None, inputs_embeds=None, **kwargs):
return super().generate(input_ids=input_ids, **kwargs)
return super().generate(input_ids=input_ids, **kwargs)
Loading