Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fill = pipeline('fill-mask', model='tamil_bert', tokenizer='tamil_bert') #14

Open
apkbala107 opened this issue Dec 12, 2021 · 1 comment

Comments

@apkbala107
Copy link


ValueError Traceback (most recent call last)
/tmp/ipykernel_3356/3905977959.py in
----> 1 fill = pipeline('fill-mask', model='tamil_bert', tokenizer='tamil_bert')

~/.local/lib/python3.8/site-packages/transformers/pipelines/init.py in pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, model_kwargs, **kwargs)
452 # Will load the correct model if possible
453 model_classes = {"tf": targeted_task["tf"], "pt": targeted_task["pt"]}
--> 454 framework, model = infer_framework_load_model(
455 model,
456 model_classes=model_classes,

~/.local/lib/python3.8/site-packages/transformers/pipelines/base.py in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)
156
157 if isinstance(model, str):
--> 158 raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
159
160 framework = "tf" if model.class.name.startswith("TF") else "pt"

ValueError: Could not load model tamil_bert with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForMaskedLM'>,).

@apkbala107
Copy link
Author

i am having above error while using my model...could you help me for that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant