Skip to content

Yanmi01/mini-SabiYarn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Pretraining a foundational model after SabiYarn

This repository is built, mimicking the SabiYarn, a Large Language model pre-trained on major Nigerian languages; pidgin, Hausa, Igbo, Yoruba, and English. It uses the architecture of GPT-J and is pre-trained on a larger dataset.

However, the model in this repository uses the GPT-2 architecture and is pre-trained on Hausa, Yoruba, Igbo, and English datasets.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages