You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Actually I've created an conda new environment in the name of maskllm and cloned this repo. Inside the MaskLLM, I've executed these scripts, which is in the below screenshot:
How can I verify that the megatron installed properly or not???
I also tried to install Megatron without Docker. But it's a bit complicated since there are other dependencies like transformer_engine. I think using official and pre-built docker can solve this problem.
I also tried to install Megatron without Docker. But it's a bit complicated since there are other dependencies like transformer_engine. I think using official and pre-built docker can solve this problem.
I've successfully completed till downloading llama models in HF format by running the below script:
And also got this folder structure:
Then I've ran the below command to convert HF to Megatron:
Unfortunately, I'm facing the below errors:
I've tried adding the path of megatron in my env variable, but it not worked, kindly help me to fix this issue
The text was updated successfully, but these errors were encountered: