You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from xturing.datasets.instruction_dataset import InstructionDataset
from xturing.models import BaseModel
import os
instruction_dataset = InstructionDataset("alpaca_data")
# Initializes the model
model = BaseModel.create("falcon_int8")
# Finetuned the model
model.finetune(dataset=instruction_dataset)
# Save the model
model.save("falcon_weights_int8")
error: RuntimeError: DistributedDataParallel is not needed when a module doesn't have any parameter that requires a gradient.
The text was updated successfully, but these errors were encountered:
Hi,
I am trying to finetune falcon with only in8 engine but it's ending up with below error, any idea ?
Also unable to fine-tune base model falcon. Ending up with cuda out of memory error.
Details:
Machine:
g5.48xlarge
ec2 machine (8 gpu, 22gb each)xTuring:
0.1.5
Torch:
2.0.1+cu117
Code:
error:
RuntimeError: DistributedDataParallel is not needed when a module doesn't have any parameter that requires a gradient.
The text was updated successfully, but these errors were encountered: