You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using the same code as the notebook but with a cache_dir to the LLamaCasual class:
train_config = TrainConfig(model_name="mlfoundations/tabula-8b", context_length=8192)
tokenizer_config = TokenizerConfig()
serializer_config = SerializerConfig()
# Load the configuration
config = AutoConfig.from_pretrained(train_config.model_name)
# Set the torch_dtype to bfloat16 which matches TabuLa train/eval setup
config.torch_dtype = torch.bfloat16
# Device setup
self.device = torch.device("mps" if torch.backends.mps.is_available() else "cpu")
# Load model and tokenizer
self.model = LlamaForCausalLM.from_pretrained(
train_config.model_name, device_map="auto", config=config, cache_dir=MODELS_PATH).to(self.device)
self.tokenizer = AutoTokenizer.from_pretrained(train_config.model_name, cache_dir=MODELS_PATH)
self.serializer = get_serializer(serializer_config)
# Prepare tokenizer
self.tokenizer, self.model = prepare_tokenizer(
self.model,
tokenizer=self.tokenizer,
pretrained_model_name_or_path=train_config.model_name,
model_max_length=train_config.context_length,
use_fast_tokenizer=tokenizer_config.use_fast_tokenizer,
serializer_tokens_embed_fn=tokenizer_config.serializer_tokens_embed_fn,
serializer_tokens=self.serializer.special_tokens
if tokenizer_config.add_serializer_tokens
else None,
)
# Initialize inference model
self.inference_model = InferenceModel(model=self.model, tokenizer=self.tokenizer, serializer=self.serializer)
This results in an error on the call to:
self.model = LlamaForCausalLM.from_pretrained(
train_config.model_name, device_map="auto", config=config, cache_dir=MODELS_PATH).to(self.device)
resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
The error happens after the download of the model is finished and the program crash.
I'm running the code on Mac with MPS backend
The text was updated successfully, but these errors were encountered:
I'm using the same code as the notebook but with a cache_dir to the LLamaCasual class:
This results in an error on the call to:
The error happens after the download of the model is finished and the program crash.
I'm running the code on Mac with MPS backend
The text was updated successfully, but these errors were encountered: