Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to customise the model path for model_1200000 and model-00001-of-00002? #707

Open
4 tasks done
sparrowtraveler opened this issue Jan 10, 2025 · 4 comments
Open
4 tasks done
Labels
question Further information is requested

Comments

@sparrowtraveler
Copy link

Checks

  • This template is only for question, not feature requests or bug reports.
  • I have thoroughly reviewed the project documentation and read the related paper(s).
  • I have searched for existing issues, including closed ones, no similar questions.
  • I confirm that I am using English to submit this report in order to facilitate communication.

Question details

image
How to customise the model path of model, model_1200000 and model-00001-of-00002 in F5-tts, do I need to modify the infer_gradio.py file? Can you tell me the exact way to modify the path code?

@sparrowtraveler sparrowtraveler added the question Further information is requested label Jan 10, 2025
@SWivid
Copy link
Owner

SWivid commented Jan 10, 2025

How to customise the model path of model, model_1200000 and model-00001-of-00002 in F5-tts, do I need to modify the infer_gradio.py file? Can you tell me the exact way to modify the path code?

what do you mean by model-00001-of-00002 here?

@sparrowtraveler
Copy link
Author

infer_gradio.py file
This model was downloaded when I first started the voice chat, I mean to transfer the model of F5-TTS to other disk to reduce the memory consumption of the system disk, how can I customize the path of the model? Modify the infer_gradio.py file? Can you give me more details?

@SWivid
Copy link
Owner

SWivid commented Jan 10, 2025

ckpt_path=str(...)

it is a str, you could just replace with the local_path

model-00001-of-00002 is the chat model (check into the .from_pretrained() to see how you could replace with loading from local)

if not USING_SPACES:
load_chat_model_btn = gr.Button("Load Chat Model", variant="primary")
chat_interface_container = gr.Column(visible=False)
@gpu_decorator
def load_chat_model():
global chat_model_state, chat_tokenizer_state
if chat_model_state is None:
show_info = gr.Info
show_info("Loading chat model...")
model_name = "Qwen/Qwen2.5-3B-Instruct"
chat_model_state = AutoModelForCausalLM.from_pretrained(
model_name, torch_dtype="auto", device_map="auto"
)
chat_tokenizer_state = AutoTokenizer.from_pretrained(model_name)
show_info("Chat model loaded.")
return gr.update(visible=False), gr.update(visible=True)
load_chat_model_btn.click(load_chat_model, outputs=[load_chat_model_btn, chat_interface_container])

@sparrowtraveler
Copy link
Author

ckpt_path=str(...)

it is a str, you could just replace with the local_path

model-00001-of-00002 is the chat model (check into the .from_pretrained() to see how you could replace with loading from local)

if not USING_SPACES:
load_chat_model_btn = gr.Button("Load Chat Model", variant="primary")
chat_interface_container = gr.Column(visible=False)
@gpu_decorator
def load_chat_model():
global chat_model_state, chat_tokenizer_state
if chat_model_state is None:
show_info = gr.Info
show_info("Loading chat model...")
model_name = "Qwen/Qwen2.5-3B-Instruct"
chat_model_state = AutoModelForCausalLM.from_pretrained(
model_name, torch_dtype="auto", device_map="auto"
)
chat_tokenizer_state = AutoTokenizer.from_pretrained(model_name)
show_info("Chat model loaded.")
return gr.update(visible=False), gr.update(visible=True)
load_chat_model_btn.click(load_chat_model, outputs=[load_chat_model_btn, chat_interface_container])

Thanks. I'll try it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants