Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quantized model in int8 or int4? #8

Open
nne998 opened this issue Jan 24, 2024 · 1 comment
Open

Quantized model in int8 or int4? #8

nne998 opened this issue Jan 24, 2024 · 1 comment

Comments

@nne998
Copy link

nne998 commented Jan 24, 2024

Hi,

Could you please provide a quantized model in int8 or int4, so that it can be served on a 16GB GPU?

Thanks.

@nne998 nne998 changed the title quantized model in int8 or int4? Quantized model in int8 or int4? Jan 24, 2024
@waxnkw
Copy link
Collaborator

waxnkw commented Jan 25, 2024

Sorry, I am rushing for a ddl recently. Will consider to add later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants