You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is it possible to connect the plugin via the oobabooga-webui or koboldcpp API to a locally running model (Refact-1.6b, starcoder, etc.)?
If possible, how? Or is it possible to work with local models only as described here?
The text was updated successfully, but these errors were encountered:
I had something else in mind. It does not matter on what to run the model locally (GPU or CPU), it is important that the plugin can work with a local model running not only in a docker container in WSL, because - and why do this when there is already oobabooga, where we can locally run models in a variety of formats. Refact is also launched in oobabooga, but it's not clear how to connect the plugin to it via the API.
Is it possible to connect the plugin via the oobabooga-webui or koboldcpp API to a locally running model (Refact-1.6b, starcoder, etc.)?
If possible, how? Or is it possible to work with local models only as described here?
The text was updated successfully, but these errors were encountered: