-
Notifications
You must be signed in to change notification settings - Fork 171
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DOCS][BUG]Can not run the examples follow vllm-integration-v0.2.md
and vllm-integration.md
#124
Comments
@alogfans Hi Would you like to help me run the vllm + mooncake? Any help will be appreciated! |
The Dockerfile is contributed by the community. We will make a Docker image to make deploy more convenient in a near future.
I noticed two vllm node logs you provided. It seems that there is problem during the initialization of Transfer Engine (the output is prior to the section you provided). You can use |
@alogfans Thanks for your reply! It is proved the two node are connectable by the The key issue is that why producer do not record metadata key into ETCD? |
According to the logs, both the prefill and the decode tried to get segment "192.168.1.208:13003" (i.e. the decode machine in the json file). The decode retrieve segment information locally, without the need of etcd. Therefore the problem could likely lead by the decode node failing to create its segment (transport init failed, memory allocation failed, ...). So you can try to print verbose logs in the decode node. |
You have to install mooncake manually by following the build doc. The package is not on pypi yet. |
Help wanted!
I try to run the vllm and mooncake examples follow
vllm-integration-v0.2.md
andvllm-integration.md
, but i failed, I guess there are something wrong with me and documents.Now I try to paste what I did and what I encountered.
Run vllm image with mooncake kv_connector config --- lack of
mooncake_vllm_adaptor
.I run the vllm docker image with some mooncake kv_connector config from vllm docker hub, but there are no
mooncake_vllm_adaptor
in thevllm
container.Next, I know I should install mooncake_vllm_adaptor, so I use
pip install mooncake_vllm_adaptor
to try to install it, but I failed too.Have to construct a docker which can run vllm integrated mooncake connector and mooncake_vllm_adaptor
I try to search docker hub and hope to find an image repository which maintained by
mooncake
community, but I cannot found it.So I build an image from
Mooncake/Dockerfile
, but it contains novllm
andsome necessary dependencies
. I cannot build successfully through the source code of vllm, it lack of some necessary dependencies.But, I can install a latest version of vllm through
pip install vllm
and some of dependencies which I found while each try.Whatever, I have a container which contains
mooncake and vllm
after many efforts I did.Cannot run examples successfully follow
vllm-integration-v0.2.md
andvllm-integration.md
Run follow
vllm-integration-0.2.md
in the docker container.first node
second node
proxy_server.py
in the first node which copy fromvllm-integration.md
proxy_server.py
process on its nodeproxy_server.py
mooncake/ram
,EtcdStoragePlugin: set: key=
andEtcdStoragePlugin: unable to set
. Guess none of vllm server requestEtcdStoragePlugin
to create the keymooncake/ram/192.168.1.208:13003
Failed to Run follow
vllm-integration.md
in the docker container. --- do not use mooncakeVLLM_HOST_IP
VLLM_HOST_PORT
MASTER_ADDR
andMASTER_PORT
envvar value, In fact I don't know these vars well.mooncake
from any logs, I guess it is not enough to enable mooncake by the givenVLLM_DISTRIBUTED_KV_ROLE
The text was updated successfully, but these errors were encountered: