Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Asyncio Event Loop issue #612

Open
EternalDeiwos opened this issue Jan 20, 2025 · 0 comments
Open

Asyncio Event Loop issue #612

EternalDeiwos opened this issue Jan 20, 2025 · 0 comments

Comments

@EternalDeiwos
Copy link

EternalDeiwos commented Jan 20, 2025

I am following this example and attempting to create a docker container for the service to run in.

I have made a few modifications, specifically using external services for both the llm and embedding. The script works well on my mac, but when I copy the same script and rebuild an identical environment, I get an error: RuntimeError: this event loop is already running.

I am vaguely aware existing issues like #51 and #15, and their solution of using nest_asyncio. I have tried the following snippet, but it didn't help.

import nest_asyncio
nest_asyncio.apply()

Any ideas what could be wrong?


requirements.txt
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
#    pip-compile --strip-extras requirements.in
#
aioboto3==13.3.0
    # via -r requirements.in
aiobotocore==2.16.0
    # via aioboto3
aiofiles==24.1.0
    # via aioboto3
aiohappyeyeballs==2.4.4
    # via aiohttp
aiohttp==3.11.11
    # via aiobotocore
aioitertools==0.12.0
    # via aiobotocore
aiosignal==1.3.2
    # via aiohttp
annotated-types==0.7.0
    # via pydantic
anyio==4.8.0
    # via
    #   httpx
    #   openai
    #   starlette
    #   watchfiles
attrs==24.3.0
    # via aiohttp
boto3==1.35.81
    # via aiobotocore
botocore==1.35.81
    # via
    #   aiobotocore
    #   boto3
    #   s3transfer
certifi==2024.12.14
    # via
    #   httpcore
    #   httpx
    #   requests
charset-normalizer==3.4.1
    # via requests
click==8.1.8
    # via
    #   rich-toolkit
    #   typer
    #   uvicorn
distro==1.9.0
    # via openai
dnspython==2.7.0
    # via email-validator
email-validator==2.2.0
    # via fastapi
fastapi==0.115.6
    # via -r requirements.in
fastapi-cli==0.0.7
    # via fastapi
filelock==3.16.1
    # via
    #   huggingface-hub
    #   torch
    #   transformers
frozenlist==1.5.0
    # via
    #   aiohttp
    #   aiosignal
fsspec==2024.12.0
    # via
    #   huggingface-hub
    #   torch
h11==0.14.0
    # via
    #   httpcore
    #   uvicorn
httpcore==1.0.7
    # via httpx
httptools==0.6.4
    # via uvicorn
httpx==0.27.2
    # via
    #   fastapi
    #   langsmith
    #   ollama
    #   openai
huggingface-hub==0.27.1
    # via
    #   tokenizers
    #   transformers
idna==3.10
    # via
    #   anyio
    #   email-validator
    #   httpx
    #   requests
    #   yarl
jinja2==3.1.5
    # via
    #   fastapi
    #   torch
jiter==0.8.2
    # via openai
jmespath==1.0.1
    # via
    #   boto3
    #   botocore
jsonpatch==1.33
    # via langchain-core
jsonpointer==3.0.0
    # via jsonpatch
langchain-core==0.3.29
    # via langchain-text-splitters
langchain-text-splitters==0.3.5
    # via -r requirements.in
langsmith==0.2.10
    # via langchain-core
lightrag-hku==1.1.1
    # via -r requirements.in
markdown-it-py==3.0.0
    # via rich
markupsafe==3.0.2
    # via jinja2
mdurl==0.1.2
    # via markdown-it-py
mpmath==1.3.0
    # via sympy
multidict==6.1.0
    # via
    #   aiohttp
    #   yarl
nano-vectordb==0.0.4.3
    # via -r requirements.in
nest-asyncio==1.6.0
    # via -r requirements.in
networkx==3.4.2
    # via torch
numpy==1.26.4
    # via
    #   -r requirements.in
    #   nano-vectordb
    #   transformers
ollama==0.4.6
    # via -r requirements.in
openai==1.59.7
    # via -r requirements.in
orjson==3.10.14
    # via langsmith
packaging==24.2
    # via
    #   huggingface-hub
    #   langchain-core
    #   transformers
propcache==0.2.1
    # via
    #   aiohttp
    #   yarl
pydantic==2.10.5
    # via
    #   -r requirements.in
    #   fastapi
    #   langchain-core
    #   langsmith
    #   ollama
    #   openai
pydantic-core==2.27.2
    # via pydantic
pygments==2.19.1
    # via rich
python-dateutil==2.9.0.post0
    # via botocore
python-dotenv==1.0.1
    # via
    #   -r requirements.in
    #   uvicorn
python-multipart==0.0.20
    # via fastapi
pyyaml==6.0.2
    # via
    #   huggingface-hub
    #   langchain-core
    #   transformers
    #   uvicorn
regex==2024.11.6
    # via
    #   tiktoken
    #   transformers
requests==2.32.3
    # via
    #   huggingface-hub
    #   langsmith
    #   requests-toolbelt
    #   tiktoken
    #   transformers
requests-toolbelt==1.0.0
    # via langsmith
rich==13.9.4
    # via
    #   rich-toolkit
    #   typer
rich-toolkit==0.13.2
    # via fastapi-cli
s3transfer==0.10.4
    # via boto3
safetensors==0.5.2
    # via transformers
shellingham==1.5.4
    # via typer
six==1.17.0
    # via python-dateutil
sniffio==1.3.1
    # via
    #   anyio
    #   httpx
    #   openai
starlette==0.41.3
    # via fastapi
sympy==1.13.1
    # via torch
tenacity==9.0.0
    # via
    #   -r requirements.in
    #   langchain-core
tiktoken==0.7.0
    # via -r requirements.in
tokenizers==0.21.0
    # via transformers
torch==2.5.1
    # via -r requirements.in
tqdm==4.67.1
    # via
    #   huggingface-hub
    #   openai
    #   transformers
transformers==4.48.0
    # via -r requirements.in
typer==0.15.1
    # via fastapi-cli
typing-extensions==4.12.2
    # via
    #   anyio
    #   fastapi
    #   huggingface-hub
    #   langchain-core
    #   openai
    #   pydantic
    #   pydantic-core
    #   rich-toolkit
    #   torch
    #   typer
urllib3==2.3.0
    # via
    #   botocore
    #   requests
uvicorn==0.34.0
    # via
    #   -r requirements.in
    #   fastapi
    #   fastapi-cli
uvloop==0.21.0
    # via uvicorn
watchfiles==1.0.4
    # via uvicorn
websockets==14.2
    # via uvicorn
wrapt==1.17.2
    # via aiobotocore
yarl==1.18.3
    # via aiohttp

# The following packages are considered to be unsafe in a requirements file:
# setuptools
Log
lightrag-1  | INFO:     Uvicorn running on http://0.0.0.0:8020 (Press CTRL+C to quit)
lightrag-1  | INFO:     172.24.0.2:50304 - "POST /ollama/api/chat HTTP/1.1" 500 Internal Server Error
lightrag-1  | ERROR:    Exception in ASGI application
lightrag-1  | Traceback (most recent call last):
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi
lightrag-1  |     result = await app(  # type: ignore[func-returns-value]
lightrag-1  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
lightrag-1  |     return await self.app(scope, receive, send)
lightrag-1  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
lightrag-1  |     await super().__call__(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
lightrag-1  |     await self.middleware_stack(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
lightrag-1  |     raise exc
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
lightrag-1  |     await self.app(scope, receive, _send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
lightrag-1  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
lightrag-1  |     raise exc
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
lightrag-1  |     await app(scope, receive, sender)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 715, in __call__
lightrag-1  |     await self.middleware_stack(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 735, in app
lightrag-1  |     await route.handle(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
lightrag-1  |     await self.app(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
lightrag-1  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
lightrag-1  |     raise exc
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
lightrag-1  |     await app(scope, receive, sender)
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 73, in app
lightrag-1  |     response = await f(request)
lightrag-1  |                ^^^^^^^^^^^^^^^^
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 301, in app
lightrag-1  |     raw_response = await run_endpoint_function(
lightrag-1  |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
lightrag-1  |     return await dependant.call(**values)
lightrag-1  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
lightrag-1  |   File "/app/server.py", line 128, in ollama_chat
lightrag-1  |     resp = rag.query(
lightrag-1  |            ^^^^^^^^^^
lightrag-1  |   File "/usr/local/lib/python3.12/site-packages/lightrag/lightrag.py", line 695, in query
lightrag-1  |     return loop.run_until_complete(self.aquery(query, param))
lightrag-1  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
lightrag-1  |   File "uvloop/loop.pyx", line 1512, in uvloop.loop.Loop.run_until_complete
lightrag-1  |   File "uvloop/loop.pyx", line 1505, in uvloop.loop.Loop.run_until_complete
lightrag-1  |   File "uvloop/loop.pyx", line 1379, in uvloop.loop.Loop.run_forever
lightrag-1  |   File "uvloop/loop.pyx", line 520, in uvloop.loop.Loop._run
lightrag-1  | RuntimeError: this event loop is already running.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant