Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Unable to create/run experiments #795

Closed
1 task done
khaledosman opened this issue Feb 4, 2025 · 8 comments
Closed
1 task done

[BUG]: Unable to create/run experiments #795

khaledosman opened this issue Feb 4, 2025 · 8 comments
Assignees
Labels
bug Something isn't working

Comments

@khaledosman
Copy link
Contributor

khaledosman commented Feb 4, 2025

Description

While running Lumigator for the first time yesterday I wasn't able to create an experiment from the UI, and while testing the api through swagger localhost:8000/docs the endpoints for GET /experiments and GET /jobs seem to fail

When following through the docker logs for the backend, this is what I see when trying to create an experiment (Sounds like it requires a /mzai/lumigator/jobs/evaluator folder to exist or perhaps it couldn't create it when its not there for the first time)

And this is the error I see in the logs when trying to hit the endpoints or open Lumigator afterwards, sounds like its unable to find the job

Reproduction

Steps to reproduce:

  1. Start with a new docker container with no previous jobs or experiments
  2. Use the sample dataset
  3. Try to create an experiment
  4. Check the logs, refresh the page or try to use the GET endpoints for /experiments or /jobs

Relevant log output

First error when creating experiment: https://pastebin.com/99Ueqnfa

INFO:     172.18.0.6:53416 - "POST /api/v1/experiments HTTP/1.1" 307 Temporary Redirect
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:create_job:336 - sending config_params...{'job_id': UUID('0ef6cb54-6e4a-4fec-b09b-5c997bcc9b79'), 'job_name': 'test - 2', 'model_uri': 'oai://gpt-4o-mini', 'dataset_path': 's3://lumigator-storage/datasets/b74814e6-b214-4014-89fb-763678f3446b/dialogsum_exc.csv', 'max_samples': 0, 'storage_path': 's3://lumigator-storage/jobs/results/', 'model_url': 'https://api.openai.com/v1', 'system_prompt': 'You are a helpful assistant, expert in text summarization. For every prompt you receive, provide a summary of its contents in at most two sentences.', 'skip_inference': False}
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:create_job:344 - template...('{{\n    "name": "{job_name}/{job_id}",\n    "model": {{\n        "inference": {{\n            "base_url": "{model_url}",\n            "engine": "{model_uri}",\n            "system_prompt": "{system_prompt}",\n            "max_retries": 3\n        }}\n    }},\n    "dataset": {{ "path": "{dataset_path}" }},\n    "evaluation": {{\n        "metrics": ["rouge", "meteor", "bertscore"],\n        "max_samples": {max_samples},\n        "return_input_data": true,\n        "return_predictions": true,\n        "storage_path": "{storage_path}",\n        "skip_inference": "{skip_inference}"\n    }}\n}}', <JobType.EVALUATION: 'evaluate'>, 'oai://gpt-4o-mini')
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:_validate_config:250 - Validation for job type JobType.EVALUATION not yet supported.
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:create_job:386 - runtime env setup...
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:create_job:387 - {'pip': '/mzai/lumigator/jobs/evaluator/requirements.txt', 'working_dir': '/mzai/lumigator/jobs/evaluator', 'env_vars': {'MZAI_JOB_ID': '0ef6cb54-6e4a-4fec-b09b-5c997bcc9b79'}}
 2025-02-04 09:32:24 | INFO     | backend.services.jobs:create_job:392 - Submitting {job_type} Ray job...
 2025-02-04 09:32:24 | INFO     | backend.ray_submit.submission:submit_ray_job:37 - Submitting if [ `arch` = "aarch64" ]; then export LD_PRELOAD=$VIRTUAL_ENV/lib/python3.11/site-packages/scikit_learn.libs/libgomp-d22c30c5.so.1.0.0; fi; python -m evaluator evaluate huggingface --config '{
     "name": "test - 2/0ef6cb54-6e4a-4fec-b09b-5c997bcc9b79",
     "model": {
         "inference": {
             "base_url": "https://api.openai.com/v1",
             "engine": "oai://gpt-4o-mini",
             "system_prompt": "You are a helpful assistant, expert in text summarization. For every prompt you receive, provide a summary of its contents in at most two sentences.",
             "max_retries": 3
         }
     },
     "dataset": { "path": "s3://lumigator-storage/datasets/b74814e6-b214-4014-89fb-763678f3446b/dialogsum_exc.csv" },
     "evaluation": {
         "metrics": ["rouge", "meteor", "bertscore"],
         "max_samples": 0,
         "return_input_data": true,
         "return_predictions": true,
         "storage_path": "s3://lumigator-storage/jobs/results/",
         "skip_inference": "False"
     }
 }'...{'pip': '/mzai/lumigator/jobs/evaluator/requirements.txt', 'working_dir': '/mzai/lumigator/jobs/evaluator', 'env_vars': {'MZAI_JOB_ID': '0ef6cb54-6e4a-4fec-b09b-5c997bcc9b79'}}
 INFO:     172.18.0.6:53430 - "POST /api/v1/experiments/ HTTP/1.1" 500 Internal Server Error
 ERROR:    Exception in ASGI application
 Traceback (most recent call last):
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/_private/runtime_env/working_dir.py", line 65, in upload_working_dir_if_needed
     working_dir_uri = get_uri_for_directory(working_dir, excludes=excludes)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/_private/runtime_env/packaging.py", line 482, in get_uri_for_directory
     raise ValueError(f"directory {directory} must be an existing directory")
 ValueError: directory /mzai/lumigator/jobs/evaluator must be an existing directory
 
 During handling of the above exception, another exception occurred:
 
 Traceback (most recent call last):
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 412, in run_asgi
     result = await app(  # type: ignore[func-returns-value]
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
     return await self.app(scope, receive, send)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
     await super().__call__(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
     await self.app(scope, receive, _send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__
     await self.simple_response(scope, receive, send, request_headers=headers)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 144, in simple_response
     await self.app(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
     await route.handle(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
     await self.app(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
     response = await f(request)
                ^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
     raw_response = await run_endpoint_function(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 214, in run_endpoint_function
     return await run_in_threadpool(dependant.call, **values)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/concurrency.py", line 39, in run_in_threadpool
     return await anyio.to_thread.run_sync(func, *args)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
     return await get_async_backend().run_sync_in_worker_thread(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
     return await future
            ^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 943, in run
     result = context.run(func, *args)
              ^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/api/routes/experiments.py", line 24, in create_experiment
     return service.create_job(JobEvalCreate.model_validate(request.model_dump()), background_tasks)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/services/jobs.py", line 393, in create_job
     submit_ray_job(self.ray_client, entrypoint)
   File "/mzai/lumigator/python/mzai/backend/backend/ray_submit/submission.py", line 40, in submit_ray_job
     return client.submit_job(
            ^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/sdk.py", line 214, in submit_job
     self._upload_working_dir_if_needed(runtime_env)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/dashboard/modules/dashboard_sdk.py", line 398, in _upload_working_dir_if_needed
     upload_working_dir_if_needed(runtime_env, upload_fn=_upload_fn)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/_private/runtime_env/working_dir.py", line 69, in upload_working_dir_if_needed
     raise ValueError(
 ValueError: directory /mzai/lumigator/jobs/evaluator must be an existing directory or a zip package

Second error when opening the page or using the GET endpoints https://pastebin.com/ZJtFv1ua

2025-02-04 09:32:24 | INFO     | backend.services.jobs:get_job:423 - Obtaining info for job c5b17dc3-adbe-443a-af63-1a4fe12aa5fd: test - sample dataset
 INFO:     172.18.0.6:53444 - "GET /api/v1/jobs/ HTTP/1.1" 500 Internal Server Error
 ERROR:    Exception in ASGI application
 Traceback (most recent call last):
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 412, in run_asgi
     result = await app(  # type: ignore[func-returns-value]
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
     return await self.app(scope, receive, send)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
     await super().__call__(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
     await self.app(scope, receive, _send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
     await self.app(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
     await route.handle(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
     await self.app(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
     raise exc
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
     await app(scope, receive, sender)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
     response = await f(request)
                ^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
     raw_response = await run_endpoint_function(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 214, in run_endpoint_function
     return await run_in_threadpool(dependant.call, **values)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/starlette/concurrency.py", line 39, in run_in_threadpool
     return await anyio.to_thread.run_sync(func, *args)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
     return await get_async_backend().run_sync_in_worker_thread(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
     return await future
            ^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 943, in run
     result = context.run(func, *args)
              ^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/api/routes/jobs.py", line 122, in list_jobs
     jobs = service.list_jobs(skip, limit)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/services/jobs.py", line 447, in list_jobs
     items=[self.get_job(record.id) for record in records],
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/services/jobs.py", line 447, in <listcomp>
     items=[self.get_job(record.id) for record in records],
            ^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/backend/services/jobs.py", line 429, in get_job
     job_status = self.ray_client.get_job_status(job_id)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/sdk.py", line 424, in get_job_status
     return self.get_job_info(job_id).status
            ^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/sdk.py", line 359, in get_job_info
     self._raise_error(r)
   File "/mzai/lumigator/python/mzai/backend/.venv/lib/python3.11/site-packages/ray/dashboard/modules/dashboard_sdk.py", line 283, in _raise_error
     raise RuntimeError(
 RuntimeError: Request failed with status code 404: Job c5b17dc3-adbe-443a-af63-1a4fe12aa5fd does not exist.

Expected behavior

An experiment is created successfully and the endpoints shouldn't fail

System Info

MacOS
main branch latest commit: 0825694

Have you searched for similar issues before submitting this one?

  • Yes, I have searched for similar issues
@khaledosman khaledosman added the bug Something isn't working label Feb 4, 2025
@agpituk
Copy link
Contributor

agpituk commented Feb 4, 2025

Just managet to reproduce the above. This is the payload from the UI when creating an experiment:

curl 'http://localhost/api/v1/experiments/' -X POST -H 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:134.0) Gecko/20100101 Firefox/134.0' -H 'Accept: application/json, text/plain, */*' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate, br, zstd' -H 'Content-Type: application/json' -H 'Referer: http://localhost/experiments' -H 'Origin: http://localhost' -H 'Connection: keep-alive' -H 'Sec-Fetch-Dest: empty' -H 'Sec-Fetch-Mode: cors' -H 'Sec-Fetch-Site: same-origin' -H 'Priority: u=0' --data-raw ''{
  "name":"test",
  "description":"test",
  "models":
    [{. "name":"facebook/bart-large-cnn",
        "uri":"hf://facebook/bart-large-cnn",
         "website_url":"https://huggingface.co/facebook/bart-large-cnn",
         "description":"BART is a large-sized model fine-tuned on the CNN Daily Mail dataset.",
         "requirements":[],
         "info": 
            {
              "parameter_count":"406M",
              "tensor_type":"F32",
              "model_size":"1.63GB"
              
            },
          "tasks":
            [{
               "summarization":
               {
                  "max_length":142,
                  "min_length":56,
                  "length_penalty":2,
                  "early_stopping":true,
                  "no_repeat_ngram_size":3,
                  "num_beams":4
                 
               }}]}],
               "dataset":"ac8bc755-cd62-45b3-b080-bc150ece978a",
               "max_samples":3,
               "model":"hf://facebook/bart-large-cnn"
  
}''

@njbrake
Copy link
Contributor

njbrake commented Feb 4, 2025

@khaledosman Thanks for filing! Can you provide exactly what command you ran to start the containers?

@njbrake njbrake self-assigned this Feb 4, 2025
@khaledosman
Copy link
Contributor Author

I ran make run-lumigator, this issue is now fixed in the latest docker image.

I opened a follow up PR to make sure users get the latest version of the docker image after any potential hotfixes in the future without the need to manually remove or rebuild the image

@njbrake
Copy link
Contributor

njbrake commented Feb 4, 2025

Cool! Just to make sure I'm following: there was an issue with an old version of Lumigator and since the new image wasn't on your system, even though you were on the latest commit of the main branch it was using some older image when running the make command?

@khaledosman
Copy link
Contributor Author

No it was an issue with the MVP version in the main branch that has just been fixed with the latest docker push a couple of hours ago AFAIK

After the fix has been patched to the latest docker image however, I still needed to remove my old images and rebuild in order to pull the latest version since the docker image was pushed to the same tag mvp, so docker couldn't tell it needed to rebuild hence the other PR

@njbrake
Copy link
Contributor

njbrake commented Feb 4, 2025

Thanks! Can you provide a link to what the patch was? I'm having a hard time finding what commit fixed the error that was reported in this bug report.

@agpituk
Copy link
Contributor

agpituk commented Feb 4, 2025

Hello there! I want to recap and ensure we understand why this happened.
Currently, the command start-lumigator doesn't point to the latest tag but to the MVP tag. The docker image of the backend was created before the folder reorganization was made #717 . In #717 we changed docker-compose to use the new paths, but the MVP backend image still had the folders organized in the old way, therefore, making it impossible to find the evaluator when launching an experiment in ray.

The fix itself, was to override the old docker image tagged for the MVP with the latest version in main (we couldn't bump versions or the old image was going to always be broken).

This poses a question around if we should use the latest tag in lumigator or not, but this bug, as it is, is resolved now!
@njbrake and @khaledosman , if you are happy with this, feel free to close the bug (I'm leaving it open in case you've got any more questions)

@njbrake
Copy link
Contributor

njbrake commented Feb 4, 2025

Makes total sense! Thanks for the explanation! I'll go ahead and close this ticket for us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants