Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REF] Refactor startup/shutdown events to lifespan events #407

Merged
merged 11 commits into from
Feb 20, 2025

Conversation

KhushiRajurkar
Copy link
Contributor

@KhushiRajurkar KhushiRajurkar commented Feb 4, 2025

Changes Made

  • Replaced deprecated startup/shutdown events with lifespan events in main.py.
  • Improved app initialization and cleanup using FastAPI’s modern event handling.
  • Ensured environment variable validation occurs properly during startup.

Issue Reference

Notes

  • The application runs successfully inside the Docker container.
  • Some test failures (5 total) appear unrelated to my changes, mostly concerning authentication and connection issues.
  • This draft PR is opened for troubleshooting with maintainers.

Changes proposed in this pull request

✅ Updated FastAPI event handling to lifespan events.
✅ Ensured vocabulary loading and cleanup occur within the app lifecycle.
Maintains compatibility with existing API functionality.

Summary by Sourcery

Migrates the application from deprecated startup/shutdown events to FastAPI lifespan events. This change improves the management of the application lifecycle, including initialization and cleanup tasks such as environment variable validation and vocabulary loading.

Bug Fixes:

  • Fixed an issue where environment variable validation was not occurring properly during startup by moving the validation logic to the lifespan event handler.

Enhancements:

  • Improved application initialization and cleanup by using FastAPI's lifespan events for managing the application lifecycle.

…espan events

* Replaced deprecated startup and shutdown events with lifespan events.
* Improved environment variable validation and vocabulary loading during app lifecycle.
* Ensured compatibility with FastAPI’s modern event handling approach.
Copy link

sourcery-ai bot commented Feb 4, 2025

Reviewer's Guide by Sourcery

This pull request refactors the application's startup and shutdown processes by replacing the deprecated startup/shutdown events with lifespan events. It also centralizes environment variable validation, authentication checks, and vocabulary loading/cleanup within the application lifecycle.

Sequence diagram for application startup with lifespan events

sequenceDiagram
  participant FastAPI App
  participant validate_environment_variables
  participant check_client_id
  participant initialize_vocabularies

  activate FastAPI App
  FastAPI App->validate_environment_variables: Call
  activate validate_environment_variables
  validate_environment_variables-->>FastAPI App: Return
  deactivate validate_environment_variables
  FastAPI App->check_client_id: Call
  activate check_client_id
  check_client_id-->>FastAPI App: Return
  deactivate check_client_id
  FastAPI App->initialize_vocabularies: Call
  activate initialize_vocabularies
  initialize_vocabularies-->>FastAPI App: Return
  deactivate initialize_vocabularies
  deactivate FastAPI App
Loading

Sequence diagram for application shutdown with lifespan events

sequenceDiagram
  participant FastAPI App
  participant cleanup_temp_vocab_dir

  activate FastAPI App
  FastAPI App->cleanup_temp_vocab_dir: Call
  activate cleanup_temp_vocab_dir
  cleanup_temp_vocab_dir-->>FastAPI App: Return
  deactivate cleanup_temp_vocab_dir
  deactivate FastAPI App
Loading

File-Level Changes

Change Details Files
Replaced the deprecated startup and shutdown events with lifespan events to manage the application's lifecycle.
  • Introduced a lifespan function to handle startup and shutdown events.
  • Removed the individual startup and shutdown event handlers.
  • Registered the lifespan function with the FastAPI app.
app/main.py
Consolidated environment variable validation and authentication checks into the startup phase.
  • Created a function to validate the presence of required environment variables.
  • Moved the authentication check to the startup phase.
  • The application will now raise an error if the environment variables are not set.
  • The application will now raise an error if authentication is enabled but the client ID is not set.
app/main.py
app/api/security.py
tests/test_security.py
Centralized the creation and cleanup of temporary directories for vocabulary lookups within the application lifecycle.
  • Created a function to initialize vocabulary term lookups and store them in a temporary directory.
  • The temporary directory is stored in the application state.
  • The temporary directory is cleaned up when the application shuts down.
app/main.py

Assessment against linked issues

Issue Objective Addressed Explanation
#152 Replace deprecated startup events with lifespan events.
#152 Update startup checks for defined credentials and allowed origins to follow the new lifespan syntax.

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@github-actions github-actions bot added the _community [BOT ONLY] PR label for community contributions. Used for tracking label Feb 4, 2025
@alyssadai alyssadai self-requested a review February 6, 2025 16:16
Copy link
Contributor

@alyssadai alyssadai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @KhushiRajurkar,

Thanks for your PR! The CI tests reproduced 2/5 of the failures you mentioned encountering in the issue: https://github.com/neurobagel/api/actions/runs/13127168535/job/36625878480?pr=407

These are due to your code changes breaking assumptions of our existing unit tests of the app startup checks, which can be fixed by updating either the tests or your code - see my detailed comments below.

As for the other 3 test failures you reported, these are integration tests (marked with @pytest.mark.integration, see example here), which means that you need a synthetic Neurobagel node running in order to run the tests. The httpx.Connect errors you encountered suggest that in the integration tests, requests involving the synthetic node are failing, so the node may not have been set up correctly in your local environment.

More details about the integration test setup are in our README, but actually, I'd recommend simply running the test suite using:

pytest

which should skip the integration tests by default. If you're developing in a Docker container as opposed to a normal Python environment, you may need extra steps to ensure that the tests can reach the test node (which is its own Docker Compose stack). This may not be worthwhile for the purposes of your PR, which shouldn't affect the integration tests. As a sanity check, you can see that the same integration tests which are failing for you locally are also running and passing in the CI tests on this PR :)

I've left some other comments below, including suggestions for more detail in the new docstrings. Have a look and let me know if you have questions.

@KhushiRajurkar
Copy link
Contributor Author

Hello @alyssadai,

Thank you for the detailed review and guidance!
Apologies for these oversights and the delay in responding—I was reviewing the feedback to understand the required changes.
I’ll carefully review the file and make the necessary fixes, including restoring and refactoring the docstrings, correcting the spacing issue in the error message, and ensuring proper formatting with pre-commit. I’ll push the updates soon and re-run the tests to verify everything.
Thank you!

@alyssadai
Copy link
Contributor

Sounds great @KhushiRajurkar - please mark your PR as ready to review once you have made the changes!

@alyssadai
Copy link
Contributor

Hi @KhushiRajurkar, just checking in: do you have any questions about addressing the review comments?

Please feel free to let us know if you no longer have capacity to work on this issue, so that one of the maintainers can take over for your PR.

@KhushiRajurkar
Copy link
Contributor Author

Hi @KhushiRajurkar, just checking in: do you have any questions about addressing the review comments?

Please feel free to let us know if you no longer have capacity to work on this issue, so that one of the maintainers can take over for your PR.

Hello @alyssadai,
Apologies for the delay in updating. I made the required changes about 5 days ago but encountered issues while running pytest and pre-commit. I wanted to ensure code consistency before pushing the changes.

I believe the issue might be related to my local system environment. As a workaround, I had planned to run the tests in Docker as a final approach before pushing the changes.

I have modified the following files:

main.py

security.py

test_security.py

I’ll proceed with the Docker approach and push the updates soon. Please let me know if there are any alternative suggestions or if you’d like me to take a different approach.

Thank you!

@alyssadai
Copy link
Contributor

alyssadai commented Feb 17, 2025

Hi @KhushiRajurkar,

I wouldn't recommend running the tests in Docker, as that will be difficult for us to replicate on our end (our CI tests run just in a bare Python environment).

Instead, I'd suggest first committing and pushing your current changes to this branch (our CI tests will run on them so we can see directly in this PR if there are any tests still failing), and then setting up a fresh Python virtual environment locally to ensure that you have the correct dependencies and linters as defined for this repo.

For example, from the root of this repo:

python -m venv venv  # ensure you have Python 3.10+
pip install -r requirements.txt

# following https://github.com/neurobagel/api?tab=readme-ov-file#testing
git submodule init
git submodule update

# following https://neurobagel.org/contributing/CONTRIBUTING/#follow-repository-code-style
pre-commit install
pre-commit run

then, you can try running the tests again using the command pytest (don't worry about the integration tests / setting up Docker Compose).

Let me know if that works. For us to better track progress on your PR, I'd encourage you to always push your changes first!

@KhushiRajurkar
Copy link
Contributor Author

Hi @KhushiRajurkar,

I wouldn't recommend running the tests in Docker, as that will be difficult for us to replicate on our end (our CI tests run just in a bare Python environment).

Instead, I'd suggest first committing and pushing your current changes to this branch (our CI tests will run on them so we can see directly in this PR if there are any tests still failing), and then setting up a fresh Python virtual environment locally to ensure that you have the correct dependencies and linters as defined for this repo.

For example, from the root of this repo:

python -m venv venv  # ensure you have Python 3.10+
pip install -r requirements.txt

# following https://github.com/neurobagel/api?tab=readme-ov-file#testing
git submodule init
git submodule update

# following https://neurobagel.org/contributing/CONTRIBUTING/#follow-repository-code-style
pre-commit install
pre-commit run

then, you can try running the tests again using the command pytest (don't worry about the integration tests / setting up Docker Compose).

Let me know if that works. For us to better track progress on your PR, I'd encourage you to always push your changes first!

Hello @alyssadai
Thank you for your guidance! I will commit and push my changes within 24 hours (as I am currently traveling) and will avoid using Docker for testing.

Regarding pytest, I had encountered an issue while running the command:
pip install -r requirements.txt
I was unable to install the dependencies, which prevented me from running pytest in the virtual environment. I'll attempt to resolve this after pushing my changes.

Thank you for your patience and support!

- Restored previously deleted docstrings
- Fixed spacing issues in the `validate_environment_variables`
- Changed raised error from `ValueError` to `RuntimeError` in `security.py`
- Updated corresponding assertion in `test_security.py`
Added a comment that was previously deleted
Revert deletions of comments in `overridden_swagger` and `overridden_redoc`
@KhushiRajurkar
Copy link
Contributor Author

Hello @alyssadai, I've committed and pushed the changes in PR #407. Please review my changes and provide your valuable feedback.

Thank you!

@alyssadai alyssadai marked this pull request as ready for review February 18, 2025 18:14
Copy link

codecov bot commented Feb 18, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.24%. Comparing base (0073503) to head (913bfa8).
Report is 8 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #407      +/-   ##
==========================================
+ Coverage   97.10%   97.24%   +0.13%     
==========================================
  Files          24       24              
  Lines         830      835       +5     
==========================================
+ Hits          806      812       +6     
+ Misses         24       23       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Revert deletion of docstring in the `root` function
Copy link
Contributor

@alyssadai alyssadai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @KhushiRajurkar, thanks for the changes! The tests are now passing 🎉

I left a few final minor suggestions below. I also notice that the pre-commit check is still failing, which I'm assuming may be related to your difficulties installing the dependencies locally.

Since this PR is close to ready, for the sake of being able to merge it soon (we have a couple PRs being worked on in parallel, and we don't want them to get too out of sync and run into merge conflicts), I will apply the remaining changes directly and run pre-commit locally so that we do not merge with the failing check.

Regarding pytest, I had encountered an issue while running the command:
pip install -r requirements.txt
I was unable to install the dependencies, which prevented me from running pytest in the virtual environment. I'll attempt to resolve this after pushing my changes.

If you are still having trouble with the dependencies, could you please open an issue or reach out to us on Discord with the exact error message you encountered, so we can better help you debug it? Without seeing the error, it is hard to know whether the issue is related to your Python installation or something else about your dev setup.

@alyssadai alyssadai self-requested a review February 20, 2025 04:04
Copy link
Contributor

@alyssadai alyssadai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ready to merge! 🧑‍🍳 Thanks again @KhushiRajurkar for the contribution!

As mentioned above. please feel free to open an issue or message us directly on Discord for further help debugging the issue you encountered with installing the requirements.

@alyssadai alyssadai merged commit bc90678 into neurobagel:main Feb 20, 2025
7 checks passed
@alyssadai alyssadai added the pr-internal Non-user-facing code improvement, will increment patch version when merged (0.0.+1) label Feb 20, 2025
@alyssadai alyssadai changed the title Refactor startup/shutdown events to lifespan events [REF] Refactor startup/shutdown events to lifespan events Feb 20, 2025
@KhushiRajurkar
Copy link
Contributor Author

Hi @KhushiRajurkar, thanks for the changes! The tests are now passing 🎉

I left a few final minor suggestions below. I also notice that the pre-commit check is still failing, which I'm assuming may be related to your difficulties installing the dependencies locally.

Since this PR is close to ready, for the sake of being able to merge it soon (we have a couple PRs being worked on in parallel, and we don't want them to get too out of sync and run into merge conflicts), I will apply the remaining changes directly and run pre-commit locally so that we do not merge with the failing check.

Regarding pytest, I had encountered an issue while running the command:
pip install -r requirements.txt
I was unable to install the dependencies, which prevented me from running pytest in the virtual environment. I'll attempt to resolve this after pushing my changes.

If you are still having trouble with the dependencies, could you please open an issue or reach out to us on Discord with the exact error message you encountered, so we can better help you debug it? Without seeing the error, it is hard to know whether the issue is related to your Python installation or something else about your dev setup.

Hello @alyssadai,
Thank you so much for your guidance and your patience! I'll take a closer look at the dependency issue and reach out on Discord if I need further assistance.😊
Thank you!

@KhushiRajurkar
Copy link
Contributor Author

Ready to merge! 🧑‍🍳 Thanks again @KhushiRajurkar for the contribution!

As mentioned above. please feel free to open an issue or message us directly on Discord for further help debugging the issue you encountered with installing the requirements.

Thank you, @alyssadai, for the opportunity to contribute to Neurobagel and for your guidance and support throughout this PR! 🙇🏻‍♀️ This experience has helped me grow as a programmer, and I’m grateful for the chance to learn and improve!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
_community [BOT ONLY] PR label for community contributions. Used for tracking pr-internal Non-user-facing code improvement, will increment patch version when merged (0.0.+1)
Projects
Status: Review - Done
Development

Successfully merging this pull request may close these issues.

Switch deprecated startup events to lifespan events
2 participants