Skip to content

Commit

Permalink
Merge pull request #39 from uncbiag/package
Browse files Browse the repository at this point in the history
Add package configuration
  • Loading branch information
thewtex authored Aug 30, 2023
2 parents 5a4a377 + 5a2390d commit b1b26c7
Show file tree
Hide file tree
Showing 24 changed files with 243 additions and 64 deletions.
12 changes: 7 additions & 5 deletions .github/workflows/github-hosted-action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,16 @@ jobs:
max-parallel: 5

steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3

- name: Set up Python 3.8
uses: actions/setup-python@v2
uses: actions/setup-python@v3
with:
python-version: 3.8
- name: Install dependencies
run: |
pip install -r requirements.txt pytest
- name: Test with unittest
pip install -e .
pip install pytest
- name: Test with pytest
run: |
python -m unittest -k CPU
pytest
44 changes: 34 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,25 @@
# OAI Analysis 2

[<img src="https://github.com/uncbiag/OAI_analysis_2/actions/workflows/github-hosted-action.yml/badge.svg">](https://github.com/uncbiag/OAI_analysis_2/actions)
[![PyPI - Version](https://img.shields.io/pypi/v/oai-analysis.svg)](https://pypi.org/project/oai-analysis)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/oai-analysis.svg)](https://pypi.org/project/oai-analysis)

# OAI Analysis 2
**Table of Contents**

- [Installation](#installation)
- [Introduction](#introduction)
- [Development](#development)
- [Citation](#citation)
- [Acknowledgements](#acknowledgements)
- [License](#license)

## Installation

```console
pip install oai-analysis
```

## Introduction

This repository contains open-source analysis approaches for the [Osteoarthritis Initiative (OAI)](https://nda.nih.gov/oai/) magnetic resonance image (MRI) data.
The analysis code is largely written in Python with the help of [ITK](http://itk.org) and [VTK](http://vtk.org) for data I/O and mesh processing
Expand Down Expand Up @@ -32,17 +51,18 @@ We are currently working on the following features which should be available in
2. **Workflow management**: Whereas *OAI Analysis* used custom code to avoid recomputing results, we are switching to [Dagster](https://dagster.io/) to manage data dependencies in *OAI Analysis 2*.
3. **Distribution of analysis results**: As we are planning on not only distributing code, but also analysis results (such as segmentations, meshes, thickness maps) we are planning on supporting data access via the [Interplanetary File System (IPFS)](https://ipfs.io/).

### Installation of dependencies and testing
## Development

Contributions are appreciated and welcome.

```
git clone https://github.com/uncbiag/oai_analysis_2
cd oai_analysis_2
pip install -r requirements.txt
python -m unittest -v discover
git clone https://github.com/uncbiag/OAI_analysis_2
cd OAI_analysis_2
pip install -e .
pip install pytest
pytest
```

Currently, this should declare that the segmentation test passed, and the registration test failed.

To view the demo notebooks:
```
cd notebooks
Expand All @@ -51,7 +71,7 @@ jupyter notebook

upload test data to https://data.kitware.com/#collection/6205586c4acac99f42957ac3/folder/620559344acac99f42957d63

### Related manuscripts
## Citation

While we used the following stationary velocity field registration approach available in [easyReg](https://github.com/uncbiag/easyreg) for *OAI Analysis*
[[paper]](https://biag.cs.unc.edu/publication/dblp-confcvpr-shen-hxn-19/)
Expand Down Expand Up @@ -101,7 +121,11 @@ Results obtained by the *OAI Analysis* pipeline can be found in this manuscript
}
```

### Acknowledgements
## Acknowledgements

This work was developed with support in part from the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS)
under award numbers [1R44AR074375](https://reporter.nih.gov/search/Naf5qSR3eUStFkMfGm6KpQ/project-details/9777582) and [1R01AR072013](https://reporter.nih.gov/search/eE7eB34dVUGoY1nLF3kZNA/project-details/9368542).

## License

`oai-analysis` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.
8 changes: 4 additions & 4 deletions notebooks/DaskComputation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,10 @@
"import sys\n",
"sys.path.append(\"./OAI_analysis_2/\")\n",
"\n",
"import oai_analysis_2\n",
"from oai_analysis_2 import mesh_processing as mp\n",
"from oai_analysis_2 import utils\n",
"from oai_analysis_2.analysis_object import AnalysisObject\n",
"import oai_analysis\n",
"from oai_analysis import mesh_processing as mp\n",
"from oai_analysis import utils\n",
"from oai_analysis.analysis_object import AnalysisObject\n",
"from dask import delayed, compute, visualize\n",
"from dask.distributed import Client, progress, LocalCluster"
]
Expand Down
4 changes: 2 additions & 2 deletions notebooks/DaskComputationCoiled.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@
"import random\n",
"import itk\n",
"import vtk\n",
"import oai_analysis_2\n",
"from oai_analysis_2 import dask_processing as dp"
"import oai_analysis\n",
"from oai_analysis import dask_processing as dp"
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions notebooks/FullDemo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,9 @@
"# Remove this once the pip package is available\n",
"import sys\n",
"sys.path.append(\"./OAI_analysis_2/\")\n",
"import oai_analysis_2\n",
"import oai_analysis_2.mesh_processing as mp\n",
"from oai_analysis_2.analysis_object import AnalysisObject\n",
"import oai_analysis\n",
"import oai_analysis.mesh_processing as mp\n",
"from oai_analysis.analysis_object import AnalysisObject\n",
"\n",
"# To enable running the itkwidgets window on colab\n",
"from google.colab import output\n",
Expand Down
2 changes: 1 addition & 1 deletion notebooks/SegmentationDemo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
"metadata": {},
"outputs": [],
"source": [
"from oai_analysis_2.analysis_object import AnalysisObject\n",
"from oai_analysis.analysis_object import AnalysisObject\n",
"obj = AnalysisObject()\n",
"FC, TC = obj.segment(test_volume)"
]
Expand Down
4 changes: 4 additions & 0 deletions oai_analysis/__about__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# SPDX-FileCopyrightText: 2023-present OAI Analysis Development Team
#
# SPDX-License-Identifier: Apache-2.0
__version__ = "2.0.0"
2 changes: 1 addition & 1 deletion oai_analysis_2/__init__.py → oai_analysis/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
__version__ = '0.1'

#from .analysis_object import AnalysisObject
#from oai_analysis_2 import mesh_processing
#from oai_analysis import mesh_processing

Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import itk
import torch
from oai_analysis_2 import utils
import oai_analysis_2.segmentation.segmenter
import oai_analysis_2.registration
from oai_analysis import utils
import oai_analysis.segmentation.segmenter
import oai_analysis.registration

import os

Expand All @@ -25,17 +25,17 @@ def __init__(self):
output_prob=True,
output_itk=True,
)
self.segmenter = oai_analysis_2.segmentation.segmenter.Segmenter3DInPatchClassWise(
self.segmenter = oai_analysis.segmentation.segmenter.Segmenter3DInPatchClassWise(
mode="pred", config=segmenter_config
)

## Initialize registerer
#self.registerer = oai_analysis_2.registration.AVSM_Registration(
#self.registerer = oai_analysis.registration.AVSM_Registration(
# ckpoint_path=os.path.join(utils.get_data_dir(), "pre_trained_registration_model"),
# config_path =os.path.join(utils.get_data_dir(), "avsm_settings")
#

self.registerer = oai_analysis_2.registration.ICON_Registration()
self.registerer = oai_analysis.registration.ICON_Registration()

## Load Atlas
self.atlas_image = itk.imread(os.path.join(utils.get_data_dir(), "atlas_60_LEFT_baseline_NMI/atlas_image.nii.gz"))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ def deform_probmap_delayed(phi_AB, image_A, image_B, prob, image_type="FC"):
def get_thickness(warped_image, mesh_type):
import itk
import numpy as np
from oai_analysis_2 import mesh_processing as mp
from oai_analysis import mesh_processing as mp

distance_inner, _ = mp.get_thickness_mesh(warped_image, mesh_type=mesh_type)
distance_inner_itk = mp.get_itk_mesh(distance_inner)
Expand All @@ -126,12 +126,12 @@ def get_thickness(warped_image, mesh_type):
def segment_method(image_A):
import itk
from itk import IntensityWindowingImageFilter as IntensityWindowingImageFilter
import oai_analysis_2
import oai_analysis
import torch
import os
from os.path import exists
from oai_analysis_2 import utils
from oai_analysis_2.segmentation import segmenter
from oai_analysis import utils
from oai_analysis.segmentation import segmenter
import numpy as np
import urllib.request
import gc
Expand Down Expand Up @@ -167,7 +167,7 @@ def segment_method(image_A):
output_itk=True,
)

segmenter = oai_analysis_2.segmentation.segmenter.Segmenter3DInPatchClassWise(
segmenter = oai_analysis.segmentation.segmenter.Segmenter3DInPatchClassWise(
mode="pred", config=segmenter_config
)

Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
167 changes: 167 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "oai-analysis"
dynamic = ["version"]
description = 'Image analysis for the Osteoarthritis Initiative (OAI) knee magnetic resonance image (MRI) dataset.'
readme = "README.md"
requires-python = ">=3.8"
license = "MIT"
keywords = []
authors = [
{ name = "Marc Niethammer", email = "[email protected]" },
{ name = "Hastings Greer", email = "[email protected]" },
{ name = "Matt McCormick", email = "[email protected]" },
{ name = "Pranjal Sahu", email = "[email protected]" },
]
classifiers = [
"Development Status :: 4 - Beta",
"Programming Language :: Python",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
]
dependencies = [
"torch",
"itk==5.3",
"trimesh",
"vtk",
"icon_registration==0.3.4",
"girder_client",
"scikit-image",
]

[project.urls]
Documentation = "https://github.com/uncbiag/OAI_analysis_2#readme"
Issues = "https://github.com/uncbiag/OAI_analysis_2/issues"
Source = "https://github.com/uncbiag/OAI_analysis_2"

[tool.hatch.version]
path = "oai_analysis/__about__.py"

[tool.hatch.envs.default]
dependencies = [
"coverage[toml]>=6.5",
"pytest",
]
[tool.hatch.envs.default.scripts]
test = "pytest {args:tests}"
test-cov = "coverage run -m pytest {args:tests}"
cov-report = [
"- coverage combine",
"coverage report",
]
cov = [
"test-cov",
"cov-report",
]

[[tool.hatch.envs.all.matrix]]
python = ["3.7", "3.8", "3.9", "3.10", "3.11"]

[tool.hatch.envs.lint]
detached = true
dependencies = [
"black>=23.1.0",
"mypy>=1.0.0",
"ruff>=0.0.243",
]
[tool.hatch.envs.lint.scripts]
typing = "mypy --install-types --non-interactive {args:oai_analysis tests}"
style = [
"ruff {args:.}",
"black --check --diff {args:.}",
]
fmt = [
"black {args:.}",
"ruff --fix {args:.}",
"style",
]
all = [
"style",
"typing",
]

[tool.black]
target-version = ["py37"]
line-length = 120
skip-string-normalization = true

[tool.ruff]
target-version = "py37"
line-length = 120
select = [
"A",
"ARG",
"B",
"C",
"DTZ",
"E",
"EM",
"F",
"FBT",
"I",
"ICN",
"ISC",
"N",
"PLC",
"PLE",
"PLR",
"PLW",
"Q",
"RUF",
"S",
"T",
"TID",
"UP",
"W",
"YTT",
]
ignore = [
# Allow non-abstract empty methods in abstract base classes
"B027",
# Allow boolean positional values in function calls, like `dict.get(... True)`
"FBT003",
# Ignore checks for possible passwords
"S105", "S106", "S107",
# Ignore complexity
"C901", "PLR0911", "PLR0912", "PLR0913", "PLR0915",
]
unfixable = [
# Don't touch unused imports
"F401",
]

[tool.ruff.isort]
known-first-party = ["oai_analysis"]

[tool.ruff.flake8-tidy-imports]
ban-relative-imports = "all"

[tool.ruff.per-file-ignores]
# Tests can use magic values, assertions, and relative imports
"tests/**/*" = ["PLR2004", "S101", "TID252"]

[tool.coverage.run]
source_pkgs = ["oai_analysis", "tests"]
branch = true
parallel = true
omit = [
"oai_analysis/__about__.py",
]

[tool.coverage.paths]
oai_analysis = ["oai_analysis", "*/oai-analysis/oai_analysis"]
tests = ["tests", "*/oai-analysis/tests"]

[tool.coverage.report]
exclude_lines = [
"no cov",
"if __name__ == .__main__.:",
"if TYPE_CHECKING:",
]
7 changes: 0 additions & 7 deletions requirements.txt

This file was deleted.

Loading

0 comments on commit b1b26c7

Please sign in to comment.