Skip to content

Commit

Permalink
Merge branch 'main' into update-metadata
Browse files Browse the repository at this point in the history
  • Loading branch information
tsalo authored Jan 23, 2025
2 parents 7f53c29 + bc269f4 commit c480b98
Show file tree
Hide file tree
Showing 11 changed files with 80 additions and 46 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
docs/generated/
cubids/_version.py
*.DS_Store

Expand Down
8 changes: 4 additions & 4 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,11 +63,11 @@ Ready to contribute? Here's how to set up `cubids` for local development.

$ git clone [email protected]:your_name_here/cubids.git

3. Install your local copy into a virtualenv.
Assuming you have virtualenvwrapper installed,
this is how you set up your fork for local development::
3. Install your local copy into a miniforge environment.
This is how you set up your fork for local development::

$ mkvirtualenv cubids
$ mamba create -n cubids python=3.12
$ mamba activate cubids
$ cd cubids/
$ python setup.py develop

Expand Down
Empty file added cubids/data/references.bib
Empty file.
1 change: 1 addition & 0 deletions cubids/metadata_merge.py
Original file line number Diff line number Diff line change
Expand Up @@ -304,6 +304,7 @@ def group_by_acquisition_sets(files_tsv, output_prefix, acq_group_level, is_long
"""Find unique sets of Key/Param groups across subjects.
This writes out the following files:
- <output_prefix>_AcqGrouping.tsv: A tsv with the mapping of subject/session to
acquisition group.
- <output_prefix>_AcqGrouping.json: A data dictionary for the AcqGrouping.tsv.
Expand Down
42 changes: 21 additions & 21 deletions docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,17 @@ API
:toctree: generated/
:template: function.rst

cubids.workflows.validate
cubids.workflows.bids_sidecar_merge
cubids.workflows.group
cubids.workflows.apply
cubids.workflows.datalad_save
cubids.workflows.undo
cubids.workflows.copy_exemplars
cubids.workflows.add_nifti_info
cubids.workflows.purge
cubids.workflows.remove_metadata_fields
cubids.workflows.print_metadata_fields
workflows.validate
workflows.bids_sidecar_merge
workflows.group
workflows.apply
workflows.datalad_save
workflows.undo
workflows.copy_exemplars
workflows.add_nifti_info
workflows.purge
workflows.remove_metadata_fields
workflows.print_metadata_fields


**********************************************
Expand All @@ -48,11 +48,11 @@ API
:toctree: generated/
:template: function.rst

cubids.metadata_merge.check_merging_operations
cubids.metadata_merge.merge_without_overwrite
cubids.metadata_merge.merge_json_into_json
cubids.metadata_merge.get_acq_dictionary
cubids.metadata_merge.group_by_acquisition_sets
metadata_merge.check_merging_operations
metadata_merge.merge_without_overwrite
metadata_merge.merge_json_into_json
metadata_merge.get_acq_dictionary
metadata_merge.group_by_acquisition_sets


***********************************
Expand All @@ -65,8 +65,8 @@ API
:toctree: generated/
:template: function.rst

cubids.validator.build_validator_call
cubids.validator.build_subject_paths
cubids.validator.run_validator
cubids.validator.parse_validator_output
cubids.validator.get_val_dictionary
validator.build_validator_call
validator.build_subject_paths
validator.run_validator
validator.parse_validator_output
validator.get_val_dictionary
13 changes: 5 additions & 8 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = ".rst"
source_suffix = {".rst": "restructuredtext"}

# The master toctree document.
master_doc = "index"
Expand Down Expand Up @@ -171,13 +171,10 @@
_python_doc_base = "https://docs.python.org/" + _python_version_str
intersphinx_mapping = {
"python": (_python_doc_base, None),
"numpy": ("https://numpy.org/doc/stable/", (None, "./_intersphinx/numpy-objects.inv")),
"scipy": (
"https://docs.scipy.org/doc/scipy/reference",
(None, "./_intersphinx/scipy-objects.inv"),
),
"sklearn": ("https://scikit-learn.org/stable", (None, "./_intersphinx/sklearn-objects.inv")),
"matplotlib": ("https://matplotlib.org/", (None, "https://matplotlib.org/objects.inv")),
"numpy": ("https://numpy.org/doc/stable/", None),
"scipy": ("https://docs.scipy.org/doc/scipy/reference/", None),
"sklearn": ("https://scikit-learn.org/stable", None),
"matplotlib": ("https://matplotlib.org/stable/", None),
"pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
"pybids": ("https://bids-standard.github.io/pybids/", None),
"nibabel": ("https://nipy.org/nibabel/", None),
Expand Down
1 change: 1 addition & 0 deletions docs/contributing.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.. include:: ../CONTRIBUTING.rst
29 changes: 29 additions & 0 deletions docs/faq.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
==========================
Frequently Asked Questions
==========================


--------------------------------------
Does CuBIDS work on all BIDS datasets?
--------------------------------------

CuBIDS relies on many hardcoded rules and data types,
so it may not work on all BIDS datasets.
Some datatypes, such as EEG or iEEG, are not yet supported,
nor are some configurations of supported datatypes, such as multi-echo fMRI.

If you encounter an issue, please open an issue on the CuBIDS GitHub repository.


-----------------------------------------------------
How do the developers determine what features to add?
-----------------------------------------------------

CuBIDS is primarily developed to curate large-scale datasets in order to be used by the PennLINC team.
This means that we will naturally prioritize features that are useful to us.
However, we are always open to suggestions and contributions from the community,
and will of course consider features that do not directly benefit us.

If you want to request support for a new modality or niche data feature,
please open an issue on the CuBIDS GitHub repository.
We are more likely to add support for a new feature if you can point us toward a dataset that we can use to test it.
5 changes: 2 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,7 @@ Contents
usage
cli
example
../CONTRIBUTING
../AUTHORS
../HISTORY
contributing
faq
glossary
api
24 changes: 15 additions & 9 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,15 @@ Installation

.. note::
We **strongly recommend** using ``CuBIDS`` with environment management.
For this, we recommend `miniconda <https://docs.conda.io/en/latest/miniconda.html>`_
(`miniforge <https://github.com/conda-forge/miniforge>`_ for M1 Chip Mac Machines).
For this, we recommend `miniforge <https://github.com/conda-forge/miniforge>`_.

Once you've installed conda,
initialize a new conda environment (for example, named ``cubids``) as follows:
Once you've installed mamba,
initialize a new mamba environment (for example, named ``cubids``) as follows:

.. code-block:: console
$ conda create -n cubids python=3.12 pip
$ conda activate cubids
$ mamba create -n cubids python=3.12 pip
$ mamba activate cubids
You are now ready to install CuBIDS.
You can do so in one of two ways.
Expand All @@ -29,8 +28,7 @@ To obtain ``CuBIDS`` locally, we can use ``pip`` to download our software from t
$ pip install CuBIDS
Alternatively,
you can clone the source code for ``CuBIDS`` from our `GitHub repository`_ using the following command:
Alternatively, you can clone the source code for ``CuBIDS`` from our `GitHub repository`_ using the following command:

.. code-block:: console
Expand All @@ -43,13 +41,14 @@ Once you have a copy of the source, you can install it with:
$ cd CuBIDS
$ pip install -e .
We will now need to install some dependencies of ``CuBIDS``.
To do this, we first must install deno to run `bids-validator`.
We can accomplish this using the following command:

.. code-block:: console
$ conda install deno
$ mamba install deno
The new schema-based ``bids-validator`` doesn't need to be installed
and will be implemented automatically when `cubids validate` is called
Expand All @@ -75,6 +74,13 @@ and will be implemented automatically when `cubids validate` is called
For more information, you can read: https://bids-validator.readthedocs.io/en/latest/user_guide/command-line.html

.. tip::
If you want to modify the CuBIDS codebase
(e.g., if you are looking to contribute to CuBIDS),
please follow the installation instructions in
`our contributing guidelines <https://github.com/PennLINC/CuBIDS/blob/main/CONTRIBUTING.rst>`_.


We also recommend using ``CuBIDS`` with the optional ``DataLad`` version control capabilities.
We use ``DataLad`` throughout our walkthrough of the CuBIDS Workflow on
:doc:`the Example Walkthrough page <example>`.
Expand Down
2 changes: 1 addition & 1 deletion docs/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ but it keeps track of every file's assignment to Entity and Parameter Groups.
.. _acqgrouptsv:

Modifying Entity and Parameter Group Assignments
---------------------------------------------
------------------------------------------------

Sometimes we see that there are important differences in acquisition parameters within a Entity Set.
If these differences impact how a pipeline will process the data,
Expand Down

0 comments on commit c480b98

Please sign in to comment.