Skip to content

Commit

Permalink
Iss25 (#43)
Browse files Browse the repository at this point in the history
* Ouranos ESPO-G6-R2 script + new capablities

This script introduces new features to the tool, including the
capability to process the climate datasets, including those consisting
of multiple models, submodels (those with specific configuration sets),
ensemble members, and multiple scenarios (SSPs). The parent calling
script is in charge of parallelization scheme, if needed.

With this script, a few issues related to the current deficiencies of
datatool could be resolved simultaneously.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Fixing short usage and comments

* Adding new parallelization schemes

Multi parallelization schemes are added, so the package not only submit
array jobs based on the given date range and the chunk schemes, but also
considers submitting jobs based on various models, ensemble members, and
scenarios. These new parallelization schemes mostly applies to climate
datasets, but not necessarily.

This commit aims to save time for the user and fasten the processing
time for datasets.

This commit resolves issue #25 on remote GitHub hosting repository.
Furthermore, it adds the ESPO dataset to the list of datasets as well.

Moreover, a new option is implement to show the list of currently
available datasets to the users.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Separating dataset information from the main extract-dataset script

This is meant to clearly organize the information provided inside the
package. The new file lists all the available datasets and the keyword
that users can provide the `--dataset` option. Previously, this
information was part of the main Usage message `--help` of the main
script.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Adding GDDP-NEX-CMIP6 info

* Fixing DOI value for ab-gov dataset

* Adding NASA GDDP-NEX-CMIP6 script address

* ESPO-G6-R2 data processing example

* Multiple minor modifications

1. the "function" keywords added to make the style compatible with that
   of Google's recommendations,
2. required arguments and options are revised alongside the relevant
   comments,
3. typos are fixed

Signed-off-by: Kasra Keshavarz <[email protected]>

* AB Government Climate Dataset Script

The script deals with the Climate Dataset produced by the Alberta
Government. The dataset is not public yet, and is planned to be
available soon.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Adding variable list for various elevation levels

Since some hydrological models can use near-surface level or 40m level
data, the necessary list of variables for both levels are added.

Furthermore, a link to the official website for the dataset is added for
further clarity.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Path to the dataset for rpp-kshook allocation is updated

Since multiple HPCs are now used for the workflows, it is important to
have consistent datasets synchronized regularly. Therefore, this commit
attempts to reflect these efforts by creating consistent paths for
various HPCs/allocations.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Bumping version to v0.5.0

* Addressing issues #39, #37, #36, #35, #34, and #25

In this commit, the following are addressed:
 * Correcting paths for the local scripts,
 * Renaming scripts to reflect the owner of the script for further
   clarification,
 * Adding parallelization schemes based on model, ensemble, and scenario,
 * Adding gcc/9.3.0 as the reference clib for the modules loaded to
   prevent mismatch between various environments defined on the HPCs,
 * Assuring ESPG:4326 is considered for the input shape file if there is
   no CRS defined,
 * Getting rid of \t characters in the help messages,
 * Correcting short help message to be more informative,
 * Adding function declarations to follow Google’s shell scripting
   guidelines,
 * Assuring --account=STR is described in the help message.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Assuring compatibility of the style with Google's shell scripting guidelines

* Organizing the assets directory

Various files within this directory is categorized to be more
informative for the users/devs.

Signed-off-by: Kasra Keshavarz <[email protected]>

* README file for ab-gov dataset

The README file for this dataset is added, offering necessary
information for the users.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Minor structural changes

This commit assures all dataset scripts follows the convention of
<institute>-<dataset-name> under the `scripts` path.

Furthermore, necessary adjusments on the styles of the scripts has been
implemented, including:
  * adding `--model`, `--scenario`, and `--ensemble` options, if missing,
    for compatibility with the main caller script, as these options are
    given to the script by `extract-dataset.sh` script,
  * assuring scripting style follows that of Google's shell scripting
    guidelines,
  * the paths to the externally called scripts are properlly adjusted,
    after modifications to the structure of datatool's `assets`
    directory, and
  * minor changes to the source code to assure compatibility with the
    v0.5.0 of datatool.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Tracking LICENSE of eccc-rdrs

* Tracking eccc-rdrs script

* Tracking GWF-NCAR CONUS-I script

* Documentation for NASA's NEX-GDDP-CMIP6 dataset

This commit addresses issue #27 by describing the NASA's NEX-GDDP-CMIP^
dataset and relevant scripts for it. Furthermore, it provides necessary
information for users to enable them use `datatool` for extracting
subsets of the dataset for any temporal and spatial extents.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Script for NASA's NEX-GDDP-CMIP6 dataset

This commit addresses issue #27 and provides scripts to extract subset
from NASA's NEX-GDDP-CMIP6 dataset. This script is capable to work with
various models, scenarios, ensemble members, and variables offered by
this dataset.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Adding Ouranos ESPO-G6-R2 Dataset Script

This commit addresses issue #34 and processes this dataset that contains
multiple GCM model outputs, including various sub-models, scenarios,
ensemble members, and variables.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Documenting Ouranos ESPO-G6-R2 Dataset script

Necessary information to use `datatool` for this script is provided to
the user via the README.md file.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Updating changelog for v0.5.0

* Adding a section for WIP directories

* Restructuring script directory

With the growing number of scripts, this commit tries to restructure
this directory to provide more clarity and organization for the users.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Updates to the documentations

The help message has been trimmed to provide more information to the
users. This include values provided to the `--lon-lims` that must be
within the [-180, +180] limits. This has not been mentioned before to
the users and could have provided confusion, as there are multiple
methods to describe longitudes.

Furthermore, the list of datasets on the main page of the repository has
been updated to reflect the most up-to-date list.

Signed-off-by: Kasra Keshavarz <[email protected]>

* Upgrading style of warning message

* Upgrading style of warning message

* Updating link addresses for CONUS I & II

* Updating link address to ERA5 dataset

* Removing dead link for the Ouranos MRCC5 dataset for now

---------

Signed-off-by: Kasra Keshavarz <[email protected]>
  • Loading branch information
kasra-keshavarz authored Mar 6, 2024
1 parent 5bfb062 commit 66140ec
Show file tree
Hide file tree
Showing 51 changed files with 2,631 additions and 2,027 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,6 @@
.ipynb_checkpoints
.DS_Store
*.swp

# WIP folders
scripts/ouranos-crcm5-cmip6/
15 changes: 15 additions & 0 deletions CHANGELOG
Original file line number Diff line number Diff line change
@@ -1,5 +1,20 @@
Changelog
=========
[v0.5.0] March 5th, 2024
# Improvements
* `DATASETS` file now describes all the datasets available in the script
* new parallelization schemes are introduced using models, scenarios,
and ensemble members
* the `assets` directory is now more organized separating common NCL
and bash scripts needed
* styles of the script is updated (not completely) to be more compatible
with Google's shell scripting style guidelines
* Documentations have been updated
# Datasets
* Ouranos ESPO-G6-R2 CMIP6 script added (~9TBs)
* NASA GDDP-NEX-CMIP CMIP6 script added (~37TBs)
* Alberta Governments CMIP6 script added (~0.1TBs)

[v0.4.1] - September 21st, 2023
# Fixed
* minor bug fixes
Expand Down
80 changes: 80 additions & 0 deletions DATASETS
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
|--------------------------------|-----------|---------------------------|
| DATASET NAME | keyword | DOI |
|--------------------------------|-----------|---------------------------|
|1. NCAR-GWF WRF CONUS I | conus_i | 10.1007/s00382-016-3327-9 |
|2. NCAR-GWF WRF CONUS II | conus_ii | 10.5065/49SN-8E08 |
|3. ECMWF ERA5 | era5 | 10.24381/cds.adbb2d47 |
|4. ECCC RDRSv2.1 | rdrs | 10.5194/hess-25-4917-2021 |
|5. CCRN CanRCM4-WFDEI-GEM-CaPA | canrcm4_g | 10.5194/essd-12-629-2020 |
|6. WFDEI-GEM-CaPA | wfdei_g | 10.20383/101.0111 |
|7. ORNL Daymet | daymet | 10.3334/ORNLDAAC/2129 |
|8. Alberta Government | ab-gov | 10.5194/hess-23-5151-201 |
| 8.1. BCC-CSM2-MR | | ditto |
| 8.2. CNRM-CM6-1 | | ditto |
| 8.3. EC-Earth3-Veg | | ditto |
| 8.4. GFDL-CM4 | | ditto |
| 8.5. GFDL-ESM4 | | ditto |
| 8.6. IPSL-CM6A-LR | | ditto |
| 8.7. MRI-ESM2-0 | | ditto |
| 8.8. Hybrid-observation | | ditto |
|9. Ouranos ESPO-G6-R2 |espo-r6-r2 |10.1038/s41597-023-02855-z |
| 9.1. AS-RCEC | | ditto |
| 9.2. BCC | | ditto |
| 9.3. CAS | | ditto |
| 9.4. CCCma | | ditto |
| 9.5. CMCC | | ditto |
| 9.6. CNRM-CERFACS | | ditto |
| 9.7. CSIRO | | ditto |
| 9.8. CSIRO-ARCCSS | | ditto |
| 9.9. DKRZ | | ditto |
| 9.10. EC-Earth-Con | | ditto |
| 9.11. INM | | ditto |
| 9.12. IPS | | ditto |
| 9.13. MIROC | | ditto |
| 9.14. MOHC | | ditto |
| 9.15. MPI-M | | ditto |
| 9.16. MRI | | ditto |
| 9.17. NCC | | ditto |
| 9.18. NIMS-KMA | | ditto |
| 9.19. NOAA-GFDL | | ditto |
| 9.20. NUIST | | ditto |
|10. Ouranos MRCC5-CMIP6 |crcm5-cmip6| TBD |
| 10.1. CanESM5 | | TBD |
| 10.2. MPI-ESM1-2-LR | | TBD |
|11. NASA GDDP-NEX-CMIP6 | gddp-nex |10.1038/s41597-022-01393-4 |
| 11.0. ACCESS-CM2 | | ditto |
| 11.1. ACCESS-ESM1-5 | | ditto |
| 11.2. BCC-CSM2-MR | | ditto |
| 11.3. CanESM5 | | ditto |
| 11.4. CESM2 | | ditto |
| 11.5. CESM2-WACCM | | ditto |
| 11.6. CMCC-CM2-SR5 | | ditto |
| 11.7. CMCC-ESM2 | | ditto |
| 11.8. CNRM-CM6-1 | | ditto |
| 11.9. CNRM-ESM2-1 | | ditto |
| 11.10. EC-Earth3 | | ditto |
| 11.11. EC-Earth3-Veg-LR | | ditto |
| 11.12. FGOALS-g3 | | ditto |
| 11.13. GFDL-CM4 | | ditto |
| 11.14. GFDL-CM4_gr2 | | ditto |
| 11.15. GFDL-ESM4 | | ditto |
| 11.16. GISS-E2-1-G | | ditto |
| 11.17. HadGEM3-GC31-LL | | ditto |
| 11.18. HadGEM3-GC31-MM | | ditto |
| 11.19. IITM-ESM | | ditto |
| 11.20. INM-CM4-8 | | ditto |
| 11.21. INM-CM5-0 | | ditto |
| 11.22. IPSL-CM6A-LR | | ditto |
| 11.23. KACE-1-0-G | | ditto |
| 11.24. KIOST-ESM | | ditto |
| 11.25. MIROC6 | | ditto |
| 11.26. MIROC-ES2L | | ditto |
| 11.27. MPI-ESM1-2-HR | | ditto |
| 11.28. MPI-ESM1-2-LR | | ditto |
| 11.29. MRI-ESM2-0 | | ditto |
| 11.30. NESM3 | | ditto |
| 11.31. NorESM2-LM | | ditto |
| 11.32. NorESM2-MM | | ditto |
| 11.33. TaiESM1 | | ditto |
| 11.34. UKESM1-0-LL | | ditto |
|--------------------------------|-----------|---------------------------|
118 changes: 65 additions & 53 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,73 +4,79 @@ This repository contains scripts to process meteorological datasets in NetCDF fi
```console
Usage:
extract-dataset [options...]

Script options:
-d, --dataset Meteorological forcing dataset of interest
-i, --dataset-dir=DIR The source path of the dataset file(s)
-v, --variable=var1[,var2[...]] Variables to process
-o, --output-dir=DIR Writes processed files to DIR
-s, --start-date=DATE The start date of the data
-e, --end-date=DATE The end date of the data
-l, --lat-lims=REAL,REAL Latitude's upper and lower bounds
-n, --lon-lims=REAL,REAL Longitude's upper and lower bounds
-a, --shape-file=PATH Path to the ESRI shapefile; optional
-m, --ensemble=ens1,[ens2[...]] Ensemble members to process; optional
Leave empty to extract all ensemble members
-j, --submit-job Submit the data extraction process as a job
on the SLURM system; optional
-k, --no-chunk No parallelization, recommended for small domains
-p, --prefix=STR Prefix prepended to the output files
-b, --parsable Parsable SLURM message mainly used
for chained job submissions
-c, --cache=DIR Path of the cache directory; optional
-E, [email protected] E-mail user when job starts, ends, or
fails; optional
-u, --account Digital Research Alliance of Canada's sponsor's
account name; optional, defaults to 'rpp-kshook'
-V, --version Show version
-h, --help Show this screen and exit
-d, --dataset Meteorological forcing dataset of interest
-i, --dataset-dir=DIR The source path of the dataset file(s)
-v, --variable=var1[,var2[...]] Variables to process
-o, --output-dir=DIR Writes processed files to DIR
-s, --start-date=DATE The start date of the data
-e, --end-date=DATE The end date of the data
-l, --lat-lims=REAL,REAL Latitude's upper and lower bounds;
optional; within the [-90, +90] limits
-n, --lon-lims=REAL,REAL Longitude's upper and lower bounds;
optional; within the [-180, +180] limits
-a, --shape-file=PATH Path to the ESRI shapefile; optional
-m, --ensemble=ens1,[ens2,[...]] Ensemble members to process; optional
Leave empty to extract all ensemble members
-M, --model=model1,[model2,[...]] Models that are part of a dataset,
only applicable to climate datasets, optional
-S, --scenario=scn1,[scn2,[...]] Climate scenarios to process, only applicable
to climate datasets, optional
-j, --submit-job Submit the data extraction process as a job
on the SLURM system; optional
-k, --no-chunk No parallelization, recommended for small domains
-p, --prefix=STR Prefix prepended to the output files
-b, --parsable Parsable SLURM message mainly used
for chained job submissions
-c, --cache=DIR Path of the cache directory; optional
-E, [email protected] E-mail user when job starts, ends, or
fails; optional
-u, --account Digital Research Alliance of Canada's sponsor's
account name; optional, defaults to 'rpp-kshook'
-L, --list-datasets List all the available datasets and the
corresponding keywords for '--dataset' option
-V, --version Show version
-h, --help Show this screen and exit

```
# Available Datasets
|# |Dataset |Time Scale |DOI |Description |
|--|--------------------------|--------------------------------|-------------------------|--------------------------------------|
|1 |WRF-CONUS I (control) |Hourly (Oct 2000 - Dec 2013) |10.1007/s00382-016-3327-9|[link](./scripts/conus_i) |
|2 |WRF-CONUS II (control)[^1]|Hourly (Jan 1995 - Dec 2015) |10.5065/49SN-8E08 |[link](./scripts/conus_ii) |
|3 |ERA5[^2] |Hourly (Jan 1950 - Dec 2020) |10.24381/cds.adbb2d47 and [link](https://cds.climate.copernicus.eu/cdsapp#!/dataset/reanalysis-era5-single-levels-preliminary-back-extension?tab=overview)|[link](./scripts/era5)|
|4 |RDRS v2.1 |Hourly (Jan 1980 - Dec 2018) |10.5194/hess-25-4917-2021|[link](./scripts/rdrs) |
|5 |CanRCM4-WFDEI-GEM-CaPA |3-Hourly (Jan 1951 - Dec 2100) |10.5194/essd-12-629-2020 |[link](./scripts/canrcm4_wfdei_gem_capa)|
|6 |WFDEI-GEM-CaPA |3-Hoursly (Jan 1979 - Dec 2016) |10.20383/101.0111 |[link](./scripts/wfdei_gem_capa) |
|7 |Daymet |Daily (Jan 1980 - Dec 2022)[^3] |10.3334/ORNLDAAC/2129 |[link](./scripts/daymet) |
|8 |BCC-CSM2-MR |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/bcc-csm2-mr) |
|9 |CNRM-CM6-1 |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/cnrm-cm6-1) |
|10|EC-Earth3-Veg |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/ec-earth3-veg) |
|11|GDFL-CM4 |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/gdfl-cm4) |
|12|GDFL-ESM4 |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/gdfl-esm4) |
|13|IPSL-CM6A-LR |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/ipsl-cm6a-lr) |
|14|MRI-ESM2-0 |Daily (Jan 1950 - Dec 2100)[^4] |*TBD* |[link](./scripts/mri-esm2-0) |
|15|Hybrid Observation(AB Gov)|Daily (Jan 1950 - Dec 2019)[^4] |10.5194/hess-23-5151-2019|[link](./scripts/hybrid_obs) |
|# |Dataset |Time Period |DOI |Description |
|--|---------------------------|--------------------------------|--------------------------|-------------------------------------|
|1 |GWF-NCAR WRF-CONUS I |Hourly (Oct 2000 - Dec 2013) |10.1007/s00382-016-3327-9 |[link](./scripts/gwf-ncar-conus_i) |
|2 |GWF-NCAR WRF-CONUS II[^1] |Hourly (Jan 1995 - Dec 2015) |10.5065/49SN-8E08 |[link](./scripts/gwf-ncar-conus_ii) |
|3 |ECMWF ERA5[^2] |Hourly (Jan 1950 - Dec 2020) |10.24381/cds.adbb2d47 and [link](https://cds.climate.copernicus.eu/cdsapp#!/dataset/reanalysis-era5-single-levels-preliminary-back-extension?tab=overview)|[link](./scripts/ecmwf-era5)|
|4 |ECCC RDRSv2.1 |Hourly (Jan 1980 - Dec 2018) |10.5194/hess-25-4917-2021 |[link](./scripts/eccc-rdrs) |
|5 |CCRN CanRCM4-WFDEI-GEM-CaPA|3-Hourly (Jan 1951 - Dec 2100) |10.5194/essd-12-629-2020 |[link](./scripts/ccrn-canrcm4_wfdei_gem_capa)|
|6 |CCRN WFDEI-GEM-CaPA |3-Hourly (Jan 1979 - Dec 2016) |10.20383/101.0111 |[link](./scripts/ccrn-wfdei_gem_capa)|
|7 |ORNL Daymet |Daily (Jan 1980 - Dec 2022)[^3] |10.3334/ORNLDAAC/2129 |[link](./scripts/ornl-daymet) |
|8 |Alberta Gov Climate Dataset|Daily (Jan 1950 - Dec 2100) |10.5194/hess-23-5151-201 |[link](./scripts/ab-gov) |
|9 |Ouranos ESPO-G6-R2 |Daily (Jan 1950 - Dec 2100) |10.1038/s41597-023-02855-z|[link](./scripts/ouranos-espo-g6-r2) |
|10|Ouranos MRCC5-CMIP6 |hourly (Jan 1950 - Dec 2100) |TBD |link |
|11|NASA NEX-GDDP-CMIP6 |Daily (Jan 1950 - Dec 2100) |10.1038/s41597-022-01393-4|[link](./scripts/nasa-nex-gddp-cmip6)|

[^1]: For access to the files on Graham cluster, please contact [Stephen O'Hearn](mailto:[email protected]).
[^2]: ERA5 data from 1950-1979 are based on [ERA5 preliminary extenion](https://cds.climate.copernicus.eu/cdsapp#!/dataset/reanalysis-era5-single-levels-preliminary-back-extension?tab=overview) and 1979 onwards are based on [ERA5 1979-present](https://doi.org/10.24381/cds.adbb2d47).
[^3]: For the Peurto Rico domain of the dataset, data are available from January 1950 until December 2022.
[^4]: Data is not publicly available yet. DOI is to be determined once the relevant paper is published.

# General Example
As an example, follow the code block below. Please remember that you MUST have access to Graham cluster with Digital Research Alliance of Canada (DRA) and have access to `CONUS I` model outputs. Also, remember to generate a [Personal Access Token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) with GitHub in advance. Enter the following codes in your Graham shell as a test case:
As an example, follow the code block below. Please remember that you MUST have access to Digital Research Alliance of Canada (DRA) clusters (specifically `Graham`) and have access to `RDRSv2.1` model outputs. Also, remember to generate a [Personal Access Token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) with GitHub in advance. Enter the following codes in your Graham shell as a test case:

```console
foo@bar:~$ git clone https://github.com/kasra-keshavarz/datatool # clone the repository
foo@bar:~$ cd ./datatool/ # move to the repository's directory
foo@bar:~$ ./extract-dataset.sh -h # view the usage message
foo@bar:~$ ./extract-dataset.sh --dataset=CONUS1 \
--dataset-dir="/project/rpp-kshook/Model_Output/WRF/CONUS/CTRL" \
--output-dir="$HOME/scratch/conus_i_output/" \
--start-date="2001-01-01 00:00:00" \
--end-date="2001-12-31 23:00:00" \
--lat-lims=49,51 \
--lon-lims=-117,-115 \
--variable=T2,PREC_ACC_NC,Q2,ACSWDNB,ACLWDNB,U10,V10,PSFC \
--prefix="conus_i";
foo@bar:~$ ./extract-dataset.sh \
--dataset="rdrs" \
--dataset-dir="/project/rpp-kshook/Climate_Forcing_Data/meteorological-data/rdrsv2.1" \
--output-dir="$HOME/scratch/rdrs_outputs/" \
--start-date="2001-01-01 00:00:00" \
--end-date="2001-12-31 23:00:00" \
--lat-lims=49,51 \
--lon-lims=-117,-115 \
--variable="RDRS_v2.1_A_PR0_SFC,RDRS_v2.1_P_HU_09944" \
--prefix="testing_";
```
See the [examples](./examples) directory for real-world scripts for each meteorological dataset included in this repository.

Expand All @@ -80,10 +86,16 @@ only in cases where jobs are submitted to clusters' schedulers. If
processing is not submitted as a job, then the logs are printed on screen.

# New Datasets
If you are considering any new dataset to be added to the data repository, and subsequently the associated scripts added here, you can open a new ticket on the **Issues** tab of the current repository. Or, you can make a [Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request) on this repository with your own script.
If you are considering any new dataset to be added to the data
repository, and subsequently the associated scripts added here,
you can open a new ticket on the **Issues** tab of the current
repository. Or, you can make a
[Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request)
on this repository with your own script.

# Support
Please open a new ticket on the **Issues** tab of the current repository in case of any issues.
Please open a new ticket on the **Issues** tab of this repository for
support.

# License
Meteorological Data Processing Workflow - datatool <br>
Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.4.2-dev
0.5.0
28 changes: 28 additions & 0 deletions assets/bash_scripts/extract_subdir_level.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
#!/bin/bash

# Input comma-separated string
root_path="$1"
input_string="$2"

# Split the input string by comma
IFS=',' read -ra directories <<< "$input_string"

# Initialize an empty string to store results
result_string=""

# Iterate over each directory
for dir in "${directories[@]}"; do
# Find subdirectories
IFS=' ' read -ra subdirs <<< $(find "$root_path/$dir" -mindepth 1 -maxdepth 1 -type d -printf "%f ")

# Prepend each subdirectory with its original value from input_string
for subdir in ${subdirs[@]}; do
result_string+="$dir/${subdir##*/},"
done
done

# Remove the trailing comma, if any
result_string=${result_string%,}

echo "$result_string"

File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Loading

0 comments on commit 66140ec

Please sign in to comment.