Skip to content

Commit

Permalink
Merge pull request #53 from ASFHyP3/develop
Browse files Browse the repository at this point in the history
Release 0.4.0
  • Loading branch information
jhkennedy authored Jan 22, 2021
2 parents 1704018 + 7e3be52 commit 13c4f85
Show file tree
Hide file tree
Showing 26 changed files with 892 additions and 324 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/static-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:

- uses: actions/setup-python@v1
with:
python-version: 3.7
python-version: 3.8

- name: Install dependencies
run: |
Expand Down Expand Up @@ -43,7 +43,7 @@ jobs:

- uses: actions/setup-python@v1
with:
python-version: 3.7
python-version: 3.8

- name: Install dependencies
run: |
Expand Down
25 changes: 12 additions & 13 deletions .github/workflows/test-and-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,12 @@ on:
- develop

env:
HYP3_REGISTRY: 626226570674.dkr.ecr.us-east-1.amazonaws.com
S3_PYPI_HOST: hyp3-pypi.s3-website-us-east-1.amazonaws.com
AWS_REGION: us-east-1
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
HYP3_REGISTRY: 845172464411.dkr.ecr.us-west-2.amazonaws.com
S3_PYPI_HOST: hyp3-pypi-west.s3-website-us-west-2.amazonaws.com
S3_PYPI_BUCKET: hyp3-pypi-west
AWS_REGION: us-west-2
AWS_ACCESS_KEY_ID: ${{ secrets.V2_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.V2_AWS_SECRET_ACCESS_KEY }}

jobs:
pytest:
Expand All @@ -26,7 +27,7 @@ jobs:
- uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
python-version: 3.7
python-version: 3.8
activate-environment: hyp3-autorift
environment-file: conda-env.yml

Expand All @@ -39,9 +40,8 @@ jobs:
- name: Safety analysis of conda environment
shell: bash -l {0}
run: |
# Ignore Safety vulerability #38264, GDAL < 3.1, because GAMMA binaries are built against GDAL 2.*
python -m pip freeze | safety check --full-report -i 38264 --stdin
conda list --export | awk -F '=' '/^[^#]/ {print $1 "==" $2}' | safety check --full-report -i 38264 --stdin
python -m pip freeze | safety check --full-report --stdin
conda list --export | awk -F '=' '/^[^#]/ {print $1 "==" $2}' | safety check --full-report --stdin
package:
Expand Down Expand Up @@ -128,9 +128,8 @@ jobs:
export SDIST_VERSION=$(python setup.py --version)
echo "::set-output name=version::${SDIST_VERSION}"
python setup.py sdist bdist_wheel
echo "Uploading version ${SDIST_VERSION} to S3-PyPI"
s3pypi --bucket hyp3-pypi --force --verbose
echo "Uploading version ${SDIST_VERSION} to ${S3_PYPI_BUCKET}"
s3pypi --bucket ${S3_PYPI_BUCKET} --private --force --verbose
dockerize:
runs-on: ubuntu-latest
Expand All @@ -152,7 +151,7 @@ jobs:
export SDIST_VERSION=${{ needs.package.outputs.SDIST_VERSION }}
export CI_JOB_TIMESTAMP=$(date --utc --rfc-3339=seconds)
echo "Building ${HYP3_REGISTRY}/${GITHUB_REPOSITORY##*/}:${SDIST_VERSION/+/_}"
docker pull ${HYP3_REGISTRY}/${GITHUB_REPOSITORY##*/}:test
docker pull ${HYP3_REGISTRY}/${GITHUB_REPOSITORY##*/}:test || true
docker build --no-cache \
-t ${HYP3_REGISTRY}/${GITHUB_REPOSITORY##*/}:${SDIST_VERSION/+/_} \
--label org.opencontainers.image.created="${CI_JOB_TIMESTAMP}" \
Expand Down
23 changes: 23 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,29 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.4.0](https://github.com/ASFHyP3/hyp3-autorift/compare/v0.3.3...v0.4.0)

**HyP3 v1 is no longer supported as of this release.**

### Added
* Added support for global processing (previously only Greenland and Antarctica)
by pointing at the new autoRIFT parameter files provided by JPL
* Added support for processing Landsat-8 Collection 2 scene pairs
* Example documentation for submitting autoRIFT jobs via the [HyP3 SDK](docs/sdk_example.ipynb)
or [HyP3 API](docs/api_example.md)

### Changed
* Sentinel-2 support now targets level-1c products instead of level-2a products to
remove baked in slope correction
* `hyp3_autorift` entrypoint point now kicks off HyP3 v2 processing (options have changed! see `--help`)

### Fixed
* 1/2 pixel offset in netCDF file due to gdal and netCDF using different pixel reference points

### Removed
* The `autorift` entrypoint and HyP3 v1 support has been removed
* The `hyp3_autorift_v2` entrypoint has been removed (now just `hyp3_autorift`)

## [0.3.1](https://github.com/ASFHyP3/hyp3-autorift/compare/v0.3.0...v0.3.1)

### Changed
Expand Down
9 changes: 4 additions & 5 deletions conda-env.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,10 @@ dependencies:
- python=3.8
- pip
# For packaging, and testing
- flake8
- flake8-import-order
- flake8-blind-except
- flake8-builtins
- pillow
- pytest
- pytest-console-scripts
Expand All @@ -25,15 +29,10 @@ dependencies:
- matplotlib-base
- netCDF4
- numpy
- psycopg2 # missing hyp3proclib dep
- requests
- scikit-image # missing autoRIFT dep
- scipy
- pip:
# for packaging and testing
- s3pypi
- safety
# For running
- --trusted-host hyp3-pypi.s3-website-us-east-1.amazonaws.com
--extra-index-url http://hyp3-pypi.s3-website-us-east-1.amazonaws.com
- hyp3proclib>=1.0.1,<2
162 changes: 162 additions & 0 deletions docs/api_example.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,162 @@
# Using the HyP3 API for autoRIFT

AutoRIFT's HyP3 API is built on [OpenAPI](https://www.openapis.org/) and
[Swagger](https://swagger.io/) and available at:

https://hyp3-autorift.asf.alaska.edu/ui

In order to use the API, you'll need a `asf-urs` session cookie, which you can get
by [signing in to Vertex](https://search.asf.alaska.edu/#/)

![vetex sign in](imgs/vertex-sign-in.png)

### Confirm you are authenticated

To confirm you are authenticated, you can run a `GET` request to our `/user` endpoint.
Select the blue `GET` button next to `/user` and click the `Try it out` button
![GET /user try](imgs/get_user_try.png)

Then, execute the request and look at the response
![GET /user execute](imgs/get_user_execute.png)

If you get a `Code 200` you should see a JSON dictionary of your user information.
If you get a `Code 401` you are not currently authenticated.

## Submitting jobs

Jobs are submitted through the API by providing a JSON payload with a list of job
definitions. A minimal job list for a single Sentinel-1 autoRIFT job would look like:

```json
{
"jobs": [
{
"job_type": "AUTORIFT",
"name": "s1-example",
"job_parameters": {
"granules": [
"S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07",
"S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654"
]
}
}
]
}
```

The job list may contain up to 200 job definitions.

### Sentinel-1, Sentinel-2, and Landsat-8

For each supported satellite mission, the granule (scene) pairs to process are
provided by ID:
* Sentinel-1: [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/naming-conventions)
* Sentinel-2: [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-2-msi/naming-convention)
*or* [Element 84 Earth Search ID](https://registry.opendata.aws/sentinel-2/)
* Landsat-8 Collection 2: [USGS scene ID](https://www.usgs.gov/faqs/what-naming-convention-landsat-collection-2-level-1-and-level-2-scenes?qt-news_science_products=0#qt-news_science_products)

To submit an example set of jobs including all supported missions, you could write a job list like:

```json
{
"jobs": [
{
"name": "s1-example",
"job_parameters": {
"granules": [
"S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07",
"S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654"
]
},
"job_type": "AUTORIFT"
},
{
"name": "s2-esa-example",
"job_parameters": {
"granules": [
"S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700",
"S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912"
]
},
"job_type": "AUTORIFT"
},
{
"name": "s2-cog-example",
"job_parameters": {
"granules": [
"S2B_22WEB_20200612_0_L1C",
"S2A_22WEB_20200627_0_L1C"
]
},
"job_type": "AUTORIFT"
}
{
"name": "l8-example",
"job_parameters": {
"granules": [
"LC08_L1TP_009011_20200703_20200913_02_T1",
"LC08_L1TP_009011_20200820_20200905_02_T1"
]
},
"job_type": "AUTORIFT"
}
]
}
```

With your JSON jobs definition, you can `POST` to the `/jobs` endpoint to
submit the jobs.

1. click the green `POST` button next to `/jobs`
2. click `Try it out` on the right
3. paste your jobs definition into the `Request body`
4. click `execute`

![POST /jobs execute](imgs/post_jobs_execute.png)

If your jobs were submitted successfully you should see a `Code 200` response and
JSON response of your job list, with some additional job attributes filled in.

## Querying jobs

You can `GET` job information from the `/jobs` endpoint. You may provide query
parameters to filter jobs which jobs are returned:
![GET /jobs query](imgs/get_jobs_query.png)

For our above examples, you can get the job that was submitted with Sentinel-2 COG IDs by
searching for `name=s2-cog-example`. If you provide *no* query parameters, you'll get a
JSON response with a jobs list for every job you've submitted.

Within the jobs list, a complete job dictionary will look like:
```JSON
{
"browse_images": [
"https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.png"
],
"expiration_time": "2021-04-27T00:00:00+00:00",
"files": [
{
"filename": "S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.nc",
"size": 6574604,
"url": "https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.nc"
}
],
"job_id": "0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7",
"job_parameters": {
"granules": [
"S1A_IW_SLC__1SSH_20170118T091036_20170118T091104_014884_01846D_01C5",
"S1B_IW_SLC__1SSH_20170112T090955_20170112T091023_003813_0068DC_C750"
]
},
"job_type": "AUTORIFT",
"name": "GIS-random-200-A526",
"request_time": "2020-10-28T00:55:35+00:00",
"status_code": "SUCCEEDED",
"thumbnail_images": [
"https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6_thumb.png"
],
"user_id": "MY_EDL_USERNAME"
}
```

Importantly, the `files` block provides download links for the product files.
Binary file added docs/imgs/get_jobs_query.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/imgs/get_user_execute.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/imgs/get_user_try.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/imgs/post_jobs_execute.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/imgs/vertex-sign-in.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 13c4f85

Please sign in to comment.