diff --git a/CHANGELOG.md b/CHANGELOG.md index bf38a8ea..9c60d4c4 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -7,6 +7,10 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/) and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html). +## [0.17.0] +## Changed +* In preparation for a major update, the Sentinel-1 processing workflow has been isolated to a new `hyp3_autorift.s1_isce2` module. + ## [0.16.0] ### Fixed * `hyp3_autorift` will no longer attempt to crop files with no valid data diff --git a/README.md b/README.md index 4f4b26c5..faf8db15 100644 --- a/README.md +++ b/README.md @@ -1,3 +1,185 @@ -# HyP3 autoRIFT -A HyP3 plugin for feature tracking processing with AutoRIFT-ISCE +# HyP3 autoRIFT Plugin + +[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.4037015.svg)](https://doi.org/10.5281/zenodo.4037015) + +The HyP3-autoRIFT plugin provides a set of workflows for feature tracking processing with the AutoRIFT [autonomous Repeat Image Feature Tracking](https://github.com/nasa-jpl/autoRIFT) (autoRIFT) software package. This plugin is part of the [Alaska Satellite Facility's](https://asf.alaska.edu) larger HyP3 (Hybrid Plugin Processing Pipeline) system, which is a batch processing pipeline designed for on-demand processing of remote sensing data. For more information on HyP3, see the [Background](#background) section. + +## Installation + +1. Ensure that conda is installed on your system (we recommend using [mambaforge](https://github.com/conda-forge/miniforge#mambaforge) to reduce setup times). +2. Clone the `hyp3-autorift` repository and navigate to the root directory of this project + ```bash + git clone https://github.com/ASFHyP3/hyp3-autorift.git + cd hyp3-autorift + ``` +3. Create and activate your Python environment + ```bash + + mamba env create -f environment.yml + mamba activate hyp3-autorift + ``` +4. Finally, install a development version of HyP3 autoRIFT + ```bash + python -m pip install -e . + ``` + +## Usage + +The HyP3-autoRIFT plugin provides workflows (accessible directly in Python or via a CLI) that can be used to process SAR data or optical data using autoRIFT. HyP3-autoRIFT can process these satellite missions: +* SAR: + * Sentinel-1 +* Optical: + * Sentinel-2 + * Landsat 4,5,7,8,9 + +To see all available workflows, run: +``` +python -m hyp3_autorift ++help +``` + +### `hyp3_autorift` workflow + +The `hyp3_autorift` workflow is used to get dense feature tracking between two images using autoRIFT. You can run this workflow by selecting the `hyp3_autorift` process: +``` +python -m hyp3_autorift ++process hyp3_autorift [WORKFLOW_ARGS] +``` +or by using the `hyp3_autorift` console script: +``` +hyp3_autorift [WORKFLOW_ARGS] +``` +For example: + +``` +hyp3_autorift \ + "S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700" \ + "S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912" +``` + +This command will run autorift for a pair of Sentinel-2 images. + +> [!IMPORTANT] +> Credentials are necessary to access Landsat and Sentinel-1 data. See the Credentials section for more information. + +For all options available to this workflow, see the help documentation: +``` +hyp3_autorift --help +``` + +### Credentials + +Depending on the mission being processed, some workflows will need you to provide credentials. Generally, credentials are provided via environment variables, but some may be provided by command-line arguments or via a `.netrc` file. + +#### AWS Credentials + +To process Landsat images, you must provide AWS credentials because the data is hosted by USGS in a "requester pays" bucket. To provide AWS credentials, you can either use an AWS profile specified in your `~/.aws/credentials` by exporting: +``` +export AWS_PROFILE=your-profile +``` +or by exporting credential environment variables: +``` +export AWS_ACCESS_KEY_ID=your-id +export AWS_SECRET_ACCESS_KEY=your-key +export AWS_SESSION_TOKEN=your-token # optional; for when using temporary credentials +``` + +For more information, please see: + +#### NASA Earthdata Login and ESA Copernicus Data Space Ecosystem (CDSE) + +To process Sentinel-1 images, you must provide Earthdata Login credentials and ESA Copernicus Data Space Ecosystem (CDSE) credentials in order to download input data. +* If you do not already have an Earthdata account, you can sign up [here](https://urs.earthdata.nasa.gov/home). +* If you do not already have a CDSE account, you can sign up [here](https://dataspace.copernicus.eu). + +For Earthdata login and CDSE, you can provide credentials by exporting environment variables: +``` +export EARTHDATA_USERNAME=your-edl-username +export EARTHDATA_PASSWORD=your-edl-password +export ESA_USERNAME=your-esa-username +export ESA_PASSWORD=your-esa-password +``` +or via your [`~/.netrc` file](https://everything.curl.dev/usingcurl/netrc) which should contain lines like these two: +``` +machine urs.earthdata.nasa.gov login your-edl-username password your-edl-password +machine dataspace.copernicus.eu login your-esa-username password your-esa-password +``` + +> [!TIP] +> Your `~/.netrc` file should only be readable by your user; otherwise, you'll receive a "net access too permissive" error. To fix, run: +> ``` +> chmod 0600 ~/.netrc +> ``` + +### Docker Container + +The ultimate goal of this project is to create a docker container that can run autoRIFT workflows within a HyP3 deployment. To run the current version of the project's container, use this command: +``` +docker run -it --rm \ + -e AWS_ACCESS_KEY_ID=[YOUR_KEY] \ + -e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \ + -e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \ + -e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \ + -e ESA_USERNAME=[YOUR_USERNAME_HERE] \ + -e ESA_PASSWORD=[YOUR_PASSWORD_HERE] \ + ghcr.io/asfhyp3/hyp3-autorift:latest \ + ++process hyp3_autorift \ + [WORKFLOW_ARGS] +``` + +> [!TIP] +> You can use [`docker run --env-file`](https://docs.docker.com/reference/cli/docker/container/run/#env) to capture all the necessary environment variables in a single file. + +#### Docker Outputs + +To retain hyp3_autorift output files running via Docker there are two recommended approaches: + +1. Use a volume mount + + Add the `-w /tmp -v ${PWD}:/tmp` flags after `docker run`; `-w` changes the working directory inside the container to `/tmp` and `-v` will mount your current working directory to the `/tmp` location inside the container such that hyp3_autorift outputs are preserved locally. You can replace `${PWD}` with any valid path. + +1. Copy outputs to a remote AWS S3 Bucket + + Append the `--bucket` and `--bucket-prefix` to [WORKFLOW_ARGS] so that the final output files are uploaded to AWS S3. This also requires that AWS credentials to write to the bucket are available to the running container. For example, to write outputs to a hypothetical bucket `s3://hypothetical-bucket/test-run/`: + + ``` + docker run -it --rm \ + -e AWS_ACCESS_KEY_ID=[YOUR_KEY] \ + -e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \ + -e AWS_SESSION_TOKEN=[YOUR_TOKEN] \ # Optional + -e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \ + -e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \ + -e ESA_USERNAME=[YOUR_USERNAME_HERE] \ + -e ESA_PASSWORD=[YOUR_PASSWORD_HERE] \ + ghcr.io/asfhyp3/hyp3-autorift:latest \ + ++process hyp3_autorift \ + [WORKFLOW_ARGS] \ + --bucket "hypothetical-bucket" \ + --bucket-prefix "test-run" + ``` + +## Background +HyP3 is broken into two components: the cloud architecture/API that manages the processing of HyP3 workflows and Docker container plugins that contain scientific workflows that produce new science products from a variety of data sources (see figure below for the full HyP3 architecture). + +![Cloud Architecture](images/arch_here.jpg) + +The cloud infrastructure-as-code for HyP3 can be found in the main [HyP3 repository](https://github.com/asfhyp3/hyp3)., while this repository contains a plugin that can be used for feature tracking processing with AutoRIFT. + +## License +The HyP3-autoRIFT plugin is licensed under the BSD 3-Clause license. See the LICENSE file for more details. + +## Code of conduct +We strive to create a welcoming and inclusive community for all contributors to HyP3-autoRIFT. As such, all contributors to this project are expected to adhere to our code of conduct. + +Please see `CODE_OF_CONDUCT.md` for the full code of conduct text. + +## Contributing +Contributions to the HyP3-autoRIFT plugin are welcome! If you would like to contribute, please submit a pull request on the GitHub repository. + +## Contact Us +Want to talk about HyP3-autoRIFT? We would love to hear from you! + +Found a bug? Want to request a feature? +[open an issue](https://github.com/ASFHyP3/asf_tools/issues/new) + +General questions? Suggestions? Or just want to talk to the team? +[chat with us on gitter](https://gitter.im/ASFHyP3/community) diff --git a/docs/api_example.md b/docs/api_example.md deleted file mode 100644 index 7d479685..00000000 --- a/docs/api_example.md +++ /dev/null @@ -1,162 +0,0 @@ -# Using the HyP3 API for autoRIFT - -AutoRIFT's HyP3 API is built on [OpenAPI](https://www.openapis.org/) and -[Swagger](https://swagger.io/) and available at: - -https://hyp3-autorift.asf.alaska.edu/ui - -In order to use the API, you'll need a `asf-urs` session cookie, which you can get -by [signing in to Vertex](https://search.asf.alaska.edu/#/) - -![vetex sign in](imgs/vertex-sign-in.png) - -### Confirm you are authenticated - -To confirm you are authenticated, you can run a `GET` request to our `/user` endpoint. -Select the blue `GET` button next to `/user` and click the `Try it out` button -![GET /user try](imgs/get_user_try.png) - -Then, execute the request and look at the response -![GET /user execute](imgs/get_user_execute.png) - -If you get a `Code 200` you should see a JSON dictionary of your user information. -If you get a `Code 401` you are not currently authenticated. - -## Submitting jobs - -Jobs are submitted through the API by providing a JSON payload with a list of job -definitions. A minimal job list for a single Sentinel-1 autoRIFT job would look like: - -```json -{ - "jobs": [ - { - "job_type": "AUTORIFT", - "name": "s1-example", - "job_parameters": { - "granules": [ - "S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07", - "S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654" - ] - } - } - ] -} -``` - -The job list may contain up to 200 job definitions. - -### Sentinel-1, Sentinel-2, and Landsat-8 - -For each supported satellite mission, the granule (scene) pairs to process are -provided by ID: -* Sentinel-1: [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/naming-conventions) -* Sentinel-2: [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-2-msi/naming-convention) - *or* [Element 84 Earth Search ID](https://registry.opendata.aws/sentinel-2/) -* Landsat-8 Collection 2: [USGS scene ID](https://www.usgs.gov/faqs/what-naming-convention-landsat-collection-2-level-1-and-level-2-scenes?qt-news_science_products=0#qt-news_science_products) - -To submit an example set of jobs including all supported missions, you could write a job list like: - -```json -{ - "jobs": [ - { - "name": "s1-example", - "job_parameters": { - "granules": [ - "S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07", - "S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654" - ] - }, - "job_type": "AUTORIFT" - }, - { - "name": "s2-esa-example", - "job_parameters": { - "granules": [ - "S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700", - "S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912" - ] - }, - "job_type": "AUTORIFT" - }, - { - "name": "s2-cog-example", - "job_parameters": { - "granules": [ - "S2B_22WEB_20200612_0_L1C", - "S2A_22WEB_20200627_0_L1C" - ] - }, - "job_type": "AUTORIFT" - } - { - "name": "l8-example", - "job_parameters": { - "granules": [ - "LC08_L1TP_009011_20200703_20200913_02_T1", - "LC08_L1TP_009011_20200820_20200905_02_T1" - ] - }, - "job_type": "AUTORIFT" - } - ] -} -``` - -With your JSON jobs definition, you can `POST` to the `/jobs` endpoint to -submit the jobs. - -1. click the green `POST` button next to `/jobs` -2. click `Try it out` on the right -3. paste your jobs definition into the `Request body` -4. click `execute` - -![POST /jobs execute](imgs/post_jobs_execute.png) - -If your jobs were submitted successfully you should see a `Code 200` response and -JSON response of your job list, with some additional job attributes filled in. - -## Querying jobs - -You can `GET` job information from the `/jobs` endpoint. You may provide query -parameters to filter jobs which jobs are returned: -![GET /jobs query](imgs/get_jobs_query.png) - -For our above examples, you can get the job that was submitted with Sentinel-2 COG IDs by -searching for `name=s2-cog-example`. If you provide *no* query parameters, you'll get a -JSON response with a jobs list for every job you've submitted. - -Within the jobs list, a complete job dictionary will look like: -```JSON -{ - "browse_images": [ - "https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.png" - ], - "expiration_time": "2021-04-27T00:00:00+00:00", - "files": [ - { - "filename": "S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.nc", - "size": 6574604, - "url": "https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6.nc" - } - ], - "job_id": "0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7", - "job_parameters": { - "granules": [ - "S1A_IW_SLC__1SSH_20170118T091036_20170118T091104_014884_01846D_01C5", - "S1B_IW_SLC__1SSH_20170112T090955_20170112T091023_003813_0068DC_C750" - ] - }, - "job_type": "AUTORIFT", - "name": "GIS-random-200-A526", - "request_time": "2020-10-28T00:55:35+00:00", - "status_code": "SUCCEEDED", - "thumbnail_images": [ - "https://hyp3-autorift-contentbucket-102baltr3ibfm.s3.us-west-2.amazonaws.com/0c8d6dfc-a909-43b7-ae80-b1ee6acff9e7/S1BA_20170112T090955_20170118T091036_HHP007_VEL240_A_2CB6_thumb.png" - ], - "user_id": "MY_EDL_USERNAME" -} -``` - -Importantly, the `files` block provides download links for the product files. \ No newline at end of file diff --git a/docs/imgs/get_jobs_query.png b/docs/imgs/get_jobs_query.png deleted file mode 100644 index f20dec4c..00000000 Binary files a/docs/imgs/get_jobs_query.png and /dev/null differ diff --git a/docs/imgs/get_user_execute.png b/docs/imgs/get_user_execute.png deleted file mode 100644 index ba985b87..00000000 Binary files a/docs/imgs/get_user_execute.png and /dev/null differ diff --git a/docs/imgs/get_user_try.png b/docs/imgs/get_user_try.png deleted file mode 100644 index 7f60131e..00000000 Binary files a/docs/imgs/get_user_try.png and /dev/null differ diff --git a/docs/imgs/post_jobs_execute.png b/docs/imgs/post_jobs_execute.png deleted file mode 100644 index 1f020266..00000000 Binary files a/docs/imgs/post_jobs_execute.png and /dev/null differ diff --git a/docs/imgs/vertex-sign-in.png b/docs/imgs/vertex-sign-in.png deleted file mode 100644 index bab1b61e..00000000 Binary files a/docs/imgs/vertex-sign-in.png and /dev/null differ diff --git a/docs/sdk_example.ipynb b/docs/sdk_example.ipynb deleted file mode 100644 index 8146e7fd..00000000 --- a/docs/sdk_example.ipynb +++ /dev/null @@ -1,319 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "# Using the Python SDK for autoRIFT\n", - "\n", - "HyP3's Python SDK `hyp3_sdk` provides a convenience wrapper around the HyP3 API and HyP3 jobs.\n", - "\n", - "\n", - "The HyP3 SDK can be installed using [Anaconda/Miniconda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/download.html#anaconda-or-miniconda)\n", - " (recommended) via [`conda`](https://anaconda.org/conda-forge/hyp3_sdk):\n", - "\n", - "```\n", - "conda install -c conda-forge hyp3_sdk\n", - "```\n", - "\n", - "Or using [`pip`](https://pypi.org/project/hyp3-sdk/):\n", - "\n", - "```\n", - "python -m pip install hyp3_sdk\n", - "```\n", - "\n", - "Full documentation of the SDK can be found in the [HyP3 documentation](https://hyp3-docs.asf.alaska.edu/using/sdk/)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# initial setup\n", - "import hyp3_sdk as sdk\n", - "\n", - "AUTORIFT_API = 'https://hyp3-autorift.asf.alaska.edu/'" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Authenticating to the API\n", - "\n", - "The SDK will attempt to pull your [NASA Earthdata Login](https://urs.earthdata.nasa.gov/) credentials out of `~/.netrc`\n", - "by default, or you can pass your credentials in directly" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# .netrc\n", - "hyp3 = sdk.HyP3(AUTORIFT_API)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# or enter your credentials\n", - "from getpass import getpass\n", - "username = 'MY_EDL_USERNAME'\n", - "password = getpass() # will prompt for a password\n", - "hyp3 = sdk.HyP3(AUTORIFT_API, username=username, password=password)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Submitting jobs\n", - "\n", - "AutoRIFT jobs can be submitted using the `hyp3.submit_autorift_job()` method.\n", - "\n", - "### Sentinel-1\n", - "\n", - "Sentinel-1 jobs are submitted using the [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/naming-conventions)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "s1_pairs = [\n", - " ('S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07',\n", - " 'S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654'),\n", - " ('S1B_IW_SLC__1SDH_20180821T204618_20180821T204645_012366_016CC2_59A7',\n", - " 'S1B_IW_SLC__1SDH_20180809T204617_20180809T204644_012191_01674F_9345'),\n", - " ('S1A_IW_SLC__1SSH_20151214T080202_20151214T080229_009035_00CF68_780C',\n", - " 'S1A_IW_SLC__1SSH_20151120T080202_20151120T080229_008685_00C5A7_105E'),\n", - " ('S1A_IW_SLC__1SSH_20150909T162413_20150909T162443_007640_00A97D_922B',\n", - " 'S1A_IW_SLC__1SSH_20150828T162412_20150828T162431_007465_00A4AF_DC3E'),\n", - "]\n", - "\n", - "s1_jobs = sdk.Batch()\n", - "for g1, g2 in s1_pairs:\n", - " s1_jobs += hyp3.submit_autorift_job(g1, g2, name='s1-example')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Here we've given each job the name `s1-example`, which we can use later to search for these jobs.\n", - "\n", - "### Sentinel-2\n", - "\n", - "Seninel-2 jobs can be submitted using either the [ESA granule ID](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-2-msi/naming-convention)\n", - "or the [Element 84 Earth Search ID](https://registry.opendata.aws/sentinel-2/)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "s2_pairs = [\n", - " # Can be either ESA granule IDs\n", - " ('S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700',\n", - " 'S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912'),\n", - " # or Element 84 Earth Search ID\n", - " ('S2B_22WEB_20200612_0_L1C', 'S2A_22WEB_20200627_0_L1C'),\n", - "]\n", - "\n", - "s2_jobs = sdk.Batch()\n", - "for g1, g2 in s2_pairs:\n", - " s2_jobs += hyp3.submit_autorift_job(g1, g2, name='s2-example')\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Landsat-8 Collection 2\n", - "\n", - "Landsat-8 Collection 2 jobs are submitted using the [USGS scene ID](https://www.usgs.gov/faqs/what-naming-convention-landsat-collection-2-level-1-and-level-2-scenes?qt-news_science_products=0#qt-news_science_products)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "l8_pairs = [\n", - " ('LC08_L1TP_009011_20200703_20200913_02_T1',\n", - " 'LC08_L1TP_009011_20200820_20200905_02_T1'),\n", - "]\n", - "\n", - "l8_jobs = sdk.Batch()\n", - "for g1, g2 in l8_pairs:\n", - " l8_jobs += hyp3.submit_autorift_job(g1, g2, name='l8-example')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Monitoring jobs\n", - "\n", - "One jobs are submitted, you can either watch the jobs until they finish" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "s1_jobs = hyp3.watch(s1_jobs)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "which will require you to keep the cell/terminal running, or you can come back later and search for jobs" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "s1_jobs = hyp3.find_jobs(name='s1-example')\n", - "s1_jobs = hyp3.watch(s1_jobs)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Downloading files\n", - "\n", - "Batches are collections of jobs. They provide a snapshot of the job status when the job was created or last\n", - "refreshed. To get updated information on a batch" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "s1_jobs = hyp3.refresh(s1_jobs)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`hyp3.watch()` will return a refreshed batch once the batch has completed.\n", - "\n", - "Batches can be added together" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "print(f'Number of Jobs:\\n S1:{len(s1_jobs)}\\n S2:{len(s2_jobs)}\\n L8:{len(l8_jobs)}')\n", - "all_jobs = s1_jobs + s2_jobs + l8_jobs\n", - "print(f'Total number of Jobs: {len(all_jobs)}')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can check if every job is complete and if every job was successful" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "all_jobs.complete()\n", - "all_jobs.succeeded()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "and filter jobs by status" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "succeeded_jobs = all_jobs.filter_jobs(succeeded=True, running=False, failed=False)\n", - "failed_jobs = all_jobs.filter_jobs(succeeded=False, running=False, failed=True)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can download the files for all successful jobs" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "file_list = succeeded_jobs.download_files('./')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "*Note: only succeeded jobs will have files to download.*" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.8.6" - } - }, - "nbformat": 4, - "nbformat_minor": 1 -} \ No newline at end of file diff --git a/src/hyp3_autorift/geometry.py b/src/hyp3_autorift/geometry.py index ceed000f..27e29a0f 100644 --- a/src/hyp3_autorift/geometry.py +++ b/src/hyp3_autorift/geometry.py @@ -1,93 +1,14 @@ """Geometry routines for working Geogrid""" import logging -import os from typing import Tuple -import isce # noqa: F401 -import isceobj -import numpy as np -from contrib.demUtils import createDemStitcher -from contrib.geo_autoRIFT.geogrid import Geogrid -from isceobj.Orbit.Orbit import Orbit -from isceobj.Sensor.TOPS.Sentinel1 import Sentinel1 -from osgeo import gdal from osgeo import ogr from osgeo import osr log = logging.getLogger(__name__) -def bounding_box(safe, priority='reference', polarization='hh', orbits='Orbits', epsg=4326): - """Determine the geometric bounding box of a Sentinel-1 image - - :param safe: Path to the Sentinel-1 SAFE zip archive - :param priority: Image priority, either 'reference' (default) or 'secondary' - :param polarization: Image polarization (default: 'hh') - :param orbits: Path to the orbital files (default: './Orbits') - :param epsg: Projection EPSG code (default: 4326) - - :return: lat_limits (list), lon_limits (list) - lat_limits: list containing the [minimum, maximum] latitudes - lat_limits: list containing the [minimum, maximum] longitudes - """ - frames = [] - for swath in range(1, 4): - rdr = Sentinel1() - rdr.configure() - rdr.safe = [os.path.abspath(safe)] - rdr.output = priority - rdr.orbitDir = os.path.abspath(orbits) - rdr.auxDir = os.path.abspath(orbits) - rdr.swathNumber = swath - rdr.polarization = polarization - rdr.parse() - frames.append(rdr.product) - - first_burst = frames[0].bursts[0] - sensing_start = min([x.sensingStart for x in frames]) - sensing_stop = max([x.sensingStop for x in frames]) - starting_range = min([x.startingRange for x in frames]) - far_range = max([x.farRange for x in frames]) - range_pixel_size = first_burst.rangePixelSize - prf = 1.0 / first_burst.azimuthTimeInterval - - orb = Orbit() - orb.configure() - - for state_vector in first_burst.orbit: - orb.addStateVector(state_vector) - - for frame in frames: - for burst in frame.bursts: - for state_vector in burst.orbit: - if state_vector.time < orb.minTime or state_vector.time > orb.maxTime: - orb.addStateVector(state_vector) - - obj = Geogrid() - obj.configure() - - obj.startingRange = starting_range - obj.rangePixelSize = range_pixel_size - obj.sensingStart = sensing_start - obj.prf = prf - obj.lookSide = -1 - obj.numberOfLines = int(np.round((sensing_stop - sensing_start).total_seconds() * prf)) - obj.numberOfSamples = int(np.round((far_range - starting_range)/range_pixel_size)) - obj.orbit = orb - obj.epsg = epsg - - obj.determineBbox() - - lat_limits = obj._xlim - lon_limits = obj._ylim - - log.info(f'Latitude limits [min, max]: {lat_limits}') - log.info(f'Longitude limits [min, max]: {lon_limits}') - - return lat_limits, lon_limits - - def polygon_from_bbox(x_limits: Tuple[float, float], y_limits: Tuple[float, float], epsg_code: int = 4326) -> ogr.Geometry: ring = ogr.Geometry(ogr.wkbLinearRing) @@ -143,43 +64,3 @@ def fix(n): fixed = ogr.Geometry(ogr.wkbPoint) fixed.AddPoint_2D(fix(point.GetX()), fix(point.GetY())) return fixed - - -def prep_isce_dem(input_dem, lat_limits, lon_limits, isce_dem=None): - if isce_dem is None: - seamstress = createDemStitcher() - isce_dem = seamstress.defaultName([*lat_limits, *lon_limits]) - - isce_dem = os.path.abspath(isce_dem + '.wgs84') - log.info(f'ISCE dem is: {isce_dem}') - - in_ds = gdal.OpenShared(input_dem, gdal.GA_ReadOnly) - warp_options = gdal.WarpOptions( - format='ENVI', outputType=gdal.GDT_Int16, resampleAlg='cubic', - xRes=0.001, yRes=0.001, dstSRS='EPSG:4326', dstNodata=0, - outputBounds=[lon_limits[0], lat_limits[0], lon_limits[1], lat_limits[1]] - ) - gdal.Warp(isce_dem, in_ds, options=warp_options) - - del in_ds - - isce_ds = gdal.Open(isce_dem, gdal.GA_ReadOnly) - isce_trans = isce_ds.GetGeoTransform() - - img = isceobj.createDemImage() - img.width = isce_ds.RasterXSize - img.length = isce_ds.RasterYSize - img.bands = 1 - img.dataType = 'SHORT' - img.scheme = 'BIL' - img.setAccessMode('READ') - img.filename = isce_dem - - img.firstLongitude = isce_trans[0] + 0.5 * isce_trans[1] - img.deltaLongitude = isce_trans[1] - - img.firstLatitude = isce_trans[3] + 0.5 * isce_trans[5] - img.deltaLatitude = isce_trans[5] - img.renderHdr() - - return isce_dem diff --git a/src/hyp3_autorift/process.py b/src/hyp3_autorift/process.py index ab41bad3..38f5d9f8 100644 --- a/src/hyp3_autorift/process.py +++ b/src/hyp3_autorift/process.py @@ -17,10 +17,7 @@ import numpy as np import requests from hyp3lib.aws import upload_file_to_s3 -from hyp3lib.fetch import download_file -from hyp3lib.get_orb import downloadSentinelOrbitFile from hyp3lib.image import create_thumbnail -from hyp3lib.scene import get_download_url from netCDF4 import Dataset from osgeo import gdal @@ -205,15 +202,6 @@ def get_platform(scene: str) -> str: raise NotImplementedError(f'autoRIFT processing not available for this platform. {scene}') -def get_s1_primary_polarization(granule_name): - polarization = granule_name[14:16] - if polarization in ['SV', 'DV']: - return 'vv' - if polarization in ['SH', 'DH']: - return 'hh' - raise ValueError(f'Cannot determine co-polarization of granule {granule_name}') - - def create_filtered_filepath(path: str) -> str: parent = (Path.cwd() / 'filtered').resolve() parent.mkdir(exist_ok=True) @@ -349,8 +337,6 @@ def process( secondary: str, parameter_file: str = DEFAULT_PARAMETER_FILE, naming_scheme: Literal['ITS_LIVE_OD', 'ITS_LIVE_PROD'] = 'ITS_LIVE_OD', - esa_username: Optional[str] = None, - esa_password: Optional[str] = None, ) -> Tuple[Path, Path, Path]: """Process a Sentinel-1, Sentinel-2, or Landsat-8 image pair @@ -363,43 +349,20 @@ def process( Returns: the autoRIFT product file, browse image, and thumbnail image """ - orbits = None - polarization = None reference_path = None secondary_path = None reference_metadata = None secondary_metadata = None reference_zero_path = None secondary_zero_path = None - reference_state_vec = None - secondary_state_vec = None - lat_limits, lon_limits = None, None platform = get_platform(reference) - if platform == 'S1': - for scene in [reference, secondary]: - scene_url = get_download_url(scene) - download_file(scene_url, chunk_size=5242880) - - orbits = Path('Orbits').resolve() - orbits.mkdir(parents=True, exist_ok=True) - - if (esa_username is None) or (esa_password is None): - esa_username, esa_password = utils.get_esa_credentials() - reference_state_vec, reference_provider = downloadSentinelOrbitFile( - reference, directory=str(orbits), esa_credentials=(esa_username, esa_password) - ) - log.info(f'Downloaded orbit file {reference_state_vec} from {reference_provider}') - secondary_state_vec, secondary_provider = downloadSentinelOrbitFile( - secondary, directory=str(orbits), esa_credentials=(esa_username, esa_password) - ) - log.info(f'Downloaded orbit file {secondary_state_vec} from {secondary_provider}') - - polarization = get_s1_primary_polarization(reference) - lat_limits, lon_limits = geometry.bounding_box(f'{reference}.zip', polarization=polarization, orbits=orbits) + if platform == 'S1': + from hyp3_autorift.s1_isce2 import process_sentinel1_with_isce2 + netcdf_file = process_sentinel1_with_isce2(reference, secondary, parameter_file) - elif platform == 'S2': + else: # Set config and env for new CXX threads in Geogrid/autoRIFT gdal.SetConfigOption('GDAL_DISABLE_READDIR_ON_OPEN', 'EMPTY_DIR') os.environ['GDAL_DISABLE_READDIR_ON_OPEN'] = 'EMPTY_DIR' @@ -407,94 +370,51 @@ def process( gdal.SetConfigOption('AWS_REGION', 'us-west-2') os.environ['AWS_REGION'] = 'us-west-2' - reference_metadata = get_s2_metadata(reference) - secondary_metadata = get_s2_metadata(secondary) - reference_path = reference_metadata['path'] - secondary_path = secondary_metadata['path'] - bbox = reference_metadata['bbox'] - lat_limits = (bbox[1], bbox[3]) - lon_limits = (bbox[0], bbox[2]) - - elif 'L' in platform: - # Set config and env for new CXX threads in Geogrid/autoRIFT - gdal.SetConfigOption('GDAL_DISABLE_READDIR_ON_OPEN', 'EMPTY_DIR') - os.environ['GDAL_DISABLE_READDIR_ON_OPEN'] = 'EMPTY_DIR' + if platform == 'S2': + reference_metadata = get_s2_metadata(reference) + reference_path = reference_metadata['path'] - gdal.SetConfigOption('AWS_REGION', 'us-west-2') - os.environ['AWS_REGION'] = 'us-west-2' + secondary_metadata = get_s2_metadata(secondary) + secondary_path = secondary_metadata['path'] - gdal.SetConfigOption('AWS_REQUEST_PAYER', 'requester') - os.environ['AWS_REQUEST_PAYER'] = 'requester' + elif 'L' in platform: + # Set config and env for new CXX threads in Geogrid/autoRIFT + gdal.SetConfigOption('AWS_REQUEST_PAYER', 'requester') + os.environ['AWS_REQUEST_PAYER'] = 'requester' - reference_metadata = get_lc2_metadata(reference) - reference_path = get_lc2_path(reference_metadata) + reference_metadata = get_lc2_metadata(reference) + reference_path = get_lc2_path(reference_metadata) - secondary_metadata = get_lc2_metadata(secondary) - secondary_path = get_lc2_path(secondary_metadata) + secondary_metadata = get_lc2_metadata(secondary) + secondary_path = get_lc2_path(secondary_metadata) - filter_platform = min([platform, get_platform(secondary)]) - if filter_platform in ('L4', 'L5', 'L7'): - # Log path here before we transform it - log.info(f'Reference scene path: {reference_path}') - log.info(f'Secondary scene path: {secondary_path}') - reference_path, reference_zero_path, secondary_path, secondary_zero_path = \ - apply_landsat_filtering(reference_path, secondary_path) + filter_platform = min([platform, get_platform(secondary)]) + if filter_platform in ('L4', 'L5', 'L7'): + # Log path here before we transform it + log.info(f'Reference scene path: {reference_path}') + log.info(f'Secondary scene path: {secondary_path}') + reference_path, reference_zero_path, secondary_path, secondary_zero_path = \ + apply_landsat_filtering(reference_path, secondary_path) - if reference_metadata['properties']['proj:epsg'] != secondary_metadata['properties']['proj:epsg']: - log.info('Reference and secondary projections are different! Reprojecting.') + if reference_metadata['properties']['proj:epsg'] != secondary_metadata['properties']['proj:epsg']: + log.info('Reference and secondary projections are different! Reprojecting.') - # Reproject zero masks if necessary - if reference_zero_path and secondary_zero_path: - _, _ = utils.ensure_same_projection(reference_zero_path, secondary_zero_path) + # Reproject zero masks if necessary + if reference_zero_path and secondary_zero_path: + _, _ = utils.ensure_same_projection(reference_zero_path, secondary_zero_path) - reference_path, secondary_path = utils.ensure_same_projection(reference_path, secondary_path) + reference_path, secondary_path = utils.ensure_same_projection(reference_path, secondary_path) bbox = reference_metadata['bbox'] lat_limits = (bbox[1], bbox[3]) lon_limits = (bbox[0], bbox[2]) - log.info(f'Reference scene path: {reference_path}') - log.info(f'Secondary scene path: {secondary_path}') - - scene_poly = geometry.polygon_from_bbox(x_limits=lat_limits, y_limits=lon_limits) - parameter_info = utils.find_jpl_parameter_info(scene_poly, parameter_file) - - if platform == 'S1': - isce_dem = geometry.prep_isce_dem(parameter_info['geogrid']['dem'], lat_limits, lon_limits) + log.info(f'Reference scene path: {reference_path}') + log.info(f'Secondary scene path: {secondary_path}') - utils.format_tops_xml(reference, secondary, polarization, isce_dem, orbits) + scene_poly = geometry.polygon_from_bbox(x_limits=lat_limits, y_limits=lon_limits) + parameter_info = utils.find_jpl_parameter_info(scene_poly, parameter_file) - import isce # noqa - from topsApp import TopsInSAR - insar = TopsInSAR(name='topsApp', cmdline=['topsApp.xml', '--end=mergebursts']) - insar.configure() - insar.run() - - reference_path = os.path.join(os.getcwd(), 'merged', 'reference.slc.full') - secondary_path = os.path.join(os.getcwd(), 'merged', 'secondary.slc.full') - - for slc in [reference_path, secondary_path]: - gdal.Translate(slc, f'{slc}.vrt', format='ENVI') - - from hyp3_autorift.vend.testGeogrid_ISCE import (loadMetadata, - runGeogrid) - meta_r = loadMetadata('fine_coreg') - meta_s = loadMetadata('secondary') - geogrid_info = runGeogrid(meta_r, meta_s, epsg=parameter_info['epsg'], **parameter_info['geogrid']) - - # NOTE: After Geogrid is run, all drivers are no longer registered. - # I've got no idea why, or if there are other affects... - gdal.AllRegister() - - from hyp3_autorift.vend.testautoRIFT_ISCE import \ - generateAutoriftProduct - netcdf_file = generateAutoriftProduct( - reference_path, secondary_path, nc_sensor=platform, optical_flag=False, ncname=None, - geogrid_run_info=geogrid_info, **parameter_info['autorift'], - parameter_file=DEFAULT_PARAMETER_FILE.replace('/vsicurl/', ''), - ) - - else: from hyp3_autorift.vend.testGeogridOptical import ( coregisterLoadMetadata, runGeogrid) meta_r, meta_s = coregisterLoadMetadata( @@ -552,8 +472,6 @@ def main(): parser.add_argument('--publish-bucket', default='', help='Additionally, publish products to this bucket. Necessary credentials must be provided ' 'via the `PUBLISH_ACCESS_KEY_ID` and `PUBLISH_SECRET_ACCESS_KEY` environment variables.') - parser.add_argument('--esa-username', default=None, help="Username for ESA's Copernicus Data Space Ecosystem") - parser.add_argument('--esa-password', default=None, help="Password for ESA's Copernicus Data Space Ecosystem") parser.add_argument('--parameter-file', default=DEFAULT_PARAMETER_FILE, help='Shapefile for determining the correct search parameters by geographic location. ' 'Path to shapefile must be understood by GDAL') diff --git a/src/hyp3_autorift/s1_correction.py b/src/hyp3_autorift/s1_correction.py index bb7d6c98..dc5995a1 100644 --- a/src/hyp3_autorift/s1_correction.py +++ b/src/hyp3_autorift/s1_correction.py @@ -1,64 +1,13 @@ import argparse -import copy import logging -from datetime import timedelta from pathlib import Path -from typing import Optional from hyp3lib.aws import upload_file_to_s3 -from hyp3lib.fetch import download_file -from hyp3lib.get_orb import downloadSentinelOrbitFile -from hyp3lib.scene import get_download_url -from hyp3_autorift import geometry, utils -from hyp3_autorift.process import DEFAULT_PARAMETER_FILE, get_s1_primary_polarization -from hyp3_autorift.utils import get_esa_credentials -from hyp3_autorift.vend.testGeogrid_ISCE import loadParsedata, runGeogrid -log = logging.getLogger(__name__) - - -def generate_correction_data( - scene: str, - buffer: int = 0, - parameter_file: str = DEFAULT_PARAMETER_FILE, - esa_username: Optional[str] = None, - esa_password: Optional[str] = None, -): - scene_path = Path(f'{scene}.zip') - if not scene_path.exists(): - scene_url = get_download_url(scene) - scene_path = download_file(scene_url, chunk_size=5242880) - - orbits = Path('Orbits').resolve() - orbits.mkdir(parents=True, exist_ok=True) - - if (esa_username is None) or (esa_password is None): - esa_username, esa_password = get_esa_credentials() - - state_vec, oribit_provider = downloadSentinelOrbitFile( - scene, directory=str(orbits), esa_credentials=(esa_username, esa_password) - ) - log.info(f'Downloaded orbit file {state_vec} from {oribit_provider}') +from hyp3_autorift.process import DEFAULT_PARAMETER_FILE +from hyp3_autorift.s1_isce2 import generate_correction_data - polarization = get_s1_primary_polarization(scene) - lat_limits, lon_limits = geometry.bounding_box(f'{scene}.zip', polarization=polarization, orbits=orbits) - - scene_poly = geometry.polygon_from_bbox(x_limits=lat_limits, y_limits=lon_limits) - parameter_info = utils.find_jpl_parameter_info(scene_poly, parameter_file) - - isce_dem = geometry.prep_isce_dem(parameter_info['geogrid']['dem'], lat_limits, lon_limits) - utils.format_tops_xml(scene, scene, polarization, isce_dem, orbits) - - reference_meta = loadParsedata(str(scene_path), orbit_dir=orbits, aux_dir=orbits, buffer=buffer) - - secondary_meta = copy.deepcopy(reference_meta) - spoof_dt = timedelta(days=1) - secondary_meta.sensingStart += spoof_dt - secondary_meta.sensingStop += spoof_dt - - geogrid_info = runGeogrid(reference_meta, secondary_meta, epsg=parameter_info['epsg'], **parameter_info['geogrid']) - - return geogrid_info +log = logging.getLogger(__name__) def main(): diff --git a/src/hyp3_autorift/s1_isce2.py b/src/hyp3_autorift/s1_isce2.py new file mode 100644 index 00000000..c1b2ee97 --- /dev/null +++ b/src/hyp3_autorift/s1_isce2.py @@ -0,0 +1,336 @@ +import copy +import logging +import os +import sys +import textwrap +from datetime import timedelta +from pathlib import Path +from typing import Optional + +import numpy as np +from hyp3lib.fetch import download_file +from hyp3lib.get_orb import downloadSentinelOrbitFile +from hyp3lib.scene import get_download_url +from osgeo import gdal + +from hyp3_autorift import geometry, utils +from hyp3_autorift.process import DEFAULT_PARAMETER_FILE +from hyp3_autorift.utils import get_esa_credentials + +log = logging.getLogger(__name__) + + +def get_s1_primary_polarization(granule_name): + polarization = granule_name[14:16] + if polarization in ['SV', 'DV']: + return 'vv' + if polarization in ['SH', 'DH']: + return 'hh' + raise ValueError(f'Cannot determine co-polarization of granule {granule_name}') + + +def process_sentinel1_with_isce2(reference, secondary, parameter_file): + import isce # noqa + from topsApp import TopsInSAR + from hyp3_autorift.vend.testGeogrid_ISCE import loadMetadata, runGeogrid + from hyp3_autorift.vend.testautoRIFT_ISCE import generateAutoriftProduct + + for scene in [reference, secondary]: + scene_url = get_download_url(scene) + download_file(scene_url, chunk_size=5242880) + + orbits = Path('Orbits').resolve() + orbits.mkdir(parents=True, exist_ok=True) + + esa_username, esa_password = utils.get_esa_credentials() + + reference_state_vec, reference_provider = downloadSentinelOrbitFile( + reference, directory=str(orbits), esa_credentials=(esa_username, esa_password) + ) + log.info(f'Downloaded orbit file {reference_state_vec} from {reference_provider}') + + secondary_state_vec, secondary_provider = downloadSentinelOrbitFile( + secondary, directory=str(orbits), esa_credentials=(esa_username, esa_password) + ) + log.info(f'Downloaded orbit file {secondary_state_vec} from {secondary_provider}') + + polarization = get_s1_primary_polarization(reference) + lat_limits, lon_limits = bounding_box(f'{reference}.zip', polarization=polarization, orbits=str(orbits)) + + scene_poly = geometry.polygon_from_bbox(x_limits=lat_limits, y_limits=lon_limits) + parameter_info = utils.find_jpl_parameter_info(scene_poly, parameter_file) + + isce_dem = prep_isce_dem(parameter_info['geogrid']['dem'], lat_limits, lon_limits) + format_tops_xml(reference, secondary, polarization, isce_dem, orbits) + + insar = TopsInSAR(name='topsApp', cmdline=['topsApp.xml', '--end=mergebursts']) + insar.configure() + insar.run() + + reference_path = os.path.join(os.getcwd(), 'merged', 'reference.slc.full') + secondary_path = os.path.join(os.getcwd(), 'merged', 'secondary.slc.full') + + for slc in [reference_path, secondary_path]: + gdal.Translate(slc, f'{slc}.vrt', format='ENVI') + + meta_r = loadMetadata('fine_coreg') + meta_s = loadMetadata('secondary') + geogrid_info = runGeogrid(meta_r, meta_s, epsg=parameter_info['epsg'], **parameter_info['geogrid']) + + # NOTE: After Geogrid is run, all drivers are no longer registered. + # I've got no idea why, or if there are other affects... + gdal.AllRegister() + + netcdf_file = generateAutoriftProduct( + reference_path, secondary_path, nc_sensor='S1', optical_flag=False, ncname=None, + geogrid_run_info=geogrid_info, **parameter_info['autorift'], + parameter_file=parameter_file.replace('/vsicurl/', ''), + ) + return netcdf_file + + +def generate_correction_data( + scene: str, + buffer: int = 0, + parameter_file: str = DEFAULT_PARAMETER_FILE, + esa_username: Optional[str] = None, + esa_password: Optional[str] = None, +): + from hyp3_autorift.vend.testGeogrid_ISCE import loadParsedata, runGeogrid + scene_path = Path(f'{scene}.zip') + if not scene_path.exists(): + scene_url = get_download_url(scene) + scene_path = download_file(scene_url, chunk_size=5242880) + + orbits = Path('Orbits').resolve() + orbits.mkdir(parents=True, exist_ok=True) + + if (esa_username is None) or (esa_password is None): + esa_username, esa_password = get_esa_credentials() + + state_vec, oribit_provider = downloadSentinelOrbitFile( + scene, directory=str(orbits), esa_credentials=(esa_username, esa_password) + ) + log.info(f'Downloaded orbit file {state_vec} from {oribit_provider}') + + polarization = get_s1_primary_polarization(scene) + lat_limits, lon_limits = bounding_box(f'{scene}.zip', polarization=polarization, orbits=str(orbits)) + + scene_poly = geometry.polygon_from_bbox(x_limits=lat_limits, y_limits=lon_limits) + parameter_info = utils.find_jpl_parameter_info(scene_poly, parameter_file) + + isce_dem = prep_isce_dem(parameter_info['geogrid']['dem'], lat_limits, lon_limits) + format_tops_xml(scene, scene, polarization, isce_dem, orbits) + + reference_meta = loadParsedata(str(scene_path), orbit_dir=orbits, aux_dir=orbits, buffer=buffer) + + secondary_meta = copy.deepcopy(reference_meta) + spoof_dt = timedelta(days=1) + secondary_meta.sensingStart += spoof_dt + secondary_meta.sensingStop += spoof_dt + + geogrid_info = runGeogrid(reference_meta, secondary_meta, epsg=parameter_info['epsg'], **parameter_info['geogrid']) + + return geogrid_info + + +class SysArgvManager: + """Context manager to clear and reset sys.argv + + A bug in the ISCE2 Application class causes sys.argv to always be parsed when + no options are proved, even when setting `cmdline=[]`, preventing programmatic use. + """ + def __init__(self): + self.argv = sys.argv.copy() + + def __enter__(self): + sys.argv = sys.argv[:1] + + def __exit__(self, exc_type, exc_val, exc_tb): + sys.argv = self.argv + + +def get_topsinsar_config(): + from isce.applications.topsApp import TopsInSAR + with SysArgvManager(): + insar = TopsInSAR(name="topsApp") + insar.configure() + + config_data = {} + for name in ['reference', 'secondary']: + scene = insar.__getattribute__(name) + + sensing_times = [] + for swath in range(1, 4): + scene.configure() + scene.swathNumber = swath + scene.parse() + sensing_times.append( + (scene.product.sensingStart, scene.product.sensingStop) + ) + + sensing_start = min([sensing_time[0] for sensing_time in sensing_times]) + sensing_stop = max([sensing_time[1] for sensing_time in sensing_times]) + + sensing_dt = (sensing_stop - sensing_start) / 2 + sensing_start + + config_data[f'{name}_filename'] = Path(scene.safe[0]).name + config_data[f'{name}_dt'] = sensing_dt.strftime("%Y%m%dT%H:%M:%S.%f").rstrip('0') + + return config_data + + +def format_tops_xml(reference, secondary, polarization, dem, orbits, xml_file='topsApp.xml'): + xml_template = f""" + + + + {orbits} + {orbits} + reference + ['{reference}.zip'] + {polarization} + + + {orbits} + {orbits} + secondary + ['{secondary}.zip'] + {polarization} + + {dem} + False + True + False + False + False + 32 + 32 + 51 + 51 + 32 + 32 + + + """ + + with open(xml_file, 'w') as f: + f.write(textwrap.dedent(xml_template)) + + +def bounding_box(safe, priority='reference', polarization='hh', orbits='Orbits', epsg=4326): + """Determine the geometric bounding box of a Sentinel-1 image + + :param safe: Path to the Sentinel-1 SAFE zip archive + :param priority: Image priority, either 'reference' (default) or 'secondary' + :param polarization: Image polarization (default: 'hh') + :param orbits: Path to the orbital files (default: './Orbits') + :param epsg: Projection EPSG code (default: 4326) + + :return: lat_limits (list), lon_limits (list) + lat_limits: list containing the [minimum, maximum] latitudes + lat_limits: list containing the [minimum, maximum] longitudes + """ + import isce # noqa: F401 + from contrib.geo_autoRIFT.geogrid import Geogrid + from isceobj.Orbit.Orbit import Orbit + from isceobj.Sensor.TOPS.Sentinel1 import Sentinel1 + frames = [] + for swath in range(1, 4): + rdr = Sentinel1() + rdr.configure() + rdr.safe = [os.path.abspath(safe)] + rdr.output = priority + rdr.orbitDir = os.path.abspath(orbits) + rdr.auxDir = os.path.abspath(orbits) + rdr.swathNumber = swath + rdr.polarization = polarization + rdr.parse() + frames.append(rdr.product) + + first_burst = frames[0].bursts[0] + sensing_start = min([x.sensingStart for x in frames]) + sensing_stop = max([x.sensingStop for x in frames]) + starting_range = min([x.startingRange for x in frames]) + far_range = max([x.farRange for x in frames]) + range_pixel_size = first_burst.rangePixelSize + prf = 1.0 / first_burst.azimuthTimeInterval + + orb = Orbit() + orb.configure() + + for state_vector in first_burst.orbit: + orb.addStateVector(state_vector) + + for frame in frames: + for burst in frame.bursts: + for state_vector in burst.orbit: + if state_vector.time < orb.minTime or state_vector.time > orb.maxTime: + orb.addStateVector(state_vector) + + obj = Geogrid() + obj.configure() + + obj.startingRange = starting_range + obj.rangePixelSize = range_pixel_size + obj.sensingStart = sensing_start + obj.prf = prf + obj.lookSide = -1 + obj.numberOfLines = int(np.round((sensing_stop - sensing_start).total_seconds() * prf)) + obj.numberOfSamples = int(np.round((far_range - starting_range)/range_pixel_size)) + obj.orbit = orb + obj.epsg = epsg + + obj.determineBbox() + + lat_limits = obj._xlim + lon_limits = obj._ylim + + log.info(f'Latitude limits [min, max]: {lat_limits}') + log.info(f'Longitude limits [min, max]: {lon_limits}') + + return lat_limits, lon_limits + + +def prep_isce_dem(input_dem, lat_limits, lon_limits, isce_dem=None): + import isce # noqa: F401 + import isceobj + from contrib.demUtils import createDemStitcher + + if isce_dem is None: + seamstress = createDemStitcher() + isce_dem = seamstress.defaultName([*lat_limits, *lon_limits]) + + isce_dem = os.path.abspath(isce_dem + '.wgs84') + log.info(f'ISCE dem is: {isce_dem}') + + in_ds = gdal.OpenShared(input_dem, gdal.GA_ReadOnly) + warp_options = gdal.WarpOptions( + format='ENVI', outputType=gdal.GDT_Int16, resampleAlg='cubic', + xRes=0.001, yRes=0.001, dstSRS='EPSG:4326', dstNodata=0, + outputBounds=[lon_limits[0], lat_limits[0], lon_limits[1], lat_limits[1]] + ) + gdal.Warp(isce_dem, in_ds, options=warp_options) + + del in_ds + + isce_ds = gdal.Open(isce_dem, gdal.GA_ReadOnly) + isce_trans = isce_ds.GetGeoTransform() + + img = isceobj.createDemImage() + img.width = isce_ds.RasterXSize + img.length = isce_ds.RasterYSize + img.bands = 1 + img.dataType = 'SHORT' + img.scheme = 'BIL' + img.setAccessMode('READ') + img.filename = isce_dem + + img.firstLongitude = isce_trans[0] + 0.5 * isce_trans[1] + img.deltaLongitude = isce_trans[1] + + img.firstLatitude = isce_trans[3] + 0.5 * isce_trans[5] + img.deltaLatitude = isce_trans[5] + img.renderHdr() + + return isce_dem diff --git a/src/hyp3_autorift/utils.py b/src/hyp3_autorift/utils.py index 19f3e1ed..a36a80a4 100644 --- a/src/hyp3_autorift/utils.py +++ b/src/hyp3_autorift/utils.py @@ -3,8 +3,6 @@ import logging import netrc import os -import sys -import textwrap from pathlib import Path from platform import system from typing import Tuple, Union @@ -122,90 +120,6 @@ def find_jpl_parameter_info(polygon: ogr.Geometry, parameter_file: str) -> dict: return parameter_info -def format_tops_xml(reference, secondary, polarization, dem, orbits, xml_file='topsApp.xml'): - xml_template = f""" - - - - {orbits} - {orbits} - reference - ['{reference}.zip'] - {polarization} - - - {orbits} - {orbits} - secondary - ['{secondary}.zip'] - {polarization} - - {dem} - False - True - False - False - False - 32 - 32 - 51 - 51 - 32 - 32 - - - """ - - with open(xml_file, 'w') as f: - f.write(textwrap.dedent(xml_template)) - - -class SysArgvManager: - """Context manager to clear and reset sys.argv - - A bug in the ISCE2 Application class causes sys.argv to always be parsed when - no options are proved, even when setting `cmdline=[]`, preventing programmatic use. - """ - def __init__(self): - self.argv = sys.argv.copy() - - def __enter__(self): - sys.argv = sys.argv[:1] - - def __exit__(self, exc_type, exc_val, exc_tb): - sys.argv = self.argv - - -def get_topsinsar_config(): - from isce.applications.topsApp import TopsInSAR - with SysArgvManager(): - insar = TopsInSAR(name="topsApp") - insar.configure() - - config_data = {} - for name in ['reference', 'secondary']: - scene = insar.__getattribute__(name) - - sensing_times = [] - for swath in range(1, 4): - scene.configure() - scene.swathNumber = swath - scene.parse() - sensing_times.append( - (scene.product.sensingStart, scene.product.sensingStop) - ) - - sensing_start = min([sensing_time[0] for sensing_time in sensing_times]) - sensing_stop = max([sensing_time[1] for sensing_time in sensing_times]) - - sensing_dt = (sensing_stop - sensing_start) / 2 + sensing_start - - config_data[f'{name}_filename'] = Path(scene.safe[0]).name - config_data[f'{name}_dt'] = sensing_dt.strftime("%Y%m%dT%H:%M:%S.%f").rstrip('0') - - return config_data - - def load_geospatial(infile: str, band: int = 1): ds = gdal.Open(infile, gdal.GA_ReadOnly) diff --git a/src/hyp3_autorift/vend/CHANGES.diff b/src/hyp3_autorift/vend/CHANGES.diff index 0424d0ae..2ddf2a3a 100644 --- a/src/hyp3_autorift/vend/CHANGES.diff +++ b/src/hyp3_autorift/vend/CHANGES.diff @@ -134,7 +134,7 @@ - slave_filename = conts['slave_filename'][0] - master_dt = conts['master_dt'][0] - slave_dt = conts['slave_dt'][0] -+ from hyp3_autorift.utils import get_topsinsar_config ++ from hyp3_autorift.s1_isce2 import get_topsinsar_config + conts = get_topsinsar_config() + master_filename = conts['reference_filename'] + slave_filename = conts['secondary_filename'] diff --git a/src/hyp3_autorift/vend/README.md b/src/hyp3_autorift/vend/README.md index 6fd4f4a3..196e1b72 100644 --- a/src/hyp3_autorift/vend/README.md +++ b/src/hyp3_autorift/vend/README.md @@ -24,7 +24,7 @@ Changes, as listed in `CHANGES.diff`, were done to: * use the full Sentinel-2 COG id in the output netCDF product filename to ensure unique names **Note:** The `topsinsar_filename.py` included here is not used, but retained for reference. -We've replaced it with `hyp3_autorift.utils.get_topsinsar_config`. +We've replaced it with `hyp3_autorift.s1_isce2.get_topsinsar_config`. ## Additional Patches diff --git a/src/hyp3_autorift/vend/testautoRIFT_ISCE.py b/src/hyp3_autorift/vend/testautoRIFT_ISCE.py index 1632ee0d..f382a489 100755 --- a/src/hyp3_autorift/vend/testautoRIFT_ISCE.py +++ b/src/hyp3_autorift/vend/testautoRIFT_ISCE.py @@ -864,7 +864,7 @@ def generateAutoriftProduct(indir_m, indir_s, grid_location, init_offset, search dt = geogrid_run_info['dt'] epsg = geogrid_run_info['epsg'] - from hyp3_autorift.utils import get_topsinsar_config + from hyp3_autorift.s1_isce2 import get_topsinsar_config conts = get_topsinsar_config() master_filename = conts['reference_filename'] slave_filename = conts['secondary_filename'] diff --git a/tests/test_process.py b/tests/test_process.py index 58e02a08..e6801354 100644 --- a/tests/test_process.py +++ b/tests/test_process.py @@ -286,26 +286,6 @@ def test_get_datetime(): process.get_datetime('S3_adsflafjladsf') -def test_get_s1_primary_polarization(): - assert process.get_s1_primary_polarization( - 'S1B_WV_SLC__1SSV_20200923T184541_20200923T185150_023506_02CA71_AABB') == 'vv' - assert process.get_s1_primary_polarization( - 'S1B_IW_GRDH_1SDV_20200924T092954_20200924T093026_023515_02CABC_6C62') == 'vv' - assert process.get_s1_primary_polarization( - 'S1B_IW_GRDH_1SSH_20200924T112903_20200924T112932_023516_02CAC7_D003') == 'hh' - assert process.get_s1_primary_polarization( - 'S1B_IW_OCN__2SDH_20200924T090450_20200924T090519_023515_02CAB8_917B') == 'hh' - - with pytest.raises(ValueError): - process.get_s1_primary_polarization('S1A_EW_GRDM_1SHH_20150513T080355_20150513T080455_005900_007994_35D2') - with pytest.raises(ValueError): - process.get_s1_primary_polarization('S1A_EW_GRDM_1SHV_20150509T230833_20150509T230912_005851_00787D_3BE5') - with pytest.raises(ValueError): - process.get_s1_primary_polarization('S1A_IW_SLC__1SVH_20150706T015744_20150706T015814_006684_008EF7_9B69') - with pytest.raises(ValueError): - process.get_s1_primary_polarization('S1A_IW_GRDH_1SVV_20150706T015720_20150706T015749_006684_008EF7_54BA') - - def test_apply_landsat_filtering(monkeypatch): def mock_apply_filter_function(scene, _): if process.get_platform(scene) < 'L7': diff --git a/tests/test_s1_isce2.py b/tests/test_s1_isce2.py new file mode 100644 index 00000000..084253fa --- /dev/null +++ b/tests/test_s1_isce2.py @@ -0,0 +1,23 @@ +import pytest + +from hyp3_autorift import s1_isce2 + + +def test_get_s1_primary_polarization(): + assert s1_isce2.get_s1_primary_polarization( + 'S1B_WV_SLC__1SSV_20200923T184541_20200923T185150_023506_02CA71_AABB') == 'vv' + assert s1_isce2.get_s1_primary_polarization( + 'S1B_IW_GRDH_1SDV_20200924T092954_20200924T093026_023515_02CABC_6C62') == 'vv' + assert s1_isce2.get_s1_primary_polarization( + 'S1B_IW_GRDH_1SSH_20200924T112903_20200924T112932_023516_02CAC7_D003') == 'hh' + assert s1_isce2.get_s1_primary_polarization( + 'S1B_IW_OCN__2SDH_20200924T090450_20200924T090519_023515_02CAB8_917B') == 'hh' + + with pytest.raises(ValueError): + s1_isce2.get_s1_primary_polarization('S1A_EW_GRDM_1SHH_20150513T080355_20150513T080455_005900_007994_35D2') + with pytest.raises(ValueError): + s1_isce2.get_s1_primary_polarization('S1A_EW_GRDM_1SHV_20150509T230833_20150509T230912_005851_00787D_3BE5') + with pytest.raises(ValueError): + s1_isce2.get_s1_primary_polarization('S1A_IW_SLC__1SVH_20150706T015744_20150706T015814_006684_008EF7_9B69') + with pytest.raises(ValueError): + s1_isce2.get_s1_primary_polarization('S1A_IW_GRDH_1SVV_20150706T015720_20150706T015749_006684_008EF7_54BA')