Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Functionality to Delete a Duplicate Request #213

Open
AndrewPlayer3 opened this issue Apr 24, 2023 · 5 comments
Open

Functionality to Delete a Duplicate Request #213

AndrewPlayer3 opened this issue Apr 24, 2023 · 5 comments
Labels
enhancement New feature or request

Comments

@AndrewPlayer3
Copy link
Contributor

When a user creates an accidental duplicate request, both requests are processed. This wastes resources and uses up the user's job quota.

In order to save resources, the duplicated request should be deleted. The duplicated request should only be deleted if it is identical, i.e. the same job name and request. However, I don't necessarily think that jobs with only the same name, or only the same request, should be deleted.

Currently, there is no functionality that allows a user to delete a request, so there is no alternative.

@AndrewPlayer3 AndrewPlayer3 added the enhancement New feature or request label Apr 24, 2023
@asjohnston-asf
Copy link
Member

asjohnston-asf commented Apr 24, 2023

There are occasional cases where a user might want to re-run a job with the exact same inputs and expect updated output:

  • after a new version of the job plugin has been released
  • after new, more precise Sentinel-1 orbit files are made available

Providing an option to cancel in-progress jobs makes sense as a way to give the end user the most control in situations where they may have unintentionally submitted duplicate jobs, submitted jobs with the wrong parameters, or submitted jobs in error.

@jhkennedy
Copy link
Contributor

Likewise, they may just want to:

  • reprocess a scenes/pair that has expired and they lost the product for

So I agree, I'd prefer a cancel/delete method as well

@markbroich
Copy link

Hi ASF team, I hope you are well. I was wondering if you got to addressing this issue pls? Thank you for considering my questions.
I submitted a few images for processing last night and they were eventually done. I just reran my code and it looks like the same images get processed again. I may be misunderstanding.

Thank you for your work and greetings from hot and humid NC. Regards, Mark

@markbroich
Copy link

Hi ASF team, I now reduce my processing request by what I already copied to S3. I still think that it would be good to prevent users from duplicate processing of identical images with exact same specs (and orbit file). My 2c. Thanks for your work. Mark

@jhkennedy
Copy link
Contributor

jhkennedy commented Aug 29, 2023

Hi @markbroich thank you for the feedback! Because there are legitimate use cases for re-processing the exact same job, we do not expect to add de-duplication functionality to HyP3.

However, monitoring an area (processing new acquisitions as they arrive) is a common application that needs deduplication, so we've published two tutorials that show you how to de-duplicate searches when monitoring an area:

Does this help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants