Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Help]: LOS Direction Displacement Values ​​Are Too Large #170

Open
oguzhannysr opened this issue Sep 24, 2024 · 19 comments
Open

[Help]: LOS Direction Displacement Values ​​Are Too Large #170

oguzhannysr opened this issue Sep 24, 2024 · 19 comments

Comments

@oguzhannysr
Copy link

oguzhannysr commented Sep 24, 2024

@AlexeyPechnikov , Hello Alexey, as seen in the image, you see LOS direction deformations with SBAS analysis. This is a landfill. It is normal for the deformation to be large due to activity. However, the values ​​are extremely high in a very short time period. What is the cause of the problem? There are no atmospheric problems because they have been eliminated. How consistent would it be if I did Gaussian filtering? I had obtained very high deformation values ​​in my previous experiments with notebooks. When I tried the same region for LICSBAS, the maximum value for the cumulative LOS value I got between 2016-2024 was 160 mm, while the deformation in 4 months with PYGMTSAR was 238 mm?

image

@AlexeyPechnikov
Copy link
Owner

First, comparing different intervals is not a good idea due to the potential influence of seasonal factors. Second, it is crucial to remove orbital ramps, atmospheric phase delays, and tidal effects to avoid overestimating displacements. Gaussian filtering can indeed be used to eliminate these effects, but the filter wavelength (and corresponding sigma) must be carefully adjusted to preserve the actual deformation.

@oguzhannysr
Copy link
Author

@AlexeyPechnikov , What exactly is the filter wavelength? Is it the one you mentioned in the code below? I set it to 30 because of the spatial resolution.
sbas.compute_interferogram_multilook(baseline_pairs, 'intf_mlook', wavelength=30, weight=sbas.psfunction())#30

@oguzhannysr
Copy link
Author

@AlexeyPechnikov ,Also, I get an error like this in the last section, the space is actually quite small? I tried restarting the client and it didn't work. Can I overcome this problem if I use Colab Pro? Is it related to this?

image

@oguzhannysr
Copy link
Author

@AlexeyPechnikov ,For example, I tried narrowing the date range a little more and it works now, but for a small area, it exports each date to almost 30 minutes. How can I speed this up?

image

@AlexeyPechnikov
Copy link
Owner

What exactly is the filter wavelength? Is it the one you mentioned in the code below? I set it to 30 because of the spatial resolution.
sbas.compute_interferogram_multilook(baseline_pairs, 'intf_mlook', wavelength=30, weight=sbas.psfunction())

This code is for interferogram creation with Gaussian filtering. But I talk about Gaussian detrending after unwrapping.

Also, I get an error like this in the last section, the space is actually quite small?

Check if you have materialized grids or lazy ones, in the last case a long lazy pipeline can fail even on a high RAM hosts.

@oguzhannysr
Copy link
Author

@AlexeyPechnikov , Alexey, regarding the wavelength selection, I think you are talking about the line in the Imperial Valley example below. Would it be useful to set the wavelength as 100 based on my 30 meter spatial resolution?

# Gaussian filtering 400m cut-off wavelength with multilooking 1x4 on Sentinel-1 intensity
intensity = sbas.multilooking(np.square(np.abs(data)), wavelength=400, coarsen=(1,4))

How can I check if there are grids or lazy ones? I don't know Dask.

@AlexeyPechnikov
Copy link
Owner

Absolutely not, this is not 'Stack.gaussian' function call.

Use 'sync' functions for every step as in large dataset examples if you are not sure.

@oguzhannysr
Copy link
Author

oguzhannysr commented Oct 7, 2024

First, comparing different intervals is not a good idea due to the potential influence of seasonal factors.

@AlexeyPechnikov ,I don't understand what you mean here. I want to measure the cumulative deformation since 2016, is it seasonally inaccurate?

@AlexeyPechnikov
Copy link
Owner

As mentioned above:

while the deformation in 4 months with PYGMTSAR was 238 mm

This time frame is too short for cases with significant seasonal changes. However, the magnitude of the change is large enough that it is likely due to unremoved tidal effects or atmospheric phase delays.

@oguzhannysr
Copy link
Author

oguzhannysr commented Oct 8, 2024

@AlexeyPechnikov ,Thank you Alexey. I still haven't solved my problem with Dask, how can I speed up the export process? Because Google Colab stops after a certain period of time. Is it useful to run pygmtsar with colab gpu, in terms of RAM?

Since these lines take a long time, I start my notebook by commenting these lines. Is this why the export process takes so long?

# optionally, materialize to disk and open
#stl_sbas = sbas.sync_cube(stl_sbas, 'stl_sbas')

@AlexeyPechnikov
Copy link
Owner

how can I speed up the export process?

You can export materialized datasets that are synced to disk.

@oguzhannysr
Copy link
Author

oguzhannysr commented Oct 8, 2024

@AlexeyPechnikov , Alexey I got my results in the same area for both orbits. But I couldn't decide which one to use. How is it correct to determine this? I don't know much about the land. I am trying to make an analysis about the runway and taxiways in the middle. I'm investigating which part has sunk or risen more.

Descending:
image

Ascending:
image

airport:
image

airport-desc:
image

airport-asc:
image

@AlexeyPechnikov
Copy link
Owner

The results look noisy. Have you performed atmospheric corrections and other processing steps? You can combine both orbits to derive vertical and horizontal displacements.

@oguzhannysr
Copy link
Author

@AlexeyPechnikov, Yes, I followed the steps in the otmanbozdagh notebook. I think I applied it in the turbulent atmospheric effects section. I'm undecided on which orbit result to consider. How can I separate vertical and horizontal displacements? Is this possible in pygmtsar? I couldn't see any examples.

@oguzhannysr
Copy link
Author

@AlexeyPechnikov , Alexey, I'm waiting for your comments and help...

@AlexeyPechnikov
Copy link
Owner

If you’ve applied turbulent atmospheric correction and still see noisy results, it usually means your interferograms are too noisy. You can filter out the noisiest ones or unwrap only the best-correlated pixels to achieve numerical stability. There are two documents on my Patreon that may help in selecting the proper processing parameters and estimating accuracy: “Baseline Networks for PS and SBAS Analyses, 2024” and “Residuals of Topographic Phase and Constant Phase Delays.”

@oguzhannysr
Copy link
Author

@AlexeyPechnikov , What threshold should I apply to the coherence value to select these pixels? Is there a threshold value you recommend or does pygmtsar provide an opportunity to determine it?

@AlexeyPechnikov
Copy link
Owner

Use the stack correlation map to estimate the correlation threshold, as demonstrated in the provided examples.

@oguzhannysr
Copy link
Author

@AlexeyPechnikov , Alexey, I have a different question: How consistent would it be to use the PSI method in actively working garbage areas? As far as I know, is PSI available in pygmtsar?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants