Skip to content

Commit

Permalink
Merge pull request #733 from DHI/more_examples
Browse files Browse the repository at this point in the history
Documentation examples
  • Loading branch information
ecomodeller authored Oct 17, 2024
2 parents 7ec68b2 + 4400e40 commit aaaeedb
Show file tree
Hide file tree
Showing 10 changed files with 298 additions and 8 deletions.
13 changes: 12 additions & 1 deletion docs/_quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,12 +49,20 @@ website:
- section: Examples
href: examples/index.qmd
contents:
- section: Dfs0
href: examples/dfs0/index.qmd
contents:
- examples/dfs0/cmems_insitu.qmd
- section: Dfs2
href: examples/dfs2/index.qmd
contents:
- examples/dfs2/bathy.qmd
- examples/dfs2/gfs.qmd
- examples/Dfsu-2D-interpolation.qmd
- section: Dfsu
href: examples/dfsu/index.qmd
contents:
- examples/dfsu/spatial_interpolation.qmd
- examples/dfsu/merge_subdomains.qmd
- examples/Time-interpolation.qmd
- examples/Generic.qmd
- text: Design philosophy
Expand Down Expand Up @@ -164,3 +172,6 @@ format:
html:
theme: cosmo
toc: true
ipynb:
theme: cosmo
toc: true
104 changes: 104 additions & 0 deletions docs/examples/dfs0/cmems_insitu.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
---
title: Dfs0 - CMEMS *in-situ* data
jupyter: python3
---



```{python}
import pandas as pd
import xarray as xr
import mikeio
```

```{python}
fino = xr.open_dataset("../../data/NO_TS_MO_FINO1_202209.nc")
fino
```

CMEMS *in-situ* data is provided in a standardised format, the [OceanSITES time-series data format](http://www.oceansites.org/docs/oceansites_data_format_reference_manual.pdf).

> "*The OceanSITES programme is the global network of open-ocean sustained time series
sites, called ocean reference stations, being implemented by an international partnership of
researchers and agencies. OceanSITES provides fixed-point time series of various physical,
biogeochemical, ecosystem and atmospheric variables at locations around the globe, from
the atmosphere and sea surface to the seafloor. The program’s objective is to build and
maintain a multidisciplinary global network for a broad range of research and operational
applications including climate, carbon, and ecosystem variability and forecasting and ocean
state validation*"

Find out which variables we are interested in to extract:

```{python}
data = [
{
"name": fino[var].name,
"standard_name": fino[var].standard_name,
"units": fino[var].units,
}
for var in fino.data_vars
if hasattr(fino[var], "units")
]
pd.DataFrame(data)
```

The data have a DEPTH dimension, even though variables are only measured at a single level and doesn't vary in time although the format allows for it.

I.e. temperature (TEMP) is available at level 1 (0.5 m)

```{python}
fino.DEPH.plot.line(x="TIME")
```

```{python}
fino['TEMP'].plot.line("-^",x='TIME')
```

```{python}
fino['VHM0'].plot.line("-^",x='TIME')
```

Wave data are only available at the surface.

```{python}
fino[['VHM0','VTZA','VPED']].isel(DEPTH=0)
```

```{python}
df = fino[['VHM0','VTZA','VPED']].isel(DEPTH=0).to_dataframe()
```

The data are stored on the concurrent timesteps.

```{python}
df[['VHM0','VTZA','VPED']].head()
```

```{python}
df[['VHM0','VTZA']].plot(style='+')
```

Convert the wave height data to a mikeio dataset.

```{python}
ds = mikeio.from_pandas(
df[["VHM0"]].dropna(), items=mikeio.ItemInfo(mikeio.EUMType.Significant_wave_height)
)
ds
```

Store the results in Dfs0 format.

```{python}
ds.to_dfs("FINO1_VHM0.dfs0")
```

Read the file again to check...

```{python}
ds = mikeio.read("FINO1_VHM0.dfs0")
ds
```


7 changes: 7 additions & 0 deletions docs/examples/dfs0/index.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
title: Dfs0 examples
---

A collection of specific examples of working with dfs0 files. For a general introduction to dfs0 see the [user guide](../../user-guide/dfs0.qmd) and the [API reference](../../api/#dfs).

* [CMEMS *In-situ* data](cmems_insitu.qmd)
2 changes: 2 additions & 0 deletions docs/examples/dfs2/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
title: Dfs2 examples
---

A collection of specific examples of working with dfs2 files. For a general introduction to dfsu see the [user guide](../../user-guide/dfs2.qmd) and the [API reference](../../api/#dfs).

* [Bathymetry](bathy.qmd)
* [Meteo data](gfs.qmd)

9 changes: 9 additions & 0 deletions docs/examples/dfsu/index.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
title: Dfsu examples
---

A collection of specific examples of working with dfsu files. For a general introduction to dfsu see the [user guide](../../user-guide/dfsu.qmd) and the [API reference](../../api/#dfs).


* [2D spatial interpolation](spatial_interpolation.qmd)
* [Merging subdomain dfsu files](merge_subdomains.qmd)
153 changes: 153 additions & 0 deletions docs/examples/dfsu/merge_subdomains.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
---
title: Merging subdomain dfsu files
jupyter: python3
---

During simulation MIKE will commonly split simulation files into subdomains and output results with a p_# suffix. This script will merge dfsu files of this type into a single file.

Note: Below implementation considers a 2D dfsu file. For 3D dfsu file, the script needs to be modified accordingly.


## Import libraries

```{python}
import mikeio
import numpy as np
from mikeio.spatial import GeometryFM2D
```

```{python}
# (optional) check first file, items etc.
mikeio.open("../../data/SimA_HD_p0.dfsu")
```

## Choose items to process

```{python}
# choose items to process (when in doubt look at one of the files you want to process with mikeio.open)
items = ["Surface elevation", "Current speed", "Current direction"]
```

## Read files

Option A: automatically find all with _p# suffix

```{python}
import glob
import os
basename = "../../data/SimA_HD" # basename of the dfsu files
def find_dfsu_files(basename):
pattern = f"{basename}_p*.dfsu"
files = sorted(glob.glob(pattern))
if not files:
raise ValueError(f"No files found matching the pattern: {pattern}")
return files
dfs_files = find_dfsu_files(basename)
print(f"Found {len(dfs_files)} files:")
for file in dfs_files:
print(f" - {os.path.basename(file)}")
dfs_list = [mikeio.read(file, items=items) for file in dfs_files]
```

Option B: manually select files

```{python}
# List of input dfsu files
dfs_files = [
"../../data/SimA_HD_p0.dfsu",
"../../data/SimA_HD_p1.dfsu",
"../../data/SimA_HD_p2.dfsu",
"../../data/SimA_HD_p3.dfsu",
]
# read all dfsu files
dfs_list = [mikeio.read(file, items=items) for file in dfs_files]
```

## Extract data of all subdomains

```{python}
# Create a dictionary to store data for each item
data_dict = {item: [] for item in items}
# Get time steps (assuming all files have the same time steps)
time_steps = dfs_list[0][items[0]].time
# loop over items and time steps and concatenate data from all subdomains
for item in items:
for i in range(len(time_steps)):
# Extract and combine data for the current time step from all subdomains
combined_data = np.concatenate([dfs[item].values[i, :] for dfs in dfs_list])
data_dict[item].append(combined_data)
# Convert the list to a numpy array
data_dict[item] = np.array(data_dict[item])
# Prepare Merged Data
merged_data = np.array([data_dict[item] for item in items])
```

## Merge geometry of all subdomains

```{python}
geometries = [dfs.geometry for dfs in dfs_list]
combined_node_coordinates = []
combined_element_table = []
node_offset = 0
# loop through geometries to combine nodes and elements of all subdomains
for geom in geometries:
current_node_coordinates = geom.node_coordinates
current_element_table = geom.element_table
combined_node_coordinates.extend(current_node_coordinates)
adjusted_element_table = [element + node_offset for element in current_element_table]
combined_element_table.extend(adjusted_element_table)
node_offset += len(current_node_coordinates)
combined_node_coordinates = np.array(combined_node_coordinates)
combined_element_table = np.array(combined_element_table, dtype=object)
projection = geometries[0]._projstr
# create combined geometry
combined_geometry = GeometryFM2D(
node_coordinates=combined_node_coordinates,
element_table=combined_element_table,
projection=projection
)
```

```{python}
combined_geometry.plot()
```

## Merge everything into dataset

```{python}
ds_out = mikeio.Dataset(
data=merged_data, # n_items, timesteps, n_elements
items=items,
time=time_steps,
geometry=combined_geometry
)
```

```{python}
ds_out[items[0]].sel(time=1).plot() # plot the first time step of the first item
```

## Write output to single file

```{python}
output_file = "area_merged.dfsu"
ds_out.to_dfs(output_file)
```

Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ import mikeio
```

```{python}
ds = mikeio.read("../data/wind_north_sea.dfsu", items="Wind speed")
ds = mikeio.read("../../data/wind_north_sea.dfsu", items="Wind speed")
ds
```

Expand Down Expand Up @@ -84,13 +84,13 @@ with rasterio.open(
```

![](../images/dfsu_grid_interp_tiff.png)
![](../../images/dfsu_grid_interp_tiff.png)

# Interpolate to other mesh
Interpolate the data from this coarse mesh onto a finer resolution mesh

```{python}
msh = mikeio.Mesh('../data/north_sea_2.mesh')
msh = mikeio.Mesh("../../data/north_sea_2.mesh")
msh
```

Expand Down Expand Up @@ -143,7 +143,7 @@ from mikeio._interpolation import get_idw_interpolant
```

```{python}
dfs = mikeio.open("../data/wind_north_sea.dfsu")
dfs = mikeio.open("../../data/wind_north_sea.dfsu")
```

```{python}
Expand Down
10 changes: 7 additions & 3 deletions notebooks/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
📢Example notebooks are moving to a new repo:
https://github.com/DHI/mikeio-examples
The notebooks in this folder are considered legacy and are being replaced by the documentation available at <https://dhi.github.io/mikeio>.

The contents are moving to either the [user guide](https://dhi.github.io/mikeio/user-guide/getting-started.html) or the [examples](https://dhi.github.io/mikeio/examples/) section of the documentation.

On each page in the documentation, there is a link to download as a Jupyter notebook. ![download as jupyter](download_jupyter.png)

If you find a notebook here that you would like to see in the documentation, please let us know by creating an issue in the [issue tracker](https://github.com/DHI/mikeio/issues).

🔍 Search for examples here <https://dhi.github.io/mikeio-examples/search.html>
Binary file added notebooks/download_jupyter.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added tests/testdata/NO_TS_MO_FINO1_202209.nc
Binary file not shown.

0 comments on commit aaaeedb

Please sign in to comment.