diff --git a/docs/_quarto.yml b/docs/_quarto.yml index 28277f3f5..469daf756 100644 --- a/docs/_quarto.yml +++ b/docs/_quarto.yml @@ -49,12 +49,20 @@ website: - section: Examples href: examples/index.qmd contents: + - section: Dfs0 + href: examples/dfs0/index.qmd + contents: + - examples/dfs0/cmems_insitu.qmd - section: Dfs2 href: examples/dfs2/index.qmd contents: - examples/dfs2/bathy.qmd - examples/dfs2/gfs.qmd - - examples/Dfsu-2D-interpolation.qmd + - section: Dfsu + href: examples/dfsu/index.qmd + contents: + - examples/dfsu/spatial_interpolation.qmd + - examples/dfsu/merge_subdomains.qmd - examples/Time-interpolation.qmd - examples/Generic.qmd - text: Design philosophy @@ -164,3 +172,6 @@ format: html: theme: cosmo toc: true + ipynb: + theme: cosmo + toc: true diff --git a/docs/examples/dfs0/cmems_insitu.qmd b/docs/examples/dfs0/cmems_insitu.qmd new file mode 100644 index 000000000..10db421ee --- /dev/null +++ b/docs/examples/dfs0/cmems_insitu.qmd @@ -0,0 +1,104 @@ +--- +title: Dfs0 - CMEMS *in-situ* data +jupyter: python3 +--- + + + +```{python} +import pandas as pd +import xarray as xr +import mikeio +``` + +```{python} +fino = xr.open_dataset("../../data/NO_TS_MO_FINO1_202209.nc") +fino +``` + +CMEMS *in-situ* data is provided in a standardised format, the [OceanSITES time-series data format](http://www.oceansites.org/docs/oceansites_data_format_reference_manual.pdf). + +> "*The OceanSITES programme is the global network of open-ocean sustained time series +sites, called ocean reference stations, being implemented by an international partnership of +researchers and agencies. OceanSITES provides fixed-point time series of various physical, +biogeochemical, ecosystem and atmospheric variables at locations around the globe, from +the atmosphere and sea surface to the seafloor. The program’s objective is to build and +maintain a multidisciplinary global network for a broad range of research and operational +applications including climate, carbon, and ecosystem variability and forecasting and ocean +state validation*" + +Find out which variables we are interested in to extract: + +```{python} +data = [ + { + "name": fino[var].name, + "standard_name": fino[var].standard_name, + "units": fino[var].units, + } + for var in fino.data_vars + if hasattr(fino[var], "units") +] + +pd.DataFrame(data) +``` + +The data have a DEPTH dimension, even though variables are only measured at a single level and doesn't vary in time although the format allows for it. + +I.e. temperature (TEMP) is available at level 1 (0.5 m) + +```{python} +fino.DEPH.plot.line(x="TIME") +``` + +```{python} +fino['TEMP'].plot.line("-^",x='TIME') +``` + +```{python} +fino['VHM0'].plot.line("-^",x='TIME') +``` + +Wave data are only available at the surface. + +```{python} +fino[['VHM0','VTZA','VPED']].isel(DEPTH=0) +``` + +```{python} +df = fino[['VHM0','VTZA','VPED']].isel(DEPTH=0).to_dataframe() +``` + +The data are stored on the concurrent timesteps. + +```{python} +df[['VHM0','VTZA','VPED']].head() +``` + +```{python} +df[['VHM0','VTZA']].plot(style='+') +``` + +Convert the wave height data to a mikeio dataset. + +```{python} +ds = mikeio.from_pandas( + df[["VHM0"]].dropna(), items=mikeio.ItemInfo(mikeio.EUMType.Significant_wave_height) +) +ds +``` + +Store the results in Dfs0 format. + +```{python} +ds.to_dfs("FINO1_VHM0.dfs0") +``` + +Read the file again to check... + +```{python} +ds = mikeio.read("FINO1_VHM0.dfs0") +ds +``` + + diff --git a/docs/examples/dfs0/index.qmd b/docs/examples/dfs0/index.qmd new file mode 100644 index 000000000..7284d0d73 --- /dev/null +++ b/docs/examples/dfs0/index.qmd @@ -0,0 +1,7 @@ +--- +title: Dfs0 examples +--- + +A collection of specific examples of working with dfs0 files. For a general introduction to dfs0 see the [user guide](../../user-guide/dfs0.qmd) and the [API reference](../../api/#dfs). + +* [CMEMS *In-situ* data](cmems_insitu.qmd) \ No newline at end of file diff --git a/docs/examples/dfs2/index.qmd b/docs/examples/dfs2/index.qmd index d9b9adbef..c5323375b 100644 --- a/docs/examples/dfs2/index.qmd +++ b/docs/examples/dfs2/index.qmd @@ -2,6 +2,8 @@ title: Dfs2 examples --- +A collection of specific examples of working with dfs2 files. For a general introduction to dfsu see the [user guide](../../user-guide/dfs2.qmd) and the [API reference](../../api/#dfs). + * [Bathymetry](bathy.qmd) * [Meteo data](gfs.qmd) diff --git a/docs/examples/dfsu/index.qmd b/docs/examples/dfsu/index.qmd new file mode 100644 index 000000000..dad1c0049 --- /dev/null +++ b/docs/examples/dfsu/index.qmd @@ -0,0 +1,9 @@ +--- +title: Dfsu examples +--- + +A collection of specific examples of working with dfsu files. For a general introduction to dfsu see the [user guide](../../user-guide/dfsu.qmd) and the [API reference](../../api/#dfs). + + +* [2D spatial interpolation](spatial_interpolation.qmd) +* [Merging subdomain dfsu files](merge_subdomains.qmd) \ No newline at end of file diff --git a/docs/examples/dfsu/merge_subdomains.qmd b/docs/examples/dfsu/merge_subdomains.qmd new file mode 100644 index 000000000..84d8c8090 --- /dev/null +++ b/docs/examples/dfsu/merge_subdomains.qmd @@ -0,0 +1,153 @@ +--- +title: Merging subdomain dfsu files +jupyter: python3 +--- + +During simulation MIKE will commonly split simulation files into subdomains and output results with a p_# suffix. This script will merge dfsu files of this type into a single file. + +Note: Below implementation considers a 2D dfsu file. For 3D dfsu file, the script needs to be modified accordingly. + + +## Import libraries + +```{python} +import mikeio +import numpy as np +from mikeio.spatial import GeometryFM2D +``` + +```{python} +# (optional) check first file, items etc. +mikeio.open("../../data/SimA_HD_p0.dfsu") +``` + +## Choose items to process + +```{python} +# choose items to process (when in doubt look at one of the files you want to process with mikeio.open) +items = ["Surface elevation", "Current speed", "Current direction"] +``` + +## Read files + +Option A: automatically find all with _p# suffix + +```{python} +import glob +import os + +basename = "../../data/SimA_HD" # basename of the dfsu files + + +def find_dfsu_files(basename): + pattern = f"{basename}_p*.dfsu" + files = sorted(glob.glob(pattern)) + if not files: + raise ValueError(f"No files found matching the pattern: {pattern}") + return files + + +dfs_files = find_dfsu_files(basename) +print(f"Found {len(dfs_files)} files:") +for file in dfs_files: + print(f" - {os.path.basename(file)}") + +dfs_list = [mikeio.read(file, items=items) for file in dfs_files] +``` + +Option B: manually select files + +```{python} +# List of input dfsu files +dfs_files = [ + "../../data/SimA_HD_p0.dfsu", + "../../data/SimA_HD_p1.dfsu", + "../../data/SimA_HD_p2.dfsu", + "../../data/SimA_HD_p3.dfsu", +] + +# read all dfsu files +dfs_list = [mikeio.read(file, items=items) for file in dfs_files] +``` + +## Extract data of all subdomains + +```{python} +# Create a dictionary to store data for each item +data_dict = {item: [] for item in items} + +# Get time steps (assuming all files have the same time steps) +time_steps = dfs_list[0][items[0]].time + +# loop over items and time steps and concatenate data from all subdomains +for item in items: + for i in range(len(time_steps)): + # Extract and combine data for the current time step from all subdomains + combined_data = np.concatenate([dfs[item].values[i, :] for dfs in dfs_list]) + data_dict[item].append(combined_data) + + # Convert the list to a numpy array + data_dict[item] = np.array(data_dict[item]) + +# Prepare Merged Data +merged_data = np.array([data_dict[item] for item in items]) +``` + +## Merge geometry of all subdomains + +```{python} +geometries = [dfs.geometry for dfs in dfs_list] + +combined_node_coordinates = [] +combined_element_table = [] +node_offset = 0 + +# loop through geometries to combine nodes and elements of all subdomains +for geom in geometries: + current_node_coordinates = geom.node_coordinates + current_element_table = geom.element_table + + combined_node_coordinates.extend(current_node_coordinates) + adjusted_element_table = [element + node_offset for element in current_element_table] + combined_element_table.extend(adjusted_element_table) + + node_offset += len(current_node_coordinates) + +combined_node_coordinates = np.array(combined_node_coordinates) +combined_element_table = np.array(combined_element_table, dtype=object) +projection = geometries[0]._projstr + +# create combined geometry +combined_geometry = GeometryFM2D( + node_coordinates=combined_node_coordinates, + element_table=combined_element_table, + projection=projection +) +``` + +```{python} +combined_geometry.plot() +``` + +## Merge everything into dataset + +```{python} +ds_out = mikeio.Dataset( + data=merged_data, # n_items, timesteps, n_elements + items=items, + time=time_steps, + geometry=combined_geometry +) +``` + +```{python} +ds_out[items[0]].sel(time=1).plot() # plot the first time step of the first item +``` + +## Write output to single file + +```{python} +output_file = "area_merged.dfsu" +ds_out.to_dfs(output_file) +``` + diff --git a/docs/examples/Dfsu-2D-interpolation.qmd b/docs/examples/dfsu/spatial_interpolation.qmd similarity index 94% rename from docs/examples/Dfsu-2D-interpolation.qmd rename to docs/examples/dfsu/spatial_interpolation.qmd index 10b5c48e8..98bf8fb44 100644 --- a/docs/examples/Dfsu-2D-interpolation.qmd +++ b/docs/examples/dfsu/spatial_interpolation.qmd @@ -10,7 +10,7 @@ import mikeio ``` ```{python} -ds = mikeio.read("../data/wind_north_sea.dfsu", items="Wind speed") +ds = mikeio.read("../../data/wind_north_sea.dfsu", items="Wind speed") ds ``` @@ -84,13 +84,13 @@ with rasterio.open( ``` -![](../images/dfsu_grid_interp_tiff.png) +![](../../images/dfsu_grid_interp_tiff.png) # Interpolate to other mesh Interpolate the data from this coarse mesh onto a finer resolution mesh ```{python} -msh = mikeio.Mesh('../data/north_sea_2.mesh') +msh = mikeio.Mesh("../../data/north_sea_2.mesh") msh ``` @@ -143,7 +143,7 @@ from mikeio._interpolation import get_idw_interpolant ``` ```{python} -dfs = mikeio.open("../data/wind_north_sea.dfsu") +dfs = mikeio.open("../../data/wind_north_sea.dfsu") ``` ```{python} diff --git a/notebooks/README.md b/notebooks/README.md index 61b4b8c4c..16ae2fbf6 100644 --- a/notebooks/README.md +++ b/notebooks/README.md @@ -1,4 +1,8 @@ -📢Example notebooks are moving to a new repo: -https://github.com/DHI/mikeio-examples +The notebooks in this folder are considered legacy and are being replaced by the documentation available at . + +The contents are moving to either the [user guide](https://dhi.github.io/mikeio/user-guide/getting-started.html) or the [examples](https://dhi.github.io/mikeio/examples/) section of the documentation. + +On each page in the documentation, there is a link to download as a Jupyter notebook. ![download as jupyter](download_jupyter.png) + +If you find a notebook here that you would like to see in the documentation, please let us know by creating an issue in the [issue tracker](https://github.com/DHI/mikeio/issues). -🔍 Search for examples here diff --git a/notebooks/download_jupyter.png b/notebooks/download_jupyter.png new file mode 100644 index 000000000..c8cddcde7 Binary files /dev/null and b/notebooks/download_jupyter.png differ diff --git a/tests/testdata/NO_TS_MO_FINO1_202209.nc b/tests/testdata/NO_TS_MO_FINO1_202209.nc new file mode 100755 index 000000000..d11e774cc Binary files /dev/null and b/tests/testdata/NO_TS_MO_FINO1_202209.nc differ