Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev echowave #32

Merged
merged 15 commits into from
Oct 13, 2024
8 changes: 4 additions & 4 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ jobs:
shell: bash -l {0}
strategy:
matrix:
os: ["ubuntu", "macos", "windows"]
python-version: ['3.9', '3.10', '3.11' ]
os: ["ubuntu", "windows"]
python-version: ['3.10', '3.11', '3.12' ]
steps:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
Expand All @@ -40,7 +40,7 @@ jobs:
python setup.py bdist_wheel

- name: Upload wheels
if: matrix.python-version == '3.10' && matrix.os == 'ubuntu'
if: matrix.python-version == '3.12' && matrix.os == 'ubuntu'
uses: actions/upload-artifact@v4
with:
name: wheels
Expand All @@ -57,7 +57,7 @@ jobs:
name: wheels
- uses: conda-incubator/setup-miniconda@v2
with:
python-version: '3.10'
python-version: '3.12'
miniforge-version: latest
miniforge-variant: Miniforge3
- name: Publish to PyPi
Expand Down
121 changes: 96 additions & 25 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,24 @@
Welcome to metocean-api's documentation!
=====================================

**metocean-api** is a Python tool to extract time series of metocean data from global/regional/coastal hindcasts/reananalysis.

The package contains functions to extract time series to csv-format from:
* `NORA3`_ hindcast dataset
* `ERA5`_ reanalysis dataset

.. _NORA3: https://marine.met.no/node/19
.. _ERA5: https://doi.org/10.24381/cds.adbb2d47
**metocean-api** is a Python tool designed to extract time series of metocean (meteorological and oceanographic) data from a variety of sources, including global, regional, and coastal hindcasts and reanalysis.
The extracted data can be saved in CSV format or netCDF for further analysis and usage.
Refer to the section below for more information about the available datasets and variables.

Installing **metocean-api**
=============================================
Quick installation
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
.. code-block:: bash

$ pip install metocean-api

or

.. code-block:: bash

$ conda install -c conda-forge metocean-api

Alternative 1: Using Mambaforge (alternative to Miniconda)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Expand Down Expand Up @@ -65,65 +72,125 @@ This section documents the **ts-module**. The ts-object is initialized with the
start_time='2000-01-01', end_time='2000-03-31' ,
product='NORA3_wind_wave')


Available Datasets in metocean-api
=====================================

Several options for **product** are available. Please check the data catalog for the time coverage:

* For wind NORA3 hourly data in 10, 20, 50, 100, 250, 500, 750m (Nordic Area): product='NORA3_wind_sub'
* For wind NORA3 hourly data in 10, 20, 50, 100, 250, 500, 750m (Nordic Area) developed by MET Norway:

.. code-block:: python

product='NORA3_wind_sub'

Dataset: https://thredds.met.no/thredds/catalog/nora3_subset_atmos/wind_hourly/catalog.html

* For atmospheric (pressure,temperature,precipitation,humidity, radiation) NORA3 hourly surface data (Nordic Area): product='NORA3_atm_sub'
* For atmospheric (pressure,temperature,precipitation,humidity, radiation) NORA3 hourly surface data (Nordic Area) developed by MET Norway:

.. code-block:: python

product='NORA3_atm_sub'

Dataset: https://thredds.met.no/thredds/catalog/nora3_subset_atmos/atm_hourly/catalog.html

* For SST and atmospheric (wind, temperature, relative humidity, tke, air density) NORA3 3-hourly data in 50, 100, 150, 200, 300m (Nordic Area): product='NORA3_atm3hr_sub'
* For SST and atmospheric (wind, temperature, relative humidity, tke, air density) NORA3 3-hourly data in 50, 100, 150, 200, 300m (Nordic Area) developed by MET Norway:

.. code-block:: python

product='NORA3_atm3hr_sub'

Dataset: https://thredds.met.no/thredds/catalog/nora3_subset_atmos/atm_3hourly/catalog.html

* For wave NORA3 sub data (Nordic Seas): product='NORA3_wave_sub'
* For wave NORA3 sub data (Nordic Seas) developed by MET Norway:

.. code-block:: python

product='NORA3_wave_sub'

Dataset: https://thredds.met.no/thredds/catalog/nora3_subset_wave/wave_tser/catalog.html

* For combined wind and wave NORA3 sub data (Nordic Seas): product='NORA3_wind_wave'
* For combined wind and wave NORA3 sub data (Nordic Seas) developed by MET Norway:

.. code-block:: python

product='NORA3_wind_wave'

* For wave NORA3 data (Nordic Seas + Arctic): product='NORA3_wave'
* For wave NORA3 data (Nordic Seas + Arctic) developed by MET Norway:

.. code-block:: python

product='NORA3_wave'

Dataset: https://thredds.met.no/thredds/catalog/windsurfer/mywavewam3km_files/catalog.html

* For sea level NORA3 data (Nordic Seas): product='NORA3_stormsurge'
* For sea level NORA3 data (Nordic Seas) developed by MET Norway:

.. code-block:: python

product='NORA3_stormsurge'

Dataset: https://thredds.met.no/thredds/catalog/stormrisk/catalog.html

* For coastal wave NORA3 data: product='NORAC_wave'
* For coastal wave NORA3 data developed by MET Norway:

.. code-block:: python

product='NORAC_wave'

Dataset: https://thredds.met.no/thredds/catalog/norac_wave/field/catalog.html

* For ocean data (sea level, temperature, currents, salinity over depth ) Norkyst800 data (from 2016-09-14 to today): product='NORKYST800'
* For ocean data (sea level, temperature, currents, salinity over depth ) Norkyst800 data (from 2016-09-14 to today) developed by MET Norway:

.. code-block:: python

product='NORKYST800'

Dataset: https://thredds.met.no/thredds/fou-hi/norkyst800v2.html

* For ocean data (sea level, temperature, currents, salinity over depth ) NorkystDA data (for 2017-2018): product='NorkystDA_zdepth' or product='NorkystDA_surface' (for only surface data)
* For ocean data (sea level, temperature, currents, salinity over depth ) NorkystDA data (for 2017-2018) developed by MET Norway:

.. code-block:: python

product='NorkystDA_zdepth' or product='NorkystDA_surface' (for only surface data)

Dataset: https://thredds.met.no/thredds/catalog/nora3_subset_ocean/catalog.html

* For global reanalysis ERA5 (wind and waves): product='ERA5'
* For global reanalysis ERA5 (wind and waves) developed by ECMWF:

The user needs to install the *CDS API key* according to https://cds.climate.copernicus.eu/api-how-to ,
.. code-block:: python

product='ERA5'

The user needs to install the *CDS API key* according to https://cds.climate.copernicus.eu/api-how-to ,
Dataset: https://doi.org/10.24381/cds.adbb2d47

* For global reanalysis/historical GTSM (storm surge, tidal elevation, total water level ): product='GTSM'
* For Global Tide and Surge Model (storm surge, tidal elevation, total water level) developed by Deltares :

.. code-block:: python
product='GTSM'

The user needs to install the *CDS API key* according to https://cds.climate.copernicus.eu/api-how-to ,

Dataset: https://doi.org/10.24381/cds.a6d42d60

* For ECHOWAVE (European COasts High Resolution Ocean WAVEs Hindcast) developed by Marine Renewable Energies Lab (MREL), TU Delft:

* For wave buoy observations (Statens vegvesen - E39): product='E39_letter_location_wave', e.g, product='E39_B_Sulafjorden_wave'
.. code-block:: python

The user needs to install the *CDS API key* according to https://cds.climate.copernicus.eu/api-how-to ,

product='ECHOWAVE'

Dataset: https://doi.org/10.4121/f359cd0f-d135-416c-9118-e79dccba57b9.v1, Publication: https://doi.org/10.1016/j.renene.2024.121391

* For wave buoy observations (Statens vegvesen - E39): product='E39_letter_location_wave', e.g,

.. code-block:: python

product='E39_B_Sulafjorden_wave'

Dataset: https://thredds.met.no/thredds/catalog/obs/buoy-svv-e39/catalog.html

Import data
=====================================
Import data from server to **ts-object** and save it as csv:

.. code-block:: python
Expand All @@ -146,6 +213,10 @@ To import data from a local csv-file to **ts-object**:
.. image:: ts.data0.png
:width: 900


Combine csv-files
=====================================

To combine several csv-files produced by **metocean-api**:

.. code-block:: python
Expand Down
9 changes: 5 additions & 4 deletions examples/example_import_NORA3.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@


# Define TimeSeries-object
df_ts = ts.TimeSeries(lon=3.73, lat=64.60,start_time='1990-01-01', end_time='2020-12-31' , product='NORA3_wind_wave')
#df_ts = ts.TimeSeries(lon=1.320, lat=53.324,start_time='2021-01-01', end_time='2021-01-15' , product='NORA3_wind_sub')
df_ts = ts.TimeSeries(lon=6.727, lat=65.064,start_time='2024-01-31', end_time='2024-02-01' , product='NORA3_wave')
#df_ts = ts.TimeSeries(lon=3.098, lat=52.48,start_time='2017-01-19', end_time='2017-02-20', product='ECHOWAVE')
#df_ts = ts.TimeSeries(lon=1.320, lat=53.324,start_time='2021-01-14', end_time='2021-01-15' , product='NORA3_wind_sub')
#df_ts = ts.TimeSeries(lon=1.320, lat=53.324,start_time='2021-01-01', end_time='2021-03-31' , product='NORA3_wave_sub')
#df_ts = ts.TimeSeries(lon=1.320, lat=53.324,start_time='2000-01-01', end_time='2001-03-31' , product='NORA3_stormsurge')
#df_ts = ts.TimeSeries(lon=1.320, lat=53.324,start_time='2021-01-01', end_time='2021-03-31' , product='NORA3_atm_sub')
Expand All @@ -12,10 +13,10 @@
#df_ts = ts.TimeSeries(lon=3.73, lat=64.60,start_time='2017-01-19', end_time='2017-01-20' , product='NorkystDA_zdepth')
#df_ts = ts.TimeSeries(lon=3.73, lat=64.60,start_time='2017-01-19', end_time='2017-01-20' , product='NorkystDA_surface')


# Import data from thredds.met.no and save it as csv
df_ts.import_data(save_csv=True)

# Load data from a local csv-file
#df_ts.load_data(local_file=df_ts.datafile)
df_ts.load_data(local_file=df_ts.datafile)


27 changes: 22 additions & 5 deletions metocean_api/ts/aux_funcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,19 +127,27 @@ def get_url_info(product, date):
infile = 'https://thredds.met.no/thredds/dodsC/nora3_subset_ocean/zdepth/{}/ocean_zdepth_2_4km-{}.nc'.format(date.strftime('%Y/%m'), date.strftime('%Y%m%d'))
x_coor_str = 'x'
y_coor_str = 'y'
elif product == 'ECHOWAVE':
infile = 'https://opendap.4tu.nl/thredds/dodsC/data2/djht/f359cd0f-d135-416c-9118-e79dccba57b9/1/{}/TU-MREL_EU_ATL-2M_{}.nc'.format(date.strftime('%Y'),date.strftime('%Y%m'))
x_coor_str = 'longitude'
y_coor_str = 'latitude'
else:
raise ValueError(f'Product not found {product}')
print(infile)
return x_coor_str, y_coor_str, infile

def get_date_list(product, start_date, end_date):
from datetime import datetime
if product == 'NORA3_wave' or product == 'ERA5' or product.startswith('NorkystDA'):
return pd.date_range(start=start_date , end=end_date, freq='D')
elif product == 'NORA3_wave_sub':
return pd.date_range(start=start_date , end=end_date, freq='MS')
return pd.date_range(start=datetime.strptime(start_date, '%Y-%m-%d').strftime('%Y-%m') , end=datetime.strptime(end_date, '%Y-%m-%d').strftime('%Y-%m'), freq='MS')
elif product == 'ECHOWAVE':
return pd.date_range(start=datetime.strptime(start_date, '%Y-%m-%d').strftime('%Y-%m') , end=datetime.strptime(end_date, '%Y-%m-%d').strftime('%Y-%m'), freq='MS')
elif product == 'NORA3_wind_sub' or product == 'NORA3_atm_sub' or product == 'NORA3_atm3hr_sub':
return pd.date_range(start=start_date , end=end_date, freq='MS')
return pd.date_range(start=datetime.strptime(start_date, '%Y-%m-%d').strftime('%Y-%m') , end=datetime.strptime(end_date, '%Y-%m-%d').strftime('%Y-%m'), freq='MS')
elif product == 'NORAC_wave':
return pd.date_range(start=start_date , end=end_date, freq='MS')
return pd.date_range(start=datetime.strptime(start_date, '%Y-%m-%d').strftime('%Y-%m') , end=datetime.strptime(end_date, '%Y-%m-%d').strftime('%Y-%m'), freq='MS')
elif product == 'NORA3_stormsurge':
return pd.date_range(start=start_date , end=end_date, freq='YS')
elif product == 'NORKYST800':
Expand All @@ -150,7 +158,9 @@ def get_date_list(product, start_date, end_date):

def drop_variables(product: str):
if product == 'NORA3_wave':
return ['projection_ob_tran','longitude','latitude']
drop_var = ['projection_ob_tran','longitude','latitude']
elif product == 'ECHOWAVE':
drop_var = ['longitude','latitude']
elif product == 'NORA3_wave_sub':
return ['longitude','latitude','rlat','rlon']
elif product == 'NORA3_wind_sub':
Expand Down Expand Up @@ -206,6 +216,12 @@ def get_near_coord(infile, lon, lat, product):
lat_near = ds.lat_rho.sel(eta_rho=eta_rho, xi_rho=xi_rho).values[0][0]
x_coor = eta_rho
y_coor = xi_rho
elif product=='ECHOWAVE':
ds_point = ds.sel(longitude=lon,latitude=lat,method='nearest')
lon_near = ds_point.longitude.values
lat_near = ds_point.latitude.values
x_coor = lon_near
y_coor = lat_near
elif product=='NORKYST800':
x, y = find_nearest_cartCoord(ds.lon, ds.lat, lon, lat)
lon_near = ds.lon.sel(Y=y, X=x).values[0][0]
Expand All @@ -223,7 +239,8 @@ def get_near_coord(infile, lon, lat, product):
print('Found nearest: lon.='+str(lon_near)+',lat.=' + str(lat_near))
return x_coor, y_coor, lon_near, lat_near

def create_dataframe(product,ds: xr.Dataset, lon_near, lat_near,outfile,variable, start_time, end_time, save_csv=True,save_nc=True, height=None, depth = None):

def create_dataframe(product,ds, lon_near, lat_near,outfile,variable, start_time, end_time, save_csv=True,save_nc=True, height=None, depth = None):
if product=='NORA3_wind_sub':
for i in range(len(height)):
variable_height = [k + '_'+str(height[i])+'m' for k in variable]
Expand Down
47 changes: 23 additions & 24 deletions metocean_api/ts/read_metno.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,15 +48,6 @@ def NORAC_ts(ts: TimeSeries, save_csv = False, save_nc = False, use_cache =False

return df


def __clean_cache(tempfiles):
for tmpfile in tempfiles:
try:
os.remove(tmpfile)
except PermissionError:
print(f"Skipping deletion of {tmpfile} due to PermissionError")


def NORA3_wind_wave_ts(ts: TimeSeries, save_csv = False, save_nc = False, use_cache =False):
"""
Extract times series of the nearest gird point (lon,lat) from
Expand Down Expand Up @@ -400,21 +391,6 @@ def NorkystDA_zdepth_ts(ts: TimeSeries, save_csv = False,save_nc = False, use_ca

return df

def __tempfile_dir(product,lon,lat, date_list,dirName):
tempfile = [None] *len(date_list)
# Create directory
try:
# Create target Directory
os.mkdir(dirName)
print("Directory " , dirName , " Created ")
except FileExistsError:
print("Directory " , dirName , " already exists")

for i in range(len(date_list)):
tempfile[i] = str(Path(dirName+"/"+product+"_"+"lon"+str(lon)+"lat"+str(lat)+"_"+date_list.strftime('%Y%m%d')[i]+".nc"))

return tempfile


def OBS_E39(ts: TimeSeries, save_csv = False, save_nc = False, use_cache =False):
"""
Expand Down Expand Up @@ -452,3 +428,26 @@ def OBS_E39(ts: TimeSeries, save_csv = False, save_nc = False, use_cache =False)
print('Data saved at: ' +ts.datafile)

return df


def __clean_cache(tempfiles):
for tmpfile in tempfiles:
try:
os.remove(tmpfile)
except PermissionError:
print(f"Skipping deletion of {tmpfile} due to PermissionError")

def __tempfile_dir(product,lon,lat, date_list,dirName):
tempfile = [None] *len(date_list)
# Create directory
try:
# Create target Directory
os.mkdir(dirName)
print("Directory " , dirName , " Created ")
except FileExistsError:
print("Directory " , dirName , " already exists")

for i in range(len(date_list)):
tempfile[i] = str(Path(dirName+"/"+product+"_"+"lon"+str(lon)+"lat"+str(lat)+"_"+date_list.strftime('%Y%m%d')[i]+".nc"))

return tempfile
Loading
Loading