Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reformat processor descriptions #35

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ install:
- pip install -r dev-requirements.txt
- pip install -e .
- git clone -b 2.4.x https://github.com/GeoNode/geonode.git
- cp local_settings.py geonode/geonode/.
- cp local_settings.py.template geonode/geonode/local_settings.py
- pip install -e geonode

script:
Expand Down
50 changes: 25 additions & 25 deletions dataqs/airnow/airnow.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,31 +50,31 @@ class AirNowGRIB2HourlyProcessor(GeoDataMosaicProcessor):
"airnow_aqi_combined"]
img_patterns = ["", "_pm25", "_combined"]
layer_titles = ["Ozone", "PM25", "Combined Ozone & PM25"]
description = """
U.S. Environmental Protection Agencys (EPA) nationwide, voluntary program,
AirNow(www.airnow.gov), provides real-time air quality data and forecasts to
protect public health across the United States, Canada, and parts of Mexico.
AirNowreceives real-time ozone and PM2.5data from over 2,000 monitors and
collects air quality forecasts for more than 300 cities.

As part of the Global Earth Observation System of Systems (GEOSS)
(www.epa.gov/geoss) program, the AirNow API system broadens access to AirNowdata
and data products. AirNow API produces data products in several standard data
formats and makes them available via FTP and web services. This document
describes the GRIB2 file formats.

All data provided by AirNow API are made possible by the efforts of more than
120 local, state, tribal, provincial, and federal government agencies
(www.airnow.gov/index.cfm?action=airnow.partnerslist). These data are not fully
verified or validated and should be considered preliminary and subject to
change. Data and information reported to AirNow from federal, state, local, and
tribal agencies are for the express purpose of reporting and forecasting the
Air Quality Index (AQI). As such, they should not be used to formulate or
support regulation, trends, guidance, or any other government or public
decision making. Official regulatory air quality data must be obtained from
EPA’s Air Quality System (AQS) (www.epa.gov/ttn/airs/airsaqs). See the AirNow
Data Exchange Guidelines at http://airnowapi.org/docs/DataUseGuidelines.pdf.
"""
description = (
u"U.S. Environmental Protection Agency's (EPA) nationwide, voluntary "
u"program, AirNow(www.airnow.gov), provides real-time air quality data "
u"and forecasts to protect public health across the United States, "
u" Canada, and parts of Mexico. AirNow receives real-time ozone and "
u"PM2.5 data from over 2000 monitors and collects air quality forecasts"
u" for more than 300 cities.\n\nAs part of the Global Earth Observation"
u" System of Systems (GEOSS)(www.epa.gov/geoss) program, the AirNow API"
u" system broadens access to AirNowdata and data products. AirNow API "
u"produces data products in several standard data formats and makes "
u"them available via FTP and web services. This documen describes the "
u"GRIB2 file formats. All data provided by AirNow API are made "
u"possible by the efforts of more than 120 local, state, tribal, "
u"provincial, and federal government agencies (www.airnow.gov/index.cfm"
u"?action=airnow.partnerslist). These data are not fully verified or "
u"validated and should be considered preliminary and subject to change."
u" Data and information reported to AirNow from federal, state, local, "
u"and tribal agencies are for the express purpose of reporting and"
u" forecasting the Air Quality Index (AQI). As such, they should not be"
u" used to formulate or support regulation, trends, guidance, or any "
u"other government or public decision making. Official regulatory air "
u"quality data must be obtained from EPA’s Air Quality System (AQS) "
u"(www.epa.gov/ttn/airs/airsaqs). See the AirNow Data Exchange "
u"Guidelines at http://airnowapi.org/docs/DataUseGuidelines.pdf."
)

def download(self, auth_account=AIRNOW_ACCOUNT, days=1):
"""
Expand Down
109 changes: 55 additions & 54 deletions dataqs/cmap/cmap.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@
import shutil
from datetime import date
from ftplib import FTP

from time import sleep
from dateutil.relativedelta import relativedelta
from dataqs.processor_base import GeoDataMosaicProcessor, GS_DATA_DIR, \
GS_TMP_DIR
GS_TMP_DIR, RSYNC_WAIT_TIME
from dataqs.helpers import get_band_count, gdal_translate, \
nc_convert, style_exists, cdo_fixlng

Expand All @@ -30,57 +30,54 @@ class CMAPProcessor(GeoDataMosaicProcessor):
layer_name = 'cmap'
bounds = '-178.75,178.75,-88.75,88.75'
title = 'CPC Merged Analysis of Precipitation, 1979/01 - {}'
abstract = """The CPC Merged Analysis of Precipitation ("CMAP") is a
technique which produces pentad and monthly analyses of global precipitation
in which observations from raingauges are merged with precipitation estimates
from several satellite-based algorithms (infrared and microwave). The analyses
are are on a 2.5 x 2.5 degree latitude/longitude grid and extend back to 1979.
These data are comparable (but should not be confused with) similarly combined
analyses by the Global Precipitation Climatology Project which are described in
Huffman et al (1997).

It is important to note that the input data sources to make these analyses are
not constant throughout the period of record. For example, SSM/I (passive
microwave - scattering and emission) data became available in July of 1987;
prior to that the only microwave-derived estimates available are from the MSU
algorithm (Spencer 1993) which is emission-based thus precipitation estimates
are avaialble only over oceanic areas. Furthermore, high temporal resolution IR
data from geostationary satellites (every 3-hr) became available during 1986;
prior to that, estimates from the OPI technique (Xie and Arkin 1997) are used
based on OLR from polar orbiting satellites.

The merging technique is thoroughly described in Xie and Arkin (1997). Briefly,
the methodology is a two-step process. First, the random error is reduced by
linearly combining the satellite estimates using the maximum likelihood method,
in which case the linear combination coefficients are inversely proportional to
the square of the local random error of the individual data sources. Over global
land areas the random error is defined for each time period and grid location
by comparing the data source with the raingauge analysis over the surrounding
area. Over oceans, the random error is defined by comparing the data sources
with the raingauge observations over the Pacific atolls. Bias is reduced when
the data sources are blended in the second step using the blending technique of
Reynolds (1988). Here the data output from step 1 is used to define the "shape"
of the precipitation field and the rain gauge data are used to constrain the
amplitude.

Monthly and pentad CMAP estimates back to the 1979 are available from CPC ftp
server.

References:

Huffman, G. J. and co-authors, 1997: The Global Precipitation Climatology
Project (GPCP) combined data set. Bull. Amer. Meteor. Soc., 78, 5-20.

Reynolds, R. W., 1988: A real-time global sea surface temperature analysis. J.
Climate, 1, 75-86.

Spencer, R. W., 1993: Global oceanic precipitation from the MSU during 1979-91
and comparisons to other climatologies. J. Climate, 6, 1301-1326.

Xie P., and P. A. Arkin, 1996: Global precipitation: a 17-year monthly analysis
based on gauge observations, satellite estimates, and numerical model outputs.
Bull. Amer. Meteor. Soc., 78, 2539-2558.
"""
abstract = (
"The CPC Merged Analysis of Precipitation ('CMAP') is a technique which"
" produces pentad and monthly analyses of global precipitation in which"
" observations from raingauges are merged with precipitation estimates "
"from several satellite-based algorithms (infrared and microwave). The "
"analyses are on a 2.5 x 2.5 degree latitude/longitude grid and extend "
"back to 1979.\n\nThese data are comparable (but should not be confused"
" with) similarly combined analyses by the Global Precipitation "
"Climatology Project which are described in Huffman et al (1997).\n\n"
"It is important to note that the input data sources to make these "
"analyses are not constant throughout the period of record. For example"
", SSM/I (passive microwave - scattering and emission) data became "
"available in July of 1987; prior to that the only microwave-derived "
"estimates available are from the MSU algorithm (Spencer 1993) which is"
" emission-based thus precipitation estimates are avaialble only over "
" oceanic areas. Furthermore, high temporal resolution IR data from "
"geostationary satellites (every 3-hr) became available during 1986;"
" prior to that, estimates from the OPI technique (Xie and Arkin 1997) "
"are used based on OLR from polar orbiting satellites.\n\nThe merging "
"technique is thoroughly described in Xie and Arkin (1997). Briefly, "
"the methodology is a two-step process. First, the random error is "
"reduced by linearly combining the satellite estimates using the "
"maximum likelihood method, in which case the linear combination "
"coefficients are inversely proportional to the square of the local "
"random error of the individual data sources. Over global land areas "
"the random error is defined for each time period and grid location by "
"comparing the data source with the raingauge analysis over the "
"surrounding area. Over oceans, the random error is defined by "
"comparing the data sources with the raingauge observations over the "
"Pacific atolls. Bias is reduced when the data sources are blended in "
"the second step using the blending technique of Reynolds (1988). Here "
"the data output from step 1 is used to define the \"shape\" of the "
"precipitation field and the rain gauge data are used to constrain the "
"amplitude.\n\nMonthly and pentad CMAP estimates back to the 1979 are "
"available from CPC ftp server.\n\nSource: "
"http://www.esrl.noaa.gov/psd/data/gridded/data.cmap.html\n\nRaw data "
"file: ftp://ftp.cdc.noaa.gov/Datasets/cmap/enh/precip.mon.mean.nc"
"\n\nReferences:\n\nHuffman, G. J. and "
"co-authors, 1997: The Global Precipitation Climatology Project (GPCP) "
"combined data set. Bull. Amer. Meteor. Soc., 78, 5-20.\n\nReynolds, R."
" W., 1988: A real-time global sea surface temperature analysis. J. "
"Climate, 1, 75-86.\n\nSpencer, R. W., 1993: Global oceanic "
"precipitation from the MSU during 1979-91 and comparisons to other "
"climatologies. J. Climate, 6, 1301-1326.\n\nXie P., and P. A. Arkin, "
"1996: Global precipitation: a 17-year monthly analysis based on gauge "
"observations, satellite estimates, and numerical model outputs. Bull. "
"Amer. Meteor. Soc., 78, 2539-2558."
)

def download(self, url, tmp_dir=GS_TMP_DIR, filename=None):
if not filename:
Expand Down Expand Up @@ -122,6 +119,7 @@ def run(self):
cdf_file = self.convert(os.path.join(self.tmp_dir, ncfile))
bands = get_band_count(cdf_file)
img_list = self.get_mosaic_filenames(self.layer_name)
dst_files = []
for band in range(1, bands + 1):
band_date = re.sub('[\-\.]+', '', self.get_date(band).isoformat())
img_name = '{}_{}T000000000Z.tif'.format(self.layer_name, band_date)
Expand All @@ -136,7 +134,10 @@ def run(self):
os.makedirs(dst_dir)
if dst_file.endswith('.tif'):
shutil.move(os.path.join(self.tmp_dir, band_tif), dst_file)
self.post_geoserver(dst_file, self.layer_name)
dst_files.append(dst_file)
sleep(RSYNC_WAIT_TIME * 2)
for dst_file in dst_files:
self.post_geoserver(dst_file, self.layer_name, sleeptime=0)

if not style_exists(self.layer_name):
with open(os.path.join(script_dir,
Expand Down
5 changes: 0 additions & 5 deletions dataqs/cmap/resources/cmap.sld
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,6 @@
<sld:Rule>
<sld:RasterSymbolizer>
<Opacity>1.0</Opacity>
<ChannelSelection>
<GrayChannel>
<SourceChannelName>{latest_band}</SourceChannelName>
</GrayChannel>
</ChannelSelection>
<ColorMap extended="true">
<sld:ColorMapEntry color="#2b83ba" label="0 mm" opacity="1.0" quantity="0"/>
<sld:ColorMapEntry color="#74b6ad" label="10 mm" opacity="1.0" quantity="10"/>
Expand Down
17 changes: 10 additions & 7 deletions dataqs/forecastio/forecastio_air.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,13 +41,16 @@ class ForecastIOAirTempProcessor(GeoDataMosaicProcessor):
prefix = "forecast_io_airtemp"
base_url = "http://maps.forecast.io/temperature/"
layer_name = "forecast_io_airtemp"
description = """Project Quicksilver is an experimental new data product
that attempts to create the world's highest resolution real-time map of global
(near-surface) air temperature.\n\n
It is generated using the same source data models that power Forecast.io,
combined with a sophisticated microclimate model that adjusts the temperatures
based on the effects of elevation, terrain, proximity to water, foliage cover,
and other factors.\n\nSource: http://blog.forecast.io/project-quicksilver/"""
description = (
"Project Quicksilver is an experimental new data product that attempts "
"to create the world's highest resolution real-time map of global "
"(near-surface) air temperature.\n\nIt is generated using the same "
"source data models that power Forecast.io, combined with a "
"sophisticated microclimate model that adjusts the temperatures based "
"on the effects of elevation, terrain, proximity to water, foliage "
"cover, and other factors.\n\nSource: "
"http://blog.forecast.io/project-quicksilver/"
)

def parse_name(self, img_date):
imgstrtime = img_date.strftime("%Y-%m-%d %H:00")
Expand Down
30 changes: 17 additions & 13 deletions dataqs/gdacs/gdacs.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,19 +39,23 @@ class GDACSProcessor(GeoDataProcessor):
prefix = "gdacs_alerts"
layer_title = 'Flood, Quake, Cyclone Alerts - GDACS'
params = {}
base_url = \
"http://www.gdacs.org/rss.aspx?profile=ARCHIVE&fromarchive=true&" + \
"from={}&to={}&alertlevel=&country=&eventtype=EQ,TC,FL&map=true"
description = """GDACS (Global Disaster and Alert Coordination System) is a
collaboration platform for organisations providing information on humanitarian
disasters. From a technical point of view, GDACS links information of all
participating organisations using a variety of systems to have a harmonized list
of data sources.In 2011, the GDACS platform was completely revised to collect,
store and distribute resources explicitly by events. The system matches
information from all organisations (by translating unique identifiers), and make
these resources available for GDACS users and developers in the form of GDACS
Platform Services. The GDACS RSS feed automatically include a list of available
resources.\n\nSource: http://www.gdacs.org/resources.aspx"""
base_url = (
"http://www.gdacs.org/rss.aspx?profile=ARCHIVE&fromarchive=true&"
"from={}&to={}&alertlevel=&country=&eventtype=EQ,TC,FL&map=true")
description = (
"GDACS (Global Disaster and Alert Coordination System) is a "
"collaboration platform for organisations providing information on "
"humanitarian disasters. From a technical point of view, GDACS links "
"information of all participating organisations using a variety of "
"systems to have a harmonized list of data sources.In 2011, the GDACS "
"platform was completely revised to collect, store and distribute "
"resources explicitly by events. The system matches information from "
"all organisations (by translating unique identifiers), and make these "
"resources available for GDACS users and developers in the form of "
"GDACS Platform Services. The GDACS RSS feed automatically include a "
"list of available resources.\n\nSource: "
"http://www.gdacs.org/resources.aspx"
)

def __init__(self, *args, **kwargs):
for key in kwargs.keys():
Expand Down
4 changes: 2 additions & 2 deletions dataqs/gdacs/tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def test_download(self):
body=response)
rssfile = self.processor.download(self.processor.base_url.format(
self.processor.params['sdate'], self.processor.params['edate']),
self.processor.prefix + ".rss")
filename=self.processor.prefix + ".rss")
rsspath = os.path.join(
self.processor.tmp_dir, rssfile)
self.assertTrue(os.path.exists(rsspath))
Expand All @@ -74,7 +74,7 @@ def test_cleanup(self):
body=response)
rssfile = self.processor.download(self.processor.base_url.format(
self.processor.params['sdate'], self.processor.params['edate']),
self.processor.prefix + ".rss")
filename=self.processor.prefix + ".rss")
rsspath = os.path.join(
self.processor.tmp_dir, rssfile)
self.assertTrue(os.path.exists(rsspath))
Expand Down
27 changes: 15 additions & 12 deletions dataqs/gfms/gfms.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,18 +52,21 @@ class GFMSProcessor(GeoDataProcessor):
layer_future = "gfms_latest"
layer_current = "gfms_current"
prefix = 'Flood_byStor_'
description = u"""The GFMS (Global Flood Management System) is a NASA-funded
experimental system using real-time TRMM Multi-satellite Precipitation Analysis
(TMPA) precipitation information as input to a quasi-global (50°N - 50°S)
hydrological runoff and routing model running on a 1/8th degree latitude /
longitude grid. Flood detection/intensity estimates are based on 13 years of
retrospective model runs with TMPA input, with flood thresholds derived for
each grid location using surface water storage statistics (95th percentile plus
parameters related to basin hydrologic characteristics). Streamflow,surface
water storage,inundation variables are also calculated at 1km resolution.
In addition, the latest maps of instantaneous precipitation and totals from the
last day, three days and seven days are displayed.
\n\nSource: http://eagle1.umd.edu/flood/"""
description = (
u"The GFMS (Global Flood Management System) is a NASA-funded"
u"experimental system using real-time TRMM Multi-satellite "
u"Precipitation Analysis (TMPA) precipitation information as input to a"
u"quasi-global (50°N - 50°S) hydrological runoff and routing model "
u"running on a 1/8th degree latitude /longitude grid. Flood detection/"
u"intensity estimates are based on 13 years of retrospective model runs"
u"with TMPA input, with flood thresholds derived for each grid location"
u" using surface water storage statistics (95th percentile plus "
u"parameters related to basin hydrologic characteristics). Streamflow,"
u" surface water storage,inundation variables are also calculated at 1"
u"km resolution. In addition, the latest maps of instantaneous "
u"precipitation and totals from the last day, three days and seven days"
u" are displayed.\n\nSource: http://eagle1.umd.edu/flood/"
)

def get_latest_future(self):
"""
Expand Down
Loading