Скачать презентацию Precipitation Validation Hydrology Training Workshop University of Hamburg Скачать презентацию Precipitation Validation Hydrology Training Workshop University of Hamburg

5950bf2ab79a6a66234fa8fcadcfcef3.ppt

  • Количество слайдов: 61

Precipitation Validation Hydrology Training Workshop University of Hamburg Chris Kidd …and many others… Precipitation Validation Hydrology Training Workshop University of Hamburg Chris Kidd …and many others…

Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Experiences – other analysis Results – statistical dependency Conclusions Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Why? – essentially to improve estimates 2008 floods 2009 floods Why? – essentially to improve estimates 2008 floods 2009 floods

UK Midlands: 20 July 2007 UK Midlands: 20 July 2007

Precipitation Characteristics • The ‘modal’ instantaneous precipitation value is zero • Rain intensities are Precipitation Characteristics • The ‘modal’ instantaneous precipitation value is zero • Rain intensities are skewed towards zero: at middle to high latitudes, heavily so! rr en io n ce • Spatial/temporal accumulations will ‘normalise’ the data at ul m cu Ac Oc cu • 1 mm of rain ≡ 1 lm-2 or 1 Kg (or 1000 tkm 2) Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar 0. 2 mm/tip Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar 0. 2 mm/tip ARG 100 gauge 0. 1 mm/tip Young’s Gauge

Conventional measurements Gauge data (rain/snow) • Simple measurements of accumulations • Quantitative sampling (tipping Conventional measurements Gauge data (rain/snow) • Simple measurements of accumulations • Quantitative sampling (tipping bucket gauges etc) • But, point measurements, under-catch errors, etc. Radar systems • Backscatter from hydrometeors (rain/snow/hail) Precipitation is highly variable both temporally and spatially: • Spatial measurements be representative measurements need to • Potential to discriminate between Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Conventional Observations 20, 000 Rain gauges Radar duplicates rain-gauge coverage Precipitation is highly variable Conventional Observations 20, 000 Rain gauges Radar duplicates rain-gauge coverage Precipitation is highly variable both temporally and spatially. Measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Variance explained by nearest station Jürgen Grieser Variance based upon monthly data: shorter periods Variance explained by nearest station Jürgen Grieser Variance based upon monthly data: shorter periods = lower explained variance Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

What is truth? Co-located 8 gauges / 4 MRRs Hydrology Training Workshop: University of What is truth? Co-located 8 gauges / 4 MRRs Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

1 st gauge… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 1 st gauge… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

2 nd gauge… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 2 nd gauge… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

2 more gauges Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 2 more gauges Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

All gauges Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 All gauges Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

plus the MRR… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 plus the MRR… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite 1: 1 10 June 2009 : 40 mm in 30 mins MRR 24. 1 GHz Gauge, TBR Tipping bucket gauges provide quantised measurements (0. 1 or 0. 2 mm/tip) MRR critical for light rainfall Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Clee-Hill ATC Radar and C-band Chilbolton C-Band University of Helsinki C-band Hydrology Training Workshop: Clee-Hill ATC Radar and C-band Chilbolton C-Band University of Helsinki C-band Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

National network Radars: Doppler, dual polarised 100/210 km National network Radars: Doppler, dual polarised 100/210 km

Gauge data Radar (daily integrated) Radar vs gauge data Hydrology Training Workshop: University of Gauge data Radar (daily integrated) Radar vs gauge data Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Helsinki Testbed FMI Helsinki • Cold season – surface issues & mixed-phase precipitation to Helsinki Testbed FMI Helsinki • Cold season – surface issues & mixed-phase precipitation to surface • Circles: 4 operational Doppler weather radars (FMI & EMHI), 1 Dual pol radar + 1 vertically pointing C-band radar for research (Vaisala & UH) 2 vertically pointing POSS-radars Dots: 80 gauges Big diamonds: FD 12 P optical scatterometers Triangles: ultrasonic snow depth Squares: weighing gauges • • • Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Ground validation - IPWG synergies Criteria GV program IPWG Type of validation Priority on Ground validation - IPWG synergies Criteria GV program IPWG Type of validation Priority on physical, also statistical Has focused on descriptive and statistical Source of validation data Arranged for and collected by principle investigators Doesn't request. IPWG participants free to contribute Source of Specific satellite-based observational data products Types of Validation data participants provide products directly to validation groups Gauge, radars and specialist Conventional gauge and/or instrumentation, diverse in radar networks, usually part of a specific locations national network Types of single-sensor, Blended satellite sensor observational data instantaneous, full-resolution products, time/area averaged. datasets GV=Ground Validation After Turk & Arkin, BAMS 2008 Both approaches are complementary Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Summary: surface measurements Representativeness of surface measurements: • Over land generally good, but variable Summary: surface measurements Representativeness of surface measurements: • Over land generally good, but variable • Over oceans: virtually none-existent Measurement issues: • Physical collection – interferes with measurement (e. g. wind effects – frozen Satellites etc) consistent, regular measurements, precip, offer global coverage, real-time delivery of data • Radar – imprecise backscatter: rainfall relationship (also clutter, range effects, Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Satellite Data sets Satellite Data sets

Observation availability Spectral Region Availability Cycle (current) Res. * Visible Since start of satellite Observation availability Spectral Region Availability Cycle (current) Res. * Visible Since start of satellite era Geostationary, 15/30 mins 250 m+ Polar orbiters, 6 -hourly Infrared Shortly after start of satellite era ~ calibrated since 1979 Geostationary, 15/30 mins 1 km+ Polar orbiters, 6 -hourly Passive Microwave Experimental 1972/1975 Uncalibrated since 1978 Calibrated since 1987 Polar orbiters, 6 -hourly + Low Earth orbiter (TMI) 4 km+ Low Earth Orbiter (PR) Polar orbiter (Cloudsat) 4 km 1. 5 km Active Microwave 13. 8 GHz since 1997 (radar) 94 GHz since 2006 * Resolutions vary greatly with scan angle, frequency, sensor etc) Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Satellite observational scales s 1 km Observations made nominally at 1 km/15 mins: 25 Satellite observational scales s 1 km Observations made nominally at 1 km/15 mins: 25 km Pr ec sy ipit ste at m ion s Estimates possible at 1 km/1 min but inaccurate Ea Vis/IR MW rth Sa Re te so lli ur te c s es Vis IR LEO MODIS Ikonos Spot Landsat GEO rapid scan 3 hours 15 minutes Precipitation products generally available at 0. 25 degree daily, or 0. 25 degree, 3 hourly Accuracy of satellite precipitation estimates improve with temporal/spatial averaging mm Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

LEO vs GEO satellite observations Low-Earth Orbit sensors Geostationary sensors SSM/I and TRMM Meteosat LEO vs GEO satellite observations Low-Earth Orbit sensors Geostationary sensors SSM/I and TRMM Meteosat / MSG

Observations to Products Resolutions time/space Data inputs Visible Infrared Passive MW Active MW O Observations to Products Resolutions time/space Data inputs Visible Infrared Passive MW Active MW O b s e r v a t i o n s Model outputs R e t r i e v a l s Monthly/seaso nal Climate resolution Climatology P r o d u c t s Agriculture/crops Meteorology Hydrology Instantaneous Full resolution Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Global precipitation data sets Many different products at different spatial/temporal resolutions … and formats! Global precipitation data sets Many different products at different spatial/temporal resolutions … and formats! Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Setting up a validation site Setting up a validation site

IPWG European validation • Radar used as 'ground truth' • Composite of radars over IPWG European validation • Radar used as 'ground truth' • Composite of radars over UK, France, Germany, Belgium and Netherlands • Nominal 5 km resolution • Equal-area polar-stereographic projection • Data and product ingest • Near real-time • Statistical and graphical output (SGI/Irix; f 77/netpbm) Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Processing setup Perceived requirements: • Daily inter-comparison → 00 Z-24 Z (also 06, -09, Processing setup Perceived requirements: • Daily inter-comparison → 00 Z-24 Z (also 06, -09, -12 Z) • 0. 25 degree resolution → 25 km resolution • Real-time → near real-time dependent upon product • Validation data → radar data (gauge being added later) Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Processing Schedule 01 Z Global IR 02 Z SSM/I data GPI FDA 03 Z Processing Schedule 01 Z Global IR 02 Z SSM/I data GPI FDA 03 Z 04 Z 05 Z European radar data ECMWF PMIR 3 B 4 x cics data Statistics at 20 km 22 Z EUMETSAT MPE Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 Web pages

Processing system Initial setup: Setting of dates Cleaning out old/decayed data Acquiring data: Searching Processing system Initial setup: Setting of dates Cleaning out old/decayed data Acquiring data: Searching existing data Listing missing data Creation of. netrc file ftp data sources Remapping of data: … to regional grid or 5 km PSG projection… Results generation: Statistical analysis Graphical output Web pages: Generate HTML files Copying to server Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Processing checks foreach day (d 0 -31) foreach product & day dn=dn+1 remap to Processing checks foreach day (d 0 -31) foreach product & day dn=dn+1 remap to PSG using LUTs & standardise format set d 0=today standardise filename foreach datasource (s 0 -sn) foreach product & day foreach product (p 1 -pn) Generate statistics foreach day (d 0 -31) Generate plots if (product for day) !exist Y add to. netrc file ftp datasource (4 k) N foreach product & day generate HTML files Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Processing checks Set foreach day (d 0 -31)up list of past dates/days Usually okay: Processing checks Set foreach day (d 0 -31)up list of past dates/days Usually okay: dn=dn+1 sometimes needs set d 0=today tweaking foreach datasource (s 0 -sn) Checks for a products results: foreach product. Okay n) no results, (p 1 -p if but not if bad data foreach day (d 0 -31) if (product for day) !exist FTP runs several N times: add to on 4 K buffer limit. netrc file macros ftp datasource (4 k) Y Prepares product & day foreach products into remap to PSG common format using LUTs & okay… Usually standardise format standardise filename Generatesproduct & day foreach outputs: Okay if. Generate statistics there is rain… Generate plots Generates raw HTML: foreach product & day Occasional issues with server generate HTML files Automated systems they are NOT!

Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

“Standard” Layout Validation data Precipitation product Occurrence comparison Accumulation comparison Contingency tables Po. D/FAR/ “Standard” Layout Validation data Precipitation product Occurrence comparison Accumulation comparison Contingency tables Po. D/FAR/ HSS Scatterplot Descriptive Statistics Cumulative distribution Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

PMIR results: Europe 2009 -01 -11 Hydrology Training Workshop: University of Hamburg, 12 -14 PMIR results: Europe 2009 -01 -11 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

PMIR results: Australia 2008 -12 -25 Hydrology Training Workshop: University of Hamburg, 12 -14 PMIR results: Australia 2008 -12 -25 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Results: Snow problems Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 Results: Snow problems Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Results: rain extent Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 Results: rain extent Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

IPWG Inter-comparison regions Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG IPWG Inter-comparison regions Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG – International Precipitation Working Group (WMO/CGMS) Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Monthly and seasonal validation Monthly and seasonal diagnostic validation summaries Hydrology Training Workshop: University Monthly and seasonal validation Monthly and seasonal diagnostic validation summaries Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Validation resolution Month At full resolution the correlation of estimated rain is low; averaging Validation resolution Month At full resolution the correlation of estimated rain is low; averaging over time and space improves the picture Fine-scale data is generated so users get to decide on averaging strategy 5 -day 3 -hour VAR vs. HQ (mm/hr) Feb. 2002 30°N-S Huffman 2/10

Resolution vs Statistical Performance can be improved just by smoothing the data! Hydrology Training Resolution vs Statistical Performance can be improved just by smoothing the data! Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Bacchiglione (1200 km 2) Validation through Hydrology Posina (116 km 2) Anagnostou & Hossain: Bacchiglione (1200 km 2) Validation through Hydrology Posina (116 km 2) Anagnostou & Hossain: PMIR: 4 km/30 min 3 B 42 RT: 1 deg/3 hr High: 57. 9 Low: 1. 6 0. 5 km 1 km 2 km 4 km Applications are resolution critical Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010 8 km 16 km

Instantaneous analysis • AMSR precipitation product (v 10) • instantaneous radar (3 x 5 Instantaneous analysis • AMSR precipitation product (v 10) • instantaneous radar (3 x 5 scans averaged to 15 mins). • 5 km resolution average to 50 x 50 km • Regions of interest: - NSea, Atlantic, France, Germany, UK. • January 2005 - September 2009 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Mean rainfall (mm/d) 20052009 Radar AMSR 0 0. 1 0. 3 0. 5 1 Mean rainfall (mm/d) 20052009 Radar AMSR 0 0. 1 0. 3 0. 5 1 2 4 8 Rainfall mm/d Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Mean rain rate: mm/h Mean rainrate (mm/h) Date: year/month Overall – current AMSR rain Mean rain rate: mm/h Mean rainrate (mm/h) Date: year/month Overall – current AMSR rain product underestimates rainfall Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Regional breakdown: ratios NSea = 0. 8370 UK = 0. 3424 Germany = 0. Regional breakdown: ratios NSea = 0. 8370 UK = 0. 3424 Germany = 0. 4271 Atlantic = 0. 8033 France = 0. 3956 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Current High-resolution Studies • Inter-comparison of 3 -hourly 0. 25 degree precipitation estimates over Current High-resolution Studies • Inter-comparison of 3 -hourly 0. 25 degree precipitation estimates over UK/NW Europe • 5 mainstream PEHRPP algorithms: - CMORPH - CPCMMW - NRLGEO - PERSIANN - 3 B 42 RT (v 5) • Surface reference data sets: 3 -hourly gauge, interpolated gauge and NIMROD European radar Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

3 -hourly/0. 25 degree data availability Radar Gauges PERSIANN NRLBLD HYDROE CPCMW CMORPH 3 3 -hourly/0. 25 degree data availability Radar Gauges PERSIANN NRLBLD HYDROE CPCMW CMORPH 3 B 42 v 5 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Precipitation Totals (2005 -2009 by season) CMORPH 1500 CPCMMW 250 200 150 NRLBLD 100 Precipitation Totals (2005 -2009 by season) CMORPH 1500 CPCMMW 250 200 150 NRLBLD 100 Radar PERSIANN 3 B 42 v 5 75 DJF 50 25 10 0 MAM JJA SON mm/season

3 -hourly radar vs product (2007) Correlation Ratio 3 B 42 RT CMORPH CPCMMW 3 -hourly radar vs product (2007) Correlation Ratio 3 B 42 RT CMORPH CPCMMW NRLBLD PERSIANN ECMWF 10. 0 4. 0 2. 0 1. 5 1. 2 1. 1 0. 91 0. 81 0. 67 0. 5 0. 25 0. 0 1. 0 0. 8 0. 6 0. 4 0. 2 0. 0 -0. 2 -0. 4 -0. 6 -0. 8 -1. 0 Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

3 -hourly 0. 25 degree summary • Correlations are good generally, although with seasonal 3 -hourly 0. 25 degree summary • Correlations are good generally, although with seasonal variations - CMORPH produces highest correlations JJA - ECMWF produces highest corrections DJF • Quantification of precipitation poor: typically <50% of 'true' rainfall as identified by radar and gauge • Temporal-matching of product and surface Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Factors to consider… Processing issues: • • Grid box vs grid point Instantaneous vs Factors to consider… Processing issues: • • Grid box vs grid point Instantaneous vs accumulation (i. e. ± 1. 5 hours or 00 -03) Data resolutions (temporal & spatial) Data units (storage resolution vs retrieval resolution) Formats (I*2; I*4; R*4) (& units: mmh-1 ; mmd-1 ; kgd-1) Filename and date/time conventions (end, start, period) W-E (180°E/0°E & E-W) and N-S (or S-N) layout Statistical analysis is dependent upon the rainfall: • intensity, extent and patterns • temporal resolution • spatial resolution All these are inter-related and pose a multi-dimensional problem that cannot currently be adequately resolved Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Statistics: blame it on the weather! Movement: Is the movement perpendicular or along the Statistics: blame it on the weather! Movement: Is the movement perpendicular or along the rain band? Intensity What is the range of values within the rain area? Type of cloud/ rain Sensor field-of-view Size/variability What is the size and variability of the rain area(s)? Statistical success has as much to do with meteorology as the algorithms ability… Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Precipitation Validation Summary What are your key requirements to make the best of your Precipitation Validation Summary What are your key requirements to make the best of your own limited resources? What are the requirements of your user community? What are the requirements of the algorithm/product providers? What sources of data are available to you – both satellite product and surface data? Should you go beyond basic, daily regional comparisons (instantaneous-seasonal)? Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010

Contacts Chris Kidd: C. Kidd@bham. ac. uk University of Birmingham (from 01/01/11 try Chris. Contacts Chris Kidd: C. [email protected] ac. uk University of Birmingham (from 01/01/11 try Chris. [email protected] gov) International Precipitation Working Group Main Web page: http: //www. isac. cnr. it/~ipwg Hydrology Training Workshop: University of Hamburg, 12 -14 October 2010