Natural Hazards [NH]

NH43A
 Poster Hall (Moscone South)
 Thursday
 1340

Extreme Natural Hazards: Risk Assessment, Forecasting, and Decision Support I Posters


Presiding:  H Plag, Reno; B D Malamud, Geography, King's College London, London, United Kingdom; I Zaliapin, Mathematics and Statistics, University of Nevada, Reno; A Ismail-Zadeh, Geophysical Institute, Karlsruhe University, Karlsruhe, Germany; K T Johnson, Honolulu; J A Orcutt, Scripps Institution of Oceanography, La Jolla; E Sztein, Board on Int'l Scientific Organizations, National Academy of Sciences, Washington

NH43A-1277 Poster

Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains

Matthews, D   (HYDROMETDSS@COMCAST.NET), Hydromet DSS, LLC, Silverthorne, CO, United States
Brilly, M   (mbrilly@fgg.uni-lj.si), Civil Engineering, Univ. of Ljubljana, Ljubljana, Slovenia

Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Today’s reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1.08 million acre feet (1.33 x 10**9 m3) of additional water could be captured by seeding efficiently and effectively in just 10 storms. Recent projects sponsored by the National Science Foundation, NOAA, and the States of Wyoming, Utah and Nevada, and conducted by the National Center for Atmospheric Research will be discussed briefly. Examples of conditions in extreme droughts of the Western United States will be presented that show potential to mitigate droughts in these regions through cloud seeding. Implications for American and European hydropower generation and sustainable water supplies will be discussed.

http://hydrometdss.org

NH43A-1278 Poster

Cartographic Design in Flood Risk Mapping - A Challenge for Communication and Stakeholder Involvement

Fuchs, S   (sven.fuchs@boku.ac.at), Institute of Mountain Risk Engineering, University of Natural Resources and Applied Life Sciences, Vienna, Austria
Serrhini, K   (kmal.serrhini@univ-tours.fr), Département Génie de l’Aménagement, École Polytechnique de l’Université de Tours, Tours, France
Dorner, W   (wolfgang.dorner@fh-deggendorf.de), University of Applied Sciences, Deggendorf, Germany

In order to mitigate flood hazards and to minimise associated losses, technical protection measures have been additionally and increasingly supplemented by non-technical mitigation, i.e. land-use planning activities. This is commonly done by creating maps which indicate such areas by different cartographic symbols, such as colour, size, shape, and typography. Hazard and risk mapping is the accepted procedure when communicating potential threats to stakeholders, and is therefore required in the European Member States in order to meet the demands of the European Flood Risk Directive. However, available information is sparse concerning the impact of such maps on different stakeholders, i.e., specialists in flood risk management, politicians, and affected citizens. The lack of information stems from a traditional approach to map production which does not take into account specific end-user needs. In order to overcome this information shortage the current study used a circular approach such that feed-back mechanisms originating from different perception patterns of the end user would be considered. Different sets of small-scale as well as large-scale risk maps were presented to different groups of test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented communication of cartographic information. Therefore, the method of eye tracking was applied using a video-oculography technique. This resulted in a suggestion for a map template which fulfils the requirement to serve as an efficient communication tool for specialists and practitioners in hazard and risk mapping as well as for laypersons. Taking the results of this study will enable public authorities who are responsible for flood mitigation to (1) improve their flood risk maps, (2) enhance flood risk awareness, and therefore (3) create more disaster-resilient communities.

NH43A-1279 Poster

Floodplain Management Strategies for Flood Attenuation in the River Po

Brath, A   (armando.brath@unibo.it), DISTART, University of Bologna, Bologna, Italy
Castellarin, A   (attilio.castellarin@unibo.it), DISTART, University of Bologna, Bologna, Italy
Di Baldassarre, G   (g.dibaldassarre@unesco-ihe.org), Institute for Water Education (UNESCO-IHE), Delft, Netherlands

This paper analyses the effects of different floodplain management policies on flood hazard using a 350km reach of the River Po (Italy) as a case study. The River Po is the longest Italian river, and the largest in terms of streamflow. The middle-lower Po flows East some 350km in the Pianura Padana (Po Valley), a very important agricultural region and industrial heart of Northern Italy. This portion of the river consists of a main channel (200-500m wide) and a floodplain (overall width from 200m to 5km) confined by two continuous artificial embankments. Floodplains are densely cultivated, and a significant portion of these areas is protected against frequent flooding by a system of minor dikes, which impacts significantly the hydraulic behaviour of the middle-lower Po during major flood events. This study aims at investigating the effects of the adoption of different floodplain management strategies (e.g., raising, lowering or removal of the minor dike system) on the hydrodynamics of the middle-lower Po and, in particular, on flood-risk mitigation. This is a crucial task for institutions and public bodies in charge of formulating robust flood risk management strategies for the Po River. Furthermore, the results of the study is of interest for other European water related public bodies managing large river basins, in the light of the recent Directive 2007/60/EC on the assessment and management of flood risks (European Parliament, 2007). The analysis is performed by means of a quasi-2D hydraulic model, which has been developed on the basis of a laser-scanning DTM and a large amount of calibration data recorded during the significant flood event of October 2000.

NH43A-1280 Poster

Climate Change and Famine: Implications for Remote Sensing Applications to Enhance Food Security

Underwood, L W (lauren.w.underwood@nasa.gov), Applied Sciences, SSAI, Stennis, MS, United States
Brown, M E (molly.brown@nasa.gov), Goddard Space Flight Center, NASA, Greenbelt, MD, United States
Ross, K W (Kenton.W.Ross@nasa.gov), Applied Sciences, SSAI, Stennis, MS, United States

Agriculture and climate are tightly linked, and climate change is transforming that linkage in ways that are not broadly understood. Increasing global mean temperatures and extreme weather events are expected to have a profound effect on future crop production and food availability; especially considering the persistent effects current climactic variability has on food insecurity today. Over the next several decades, projected changes in weather patterns pose a serious threat to food security, particularly in semi-arid tropical regions already food insecure. These changes are amplifying the need for expanding decision support tools and earlier early warning so that decision makers will have longer time horizons for planning and preparedness. Our research is helping evaluate what remote sensing data will be most useful in meeting this need. Multiple national/international organizations have created decision support tools that summarize information about food security status in key regions. These include the U.N. Food and Agriculture Organization’s Global Information and Early Warning Service (GIEWS), the U.S. Department of Agriculture’s CropExplorer and U.S. Agency for International Development’s Famine Early Warning System Network (FEWS NET). FEWS NET early warning of agricultural production declines that may affect food security is characterized by weekly weather hazard assessments, and relies upon vegetation, temperature and rainfall data derived from remote sensing to identify abnormal weather related conditions. Previously published research utilized a questionnaire to elicit inputs from professionals who use Earth science data to address FEWS NET’s institutional needs. This work identified that rainfall and vegetation products are valued as data that provide actionable food security information. The questionnaire also led to key findings regarding planned FEWS NET enhancements, and that the focus of a recent NASA-funded project on developing new seasonal forecast products was well placed: planned forecast products for rainfall and vegetation would be considered useful. To assess if these weather related forecasting data products are sufficient to meet early warning needs, a follow-up questionnaire will be disseminated to these same professionals to evaluate if these enhancements have improved their ability to make actionable food security decisions in light of uncertainties related to the effects of climate change. If proven useful, these insights would support the conclusion that weather-related biophysical forecasts can play a crucial role in providing earlier estimates of weather-related agricultural production deficits. Other survey questions will address how users anticipate climate change will affect food security, and what other food security decision support tools might be useful in changing agroclimatic regimes. Understanding the impacts global climate change has on agriculture may better equip decision makers to confront the effects upon already fragile food security situations, anticipate actions required to mitigate further food insecurity, and provide earlier early warning. It is hoped that dependable forecasts can play an important role in an emerging global food security “system of systems”.

NH43A-1281 Poster

A 2000 year flood record from annually laminated sediments of Lake Mondsee (European Alps, Upper Austria)

Swierczynski, T   (swier@gfz-potsdam.de), 5.2 Climate Dynamics and Landscape Evolution, German Research Centre for Geosciences, Potsdam, Germany
Lauterbach, S   (slauter@gfz-potsdam.de), 5.2 Climate Dynamics and Landscape Evolution, German Research Centre for Geosciences, Potsdam, Germany
Brauer, A   (brau@gfz-potsdam.de), 5.2 Climate Dynamics and Landscape Evolution, German Research Centre for Geosciences, Potsdam, Germany
Merz, B   (Bruno.Mer@gfz-potsdam.de), 5.4 Hydrology, German Research Centre for Geosciences, Potsdam, Germany

Magnitude and frequency of extreme flood events are much debated related to global warming and the possible intensification of the water cycle. The Alpine region is especially sensitive to climatic changes providing the opportunity to study triggering mechanisms of floods in a changing environment within human habitats. Geoarchives such as lakes provide long continous time series >10000 years inheriting detailed information about environmental conditions and climatic events of the past. The sediments of the pre-alpine Lake Mondsee which cover the Holocene are annually laminated with intercalated allochthounous deposits signifying floods. We characterised these allochthonous 'detrital' flood layers by using sedimentological, mineralogical and geophysical methods (microfacies, X-ray diffraction, magnetic susceptibility, X- ray fluorescence scanning) and identified different flood periods within the last two millenia. Nine extreme floods are recorded within the lake sediments of the last 100 years well correlating with instrumental and historical data. In order to better understand the flood generating process we furthermore calibrate sedimentological and available hydrological data.

NH43A-1282 Poster

High Variability in Coseismic Slip and Extreme Local Tsunami Runup

Geist, E L (egeist@usgs.gov), US Geological Survey, Menlo Park, CA, United States

High variability in coseismic slip for inter-plate thrust earthquakes at subduction zones is examined in the context of local tsunami runup. The tsunami emanating from the M=9.2, 2004 Sumatra-Andaman earthquake highlighted this problem, with extreme runup along the northwestern shoreline of Sumatra tied to localized high slip at the southern end of the rupture zone, near the oceanic trench. Other tsunamigenic inter-plate thrust earthquakes, such as the M=8.6, 1957 Andreanof earthquake, also have exhibited high slip fluctuations, resulting in localized areas of high slip. Statistical models of coseismic slip are a critical component of assessing local tsunami hazards. The standard stochastic slip model is first characterized by a power-law decay in the wavenumber spectrum [Andrews, 1980]. The slip distribution is then defined by convolving the corresponding auto-correlation function with Gaussian-distributed random variables. Alternatively, a modified version of the stochastic slip model uses random variables specified according to a broader family of Lévy α-stable probability distributions, and is better able to account for observed slip variability of onshore earthquakes [Lavallée et al., 2006]. I examine the slip distributions of several tsunamigenic, inter-plate thrust earthquakes that are derived from the inversion of seismic and tsunami waveforms to determine whether there is a clear preference for using the Lévy α-stable stochastic slip model (for α < 2). Because the spatial resolution of slip inversions for offshore earthquakes is low, one-point statistics are primarily used in the comparison with the stochastic slip models, rather than attempting to estimate both the spectral decay parameter and the parameters for the probability law. In addition, the effect on local tsunami runup is discussed, particularly with respect to the effect of shoaling amplification during propagation dictated by Green’s Law. The component of the tsunami wavefield generated by high slip along the up-dip portion of the inter-plate thrust (near the oceanic trench) will be amplified during shoaling to a greater extent than equivalent high slip along the down-dip portion of the fault. Therefore, estimation of maximum local tsunami runup is especially sensitive to high fluctuations of slip in the dip direction.

NH43A-1283 Poster

Using Landslide Movement to Differentiate Landslide Hazard and Risk

Dellow, G D (g.dellow@gns.cri.nz), GNS Science, Lower Hutt, New Zealand

Landslide-time-series records (landslide catalogue) are necessary to determine the frequency component of landslide magnitude/frequency relationships required to calculate the probability of a landslide occurring (the landslide hazard). Fast landslides, defined as landslides that have a total cumulative movement of > 1.0 m in 24 hours, often have a single movement episode and get a single entry in a landslide catalogue. Slow landslides, defined as cumulative displacement < 0.1 m in 24 hours, often have multiple movement episodes through time and this can result in multiple entries in a landslide catalogue. If a landslide catalogue combining both fast and slow landslide movements is analysed to determine the frequency of landsliding, the resulting landslide frequency will be dominated by the frequency of the small episodic movements of slow landslides. As landslide frequency is required to derive landslide hazard, this can lead to an inappropriate determination of landslide hazard. Slow landslides are usually pre-existing landslides and it is often possible to identify them using geomorphic criteria, prior to movement being observed. A map or inventory of pre-existing landslides can be used to identify potential slow landslides as slow landslides will be a subset of the landslide inventory population (some inventory landslides are the ‘preserved’ remains of fast landslides). Slow landslide movement is site specific and information on individual landslide behaviour over time is required to determine the probability of movement occurring. For slow landslides, the landslide hazard is the probability of movement occurring at a specific site. Fast landslides are usually first-time landslides and the specific sites at which they may occur are seldom identifiable prior to failure. The probability of fast landslide movement is potentially treatable using probabilistic techniques to spatially differentiate landslide hazard. For fast landslides, calculating the landslide hazard requires determining the probability of landslide movement occurring anywhere. Differentiation of landslides based on movement parameters recognises that different datasets are required to determine the hazard (or the probability of movement occurring) for fast and slow landslides. A landslide catalogue is a required input for determining -fast-landslide hazard, while a landslide inventory can be used to identify potential slow landslides which then must be examined to determine their hazard or probability of movement. Differences in the frequency of landslide movement for fast and slow landslides differentiate a landslide population into two subsets with different risk profiles. Landslides causing fatalities in New Zealand (over 350 deaths from 50 individual landslides) have all been fast landslides. Fast landslides are a risk to both life (life safety) and infrastructure (dollar cost). Small episodic movements of slow landslides are not a direct life-safety risky (no such deaths or injuries are recorded in New Zealand), although they often damage infrastructure (dollar cost).

NH43A-1284 Poster

Validation of Volcanic Ash Forecasting Performed by the Washington Volcanic Ash Advisory Center

Salemi, A   (tony.salemi@noaa.gov), Satellite Analysis Branch, NOAA/NESDIS, Camp Springs, MD, United States
Hanna, J   (jay.hanna@noaa.gov), Satellite Analysis Branch, NOAA/NESDIS, Camp Springs, MD, United States

In support of NOAA’s mission to protect life and property, the Satellite Analysis Branch (SAB) uses satellite imagery to monitor volcanic eruptions and track volcanic ash. The Washington Volcanic Ash Advisory Center (VAAC) was established in late 1997 through an agreement with the International Civil Aviation Organization (ICAO). A volcanic ash advisory (VAA) is issued every 6 hours while an eruption is occurring. Information about the current location and height of the volcanic ash as well as any pertinent meteorological information is contained within the VAA. In addition, when ash is detected in satellite imagery, 6-, 12- and 18-hour forecasts of ash height and location are provided. This information is garnered from many sources including Meteorological Watch Offices (MWOs), pilot reports (PIREPs), model forecast winds, radiosondes and volcano observatories. The Washington VAAC has performed a validation of their 6, 12 and 18 hour airborne volcanic ash forecasts issued since October, 2007. The volcanic ash forecasts are viewed dichotomously (yes/no) with the frequency of yes and no events placed into a contingency table. A large variety of categorical statistics useful in describing forecast performance are then computed from the resulting contingency table.

www.ssd.noaa.gov/VAAC

NH43A-1285 Poster

Uniting Mandelbrot’s Noah and Joseph Effects in Toy Models of Natural Hazard Time Series

Credgington, D   (nww62@yahoo.co.uk), British Antarctic Survey, Cambridge, United Kingdom
Watkins, N W (nww@bas.ac.uk), British Antarctic Survey, Cambridge, United Kingdom
Chapman, S C (S.C.Chapman@warwick.ac.uk), CFSA, University of Warwick, Coventry, United Kingdom
Rosenberg, S J (sam.joe.rosenberg@googlemail.com), British Antarctic Survey, Cambridge, United Kingdom
Sanchez, R   (rsanchez@fis.uc3m.es), Oak Ridge National Laboratory, Oak Ridge, TN, United States

The forecasting of extreme events is a highly topical, cross-disciplinary problem. One aspect which is potentially tractable even when the events themselves are stochastic is the probability of a “burst” of a given size and duration, defined as the area between a time series and a constant threshold. Many natural time series depart from the simplest, Brownian, case and in the 1960s Mandelbrot developed the use of fractals to describe these departures. In particular he proposed two kinds of fractal model to capture the way in which natural data is often persistent in time (his “Joseph effect”, common in hydrology and exemplified by fractional Brownian motion) and/or prone to heavy tailed jumps (the “Noah effect”, typical of economic index time series, for which he gave Levy flights as an examplar). Much of the earlier modelling, however, has emphasised one of the Noah and Joseph parameters (the tail exponent mu and one derived from the temporal behaviour such as power spectral beta) at the other one's expense. I will describe work [1] in which we applied a simple self-affine stable model-linear fractional stable motion (LFSM)-which unifies both effects to better describe natural data, in this case from space physics. I will show how we have resolved some contradictions seen in earlier work, where purely Joseph or Noah descriptions had been sought. I will also show recent work [2] using numerical simulations of LFSM and simple analytic scaling arguments to study the problem of the area between a fractional Levy model time series and a threshold. [1] Watkins et al, Space Science Reviews [2005] [2] Watkins et al, Physical Review E [2009]

NH43A-1286 Poster

Numerical Modeling to Support Floodplain Mapping in Coastal Areas

Cydzik, K   (kcydzik@exponent.com), Exponent, Irvine, CA, United States
Shrestha, P L (pshrestha@exponent.com), Exponent, Irvine, CA, United States
Hamilton, D   (dhamilton@exponent.com), Exponent, Irvine, CA, United States
Rezakhani, M   (mrezakhani@exponent.com), Exponent, Phoenix, AZ, United States
Scheffner, N   (cht@canufly.net), Computational Hydraulics and Transport, Edwards, MS, United States
Lenaburg, R   (Raymond.Lenaburg@dhs.gov), FEMA, Oakland, CA, United States

A hurricane-induced flood mapping study was conducted for the State of Hawaii encompassing the six major Hawaiian Islands: Hawaii, Kauai, Lanai, Maui, Molokai, and Oahu. The objective of the study was to use numerical methods to compute storm surge frequency relationships using the Empirical Simulation Technique (EST). This paper describes the EST methodology. Ultimately, the storm surge frequency data and water surface elevations determined through the modeling effort define coastal inundation areas to revise Flood Insurance Rate Maps (FRIMs). Such information guides coastal development and highlights flood risks in coastal areas. To perform a realistic storm surge analysis, historical events impacting the islands in the study area were selected from the National Hurricane Center’s Eastern and Central North Pacific Basin Hurricane database. The database consists of hurricanes, tropical storms, and tropical depressions impacting the Hawaiian Islands from 1949 through 2005 and includes records of the latitude, longitude, maximum wind speed, and, often, the central pressure of the eye of the storm. For this study, candidate events were selected based on two criteria. Storms were required to pass within 200 nautical miles of at least two of the islands with maximum winds at that point of at least tropical storm-strength (39 mph.) Of the 794 storm events in the database, 11 events met these criteria and were used to generate wind and pressure fields for the modeling effort. An assumption of the EST analysis is that each of the 11 events has an equal probability of impacting the islands within the 200 nautical mile ellipse. Therefore, the 11events were translated by one Radius-to-Maximum winds across the ellipse so that each event impacted each island, generating 102 impacting events. The hypothetical events were used to generate wind and pressure fields for input to the ADvanced CIRCulation (ADCIRC) long-wave hydrodynamic model to compute storm surge at defined transect points. This database of storm surges was then input to the EST, a statistical model simulating life-cycle sequences of cyclic but non-deterministic multi-parameter systems such as storm events and corresponding environmental impacts. The EST uses "bootstrap" re-sampling-with-replacement, nearest-neighbor random walk interpolation, and subsequent smoothing technique, where random sampling of a finite length database creates a larger database. The new database contains events more severe than those in the historical database. The EST model generates N simulations of a T-year sequence of events. The only assumptions in the simulation are that the simulated events are similar in behavior and magnitude to historical events and the frequency of storm events in the future will remain the same as the past. The simulated life-cycle hurricanes are used to generate mean value frequency estimates with standard deviation confidence limits. These results are combined with frequency-indexed wave run-up and setup to generate revised FIRMs for the six Hawaiian Islands.

NH43A-1287 Poster

Land Fire impacts assessment on the Rice Watershed, California. 2007

Zahraei, A   (szahraei@uci.edu), Civil and Environmental Engineering Department, University of California-Irvine, Irvine, CA, United States
Imam, B   (bimam@uci.edu), Civil and Environmental Engineering Department, University of California-Irvine, Irvine, CA, United States
Sorooshian, S   (soroosh@uci.edu), Civil and Environmental Engineering Department, University of California-Irvine, Irvine, CA, United States

Burn impacts assessment is a key factor for the post-fire disaster management. For example, assessing wildfire impacts on vegetation is an important component of improving the prediction of hydrologic and ecologic impacts of wildfires within the affected watershed. Many studies have analyzed satellite derived indices of vegetation vigor as indicator of burning effects. This poster reports a study in which Landsat (TM) data was used to compute three indices, NDVI, MASAVI and NBR, which are commonly used in assessing wildfire impacts. The study focused on the Rice watershed southern California, which was affected by a major wildfire in the 2007 fire season. A series of before and after Landsat images were used to evaluate these indices evaluated before and after the wildfire. Comparison between the three indices reveals that the affects of the fire were not very prominently present in the Satellite observation due to the length of time separating the fire from the next available Lansat scene. Such separation may include a period of vegetation recovery. However, when compared with the scenes from the previous year, but for the same season, post fire vegetation show marked differences from pre-fire conditions. The ability of NDVI, MSAVI and NBR to monitor post-fire impacts on vegetation is further evaluated by comparing precipitation patterns in 2006 and 2007, which may shed more light on whether the marked difference in these indices are due to dry/wet differences or to the impacts of fire. NDVI shows more reliability and better representation of both long-term and short-term impacts of wild-fire.

NH43A-1288 Poster

Future flood risk estimation in the Rhine basin

Linde, A H (aline.te.linde@ivm.vu.nl), Institute for Environmental St, VU University, Amsterdam, Netherlands
Bubeck, P   (philip.bubeck@ivm.vu.nl), Institute for Environmental St, VU University, Amsterdam, Netherlands
Moel, H D (hans.de.moel@ivm.vu.nl), Institute for Environmental St, VU University, Amsterdam, Netherlands

The Rhine basin is a densely populated river basin and economically the most important river in Western Europe. Currently, more than 10 million people are living in areas at risk of flooding events, especially in the upstream German and Dutch areas of the river Rhine. Floods caused casualties and severe damage on a regular basis, with two recent extreme events in 1993 and 1995 (1800 and 3500 Mio USD direct damage). Flood risk is expected to increase due to climate change and socio-economic development in flood-prone areas. This requires a better understanding of future flood risk developments and the effect of adaptation strategies for reducing these risks. Flood risk assessment is ideally done on a basin-wide scale, which is also stimulated in Europe by the EU Flood Directive. Previous studies in the Rhine basin focused on current flood risk (i.e. the Rhine Atlas). Few studies have examined the potential future increase in disaster losses using a comprehensive approach to climate and socio-economic change. In this paper, we will develop a uniform flood risk methodology and estimate future flood risk in 2050 of the entire basin in a scenario study.

NH43A-1289 Poster

A new scoring method for evaluating the performance of earthquake forecasts and predictions

Zhuang, J   (zhuangjc@ism.ac.jp), Institute of Statistical Mathematics, Tokyo, Japan

This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

NH43A-1290 Poster

Flood risk management by flooding currents using satellite image in the Nakdong basin, Korea

KWAK, Y   (kwak_yj@graduate.chiba-u.jp), Chiba university, Chiba City, Japan
Park, J   (amon@rsch.tuis.ac.jp), Information Systems, Tokyo University of Information Sciences, Chiba, Japan
Kondoh, A   (kondoh@faculty.chiba-u.jp), Chiba university, Chiba City, Japan

Recently, floods have increased due to rapid urbanization and human activity in the lowland. Therefore, river flood control is essential for functional embankments and maintenance of safety concerns. Floods in Korea that occur every year are also caused by heavy rains and typhoons. However, due to lack of hydrologic, hydraulic and geomorphic data, high-magnitude floods is often difficult to compute in terms of the distribution of stream power and discharge per unit boundary area. Actually, it is unable to predict flood occurrence accurately using flood risk information and simulation data. In this study, we arrange a characteristic of the Nakdong river system which is a second largest basin in S.Korea and surrounded by mountain ranges. Most floods are caused by overflow and levee break in Nakdong river. To consider the inundation model, at first, the authors assume that extreme floods only occur within the maximum overflow at peak value. Next, the authors determine the most risk inundation area (Yangsan stream) among the 13 confluence points in Nakdong basin. This study’s purpose is to extract not only GIS-based flood risk factors but also the relationship between main stream and tributary, which is analyzed to generate multiple criteria such as the inundation vulnerable index (IVI), flow capacity (FC), potential flow (PF). Moreover, flood risk is also related to water depth and draining. The weighted inundation model is recalculated to produce the risky rank. Then, raster calculation proceeds to make identifying inundation areas in detail. In order to verify the flood risk, the authors apply to the previous occurred flooding data and satellite image. With correlation analysis of flood risk factors, the authors develop a methodology for determining flood risk index using spatial analysis and image analysis in the Nackdong river basin. As a result, the authors determine the location of most risk area among the 13 confluence point between main stream and tributary, thus the tributary stream overflow bounding discharge capacity of the channel in downstream of Nakdong river.



Location of the study area in Nakdong basin, S.Korea

NH43A-1291 Poster

How Robust are Science-Based Disaster Preparedness Strategies? Lessons from Western Sumatra (Invited)

Shannon, R   (shannon-r@email.ulster.ac.uk), Environmental Science, University of Ulster, Coleraine, United Kingdom
McCloskey, J   (j.mccloskey@ulster.ac.uk), Environmental Science, University of Ulster, Coleraine, United Kingdom
McDowell, S   (sp.mcdowell@ulster.ac.uk), Environmental Science, University of Ulster, Coleraine, United Kingdom

Forecasts of the next likely megathrust earthquake which will occur off the western coast of Sumatra, possibly in the near future, indicate that it will likely be tsunamigenic and could be more devastating than the 2004 event. Hundreds of simulations of potential earthquakes and their tsunamis show that, while the earthquake is fundamentally unpredictable, many scenarios would see dangerous inundation of low-lying areas along the west coast of Sumatra; the cities of Padang and Bengkulu broadside-on to the areas of highest seismic potential have a combined population of over one million. Understanding how the science of unpredictable, high probability events is absorbed by society is essential for the development of effective mitigation and preparedness campaigns. A five month field investigation conducted in Padang and Bengkulu aimed to conceptualise the main issues driving risk perception of tsunami hazard, and explore its influence upon preparedness. Of specific interest was the role of scientifically quantified hazard information upon risk perception and hazard preparedness. Target populations were adult community members (n=270) and senior high school students (n=90). Preliminary findings indicate that scientific knowledge of earthquake and tsunami threat amongst respondents in both cities is good. However the relationship between respondent’s hazard knowledge, desired risk perception, and the adoption of preparedness measures was often non-linear and is susceptible to the negative effects of unscientific forecasts disseminated by government and mass media. Evidence suggests that ‘mystic’ predictions often portrayed in the media as being scientific, have been readily absorbed by the public; when these fail to materialise the credibility of authentic science and scientists plummets. As a result levels of sustainable earthquake and tsunami preparedness measures adopted by those living in tsunami threatened areas can be detrimentally impacted. It is imperative that the internationally accredited science of high probability, unpredictable natural hazards prevails within public consciousness in western Sumatra, despite the frequent circulation of unsubstantiated predictions and claims relating to these events. While the management of this information ultimately lies with government, the recent past has dictated a need for scientists to become more proactive in ensuring their work is accepted as a foremost source of knowledge used to guide accurate risk perceptions and stimulate the adoption of appropriate preparedness measures.

NH43A-1292 Poster

Experimental Study of the Effect of the Initial Spectrum Width on the Statistics of Random Wave Groups

Shemer, L   (shemer@eng.tau.ac.il), School of Mechanical Engineering, Tel-Aviv University, Tel-Aviv, Israel
Sergeeva, A   (a.sergeeva@hydro.appl.sci-nnov.ru), Institute of Applied Physics, RAS, Nizhny Novgorod, Russian Federation

The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009).



Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.

NH43A-1293 Poster

Observing Natural Hazards: Tsunami, Hurricane, and El Niño Observations from the NDBC Ocean Observing System of Systems

O'Neil, K   (kathleen.oneil@noaa.gov), NOAA's National Data Buoy Center, Stennis Space Center, MS, United States
Bouchard, R   (richard.bouchard@noaa.gov), NOAA's National Data Buoy Center, Stennis Space Center, MS, United States
Burnett, W H (bill.burnett@noaa.gov), NOAA's National Data Buoy Center, Stennis Space Center, MS, United States
Aldrich, C   (charles.aldrich@noaa.gov), NOAA's National Data Buoy Center, Stennis Space Center, MS, United States

The National Oceanic and Atmospheric Administration’s (NOAA) National Data Buoy Center (NDBC) operates and maintains the NDBC Ocean Observing Systems of Systems (NOOSS), comprised of 3 networks that provide critical information before and during and after extreme hazards events, such as tsunamis, hurricanes, and El Niños. While each system has its own mission, they have in common the requirement to remain on station in remote areas of the ocean to provide reliable and accurate observations. After the 2004 Sumatran Tsunami, NOAA expanded its network of tsunameters from six in the Pacific Ocean to a vast network of 39 stations providing information to Tsunami Warning Centers to enable faster and more accurate tsunami warnings for coastal communities in the Pacific, Atlantic, Caribbean and the Gulf of Mexico. The tsunameter measurements are used to detect the amplitude and period of the tsunamis, and the data can be assimilated into models for the prediction and impact of the tsunamis to coastal communities. The network has been used for the detection of tsunamis generated by earthquakes, including the 2006 and 2007 Kuril Islands, 2007 Peru, and Solomon Islands, and most recently for the 2009 Dusky Sound, New Zealand earthquake. In August 2009, the NOAA adjusted its 2009 Atlantic Hurricane Seasonal Outlooks from above normal to near or below normal activity, primarily due to a strengthening El Niño. A key component in the detection of that El Niño was the Tropical Atmosphere Ocean Array (TAO) operated by NDBC. TAO provides real-time data for improved detection, understanding, and prediction of El Niño and La Niña. The 55-buoy TAO array spans the central and eastern equatorial Pacific providing real-time and post-deployment recovery data to support climate analysis and forecasts. Although, in this case, the El Niño benefits the tropical Atlantic, the alternate manifestation, La Niña typically enhances hurricane activity in the Atlantic. The various phases of the El Niño-Southern Oscillation resulting in extreme hazards, such as floods and landslides, droughts and wildfires, fish kills and biological impacts. For almost 40 years, NDBC has operated and maintained a network of buoys and coastal automated stations for meteorological and oceanographic observations that support real-time weather analysis, forecasting, and warnings. The US National Hurricane Center (NHC) uses the observations from the buoys to detect the position and intensity of tropical cyclones and the extent of their extreme winds and sea. Since 2006, NHC has cited over 100 instances of using buoy data in its Forecast Discussions or Public Advisories. Data are also used in reconstructing and analyzing the extent of devastation from land-falling hurricanes. The unprecedented devastation caused by the rising waters of 2005’s Hurricane Katrina was attributed to the waves generated and reported by the NDBC buoys in the Gulf of Mexico superimposed upon the storm surge at landfall. The three constituent systems of the NOOSS comprise a network of more than 250 observing stations providing real-time and archived data for forecasters, scientists, and disaster management officials.

http://www.ndbc.noaa.gov/

NH43A-1294 Poster

A 450 year history of extreme floods in annually laminated sediment from pre-alpine Lake Ammersee (Southern Germany)

Czymzik, M   (markus@gfz-potsdam.de), 5.2, GFZ Helmholtz Centre Potsdam, Potsdam, Germany
Brauer, A   (brau@gfz-potsdam.de), 5.2, GFZ Helmholtz Centre Potsdam, Potsdam, Germany
Plessen, B   (birgit@gfz-potsdam.de), 5.2, GFZ Helmholtz Centre Potsdam, Potsdam, Germany
Dulski, P   (dulski@gfz-potsdam.de), 5.2, GFZ Helmholtz Centre Potsdam, Potsdam, Germany
von Grafenstein, U   (Ulrich.von-Grafenstein@lsce.ipsl.fr), Laboratoire des Sciences du Climat et de l’Environnement, Gif-sur-Yvette, France

Forecasting of extreme events and their impacts on the human habitat requires comprehensive understanding of the underlying physical processes and recurrence intervals. Since instrumental time series rarely exceed a century, geo-archives are adequate tools to examine such events on longer time scales. In particular, lakes with annually laminated (varved) sediments provide continuous high-resolution records of climate and environmental variability. Flood-triggered sediment fluxes of detrital catchment material into these lakes provide long flood time series that can be precisely dated through counting of annual layers. The pre-alpine Lake Ammersee (Southern Germany) is an ideal site for reconstructing long time series of flood frequencies because its annually laminated sediment profile allows precise dating and reliable detection of even microscopic layers by their sedimentological and geochemical characteristics. Furthermore, instrumental time series of local precipitation and runoff can be used for calibrating the palaeo-record. The existing high-resolution Holocene palaeotemperature reconstruction derived from ostracods in Lake Ammersee sediments facilitates the discussion of changes in flood frequency patterns in relation to changes in larger scale climate boundary conditions. A novel methodological approach combining micro-facies analyses, high-resolution element scanning and stable isotope measurements allowed to reconstruct a 450 year time series of detrital layers in two varved sediment cores from Lake Ammersee located 1.8 km apart from each other. The seasonal occurrence of each layer was determined by its micro-stratigraphic position within a varve. The comparison of our record with measured runoff data from the main tributary River Ammer for the last 73 years and the proximal-distal pattern of detrital layer thickness towards the Ammer river mouth confirm the interpretation of these layers as triggered by floods. To better understand the effects of precipitation characteristics on major runoff events we compared our flood layer record with continuous daily precipitation data from the Meteorological Observatory Hohenpeiβenberg back to AD 1880. For investigating the role of atmospheric circulation patterns on flood frequencies we compared our record with high-resolution atmospheric pressure field reconstructions of the last 450 years.

czymzik@gfz-potsdam.de

NH43A-1295 Poster

GEOELECTRICAL TOMOGRAPHY AS AN OPERATIVE TOOL FOR THE EMERGENCY MANAGEMENT OF LANDSLIDE: AN APPLICATION IN BASILICATA REGION, ITALY

Gerardo, C   (gerardo.colangelo@regione.basilicata.it), Department of Infrastructure, Civil Protection, Potenza, Italy
Lapenna, V   (lapenna@imaa.cnr.it), CNR, IMAA, Potenza, Italy
Loperte, A   (loperte@imaa.cnr.it), CNR, IMAA, Potenza, Italy
Perrone, A   (perrone@imaa.cnr.it), CNR, IMAA, Potenza, Italy

A new approach has been applied for investigating some landslides of recent genesis in Basilicata region (southern Italy), in particular a geophysical technique has been used to study a landslide bodies. Electrical resistivity tomography method has been applied to obtain information about the deep characteristics of the landslide bodies. The high resolution of the 2D ERTs allowed to locate the possible sliding surfaces of the landslide body. They also highlighted areas characterized by high water content, the increase of the saturation degree and pore pressures in these areas could have caused a weakening of the slopes. The information obtained by the application of indirect surveys appeared to be particularly useful for the end users involved in the risks management. In particular, taking into account the cycle of landslides emergency, the obtained data could give a valid contribution during the post-event phase which mainly regards the damage valuation. Indeed, only a corrected assessment of the damage and a precise geometric reconstruction of the landslide body, can direct the intervention actions of the end users. The results represent a valid cognitive support to choose the most appropriate technical solution for strengthening of the slopes and an example of best practice for the cooperation between the research activity (IMAA-CNR) and field emergency (Basilicata Civil Protection).

NH43A-1296 Poster

Radon and Helium as productive tools for earthquake precursory and fault delineation studies in NW Himalayas, India: An overview

Bajwa, B   (bsbajwa@excite.com), Physics, Guru Nanak Dev University, Amritsar, India
Mahajan, S   (mahajansandeep21@gmail.com), Physics, Guru Nanak Dev University, Amritsar, India
Walia, V   (vivekwalia@redifmail.com), National Center for Research on Earthquake Engineering, Taipei, Taiwan
Kumar, A   (aruphy2004@yahoo.com), Physics, Guru Nanak Dev University, Amritsar, India
Singh, S   (surinder_s9151@yahoo.com), Physics, Guru Nanak Dev University, Amritsar, India
Yang, T F (tyyang@ntu.edu.tw), Geosciences, National Taiwan University, Taipei, Taiwan

To determine the role of radon and helium as a productive tool for fault delineation and earthquake precursory studies, continuous measurements are made in the soil-gas and groundwater in NW Himalayas, India. The area under study is seismically active and falls in the High Seismic Zones IV and V of the Seismic Map of India. The NW Himalayas are tectonically active due to the northward movement of the Indian plate towards Eurasian plate and the frequent occurrence of small magnitude earthquake indicates that the area is under unusually high stress and strain. The temporal variations in the radon concentration in soil-gas and groundwater are continuous monitored, at three different stations viz. Amritsar (Zone IV), Dharamsala (Zone V) and Dalhousie (Zone IV), using Barasol probes (Algade, France) and RAD7 (Durrige, USA) respectively. The radon anomalies, in the data are correlated with micro seismic events recorded along Main Boundary Thrust (MBT) and Main Central Thrust (MCT) of NW Himalayas within the grid (28 - 34° North, 72 - 79° East). The anomalous change in the radon concentration before an event suggests that continuous radon monitoring in a grid pattern can serve as a productive tool in earthquake prediction studies. The MCT and MBT are associated with evolution of Himalayan orogeny. Besides the longitudinal lineaments several transverse lineaments occur as faults and fractures trending normally or obliquely to Himalayan trend. Keeping this thing in view, a geochemical soil-gas surveys have been conducted in the NW Himalayas. To carry out the present investigation soil-gas samples were collected in sample bags at depth of about 0.7 - 1.0 m by using hollow steel probe. The collected soil-gas sample bags are analyzed for radon and helium using RTM 2100 and Helium Leak Sniff Detector respectively. The data analysis clearly reveals anomalous values of subsurface gases along the fault and lineaments.

NH43A-1297 Poster

Development of web-based services for a novel ensemble flood forecasting and risk assessment system

He, Y   (yi.he@kcl.ac.uk), Geography, King's College London, London, United Kingdom
Manful, D Y (desmond.manful@kcl.ac.uk), Geography, King's College London, London, United Kingdom
Cloke, H L (hannah.cloke@kcl.ac.uk), Geography, King's College London, London, United Kingdom
Wetterhall, F   (fredrik.wetterhall@kcl.ac.uk), Geography, King's College London, London, United Kingdom
Li, Z   (zjli@hhu.edu.cn), Hydrology and Water Resources, Hohai University, Nanjing, China
Bao, H   ( icehot@hhu.edu.cn), Hydrology and Water Resources, Hohai University, Nanjing, China
Pappenberger, F   (florian.pappenberger@ecmwf.int), European Centre for Medium-Range Weather Forecasts (ECMWF), Reading, United Kingdom
Wesner, S   (wesner@hlrs.de), Applications & Visualization, High Performance Computing Center Stuttgart, Stuttgart, Germany
Schubert, L   (schubert@hlrs.de), Department Intelligent Service Infrastructures, High Performance Computing Center Stuttgart, Stuttgart, Germany
Yang, L   (Liqun.yang@kcl.ac.uk), Geography, King's College London, London, United Kingdom
Hu, Y   ( jxxbc@163.com), Hydrological Bureau of Anhui Province, Hefei, China

Flooding is a wide spread and devastating natural disaster worldwide. Floods that took place in the last decade in China were ranked the worst amongst recorded floods worldwide in terms of the number of human fatalities and economic losses (Munich Re-Insurance). Rapid economic development and population expansion into low lying flood plains has worsened the situation. Current conventional flood prediction systems in China are neither suited to the perceptible climate variability nor the rapid pace of urbanization sweeping the country. Flood prediction, from short-term (a few hours) to medium-term (a few days), needs to be revisited and adapted to changing socio-economic and hydro-climatic realities. The latest technology requires implementation of multiple numerical weather prediction systems. The availability of twelve global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a good opportunity for an effective state-of-the-art early forecasting system. A prototype of a Novel Flood Early Warning System (NEWS) using the TIGGE database is tested in the Huai River basin in east-central China. It is the first early flood warning system in China that uses the massive TIGGE database cascaded with river catchment models, the Xinanjiang hydrologic model and a 1-D hydraulic model, to predict river discharge and flood inundation. The NEWS algorithm is also designed to provide web-based services to a broad spectrum of end-users. The latter presents challenges as both databases and proprietary codes reside in different locations and converge at dissimilar times. NEWS will thus make use of a ready-to-run grid system that makes distributed computing and data resources available in a seamless and secure way. An ability to run or function on different operating systems and provide an interface or front that is accessible to broad spectrum of end-users is additional requirement. The aim is to achieve robust interoperability through strong security and workflow capabilities. A physical network diagram and a work flow scheme of all the models, codes and databases used to achieve the NEWS algorithm are presented. They constitute a first step in the development of a platform for providing real time flood forecasting services on the web to mitigate 21st century weather phenomena.

NH43A-1298 Poster

Ice jam flooding: a location prediction model

Collins, H A (geographerH@gmail.com), Geography, State University of New York at Buffalo, Buffalo, NY, United States

Flooding created by ice jamming is a climatically dependent natural hazard frequently affecting cold regions with disastrous results. Basic known physical characteristics which combine in the landscape to create an ice jam flood are modeled on the Cattaraugus Creek Watershed, located in Western New York State. Terrain analysis of topographic features, and the built environment features is conducted using Geographic Information Systems in order to predict the location of ice jam flooding events. The purpose of this modeling is to establish a broadly applicable Watershed scale model for predicting the probable locations of ice jam flooding.
location of historic ice jam flooding events

NH43A-1299 Poster

Analysis of Prediction Uncertainty of Vs30 in Southern California

Thompson, M   (thomp518@unr.nevada.edu), Nevada Seismological Laboratory, University of Nevada, Reno, NV, United States
Louie, J N (louie@seismo.unr.edu), Nevada Seismological Laboratory, University of Nevada, Reno, NV, United States
Dhar, M S (mahesh@seismo.unr.edu), Nevada Seismological Laboratory, University of Nevada, Reno, NV, United States
Pancha, A   (pancha@seismo.unr.edu), Optim SDS, Reno, NV, United States
Pullammanappallil, S   (satish@optimsoftware.com), Optim SDS, Reno, NV, United States
Yong, A K (yong@usgs.gov), Earthquake Hazards Team, U.S. Geological Survey, Pasadena, CA, United States

We have chosen 391 direct Vs30 measurements made by the University of Nevada, Reno and Optim, and from the Next Generation Attenuation database to evaluate uncertainties and likelihood of possibly hazardous soil classes. We also divide the 391 direct Vs30 measurements into “soil,” “basin,” or “rock” categories based on the site class map assembled by Wills and others (2000 BSSA) to evaluate the change in uncertainty in subsets of the accumulated data. Statistical analysis of the data shows that the dataset closely fits a log-normal distribution. Using the kriging method with a theoretical variogram, we create several first-order maps of Vs30, its spatial variance, and likelihood of NEHRP soil classes D and E. These maps created from the log-transformed Vs30 data show >60% likelihood of NEHRP site class E in limited parts of the Los Angeles Basin, and in Imperial Valley. As expected, NEHRP site class D shows >80% probability in these and many additional areas. The RMS error of the Vs30 maps, with a range of 58 m/s to 90 m/s for all subsets and the complete set, is smallest for the “soil” subset. For the “basin” subset, the kriged spatial variance is at a minimum for the gridded area. The subset with the highest fractal dimension, kriged spatial variance, and RMS error is the “rock” subset, which is attributed to the relatively small size of the subset. The fractal dimension, which can be related to correlation length, is smallest for the complete data set. We believe that additional direct measurements, and a denser distribution of measurements, will decrease these prediction uncertainties.

www.seismo.unr.edu/hazsurv

NH43A-1300 Poster

Climate Change and Sea Level Rise: A Challenge to Science and Society

Plag, H   (hpplag@unr.edu), Nevada Bureau of Mines and Geology and Seismological Laboratory, University of Nevada, Reno, Reno, NV, United States

Society is challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often subsiding and densely populated coastal areas are under risk of increased inundation, with potentially devastating consequences for the global economy, society, and environment. Faced with a trade-off between imposing the very high costs of coastal protection and adaptation upon today's national economies and leaving the costs of potential major disasters to future generations, governments and decision makers are in need of scientific support for the development of mitigation and adaptation strategies for the coastal zone. Low-frequency to secular changes in LSL are the result of many interacting Earth system processes. The complexity of the Earth system makes it difficult to predict Global Sea Level (GSL) rise and, even more so, LSL changes over the next 100 to 200 years. Humans have re-engineered the planet and changed major features of the Earth surface and the atmosphere, thus ruling out extrapolation of past and current changes into the future as a reasonable approach. The risk of rapid changes in ocean circulation and ice sheet mass balance introduces the possibility of unexpected changes. Therefore, science is challenged with understanding and constraining the full range of plausible future LSL trajectories and with providing useful support for informed decisions. In the face of largely unpredictable future sea level changes, monitoring of the relevant processes and development of a forecasting service on realistic time scales is crucial as decision support. Forecasting and "early warning" for LSL rise would have to aim at decadal time scales, giving coastal managers sufficient time to react if the onset of rapid changes would require an immediate response. The social, environmental, and economic risks associated with potentially large and rapid LSL changes are enormous. Therefore, in the light of the current uncertainties and the unpredictable nature of some of the forcing processes for LSL changes, the focus of scientific decision support may have to shift from projections of LSL trajectories on century time scales to the development of models and monitoring systems for a forecasting service on decadal time scales. The requirements for such a LSL forecasting service and the current obstacles will be discussed.

NH43A-1301 Poster

Does Tropical Cyclone Modification Make Sense? A Decision-Analytic Assessment

Klima, K   (kklima@andrew.cmu.edu), Carnegie Mellon University, Pittsburgh, PA, United States
Morgan, M G (gm5d@andrew.cmu.edu), Carnegie Mellon University, Pittsburgh, PA, United States
Grossmann, I   (irisg@andrew.cmu.edu), Carnegie Mellon University, Pittsburgh, PA, United States

Since the demise of project Stormfury in 1983, little attention has been devoted to the possibility of intentionally modifying tropical cyclones (TC). However, following Hurricane Katrina and three other Category 5 hurricanes (Emily, Rita, and Wilma), which together resulted in at least 2,280 deaths and over $120-billion in damages (Blake et al., 2007), the U.S. Department of Homeland Security (DHS) has recently begun to support an effort to identify and evaluate hurricane mitigation strategies through Project HURRMIT ([http://www.ofcm.noaa.gov/ihc09/Presentations/Session10/s10-01Woodley.ppt]). Using a decision analytic framing and FEMA's HAZUS-MH MR3 damage model (http://www.fema.gov/plan/prevent/hazus/]), this paper asks, how sure must one be that an intervention will reduce TC damages before choosing to undertake a program of modification? The analysis is formulated in probabilistic terms, and assesses net benefits. In contrast to a much earlier application of decision analysis to TC-modification (Howard et al., 1972) , this work uses census data on the value of property at risk, and prior distributions on changing storm behavior based on data from hurricanes approaching the east coast of Florida since 1953. Even before considering both issues of liability that may arise from the fact that a modified storm is no longer "an act of God" as well as unforeseen environmental consequences, our results suggest that while TC modification techniques will likely alter TC behavior, one will have to be significantly more confident of the predictability and effectiveness of modification methods before their use can be justified. This work is supported by the Climate Decision Making Center through a cooperative agreement between the National Science Foundation (SES-0345798) and Carnegie Mellon University.

NH43A-1302 Poster

The Role of Earth Science in Oregon’s Tsunami Preparedness (Invited)

Priest, G R (george.priest@dogami.state.or.us), Newport Coastal Field Office, Oregon Dept. of Geology and Mineral Industries, Newport, OR, United States

Earth science played a critical role in understanding the scope of Oregon’s tsunami hazard. When in the early 1990’s earth scientists communicated to stakeholders the seriousness of the threat posed by local Cascadia subduction zone tsunamis, tsunami preparedness began to rise in priority at all levels of government. Hard field evidence in the form of prehistoric tsunami deposits was a critical component in making the hazard “real” to local governments. State-produced tsunami inundation maps derived from numerical simulations gave decision makers and educators reliable tools to illustrate the spatial scope of the hazard. These maps allowed local cities to plan for evacuation and empowered the State of Oregon to begin “hard” mitigation by limiting new construction of critical facilities seaward of a regulatory inundation line. “Entering” and “Leaving” tsunami hazard zone signs were placed along the Oregon Coast Highway where it dips below this inundation line as means of raising awareness of both the local and transient populations. When detailed inundation studies and derivative evacuation maps were produced for individual communities, State scientists sought advice from local officials at every stage, giving them ownership of the final products. This sense of ownership gave decision makers much greater confidence in the maps and turned many skeptics into passionate advocates. This network of advocates has, over time, resulted in local jurisdictions taking substantive preparedness actions such as replacing critical evacuation bridges, starting networks of emergency response volunteers, and moving critical structures like schools and fire stations. One place that earth science has some difficulty is in communicating probability and uncertainty. For example, the State of Oregon is currently producing new maps that depict uncertainty of tsunami flooding from a future Cascadia subduction zone earthquake. These maps show a range of inundation lines that reflect the relative confidence level (percentage) that a local Cascadia tsunami will NOT exceed each line. In the first of these studies at Cannon Beach, Oregon (Priest et al., 2009) the 90th percentile flood level was only about half to two-thirds as high as the 99th percentile. On the northern Oregon coast Cascadia recurrence is ~500 years, so a percentile map depicts spatial uncertainty of inundation for that event. A Cascadia tsunami approximating the 99th percentile confidence level is no doubt a rare event, but how rare we really do not know. We suspect from offshore turbidite data that only one of these extreme events may have occurred in the last 10,000 years. When the map and underlying data were presented to local officials, they had some difficulty in understanding how to use the information. Erring on the side of caution, they chose the 99th percentile line for evacuation planning but this decision greatly limited available evacuation sites. Cost may make a similarly conservative decision inappropriate for use in building codes or for design of vertical evacuation structures. REFERENCE Priest, G.R.; Goldfinger C.; Wang, K.; Witter, R.C.; Zhang; Y., Baptista, A.M. (2009) Tsunami hazard assessment of the Northern Oregon coast: a multi-deterministic approach tested at Cannon Beach, Clatsop County, Oregon. Oregon Dept. Geol. Mineral Industries Special Paper 41.

NH43A-1303 Poster [WITHDRAWN]

Natural Hazard Assessment and Communication in the Central United States

Wang, Z   (zmwang@uky.edu), Kentucky Gelogical Survey, University of Kentucky, Lexington, KY, United States
Lynch, M J (mike.lynch@uky.edu), Kentucky Gelogical Survey, University of Kentucky, Lexington, KY, United States

In the central United States, natural hazards, such as floods, tornados, ice storms, droughts, and earthquakes, result in significant damages and losses of life every year. For example, the February 5-6, 2008 tornado touched down in nine states (Alabama, Arkansas, Illinois, Indiana, Kentucky, Mississippi, Missouri, and Tennessee), killing 57, injuring 350, and causing more than $1.0 billion in damages. The January 2009 ice storm struck Arkansas, Illinois, Indiana, Kentucky, Missouri, Ohio, Tennessee, and West Virginia, killing 36 and causing more than $1.0 billion in damages. It is a great challenge for the society to develop an effective policy for mitigating these natural hazards in the central United States. However, the development of an effective policy starts with a good assessment of the natural hazards. Scientists play a key role in assessing the natural hazards. Therefore, scientists play an important role in the development of an effective policy for the natural hazard mitigation. It is critical for scientists to clearly define, quantify, and communicate the hazard assessments, including the associated uncertainties which are a key factor in policy decision making, to end-users. Otherwise, end-users will have difficulty understanding and using the information provided. For example, ground motion hazard maps with 2, 5, and 10 percent probabilities of exceedance (PE) in 50 years in the central United States have been produced for seismic hazard mitigation purpose. End-users have difficulty understanding and using the maps, however, which has led to either indecision or ineffective policy for seismic hazard mitigation in many communities in the central United States.

NH43A-1304 Poster

TOWARDS AN INTEGRATED APPROACH FOR REDUCTION OF SEISMIC LOSS

Askan, A   (aaskan@andrew.cmu.edu), Civil Engineering and Earthquake Engineering Research Center, Middle East Technical Univ, Ankara, Turkey
Ugurhan, B   (ugurhan@metu.edu.tr), Civil Engineering and Earthquake Engineering Research Center, Middle East Technical Univ, Ankara, Turkey
Erberik, M A (altug@metu.edu.tr), Civil Engineering and Earthquake Engineering Research Center, Middle East Technical Univ, Ankara, Turkey
Yucemen, S   (yucemen@metu.edu.tr), Civil Engineering and Earthquake Engineering Research Center, Middle East Technical Univ, Ankara, Turkey

Earthquakes are among the most destructive natural hazards worldwide with high damage potential. However, by utilizing several principles it is possible to identify and reduce the resulting social and economic losses. Evaluation of regional seismicity is an essential first step to mitigate potential seismic losses. Existing methodologies for seismic risk estimation involve the following key steps: seismic hazard estimation, site response analyses and building vulnerability assessment. Seismic hazard estimation techniques traditionally employ existing ground motion prediction equations. However, in regions of high seismicity with sparse seismic recordings, it is essential to employ scenario-based seismic hazard quantification based on ground motion simulations along with region-specific site response and building vulnerability estimation. In this study, we present our initial attempts towards an integrated region-specific seismic loss estimation approach which takes geophysical, geotechnical, and structural information into account. We present results in the form of spatial distribution of ground motion intensity parameters, ground motion prediction equations, and building fragility curves based on finite-fault simulations of large earthquakes on the North Anatolian Fault zone (Turkey).

NH43A-1305 Poster

ARkStorm: A West Coast Storm Scenario

Cox, D A (dacox@usgs.gov), USGS, Sacramento, CA, United States
Jones, L M (jones@usgs.gov), USGS, Pasadena, CA, United States
Ralph, F M (marty.ralph@noaa.gov), Environmental Systems Research Laboratory, NOAA, Boulder, CO, United States
Dettinger, M D (mddettin@usgs.gov), USGS, La Jolla, CA, United States
Porter, K   (keith@cohen-porter.net), University of Colorado, Boulder, CO, United States
Perry, S C (scperry@usgs.gov), USGS, Pasadena, CA, United States
Barnard, P L (pbarnard@usgs.gov), USGS, Santa Cruz, CA, United States
Hoover, D   (dhoover@usgs.gov), USGS, Santa Cruz, CA, United States
Wills, C J (cwills@consrv.ca.gov), California Geological Survey, Sacramento, CA, United States
Stock, J D (jstock@usgs.gov), USGS, Menlo Park, CA, United States
Croyle, W   (wcroyle@water.ca.gov), California Department of Water Resources, Sacramento, CA, United States
Ferris, J C (jcferris@usgs.gov), USGS, Sacramento, CA, United States
Plumlee, G S (gplumlee@usgs.gov), USGS, Denver, CO, United States
Alpers, C N (cnalpers@usgs.gov), USGS, Sacramento, CA, United States
Miller, M   (mitchell.miller@oes.ca.gov), California Emergency Management Agency, Sacramento, CA, United States
Wein, A   (awein@usgs.gov), USGS, Menlo Park, CA, United States
Rose, A   (adam.rose@usc.edu), University of Southern California, Los Angeles, CA, United States
Done, J   (done@ucar.edu), National Center for Atmospheric Research, Boulder, CO, United States
Topping, K   (KenTopping@aol.com), California State Polytechnic University, San Luis Obispo, CA, United States

The United Stated Geological Survey (USGS) Multi-Hazards Demonstration Project (MHDP) is preparing a new emergency-preparedness scenario, called ARkStorm, to address massive U.S. West Coast storms analogous to those that devastated California in 1861-62. Storms of this magnitude are projected to become more frequent and intense as a result of climate change. The MHDP has assembled experts from the National Oceanic and Atmospheric Administration (NOAA), USGS, Scripps Institute of Oceanography, the State of California, California Geological Survey, the University of Colorado, the National Center for Atmospheric Research, and other organizations to design the large, but scientifically plausible, hypothetical scenario storm that would provide emergency responders, resource managers, and the public a realistic assessment of what is historically possible. The ARkStorm patterns the 1861 - 1862 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. The ARkStorm draws heat and moisture from the tropical Pacific, forming Atmospheric Rivers (ARs) that grow in size, gain speed, and with a ferocity equal to hurricanes, slam into the U.S. West Coast for several weeks. Using sophisticated weather models and expert analysis, precipitation, snowlines, wind, and pressure data the modelers will characterize the resulting floods, landslides, and coastal erosion and inundation. These hazards will then be translated into the infrastructural, environmental, agricultural, social, and economic impacts. Consideration will be given to catastrophic disruptions to water supplies resulting from impacts on groundwater pumping, seawater intrusion, water supply degradation, and land subsidence. Possible climate-change forces that could exacerbate the problems will also be evaluated. In contrast to the recent U.S. East and Gulf Coast hurricanes, only recently have scientific and technological advances documented the ferocity and strength of possible future West Coast storms. A task of ARkStorm is to elevate the visibility of the very real threats to human life, property, and ecosystems posed by extreme storms on the U.S. West Coast. This enhanced visibility will help increase the preparedness of the emergency management community and the public to such storms. ARkStorm is scheduled to be completed by September 2010 and will be the basis of a state-wide emergency response drill, Golden Guardian, led by the California Emergency Management Agency in 2011.

NH43A-1306 Poster

Hazard Science in Support of Community Resiliency: The Response of the Multi Hazards Demonstration Project to the 2009 Station Fire in Los Angeles County

Jones, L M (jones@usgs.gov), Multi Hazards, U.S. Geological Survey, Pasadena, CA, United States
Bawden, G W (gbawden@usgs.gov), Multi Hazards, U.S. Geological Survey, Pasadena, CA, United States
Bowers, J   (jcbowers@usgs.gov), California Water Science Center, U.S. Geological Survey, San Diego, CA, United States
Cannon, S   (cannon@usgs.gov), Geologic Hazards Science Center, U.S. Geological Survey, Golden, CO, United States
Cox, D A (dacox@usgs.gov), Multi Hazards, U.S. Geological Survey, Pasadena, CA, United States
Fisher, R   (rfisher@usgs.gov), Western Ecological Research Center, U.S. Geological Survey, Sacramento, CA, United States
Keeley, J   (jon_keeley@usgs.gov), Western Ecological Research Center, U.S. Geological Survey, Sacramento, CA, United States
Perry, S C (scperry@usgs.gov), Multi Hazards, U.S. Geological Survey, Pasadena, CA, United States
Plumlee, G S (gplumlee@usgs.gov), Minerals Science Center, U.S. Geological Survey, Denver, CO, United States
Wood, N J (nwood@usgs.gov), Cascades Volcano Observatory, U.S. Geological Survey, Vancouver, WA, United States

The “Station” fire, the largest fire in the history of Los Angeles County in southern California, began on August 26, 2009 and as of the abstract deadline had burned over 150,000 acres of the Angeles National Forest. This fire creates both a demand and an opportunity for hazards science to be used by the communities directly hit by the fire, as well as those downstream of possible postfire impacts. The Multi Hazards Demonstration Project of the USGS is deploying several types of scientific response, including 1) evaluation of potential debris-flow hazards and associated risk, 2) monitoring physical conditions in burned areas and the hydrologic response to rainstorms, 3) increased streamflow monitoring, 4) ash analysis and ground water contamination, 5) ecosystem response and endangered species rescue, 6) lidar data acquisition for evaluations of biomass loss, detailed mapping of the physical processes that lead to debris-flow generation, and other geologic investigations. The Multi Hazards Demonstration Project is working with the southern California community to use the resulting information to better manage the social consequences of the fire and its secondary hazards. In particular, we are working with Los Angeles County to determine what information they need to prioritize recovery efforts. For instance, maps of hazards specific to debris flow potential can help identify the highest priority areas for debris flow mitigation efforts. These same maps together with ecosystem studies will help land managers determine whether individuals from endangered species should be removed to zoos or other refuges during the rainy months. The ash analysis will help water managers prevent contamination to water supplies. Plans are just beginning for a public information campaign with Los Angeles County about the risk posed by potential debris flows that should be underway in December. Activities from the fire response will support the development of the Wildfire Scenario in 2011, which will examine implications of land-use decisions in the frequency of fires in southern California.