An experimental Environmental Watch System, the so-called Observatorio Andino-OA (Observatorio Andino), has been implemented in Venezuela, Colombia, Ecuador, Peru, Bolivia, and Chile over the past two years. The OA is a collaborative and regional network that aims to monitor several environmental variables and develop accurate forecasts based on different scientific tools. Its overall goal is to improve risk assessments, set up Early Warning Systems, support decision-making processes, and provide easily- and intuitively-understandable spatial maps to end-users. The initiative works under the scientific and logistic coordination of the Centro de Modelado Cient\'ifico (CMC) at Zulia University, Venezuela, and the Centro Internacional para la Investigaci\'on del Fen\'omeno 'El Ni\~no' (CIIFEN), and is operated at a local level by the National Weather Services (NWSs) of the aforementioned six Andean nations. The OA provides several freely-available model outputs including meteorological and hydrological forecasts, droughts, fire and flood indices, ecosystems dynamics, climate and health applications, and five-day high-resolution oceanographic predictions for the Eastern Pacific. This article briefly describes the current products, methodologies and dynamical and statistical modeling outputs provided by the OA. Also, a discussion on how these sets of tools have been put together as a coordinated, scientific watch and forecast system for each country and for the entire region is presented. Our experiences over the past two years suggest that this initiative would significantly improve the current decision-making processes in Andean countries. Comment: 20 pages, 5 figures, sent to BAMS
Determining how El Niño and its impacts may change over the next 10 to 100 years remains a difficult scientific challenge. Ocean–atmosphere coupled general circulation models (CGCMs) are routinely used both to analyze El Niño mechanisms and teleconnections and to predict its evolution on a broad range of time scales, from seasonal to centennial. The ability to simulate El Niño as an emergent property of these models has largely improved over the last few years. Nevertheless, the diversity of model simulations of present-day El Niño indicates current limitations in our ability to model this climate phenomenon and to anticipate changes in its characteristics. A review of the several factors that contribute to this diversity, as well as potential means to improve the simulation of El Niño, is presented.
The international field campaign called the Convective and Orographically-induced Precipitation Study (COPS) took place from June to August 2007 in southwestern Germany/eastern France. The overarching goal of COPS is to advance the quality of forecasts of orographically-induced convective precipitation by four-dimensional observations and modeling of its life cycle. COPS was endorsed as one of the Research and Development Projects of the World Weather Research Program (WWRP), and combines the efforts of institutions and scientists from eight countries. A strong collaboration between instrument principal investigators and experts on mesoscale modeling has been established within COPS. In order to study the relative importance of large-scale and small-scale forcing leading to convection initiation in low mountains, COPS is coordinated with a one-year General Observations Period in central Europe, the WWRP Forecast Demonstration Project MAP D-PHASE, and the first summertime European THORPEX Regional Campaign. Furthermore, the Atmospheric Radiation Measurement program Mobile Facility operated in the central COPS observing region for nine months in 2007. The article describes the scientific preparation of this project and the design of the observation systems. COPS will rest on three pillars: A unique synergy of observing systems, the next-generation high-resolution mesoscale models with improved model physics, and advanced data assimilation and ensemble prediction systems. These tools will be used to separate and to quantify errors in quantitative precipitation forecasting as well as to study the predictability of convective precipitation.
The collective representation within global models of aerosol, cloud, precipitation, and their radiative properties remains unsatisfactory. They constitute the largest source of uncertainty in predictions of climatic change and hamper the ability of numerical weather prediction models to forecast high-impact weather events. The joint European Space Agency (ESA)–Japan Aerospace Exploration Agency (JAXA) Earth Clouds, Aerosol and Radiation Explorer (EarthCARE) satellite mission, scheduled for launch in 2018, will help to resolve these weaknesses by providing global profiles of cloud, aerosol, precipitation, and associated radiative properties inferred from a combination of measurements made by its collocated active and passive sensors. EarthCARE will improve our understanding of cloud and aerosol processes by extending the invaluable dataset acquired by the A-Train satellites CloudSat, Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and Aqua. Specifically, EarthCARE’s cloud profiling radar, with 7 dB more sensitivity than CloudSat, will detect more thin clouds and its Doppler capability will provide novel information on convection, precipitating ice particle, and raindrop fall speeds. EarthCARE’s 355-nm high-spectral-resolution lidar will measure directly and accurately cloud and aerosol extinction and optical depth. Combining this with backscatter and polarization information should lead to an unprecedented ability to identify aerosol type. The multispectral imager will provide a context for, and the ability to construct, the cloud and aerosol distribution in 3D domains around the narrow 2D retrieved cross section. The consistency of the retrievals will be assessed to within a target of ±10 W m–2 on the (10 km)2 scale by comparing the multiview broadband radiometer observations to the top-of-atmosphere fluxes estimated by 3D radiative transfer models acting on retrieved 3D domains.
Climate Data Records (CDRs) of Essential Climate Variables (ECVs) as defined by the Global Climate Observing System (GCOS) derived from satellite instruments help to characterize the main components of the Earth system, to identify the state and evolution of its processes, and to constrain the budgets of key cycles of water, carbon and energy. The Climate Change Initiative (CCI) of the European Space Agency (ESA) coordinates the derivation of CDRs for 21 GCOS ECVs. The combined use of multiple ECVs for Earth system science applications requires consistency between and across their respective CDRs. As a comprehensive definition for multi-ECV consistency is missing so far, this study proposes defining consistency on three levels: (1) consistency in format and metadata to facilitate their synergetic use (technical level); (2) consistency in assumptions and auxiliary datasets to minimize incompatibilities among datasets (retrieval level); and (3) consistency between combined or multiple CDRs within their estimated uncertainties or physical constraints (scientific level). Analysing consistency between CDRs of multiple quantities is a challenging task and requires coordination between different observational communities, which is facilitated by the CCI program. The inter-dependencies of the satellite-based CDRs derived within the CCI program are analysed to identify where consistency considerations are most important. The study also summarizes measures taken in CCI to ensure consistency on the technical level, and develops a concept for assessing consistency on the retrieval and scientific levels in the light of underlying physical knowledge. Finally, this study presents the current status of consistency between the CCI CDRs and future efforts needed to further improve it.
The National Weather Service (NWS) is responsible for issuing public warnings for all hazardous weather events across the United States. Advances in technology and basic scientific research over the years have allowed for significant improvements in this assignment. But while the NWS continues to focus much of its strategic planning toward improved warnings, most of those associated with the process are aware that there are a number of steps beyond increased accuracy to make their warnings effective. The premise of the present study is that NWS forecasters can benefit from knowing more about their emergency management counterparts, including a general overview of the nature of that community, along with characteristics that might influence collaboration. Multivariate analyses of variance (MANOVA) were used to test cross-sectional differences between all variables. Results indicate significant differences between age groups when comparing salaries with education, years worked in the response field, years worked as an emergency manager, and first-responder training.
Water in the soil—both its amount (soil moisture) and its state (freeze/thaw)—plays a key role in water and energy cycles, in weather and climate, and in the carbon cycle. Additionally, soil moisture touches upon human lives in a number of ways—from the ravages of flooding to the needs for monitoring agricultural and hydrologic droughts. Because of their relevance to weather, climate, science, and society, accurate and timely measurements of soil moisture and freeze/thaw state with global coverage are critically important.
The transition from the Advanced Very High Resolution Radiometer (AVHRR)/2 to AVHRR/3 on NOAA polar orbiters was associated with a switching from daylight operations of the 3.7- to 1.6-µm wave band, while retaining 3.7 µm for nighttime operations. Investigations of the daylight applicability of the two channels suggest that the 1.6-µm wave band for daylight operations does not prove to be the better choice, at least for cloud applications. The 3.7-µm wave band is much less affected by surface contamination, and measures more faithfully and unambiguously the particle effective radius near cloud tops. The 1.6-µm radiation penetrates deeper into the cloud, supplying an integrated signal throughout the inner portions of the cloud, including surface contribution. Therefore, a synergetic use of the two wave bands can provide an improved retrieval of cloud microstructure and precipitation than from any of the channels alone. However, when one channel must be selected for the AVHRR/3, 3.7 µm performs much better for these applications. Both wave bands identify equally well microphysical features in the anvils of severe storms. For other applications, such as detection of ice and snow over vegetated surfaces and desert dust aerosols, the 1.6-µm wave band does not present clear advantages with respect to 3.7 µm, except that it can be used directly as is, whereas the 3.7-µm wave band has to be corrected for the thermal emission and water vapor absorption. Anyway, the Moderate Resolution Imaging Spectroradiometer (MODIS) can be used instead for the applications to the relatively slowly changing surface properties, while prioritizing the AVHRR for the faster varying atmospheric applications. Finally, the 3.7-mm wave band is more effective in detecting fog, fires, and hot spots. All these factors need to be considered by the operators of AVHRR/3 making a justifiable choice of the channels for the maximum benefit of the user community.
Critical reviews of forecasts of ENSO conditions, based on a set of 15 dynamical and statistical models, are given for the 1997-98 El Niño event and the initial stages of the 1998-99 La Niña. While many of the models forecasted some degree of warming one to two seasons prior to the onset of the El Niño in boreal spring of 1997, none predicted its strength until the event was already becoming very strong in late spring. Neither the dynamical nor the statistical models, as groups, performed significantly better than the other during this episode. The best performing statistical models and dynamical models forecast SST anomalies of about +1°C (vs 2.5°-3° observed) in the Niño 3.4 region prior to any observed positive anomalies. The most comprehensive dynamical models performed better than the simple dynamical models. Once the El Niño had developed in mid-1997, a larger set of models was able to forecast its peak in late 1997 and dissipation and reversal to cold conditions in late spring/early summer 1998. Overall, however, skill for these recent two years does not appear greater than that found over an earlier (1982-93) period. In both cases, median model correlation skill averaged over lead times of one to three seasons is near or just above 0.6.Because ENSO extremes usually develop in boreal spring or early summer and persist through the following winter, forecasting impact tendencies in extratropical North America for winter (when impacts are most pronounced) at 5 months of lead time is not difficult, requiring only good observations of the summer ENSO state and knowledge of the winter teleconnections. Because of the strength of the 1997-98 El Niño and the consequent skill of 5-month lead forecasts of U.S. winter 1997-98 impacts, the success of these forecasts was noticed to an unprecedented extent by the general public. However, forecasting impacts in austral winter that occur simultaneously with the initial appearance of an ENSO extreme (e.g., in Chile, Uruguay, Kiribati, Ecuador, and Peru) require forecasting the boreal spring/summer onset of ENSO events themselves at several months of lead time. This latter task is formidable, as evidenced by the fact that formal announcements of an El Niño did not occur until May, leaving little time for users in the above regions to prepare.Verbal summaries of ENSO forecasts issued to users worldwide during the 1997-98 El Niño event contained ambiguities. To address the needs for forecasts to be expressed verbally for nontechnical users and also to be precise enough for meaningful utility and verification, a simple numerically based verbal classification system for describing ENSO-related forecasts is presented.
Results of a pilot study of the evolution of large-scale hydrologic
processes associated with the first transition of the Asian summer
monsoon in conjunction with the launching of the South China Sea Monsoon
Experiment (SCSMEX) in May 1998 are presented. SCSMEX is a major
international field experiment to study the water and energy cycles of
the Asian monsoon region, with the aim toward better understanding and
improved prediction of the onset, maintenance, and variability of the
monsoon of southern China, Southeast Asia, and the western pacific
region. In this paper, the utility of reliable satellite data in
revealing characteristics of the South China Sea (SCS) monsoon is
emphasized. Using a combination of satellite-estimated rainfall,
moisture, surface wind, and sea surface temperature, the authors present
some interesting and hitherto unknown features in large-scale
atmospheric and oceanic hydrologic processes associated with the
fluctuation of the SCS monsoon. Results show that, climatologically, the
SCS monsoon occurs during mid-May when a major convection zone shifts
from the eastern Indian Ocean-southern Indochina to the SCS.
Simultaneous with the SCS monsoon onset is the development of a moist
tongue and frontal rainband emanating from the northern SCS, across
southern China and the East China Sea to southern Japan, as well as the
enhancement of equatorial convection in the western Pacific ITCZ.
Analysis of the satellite-derived moisture and rainfall shows that the
onset of the SCS monsoon during 1997 was preceded by the development of
eastward-propagating supercloud clusters over the Indian Ocean. The
satellite data also reveal a strong onset vortex over the SCS and
large-scale cooling and warming patterns over the Indian Ocean and
western Pacific. These features signal a major shift of the large-scale
hydrologic cycle in the ocean-atmosphere system, which underpins the SCS
monsoon onset. The paper concludes with a brief discussion of the
observational platform of SCSMEX and a call for the use of satellite
data, field observations, and models for comprehensive studies of the
Computational Fluid Dynamics (CFD) model simulations of urban boundary layers have improved so that they are useful in many types of flow and dispersion analyses. The study described here is intended to assist in planning emergency response activities related to releases of chemical or biological agents into the atmosphere in large cities such as New York City. Five CFD models (CFD-Urban, FLACS, FEM3MP, FEFLO-Urban, and Fluent-Urban) have been applied by five independent groups to the same 3-D building data and geographic domain in Manhattan, using approximately the same wind input conditions. Wind flow observations are available from the Madison Square Garden March 2005 (MSG05) field experiment. It is seen from the many side-by-side comparison plots that the CFD models simulations of near-surface wind fields generally agree with each other and with field observations, within typical atmospheric uncertainties of a factor of two. The qualitative results shown here suggest, for example, that transport of a release at street level in a large city could reach a few blocks in the upwind and crosswind directions. There are still key differences seen among the models for certain parts of the domain. Further quantitative examinations of differences among the models and the observations are necessary to understand causal relationships.
Aircraft emissions impact the atmosphere in a variety of ways, including enhancing greenhouse gases, especially water vapor and carbon dioxide, in the upper troposphere and lower stratosphere, forming persistent contrails, and altering the distributions of reactive chemical species, which change the oxidative capacity of the atmosphere. This paper summarizes some recent findings related to the impacts of aircraft exhaust on the chemistry of the upper troposphere and lower stratosphere (UTLS). Of particular note are the improvements in our understanding of production of nitrogen oxides (NOx ~ NO + NO2) by lightning and of the influence of long-range transport on background abundances of reactive species. Studies have also identified gaps in our knowledge, including the behavior of HOx (OH and HO2) species at high NOx and discrepancies in measurements of water vapor in the relatively dry UTLS. Lack of detailed observations of species, such as the halogens chlorine and bromine, limits our ability to assess the role of heterogeneous chemistry on UTLS chemistry with or without the influence of aircraft exhaust. Recommendations for studies that address these issues are presented.
Instructors at The Pennsylvania State University report on their experience of developing a course in radar meteorology in conjunction with a field experiment utilizing the Doppler On Wheels (DOW) radars maintained by the Center for Severe Weather Research. The field experiment) coined the Pennsylvania Area Mobile Radar Experiment (PAMREX), was inextricably wedded to the classroom experience of the students. The unusual hands-on learning experience provided a firsthand taste of what research is all about, particularly field research, while also including fairly traditional lectures on the theoretical aspects of radar.
A new field of study, “decadal prediction,” is emerging in climate science. Decadal prediction lies between seasonal/interannual forecasting and longer-term climate change projections, and focuses on time-evolving regional climate conditions over the next 10–30 yr. Numerous assessments of climate information user needs have identified this time scale as being important to infrastructure planners, water resource managers, and many others. It is central to the information portfolio required to adapt effectively to and through climatic changes. At least three factors influence time-evolving regional climate at the decadal time scale: 1) climate change commitment (further warming as the coupled climate system comes into adjustment with increases of greenhouse gases that have already occurred), 2) external forcing, particularly from future increases of greenhouse gases and recovery of the ozone hole, and 3) internally generated variability. Some decadal prediction skill has been demonstrated to arise from the first two of these factors, and there is evidence that initialized coupled climate models can capture mechanisms of internally generated decadal climate variations, thus increasing predictive skill globally and particularly regionally. Several methods have been proposed for initializing global coupled climate models for decadal predictions, all of which involve global time-evolving three-dimensional ocean data, including temperature and salinity. An experimental framework to address decadal predictability/prediction is described in this paper and has been incorporated into the coordinated Coupled Model Intercomparison Model, phase 5 (CMIP5) experiments, some of which will be assessed for the IPCC Fifth Assessment Report (AR5). These experiments will likely guide work in this emerging field over the next 5 yr.
Nobody anticipated that El Nino would be weak and prolonged in 1992, but brief and intense in 1997/98. Why are various El Nino episodes so different, and so difficult to predict? The answer involves the important role played by random atmospheric disturbances (such as westerly wind bursts) in sustaining the weakly damped Southern Oscillation, whose complementary warm and cold phases are, respectively, El Nino and La Nina. As in the case of a damped pendulum sustained by modest blows at random times, so the predictability of El Nino is limited, not by the amplification of errors in initial conditions as in the case of weather, but mainly by atmospheric disturbances interacting with the Southern Oscillation. Given the statistics of the wind fluctuations, the probability distribution function of future sea surface temperature fluctuations in the eastern equatorial Pacific can be determined by means of an ensemble of calculations with a coupled ocean-atmosphere model. Each member of the ensemble starts from the same initial conditions and has, superimposed, a different realization of the noise. Such a prediction, made at the end of 1996, would have assigned a higher likelihood to a moderate event than to the extremely strong event that actually occurred in 1997. (The rapid succession of several westerly wind bursts in early 1997 was a relatively rare phenomenon.) In late 2001, conditions were similar to those in 1996, which suggested a relatively high probability of El Nino appearing in 2002. Whether the event will be weak or intense depends on the random disturbances that materialize during the year.
The fourth workshop of the International Precipitation Working Group (IPWG) focused on the quantitative precipitation estimates (QPE) from satellite observations and model analyses. Two working group conducted sessions on topical areas that includes validation, applications, research, and new technologies. The first session of the workshop focused on international projects and satellite programs. Other presentations focused on new precipitation products and datasets, such as those of the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) Satellite Applications Facility on Support to Operational Hydrology and Water Management (H-SAF). The second day of the workshop concentrated on the development and refinement of new algorithms and their applications, including the improved utilization of current data sources and the exploitation of new datasets. The routine generation of satellite precipitation products for the user community was highlighted, particularly the need to mainstream satellite precipitation products into operational use.
The Yellow and East China Seas are contained in a
shallow, marginal seas surrounded by mainland
China to the west and Korea and Japan to the
east. More than 75% of the area of this region is
shallower than 100 m; thus, it is among the most
productive continental shelf regions in the world. Heo
et al. (2012) showed that warming in the East China
and Yellow Seas has been more rapid than global
warming. Therefore, even a small environmental
change can have an ecological impact on the region
and an economic influence on the people living along
The Arctic is one chapter from the State of the Climate in 2020 annual report and is available from https://doi.org/10.1175/BAMS-D-21-0086.1. Compiled by NOAA’s National Centers for Environmental Information, State of the Climate in 2020 is based on contributions from scientists from around the world. It provides a detailed update on global climate indicators, notable weather events, and other data collected by environmental monitoring stations and instruments located on land, water, ice, and in space. Tundra greenness has been monitored by Earth-observing satellites
since 1982 and a growing constellation of spaceborne sensors provide increasingly detailed observations of Arctic ecosystems.
Historical documentary sources in the Canary Islands have been used to construct cereal production series for the period 1595–1836. The cereal growth period in this region covers essentially the rainy season, making these crops adequate to characterize the annual precipitation. A proxy for the Islands' rainfall based on the historical series of wheat and barley production has been constructed and assessed by using two independent series of dry and wet years. The spectral analysis of the crop production reveals a strong non stationary behavior. This fact, along with the direct comparison with several reconstructed and instrumental North Atlantic Oscillation series, suggests the potential use of the reconstructed precipitation as a proxy for this climatic oscillation during preinstrumental times.
This is an abridged version of the full-length article that is available online (10.1175/BAMS-84-8-García)
CMIP6 simulations suggest that anthropogenic greenhouse gas forcing has at least doubled the likelihood of 2015-19 like prolonged droughts over the South African Western Cape, with large cancellation due to other anthropogenic effects.
The COVID-19 pandemic has brought dynamics of compound hazards and risk-response feedbacks to the fore of hydrometeorological hazard preparedness and response. For example, lockdowns implemented to slow the hazard of COVID-19 transmission have the potential unintended side effect of isolating or demobilizing people in the face of an incoming compounding hydrometeorological hazard, while conventional hydrometeorological hazard responses—group evacuations, shelters, community cooling centers, etc.—potentially exacerbate disease transmission. Indeed, widespread climate extremes in the summer of 2021, including major heat waves in the Northern Hemisphere, widespread flooding, and extensive forest fires, all required that emergency responders and health systems balance acute risks of disease and severe climatic hazards, and to do so in the context of economic stresses associated with both. Effective management of such hazards requires coordination across hydrometeorological services, disaster preparedness and response organizations, emergency health services, and vulnerable communities. At times it also requires that standard recommendations for responding to COVID-19 or a hydrometeorological hazard be revised, sometimes acutely, and that these revisions be communicated effectively to diverse audiences.
Dynamical downscaling of coarse resolution wind fields and an open-source economic loss model are used to build a modeling chain that simulates historical windstorms and their impacts at local scales and in complex terrain. The potential of the method for future applications is illustrated for a high-impact foehn storm in the Swiss Alps that occurred on 15 February 1925.
The Twentieth Century Reanalysis, a 3-dimensional reanalysis of the global atmospheric state since 1871, sets the boundary conditions. The numerical Weather Research and Forecasting Model then conducts the dynamical downscaling of weather elements to 3-km horizontal resolution. As a result, the local wind regime during the Foehn storm is captured. In fact, the wind field is consistent with sparse wind observations and the wind flow concept of meteorologists of that time. Finally, the wind loss model computes potential losses per aggregated area as a function of maximum wind speeds and local asset values. Historical damage reports correspond well to the spatial distribution of the modeled losses on the leeward side of the Alpine range.
Hence, it is now possible to extensively explore the damage potential of historical weather extremes in time periods that have traditionally been the province of environmental historians. The dynamical downscaling and loss modeling chain provides a physically consistent method to assess extreme weather phenomena at high resolutions and to estimate the potential harm from past natural hazards in today's or a future world. Further refinements and extensions of the method into the past may well change our perspectives on historical extreme events and their socio-economic impacts.
Several large-scale climate patterns influenced climate conditions and weather patterns across the globe during 2010. The transition from a warm El Niño phase at the beginning of the year to a cool La Niña phase by July contributed to many notable events, ranging from record wetness across much of Australia to historically low Eastern Pacific basin and near-record high North Atlantic basin hurricane activity. The remaining five main hurricane basins experienced below- to well-below-normal tropical cyclone activity. The negative phase of the Arctic Oscillation was a major driver of Northern Hemisphere temperature patterns during 2009/10 winter and again in late 2010. It contributed to record snowfall and unusually low temperatures over much of northern Eurasia and parts of the United States, while bringing above-normal temperatures to the high northern latitudes. The February Arctic Oscillation Index value was the most negative since records began in 1950.
The 2010 average global land and ocean surface temperature was among the two warmest years on record. The Arctic continued to warm at about twice the rate of lower latitudes. The eastern and tropical Pacific Ocean cooled about 1°C from 2009 to 2010, reflecting the transition from the 2009/10 El Niño to the 2010/11 La Niña. Ocean heat fluxes contributed to warm sea surface temperature anomalies in the North Atlantic and the tropical Indian and western Pacific Oceans. Global integrals of upper ocean heat content for the past several years have reached values consistently higher than for all prior times in the record, demonstrating the dominant role of the ocean in the Earth's energy budget. Deep and abyssal waters of Antarctic origin have also trended warmer on average since the early 1990s. Lower tropospheric temperatures typically lag ENSO surface fluctuations by two to four months, thus the 2010 temperature was dominated by the warm phase El Niño conditions that occurred during the latter half of 2009 and early 2010 and was second warmest on record. The stratosphere continued to be anomalously cool.
Annual global precipitation over land areas was about five percent above normal. Precipitation over the ocean was drier than normal after a wet year in 2009. Overall, saltier (higher evaporation) regions of the ocean surface continue to be anomalously salty, and fresher (higher precipitation) regions continue to be anomalously fresh. This salinity pattern, which has held since at least 2004, suggests an increase in the hydrological cycle.
Sea ice conditions in the Arctic were significantly different than those in the Antarctic during the year. The annual minimum ice extent in the Arctic—reached in September—was the third lowest on record since 1979. In the Antarctic, zonally averaged sea ice extent reached an all-time record maximum from mid-June through late August and again from mid-November through early December. Corresponding record positive Southern Hemisphere Annular Mode Indices influenced the Antarctic sea ice extents.
Greenland glaciers lost more mass than any other year in the decade-long record. The Greenland Ice Sheet lost a record amount of mass, as the melt rate was the highest since at least 1958, and the area and duration of the melting was greater than any year since at least 1978. High summer air temperatures and a longer melt season also caused a continued increase in the rate of ice mass loss from small glaciers and ice caps in the Canadian Arctic. Coastal sites in Alaska show continuous permafrost warming and sites in Alaska, Canada, and Russia indicate more significant warming in relatively cold permafrost than in warm permafrost in the same geographical area. With regional differences, permafrost temperatures are now up to 2°C warmer than they were 20 to 30 years ago. Preliminary data indicate there is a high probability that 2010 will be the 20th consecutive year that alpine glaciers have lost mass.
Atmospheric greenhouse gas concentrations continued to rise and ozone depleting substances continued to decrease. Carbon dioxide increased by 2.60 ppm in 2010, a rate above both the 2009 and the 1980–2010 average rates. The global ocean carbon dioxide uptake for the 2009 transition period from La Niña to El Niño conditions, the most recent period for which analyzed data are available, is estimated to be similar to the long-term average. The 2010 Antarctic ozone hole was among the lowest 20% compared with other years since 1990, a result of warmer-than-average temperatures in the Antarctic stratosphere during austral winter between mid-July and early September.
Extreme rainfall over southeast Australia was examined and its teleconnection with ENSO in observations and a selection of CMIP5 models. In observations, the magnitude of anomalously cool SSTs in the Nino-3.4 region has a far greater effect on Rx5day in SE Australia compared to the magnitude of anomalously warm SSTs. Five CMIP5 models were selected as the focus for the study as they possess aspects of ENSO variability and an asymmetric ENSO-extreme rainfall relationship. Using these models, it was found little evidence of significant change in the ENSO extreme rainfall relationship between 1861-90 and 1976-2005. The PDFs of Rx5day values in La Ninão seasons also show non-significant differences between the same periods. Inter-annual variability related to ENSO has played a greater role than any long-term trend on the magnitude of extreme rainfall events in southeast Australia over the period 1861-2005.
Data from field surveys are discussed that measured the evolution of coastal residents' risk perceptions and preparation plans as two hurricanes, Isaac and Sandy, approached the United States during the 2012 hurricane season. Surveyed residents overestimated the probability that their homes would be affected by hurricane-force winds, but then they displayed limited degrees of concern over this prospect. These residents also underestimated the threat posed by flooding, including people living adjacent to water areas. The surveys revealed that residents nevertheless had a higher awareness of a storm's maximum winds rather than flood potential. Specifically, when respondents were asked to report what they believed Isaac's and Sandy's maximum winds and predicted maximum storm surges to be, respondents were much better at the former than the latter. What was particularly notable was that the tendency to underestimate the relative threat posed by water in Isaac and Sandy even among those for whom the threat should have been most salient.