Minimize EO (Earth Observation) Topics on Climate Change

Landslide Threats in Near Real-Time During Heavy Rains

February /March 2018: For the first time, scientists can look at landslide threats anywhere around the world in near real-time, thanks to satellite data and a new model developed by NASA. The model, developed at NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland, estimates potential landslide activity triggered by rainfall. Rainfall is the most widespread trigger of landslides around the world. If conditions beneath Earth's surface are already unstable, heavy rains act as the last straw that causes mud, rocks or debris — or all combined — to move rapidly down mountains and hillsides. 48)

The model is designed to increase our understanding of where and when landslide hazards are present and improve estimates of long-term patterns. A global analysis of landslides over the past 15 years using the new open source Landslide Hazard Assessment for Situational Awareness model was published in a study released online on March 22 in the journal Earth's Future. 49)

Determining where, when, and how landslide hazards may vary and affect people at the global scale is fundamental to formulating mitigation strategies, appropriate and timely responses, and robust recovery plans. While monitoring systems exist for other hazards, no such system exists for landslides. A near global LHASA (Landslide Hazard Assessment model for Situational Awareness) has been developed to provide an indication of potential landslide activity at the global scale every 30 minutes. This model uses surface susceptibility and satellite rainfall data to provide moderate to high "nowcasts." This research describes the global LHASA currently running in near real-time and discusses the performance and potential applications of this system. LHASA is intended to provide situational awareness of landslide hazards in near real-time. This system can also leverage nearly two decades of satellite precipitation data to better understand long-term trends in potential landslide activity.

"Landslides can cause widespread destruction and fatalities, but we really don't have a complete sense of where and when landslides may be happening to inform disaster response and mitigation," said Dalia Kirschbaum, a landslide expert at Goddard and co-author of the study. "This model helps pinpoint the time, location and severity of potential landslide hazards in near real-time all over the globe. Nothing has been done like this before."

The model estimates potential landslide activity by first identifying areas with heavy, persistent and recent precipitation. Rainfall estimates are provided by a multi-satellite product developed by NASA using the NASA and JAXA (Japan Aerospace Exploration Agency's)GPM ( Global Precipitation Measurement) mission, which provides precipitation estimates around the world every 30 minutes. The model considers when GPM data exceeds a critical rainfall threshold looking back at the last seven days.

In places where precipitation is unusually high, the model then uses a susceptibility map to determine if the area is prone to landslides. This global susceptibility map is developed using five features that play an important role in landslide activity: if roads have been built nearby, if trees have been removed or burned, if a major tectonic fault is nearby, if the local bedrock is weak and if the hillsides are steep.

If the susceptibility map shows the area with heavy rainfall is vulnerable, the model produces a "nowcast" identifying the area as having a high or moderate likelihood of landslide activity. The model produces new nowcasts every 30 minutes.

Figure 27: This animation shows the potential landslide activity by month averaged over the last 15 years as evaluated by NASA's Landslide Hazard Assessment model for Situational Awareness model. Here, you can see landslide trends across the world (image credit: NASA/GSFC / Scientific Visualization Studio)

The study shows long-term trends when the model's output was compared to landslide databases dating back to 2007. The team's analysis showed a global "landslide season" with a peak in the number of landslides in July and August, most likely associated with the Asian monsoon and tropical cyclone seasons in the Atlantic and Pacific oceans.

"The model has been able to help us understand immediate potential landslide hazards in a matter of minutes," said Thomas Stanley, landslide expert with the Universities Space Research Association at Goddard and co-author of the study. "It also can be used to retroactively look at how potential landslide activity varies on the global scale seasonally, annually or even on decadal scales in a way that hasn't been possible before."

 


 

Study of Antarctic ice loss

February 20, 2018: A NASA study based on an innovative technique for crunching torrents of satellite data provides the clearest picture yet of changes in Antarctic ice flow into the ocean. The findings confirm accelerating ice losses from the West Antarctic Ice Sheet and reveal surprisingly steady rates of flow from its much larger neighbor to the east. 50)

The computer-vision technique crunched data from hundreds of thousands of NASA- USGS (U.S. Geological Survey )Landsat satellite images to produce a high-precision picture of changes in ice-sheet motion.

The new work provides a baseline for future measurement of Antarctic ice changes and can be used to validate numerical ice sheet models that are necessary to make projections of sea level. It also opens the door to faster processing of massive amounts of data.

"We're entering a new age," said the study's lead author, cryospheric researcher Alex Gardner of NASA's Jet Propulsion Laboratory in Pasadena, California. "When I began working on this project three years ago, there was a single map of ice sheet flow that was made using data collected over 10 years, and it was revolutionary when it was published back in 2011. Now we can map ice flow over nearly the entire continent, every year. With these new data, we can begin to unravel the mechanisms by which the ice flow is speeding up or slowing down in response to changing environmental conditions."

The innovative approach by Gardner and his international team of scientists largely confirms earlier findings, though with a few unexpected twists. - Among the most significant: a previously unmeasured acceleration of glacier flow into Antarctica's Getz Ice Shelf, on the southwestern part of the continent — likely a result of ice-shelf thinning.

Speeding up in the west, steady flow in the east: The research, published in the journal "The Cryosphere," also identified the fastest speed-up of Antarctic glaciers during the seven-year study period. The glaciers feeding Marguerite Bay, on the western Antarctic Peninsula, increased their rate of flow by 400 to 800 m/year, probably in response to ocean warming. 51)

Perhaps the research team's biggest discovery, however, was the steady flow of the East Antarctic Ice Sheet. During the study period, from 2008 to 2015, the sheet had essentially no change in its rate of ice discharge — ice flow into the ocean. While previous research inferred a high level of stability for the ice sheet based on measurements of volume and gravitational change, the lack of any significant change in ice discharge had never been measured directly.

EO-Topics_Auto7E

Figure 28: The speed of Antarctic ice flow, derived from Landsat imagery over a seven-year period (image credit: NASA)

The study also confirmed that the flow of West Antarctica's Thwaites and Pine Island glaciers into the ocean continues to accelerate, though the rate of acceleration is slowing.

In all, the study found an overall ice discharge for the Antarctic continent of 1,929 gigatons per year in 2015, with an uncertainty of plus or minus 40 gigatons. That represents an increase of 36 gigatons per year, plus or minus 15, since 2008. A gigaton is one billion tons (109 tons).

The study found that ice flow from West Antarctica — the Amundsen Sea sector, the Getz Ice Shelf and Marguerite Bay on the western Antarctic Peninsula — accounted for 89 percent of the increase.

Computer vision: The science team developed software that processed hundreds of thousands of pairs of images of Antarctic glacier movement from Landsat-7 and Landsat-8, captured from 2013 to 2015. These were compared to earlier radar satellite measurements of ice flow to reveal changes since 2008.

"We're applying computer vision techniques that allow us to rapidly search for matching features between two images, revealing complex patterns of surface motion," Gardner said.

Instead of researchers comparing small sets of very high-quality images from a limited region to look for subtle changes, the novelty of the new software is that it can track features across hundreds of thousands of images/year — even those of varying quality or obscured by clouds — over an entire continent. "We can now automatically generate maps of ice flow annually — a whole year — to see what the whole continent is doing," Gardner said.

The new Antarctic baseline should help ice sheet modelers better estimate the continent's contribution to future sea level rise. "We'll be able to use this information to target field campaigns, and understand the processes causing these changes," Gardner said. "Over the next decade, all this is going to lead to rapid improvement in our knowledge of how ice sheets respond to changes in ocean and atmospheric conditions, knowledge that will ultimately help to inform projections of sea level change."

 


 

Seismic footprint study to track Hurricanes and Typhoons

February 15, 2018: Climatologists are often asked, "Is climate change making hurricanes stronger?" but they can't give a definitive answer because the global hurricane record only goes back to the dawn of the satellite era. But now, an intersection of disciplines—seismology, atmospheric sciences, and oceanography—offers an untapped data source: the continuous seismic record, which dates back to the early 20th century.

An international team of researchers has found a new way to identify the movement and intensity of hurricanes, typhoons and other tropical cyclones by tracking the way they shake the seafloor, as recorded on seismometers on islands and near the coast. After looking at 13 years of data from the northwest Pacific Ocean, they have found statistically significant correlations between seismic data and storms. Their work was published Feb. 15 in the journal Earth and Planetary Science Letters. 52) 53)

The group of experts was assembled by Princeton University's Lucia Gualtieri, a postdoctoral research associate in geosciences, and Salvatore Pascale, an associate research scholar in atmospheric and oceanic sciences.

Most people associate seismology with earthquakes, said Gualtieri, but the vast majority of the seismic record shows low-intensity movements from a different source: the oceans. "A seismogram is basically the movement of the ground. It records earthquakes, because an earthquake makes the ground shake. But it also records all the tiny other movements," from passing trains to hurricanes. "Typhoons show up very well in the record," she said.

Because there is no way to know when an earthquake will hit, seismometers run constantly, always poised to record an earthquake's dramatic arrival. In between these earth-shaking events, they track the background rumbling of the planet. Until about 20 years ago, geophysicists dismissed this low-intensity rumbling as noise, Gualtieri said.

"What is noise? Noise is a signal we don't understand," said Pascale, who is also an associate research scientist at the National and Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory.

Just as astronomers have discovered that the static between radio stations gives us information about the cosmic background, seismologists have discovered that the low-level "noise" recorded by seismograms is the signature of wind-driven ocean storms, the cumulative effect of waves crashing on beaches all over the planet or colliding with each other in the open sea.

One ocean wave acting alone is not strong enough to generate a seismic signature at the frequencies she was examining, explained Gualtieri, because typical ocean waves only affect the upper few feet of the sea. "The particle motion decays exponentially with depth, so at the seafloor you don't see anything," she said. "The main mechanism to generate seismic abnormalities from a typhoon is to have two ocean waves interacting with each other." When two waves collide, they generate vertical pressure that can reach the seafloor and jiggle a nearby seismometer.

When a storm is large enough—and storms classified as hurricanes or typhoons are—it will leave a seismic record lasting several days. Previous researchers have successfully traced individual large storms on a seismogram, but Gualtieri came at the question from the opposite side: can a seismogram find any large storm in the area?

EO-Topics_Auto7D

Figure 29: Lucia Gualtieri, a postdoctoral researcher in geosciences at Princeton University, superimposed an image of the seismogram recording a tropical cyclone above a satellite image showing the storm moving across the northwest Pacific Ocean. Gualtieri and her colleagues have found a way to track the movement and intensity of typhoons and hurricanes by looking at seismic data, which has the potential to extend the global hurricane record by decades and allow a more definitive answer to the question, "Are hurricanes getting stronger?" (image credit: Photo illustration by Lucia Gualtieri, satellite image courtesy of NASA/NOAA)

Gualtieri and her colleagues found a statistically significant agreement between the occurrence of tropical cyclones and large-amplitude, long-lasting seismic signals with short periods, between three and seven seconds, called "secondary microseisms." They were also able to calculate the typhoons' strength from these "secondary microseisms," or tiny fluctuations, which they successfully correlated to the observed intensity of the storms.

In short, the seismic record had enough data to identify when typhoons happened and how strong they were (Figure 29).

So far, the researchers have focused on the ocean off the coast of Asia because of its powerful typhoons and good network of seismic stations. Their next steps include refining their method and examining other storm basins, starting with the Caribbean and the East Pacific.

And then they will tackle the historic seismic record: "When we have a very defined method and have applied this method to all these other regions, we want to start to go back in time," said Gualtieri.

While global storm information goes back only to the early days of the satellite era, in the late 1960s and early 1970s, the first modern seismograms were created in the 1880s. Unfortunately, the oldest records exist only on paper, and few historical records have been digitized.

"If all this data can be made available, we could have records going back more than a century, and then we could try to see any trend or change in intensity of tropical cyclones over a century or more," said Pascale. "It's very difficult to establish trends in the intensity of tropical cyclones—to see the impact of global warming. Models and theories suggest that they should become more intense, but it's important to find observational evidence."

"This new technique, if it can be shown to be valid across all tropical-cyclone prone basins, effectively lengthens the satellite era," said Morgan O'Neill, a T. C. Chamberlin Postdoctoral Fellow in geosciences at the University of Chicago who was not involved in this research. "It extends the period of time over which we have global coverage of tropical cyclone occurrence and intensity," she said.

The researchers' ability to correlate seismic data with storm intensity is vital, said Allison Wing, an assistant professor of earth, ocean and atmospheric science at Florida State University, who was not involved in this research. "When it comes to understanding tropical cyclones—what controls their variability and their response to climate and climate change—having more data is better, in particular data that can tell us about intensity, which their method seems to do. ... It helps us constrain the range of variability that hurricane intensity can have."

This connection between storms and seismicity began when Gualtieri decided to play with hurricane data in her free time, she said. But when she superimposed the hurricane data over the seismic data, she knew she was on to something. "I said, 'Wow, there's something more than just play. Let's contact someone who can help."

Her research team ultimately grew to include a second seismologist, two atmospheric scientists and a statistician. "The most challenging part was establishing communications with scientists coming from different backgrounds," said Pascale. "Often, in different fields in science, we speak different dialects, different scientific dialects." Once they developed a "shared dialect," he said, they began to make exciting discoveries. "This is how science evolves," said Pascale. "Historically, it's always been like that. Disciplines first evolve within their own kingdom, then a new field is born."

 


 

New Study Finds Sea Level Rise Accelerating

February 13, 2018: The rate of global sea level rise has been accelerating in recent decades, rather than increasing steadily, according to a new study based on 25 years of NASA and European satellite data. 54) 55) 56)

This acceleration, driven mainly by increased melting in Greenland and Antarctica, has the potential to double the total sea level rise projected for 2100 when compared to projections that assume a constant rate of sea level rise, according to lead author Steve Nerem. Nerem is a professor of Aerospace Engineering Sciences at the University of Colorado Boulder, a fellow at Colorado's CIRES (Cooperative Institute for Research in Environmental Sciences), and a member of NASA's Sea Level Change team.

If the rate of ocean rise continues to change at this pace, sea level will rise 65 cm by 2100 — enough to cause significant problems for coastal cities, according to the new assessment by Nerem and colleagues from NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland; CU Boulder; the University of South Florida in Tampa; and Old Dominion University in Norfolk, Virginia. The team, driven to understand and better predict Earth's response to a warming world, published their work Feb. 12 in the journal PNAS (Proceedings of the National Academy of Sciences). 57)

"This is almost certainly a conservative estimate," Nerem said. "Our extrapolation assumes that the sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."

EO-Topics_Auto7C

Figure 30: NASA Scientific Visualization Studio image by Kel Elkins, using data from Jason-1, Jason-2, and TOPEX/Poseidon. Story by Katie Weeman, CIRES, and Patrick Lynch, NASA GSFC. Edited by Mike Carlowicz.

Rising concentrations of greenhouse gases in Earth's atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the ocean has contributed about half of the7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.

These increases were measured using satellite altimeter measurements since 1992, including the Topex/Poseidon, Jason-1, Jason-2 and Jason-3 satellite missions, which have been jointly managed by multiple agencies, including NASA, CNES (Centre National d'Etudes Spatiales), EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites), and NOAA (National Oceanic and Atmospheric Administration). NASA's Jet Propulsion Laboratory in Pasadena, California, manages the U.S. portion of these missions for NASA's Science Mission Directorate. The rate of sea level rise in the satellite era has risen from about 2.5 mm/ year in the 1990s to about 3.4 mm/year today.

"The Topex/Poseidon/Jason altimetry missions have been essentially providing the equivalent of a global network of nearly half a million accurate tide gauges, providing sea surface height information every 10 days for over 25 years," said Brian Beckley, of NASA Goddard, second author on the new paper and lead of a team that processes altimetry observations into a global sea level data record. "As this climate data record approaches three decades, the fingerprints of Greenland and Antarctic land-based ice loss are now being revealed in the global and regional mean sea level estimates."

Table 2: Significa

Satellite altimetry has shown that GMSL (Global Mean Sea Level) has been rising at a rate of ~3 ± 0.4 mm/y since 1993. Using the altimeter record coupled with careful consideration of interannual and decadal variability as well as potential instrument errors, we show that this rate is accelerating at 0.084 ± 0.025 mm/y2, which agrees well with climate model projections. If sea level continues to change at this rate and acceleration, sea-level rise by 2100 (~65 cm) will be more than double the amount if the rate was constant at 3 mm/y.

nce of global sea level rise

 


 

Ozone layer not recovering in lower latitudes, despite ozone hole healing at the poles

February 8, 2018: The ozone layer - which protects us from harmful ultraviolet radiation - is recovering at the poles, but unexpected decreases in part of the atmosphere may be preventing recovery at lower latitudes. Global ozone has been declining since the 1970s owing to certain man-made chemicals. Since these were banned, parts of the layer have been recovering, particularly at the poles. 58)

However, the new result, published in the EGU (European Geosciences Union) journal Atmospheric Chemistry and Physics, finds that the bottom part of the ozone layer at more populated latitudes is not recovering. The cause is currently unknown. 59)

Ozone is a substance that forms in the stratosphere - the region of the atmosphere between about 10 and 50 km altitude, above the troposphere that we live in. It is produced in tropical latitudes and distributed around the globe. A large portion of the resulting ozone layer resides in the lower part of the stratosphere. The ozone layer absorbs much of the UV radiation from the Sun, which, if it reaches the Earth's surface, can cause damage to DNA in plants, animals and humans.

In the 1970s, it was recognized that chemicals called CFCs (Chlorofluorocarbons), used for example in refrigeration and aerosols, were destroying ozone in the stratosphere. The effect was worst in the Antarctic, where an ozone 'hole' formed.

In 1987, the Montreal Protocol was agreed (international treaty), which led to the phase-out of CFCs and, recently, the first signs of recovery of the Antarctic ozone layer. The upper stratosphere at lower latitudes is also showing clear signs of recovery, proving the Montreal Protocol is working well.

However, despite this success, scientists have evidence that stratospheric ozone is likely not recovering at lower latitudes, between 60º N and 60º S, due to unexpected decreases in ozone in the lower part of the stratosphere.

Study co-author Professor Joanna Haigh, Co-Director of the Grantham Institute for Climate Change and the Environment at Imperial College London, said: "Ozone has been seriously declining globally since the 1980s, but while the banning of CFCs is leading to a recovery at the poles, the same does not appear to be true for the lower latitudes. The potential for harm in lower latitudes may actually be worse than at the poles. The decreases in ozone are less than we saw at the poles before the Montreal Protocol was enacted, but UV radiation is more intense in these regions and more people live there."

The cause of this decline is not certain, although the authors suggest a couple of possibilities. One is that climate change is altering the pattern of atmospheric circulation, causing more ozone to be carried away from the tropics.

The other possibility is that very short-lived substances (VSLSs), which contain chlorine and bromine, could be destroying ozone in the lower stratosphere. VSLSs include chemicals used as solvents, paint strippers, and as degreasing agents. One is even used in the production of an ozone-friendly replacement for CFCs.

Dr William Ball from ETH Zürich [Eidgenoessische Technische Hochschule, Zürich (Swiss Federal Institute of Technology, Zürich)] and PMOD/WRC [Physikalisch-Meteorologisches Observatorium Davos, World Radiation Center (Switzerland)], who led the analysis, said: "The finding of declining low-latitude ozone is surprising, since our current best atmospheric circulation models do not predict this effect. Very short-lived substances could be the missing factor in these models."

It was thought that very short-lived substances would not persist long enough in the atmosphere to reach the height of the stratosphere and affect ozone, but more research may be needed.

To conduct the analysis, the team developed new algorithms to combine the efforts of multiple international teams that have worked to connect data from different satellite missions since 1985 and create a robust, long time series.

William Ball said: "The study is an example of the concerted international effort to monitor and understand what is happening with the ozone layer; many people and organizations prepared the underlying data, without which the analysis would not have been possible."

Although individual datasets had previously hinted at a decline, the application of advanced merging techniques and time series analysis has revealed a longer term trend of ozone decrease in the stratosphere at lower altitudes and latitudes.

The researchers say the focus now should be on getting more precise data on the ozone decline, and determining what the cause most likely is, for example by looking for the presence of VSLSs in the stratosphere.

Dr Justin Alsing from the Flatiron Institute in New York, who took on a major role in developing and implementing the statistical technique used to combine the data, said: "This research was only possible because of a great deal of cross-disciplinary collaboration. My field is normally cosmology, but the technique we developed can be used in any science looking at complex datasets."

Following the successful implementation of the Montreal Protocol, total column ozone stabilized at the end of the 1990s, but the search for the first signs of recovery in total column ozone integrated between 60º S and 60º N have not yet been successful. The lower stratosphere, below 24 km (~32 hPa), contains a large fraction of the total column ozone and is a region of large natural variability that has previously inhibited detection of significant trends. With longer time series, improved composites, and integration of the lower stratospheric column, we can now detect statistically significant trends in this region. We find that the negative ozone trend within the lower stratosphere between 1998 and 2016 is the main reason why a statistically significant recovery in total column ozone has remained elusive. Our main findings are as follows:

• We further confirm other studies that the Montreal Protocol is successfully reducing the impact of halogenated ozone-depleting substances as indicated by the highly probable recovery (>95 %) measured in upper stratospheric regions (1–10 hPa or 32–48 km).

• Lower stratospheric ozone (between 147 and 32 hPa (13–24 km) at mid-latitudes, or 100 and 32 hPa (17– 24 km) at tropical latitudes) has continued to decrease since 1998 between 60º S and 60º N, with a probability of 99% in two of the three analyzed datasets and 87% in the third.

• The main stratospheric dataset considered indicates a highly probable (95 %) decrease in the ozone layer since 1998, i.e. in stratospheric ozone (between 147 and 1 hPa (13–48 km) at mid-latitudes, or 100 and 1 hPa (17–48 km) at tropical latitudes) integrated over latitudes 60º S–60º N – the other composites support this result when considering the associated caveats of each.

• There is no significant change in total column ozone between 1998 and 2016, which includes both tropospheric ozone and the stratospheric ozone layer – indeed no change is the most probable result indicated, which our findings imply is a consequence of increasing tropospheric ozone, together with the slowed rate of decrease in stratospheric ozone following the Montreal Protocol.

• State-of-the-art models, nudged to have historical atmospheric dynamics as realistic as possible, do not reproduce these observed decreases in lower stratospheric ozone.

Table 3: Summary of the published paper (Ref. 59)

 


 

Heat loss from Earth's interior triggers Greenland's ice sheet slide towards the sea

January 30, 2018: In North-East Greenland, researchers have measured the loss of heat that comes up from the interior of the Earth. This enormous area is a geothermal 'hot spot' that melts the ice sheet from below and triggers the sliding of glaciers towards the sea. The melting takes place with increased strength and at a speed that no models have previously predicted. 60)

As reported in the journal Scientific Reports, researchers from the Arctic Research Center, Aarhus University (Aarhus, Denmark), and the Greenland Institute of Natural Resources (Nuuk, Greenland) present results that, for the first time, show that the deep bottom water of the north-eastern Greenland fjords is being warmed up by heat gradually lost from the Earth's interior. And the researchers point out that this heat loss triggers the sliding of glaciers from the ice sheet towards the sea. 61)

Icelandic conditions: "North-East Greenland has several hot springs where the water becomes up to 60 degrees warm and, like Iceland, the area has abundant underground geothermal activity," explains Professor Søren Rysgaard, who headed the investigations.

For more than ten years (2005-2015), the researchers have measured the temperature and salinity in the fjord Young Sound, located at Daneborg, north of Scoresbysund, which has many hot springs, and south of the glacier Nioghalvfjerdsfjorden, which melts rapidly and is connected to the North-East Greenland Ice Stream (NEGIS).

By focusing on an isolated basin in the fjord with a depth range between 200 and 340 m, the researchers have measured how the deep water is heated over a ten-year period. Based on the extensive data, researchers have estimated that the loss of heat from the Earth's interior to the fjord is about 100 mW m-2. This corresponds to a 2 MW wind turbine sending electricity to a large heater at the bottom of the fjord all year round.

Heat from the Earth's interior — an important influence: It is not easy to measure the geothermal heat flux — heat emanating from the Earth's interior — below a glacier, but within the area there are several large glaciers connected directly to the ice sheet. If the Earth releases heat to a fjord, heat also seeps up to the bottom part of the glaciers. This means that the glaciers melt from below and thus slide more easily over the terrain on which they sit when moving to the sea.

"It is a combination of higher temperatures in the air and the sea, precipitation from above, local dynamics of the ice sheet and heat loss from the Earth's interior that determines the mass loss from the Greenland ice sheet," explains Søren Rysgaard.

The researchers expect that the new discoveries will improve the models of ice sheet dynamics, allowing better predictions of the stability of the Greenland ice sheet, its melting and the resulting global water rise.

EO-Topics_Auto7B

Figure 31: Geothermal vents localities and ice surface speeds (2008–2009) for Greenland. Geothermal vent localities on land with temperatures >10ºC, Boreholes, hydrothermal vent complexes offshore and present study. Reconstructed geothermal anomalies (contours in inserted box). Ice drilling localities are indicated by CC, NGRIP, GRIP and Dye (image credit: Research Team of Aarhus University)

 


 

Dust on Snow Controls Springtime River Rise

January 23, 2018: A new study has found that dust, not spring warmth, controls the pace of spring snowmelt that feeds the headwaters of the Colorado River. Contrary to conventional wisdom, the amount of dust on the mountain snowpack controls how fast the Colorado Basin's rivers rise in the spring regardless of air temperature, with more dust correlated with faster spring runoff and higher peak flows. 62)

The finding is valuable for western water managers and advances our understanding of how freshwater resources, in the form of snow and ice, will respond to warming temperatures in the future. By improving knowledge of what controls the melting of snow, it improves understanding of the controls on how much solar heat Earth reflects back into space and how much it absorbs — an important factor in studies of weather and climate.

When snow gets covered by a layer of windblown dust or soot, the dark topcoat increases the amount of heat the snow absorbs from sunlight. Tom Painter of NASA's Jet Propulsion Laboratory in Pasadena, California, has been researching the consequences of dust on snowmelt worldwide. This is the first study to focus on which has a stronger influence on spring runoff: warmer air temperatures or a coating of dust on the snow.

Windblown dust has increased in the U.S. Southwest as a result of changing climate patterns and human land-use decisions. With rainfall decreasing and more disturbances of the land, protective crusts on soil are removed and more bare soil is exposed. Winter and spring winds pick up the dusty soil and drop it on the Colorado Rockies to the northeast. Historical lake sediment analyses show there is currently an annual average of five to seven times more dust falling on the Rocky Mountain snowpack than there was before the mid-1800s.

Painter and colleagues looked at data on air temperature and dust in a mountain basin in southwestern Colorado from 2005 to 2014, and streamflow from three major tributary rivers that carry snowmelt from these mountains to the Colorado River. The Colorado River's basin spans about 246,000 square miles (637,000 km2) in parts of seven western states.

The researchers found that the effects of dust dominated the pace of the spring runoff even in years with unusually warm spring air temperatures. Conversely, there was almost no statistical correlation between air temperature and the pace of runoff.

"We found that when it's clean, the rise to the peak streamflow is slower, and generally you get a smaller peak." Painter said. "When the snowpack is really dusty, water just blasts out of the mountains." The finding runs contrary to the widely held assumption that spring air temperature determines the likelihood of flooding.

Coauthor McKenzie Skiles, an assistant professor in the University of Utah Department of Geography, said that while the impacts of dust in the air, such as reduced air quality, are well known, the impacts of the dust once it's been deposited on the land surface are not as well understood. "Given the reliance of the western U.S. on the natural snow reservoir, and the Colorado River in particular, it is critical to evaluate the impact of increasing dust deposition on the mountain snowpack," she said.

EO-Topics_Auto7A

Figure 32: A coating of dust on snow speeds the pace of snowmelt in the spring (image credit: NASA)

Painter pointed out that the new finding doesn't mean air temperatures in the region can be ignored in considering streamflows and flooding, especially in the future. "As air temperature continues to climb, it's going to have more influence," he said. Temperature controls whether precipitation falls as snow or as rain, for example, so ultimately it controls how much snow there is to melt. But, he said, "temperature is unlikely to control the variability in snowmelt rates. That will still be controlled by how dirty or clean the snowpack is."

Skiles noted, "Dust on snow does not only impact the mountains that make up the headwaters of Colorado River. Surface darkening has been observed in mountain ranges all over the world, including the Alps and the Himalaya. What we learn about the role of dust deposition for snowmelt timing and intensity here in the western U.S. has global implications for improved snowmelt forecasting and management of snow water resources."

The study, titled "Variation in rising limb of Colorado River snowmelt runoff hydrograph controlled by dust radiative forcing in snow," was published today in the journal Geophysical Research Letters. Coauthors are from the University of Utah, Salt Lake City; University of Colorado, Boulder; and University of California, Santa Barbara. 63)

 


 

Study of Extreme Wintertime Arctic Warm Event

January 16, 2018: In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt. 64)

"We heard about this from the media," says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In December 2017, they published their analysis of this exceptional event in the journal Geophysical Research Letters. 65)

The researchers show in their paper how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a "highway" (Figure 33).

One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20º Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. "It's extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic," says Binder.

The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass – which also lay close to the ground – moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.

The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 km. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the "highway to the Arctic".

EO-Topics_Auto79

Figure 33: Schematic illustration of the unusual processes that led to the Arctic warm event (warm air highway), image credit: Sandro Bösch / ETH Zurich

Poleward warm air transport: This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.

This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 cm – during a period in which ice usually becomes thicker and more widespread.

"These weather conditions and their effect on the sea ice were really exceptional," says Binder. The researchers were not able to identify a direct link to global warming. "We only carried out an analysis of a single event; we didn't research the long-term climate aspects" emphasizes Binder.

However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 – a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers.

According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight – the sun shines 24 hours a day at this time of year – particularly intensified the melting of the sea ice.

The extreme event was the result of a very unusual large-scale flow configuration in early winter 2015/2016 that came along with overall anomalously warm conditions in Europe (National Oceanic and Atmospheric Administration, 2016) and other regional extremes, for example, flooding in the UK. 66) In this study (Ref. 65),we focus on the Arctic. At the North Pole, buoys measured maximum surface temperatures of -0.8ºC on 30 December 67), and at the Svalbard airport station values of 8.7ºC were observed, the warmest temperatures ever recorded at that station between November and April (The Norwegian Meteorological Institute, 2016). According to operational analyses from the ECMWF (European Center for Medium-Range Weather Forecasts), the maximum 2 m temperature (T2m) north of 82ºN reached values larger than 0ºC during three short episodes between 29 December 2015 and 4 January 2016—almost 30 K above the winter climatological mean in this region (Figure 34a). They occurred in the Eurasian Arctic sector in the region around Svalbard and over the Kara Sea (purple contour in Figure 34b) and were the highest winter values since 1979 (Figure 34c). The warm event led to a thinning of the sea ice by more than 30 cm in the Barents and Kara Seas, and contributed to the record low Northern Hemisphere sea ice extent observed in January and February 2016 (National Snow and Ice Data Center, 2016).

EO-Topics_Auto78

Figure 34: Illustration of the Arctic warm event and its extremeness. (a) Temporal evolution of the domain maximum (red) and mean (blue) T2m (ºC) between 20 December 2015 and 10 January 2016 at latitudes ≥82ºN and between 120ºW and 120ºE, derived from operational analyses. Also shown are the domain mean December–February 1979–2014 climatological mean T2m (black), and the corresponding ±1 standard deviation envelope (grey) from ERA-Interim reanalysis data. (b) Maximum T2m (ºC) between 00 UTC 30 December 2015 and 18 UTC 4 January 2016 from operational analyses, with the purple contour highlighting the regions ≥82ºN with maximum T2m ≥ 0ºC. (c) Rank of maximum T2m shown in Figure 34b among all 6-hourly values in winter 1979–2014 in the ERA-Interim reanalyses (consisting of a total of 13,232 values), image credit: study team

 


 

Long-Term Warming Trend Continued in 2017

January 18, 2018: Earth's global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA. Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 º Fahrenheit (0.90º Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's GISS (Goddard Institute for Space Studies) in New York. That is second only to global temperatures in 2016. 68)

In a separate, independent analysis, scientists at NOAA (National Oceanic and Atmospheric Administration) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies' records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.

Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017's global mean change is accurate to within 0.1º Fahrenheit, with a 95 percent certainty level.

"Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we've seen over the last 40 years," said GISS Director Gavin Schmidt.

The planet's average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.

Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event – and with a La Niña starting in the later months of 2017 – last year's temperatures ranked between 2015 and 2016 in NASA's records.

EO-Topics_Auto77

Figure 35: This map shows Earth's average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA's Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline (image credit: NASA's Scientific Visualization Studio)

In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.

Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.

Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.

NASA's temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.

These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.

NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures. The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available.

GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.

NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

 


 

Study of Antarctic Ozone Hole Recovery

January 5, 2018: For the first time, scientists have shown through direct observations of the ozone hole by an instrument on NASA's Aura mission, that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion. Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing human-produce chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005 — the first year that measurements of chlorine and ozone during the Antarctic winter were made by the Aura satellite. 69)

- "We see very clearly that chlorine from CFCs is going down in the ozone hole, and that less ozone depletion is occurring because of it," said lead author Susan Strahan, an atmospheric scientist from NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study was published in the journal Geophysical Research Letters. 70)

- CFCs are long-lived chemical compounds that eventually rise into the stratosphere, where they are broken apart by the Sun's ultraviolet radiation, releasing chlorine atoms that go on to destroy ozone molecules. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plant life.

- Two years after the discovery of the Antarctic ozone hole in 1985, nations of the world signed the Montreal Protocol on Substances that Deplete the Ozone Layer, which regulated ozone-depleting compounds. Later amendments to the Montreal Protocol completely phased out production of CFCs.

- Past studies have used statistical analyses of changes in the ozone hole's size to argue that ozone depletion is decreasing. This study is the first to use measurements of the chemical composition inside the ozone hole to confirm that not only is ozone depletion decreasing, but that the decrease is caused by the decline in CFCs.

- The Antarctic ozone hole forms during September in the Southern Hemisphere's winter as the returning Sun's rays catalyze ozone destruction cycles involving chlorine and bromine that come primarily from CFCs. To determine how ozone and other chemicals have changed year to year, scientists used data from JPL's MLS (Microwave Limb Sounder) aboard the Aura satellite, which has been making measurements continuously around the globe since mid-2004. While many satellite instruments require sunlight to measure atmospheric trace gases, MLS measures microwave emissions and, as a result, can measure trace gases over Antarctica during the key time of year: the dark southern winter, when the stratospheric weather is quiet and temperatures are low and stable.

Figure 36: Using measurements from NASA's Aura satellite, scientists studied chlorine within the Antarctic ozone hole over the last several years, watching as the amount slowly decreased (image credit: NASA/GSFC, Katy Mersmann)

The change in ozone levels above Antarctica from the beginning to the end of southern winter — early July to mid-September — was computed daily from MLS measurements every year from 2005 to 2016. "During this period, Antarctic temperatures are always very low, so the rate of ozone destruction depends mostly on how much chlorine there is," Strahan said. "This is when we want to measure ozone loss."

They found that ozone loss is decreasing, but they needed to know whether a decrease in CFCs was responsible. When ozone destruction is ongoing, chlorine is found in many molecular forms, most of which are not measured. But after chlorine has destroyed nearly all the available ozone, it reacts instead with methane to form hydrochloric acid, a gas measured by MLS. "By around mid-October, all the chlorine compounds are conveniently converted into one gas, so by measuring hydrochloric acid we have a good measurement of the total chlorine," Strahan said.

Nitrous oxide is a long-lived gas that behaves just like CFCs in much of the stratosphere. The CFCs are declining at the surface but nitrous oxide is not. If CFCs in the stratosphere are decreasing, then over time, less chlorine should be measured for a given value of nitrous oxide. By comparing MLS measurements of hydrochloric acid and nitrous oxide each year, they determined that the total chlorine levels were declining on average by about 0.8 percent annually.

The 20 percent decrease in ozone depletion during the winter months from 2005 to 2016 as determined from MLS ozone measurements was expected. "This is very close to what our model predicts we should see for this amount of chlorine decline," Strahan said. "This gives us confidence that the decrease in ozone depletion through mid-September shown by MLS data is due to declining levels of chlorine coming from CFCs. But we're not yet seeing a clear decrease in the size of the ozone hole because that's controlled mainly by temperature after mid-September, which varies a lot from year to year."

Looking forward, the Antarctic ozone hole should continue to recover gradually as CFCs leave the atmosphere, but complete recovery will take decades. "CFCs have lifetimes from 50 to 100 years, so they linger in the atmosphere for a very long time," said Anne Douglass, a fellow atmospheric scientist at Goddard and the study's co-author. "As far as the ozone hole being gone, we're looking at 2060 or 2080. And even then there might still be a small hole."

 


 

Study solves a conflict in the post-2006 atmospheric methane budget concentrations

January 2, 2018: A new NASA-led study has solved a puzzle involving the recent rise in atmospheric methane, a potent greenhouse gas, with a new calculation of emissions from global fires. The new study resolves what looked like irreconcilable differences in explanations for the increase. 71)

Methane emissions have been rising sharply since 2006. Different research teams have produced viable estimates for two known sources of the increase: emissions from the oil and gas industry, and microbial production in wet tropical environments like marshes and rice paddies. But when these estimates were added to estimates of other sources, the sum was considerably more than the observed increase. In fact, each new estimate was large enough to explain the whole increase by itself.

John Worden of NASA's Jet Propulsion Laboratory in Pasadena, California, and colleagues focused on fires because they're also changing globally. The area burned each year decreased about 12 percent between the early 2000s and the more recent period of 2007 to 2014, according to a new study using observations by NASA's MODIS (Moderate Resolution Imaging Spectrometer) satellite instrument. The logical assumption would be that methane emissions from fires have decreased by about the same percentage. Using satellite measurements of methane and carbon monoxide, Worden's team found the real decrease in methane emissions was almost twice as much as that assumption would suggest.

When the research team subtracted this large decrease from the sum of all emissions, the methane budget balanced correctly, with room for both fossil fuel and wetland increases. The research is published in the journal Nature Communications. 72)

Most methane molecules in the atmosphere don't have identifying features that reveal their origin. Tracking down their sources is a detective job involving multiple lines of evidence: measurements of other gases, chemical analyses, isotopic signatures, observations of land use, and more. "A fun thing about this study was combining all this different evidence to piece this puzzle together," Worden said.

Carbon isotopes in the methane molecules are one clue. Of the three methane sources examined in the new study, emissions from fires contain the largest percentage of heavy carbon isotopes, microbial emissions have the smallest, and fossil fuel emissions are in between. Another clue is ethane, which (like methane) is a component of natural gas. An increase in atmospheric ethane indicates increasing fossil fuel sources. Fires emit carbon monoxide as well as methane, and measurements of that gas are a final clue.

Worden's team used carbon monoxide and methane data from the Measurements of Pollutants in the Troposphere instrument on NASA's Terra satellite and the Tropospheric Emission Spectrometer instrument on NASA's Aura to quantify fire emissions of methane. The results show these emissions have been decreasing much more rapidly than expected.

Combining isotopic evidence from ground surface measurements with the newly calculated fire emissions, the team showed that about 17 teragrams per year of the increase is due to fossil fuels, another 12 is from wetlands or rice farming, while fires are decreasing by about 4 teragrams per year. The three numbers combine to net emissions increase of ~25 Tg/year of CH4 — the same as the observed increase.

The magnitude of the global CH4 masses involved are illustrated by: 1 Tg (1 teragram) = 1012 g = 1,000,000 tons. Methane emissions are increasing by about 25 Tg/year, with total emissions currently of ~550 Tg/year budget.

Worden's coauthors are at the NCAR (National Center for Atmospheric Research), Boulder, Colorado; and the Netherlands Institute for Space Research and University of Utrecht, both in Utrecht, the Netherlands.

Figure 37: This time series was created using data from the MODIS instrument data onboard NASA's Terra and Aqua satellites. The burned area is estimated by applying an algorithm that detects rapid changes in visible and infrared surface reflectance imagery. Fires typically darken the surface in the visible part of the electromagnetic spectrum, and brighten the surface in several wavelength bands in the shortwave infrared that are sensitive to the surface water content of vegetation (image credit: NASA/GSFC/SVS)

Legend to Figure 37: Thermal emissions from actively burning fires also are measured by MODIS and are used to improve the burned area estimates in croplands and other areas where the fire sizes are relatively small. This animation portrays burned area between September 2000 and August 2015 as a percent of the 1/4 degree grid cell that was burned each month. The values on the color bar are on a log scale, so the regions shown in blue and green shades indicate small burned areas while those in red and orange represent a larger percent of the region burned. Beneath the burned area, the seasonal Blue Marble landcover shows the advance and retreat of snow in the northern hemisphere.

Trend in CH4 emissions from fires. Figure 38 shows the time series of CH4 emissions that were obtained from GFEDv4s (Global Fire Emissions Database, version 4s) and top-down estimates based on CO emission estimates and GFED4s-based emission ratios. The CO-based fire CH4 emissions estimates amount to 14.8 ± 3.8 Tg CH4 per year for the 2001–2007 time period and 11.1 ± 3 Tg CH4 per year for the 2008–2014 time period, with a 3.7 ± 1.4 Tg CH4 per year decrease between the two time periods. The mean burnt area (a priori)-based estimate from GFED4s is slightly larger and shows a slightly smaller decrease (2.3 Tg CH4 per year) in fire emissions after 2007 relative to the 2001–2006 time period. The range of uncertainties (shown as blue error bars in Figure 38 is determined by the uncertainty in top-down CO emission estimates that are derived empirically using the approaches discussed in the Methods). The red shading describes the range of uncertainty stemming from uncertainties in CH4/CO emission factors (Methods). By assuming temporally constant sector-specific CH4/CO emission factors, we find that mean 2001–2014 emissions average to 12.9 ± 3.3 Tg CH4 per year, and the decrease averages to 3.7 ± 1.4 Tg CH4 per year for 2008–2014, relative to 2001–2007. This decrease is largely accounted for by a 2.9 ± 1.2 Tg CH4 per year decrease during 2006–2008, which is primarily attributable to a biomass burning decrease in Indonesia and South America.

EO-Topics_Auto76

Figure 38: Trend of methane emissions from biomass burning. Expected methane emissions from fires based on the Global Fire Emissions Database (black) and the CO emissions plus CH4/CO ratios shown here (red). The range of uncertainties in blue is due to the calculated errors from the CO emissions estimate and the shaded red describes the range of error from uncertainties in the CH4/CO emission factors (image credit: Methane Study Team)

 


 

Industrial-age doubling of snow accumulation in the Alaska Range linked to tropical ocean warming

December 19, 2017: Snowfall on a major summit in North America's highest mountain range has more than doubled since the beginning of the Industrial Age, according to a study from Dartmouth College, the University of Maine, and the University of New Hampshire. The research not only finds a dramatic increase in snowfall, it further explains connections in the global climate system by attributing the record accumulation to warmer waters thousands of miles away in the tropical Pacific and Indian Oceans. 73)

The study demonstrates that modern snowfall in the iconic Alaska Range is unprecedented for at least the past 1200 years and far exceeds normal variability. "We were shocked when we first saw how much snowfall has increased," said Erich Osterberg, an assistant professor of Earth sciences at Dartmouth College and principal investigator for the research. "We had to check and double-check our results to make sure of the findings. Dramatic increases in temperature and air pollution in modern times have been well established in science, but now we're also seeing dramatic increases in regional precipitation with climate change."

According to the research, wintertime snowfall has increased 117 percent since the mid-19th century in southcentral Alaska in the United States. Summer snows also showed a significant increase of 49 percent in the short period ranging less than two hundred years.

The research, appearing in Scientific Reports, is based on analysis of two ice cores (each 208 m long) collected from the Mount Hunter summit plateau (62°56'N, 151°5'W, 3900 m) in Denali National Park, Alaska. A high snow accumulation rate (1.15 m water equivalent [w. e.] average since 1900) and infrequent surface melt (<0.5% of the core is composed of refrozen melt layers and lenses) at the Mt. Hunter drill site preserve robust seasonal oscillations of several chemical parameters (Na, Ca, Mg, NH4 +, MSA (methanesulfonic acid), δ18O, liquid conductivity, dust concentration), facilitating annual layer counting back to 800 CE (Common Era, Figure 39). — According to the authors, accumulation records in the separate samples taken from just below the summit of the mountain known as "Denali's Child" are in nearly complete agreement. 74)

EO-Topics_Auto75

Figure 39: Annual layer counting in the Mt. Hunter ice core. (A) Three chemical series exhibiting annual layers are shown at a representative depth of core: Mg (black), δ18O (blue) and MSA (red). Each vertical dotted line represents the depth of Jan. 1st in a δ18O trough and just below a Mg peak. The distance between each vertical dotted line represents one year's snow accumulation (before thinning correction). The position of these years was selected three times by three independent researchers. We delineate summer (May-August) and winter (September-April) seasons by recording the late summer-fall peak positions of MSA (purple circles) and the spring peak positions of Mg (orange circles).

The annually resolved Denali snow accumulation record (Figure 40) indicates that the post-1950 precipitation increase in the Alaskan weather station records began well before the 20th century, in circa 1840 CE.

"It is now glaringly clear from our ice core record that modern snowfall rates in Alaska are much higher than natural rates before the Industrial Revolution," said Dominic Winski, a research assistant at Dartmouth and the lead author of the report. "This increase in precipitation is also apparent in weather station data from the past 50 years, but ice cores show the scale of the change well above natural conditions."

Once the researchers established snowfall rates, they set out to identify why precipitation has increased so rapidly in such a short amount of time. Scientific models predict as much as a 2 percent increase in global precipitation per degree of warming because warmer air holds more moisture, but this could not account for most of the dramatic increases in Denali snowfall over the studied period.

EO-Topics_Auto74

Figure 40: The Mt. Hunter accumulation record. Annual (light gray line) and 21-year smoothed (black line) accumulation time series from the year 810 CE (Common Era) to present, constrained by 21-year smoothed error envelopes (blue shading) inclusive of stochastic, peak position and layer-thinning model uncertainties, including the total uncertainty range among all four modeling approaches. The inset shows seasonal trends in accumulation since 1867 with 21-year running means (bold lines). Snowfall accumulating between September and April (blue) has more than doubled, with a faster rise since 1976. Summer accumulation (April to August; red) remained comparatively stable except for a baseline shift between 1909 and 1925 (image credit: Dartmouth College, Dominic Winski)

The research suggests that warming tropical oceans have caused a strengthening of the Aleutian Low pressure system with its northward flow of warm, moist air, driving most of the snowfall increases. Previous research has linked the warming tropical ocean temperatures to higher greenhouse gas concentrations.

The analysis includes a series of dramatic graphs that demonstrate extreme shifts in precipitation and reinforce the global climate connections that link snowfall in the high reaches of the North American continent with warm tropical waters. As noted in the paper (Ref. 74), this same atmospheric connection accounts for a decrease in Hawaiian precipitation.

"Everywhere we look in the North Pacific, we're seeing this same fingerprint from warming tropical oceans. One result is that wintertime climate in the North Pacific is very different than it was 200 years ago. This doesn't just affect Alaska, but Hawaii and the entire Pacific Northwest are impacted as well," said Winski.

The research builds on a recent study using the same ice cores that showed that an intensification of winter storm activity in Alaska and Northwestern Canada, driven by the strengthening Aleutian Low, started in 1740 and is unprecedented in magnitude and duration over the past millennium. The new record shows the result of that increase in Aleutian Low storm activity on snow accumulation.

For this analysis, researchers were able to segment the ice core records by seasons and years using markers like magnesium from spring dust to separate winter snow from summer snow. To account for snow layers getting squeezed and thinned under their own weight, the researchers applied four separate equations used in other studies, and in all cases the corrected record shows at least a doubling of snowfall.

According to the paper, while numerous snow accumulation records exist, "to our knowledge, no other alpine ice core accumulation record has been developed with such a thorough characterization of the thinning regime or uncertainties; all of the thinning models produce a robust increase in accumulation since the mid-19th century above late-Holocene background values."

The researchers note that the findings imply that regions that are sensitive to warming tropical ocean waters may continue to experience rain and snowfall variability well outside the natural range of the past millennium.

"Climate change can impact specific regions in much more extreme ways than global averages indicate because of unexpected responses from features like the Aleutian Low," said Osterberg. "The Mount Hunter record captures the dramatic changes that can occur when you get a double whammy from climate change – warming air combined with more storms from warming ocean temperatures."

However, the researchers also note that the regional findings do not necessarily mean that the same level of snowfall increases will occur elsewhere throughout the mid- and high latitudes.

"Scientists keep discovering that on a regional basis, climate change is full of surprises. We need to understand these changes better to help communities prepare for what will come with even more carbon dioxide pollution in the air," said Osterberg.

As part of the analysis, the authors suggest that current climate models underestimate the sensitivity of North Pacific atmospheric connections to warming tropical ocean temperatures. They argue that refining the way the modeled atmosphere responds to tropical ocean temperatures may improve rain and snowfall predictions in a warming world.

This research was supported by the NSF (National Science Foundation) Paleoclimate Program (P2C2).

 


 

Arctic sea ice loss could dry out California

December 2017: Arctic sea ice loss of the magnitude expected in the next few decades could impact California's rainfall and exacerbate future droughts, according to new research led by LLNL (Lawrence Livermore National Laboratory) scientists. 75)

EO-Topics_Auto73

Figure 41: Extent of Arctic sea ice in September 2016 versus the 1981-2010 average minimum extent (gold line). Through satellite images, researchers have observed a steep decline in the average extent of Arctic sea ice for every month of the year (image credit: NASA)

The dramatic loss of Arctic sea ice cover observed over the satellite era is expected to continue throughout the 21st century. Over the next few decades, the Arctic Ocean is projected to become ice-free during the summer. A new study by Ivana Cvijanovic and colleagues from LLNL and University of California, Berkeley shows that substantial loss of Arctic sea ice could have significant far-field effects, and is likely to impact the amount of precipitation California receives. The research appears in the Dec. 5 edition of Nature Communications. 76)

The study identifies a new link between Arctic sea ice loss and the development of an atmospheric ridging system in the North Pacific. This atmospheric feature also played a central role in the 2012-2016 California drought and is known for steering precipitation-rich storms northward, into Alaska and Canada, and away from California. The team found that sea ice changes can lead to convection changes over the tropical Pacific. These convection changes can in turn drive the formation of an atmospheric ridge in the North Pacific, resulting in significant drying over California.

"On average, when considering the 20-year mean, we find a 10-15 percent decrease in California's rainfall. However, some individual years could become much drier, and others wetter," Cvijanovic said.

The study does not attribute the 2012-2016 drought to Arctic sea ice loss. However, the simulations indicate that the sea-ice driven precipitation changes resemble the global rainfall patterns observed during that drought, leaving the possibility that Arctic sea-ice loss could have played a role in the recent drought.

"The recent California drought appears to be a good illustration of what the sea-ice driven precipitation decline could look like," she explained.

California's winter precipitation has decreased over the last two decades, with the 2012-2016 drought being one of the most severe on record. The impacts of reduced rainfall have been intensified by high temperatures that have enhanced evaporation. Several studies suggest that recent Californian droughts have a manmade component arising from increased temperatures, with the likelihood of such warming-enhanced droughts expected to increase in the future.

EO-Topics_Auto72

Figure 42: Schematics of the teleconnection through which Arctic sea-ice changes drive precipitation decrease over California. Arctic sea-ice loss induced high-latitude changes first propagate into the tropics, triggering tropical circulation and convection responses. Decreased convection and decreased upper level divergence in the tropical Pacific then drive a northward propagating Rossby wavetrain, with anticyclonic flow forming in the North Pacific. This ridge is responsible for steering the wet tropical air masses away from California (image credit: LLNL, Kathy Seibert)

"Our study identifies one more pathway by which human activities could affect the occurrence of future droughts over California — through human-induced Arctic sea ice decline," Cvijanovic said. "While more research should be done, we should be aware that an increasing number of studies, including this one, suggest that the loss of the Arctic sea ice cover is not only a problem for remote Arctic communities, but could affect millions of people worldwide. Arctic sea ice loss could affect us, right here in California."

Other co-authors on the study include Benjamin Santer, Celine Bonfils, Donald Lucas and Susan Zimmerman from LLNL and John Chiang from the University of California, Berkeley.

The research is funded by DOE (Department of Energy) Office of Science. Cvijanovic and Bonfils were funded by the DOE Early Career Research Program Award and Lucas is funded by the DOE Office of Science through the SciDAC project on Multiscale Methods for Accurate, Efficient and Scale-Aware Models of the Earth System.

 


 

Increasing Wildfires in the boreal forests of northern Canada and Alaska due to Lightning

December 2017: Wildfires in the boreal forests of northern Canada and Alaska have been increasing in frequency and the amount of area burned, and the drivers of large fire years are still poorly understood. But recent NASA-funded research offers at least one possible cause: more lightning. As global warming continues, lightning storms and warmer conditions are expected to spread farther north, meaning fire could significantly alter the landscape over time. 77)

A record number of lightning-ignited fires burned in Canada's Northwest Territories in 2014 and in Alaska in 2015. Scientist Sander Veraverbeke (Vrije Universiteit Amsterdam and University of California, Irvine) and colleagues examined data from satellites and from ground-based lightning networks to see if they could figure out why those seasons were so bad.

The team found that the majority of fires in their study areas in 2014 and 2015 were ignited by lightning storms, as opposed to human activity. That is natural, given the remoteness of the region, but it also points to more frequent lightning strikes in an area not known for as many thunderstorms as the tropics or temperate regions. Looking at longer trends, the researchers found that lightning-ignited fires in the region have been increasing by 2 to 5 percent per year since 1975, a trend that is consistent with climate change. The study was published in July 2017 in the journal Nature Climate Change. 78)

"We found that it is not just a matter of more burning with higher temperatures. The reality is more complex," Veraverbeke said. "Higher temperatures also spur more thunderstorms. Lightning from these thunderstorms is what has been igniting many more fires in these recent extreme events."

The map of Figure 43 shows the location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015. The map of Figure 44 shows the timing of the fires (June, July, or August) within the inset box. Both maps are based on data from the Veraverbeke study, which combined observations from the Alaska Fire Emissions Database, computer models, and fire observations from the MODIS (Moderate Resolution Imaging Spectroradiometer) instruments on NASA's Terra and Aqua satellites.

The fire season in the far north has typically peaked in July, after the spring thaw and the melting of winter snow. As global temperatures continue to rise, especially in the polar regions, thawing and warming tend to happen earlier in the spring and summer and at a more extensive level than in the past. The warmer weather also leads to more atmospheric instability, bringing more thunderstorms. The researchers asserted in the paper that "extreme fire years result when high levels of lightning ignition early in the growing season are followed by persistent warm and dry conditions that accelerate fire spread later in midsummer."

EO-Topics_Auto71

Figure 43: Location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015, acquired with MODIS on Terra and Aqua and in situ measurements (image credit: NASA Earth Observatory, maps and charts by Jesse Allen using data provided by Sander Veraverbeke (Vrije Universiteit). Story by Mike Carlowicz (NASA Earth Observatory), Alan Buis (Jet Propulsion Laboratory), and Brian Bell (University of California, Irvine)

EO-Topics_Auto70

Figure 44: The timing of the fires (June, July, or August) within the inset box using data of the Veraverbeke study (image credit: NASA Earth Observatory and Lightning Study)

Brendan Rogers of the Woods Hole Research Center said these trends are likely to continue. "We expect an increasing number of thunderstorms, and hence fires, across the high latitudes in the coming decades as a result of climate change."

The researchers also found that wildfires are creeping farther north, closer to the transition zone between boreal forests and Arctic tundra. Together, these areas include at least 30 percent of the world's tree cover and 35 percent of its stored soil carbon.

EO-Topics_Auto6F

Figure 45: Ignition density in the Northwest Territories (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)

EO-Topics_Auto6E

Figure 46: Ignition density in Alaska (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)

"In these high-latitude ecosystems, permafrost soils store large amounts of carbon that become vulnerable after fires pass through," said James Randerson of UC Irvine. "Exposed mineral soils after tundra fires also provide favorable seedbeds for trees migrating north under a warmer climate."

"Taken together, we discovered a complex feedback loop between climate, lightning, fires, carbon and forests that may quickly alter northern landscapes," Veraverbeke said. "A better understanding of these relationships is critical to better predict future influences from climate on fires and from fires on climate."

Study co-author Charles Miller of NASA/JPL (Jet Propulsion Laboratory) added that while data from the lightning networks were critical to this study, it is challenging to use these data for trend detection because of continuing network upgrades. "A spaceborne sensor that provides high northern latitude lightning data would be a major step forward."

 


 

Global Carbon Budget 2017

November 2017: Following three years of no growth, global GHG (Greenhouse Gas) emissions from human activities are projected to increase by 2% by the end of 2017, according to the nongovernmental organization GCP (Global Carbon Project). The increase, to a record 37 billion tons of carbon dioxide equivalent, dashed hopes in the environmental community that CO2 emissions from human activity might have plateaued and begun turning downward. 79)

In a set of three reports published 13 November, GCP said the biggest cause of the increase is the 3.5% growth in China, the world's largest emitter of greenhouse gases. The country experienced higher energy demand, particularly from industry, and a decline in hydroelectric power due to sparse rainfall. 80) 81) 82)

In addition, the decade-long trend in emissions reductions by the US and the European Union, the second- and third-largest emitters, respectively, appears to have slowed this year. The EU's output hasn't declined appreciably since 2015. The US output declined by 0.4%, compared with a 1.2% average annual reduction during the previous 10 years. Coal consumption in the US inched up 0.5%, its first increase in five years.

India, the fourth-largest greenhouse gas emitter, limited its growth to 2% this year, compared with a 6% jump in 2016. Emissions from all other countries increased 2.3% from 2016, to 15.1 gigatons (Figure 47).

EO-Topics_Auto6D

Figure 47: The world's four largest carbon dioxide emitters—China, the US, the European Union, and India—account for about 60% of global emissions. Although those countries have made strides recently, their emissions and those globally (expected year-to-year percent change and error bars shown under each country) will probably tick upward in 2017 Image credit: Global Carbon Project, CC BY 4.0)

Despite the 2014–16 hiatus in global emissions growth, CO2 has continued to accumulate in the atmosphere at a faster pace than at any time during the 50 years that measurements have been kept. The elevated global temperatures resulting from the 2015–16 El Niño diminished the capacity of terrestrial ecosystems to take up CO2 from the atmosphere, the GCP reports said.

Corinne Le Quéré of the University of East Anglia (Norwich, UK), lead author of the principal report (Ref. 80) that was published in Earth System Science Data, said in an email that she expects emissions to plateau or grow slightly in the coming years. But they are unlikely to return to the 3% growth levels that were seen regularly in the decade that ended in 2010.

Kelly Levin of the nonprofit WRI (World Resources Institute) cautions against reading too much into a single year's data but also warns about the perilous big picture. "To have a chance of transforming the economy in time to stay below 2 °C, global GHG emissions must peak by 2020," she says. WRI's analysis, and another by the UNEP (United Nations Environment Program), predict on the basis of current trends and treaty commitments that the peak in global emissions won't occur until after 2030. At that point, the probability of limiting global warming to 2 °C could be as low as 50%, even with accelerated national reduction commitments, rapid abandonment of fossil fuel use, and deployment of carbon-removal technologies whose feasibility hasn't yet been demonstrated.

The 2 °C mark is thought by most climate scientists to be the threshold below which the worst impacts of climate change can be avoided. The 2015 Paris climate agreement set an "aspirational" goal of limiting temperature increase to 1.5 °C.

The WRI analysis says the number of countries whose emissions have peaked or are committed to peak will increase from 49 in 2010 to 53 by 2020 and to 57 by 2030. Those countries accounted for 36% of world greenhouse gas emissions in 2010 and will represent 60% of the total in 2030, when China has committed to peak its output.

Despite last year's emissions increase, China's coal consumption this year is still about 8% below its record 2013 high. The Chinese government has projected a near-doubling of the nation's solar energy production over the next two years, to 213 GW. China's nonfossil energy sources make up 14.3% of overall energy production, up by one percentage point in less than a year.

 


 

Study of Global Light Pollution at Night

November 22, 2017: They were supposed to bring about an energy revolution—but the popularity of LED (Light-Emitting Diode) lights is driving an increase in light pollution worldwide, with dire consequences for human and animal health, researchers said in their study. Five years of advanced satellite images show that there is more artificial light at night across the globe, and that light at night is getting brighter. The rate of growth is approximately two percent each year in both the amount of areas lit and the radiance of the light. 83) 84) 85)

An international team of scientists reported the results of a landmark study of global light pollution and the rise of LED outdoor lighting technology. The study finds both light pollution and energy consumption by lighting steadily increasing over much of the planet. The findings also challenge the assumption that increases in the energy efficiency of outdoor lighting technologies necessarily lead to an overall decrease in global energy consumption.

The team, led by Christopher Kyba of the GFZ (German Research Center for Geosciences) in Potsdam, Germany, analyzed five years of images from the Suomi NPP (Suomi National Polar-orbiting Partnership) satellite, operated jointly by NASA and NOAA (National Oceanic and Atmospheric Administration). The data show gains of 2% per year in both the amount of the Earth's surface that is artificially lit at night and the quantity of light emitted by outdoor lighting. Increases were seen almost everywhere the team looked into, with some of the largest gains in regions that were previously unlit.

"Light is growing most rapidly in places that didn't have a lot of light to start with," Kyba noted. "That means that the fastest rates of increase are occurring in places that so far hadn't been very strongly affected by light pollution."

The results reported today confirm suggestions in earlier research based on data obtained with U.S. Department of Defense meteorological satellite measurements (DMSP series) going back to the 1970s. However, the better sensitivity of Suomi's cameras to light on the night side of Earth and significantly improved ground resolution led to more robust conclusions about the changing illumination of the world at night.

The study is among the first to examine the effects, as seen from space, of the ongoing worldwide transition to LED lighting. Kyba's team found that the energy saving effects of LED lighting on country-level energy budgets are lower than expected from the increase in the efficiency of LEDs compared to older lamps.

EO-Topics_Auto6C

Figure 48: Infographic showing the number of countries experiencing various rates of change of night lights during 2012-2016 (image credit: Kyba and the Study Team)

Environmental Gains Unrealized : LED lighting requires significantly less electricity to yield the same quantity of light as older lighting technologies. Proponents of LED lighting have argued that the high energy efficiency of LEDs would contribute to slowing overall global energy demand, given that outdoor lighting accounts for a significant fraction of the nighttime energy budget of the typical world city.

The team tested this idea by comparing changes in nighttime lighting seen from Earth orbit to changes in countries' GDP (Gross Domestic Product) – a measure of their overall economic output – during the same time period. They concluded that financial savings from the improved energy efficiency of outdoor lighting appear to be invested into the deployment of more lights. As a consequence, the expected large reductions in global energy consumption for outdoor lighting have not been realized.

Kyba expects that the upward global trend in use of outdoor lighting will continue, bringing a host of negative environmental consequences. "There is a potential for the solid-state lighting revolution to save energy and reduce light pollution," he added, "but only if we don't spend the savings on new light".

IDA (International Dark-Sky Association) has campaigned for the last 30 years to bring attention to the known and suspected hazards associated with the use of artificial light at night. IDA Executive Director J. Scott Feierabend pointed out repercussions including harm to wildlife, threats to human wellbeing, and potentially compromised public safety. IDA drew public attention to concerns associated with the strong blue light emissions of LED lighting as early as 2010.

"Today's announcement validates the message IDA has communicated for years," Feierabend explained. "We hope that the results further sound the alarm about the many unintended consequences of the unchecked use of artificial light at night."

Satellite imagery: The VIIRS (Visible Infrared Imaging Radiometer Suite) DNB (Day-Night Band) of the Suomi NPP mission started observations in 2012 -just as outdoor use of LED lighting began in earnest. This sensor provides the first-ever global calibrated nighttime radiance measurements in a spectral band of 500 to 900 nm, which is close to the visible band, with a much higher radiometric sensitivity than the DMSP series, and at a spatial resolution of ~750 m. This improved spatial resolution allows neighborhood (rather than city or national) scale changes in lighting to be investigated for the first time.

The cloud-free DNB data show that over the period of 2012–2016, both lit area and the radiance of previously lit areas increased in most countries (Figure 49) in the 500–900 nm range, with global increases of 2.2% per year for lit area and 2.2% per year for the brightness of continuously lit areas. Overall, the radiance of areas lit above 5 nWcm-2 sr-1 increased by 1.8% per year. These factors decreased in very few countries, including several experiencing warfare. They were also stable in only a few countries, interestingly including some of the world's brightest (for example, Italy, Netherlands, Spain, and the United States). With few exceptions, growth in lighting occurred throughout South America, Africa, and Asia. Because the analysis of lit area and total radiance is not subject to a stability criterion, transient lights such as wildfires can cause large fluctuations.

Australia experienced a major decrease in lit area from 2012 to 2016 for this reason(Figures 49A and 50). However, fire-lit areas failed the stability test and were therefore not included in the radiance change analysis (Figure 49B). A small number of countries have "no data" because of either their extreme latitude (Iceland) or the lack of observed stable lights above 5 nWcm-2 sr-1 in the cloud-free composite (for example, Central African Republic).

EO-Topics_Auto6B

Figure 49: Geographic patterns in changes in artificial lighting. Changes are shown as an annual rate for both lit area (A) and radiance of stably lit areas (B). Annual rates are calculated based on changes over the four year period, that is, (A2016/A2012)1/4, where A2016 is the lit area observed in 2016 (image credit: Study Team)

 

EO-Topics_Auto6A

Figure 50: Absolute change in lit area from 2012 to 2016. Pixels increasing in area are shown as red, pixels decreasing in area are shown as blue, and pixels with no change in area are shown as yellow. Each pixel has a near-equal area of 6000 ± 35 km2. To ease interpretation, the color scale cuts off at 200 km2, but some pixels had changes of up to ±2000 km2 (image credit: Study Team)

Comparisons of the VIIRS data with photographs taken from aboard the ISS (International Space Station) show that the instrument on Suomi-NPP sometimes records a dimming of some cities even though these cities are in fact the same in brightness or even more brightly lit. The reason for this is that sensor can't "see" light at wavelengths below 500 nm, i.e. blue light. When cities replace orange lamps with white LED lights that emit considerable radiation below 500 nm, VIIRS mistakes the change for a decrease. In short: The Earth's night-time surface brightness and especially the skyglow over cities is increasing, probably even in the cases where the satellite detects less radiation. 86)

There is, however, hope that things will change for the better. Christopher Kyba says: "Other studies and the experience of cities like Tucson, Arizona, show that well designed LED lamps allow a two-third or more decrease of light emission without any noticeable effect for human perception." Kyba's earlier work has shown that the light emission per capita in the United States of America is 3 to 5 times higher than that in Germany. Kyba sees this as a sign that prosperity, safety, and security can be achieved with conservative light use. "There is a potential for the solid state lighting revolution to save energy and reduce light pollution," adds Kyba, "but only if we don't spend the savings on new light."