Aid workers are getting vital details about the trail of destruction left by Typhoon Haiyan, including damage to individual streets and buildings, from online maps developed with contributions from Red Cross staff and volunteers across the world.
A British Red Cross team is using the latest satellite photos and reports from a range of sources to help update maps that could save lives in the aftermath of the disaster.
The work, in partnership with the American Red Cross, is part of the Humanitarian OpenStreetMap Team project – an interactive map that can be viewed and edited by anyone with an internet connection.
The British Red Cross mapping team has 17 members and has been active since 2010, but in the wake of the disaster is focusing its efforts on a single task for the first time.
How the mapping process works
Using satellite images taken after the typhoon, the team is updating maps of the worst affected areas with information about damage to buildings. By looking at recent pictures to find out whether structures are partly or totally destroyed, they can give aid workers information about blocked roads or the scale of destruction in a particular neighbourhood.
Other details mapped include the number of people reported missing in different areas and the location of Red Cross aid workers. Sources include figures from the Red Cross and other organisations like the UN. The maps are ‘layered’ so users can choose to see the information that’s relevant to them.
Updates are made every hour of every day, so the maps are constantly evolving source of data. In a fast-changing situation like the aftermath of a typhoon, up-to-date information can make a huge difference to the work of those on the ground.
Andrew Braye, who leads the British Red Cross’ involvement in the project, said the organisation is using this kind of mapping more and more. It’s made possible by new technology and the growth of online collaboration tools such as OpenStreetMap and Google Docs. These let individuals and organisations work together to create new online tools that can be used by everyone.
The team has been using its mapping expertise to support the British Red Cross for three years – helping everyone from aid workers planning trips abroad to volunteers dealing with emergencies in the UK.
As well as Red Cross staff, its members include digital volunteers recruited through adverts on the internet and events linked to digital mapping. The team usually works on a wide range of projects – but for the first time has come together to focus on a single task.
Volunteer Johnny Henshall has recently completed a Masters in geographical information systems. He said: “It’s a chance for me to use the skills I’ve just got. This is what I want to be doing. It’s amazing to be able to help – there aren’t many opportunities to do this for a humanitarian organisation.”
Andrew says the team’s “revolving door” of volunteers brings in people with highly specialised skills who would normally cost a huge amount to employ. But many are willing to give their time for a few weeks or months between paid contracts.
Could you join the team?
If you have some spare time and know how to use PostgreSQL, PostGIS, GeoServer and OpenLayers, the team would like to hear from you. Email them for more information.
Scientists use climate models to predict how Earth’s climate will change. Climate models are computer programs with mathematical equations. They are programmed to simulate past climate as accurately as possible. This gives scientists some confidence in a climate model’s ability to predict the future.
Climate models predict that Earth’s average temperature will keep rising over the next 100 years or so. There may be a year or years where Earth’s average temperature is steady or even falls. But the overall trend is expected to be up.
Earth’s average temperature is expected to rise even if the amount of greenhouse gases in the atmosphere decreases. But the rise would be less than if greenhouse gas amounts remain the same or increase.
Take a look at this amazing visualization that explains how we’ve treated the Earth over the past 100 years and what the next 100 might hold.
Dr. Matthew Hansen at the University of Maryland, we’ve built the first detailed maps of the world’s forests, from 2000-2012, documenting and quantifying forest landscape changes such as fires, tornadoes, disease and logging.
The most significant findings were that the overall rate of tropical deforestation is increasing, and global forests have experienced a net loss of 1.5M sq km during 2000-2012 due to both natural (disturbance) and human causes. That’s a loss of forested land comparable in size to the entire state of Alaska.
A little more than 300,000 square miles of forest was established or replanted worldwide between 2000 and 2012. Unfortunately, almost 900,000 square miles was destroyed during the same time period — logged, ravaged by fire, or attacked by insects.
Those are the main conclusions of a study that examined hundreds of thousands of images snapped by the U.S. government’s Landsat satellites. Academic researchers partnered with Google staff to produce stunning maps displaying the world’s forests and areas that have been deforested or reforested since 2000. Those maps were used to produce the following short videos:
About a third of the deforestation occurred in the tropics, and half of that was in South America. Logging and clearing of land for farming were responsible for much of the loss. Hearteningly, the researchers found that deforestation has been slowing down in Brazil, where worldwide concerns about the loss of the Amazon have helped spur domestic efforts to save the rainforest. But that slowdown was offset by increasing losses in other countries.
“Although Brazilian gross forest loss is the second highest globally, other countries, including Malaysia, Cambodia, Cote d’Ivoire, Tanzania, Argentina, and Paraguay, experienced a greater percentage of loss of forest cover,” the scientists wrote in the paper, published Thursday in Science. “Given consensus on the value of natural forests to the Earth system, Brazil’s policy intervention is an example of how awareness of forest valuation can reverse decades of previous wide-spread deforestation.”
The tropics lost more forest cover during the study period than any other region. The second-worst hit were the boreal forests of spruce, fir, and larch in and around the Arctic, with fire the leading cause. Previous research has shown that these forests are burning at a rate not seen in at least 10,000 years, with climate change increasing temperatures and drying out the landscape.
That wasn’t the only worrisome climate-related finding in the new paper. The mountains of the American West are losing forests due not only to logging, but also because of fire and disease — with mountain pine bark beetles marching up mountains as temperatures warm, feasting on banquets of ill-prepared pines.
The loss of forests is making it even more difficult for the Earth to suck back up all the carbon dioxide that we’re pumping into its atmosphere.
- The world is still losing its forests, and these beautiful satellite maps tally the toll by John Upton
- Global Forest Change, University of Maryland
- High-Resolution Global Maps of 21st-Century Forest Cover Change, Science
Compare pre- and post-event imagery from Astrium to explore damage caused by Typhoon Haiyan/Yolanda.
Recently NASA reported that this year’s maximum wintertime extent of Antarctic sea ice was the largest on record, even greater than the previous year’s record.
This is understandably at odds with the public’s perception of how polar ice should respond to a warming climate, given the dramatic headlines of severe decline in Arctic summertime extent. But the “paradox of Antarctic sea ice” has been on climate scientists’ minds for some time.
Continental v. sea ice
First off, sea ice is different to the “continental ice” associated with polar ice caps, glaciers, ice shelves and icebergs. Continental ice is formed by the gradual deposition, build up and compaction of snow, resulting in ice that is hundreds to thousands of metres thick, storing and releasing freshwater that influences global sea-level over thousands of years.
Sea ice, though equally important to the climate system, is completely different. It is the thin layer (typically 1-2m) of ice that forms on the surface of the ocean when the latter is sufficiently cooled enough by the atmosphere.
From there sea ice can move with the winds and currents, continuing to grow both by freezing and through collisions (between the floes that make up the ice cover). When the atmosphere, and/or ocean is suitably warm again, such as in spring or if the sea ice has moved sufficiently towards the equator, then the sea ice melts again.
Antarctic v. Arctic
Secondly, we need to understand that the Arctic and Antarctic climate systems are very different, particularly in sea ice.
In the Arctic, sea ice forms in an ocean roughly centred on the North Pole that is surrounded by continents. A relatively large (though diminishing) proportion of the ice persists over multiple years before ultimately departing for warmer latitudes through exit points such as Fram Strait between Greenland and Svalbard.
In the south, on the other hand, sea ice forms outwards from the continental Antarctic Ice Sheet, where it is exposed to and strongly influenced by the winds and waters of the Southern Ocean. Here, there is a much stronger seasonal ebb and flow to sea ice coverage as over 80% of the sea ice area grows each autumn-winter and decays each spring-summer. This annual expansion-contraction from about 4 to 19 million square kms is one of the greatest seasonal changes on the Earth’s surface.
Area v. volume
Finally we need to remember that “extent” or “areal coverage” is only one way with which we monitor and study sea ice.
Sea ice turns out to be a very complex and variable medium that is very difficult to observe over large-scales. It is also constantly moving and restructuring. Until we achieve the “holy grail” of monitoring total sea ice volume from space and how it changes over time (and there are great steps towards this with European Space Agency’s environmental research satellite CryoSat-II), we are limited to interpreting its global behaviour through area.
What happened this winter?
This winter, the maximum total Antarctic sea ice extent was reported to be 19.47 million square kilometres, which is 3.6% above the winter average calculated from 1981 to 2010. This continues a trend that is weakly positive and remains in stark contrast to the decline in Arctic summer sea ice extent (2013 was 18% below the mean from 1981-2010).
To further complicate this picture, we find this net increase actually masks strong declines in particular regions around Antarctica, such as in the Bellingshausen Sea, which are on par or greater than those in the Arctic.
So while there is much greater attention given to the Arctic decline and the prediction of “ice-free summers” at the North Pole this century, Antarctic climate scientists still have their work cut out to understand the regional declines amidst the mild “net” expansion occurring in the southern hemisphere.
Here are some of the leading hypotheses currently being explored through a combination of satellite remote sensing, fieldwork in Antarctica and numerical model simulations – to help explain the increasing trend in overall Antarctic sea ice coverage:
- Increased westerly winds around the Southern Ocean, linked to changes in the large-scale atmospheric circulation related to ozone depletion, will see greater northward movement of sea ice, and hence extent, of Antarctic sea ice.
- Increased precipitation, in the form of either rain or snow, will increase the density stratification between the upper and middle layers of the Southern Ocean. This might reduce the oceanic heat transfer from relatively warm waters at below the surface layer, and therefore enhancing conditions at the surface for sea ice.
- Similarly, a freshening of the surface layers from this precipitation would also increase the local freezing point of sea ice formation.
- Another potential source of cooling and freshening in the upper ocean around Antarctica is increased melting of Antarctic continental ice, through ocean/ice shelf interaction and iceberg decay.
- The observed changes in sea ice extent could be influenced by a combination of all these factors and still fall within the bounds of natural variability.
The take home messages is that while the increase in total Antarctic sea ice area is relatively minor compared to the Arctic, it masks the fact that some regions are in strong decline. Given the complex interactions of winds and currents driving patterns of sea ice variability and change in the Southern Ocean climate system, this is not unexpected.
But it is still fascinating to study.
Most of the world’s electrical power is generated by utilizing non-renewable energy resources such as coal or uranium. While each material has a long and productive history of powering electrical plants, they also provide environmental challenges that defy easy comparison. Only by examining the total lifetime risks of the coal and uranium used in energy plants can it be determined which is better for the environment.
Coal-fired electric power plants emit massive amounts of greenhouse gases and other harmful pollutants to the atmosphere on a daily basis. Among the worst offenders are sulfur dioxide, which contributes to the formation of acid rain; nitrogen oxides, which combine with VOCs to form smog; and toxic compounds of mercury. That’s beyond the tonnage of carbon dioxide emissions that contribute directly to climate change. Burning coat releases over two pounds of carbon dioxide into the atmosphere for every kilowatt-hour of electricity it creates (See References 1, 2).
Greenhouse Gas Effect of Nuclear Power Plants
Nuclear power plants emit no carbon dioxide, sulfur dioxide, nitrogen oxides, mercury, or other toxic gases. A properly managed facility does not directly contribute to atmospheric climate change; the broad cooling towers characteristic of nuclear plants emit water vapor. Some coastal plants, however, discharge heated water back to lakes and seas, and this heat eventually radiates into surface warming. Raising water temperature in this way may also alter the way carbon dioxide is exchanged with the air by ocean bodies, leading to major shifts in weather patterns such as hurricanes (See references 1, 3, 4).
A typical coal-burning power plant creates over 300,000 tons of waste ash and sludge each year. That residue forms a toxic mess with pollutants such as arsenic, cadmium, chromium and mercury (See Reference 5). A typical nuclear power plant generates 20 metric tons of radioactive waste annually. This material must be isolated, transported and stored in remote locations for hundreds of years. Exposure to high levels of radiation is deadly to people and animals (See Reference 6).
While a nuclear power plant is completely safe under ideal conditions, the failure of a poorly designed facility in Chernobyl led to the world’s largest single eco-disaster. The failure of the Fukushima nuclear power plants following a series of earthquakes and tsunamis demonstrated that even well designed nuclear energy systems are not risk-free. Frightening as those episodes may seem, however, the danger of climate change caused by greenhouse gas emissions may be more urgent — and thus make nuclear a better choice than coal for the environment.
- U.S. Environmental Protection Agency: Air Emissions
- U.S. Energy Information Administration: Carbon Dioxide Emissions from the Generation of Electric Power in the United States
- Marian Koshland Science Museum: Global Warming Facts and Our Future: Ocean Circulation
- U.S. Environmental Protection Agency: Nuclear Energy
- Union of Concerned Scientists: Coal Power: Wastes Generated
- Nuclear Energy Institute: Nuclear Waste: Amounts and On-Site Storage
Source: M.Matthews from homeguides.sfgate.com
Human-caused climate change and air pollution remain major global-scale problems and are both due mostly to fossil fuel burning. Mitigation efforts for both of these problems should be undertaken concurrently in order to maximize effectiveness. Such efforts can be accomplished largely with currently available low-carbon and carbon-free alternative energy sources like nuclear power and renewables, as well as energy efficiency improvements.
Figure 1. Cumulative net deaths prevented assuming nuclear power replaces fossil fuels. The top panel (a) shows results for the historical period in our study (1971-2009), with mean values (labeled) and ranges for the baseline historical scenario. The middle (b) and bottom (c) panels show results for the high-end and low-end projections, respectively, of nuclear power supply estimated by the IAEA (ref. 4) for the period 2010-2050. Error bars reflect the ranges for the fossil fuel mortality factors listed in Table 1 of our paper. The larger columns in panels (b) and (c) reflect the all-coal case and are labeled with their mean values, while the smaller columns reflect the all-gas case; values for the latter are not shown because they are all simply a factor of about 10 lower (reflecting the order-of-magnitude difference between the mortality factors for coal and gas). Countries/regions are arranged in descending order of CO2 emissions in recent years. FSU15=15 countries of the Former Soviet Union and OECD=Organization for Economic Cooperation and Development.
In a recently published paper (ref. 1), we provide an objective, long-term, quantitative analysis of the effects of nuclear power on human health (mortality) and the environment (climate). Several previous scientific papers have quantified global-scale greenhouse gas (GHG) emissions avoided by nuclear power, but to our knowledge, ours is the first to quantify avoided human deaths as well as avoided GHG emissions on global, regional, and national scales.
The paper demonstrates that without nuclear power, it will be even harder to mitigate human-caused climate change and air pollution. This is fundamentally because historical energy production data reveal that if nuclear power never existed, the energy it supplied almost certainly would have been supplied by fossil fuels instead (overwhelmingly coal), which cause much higher air pollution-related mortality and GHG emissions per unit energy produced (ref. 2).
Using historical electricity production data and mortality and emission factors from the peer-reviewed scientific literature, we found that despite the three major nuclear accidents the world has experienced, nuclear power prevented an average of over 1.8 million net deaths worldwide between 1971-2009 (see Fig. 1). This amounts to at least hundreds and more likely thousands of times more deaths than it caused. An average of 76,000 deaths per year were avoided annually between 2000-2009 (see Fig. 2), with a range of 19,000-300,000 per year.
Likewise, we calculated that nuclear power prevented an average of 64 gigatonnes of CO2-equivalent (GtCO2-eq) net GHG emissions globally between 1971-2009 (see Fig. 3). This is about 15 times more emissions than it caused. It is equivalent to the past 35 years of CO2 emissions from coal burning in the U.S. or 17 years in China (ref. 3) — i.e., historical nuclear energy production has prevented the building of hundreds of large coal-fired power plants.
To compute potential future effects, we started with the projected nuclear energy supply for 2010-2050 from an assessment made by the UN International Atomic Energy Agency that takes into account the effects of the Fukushima accident (ref. 4). We assume that the projected nuclear energy is canceled and replaced entirely by energy from either coal or natural gas. We calculate that this nuclear phaseout scenario leads to an average of 420,000-7 million deaths and 80-240 GtCO2-eq emissions globally (the high-end values reflect the all coal case; see Figs. 1 and 3). This emissions range corresponds to 16-48% of the “allowable” cumulative CO2 emissions between 2012-2050 if the world chooses to aim for a target atmospheric CO2 concentration of 350 ppm by around the end of this century (ref. 5). In other words, projected nuclear power could reduce the CO2 mitigation burden for meeting this target by as much as 16-48%.
The largest uncertainties and limitations of our analysis stem from the assumed values for impacts per unit electric energy produced. However, we emphasize that our results for both prevented mortality and prevented GHG emissions could be substantial underestimates. This is because (among other reasons) our mortality and emission factors are based on analysis of Europe and the US (respectively), and thus neglect the fact that fatal air pollution and GHG emissions from power plants in developing countries are on average substantially higher per unit energy produced than in developed countries.
Our findings also have important implications for large-scale “fuel switching” to natural gas from coal or from nuclear. Although natural gas burning emits less fatal pollutants and GHGs than coal burning, it is far deadlier than nuclear power, causing about 40 times more deaths per unit electric energy produced (ref. 2).
Also, such fuel switching is practically guaranteed to worsen the climate problem for several reasons. First, carbon capture and storage is an immature technology and is therefore unlikely to constrain the resulting GHG emissions in the necessary time frame. Second, electricity infrastructure generally has a long lifetime (e.g., fossil fuel power plants typically operate for up to ~50 years). Third, potentially usable natural gas resources (especially unconventional ones like shale gas) are enormous, containing many hundreds to thousands of gigatonnes of carbon (based on ref. 6). For perspective, the atmosphere currently contains ~830 GtC, of which ~200 GtC are from industrial-era fossil fuel burning.
We conclude that nuclear energy — despite posing several challenges, as do all energy sources (ref. 7) — needs to be retained and significantly expanded in order to avoid or minimize the devastating impacts of unabated climate change and air pollution caused by fossil fuel burning.
1. Kharecha, P.A., and J.E. Hansen, 2013: Prevented mortality and greenhouse gas emissions from historical and projected nuclear power. Environ. Sci. Technol., in press, doi:10.1021/es3051197.
2. Markandya, A., and P. Wilkinson, 2007: Electricity generation and health. Lancet, 370, 979-990, doi: 10.1016/S0140-6736(07)61253-7.
3. Boden, T. A., G. Marland, R.J. Andres, 2012: Global, Regional, and National Fossil-Fuel CO2 Emissions. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A., doi:10.3334/CDIAC/00001_V2012.
4. International Atomic Energy Agency, 2011: Energy, Electricity and Nuclear Power Estimates for the Period up to 2050: 2011 Edition. IAEA Reference Data Series 1/31. Available at http://www-pub.iaea.org/MTCD/Publications/PDF/RDS1_31.pdf
5. Hansen, J., P. Kharecha, Mki. Sato, V. Masson-Delmotte, et al., 2013: Scientific prescription to avoid dangerous climate change to protect young people, future generations, and nature. PLOS One, submitted.
6. GEA, 2012: Global Energy Assessment — Toward a Sustainable Future. Cambridge University Press, Cambridge, UK and New York, NY, USA and the International Institute for Applied Systems Analysis, Laxenburg, Austria. Available at http://www.globalenergyassessment.org.
7. Kharecha, P.A., C.F. Kutscher, J.E. Hansen, and E. Mazria, 2010: Options for near-term phaseout of CO2 emissions from coal use in the United States. Environ. Sci. Technol., 44, 4050-4062, doi:10.1021/es903884a.
Carbon dioxide (CO2) is an important heat-trapping (greenhouse) gas, which is released through human activities such as deforestation and burning fossil fuels, as well as natural processes such as respiration and volcanic eruptions. The chart on the left shows the CO2 levels in the Earth’s atmosphere during the last three glacial cycles, as reconstructed from ice cores. The chart on the right shows CO2 levels in recent years, corrected for average seasonal cycles.
Thirty-five years ago, a scientist named John H. Mercer issued a warning. By then it was already becoming clear that human emissions would warm the earth, and Dr. Mercer had begun thinking deeply about the consequences.
His paper, in the journal Nature, was titled “West Antarctic Ice Sheet and CO2 Greenhouse Effect: A Threat of Disaster.” In it, Dr. Mercer pointed out the unusual topography of the ice sheet sitting over the western part of Antarctica. Much of it is below sea level, in a sort of bowl, and he said that a climatic warming could cause the whole thing to degrade rapidly on a geologic time scale, leading to a possible rise in sea level of 16 feet.
While it is clear by now that we are in the early stages of what is likely to be a substantial rise in sea level, we still do not know if Dr. Mercer was right about a dangerous instability that could cause that rise to happen rapidly, in geologic time. We may be getting closer to figuring that out.
An intriguing new paper comes from Michael J. O’Leary of Curtin University in Australia and five colleagues scattered around the world. Dr. O’Leary has spent more than a decade exploring the remote western coast of Australia, considered one of the best places in the world to study sea levels of the past.
The paper, published July 28 in Nature Geoscience, focuses on a warm period in the earth’s history that preceded the most recent ice age. In that epoch, sometimes called the Eemian, the planetary temperature was similar to levels we may see in coming decades as a result of human emissions, so it is considered a possible indicator of things to come.
Examining elevated fossil beaches and coral reefs along more than a thousand miles of coast, Dr. O’Leary’s group confirmed something we pretty much already knew. In the warmer world of the Eemian, sea level stabilized for several thousand years at about 10 to 12 feet above modern sea level.
The interesting part is what happened after that. Dr. O’Leary’s group found what they consider to be compelling evidence that near the end of the Eemian, sea level jumped by another 17 feet or so, to settle at close to 30 feet above the modern level, before beginning to fall as the ice age set in.
In an interview, Dr. O’Leary told me he was confident that the 17-foot jump happened in less than a thousand years — how much less, he cannot be sure.
This finding is something of a vindication for one member of the team, a North Carolina field geologist, Paul J. Hearty. He had argued for decades that the rock record suggested a jump of this sort, but only recently have measurement and modeling techniques reached the level of precision needed to nail the case.
We have to see if their results withstand critical scrutiny. A sea-level scientist not involved in the work, Andrea Dutton of the University of Florida, said the paper had failed to disclose enough detailed information about the field sites to allow her to judge the overall conclusion. But if the work does hold up, the implications are profound. The only possible explanation for such a large, rapid jump in sea level is the catastrophic collapse of a polar ice sheet, on either Greenland or Antarctica.
Dr. O’Leary is not prepared to say which; figuring that out is the group’s next project. But a 17-foot rise in less than a thousand years, a geologic instant, has to mean that one or both ice sheets contain some instability that can be set off by a warmer climate.
That, of course, augurs poorly for humans. Scientists at Stanford calculated recently that human emissions are causing the climate to change many times faster than at any point since the dinosaurs died out. We are pushing the climate system so hard that, if the ice sheets do have a threshold of some kind, we stand a good chance of exceeding it.
Another recent paper, by Anders Levermann of the Potsdam Institute for Climate Impact Research in Germany and a half-dozen colleagues, implies that even if emissions were to stop tomorrow, we have probably locked in several feet of sea level rise over the long term.
Benjamin Strauss and his colleagues at Climate Central, an independent group of scientists and journalists in Princeton that reports climate research, translated the Levermann results into graphical form, and showed the difference it could make if we launched an aggressive program to control emissions. By 2100, their calculations suggest, continuing on our current path would mean locking in a long-term sea level rise of 23 feet, but aggressive emission cuts could limit that to seven feet.
If you are the mayor of Miami or of a beach town in New Jersey, you may be asking yourself: Exactly how long is all this going to take to play out?
On that crucial point, alas, our science is still nearly blind. Scientists can look at the rocks and see indisputable evidence of jumps in sea level, and they can associate those with relatively modest increases in global temperature. But the nature of the evidence is such that it is hard to tell the difference between something that happened in a thousand years and something that happened in a hundred.
On the human time scale, of course, that is all the difference in the world. If sea level is going to rise by, say, 30 feet over several thousand years, that is quite a lot of time to adjust — to pull back from the beaches, to reinforce major cities, and to develop technologies to help us cope.
But if sea level is capable of rising several feet per century, as Dr. O’Leary’s paper would seem to imply and as many other scientists believe, then babies being born now could live to see the early stages of a global calamity.
As global climate change becomes more evident, NASA’s satellite program for studying and monitoring our home planet becomes increasingly important. There are two key areas toward which NASA satellite measurements can contribute:
1. Monitoring changes in the Earth’s climate. Global climate change occurs slowly relative to weather and even to the change of seasons throughout the year. Changes known to be related to global climate change–increased atmospheric carbon dioxide and other greenhouse gases (including water vapor), sea level rise, and the melting of Arctic sea ice and the Greenland and Antarctic ice sheets–are so gradual that it takes many years, even decades, to characterize and quantify them.
Given that satellite missions typically last on the order of three to 10 years, NASA often needs to consider launching copies of some instruments, as current versions age and fail. Continuity in our satellite observations is important for maintaining long records of key climate indicators, such as those listed above. Having long and continuous records of these is critical for monitoring the effects of climate change, helping determine how we can best adapt to them, and assessing whether measures to limit its effects are working as expected.
2. Improving our understanding of global climate change key processes. Simply monitoring some of the climate change indicators listed above doesn’t provide enough information for scientists to fully understand and characterize the problem and consequences.
For example, only observing sea level rise doesn’t illuminate all the key processes that might be involved in determining the rate at which it is rising; these include sea level rise, the melting of ice sheets and glaciers, warming of the ocean, the continents’ and shorelines’ slow response to ice sheet melting and sea level rise, etc. Similarly, it is critical to understand how water vapor and clouds respond to climate change, as these help determine the amount of future temperature warming that might be expected to result from increasing the amount of greenhouse gases in the atmosphere.
Knowing “how the Earth’s climate works” is vital to making projections of future warming and the associated impacts using very sophisticated computer models of the Earth’s climate. For such projections to be useful, they have to accurately represent the Earth’s climate system.
Thus, some of NASA’s satellite program focuses on developing new observations to illuminate how the Earth’s climate system works and to reduce uncertainties in global models used for climate projection.
The above is from Dr. Duane Waliser, who specializes in climate dynamics and modeling. He is the chief scientist of JPL’s Earth Science and Technology Directorate and an adjunct professor in the Atmospheric and Oceanic Sciences Department at UCLA.