Findings

Weather forecast

Kevin Lewis

January 28, 2015

Does the Environment Still Matter? Daily Temperature and Income in the United States

Tatyana Deryugina & Solomon Hsiang
NBER Working Paper, December 2014

Abstract:
It is widely hypothesized that incomes in wealthy countries are insulated from environmental conditions because individuals have the resources needed to adapt to their environment. We test this idea in the wealthiest economy in human history. Using within-county variation in weather, we estimate the effect of daily temperature on annual income in United States counties over a 40-year period. We find that this single environmental parameter continues to play a large role in overall economic performance: productivity of individual days declines roughly 1.7% for each 1°C (1.8°F) increase in daily average temperature above 15°C (59°F). A weekday above 30°C (86°F) costs an average county $20 per person. Hot weekends have little effect. These estimates are net of many forms of adaptation, such as factor reallocation, defensive investments, transfers, and price changes. Because the effect of temperature has not changed since 1969, we infer that recent uptake or innovation in adaptation measures have been limited. The non-linearity of the effect on different components of income suggest that temperature matters because it reduces the productivity of the economy's basic elements, such as workers and crops. If counties could choose daily temperatures to maximize output, rather than accepting their geographically-determined endowment, we estimate that annual income growth would rise by 1.7 percentage points. Applying our estimates to a distribution of "business as usual" climate change projections indicates that warmer daily temperatures will lower annual growth by 0.06-0.16 percentage points in the United States unless populations engage in new forms of adaptation.

---------------------

Who Loses Under Power Plant Cap-and-Trade Programs?

Mark Curtis
NBER Working Paper, December 2014

Abstract:
This paper tests how a major cap-and-trade program, known as the NOx Budget Trading Program (NBP), impacted labor markets in the regions where it was implemented. The cap-and-trade program dramatically decreased levels of NOx emissions and added substantial costs to energy producers. Using a triple-differences approach that takes advantage of the geographic and time variation of the program as well as variation in industry energy-intensity levels, I examine how employment dynamics changed in manufacturing industries whose production process requires high levels of energy. After accounting for a variety of flexible state, county and industry trends, I find that employment in the manufacturing sector dropped by 1.3% as a result of the NBP. Young workers experienced the largest employment declines and earnings of newly hired workers fell after the regulation began. Employment declines are shown to have occurred primarily through decreased hiring rates rather than increased separation rates, thus mitigating the impact on incumbent workers.

---------------------

Temperature impacts on economic growth warrant stringent mitigation policy

Frances Moore & Delavane Diaz
Nature Climate Change, forthcoming

Abstract:
Integrated assessment models compare the costs of greenhouse gas mitigation with damages from climate change to evaluate the social welfare implications of climate policy proposals and inform optimal emissions reduction trajectories. However, these models have been criticized for lacking a strong empirical basis for their damage functions, which do little to alter assumptions of sustained gross domestic product (GDP) growth, even under extreme temperature scenarios. We implement empirical estimates of temperature effects on GDP growth rates in the DICE model through two pathways, total factor productivity growth and capital depreciation. This damage specification, even under optimistic adaptation assumptions, substantially slows GDP growth in poor regions but has more modest effects in rich countries. Optimal climate policy in this model stabilizes global temperature change below 2 °C by eliminating emissions in the near future and implies a social cost of carbon several times larger than previous estimates. A sensitivity analysis shows that the magnitude of climate change impacts on economic growth, the rate of adaptation, and the dynamic interaction between damages and GDP are three critical uncertainties requiring further research. In particular, optimal mitigation rates are much lower if countries become less sensitive to climate change impacts as they develop, making this a major source of uncertainty and an important subject for future research.

---------------------

Optimal Learning on Climate Change: Why Climate Skeptics Should Reduce Emissions

Sweder van Wijnbergeny & Tim Willemsz
Journal of Environmental Economics and Management, March 2015, Pages 17–33

Abstract:
Climate skeptics typically argue that the possibility that global warming is exogenous implies that we should not take additional action towards reducing emissions until we know what drives warming. This paper however shows that even climate skeptics have an incentive to reduce emissions: such a directional change generates information on the causes of global warming. Since the optimal policy depends upon these causes, they are valuable to know. Although increasing emissions would also generate information, that option is inferior due its irreversibility. We show that optimality can even imply that climate skeptics should actually argue for lower emissions than believers.

---------------------

Transition to Clean Technology

Daron Acemoglu et al.
NBER Working Paper, December 2014

Abstract:
We develop a microeconomic model of endogenous growth where clean and dirty technologies compete in production and innovation — in the sense that research can be directed to either clean or dirty technologies. If dirty technologies are more advanced to start with, the potential transition to clean technology can be difficult both because clean research must climb several rungs to catch up with dirty technology and because this gap discourages research effort directed towards clean technologies. Carbon taxes and research subsidies may nonetheless encourage production and innovation in clean technologies, though the transition will typically be slow. We characterize certain general properties of the transition path from dirty to clean technology. We then estimate the model using a combination of regression analysis on the relationship between R&D and patents, and simulated method of moments using microdata on employment, production, R&D, firm growth, entry and exit from the US energy sector. The model's quantitative implications match a range of moments not targeted in the estimation quite well. We then characterize the optimal policy path implied by the model and our estimates. Optimal policy makes heavy use of research subsidies as well as carbon taxes. We use the model to evaluate the welfare consequences of a range of alternative policies.

---------------------

From the extreme to the mean: Acceleration and tipping points of coastal inundation from sea level rise

William Sweet & Joseph Park
Earth's Future, December 2014, Pages 579–600

Abstract:
Relative sea level rise (RSLR) has driven large increases in annual water level exceedances (duration and frequency) above minor (nuisance level) coastal flooding elevation thresholds established by the National Weather Service (NWS) at U.S. tide gauges over the last half-century. For threshold levels below 0.5 m above high tide, the rates of annual exceedances are accelerating along the U.S. East and Gulf Coasts, primarily from evolution of tidal water level distributions to higher elevations impinging on the flood threshold. These accelerations are quantified in terms of the local RSLR rate and tidal range through multiple regression analysis. Along the U.S. West Coast, annual exceedance rates are linearly increasing, complicated by sharp punctuations in RSLR anomalies during El Niño Southern Oscillation (ENSO) phases, and we account for annual exceedance variability along the U.S. West and East Coasts from ENSO forcing. Projections of annual exceedances above local NWS nuisance levels at U.S. tide gauges are estimated by shifting probability estimates of daily maximum water levels over a contemporary 5-year period following probabilistic RSLR projections of Kopp et al. (2014) for representative concentration pathways (RCP) 2.6, 4.5, and 8.5. We suggest a tipping point for coastal inundation (30 days/per year with a threshold exceedance) based on the evolution of exceedance probabilities. Under forcing associated with the local-median projections of RSLR, the majority of locations surpass the tipping point over the next several decades regardless of specific RCP.

---------------------

Evidence for a wavier jet stream in response to rapid Arctic warming

Jennifer Francis & Stephen Vavrus
Environmental Research Letters, January 2015

Abstract:
New metrics and evidence are presented that support a linkage between rapid Arctic warming, relative to Northern hemisphere mid-latitudes, and more frequent high-amplitude (wavy) jet-stream configurations that favor persistent weather patterns. We find robust relationships among seasonal and regional patterns of weaker poleward thickness gradients, weaker zonal upper-level winds, and a more meridional flow direction. These results suggest that as the Arctic continues to warm faster than elsewhere in response to rising greenhouse-gas concentrations, the frequency of extreme weather events caused by persistent jet-stream patterns will increase.

---------------------

Dramatically increasing chance of extremely hot summers since the 2003 European heatwave

Nikolaos Christidis, Gareth Jones & Peter Stott
Nature Climate Change, January 2015, Pages 46–50

Abstract:
Socio-economic stress from the unequivocal warming of the global climate system could be mostly felt by societies through weather and climate extremes. The vulnerability of European citizens was made evident during the summer heatwave of 2003 when the heat-related death toll ran into tens of thousands. Human influence at least doubled the chances of the event according to the first formal event attribution study, which also made the ominous forecast that severe heatwaves could become commonplace by the 2040s. Here we investigate how the likelihood of having another extremely hot summer in one of the worst affected parts of Europe has changed ten years after the original study was published, given an observed summer temperature increase of 0.81 K since then. Our analysis benefits from the availability of new observations and data from several new models. Using a previously employed temperature threshold to define extremely hot summers, we find that events that would occur twice a century in the early 2000s are now expected to occur twice a decade. For the more extreme threshold observed in 2003, the return time reduces from thousands of years in the late twentieth century to about a hundred years in little over a decade.

---------------------

Inferring Carbon Abatement Costs in Electricity Markets: A Revealed Preference Approach using the Shale Revolution

Joseph Cullen & Erin Mansur
NBER Working Paper, December 2014

Abstract:
This paper examines how much carbon emissions from the electricity industry would decrease in response to a carbon price. We show how both carbon prices and cheap natural gas reduce, in a nearly identical manner, the historic cost advantage of coal-fired power plants. The shale revolution has resulted in unprecedented variation in natural gas prices that we use to estimate the short-run price elasticity of abatement. Our estimates imply that a price of $10 ($60) per ton of carbon dioxide would reduce emissions by 4% (10%). Furthermore, carbon prices are much more effective at reducing emissions when natural gas prices are low. In contrast, modest carbon prices have negligible effects when gas prices are at levels seen prior to the shale revolution.

---------------------

The geographical distribution of fossil fuels unused when limiting global warming to 2 °C

Christophe McGlade & Paul Ekins
Nature, 8 January 2015, Pages 187–190

Abstract:
Policy makers have generally agreed that the average global temperature rise caused by greenhouse gas emissions should not exceed 2 °C above the average global temperature of pre-industrial times. It has been estimated that to have at least a 50 per cent chance of keeping warming below 2 °C throughout the twenty-first century, the cumulative carbon emissions between 2011 and 2050 need to be limited to around 1,100 gigatonnes of carbon dioxide (Gt CO2). However, the greenhouse gas emissions contained in present estimates of global fossil fuel reserves are around three times higher than this, and so the unabated use of all current fossil fuel reserves is incompatible with a warming limit of 2 °C. Here we use a single integrated assessment model that contains estimates of the quantities, locations and nature of the world’s oil, gas and coal reserves and resources, and which is shown to be consistent with a wide variety of modelling approaches with different assumptions, to explore the implications of this emissions limit for fossil fuel production in different regions. Our results suggest that, globally, a third of oil reserves, half of gas reserves and over 80 per cent of current coal reserves should remain unused from 2010 to 2050 in order to meet the target of 2 °C. We show that development of resources in the Arctic and any increase in unconventional oil production are incommensurate with efforts to limit average global warming to 2 °C. Our results show that policy makers’ instincts to exploit rapidly and completely their territorial fossil fuels are, in aggregate, inconsistent with their commitments to this temperature limit. Implementation of this policy commitment would also render unnecessary continued substantial expenditure on fossil fuel exploration, because any new discoveries could not lead to increased aggregate production.

---------------------

The Effect of Framing and Normative Messages in Building Support for Climate Policies

Mark Hurlstone et al.
PLoS ONE, December 2014

Abstract:
Deep cuts in greenhouse gas emissions are required to mitigate climate change. However, there is low willingness amongst the public to prioritise climate policies for reducing emissions. Here we show that the extent to which Australians are prepared to reduce their country's CO2 emissions is greater when the costs to future national income are framed as a “foregone-gain” — incomes rise in the future but not by as much as in the absence of emission cuts — rather than as a “loss” — incomes decrease relative to the baseline expected future levels (Studies 1 & 2). The provision of a normative message identifying Australia as one of the world's largest CO2 emitters did not increase the amount by which individuals were prepared to reduce emissions (Study 1), whereas a normative message revealing the emission policy preferences of other Australians did (Study 2). The results suggest that framing the costs of reducing emissions as a smaller increase in future income and communicating normative information about others' emission policy preferences are effective methods for leveraging public support for emission cuts.

---------------------

Global Sea Ice Coverage from Satellite Data: Annual Cycle and 35-Yr Trends

Claire Parkinson
Journal of Climate, December 2014, Pages 9377–9382

Abstract:
Well-established satellite-derived Arctic and Antarctic sea ice extents are combined to create the global picture of sea ice extents and their changes over the 35-yr period 1979–2013. Results yield a global annual sea ice cycle more in line with the high-amplitude Antarctic annual cycle than the lower-amplitude Arctic annual cycle but trends more in line with the high-magnitude negative Arctic trends than the lower-magnitude positive Antarctic trends. Globally, monthly sea ice extent reaches a minimum in February and a maximum generally in October or November. All 12 months show negative trends over the 35-yr period, with the largest magnitude monthly trend being the September trend, at −68 200 ± 10 500 km2 yr−1 (−2.62% ± 0.40% decade−1), and the yearly average trend being −35 000 ± 5900 km2 yr−1 (−1.47% ± 0.25% decade−1).

---------------------

Probabilistic reanalysis of twentieth-century sea-level rise

Carling Hay et al.
Nature, 22 January 2015, Pages 481–484

Abstract:
Estimating and accounting for twentieth-century global mean sea-level (GMSL) rise is critical to characterizing current and future human-induced sea-level change. Several previous analyses of tide gauge records — employing different methods to accommodate the spatial sparsity and temporal incompleteness of the data and to constrain the geometry of long-term sea-level change — have concluded that GMSL rose over the twentieth century at a mean rate of 1.6 to 1.9 millimetres per year. Efforts to account for this rate by summing estimates of individual contributions from glacier and ice-sheet mass loss, ocean thermal expansion, and changes in land water storage fall significantly short in the period before 1990. The failure to close the budget of GMSL during this period has led to suggestions that several contributions may have been systematically underestimated. However, the extent to which the limitations of tide gauge analyses have affected estimates of the GMSL rate of change is unclear. Here we revisit estimates of twentieth-century GMSL rise using probabilistic techniques and find a rate of GMSL rise from 1901 to 1990 of 1.2 ± 0.2 millimetres per year (90% confidence interval). Based on individual contributions tabulated in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, this estimate closes the twentieth-century sea-level budget. Our analysis, which combines tide gauge records with physics-based and model-derived geometries of the various contributing signals, also indicates that GMSL rose at a rate of 3.0 ± 0.7 millimetres per year between 1993 and 2010, consistent with prior estimates from tide gauge records. The increase in rate relative to the 1901–90 trend is accordingly larger than previously thought; this revision may affect some projections of future sea-level rise.

---------------------

Twentieth-century shifts in forest structure in California: Denser forests, smaller trees, and increased dominance of oaks

Patrick McIntyre et al.
Proceedings of the National Academy of Sciences, forthcoming

Abstract:
We document changes in forest structure between historical (1930s) and contemporary (2000s) surveys of California vegetation through comparisons of tree abundance and size across the state and within several ecoregions. Across California, tree density in forested regions increased by 30% between the two time periods, whereas forest biomass in the same regions declined, as indicated by a 19% reduction in basal area. These changes reflect a demographic shift in forest structure: larger trees (>61 cm diameter at breast height) have declined, whereas smaller trees (<30 cm) have increased. Large tree declines were found in all surveyed regions of California, whereas small tree increases were found in every region except the south and central coast. Large tree declines were more severe in areas experiencing greater increases in climatic water deficit since the 1930s, based on a hydrologic model of water balance for historical climates through the 20th century. Forest composition in California in the last century has also shifted toward increased dominance by oaks relative to pines, a pattern consistent with warming and increased water stress, and also with paleohistoric shifts in vegetation in California over the last 150,000 y.

---------------------

Exploring the impact of permitting and local regulatory processes on residential solar prices in the United States

Jesse Burkhardt et al.
Energy Policy, March 2015, Pages 102–112

Abstract:
This article statistically isolates the impacts of city-level permitting and other local regulatory processes on residential PV prices in the United States. We combine data from two “scoring” mechanisms that independently capture local regulatory process efficiency with the largest dataset of installed PV prices in the United States. We find that variations in local permitting procedures can lead to differences in average residential PV prices of approximately $0.18/W between the jurisdictions with the least-favorable and most-favorable permitting procedures. Between jurisdictions with scores across the middle 90% of the range (i.e., 5th percentile to 95th percentile), the difference is $0.14/W, equivalent to a $700 (2.2%) difference in system costs for a typical 5-kW residential PV installation. When considering variations not only in permitting practices, but also in other local regulatory procedures, price differences grow to $0.64–$0.93/W between the least-favorable and most-favorable jurisdictions. Between jurisdictions with scores across the middle 90% of the range, the difference is equivalent to a price impact of at least $2500 (8%) for a typical 5-kW residential PV installation. These results highlight the magnitude of cost reduction that might be expected from streamlining local regulatory regimes.

---------------------

The American public’s preference for preparation for the possible effects of global warming: Impact of communication strategies

Bo MacInnis et al.
Climatic Change, January 2015, Pages 17-33

Abstract:
Experiments embedded in surveys of nationally representative samples of American adults assessed whether attitudes toward preparation for the possible effects of global warming varied depending on who endorsed such efforts, the stated purpose of preparation, the consequences of global warming targeted in a preparation message, and the words used to describe preparation and its alternative. Collapsing across all experiments, most (74 %) Americans preferred preparing for possible consequences of global warming. The experimental manipulations produced statistically significant variation in this percentage, but in ways inconsistent with a series of perspectives that yield predictions about this variation. Preference for preparation was not greater when it was described using more familiar or simpler terms (preference for preparation was greatest when it was described as to “increase preparedness” and least when described as “increase resilience”), when efforts were said to be focused on people’s health rather than on people and the environment generally or on coastal ecosystems in particular, or when preparation was endorsed by more generally trusted groups (preference for preparation was highest when no one explicitly endorsed it or when endorsed by government officials or university researchers and declined when religious leaders or business leaders endorsed it). Thus, these experiments illustrate the value of empirical testing to gauge the impact of variation in descriptions of policy options in this arena and illustrate how communication approaches may have influenced public opinion in the past.

---------------------

Natural Hazards and Residential Mobility: General Patterns and Racially Unequal Outcomes in the United States

James Elliott
Social Forces, forthcoming

Abstract:
This study conducts a nationwide, locally comparative analysis of the extent to which natural hazards contribute to residential mobility in the United States and how this influence varies for racial and ethnic minorities. Analyses combine census data on households with data from thousands of recorded natural hazards during the late 1990s. Findings affirm that natural hazards are common throughout the country; that associated property damage correlates positively with increases in residential mobility for all groups; that these increases are particularly noticeable among racial and ethnic minorities because of preexisting inequalities in mobility; and that areas with more costly damage tend to pull as well as push migrants, especially Latinos and Asians. Implications for existing theory, methods, and policy are discussed.

---------------------

Achieving California's 80% greenhouse gas reduction target in 2050: Technology, policy and scenario analysis using CA-TIMES energy economic systems model

Christopher Yang et al.
Energy Policy, February 2015, Pages 118–130

Abstract:
The CA-TIMES optimization model of the California Energy System (v1.5) is used to understand how California can meet the 2050 targets for greenhouse gas (GHG) emissions (80% below 1990 levels). This model represents energy supply and demand sectors in California and simulates the technology and resource requirements needed to meet projected energy service demands. The model includes assumptions on policy constraints, as well as technology and resource costs and availability. Multiple scenarios are developed to analyze the changes and investments in low-carbon electricity generation, alternative fuels and advanced vehicles in transportation, resource utilization, and efficiency improvements across many sectors. Results show that major energy transformations are needed but that achieving the 80% reduction goal for California is possible at reasonable average carbon reduction cost ($9 to $124/tonne CO2e at 4% discount rate) relative to a baseline scenario. Availability of low-carbon resources such as nuclear power, carbon capture and sequestration (CCS), biofuels, wind and solar generation, and demand reduction all serve to lower the mitigation costs, but CCS is a key technology for achieving the lowest mitigation costs.

---------------------

Modeling very large-fire occurrences over the continental United States from weather and climate forcing

R. Barbero et al.
Environmental Research Letters, December 2014

Abstract:
Very large-fires (VLFs) have widespread impacts on ecosystems, air quality, fire suppression resources, and in many regions account for a majority of total area burned. Empirical generalized linear models of the largest fires (>5000 ha) across the contiguous United States (US) were developed at ~60 km spatial and weekly temporal resolutions using solely atmospheric predictors. Climate−fire relationships on interannual timescales were evident, with wetter conditions than normal in the previous growing season enhancing VLFs probability in rangeland systems and with concurrent long-term drought enhancing VLFs probability in forested systems. Information at sub-seasonal timescales further refined these relationships, with short-term fire weather being a significant predictor in rangelands and fire danger indices linked to dead fuel moisture being a significant predictor in forested lands. Models demonstrated agreement in capturing the observed spatial and temporal variability including the interannual variability of VLF occurrences within most ecoregions. Furthermore the model captured the observed increase in VLF occurrences across parts of the southwestern and southeastern US from 1984 to 2010 suggesting that, irrespective of changes in fuels and land management, climatic factors have become more favorable for VLF occurrence over the past three decades in some regions. Our modeling framework provides a basis for simulations of future VLF occurrences from climate projections.


Insight

from the

Archives

A weekly newsletter with free essays from past issues of National Affairs and The Public Interest that shed light on the week's pressing issues.

advertisement

Sign-in to your National Affairs subscriber account.


Already a subscriber? Activate your account.


subscribe

Unlimited access to intelligent essays on the nation’s affairs.

SUBSCRIBE
Subscribe to National Affairs.