The Climate CIRCulator is brought to you by The Pacific Northwest Climate Impacts Research Consortium (CIRC) and The Oregon Climate Change Research Institute (OCCRI).

Tree damage, timber blowdown, and forest debris in Clatsop and Tillamook counties from winter storm 2007. (Oregon Department of Forestry)

Coastal Hazards and Hydrology
Reconstructing the Great Coastal Gale of 2007

The Northwest’s current drought makes it difficult to recall times when storms were many and snow was plentiful. But 2007-2008 was just that. A series of storms in early December 2007 blanketed low elevation ski resorts with deep powder and pummeled the coastline with hurricane-strength winds and surging seas resulting in flooding that closed highways and inundated homes.
The 2007 coastal disturbance is the focus of a recent study by Oregon State University researchers Tiffany Cheng and David Hill. (Hill and his research were profiled in the April 2015 issue of the CIRCulator.) The paper’s third author, Wolf Read, works at the University of British Columbia. 
In their paper, Cheng and colleagues examine the factors leading up to the 2007 flooding that occurred on December 2nd and 3rd in Tillamook Bay, an estuary located along Oregon’s north coast. To get a glimpse at the intense, two-day flooding—dubbed the “Great Coastal Gale of 2007”—the researchers developed a computer model to recreate the event. Their goal: to evaluate the relative influences of flooding in the estuary, including surges from wind-driven water within the bay, river flows, and offshore breaking waves. (Influences were assessed on their own, excluding the influence of the tides.) 
The researchers first looked at the possibility that the storm’s 100-mile-per-hour winds may have been responsible for bay flooding. However, upon further analysis the team found the heavy winds had only a mixed impact on flooding in the bay. For example, winds blowing from the south pushed water from the southern portion of the bay to the north, thereby reducing flooding in the south. But as the researchers discovered, this only explained a small percentage of flooding in the north of the bay. 
The authors next looked at the possibility that bay flooding may have been driven by heavy precipitation, creating high streamflows from the coastal mountains. In removing the streamflow component from their simulation, the authors found that water levels declined by approximately 20 percent, a significant but still incomplete explanation for the extreme flooding during the storm. The main driver of flooding had to be something else, the researchers concluded.
By removing from their simulations the influence of the offshore breaking waves that drove water through the bay’s inlet, Cheng and colleagues discovered that simulated water levels dropped by nearly two feet, explaining most of the abnormally high water levels in the bay. In other words, offshore waves (which measured in excess of 30 feet during the storm’s peak) were the smoking gun for flooding in Tillamook Bay. What’s more, timing was everything and things may have been a lot worse. Streamflow contributions to the bay were relatively small, the researchers note, because they peaked just prior to the waves reaching their maximum height during the storm. Had those same streamflows peaked a day later, it’s possible that flooding from the Great Coastal Gale of 2007 might have been that much greater.

Cheng, T.K.; Hill, D.F., and Read, W. (2015) The contributions to storm tides in Pacific Northwest estuaries: Tillamook Bay, Oregon, and the December 2007 storm. Journal of Coastal Research, 31(3), 723–734.

Drought in California (USDA)

California Drought
Taking Issue with 1,200-Year Drought Study

Much of the Western United States is now in the grips of moderate to severe drought. According to some, California is experiencing its most severe drought in over a millennium. But just how good is that claim?
This is one of the questions NOAA researchers Henry Diaz and Eugene Wahl hope to answer in their recent paper in the Journal of Climate. Part of Diaz and Wahls’ query is in response to the headline-grabbing work of Daniel Griffin and Kevin Anchukaitis.

Griffin and Anchukaitis compared tree-ring samples gathered in Central and Southern California over the last three years against ring measurements taken from trees that lived in those areas from the years 800 to 2006. (See “What We’re Learning from California’s Drought” in the February 2015 issue of the CIRCulator.) From their analysis the researchers concluded that while drought was common in the records—the two noted some 37 droughts in the tree ring data—California’s current drought was the worst in over 1,200 years.

Griffin and Anchukaitis’ conclusions are based on the assumption that tree ring records can act as proxies for actual weather station measures. Since weather station data—much to every climatologist’s dismay—do not run much beyond a century, reconstructing paleoclimate requires getting creative with available measures, such as tree ring data. Researchers reconstruct paleoclimate by applying a variety of statistical methods and by carefully choosing which tree ring series to include in order to emphasize the weather variable of interest. (Some trees may respond more to temperature, others to precipitation, depending on species and location.) Here’s where Diaz and Wahl’s newer study comes into the picture.

Diaz and Wahl used a previous reconstruction of streamflow in medium-sized river basins across the West. They argue that because streamflow can account for precipitation over an entire basin, the method will be less sensitive to local fluctuations in precipitation and soil moisture, and particularly to the tendency for local soil moisture anomalies to persist and influence tree growth—and hence data—in a second year. Their study used reconstructed, gridded precipitation over the continental U.S. for the period 1571-1977, and focused on the average over the entire California-Nevada region.

Reconstructions in hand, Diaz and Wahl ranked the precipitation anomalies averaged over California-Nevada: 1977, the last year in the time series and the water year that was as dry as 2014, came in at 10th place. But they also note that 2014 is “comparable” to a few other dry years. From this the researchers concluded that “events of this magnitude occur on an approximately 50-year average time scale, and thus 2014 is clearly not unprecedented.”  (Note: their 50-year “average time scale” is not the same as a “return period” calculated by fitting an extreme value distribution to the data.) Similarly, for 3-year total precipitation they find other examples that were roughly as dry, and state, again, that 2014 is “not unprecedented.”
In Our Editorial Opinion: The results of Diaz and Wahl’s analysis contrasts (though doesn’t necessary contradict) that of Griffin and Anchukaitis. However, paleoclimate reconstructions are always challenging, and these studies illustrates that different methods can produce very different results. Diaz and Wahl’s own Table 2, which ranks 3-year precipitation totals, shows that 2012-14 is the lowest in the instrumental record, but also that their precipitation reconstruction comes up with a very different ranking for the driest years in the 1895-1977 period common to both the tree ring and instrumental records. In other words, Diaz and Wahl have shown only that 2014, or 2012-14, might not be hands-down the driest 1- or 3-year period since 1571.

Diaz, H.F and E.R. Wahl (2015) Recent California Water Year Precipitation Deficits: A 440-Year Perspective. J. Climate, 28, 4637–4652.

Griffin, D., and K. J. Anchukaitis (2014) How unusual is the 2012–2014 California drought?, Geo-phys. Res. Lett., 41, 9017–9023. doi:10.1002/2014GL062433

The Columbia River Gorge near Dallesport, Washington. (Creative Commons)

New Hydrologic Method 
A New Approach to Tracking Hydrologic Responses to Climate Change

Rising temperatures and changing precipitation patterns are leading to major shifts in when and how streamflows occur. But deciphering how climate change might affect any given watershed can be both time-consuming and expensive. Now a new study focusing on the Pacific Northwest offers a quick, inexpensive alternative to traditional climate downscaling and hydrologic modeling.
The method is called the sensitivity approach. The brainchild of CIRC and OCCRI researcher Julie Vano, the sensitivity approach is the subject of a recent paper written by Vano and CIRC researchers Bart Nijssen and Dennis Lettenmaier in Water Resources Research. Essentially, the approach tries to determine how sensitive runoff is to incremental changes in temperature and precipitation during different seasons.
Watersheds in the Northwest are typically classified as rain-dominant, snow-dominant, or transitional (a mix of both rain and snow), based on their predominant type of precipitation.
Rain-dominant watersheds receive mostly rainfall during the cool season, making streamflow highest in the winter. Snow-dominant watersheds receive mostly snow, which remains stored in mountain snowpack until it melts during the spring and summer when peak streamflow occurs. Transitional watersheds receive both rain and snow so they tend to have two streamflow peaks: one in winter and the other in spring.
Using the sensitivity approach, Vano and colleagues identified watersheds in the Pacific Northwest that are most likely to experience streamflow seasonality changes.

The team found transitional, or mixed, watersheds that receive a substantial portion of streamflow from spring snowmelt are most sensitive to warming during the cool season, as they are likely to experience increased cool season runoff and decreased warm season runoff. Rain-dominated watersheds are less sensitive to warming because there is little snow to melt. And snow-dominated watersheds are also less sensitive because they are likely to remain cold enough that warming does not affect the timing of snowmelt. Here’s step-by-step how the researchers used the sensitivity approach to get their results:
  1. They ran a hydrologic model using baseline historical temperature and precipitation data (taken from the Columbia Basin Climate Change Scenarios Project). This produced a series of simulated streamflows.
  2. Then they ran the model again, but with temperatures 0.1°Celsius (0.2°F) higher every day of the year, producing still more streamflows.
  3. Next, they compared streamflows from step 2 with the baseline streamflows from step 1 to determine how much runoff changes at each location and in each month occurred in response to the 0.1°C temperature increase.
  4. They repeated steps one through three, but this time changed the temperature only during the cool season or warm season, or fall, winter, spring, or summer.
  5. They then repeated the entire process by imposing a precipitation increase of one percent while keeping temperature the same.
  6. And finally, they multiplied the resulting temperature and precipitation sensitivities for each location and month by the projected change in temperature or precipitation that came directly from Global Climate Models (no downscaling necessary) to approximate future changes in streamflow.
So just how well does the sensitivity approach measure up to other methods? The resulting projected monthly hydrographs are strikingly similar to those generated using the “full-simulation” approach in which Global Climate Model data is first downscaled and then run through a hydrological model.
These results suggest that Vano’s method can capture the projected seasonality shift quickly (not to mention inexpensively) compared to the full-simulation method, but there are a couple caveats. The approach only works so long as the runoff sensitivity to small changes (e.g. 0.1°C or nearly 0.2°F) is the same as the sensitivity to large changes (e.g. 3°C or 5.4°F), and changes in individual months or seasons can add up to equal the projected annual changes. It’s no surprise then that the sensitivity approach works best when considering near-term change (roughly 30 years) and in locations with small sensitivities. What’s more, while the approach quickly provides insights into the nature of average changes in the hydrograph, it does not provide daily flow sequences as in the “full-simulation” approach. (Subsequent work by Vano and other OCCRI and CIRC colleagues yet to be published explored sensitivities to large temperature change.) 
Editorial Note: Water managers work with the timing of streamflow to both reduce flood risks and store water for future use. Future shifts in streamflow timing could force managers to rethink their current operating rules. Shifts in timing and runoff are also the main concern for the parties involved in the Columbia River Treaty 2014/2024 Review. This research helps to identify locations in the Columbia River Basin most sensitive to climate changes.
Vano, J. A., B. Nijssen, and D. P. Lettenmaier (2015), Seasonal hydrologic responses to climate change in the Pacific Northwest, Water Resour. Res., 51, 1959–1976. doi:10.1002/2014WR01590

The upper part of the March 22, 2014, landslide in northwest Washington as it appeared on March 27 (Jonathan Godt, USGS)

Precipitation and the Oso Landslide
Setting the Climatic Stage for the Oso Landslide

Last year’s massive landslide near the town of Oso, Washington, resulted in the loss of 43 lives, making it the largest such tragedy in the United States in decades. Now a new study is helping unravel the unusual climatic events leading up to the March 22, 2014 disaster.

The study, published in the American Meteorological Society’s Journal of Hydrometeorology, is the work of University of Washington’s Brian Henn and colleagues, including former CIRC researcher Dennis Lettenmaier. The researchers’ goal was not to investigate the mechanism (or mechanisms) that triggered the landslide. Instead Henn and his colleagues’ observations and analyses set the climatic stage, so to speak, for the events that ultimately brought down the hillside.

Setting the Stage: The 2014 water year (October 1, 2013 to September 30, 2014) began with an unseasonably dry spell. Then, the pattern abruptly changed and precipitation arrived with a vengeance. From mid February to late March 2014, rain and snow were so heavy they nearly made up for what hadn’t arrived in the fall. The result was an anomalously wet late winter.

That’s the stage. Enter the loaded weapon: A mass of soil—16 billion kilograms (18 million tons), or roughly three times the mass of the Great Pyramid of Giza—made of glacial till and sand precariously perched atop the 190-meter-tall (623 foot) bluff overlooking the Washington community.

The heavy rains in the late winter super-saturated the soil. Somewhat like a sponge, the more water the soil soaked up, the heavier it got, until the entire mass was sent sliding. (The exact trigger—if there was a single trigger—of the slide is still largely unknown.)

Henn and colleagues’ analysis looked at soil moisture and precipitation accumulations as the possible contributors to the slide. Using the Variable Infiltration Capacity Macroscale Hydrologic Model to reconstruct conditions at the time of the landslide, the researchers calculated that the local soil moisture was unseasonably high due to the high amounts of precipitation in the weeks leading up to the landslide. In fact, six days before the slide the bluff’s soil moisture was calculated as having a 43-year return period (meaning the soil moisture conditions were so high they had a 1 in 43, or 2.32%, chance of occurring in any given water year). Precipitation accumulations ending on the day of the landslide were also—not surprisingly—really high: racking up return periods as high as 88 years for 3-week accumulations.

Note On Research Methods: The researchers’ conclusions were reached by gathering data from several precipitation gauges located near the site of the landslide—two primary gauges located 3 km (about 2 miles) and 18 km (11 miles) away and nine other stations located within 80 km (50 miles). The data were broken down into 28 different time frames ranging from 1 day to 10 years with time frames all ending on the date of the landslide. Results were fitted with a probability distribution. The distributions were then compared with the 2014 accumulations to estimate their probabilities leading up to the landslide. While the previous few days were wet, they weren’t exceptionally so; it was the precipitation totals in the 1 to 6 weeks before the slide that were exceptionally high compared to similar time periods. Soil moisture reconstructions were taken from University of Washington’s Drought Monitoring System for the Pacific Northwest, an effort supported by CIRC.
Henn, B, Q. Cao, D.P. Lettenmaier, C.S. Magirl, C. Mass, J. Brent Bower, M. St. Laurent, Y. Mao, and S.Perica (2015) Hydroclimatic Conditions Preceding the March 2014 Oso Landslide. J. Hydrometeor, 16, 1243–1249.

A spider unit being deployed by helicopter to help study the landslide that occurred in northwest Washington on March 22, 2014. This photo was taken on April 1, 2014. (Jonathan Godt, USGS)

The Climate CIRCulator is brought to you by The Pacific Northwest Climate Impacts Research Consortium (CIRC). CIRC delivers science, information, and tools to decision makers responsible for the management of  resources and services in a changing climate. Our team consists of experts from Oregon State University, the University of Oregon, the University of Idaho, and the University of Washington. CIRC is funded by the National Oceanic and Atmospheric Administration (NOAA) and housed in the Oregon Climate Change Research Institute (OCCRI) at Oregon State University. The OCCRI brochure can be downloaded  here.

The Climate CIRCulator, July/August, 2015, Issue 7.
Copyright © 2012
The PNW Climate Impacts Research Consortium.
All rights reserved.