Skip to main content

Frontloading the science in anticipation of environmental disasters

Dr. Usha Varanasi
Dr. Usha Varanasi

A recent paper by Center for Urban Waters Distinguished Scholar in Residence Usha Varanasi discusses the decline in America’s baseline ability to use science to plan for and assess highly likely environmental disasters, such as oil spills.

This article first appeared in the May 2012 issue of the journal Fisheries, published by the American Fisheries Society. It is reprinted with permission of Fisheries and the author. Dr. Varanasi is an adviser to the Puget Sound Institute.

By Usha Varanasi

It is a given that every 3–5 years our coasts and oceans will experience an environmental crisis of epic proportions. In responding to these crises (specifically, major oil spills) we frantically gather thousands of samples from impacted areas as oil mixes with water, moves, and degrades. Under these dynamic conditions, we try to analyze as many of these samples as possible to generate hundreds of data reports to calm a concerned public and occasionally develop new methods on the fly (Krahn et al. 1988; Varanasi 1989; Field et al. 1999). While attempting to advance our knowledge under daunting odds, we try to publish reports on lessons learned and identify gaps in knowledge that must be filled to avoid or better manage the next catastrophe (Field et al. 1999). This is not the ideal situation in which to develop a thoughtful, strategic, and comprehensive knowledge base. In addition, the potential for litigation can hinder progress and publication of new research funded by the litigating parties. Once the fervor and frenzy over the catastrophe subsides and the crisis moves off the center stage, funding and momentum are cut short and our institutional memories fade.

As a scientist and former science director¹ who worked under these conditions over several decades, I have wondered often whether this is unavoidable. Perhaps it is naïveté on my part, but we ought to get better at responding to each subsequent disaster by continually expanding our knowledge base. Unfortunately, this is not always the case.

I believe that we must stop thinking about doing good and relevant science “if and when” disasters happen. They are inevitable, and we must ensure that robust scientific inquiry is
conducted and financially supported between disasters when there are opportunities to expand our scientific and technological capacity while improving communication tools so that we are better prepared before the next big one hits us rather than scrambling and often “reinventing the wheel.”

In addition to the lack of consistent funding for frontloading of science, we found that during environmental catastrophes such as the Exxon Valdez oil spill in Alaska to more recent events such as Hurricane Katrina and the Deepwater Horizon oil spill in the Gulf of Mexico there was no standardized and robust baseline information available about the state of the environment where the spill or storm may cause serious damage (Dickhoff et al. 2007; Hom et al. 2008). In the past this lack of baseline (prestorm or prespill) data on levels of chemical contaminants in seafood from the affected regions severely hampered our ability to determine what additional impact was caused by the storm or spill. In the past 30 years there have been attempts to mount systematic nationwide monitoring of fish and shellfish to determine baseline levels of chemical contaminants, such as polycyclic aromatic compounds derived from fossil fuels, and associated diseases. But most of these programs are not sustained or are having severe budget shortfalls. It is clear that we need a systematic effort to collect and report such information in a user-friendly manner, and we need a dramatically different approach to generate revenue and strategies for these programs.

To provide consistent funding for these scientific activities, we need to think of a different model than the usual federal or state funding sources, which currently are being severely cut. Revenue generation to provide a comprehensive and targeted scientific underpinning, so necessary for thoughtful management of our waters, needs to be tied to exploration and extraction of resources from the very same waters. Considerable investment is made by the industry to develop improved extraction technologies and remediation methods. The research is usually conducted under the umbrella of proprietary work. This is understandable. Nonetheless, each time a permit is granted for drilling for oil in public waters, significant moneys (fees) should be contributed upfront to a national trust fund that could be independently managed by a public/private coalition.² The funds should be made available to scientists from all sectors to investigate potential impacts of an oil spill on human health and safety as well as impacts on ocean life before a crisis happens. Scenario building and development of a long-term strategy of remediation (if the disaster does happen) should be studied and debated in the open, and relevant, new methods should be developed and validated beforehand so they can be standardized and used confidently to measure contamination of seafood and to assess biological effects of toxic contaminants from a spill.

Targeted baseline monitoring should be conducted to determine levels of contaminants in waters, sediment, and organisms from the site where drilling is proposed or along the transport routes of the oil via pipeline and tankers and the region where trajectories of potential spills are predicted should a disaster occur. Such an approach to ensure that science is planned and conducted in anticipation of the inevitable next crisis would enable us to better protect the public and marine life by making environmental decisions wisely.

My experience with major environmental catastrophes such as the Exxon Valdez and Deepwater Horizon oil spills has shown that despite the best efforts of dedicated scientists
and agency staff, serious decisions had to be made with insufficient scientific information while the nation responded to deep concerns over human safety, seafood contamination, damage to marine life, and economic losses. For example, hot water cleaning was used on beaches during the Exxon Valdez spill to remove visible oil slicks from the rocks, which actually further damaged fragile marine ecosystems and pushed oil beneath the surface where degradation was considerably slower and the oil still persists after two decades.

During the Deepwater Horizon incident, massive amounts of dispersant were used to break up the oil mass in an attempt to protect shorelines and the public. However, to the best of my knowledge, scientists in the public sector (specifically NOAA scientists) did not have timely information about the composition of the dispersant used and there were insufficient scientific methods or knowledge to measure uptake of dispersant components
or the effects of emulsified oil on marine life. How such a large-scale use of dispersant (1.8 million gallons) would affect the ecosystem of this region was unknown. Consequently,
methods to measure dispersant components had to be developed under crisis conditions, and very few studies could be conducted during this dynamic situation to assess the impact of dispersant and oil emulsions on marine life, especially on the early developmental stages known to be sensitive to low levels of toxicants. It should be common wisdom that during any environmental crisis, we should apply standardized and established methods and known processes rather than frantically developing new technology and approaches, because time will be of the essence and validated methods will generate confidence.

It may be said that each crisis is different and we can never be truly prepared. Nonetheless, my experience in responding to environmental catastrophes has shown that when science was frontloaded—that is, when we were equipped with validated, standardized methods and protocols—we were better able to apply successfully this knowledge to aid agencies and communities affected by the oil spill. For example, in the mid-1980s we had developed considerable knowledge about uptake, metabolism, and effects of toxic hydrocarbons in fish and invertebrates in marine waters (Varanasi 1989). Our research showed that polycyclic aromatic compounds associated with fossil fuels
do not accumulate in vertebrates (e.g., fish, marine mammals) because of their efficient metabolism in liver and rapid excretion of by-products of hydrocarbons (metabolites) in bile. This scientific knowledge led to the development of rapid screening methods to detect hydrocarbon exposure (Krahn et al. 1988). Our research also showed that in contrast to fish and mammals, invertebrates (e.g., shellfish, molluscs) tend to bioacculmulate toxic hydrocarbons due to inefficient metabolism of hydrocarbons.
This scientific knowledge and established methods to measure polycyclic aromatic hydrocarbons and metabolites were extensively used during the Exxon Valdez oil spill in Alaska to provide timely information to the affected community about the degree of contamination of the seafood (Field et al. 1999). Availability of rapid screening methods allowed scientists to analyze large numbers of seafood samples and communicate
their results broadly and with confidence. Though the cultural differences and severe distress experienced by communities from this risk of spilled oil to their harvests should not be minimized and provided a grave challenge, scientists working on seafood safety during this crisis had a platform from which to assist. This was possible, at least for my team, because National Oceanic and Atmospheric Administration (NOAA) managers in the field communicated well with state agencies and worked directly with affected communities (Field et al. 1999). Regrettably, funding and support for such scientific inquiry slowly declined so that we were not able to expand the knowledge base to determine uptake, metabolism, and excretion of toxic compounds when emulsified with various dispersants, before new oil spills, notably the Deepwater Horizon incident.

Conducting strategic and comprehensive scientific inquiry, including hypothesis testing, is not possible during an intense crisis because scientists are often faced with having to answer fragmented “questions of the day” and answers often managed as a public relations issue by diverse parties. Much valuable time is lost for the scientists who have the knowledge and broad understanding of the strengths and limitations of the methods
and expertise to best interpret and communicate the results.

Though transferring knowledge gained from an earlier spill is not always applicable due to differences in physical properties of the oil, location and causes of the disaster, and cultural
differences of communities, having established methods and processes affected from earlier spills and new knowledge developed and validated between crises can be extremely valuable. Hence, time devoted by scientists will be better used to apply well-established methods and to communicate clearly scientific results to a wide range of audiences in an independent, accurate, reliable, and consistent manner that rebuilds confidence in seafood advisories and avoids hasty assessment of ecological damage in a politically charged and media-intensive atmosphere after an environmental crisis.

In conclusion,

  • Our responses ought to improve with each new disaster, but they rarely do, despite progress in science, technology, and communications.
  • Sociopolitical and financial factors, especially when the disaster is caused by humans, hamper frank and honest dialogue and open investigations. It is imperative that scientists are objective, thoughtful, independent, and effective spokespersons.
  • When science is not frontloaded, conducting good and objective science during a crisis, being responsive to public fear and distress, and clearly communicating scientific findings in response to environmental disasters become serious challenges.
  • It is time to try a different model for funding scientific underpinning that includes targeted baseline information, standardized and validated methods, and research
    and development to improve opportunity for new and innovative techniques and approaches. (Note: Establishment of this trust fund to frontload science and strategies does not replace the funds that are allocated or adjudicated for mounting an environmental response and natural resource damage assessments when a disaster happens.)
  • Lastly, there should be consistency in regulatory criteria on allowable limits for consumption of contaminated seafood so there is no confusion in the public’s mind
    with regards to what is safe to consume. A reservoir of authoritative and objective scientific information needs to be created that is accessible to all interested parties
    and the public at large.

Such a frontloading of science will provide strong factual underpinning before making decisions that could forever alter our coastlines and ocean life.

Footnotes

  1. Varanasi was the Science and Research Director of NOAA’s Northwest Fisheries Science Center in Seattle, WA.
  2. I have deliberately not discussed my thoughts in this article about the specifics on how this trust fund should/could be managed, etc., because I did not want to lose focus from the concept that we need to establish such a fund.

Acknowledgments

I sincerely thank the following colleagues who kindly read and commented on this article: Denis Hayes, Bullitt Foundation; William H. Rodgers Jr., University of Washington Law School; and Tom Hom and John E. Stein, Northwest Fisheries Science Center, NOAA.

References

Dickhoff, W.W., T.K. Collier and U. Varanasi. 2007. The seafood “dilemma”—a way forward. Fisheries 32(5):244–246.

Field, L.J., J.A. Fall, T.S. Nighswander, N. Peacock, and U. Varanasi, editors. 1999. Evaluating and communicating subsistence seafood safety in a cross-cultural context: lessons learned from the Exxon Valdez oil spill. Society of Environmental Toxicology and Chemistry, Pensacola, Florida.

Hom, T., T.K. Collier, M.M. Krahn, M.S. Strom, G.M. Ylitalo, W.B. Nilsson, R.N. Paranjype, and U. Varanasi. 2008. Assessing seafood safety in the aftermath of Hurricane Katrina. American Fisheries Society Symposium 64:73–93.

Krahn, M.M., C.A. Wigren, R.W. Pearce, L.K. Moore, R.G. Bogar, W.D. MacLeod Jr., S. Chan, D.W. Brown. 1988. Standard analytical procedures of the NOAA National Analytical Facility, 1988: new HPLC cleanup and revised extraction procedures for organic contaminants. U.S. Department of Commerce, NOAA Technical Memo NMFS F/ NWC-153, Seattle, WA.

Varanasi, U., editor. 1989. Metabolism of polycyclic aromatic hydrocarbons in the aquatic environment. CRC Press, Boca Raton, Florida.


This article also appeared in the May 2012 edition of Fisheries, published by the American Fisheries Society.