Build a bridge out of ‘er

12 03 2011

Apologies to Monty Python and my poor attempt to make the over-used expression ‘bridging the gap’ humorous.

Today’s guest post comes from across the Pacific Ocean. Dr. Sara Maxwell is a postdoctoral fellow with Marine Conservation Biology Institute and University of California Santa Cruz, Long Marine Laboratory. She was kind enough to contribute to ConservationBytes.com about an issue I’ve covered before in various forms – making conservation research relevant for conservation action.

© R. Arlettaz

In a catalyzing article titled “From publications to public actions: when conservation biologists bridge the gap between research and implementation” in the November 2010 issue of BioScience, Raphaël Arlettaz1 and his colleagues Michael Schaub2, Jérôme Fournier3, Thomas Reichlin2, Antoine Sierro4, James Watson5 and Veronika Braunisch2 explore reasons for our hard work as conservation biologists not reaching the implementation phase. This article strongly resonated with my colleague, Kiki Jenkins6 and I, Sara Maxwell. This resulted in a series of letters published in BioScience and now we join together, along with Jeffrey Camm7, Guillaume Chapron8, Liana Joseph9, and Rudi Suchant10 to synthesize our ideas and present them to the larger conservation community via ConservationBytes.

The article that sparked the discussion

In their article, Arlettaz and colleagues highlight some of the main roadblocks to implementing conservation research. The main reasons are that:

  1. The research made by conservation biologists’ does not lend itself well to implementation, i.e., as a community we often focus on the wrong questions or address them in ways that do not lead to practical applications for practitioners;
  2. The outcomes of conservation biologists’ research tends not to reach practitioners and so fails to be put into action;
  3. When we successfully align and collaborate with practitioners, there is a lack of economic or political support to make the changes that need to happen; or
  4. Conservation biologists do not commit to engaging themselves in the implementation of their recommendations due to a lack of reward structure for this behaviour and the conflicting roles of academia and conservation.

Arlettaz and colleagues illustrate how to overcome these roadblocks using a case study of their own work on the endangered hoopoe (Upupa epops) in Switzerland, showing how they followed through the recommendations of their work to implementation and had a direct impact on species recovery. They highlight means by which other conservation biologists can do the same.

Their approach – and the outcomes – should resonate strongly with all conservation biologists, not just for its reminders of our ‘wins’, but also of the times we feel we have failed. Who among us hasn’t poured what feels like your heart and soul into a project that you believe in, only to have the outputs of your work stymied by a government agency that puts politics above science. Who hasn’t felt the sting of knowing that while your career has advanced, the work you have published – for all of its potential impact – will likely go no farther than the pages of a journal. This because you lack the time and resources to go beyond the printed page because the academic system is not structured to allow – or reward – such efforts. And this is despite the fact that it was the promise of action stemming from your research that inspired you to enter conservation biology to begin with.

A response

Despite these roadblocks, like Arlettaz and colleagues, Jenkins and Maxwell are attempting careers where they engage in both academic research and inspire effective change – and are looking forward with hope that the academic system will change its system of rewards, and that they can help to drive that change.

In their letter, they identified a bottom-up approach to influencing the system of rewards within academia to reflect research impacts and not just publications. While they believe that peer-reviewed research must absolutely be maintained, they agree with Arlettaz and colleagues that it is only one dimension of effective conservation science. Just as it can be more expediate and effective for conservation scientists to conduct conservation actions themselves, Jenkins and Maxwell believe that conservation scientists should also begin grassroots change that brings implementation and impacts to the academic forefront, especially in innovative departments or progressive institutions. These means might involve: encouraging search and promotion committees to ask for statements of implementation, suggesting academic managers give rewards for implementation or research impacts, or simply including a section on research impacts on your own CV or annual activities reports. A rewards system does not need to be established by new rules; all that is required is a common currency. As more individuals start trading this currency, the broader its recognition and acceptance throughout academia.

The reply

In Arlettaz and colleagues’ reply, however, they doubted that this would be sufficient to overcome the immense research-implementation divide prevailing in biodiversity conservation, which partly stems from the practices currently ruling research institutions. They put forth a system of structured organizational change that might be more appropriate in those instances. The reward system for conservation scientists based in academia is heavily focused on publications in peer-reviewed journals, all of which favour demonstrated novelty and scientific progress.

Along with medicine, conservation biology differs from other disciplines among the life sciences in that it is mission-driven. The consequential trade-off that conservation scientists face when ensuring that their scientific evidence is employed by policy-makers and conservation practitioners is ignored by almost all research institutions in assessing academics for employment, promotion or judging suitability for awarding grants. Until aspects around career evaluation are changed systematically, those conservation scientists who dare to move their ideas into the policy or management world will receive little support or recognition from their institutions.

They suggest that one idea for trying to overcome this is developing a system of accreditation that rewards the full spectrum of activities that conservation biologists play. This is not totally novel: engineers have an accreditation system based on patents; in the UK, it will be used to judge the merit of the science by the government (Gilbert 2010). They further suggest rules broadly recognised by academia, including indices for biodiversity conservation impact similar to the traditional metrics estimating publications output (i.e., journal impact factor, citation and h-indices).

In fact, such a tool is in development. In recent correspondence in Nature, Thomas Niederkrotenthaler, Thomas Dorner and Manfred Maier of the Medical University of Vienna, Austria, describe what they call the ‘Societal Impact Tool’. The goal of the tool is to evaluate research based on factors such as ‘the aim of the published investigation; the extent to which the authors attempt to translate their scientific findings into societal action; and the level, status and target group of this translation.’ Their tool is simple and straightforward, and could easily be adopted by both academic departments and journals.

Arlettaz and colleagues also suggested a second idea by which journals can help to strengthen the links between good science and better practice by rewarding science that is relevant and useful. This can be done by way of new scientific journals or sections in existing conservation journals preferentially publishing results that are not simply novel and scientifically rigorous but are proven to be useful for conservation in practice. The authors would provide a letter of support from practitioners demonstrating that the work is of practical importance*, similar to the traditional approach of engineers for progressing relevant operations research (e.g., the journal Interfaces). Journals may also systematically request practitioners to function as reviewers for judging the applicability of results.

The Journal of Applied Ecology recently launched one such initiative where they are giving a voice to practitioners to help span the divide between implementation and science (Hulme 2011). These concepts would tighten the collaboration between conservation scientists and practitioners, optimally from the very start of a research project as shrewdly suggested by Jenkins and Maxwell, and promote new implementation pathways where there is no ‘established implementation process’ by which to ‘escort recommendations’.

In summary…

At the end of the day, however, it is up to us as conservation biologists to make our own impact. It is up to us to prioritise, strategise and streamline our efforts to have the most influence, while following through on the tasks that allow us the opportunity to continue our work within the academic system where we have ultimately chosen to be. This means that we must put implementation at the forefront of our research planning – before we even know what our recommendations are – by beginning early to build trusting relationships with implementers, managers and/or local NGOs working on the ground.

This will additionally allow us to garner valuable insights into how to make our work most effective to them. We must also build implementation into our research budgets. This may involve anticipating the time or travel costs associated with ‘actively escorting’ our recommendations through available avenues, or budgeting for professional translation of work – whether it be in another language or to the general public – to reach our target audience. The key is that this ‘establishes implementation as more than an afterthought that is conducted on piecemealed time and funds, but instead gives this important piece of conservation science a prominent and tangible place in research design and funding.’

Reader, what do you think?

Through this series of articles, we’ve suggested six ways to improve the use of good science to guide better conservation:

  • Using practical case-studies to demonstrate the best science-based approaches to conservation,
  • Searching and promoting committees requesting statements of implementation,
  • Academic managers rewarding staff for implementation or research impacts
  • Stating research impacts on CV or annual activities reports,
  • A new accreditation system to reward science which influences practice
  • Journals or section of journals that require demonstrated proof of impact

This brings us to you, the ConservationBytes.com reader. What do you think? How can we as conservation biologists be more effective? How can we change the system? Do you know of examples where the system is already changing, either successfully or unsuccessfully? Should it even be changed? We look forward to your thoughts and comments, and thank Corey Bradshaw for putting forth this blog and excellent forum to move our field forward**.

*CJAB: We tried this in Conservation Letters, but it proved nearly impossible to implement. I personally do not think this is tractable.

**CJAB: Why, thank you – and, you are all welcome ;-)

  1. Raphaël Arlettaz (raphael.arlettaz@iee.unibe.ch) is a professor at the University of Bern, Switzerland, where he leads the Institute of Ecology and Evolution and the Department of Conservation Biology and director of the Valais Field Station of the Swiss Ornithological Institute.
  2. Veronika Braunisch, Thomas S. Reichlin, and Michael Schaub are postdoctoral research fellows at the Department of Conservation Biology, at the University of Bern. Veronika Braunisch also works as a research fellow at the Forest Research Institute of Baden-Wuerttemberg, in Germany together with Rudi Suchant, who leads the Wildlife Ecology group at this institute. Michael Schaub is also a research group leader at the Swiss Ornithological Institute.
  3. Jérôme Fournier is with an environment consultancy company.
  4. Antoine Sierro works at the Valais Field Station of the Swiss Ornithological Institute.
  5. James E. M. Watson is a postdoctoral research fellow at The Ecology Centre, at the University of Queensland in Brisbane, Australia.
  6. Kiki Jenkins is a David H. Smith Conservation Fellow at the University of Washington School of Marine Affairs.
  7. Jeffrey Camm is Professor of Quantitative Analysis, CoB Research Fellow and Head of the Department of Quantitative Analysis and Operations Management at the University of Cincinnati.
  8. Guillaume Chapron is assistant professor at the Grimsö Wildlife Research Station, part of the Swedish University of Agricultural Sciences.
  9. Liana Joseph is a David H. Smith Conservation Fellow at Wildlife Conservation Society in New York.
  10. Rudi Suchant is a research fellow at Forest Research Institute of Baden-Wuerttemberg in Germany.

 


Actions

Information

7 responses

18 03 2011
Alex Diment

Here’s a list of key readings that I put together for a session on this topic during the SCB meeting in Beijing 2009.
Hope it is useful to you all.
Any additions or updates welcome.

http://alex.diment.org/IC/Improving_Conservation_Publishing.html

Like

17 03 2011
Simone Vincenzi

Kimberly, I don’t follow the logical passage from here:

It seems that a key point often overlooked in this discussion is the value of hypothesis-driven research. From a scientific perspective, ‘good’ research should answer a fundamental question about the mechanisms operating within a particular study system.

to here:

Observational studies may be what’s needed to address a particular conservation issue, but they generally don’t fulfill our scholarly objectives.

One of the big problems I see historically in ecology is the (past, but still in vogue nonetheless) reductionist approach and an excess of wrong models (which theoretically work in a vacuum with ideal and not existent species, at least in this world).

But I think that you intended as observational studies something more like poorly controlled, anecdote-based studies.

Like

17 03 2011
Kimberly Terrell

It seems that a key point often overlooked in this discussion is the value of hypothesis-driven research. From a scientific perspective, ‘good’ research should answer a fundamental question about the mechanisms operating within a particular study system. Observational studies may be what’s needed to address a particular conservation issue, but they generally don’t fulfill our scholarly objectives. I contend that striving to understand the natural world is as noble an ambition as the effort to preserve it, and, in fact, the two goals are complementary. Knowledge of the intricate processes that sustain organisms and ecosystems will inspire conservation efforts, whereas a biodiverse planet will provide endless opportunities for scientific discovery. The challenge for conservation scientists lies in the ability to become a respected and influential member of both communities – the scholarly and the applied.

Like

16 03 2011
Simone Vincenzi

I theoretically agree that conservation ecology, being of the applied disciplines of ecology, should present (in journals) works with a real and effective applicable conservation component, but it will be very hard to change the usual and normal directions followed by Editors (i.e., largely favor more theoretical works, even in conservation, sometimes looking for greater generalities). There is a strong tendency, especially in big journals and I could cite many examples, to prefer theoretical and never in this world applicable works.
I’m next to submitting a paper where I describe the (very) applicable potential of the quantitative results of a long term conservation experiment . Shooting high or shooting low, this is the question.

Like

16 03 2011
bryan wallace

While folks in academia might have issues ‘building bridges’ between research and implementation, many of us in the conservation NGO community live at this interface. So it’s important to emphasize that the debate here is from an academia-centric point of view, and figuring out ways to implement research is not a universal challenge to conservation science.

With that in mind, however, I think that this dichotomy between conservation science generated in academia and that generated and/or implemented by NGOs (and, of course, entities/agencies charged with natural resource management) is healthy, as long as effective cross-talk and strategic partnerships are cultivated. Academics have the freedom to be creative in research pursuits, which is important to be sure we’re thinking about ‘the next big issue in conservation,’ whereas research conducted by on-the-ground folks and conservation NGOs is necessarily and pragmatically constrained to what is needed for resource management and to meet applied conservation goals.

In sum, these worlds are quite complementary, but only if folks in each are willing to reach out to folks in the other.

Like

15 03 2011
Matt Hayward

I agree entirely with the disparity between what’s published in the conservation literature and what information is needed by managers. The conservation NGO I work for has opted to write its own research proposals to send to academics around Australia in order to ensure the research occurring on our sanctuaries answers management problems we are having. To now it has been a top-down process with academics effectively dictating what is important to research and I think this has led to a whole range of conservation fads that are attractive to journals but irrelevant to practitioners.

I think Australia suffers from this problem more than other developed nations. Why? Our academics are competing with European and American scientists to publish in the top ranked journals and so cannot afford to publish the simple autecological/natural history studies that we need to ultimately get synthesis or predictive ability. However, these older continents benefited from a long period of natural history investigation before Australia was discovered (and we’ve largely lost the traditional knowledge that would have solved this problem). The government conservation agencies are now the organisations who research the simple but important natural history aspects of our biodiversity, however these often have little emphasis on publishing. Even Australia’s conservation bent journals are making it harder to publish the autecological studies we need as they move to a more international focus (Wildlife Research and Austral Ecology leap to mind).

Perhaps only time will tell whether our strategy will breach the implementation gap.

Like

14 03 2011
MJ

Thanks for the excellent post. I am a practitioner who doesn’t have time to read lots of journals so hadn’t come across that article in BioScience. (Tho I did note an editorial in Oryx along similar lines last year.) I think you have answered half of the question very well, and my only specific query is why did you not elevate your suggestion that journals should get their submissions reviewed by more practitioners as one of your main conclusions. This would probably be the easiest thing to do: even better, they should be more fully represented on editorial boards. And by practitioners I mean on-the-ground folks, not heads of global NGOs.

But, I think, what your post hasn’t really fully addressed is the choice of research topic. Conservation is about so much more than biology, especially in the tropics. Very rarely is the biggest problem lack of biological knowledge. People who care passionately about tropical conservation should be seeking much wider skill sets, and considering problems more holistically (see http://bottomupthinking.wordpress.com/2011/02/09/jack-of-all-trades-master-of-one/ for more of my thoughts on this). Conservation researchers therefore need to look outside their narrow fields of expertise if they are going to identify the most useful research topics and thus have a greater impact on the ground. Moving conservation groups out of the biological sciences faculty and into some more cross-disciplinary department might be a good start.

MJ

ps. I’m sure this must annoy the **** out of academics too, so why oh why can journals not set up a system which does not require us to pre-format a paper to their exact requirements before sending for initial consideration??? That’s a lot of wasted effort just for a myopic editor (presumably they’re all myopic, except for those which accept one’s papers!) to reject your submission as not of interest to their readers.

Like

Leave a comment