Wednesday, 2 March 2016

Systematic reviews - valuable evidence-based tool, or just rubbish?

I'm delighted to revive this blog from the depths of apathy and inactivity (and promise to do better from now on in) with a guest post from my colleague, Avtar Natt.  Avtar loves data.  Analysing it, cross-referencing it, and (especially) making scary looking bell curves and Excel spreadsheets with it. He has applied his laser-like focus to the world of systematic reviews.  Here's his take on them (we would love it if any readers out there would like to write a guest response post):

 __________
I don’t have a background in health so my exposure to systematic reviews is limited. During my time as an information scientist I do however have experience in using machines to find things. I did it enough times and under enough pressure that I got pretty good at it. It is with this background that I have found systematic reviews as a ‘thing’ of interest in my new guise of an Academic Liaison Librarian.


Those familiar with information science or the science of science will be familiar with a chap called Derek de Solla Price, who is regarded as the father Scientometrics. He wrote a book in 1963 called Little Science, Big Science which I am pretty sure anyone with an interest in Scientometrics will have cited at some point. Within the book, Price proposed an idea called the exponential growth of science and how every 15 years or so, all of science (measurable by scientific papers) doubles. This sort of growth is non-linear and more like the growth pattern of a population. The differentiating part of Price’s hypothesis is that science or more specifically a scientific field will eventually saturate. Visually, the exponential growth of science slows down and becomes more like a logistic curve (see here).

People with my background will acknowledge that Price isn’t Nostradamus and that in the 1960’s paradigms like sociological functionalism were more au fait. The interesting thing is how 50 years later, the common sense surrounding an idea slowly gaining momentum, becoming rapid then finally slowing down fits when it comes to scholarly communication and behaviour of scientists. After all, moving on to new things is a key part of scientific progress. It’s all these thoughts that juggle around in my head when systematic reviews come to mind.

When I add in other bits of information I have read about and what I classify as tacit knowledge I have several thoughts about systematic reviews I would like to share:

1.       Classificatory Language – When the words systematic and review are put together it seems to have taken a life of its own. A lot of papers and student dissertations have systematic review in the title but do they all mean the same thing? Scientists (and students they end up supervising) often produce papers that sit somewhere between acting systematically and a full blown Cochrane Review. Perhaps a solution in the future could be some sort of double compound noun where a word before “systematic review” could be used to classify and differentiate.

2.       Fudging but not cheating – You know the data you want or the papers you want to use already, you want it to look scientific and be persuasive but you do not want to be overwhelmed. On top of all of this, you want to swat away anyone accusing you (or your data) of bias. It is these processes and thoughts that run through a systematic reviewer’s mind when they fudge their data/retrieval.

3.       Publish or Perish – This is where I start sympathising with scientists. They typically have enough on their plate in a modern university, so forming a research team or Matthew Effecting their students to co-authoring papers must be like light at the end of the tunnel. At least they can fulfil work commitments as well as be research active. 

4.       Impact & Open Access – There are a few posts in the LSE Impact Blog that advocate Systematic Reviews and the whole shtick of publishing shorter, better, faster and free. Open access and PLOS One instantly come to mind, and at its core the proliferation and emancipation of journals are a good thing. The unintended consequence however is that at a time of ‘impact’, where scientists range from Baby Boomers to Millennials, new scholarly norms could be forming that are incorporating systematic reviews and massively inflating their presence.

5.       No slowing down – This ties in to the point above and the exponential growth hypothesis. I did a search on Web of Science of papers with “systematic review*” in the Title. From 1970 to 2014 I found just over 43,000 papers with 50% of these papers coming from the period 2012-2014. I return again to publications like PLOS One but also the suggestion that point in time there is no slowing down of systematic review papers. Students become scientists.  Students (in Health especially) get acquainted with systematic review papers. These students that have become scientists have students that need to get acquainted with systematic review papers. And on and on we go…

6.       Coping Methodology – To avoid being accused of ignorance, I will like to emphasise the value and suitability of the systematic review for health practitioners.  They make so much sense. As opposed to quantitative methods like citation analysis, at least the papers themselves are being read. To cope with exponential growth and modern working practices, systematic reviews are a solution and look quite nice on any references list. As a business librarian, I am seeing more and more of my masters students follow the systematic review method for dissertations. Again, this makes sense and a supervisor can help the student frame and guide them toward a good grade with hard work. Which leads on to… 

7.       Habitus – This is my final point where I am thinking about the student when they breach through and become a scientist. Their papers are quantum’s of knowledge, exponentially growing and redefining what is acceptable with it comes to information retrieval and scientific method. There is good and bad of everything and perhaps this is all a matter of discourse but I would like to encounter more acknowledgment of the dark arts surrounding systematic reviewing and the impact it can have on scholarly communication as a whole.

Avtar Natt
January 2016