Research Methods for Information Research
7. Beyond research methods
7.3 Finding out what research methods were used
“Books beat the Beatles when it comes to reducing stress” or, more prosaically “Even six minutes of reading can reduce stress levels by more than two thirds, according to a new study”. For me, these two quotes about a recent research study highlight a possible media shift in research reporting and a more specific cause for concern about research methods reporting. Reports on this work appeared in the Telegraph and Daily Mail in the UK and by early April 2009 had rapidly flowed around the World via the Times of India, Marie Claire and a host of news portals. As always, my research methods perspective prompted several questions – who says this? What sort of study was carried out and on what scale? What methods were used? All that the prominent news reports told us was the name of the institution that had performed the study (The Mind Lab, or Mindlab International)20 and, in some reports the fact that the participants were volunteers – but how many volunteers and how representative were they of the wider stressed-out community?
How could I answer these questions? The traditional answer, of course, is that all this information is included in the formal research report or in the academic journal article expounding the prior coverage of the research theme, the issue addressed, the methods used, the principal findings and conclusions. Even if a newspaper or broadcaster picks up on such a report there is a fair chance that the approach used, if not the precise methods, will be included in the account. There is a stronger chance that a source will be given for the report being recycled and even without precise details it should be fairly easy to run the original report to earth by visiting the website of the organisation carrying out the research. In this case, as with several recent media reports that I have followed up, it gradually transpired that there were no such academic reports. The study was commissioned by Galaxy (which one?) and conducted by Mind Lab International, one of the burgeoning group of university-generated independent research consultancies, this time based at the University of Sussex.
Interestingly, the first and more melodramatic of the opening quotes above is the title of the consultancy report itself, in the version of the report sent to me by its co-author, the cognitive neuropsychologist Dr David Lewis. It emerged that the study involved 16 volunteers, all keen readers, and entailed using heart rate sensors and skin conductance measurement during a set of ten alternating stress production (5) or reduction (5) activities, ranging from playing a game in which they received a mild electric shock through to reading a favourite book. The results were certainly interesting and could serve as the basis for a preliminary pilot study leading to a much larger-scale research study. A larger-scale study could take account of reluctant readers, younger people (the average age of the participants was 34) who might be more likely to relax by playing computer games (without the electric shocks!) or other means than those tested, as well as variability in stress levels and in responsiveness to any stress reduction.
The problem here (and in other recent cases of media reporting of small-scale consultancy research) is not that the central claim made was unfounded, but that without the traditional trappings of academic research reporting, the media are unable to contextualise their versions of the central claim and, in turn, the uncommitted reader has no basis on which to assess the merits of the story, leading to inevitable distortion and exaggeration. More generally, the growth in university-derived research consultancies and a trend towards awarding research contracts to management consultancy companies may be fundamentally distorting the academic research reporting process. Commenting on this perceived trend, Prof. Dave Nicholas claimed that:
“Because consultancy-based research work tends not to result in peer-reviewed publication this generally resulted in a loss to the UK LIS research literature.”21
This problem does not begin and end with consultants and their reporting habits. There is growing evidence that academic researchers are also beginning to change their publication habits. The growth of the Internet and the enhanced possibilities for communicating with academic peers that this has created is now leading to a short-circuiting of the traditional (and slow) academic research publication process. For example, a recent survey of scientific, technical and medical information specialists working in academia and government organizations by 2collab9 reports that more than 50% of respondents believe that social networking will play a key role in shaping the future of research. Unsurprisingly, perhaps, this survey also reports that the most frequently used research-based social bookmarking and file sharing site (at 20% of respondents) is 2collab. And yes, I have deliberately chosen a non-academic research source to cite. If you are keen enough you will be able to find out how well this research was conducted and precisely what methods were used.
Meanwhile, various research groups such as theoretical physicists report that they have largely abandoned traditional publication because of the pace of development within their discipline. Sharing of draft reports with other specialist researchers as a prelude or alternative to formal submission to the peer review process and publication in the appropriate academic journal, does not itself threaten our access to information about the research methods used. In some cases, the informal feedback process involved in posting draft papers for peer scrutiny may result in clearer exposition of research methods used. However, sloppy publication of research is more likely to occur without the discipline imposed by the academic publication review process and, in particular, details of research methods used are more likely to disappear because they can be a chore to write (compared with the interesting results being reported) and there must be a temptation to fudge this aspect of reporting, particularly if you are aware of the limitations in your choice or use of research methods.
21. Nicholas, D. ‘Research’ in Bowman, J.H. (ed) British Librarianship and Information Work 2001-2005 Aldershot, Hants: Ashgate 2007. ↩