« Health Informatics and Science | Main | House Robot Developed for Physically Impaired »

Peer Review and Innovation

Picture of woman reading.Lecturers told students on my wife's Physiotherapy course that journals using peer review--like the British Medical Journal--were the gold standard.

But in New Scientist 23 February 2008 Donald Braben argues that we are seriously deluded if we think peer review can lead to innovation. Peer review might work for the mainstream, he writes, but it excludes radical research. Now this chimes with an exchange between Checkland and Jackson I read when researching a Masters dissertation.

Checkland's Soft Systems Methodology is a way of finding solutions to problems that cannot easily be defined and might only be sensed as a vague feeling that all is not well. Predictably Checkland suggests defining the problem and then "identifying feasible and desirable changes". Part of this identification is for the interested parties to generate options and it was here, as I recall, that Michael A. Jackson argued group dynamics meant the methodology was normative, rather than radical. Groups tend to fall into heirarchical working, he suggested, often with one particular individual or group of individuals dominating this meant that radical solutions would often be rejected by those supporting the status quo.

The same limitation may apply to multi-disciplinary review. In a previous posting I wrote about a presentation by Prof Berg in which he argued computers should support standardised pathways of care which would be continually enhanced by review. On the face of it this sounds reasonable. Indeed the idea is not new. I was proposing it at least 15 years before Berg and I doubt I was the first.

But Berg argued that the review would generate innovation. I doubted it and what I have read and heard about the dynamics of multi-disciplinary working supports my scepticism.

Nor is that the end of the story. Peer review may be part of the future of medical practice, but only part. Wikipedia also quotes Drummond Rennie of the Journal of the American Medical Association:

"There seems to be no study too fragmented, no hypothesis too trivial, no literature too biased or too egotistical, no design too warped, no methodology too bungled, no presentation of results too inaccurate, too obscure, and too contradictory, no analysis too self-serving, no argument too circular, no conclusions too trifling or too unjustified, and no grammar and syntax too offensive for a paper to end up in print."


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)