The discussion about reputation-metrics in science is dragging on. By now everybody knows the standard indicators (publications, impact-factor, citations,...), everybody uses them, everybody criticises them - and everybody ignores them if necessary. It has become a ritual to do metrics-bashing (while boasting about the own Hirsch-factor). Something has to happen.
Now.
(It won't.)
While researching new metrics can earn you a living, the output, quite frankly, can bore you to tears. The same folks that were unable to show how scientific excellence maps onto numbers, now open the floodgates. They spread their concept of 'excellence by Excel' from research to knowledge-transfer to impact on society - expanding the food-chain to be tagged.
Get real!
What societal impact does a scientific result have? The discovery of superconductivity? Research on linguistics of micro-languages? Any result: societal impact? Good luck!
The science-community is feeling the grip of the bureaucrats while science-funding is following the mirage of 'efficiency'. It looks as if everybody is fooled into submission.You know the line: 'I believe it is crap but since everybody is doing it, so should we' - which is heard from scientists and bureaucrats alike. So they all play those 'boredgames'.
The science-bureaucrats are the ones who need some computable numbers to rank, judge, praise or dismiss science and scientists - because they so deeply mistrust the concept of science and the peer-review-system, it seems. How could they understand the predominant working principle of curiosity-driven self-exploitation that powers any real scientist?
Since many of them can't distinguish potatoes from horse-droppings, they need the science-landscape mapped to a score-sheet to create their impressive set of poo-charts - umm, pie-charts.
This age-old approach to reputation-metrics looks so impressingly objective. But it must not be mistaken: no matter what numbers they compile, the best ones are the fallout of a peer's opinion:
publications? - Referrees have seen the paper and commented on it
citations? - Scientists quote what they learned to be important and trustworthy
PhD-theses? - a number of scientists were involved over years
Reputation-metrics as we know them - the compilation of indicators - is nothing but the condensate of peer-review that scientists justifiably rely on and that bureaucrats are so scared of. 'Objectivity' is a sweet deception and honesty about that would be a good thing for a start.
Comments