I came a across a tweet the other day where someone was complaining
that their paper kept getting bounced from journals because its findings weren’t
“surprising”. Many people piled into the replies to also complain about how in
this bright new age of empirical rigour boring findings are important. I want
to defend the editors. Surprising is not the ideal word. They should have used “uninteresting”,
or “doesn’t push the needle”, or “this finding doesn’t matter enough to be
published in this journal”. We pursue far too much research that is of low
marginal value. It should be published if it is technically sound, but it
should not be published in A journals.
I would think that in an ideal world, there would be some
big questions that animate research. Things like “why do some economies grow
while others do not?” or “what is the education production function?”. Each of
these grand questions has smaller questions nested within in, like “does secret
balloting improve growth by reducing government corruption?” Some of these
nested questions will have pretty obvious hypothetical answers, but a degree of
doubt and no empirical evidence. Others won’t even be clear at the hypothesis
level of analyses. What they will all share, however, is that they push the
needle on some question that is of importance to society; a question that
people care about.
For some reason, which I suspect is the obsession with
causal identification in economics right now, we are currently publishing reams
of material that answers questions nobody gives a fuck about. Some examples. I
remember reading a paper in public economics where someone looked at the causal
effect on faculty quality of firing all the Jews (Nazis were used as exogenous
variation). Unsurprisingly, it was
negative. The author noted in the literature review that nobody had written on
this question before. I wonder why? Because
the answer is obvious. Were there deans somewhere wondering whether they could replace all their top academics with PhD students and get the same results? No! Now the identification was clear so I think this
paper should be published, but not in Journal
of Political Economy!
Some people might say that the effect size of sacking a
quarter of your faculty is not obvious. Sure, but neither is it made clear by
these studies. You have too many concerns stemming from external validity, for one. Now
these people would respond, “well then you just need to do more replications”.
Oh, right, so now not only do we need to publish one valueless paper we need to
publish dozens?
This brings me to a key issue, which is costs and benefits.
A typical academic publishes maybe 2 papers per year and teaches 2 courses. Let’s
be generous and say that they take a month of leave and only work 40 hours
a week. Let’s be more generous and say that they’re only paid $100/hour (they
are paid more than that). So they have 11 months of work, they spend say 5
months of that teaching (generous again), leaving 6 months for research, which
amounts to 3 months per paper. There are 12 weeks in 3 months, 40 hours per
week, so that’s about $48 000 per paper.
Do you think that if you went to the Australian taxpayer and
said “hey, for the bargain price of $48 000, I can tell you whether arbitrarily
sacking part of your faculty and replacing it with marginal professors results
in declining quality; do you want to fund me?” that they would say yes? Of
course they wouldn’t!
A lot of the marginal papers that I come across are driven by
two things:
1) PhD
students who want to demonstrate technical skills rather than ideas so that they
can go work as a postdoc for someone else who does have ideas
2) The tendency
at present to find a data set that nobody has used and then think about what
questions it can be used for, rather than coming up with a really useful
question and a tight hypothesis and then finding an ingenious way to test the
hypothesis.
These two things are related in that neither suits people
who have questions. We are increasingly recruiting people to research jobs who
are technical workhorses not creative thinkers. We are consequently getting a proliferation
of empirical observations, but not a lot of coherent bodies of interesting knowledge. This
has been OK for a while because we had so much theory leftover from the pre-data
era to test with our new data. But now we are running out of good theory. For the
economists out there, it’s like operating at the extremes of your production isoquant—the
efficient mix of inputs is part theory and part empirics, we are instead using
almost all empirics.
Now one thing that really upsets me about the current state
of play is that there are all these empirical workhorses who want to be “researchers”
applying for academic jobs. But academic research is different to research in
general. Academic research is meant to be about fundamental knowledge—about pushing
the needle. If you don’t have questions and you are incapable of creative
theorising, you shouldn’t be an academic. There are lots of research jobs elsewhere.
I don’t say this purely out of bitterness at the extent of competition on the
job market. I say it out of a genuine feeling of tragedy that governments, NGOs
and other organisations that have heaps of
really important, immediate impact questions
(and the necessary data!) can’t find researchers with the skills they need
because everyone is trying to get an academic job where they will study whether higher crime rates
increase stress or some other such banality. Go work for government!
There are lots of problems in the current academic publishing
system. Not publishing failures to replicate is one. Not publishing refutations
of hypotheses is a huge one. Frothing to publish really unexpected results is a
staggeringly pernicious one that fed directly into psychology’s replication
crisis. But not publishing boring results is not a problem. There are plenty of B
journals that should take this work, which should mostly be done by minor academics
and PhD students who don’t want academic jobs. The A* journals should only be
taking papers that genuinely push the frontiers of our knowledge in significant
ways.
Comments
Post a Comment