Top News, Articles, and Interviews in Philosophy

'Risky Research' Redux

I'm looking forward to participating in 1DaySooner's Zoom panel discussion on 'What is the Upper Limit of Risk in Clinical Trials?' next week (May 4th, @6pm ET) -- you can register here if you're interested in attending.My basic view is that there is no absolute upper limit: given informed consent, the risk just needs to be proportionate, i.e. outweighed by the social value of the information gained from the research.Indeed, this strikes me as entirely straightforward.  There are two key values that public policy should be guided by: beneficence (promoting the overall good) and autonomy (respecting individuals' choices about their own lives).  Conflicts between the two values can be morally tricky.  But if both of these values point in the same direction, as they do in the case of valuable research involving willing volunteers, then it really should be a no-brainer.  There's just no good reason to engage in anti-beneficent paternalism.  So: let's please stop doing that!I think that's the simplest case for "risky research".  In my paper with Peter Singer, we additionally proposed a principle of risk parity according to which, "if it is permissible to expose some members of society (e.g. health workers or the economically vulnerable) to a certain level of ex ante risk in order to minimize overall harm from the virus, then it is permissible to expose fully informed volunteers to a comparable level of risk in the context of promising research [More]

Follow Decision Theory!

Back in January, I wrote that there's no such thing as "following the science" -- that scientists and medical experts "aren't experts in ethical or rational decision-making. Their expertise merely concerns the descriptive facts, providing the essential inputs to rational decision-making, but not what to do with those inputs."It's worth additionally emphasizing that this question of how to convert information into rational decisions is not something about which academic experts are entirely at sea. On the contrary, there's a well-developed academic subfield of decision theory which tells us how to balance considerations of risk and reward in a rational manner.  The key concept here is expected value, which involves multiplying (the value of) each possible outcome by its probability.  For example, we know that (all else equal) we should not accept a 50% chance of causing 10 extra deaths for the sake of a 1% chance of averting 100 deaths, for the latter's expected value (one death averted) does not outweigh the former's expected cost (5 extra deaths).Now, my central complaint throughout the pandemic has been that policy-makers and institutions like the FDA (and their European equivalents) have evidently not been guided by any sort of cost-benefit analysis or the most basic principles of decision theory.  As Govind Persad put it, withholding vaccines during a pandemic is like withdrawing the service ladder from a subway tunnel for a safety [More]

Imagining an Alternative Pandemic Response

I received my first shot of the Moderna vaccine yesterday -- which naturally got me thinking about how this should've been accessible much, much sooner.  I don't think anyone's particularly happy about the way that our pandemic response played out, but there's probably a fair bit of variation in what people think should've been done differently.  What alternative history of COVID-19 do you wistfully yearn after?  Here's mine (imagining that these lessons were taken on board from the start)...In early Feb 2020, with Covid declared a "global health emergency" by the WHO, American scientists prioritize preparing a low-dose "challenge strain" of the virus in case it is needed for emergency immunity research.By early March, as the seriousness of the pandemic becomes clear, the emergency research protocols are approved by the president, and hundreds of volunteers enlisted (mostly young and healthy, but also some terminal patients and elderly altruists who want to help produce a better world for their great-grandkids).A strict (but temporary) lockdown is implemented for the last two weeks of March, to buy time while the nation waits on the results of the emergency immunity research.  "Immunity passports" grant lockdown exceptions to those who have already recovered from the illness.  Those in possession of immunity passports are highly favoured for "essential work" to minimize transmission risk to others.  There are reports of occasional "pox parties" [More]

Against Anti-Beneficent Paternalism

In a previous post, I argued that "undue inducement" worries are typically deeply misguided, and that banning good compensation is contrary to the interests of the very people that it's intended to help.  In this post, I want to raise a different objection: that even if allowing and/or incentivizing beneficent actions (such as kidney donation, or challenge trial participation) would "induce" some people to perform beneficent acts that they might later regret (or that they wouldn't have agreed to if thinking more clearly), it may nonetheless be the case that banning this would be morally even worse.First: I grant that it is absolutely a pro-tanto moral cost if someone makes a personally-regrettable decision.  But a question that is rarely asked is: how great of a moral cost is this?  How does it compare to the moral costs of status-quo harms (e.g. people dying for lack of a kidney transplant, or lack of a promptly-developed Covid vaccine) that are relieved by the transaction, or even just the costs to other participants who truly wish to participate (some of whom may benefit greatly from being well-compensated)?As a general rule, it seems to me that we should not intervene to prevent people from performing beneficent acts (acts that help others more than they harm the agent themselves).  Reasonable people can dispute the conditions under which individuals might be forced to sacrifice their own interests to better promote the general good. But [More]

There's No Such Thing as "Following the Science"

Ezra Klein quotes a Harvard epidemiologist's criticism of the FDA for blocking rapid at-home Covid tests: "They are inadvertently killing people by not following the science."I agree with the spirit of the criticism (and was heartened to read that Biden’s surgeon general nominee agrees that the FDA has been "too conservative"), but it's worth clarifying that the FDA's failure here is fundamentally ethical, not scientific.It's a popular rhetorical move, to present one's preferred policies as being backed by the authority of science.  It immediately puts one's critics on the back foot: who are they to question science, after all?  But it's also misleading.  Science doesn't recommend policies for us to follow, for the simple reason that science merely tells us what is the case, and cannot by itself answer normative questions about what ought to be done.Whether we realize it or not, we use normative bridging principles to cross the is/ought gap.  If some such principle is implicitly presupposed in a context, it might then seem as though the scientific claim alone suffices to yield a policy recommendation.  Opponents of the policy may then try to undermine our scientific knowledge in order to muddy the waters (cf. climate and covid "skeptics").  Perhaps such silliness could be decried as a failure to "follow the science". But such a framing risks reinforcing the mistaken impression that the science alone determines what should be [More]

The Risk of Excessive Conservatism

In 'Lessons from the Pandemic', I summarized what I took to be some of the biggest mistakes of the pandemic response, and tried to give a sense of the scale of the potential damage done, along with some concrete suggestions for how we might have done vastly better.  Some readers (e.g. here) seemed of the opinion that only those with "authority" should express such opinions, which I obviously disagree with.  But to better help such readers, it might be helpful to bracket any particular empirical details or examples and focus instead on the most general overarching claim of my post: that excessive conservatism risks immense harm in a pandemic.One doesn't need a medical degree to see that this more modest (yet still important) claim is true.  For it does not require us to establish that some unconventional pandemic policy truly would be much better; it suffices to note that an unconventional pandemic policy easily could be much better -- i.e., there's a non-trivial probability of this -- and since excessive conservatism would dismiss such unconventional proposals out of hand, such conservatism poses a significant risk of immense harm.  Since it is worth guarding against significant risks of immense harm, it is worth guarding against excessive conservatism in a pandemic. To turn this into a more pointed critique of the medical/policy establishment (and elite public opinion), we can simply observe that there is no evidence that said [More]

Epistemic Calibration Bias and Blame-Aversion

People typically treat having an importantly false belief as much more problematic than failing to have an importantly true belief.  They're more concerned about being over-confident than being under-confident in their credences.  But why?  Is such an epistemic asymmetry warranted?I'm dubious.  The ideal is to be epistemically well-calibrated: to have just the degree of confidence in an important proposition that is warranted by your evidence, such that in the long run exactly X% of your "X% confident" beliefs turn out to be true -- no more and no less.  Moreover, it seems to me that we should be equally concerned about miscalibration in either direction.  If we are underconfident (or withhold judgment entirely) when our evidence strongly supports some important truth, that's just as bad, epistemically speaking, as being correspondingly overconfident.In thinking about this, it's important to distinguish two dimensions of confidence: what we might call credal value and robustness.  To see how these come apart, note that I might have weak evidence that something is very probable.  My credence in the proposition should then be high -- for now -- but I should regard this credal value as tentative, or likely to change (in an unknown direction) in the face of further evidence.  "Bold beliefs, weakly held," to put the idea in slogan form.This distinction carries over, in obvious fashion, to expected-value [More]

Scale and Symmetry in Covid Debates

One curious feature of some public debate about Covid policy is when people object to a disliked policy proposal by appealing to a consideration that counts at least as much against the alternative.  Here I'll just highlight a couple of especially striking examples of this: scale and unknown risks.(1) Scale:  Back when people were debating whether society's response might end up being worse than the disease, it wasn't unusual to see health boosters emphasize the sheer scale of the health costs that would be incurred along the path to herd immunity through natural infection.  "Even a fatality rate of just 0.01% for younger adults would translate into thousands of deaths across that population."  That kind of thing.Which invites the obvious response: Yes, the scale of a pandemic makes the policy stakes really high!  For example, if you lower everyone's quality of life by an average of 1/3 for a year, that translates into more than 100 million life-years of equivalent value lost in the US alone (cf. estimated health gains of a few million life-years from covid prevention measures).Of course, that's just a made-up illustration.  Maybe average quality of life has not declined so much. But the essential point remains that any non-trivial cost imposed across an entire population results in massive total damages.  And it's not hard to see how the indirect costs of the pandemic could add up here.  There are nearly 50 million [More]

Latest News

Here are some of the things going on in philosophy
and the humanities.

See all News Items

Philosopher Spotlight

Conversations with philosophers, professional and non-professional alike.
Visit our podcast section for more interviews and conversations.

Interview with

Dr. Robert McKim
  • on Religious Diversity
  • Professor of Religion and Professor of Philosophy
  • Focuses on Philosophy of Religion
  • Ph.D. Yale

Interview with

Dr. Alvin Plantinga
  • on Where the Conflict Really Lies
  • Emeritus Professor of Philosophy (UND)
  • Focuses on Epistemology, Metaphysics, Philosophy of Religion
  • Ph.D. Yale

Interview with

Dr. Peter Boghossian
  • on faith as a cognitive sickness
  • Teaches Philosophy at Portland State University (Oregon)
  • Focuses on atheism and critical thinking
  • Has a passion for teaching in prisons
See all interviews


Twitter followers


News items posted


Page views per month

21 years

in publication

Latest Articles

See all Articles