Showing posts with label paternalism. Show all posts
Showing posts with label paternalism. Show all posts

Monday, September 29, 2025

Lying and epistemic utility

Epistemic utility is the value of one’s beliefs or credences matching the truth.

Suppose your and my credences differ. Then I am going to think that my credences better match the truth. This is automatic if I am measuring epistemic utilities using a proper scoring rule. But that means that benevolence with respect to epistemic utilities gives me a reason to shift your credences to be closer to mine.

At this point, there are honest and dishonest ways to proceed. The honest way is to share all my relevant evidence with you. Suppose I have done that. And you’ve reciprocated. And we still differ in credences. If we’re rational Bayesian agents, that’s presumably due to a difference in prior probabilities. What can I do, then, if the honest ways are exhausted?

I can lie! Suppose your credence that there was once life on Mars is 0.4 and mine is 0.5. So I tell you that I read that a recent experiment provided a little bit of evidence in favor of there once having been life on Mars, even though I read no such thing. That boosts your credence that there was once life on Mars. (Granted, it also boosts your credence in the falsehood that there was such a recent experiment. But, plausibly, getting right whether there was once life on Mars gets much more weight in a reasonable person’s epistemic utilities than getting right what recent experiments have found.)

We often think of lying as an offense against truth. But in these kinds of cases, the lies are aimed precisely at moving the other towards truth. And they’re still wrong.

Thus, it seems that striving to maximize others’ epistemic utility is the wrong way to think of our shared epistemic life.

Maximizing others’ epistemic utility seems to lead to a really bad picture of our shared epistemic life. Should we, then, think of striving to maximize our own epistemic utility as the right approach to one’s individual epistemic life? Perhaps. For maybe what is apt to go wrong in maximizing others’ epistemic utility is paternalism, and paternalism is rarely a problem in one’s own case.

Friday, February 21, 2025

Bayesianism and epistemic paternalism

Suppose that your priors for some hypothesis H are 3/4 while my priors for it are 1/2. I now find some piece of evidence E for H which raises my credence in H to 3/4 and would raise yours above 3/4. If my concern is for your epistemic good, should I reveal this evidence E?

Here is an interesting reason for a negative answer. For any strictly proper (accuracy) scoring rule, my expected value for the score of a credence is uniquely maximized when the credence is 3/4. I assume your epistemic utility is governed by a strictly proper scoring rule. So the expected epistemic utility, by my lights, of your credence is maximized when your credence is 3/4. But if I reveal E to you, your credence will go above 3/4. So I shouldn’t reveal it.

This is epistemic paternalism. So, it seems, expected epistemic utility maximization (which I take it has to employ a strictly proper scoring rule) forces one to adopt epistemic paternalism. This is not a happy conclusion for expected epistemic utility maximization.

Thursday, February 15, 2024

Technology and dignitary harms

In contemporary ethics, paternalism is seen as really bad. On the other hand, in contemporary technology practice, paternalism is extremely widely practiced, especially in the name of security: all sorts of things are made very difficult to unlock, with the main official justification being that if if users unlock the things, they open themselves to malware. As someone who always wants to tweak technology to work better for him, I keep on running up against this: I spend a lot of time fighting against software that wants to protect me from my own stupidity. (The latest was Microsoft’s lockdown on direct access to HID data from mice and keyboards when I wanted to remap how my laptop’s touchpad works. Before this, because Chromecasts do not make root access available, to get my TV’s remote control fully working with my Chromecast, I had to make a hardware dongle sitting between the TV and the Chromecast, instead of simply reading the CEC system device on the Chromecast and injecting appropriate keystrokes.)

One might draw one of two conclusions:

  1. Paternalism is not bad.

  2. Contemporary technology practice is ethically really bad in respect of locking things down.

I think both conclusions would be exaggerated. I suspect the truth is that paternalism is not quite as difficult to justify as contemporary ethics makes it out, and that contemporary technology practice is not really bad, but just a little bad in the respect in question, even if that “a little bad” is very annoying to hacker types like me.

Here is another thought. While the official line on a lot of the locking down of hardware and software is that it is for the good of the user, in the name of security, it is likely that often another reason is that walled gardens are seen as profitable in a variety of ways. We think of a profit motive as crass. But at least it’s not paternalistic. Is crass better than paternalistic? On first, thought, surely not: paternalism seeks the good of the customer, while profit-seeking does not. On second thought, it shows more respect for the customer to have a wall around the garden in order to be able to charge admission rather than in order to control the details of the customer’s aesthetic experience for the customer’s own good (you will have a better experience if you start by these oak trees, so we put the gate there and erect a wall preventing you from starting anywhere else). One does have a right to seek reasonable compensation for one’s labor.

The considerations of the last paragraph suggest that the special harm of paternalistic behavior is a dignitary harm. There is no greater non-dignitary harm to me when I am prevented from rooting my device for paternalistic reasons than when I am prevented from doing so for profit reasons, but the dignitary harm is greater in the paternalistic case.

There is, however, an interesting species of dignitary harm that sometimes occurs in profit-motivated technological lockdowns. Some of these lockdowns are motivated by protecting content-creator profits from user piracy. This, too, is annoying. (For instance, when having trouble with one of our TV’s HDMI ports, I tried to solve the difficulty by using an EDID buffer device, but then I could no longer use our Blu-Ray player with that port because of digital-rights management issues.) And here there is a dignitary harm, too. For while paternalistic lockdowns are based on the presumption that lots of users are stupid, copyright lockdowns are based on the presumption that lots of users are immoral.

Objectively, it is worse to be treated as immoral than as stupid: the objective dignitary harm is greater. (But oddly I tend to find myself more annoyed when I am thought stupid than when I am thought immoral. I suppose that is a vice in me.) This suggests that in terms of difficulty of justification of technological lockdowns with respect to dignitary harms, the ordering of motives would be:

  1. Copyright-protection (hardest to justify, with biggest dignitary harm to the user).

  2. Paternalism (somewhat smaller dignitary harm to the user).

  3. Other profit motives (easiest to justify, with no dignitary harm to the user).

Friday, March 17, 2023

Paternalistically enhancing autonomy

Sometimes you are about to tell someone something, and they say: “I don’t want to hear about it.” Yet in some cases, the thing one wanted to tell them is actually rationally relevant to a decision they need to make, and without the information their decision will be less truly theirs.

Imagine, for instance, you have a friend who needs an organ transplant and is planning to travel to China to get the organ transplant. You start to tell them that you read that China engages (or at least recently engaged) in forced organ harvesting among executed prisoners, but they try to shut you up. Yet you keep on speaking. In doing so, you are being paternalistic, but your paternalism enables them to make a more truly informed, and hence autonomous, decision.

It sounds strange to think of paternalism as supporting autonomy, but if we think of autonomy in a Kantian way as tied to genuine rationality, rather than in a shallow desire-fulfillment way, then we will realize that a person can (e.g., through deliberate ignorance) act against their own autonomy, and there may be room for a healthy paternalism in restoring them to autonomy against their own desires. This kind of thing should be rare (except in the case of literal parents!), but it is also the kind of thing friends need to do for friends at times.

Monday, April 27, 2009

Evangelization, love and union

One important reason for evangelization is the beneficence aspect of love: Christians would like everyone to share in Christ's gift of new life. But a second reason for evangelization is love's striving for union. Christians would like to be united with neighbor, and thus would like to be members of the body of Christ together with the neighbor. The second reason presupposes the first. For, sometimes, the beneficence aspect of love holds us back from a union that would be harmful to the other—thus, if our company is noxious to someone we are in love with, we should sacrifice ourselves and stay away. However, the union in the body of Christ is beneficial for our neighbor, and hence the beneficence aspect of love does not hold us back here.

Is this paternalistic? Could be. But there is nothing wrong with a proper paternalism.