jack: (Default)
[personal profile] jack
Utilitarianism is a very useful way of thinking, I think an advance over my previous conceptions. It might even be an ideal (say to incorporate into a non-sapient robot). It may be a reasonable (but imho flawed) representation of my ultimate moral aims. But I have a number of problems:

0. As normally stated, it's a bit unclear whether intentions or results matter. I normally interpret it as doing what is most likely to improve happiness (math: having the greatest happiness expectation).

1. It generally doesn't help in thought examples like Simont's.

2. How do you quantify happiness? And if you can, how to compare it between people? If two of us rate our happiness on a scale of 1-10, and they have different distributions, can you put them on a common scale?

3. Do you take into account people's feeling, like satisfaction at helping someone else? Or just your own? Do you consider contributions to far future happiness, eg. not-dropping-litter sets a good example, meaning fewer people overall drop litter? Obviously you should, but I defy you to know how much weight to put to that.

If you use these arguments you can fit utilitarianism to any situation, but it doesn't actually help you decide because you have to know the answer to know how much weight to put to the more intangible benefits.

5. I don't like the ruthlessness. According to the statemnet, if you arrest one (innocent) person as an example, which improves crime and makes life better for lots of people, that's good. Possibly that's bad in the long term, but you can't show that. And yet I don't agree with doing it.

6. What about death? Or, for that matter, not being born? You don't then experience any unhappiness, is it bad? How much so? Is it different for different people?

7. What about animals? Do they have the same consideration as humans? Any at all? I say somewhere inbetween, but how do you quantify that?

8. Anything I do in this country is probably irrelevent goodness-wise compared to getting a £40k job and sending it *all* to Africa. But most people don't, should a moral system let them choose between other options, or just stay like a broken compass pointing south-east?

Date: 2006-08-02 09:06 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
I was thinking more of a matter of policy, as a hypothetical example. Does forcing a large pain on one person justify happiness to other people?

I might hope I would be selfless enough volunteer, but don't think I want to force someone else to. But by a utilitarian argument you should.

Date: 2006-08-03 10:44 am (UTC)
From: [identity profile] douglas-reay.livejournal.com
There's two similar questions. One, where you knowingly (and unnecessarily) force significant pain upon a single chosen innocent individual without their consent to create less happiness but to a larger number of people. That's throwing the Christians to the lions in order to entertain the masses.

And the second where it is probabilistic. Such as having a shoot to kill policy on suspected bombers, knowing that X percent of the time you will shoot dead an innocent shopper in the subway, and Y percent of the time you will save hundreds of commuters from horrific injuries.

The latter isn't nice, and one can live with it, given a sufficiently extreme reward ratio.

The former is barbaric, and when legislating, you have to decide the effects on society. A case in point is the way newspapers crucify innocent celebrities in order to entertain millions of readers.

Date: 2006-08-04 05:00 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
I agree, but may I concentrate on the part of the post which I think illustrates the point I was trying to make?

The former is barbaric, and when legislating, you have to decide the effects on society.

Hypothetically if there are no other effects, approximately no damage to society from people finding out, no great sense of outrage from the victim knowing, etc, when comparing the suffering imposed on someone through no choice of their own to the lesser pleasure given to many people, do you or do you not take into account the fact you're imposing it?

I'll leave the various other details for the next reply.

Date: 2006-08-04 05:01 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
PS. I think I do. And I admit that my moral system such as it is is a big compromise mess. But that it's a reason I don't like utilitarianism as stated.