jack: (Default)
[personal profile] jack
Diax's Rake

Diax's Rake says "Don't believe something simply because you want it to be true". It's from Anathem -- I'm not sure if there's a real-world name?

It sounds obvious, but in fact I keep coming across it in contexts where I hadn't realised "believing in something because I wanted to" was what people were doing.

For instance, the most common argument that there's an absolute standard of morality seems to be "But if we didn't, it would be really terrible. Blah blah blah Hitler." But that seems to be an argument for why it's not desirable to live in that world, but it offers no reason other than sheer optimism to think that we do live in that world.

But another case seems to be free will. Why do people think we have free will. It seems like the most common argument is "But if we didn't, it would be terrible! Our lives would be pointless, and we wouldn't be able to philosophically justify prison sentences." But again, that seems to be "we WANT to have free will", not "here's a reason to think it's LIKELY we have free will".

Free Will

However, that's somewhat misleading. I feel like at some point society started toying with the idea that "have free will" or "not have free will" made no seriously falsifiable assertions, even in principle.

At which point, some people said "Look, our future actions are basically predetermined by the physics of our minds. 'free will' is basically a meaningless concept."

And other said, "No, wait. Look at what we associate with 'free will': rights, responsibilities, choices, law, etc, etc. We do have all that, we don't care if it's predetermined or not. I think 'not having free will' is basically a meaningless concept."

And the thing is, THEY'RE BOTH RIGHT. "free will" being meaningless and "not having free will" being meaningless are exactly the same statement, they just SOUND like they're opposed. They're somewhat opposed: they agree how the world works, but disagree whether "free will" is an appropriate description to use to describe it.

And arguing about "should we use this word or not" is almost always pointless, with people regressing to assuming that they're still arguing for the concept they used to assocaite with the word, without recognising that the other people don't disagree, they're just doing the same thing.

Many people who know more about philosophy than me seem to be self-defining as compatibilists (the idea that free will and determinism aren't contradictory?) If someone says they're compatibilist, I generally find I completely agree with how they say the universe works. But I don't understand the assertion that free will exists. Is there a basis for that? It's not just pandering to people who have a really intense intuition that free will is a well-defined concept that exists, at the expense of alienating people who at some point because convinced it doesn't?

Date: 2012-06-18 10:59 am (UTC)
pseudomonas: per bend sinister azure and or a chameleon counterchanged (Default)
From: [personal profile] pseudomonas
I think that's more a cognitive bias than a fallacy. Runs close to wishful thinking, is that the real-world name you were after?

Re: Appeal to consequences

Date: 2012-06-18 12:00 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
So what you're saying is that it's not called "Diax's Rake" because that would be difficult to explain?

Date: 2012-06-18 12:35 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
I reason about this from a perspective of symmetries and levels of abstraction.

Firstly, symmetry. Symmetry is fundamental to the physical universe, right down to time being differentiated from space by the past-future asymmetry.

I perceive an asymmetry between my consciousness and the world of my senses. This I regard as my sole metaphysical axiom: I feel myself to be something that has feelings, which pretty much defines consciousness as it's normally understood — yet I couldn't prove to you that I'm conscious.

Having accepted my consciousness axiom, symmetry strongly suggests that other people are conscious, too.

I see determinism as a concept pertaining to physics, and free will as a concept pertaining to metaphysics. I am conscious of having free will, yet we know what will happen when a weight is dangled from a spring, and the neurones in my brain are as susceptible to determinism and/or uncertainty principles and chaos theory as anything else.

So then there's AI. Supposing I were placed in front of a terminal with two chat windows, one connected to you and one connected to a supercomputer that was perfectly emulating your mind. Is the emulator conscious? How could science ever tell us? Is there any moral difference between them? I can't tell.

It boils down to why we accept the Golden Rule, and I'd say it's a combination of instinctive empathy, conditioned social contract and rational symmetry arguments about interpersonal interactions. People without the empathy are psychopaths, people without the social contract are sociopaths and people without the rational symmetry are illogical or short-sighted. Many people would add that people who claim a metaphysical justification for disobeying the Golden Rule are evil.

Opinions seem to differ on which aspect or aspects of that consideration constitute "morality".

Date: 2012-06-18 01:36 pm (UTC)
ptc24: (Default)
From: [personal profile] ptc24
A thought... does free will imply explicable-but-unpredicatable behaviour?

If your behaviour is inexplicable, then it seems more like randomness, or even madness, than free will. If your behaviour can be predicted in advance, then a) that looks like determinism and b) people can use that prediction to manipulate you, raising doubts that your behaviour is your own.

Date: 2012-06-18 02:05 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
As I say, I regard my "free will" as a metaphysical construct. It is by definition not falsifiable, and inherently difficult to pin down in formal language.

And then it's only a good working assumption on my part that your "free will" is anything at all like mine. Or even that it exists. Possibly I'm talking to the emulator.

But in a wishy-washy nebulous casual sense, if you fan a deck and ask me to "pick a card, any card", I can do so without consciously feeling any compulsion or constraint.

Date: 2012-06-18 04:07 pm (UTC)
gareth_rees: (Default)
From: [personal profile] gareth_rees
time being differentiated from space by the past-future asymmetry

Are you sure about that?

Time is different from space because of the minus sign in the space-time metric (d² = x² + y² + z² − t²). The past is different from the future for thermodynamic reasons (the universe has lower entropy in the past and higher entropy in the future).

It is not remotely clear to me that these two facts are connected.

Date: 2012-06-18 04:57 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
Sorry — more exactly time being different from space in that it has a past-future asymmetry.

Date: 2012-06-18 07:39 pm (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker

Date: 2012-06-18 08:38 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
By my thinking as elaborated above, Dogbert is perpetrating a fallacious mixture of reasoning about the physical and metaphysical. While the laws of physics are waggling neurons that stimulate nerve cells to make my body take actions that are rationally attributed to me, my consciousness is making free choices for which it might later be blamed. There is no contradiction.

Fortunately, reconfiguring a constituent part of the universe, conditioning a rational system not to repeat an action and punishing a free-willed consciousness as a deterrent against blameworthy behaviour tend to be the same activity viewed at different levels of abstraction. It's only in fairly complex grey areas that discrepancies become apparent.

Date: 2012-06-18 08:42 pm (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker
"my consciousness is making free choices for which it might later be blamed"

I have no idea what this means. Seriously. It means nothing to me. Can you explain in other words that might work better for me?

Date: 2012-06-18 09:04 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
To link back to a different part of what I said up there: in a wishy-washy nebulous casual sense, if you fan a deck and ask me to "pick a card, any card", I can do so without consciously feeling any compulsion or constraint.

Similarly, if you show me someone standing at the edge of a high cliff I can exercise free will in whether or not to give them a shove.

Suppose I do. In the realm of the physical I can decide that was an irrational thing for me to have done because it's disadvantageous to live in a world where more people are shoved off cliffs, because people will shun me for it, or whatever. Also, society has deemed it illegal.

But then there's the empathic argument that I wouldn't like it if someone did that to me, therefore what I did was "bad".

You can't prove to me that the person standing on the cliff edge is anything more than an automaton, that it's capable of liking or disliking. Similarly, if I see you push someone over a cliff edge, I can't know you were a conscious being exhibiting free will.

If a conscious being makes a free decision to push someone off a cliff, they're to blame for it. If an automaton pushes someone off a cliff, no such concept pertains. That, to me, is the difference, and the nature of blame.

Date: 2012-06-18 09:08 pm (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker
"I can do so without consciously feeling any compulsion or constraint."

So by "My consciousness is making free choices for which it might later be blamed" you mean "I don't feel constrained".

That sounds more to do with feelings than with the existence of anything. If you want to talk about whether you _feel_ free, then that seems to me to be in a different class to whether you _are_ free.

"You can't prove to me that the person standing on the cliff edge is anything more than an automaton, that it's capable of liking or disliking."

You seem to be saying that automatons cannot like things. Are you saying that "liking" is connected to whether we actions are deterministic?

Date: 2012-06-18 09:29 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
No! I'm saying that the debate about determinism versus quantum indeterminacy and the debate about simplicity versus chaos and emergence are physical debates, whereas there is a conscious/unconscious dichotomy that is metaphysical and not susceptible to physical analysis.

A conscious entity can feel, can like, can choose, can be blamed. An unconscious entity, an automaton, cannot.

To try and link the issue of whether or not I have free will with the issue of whether or not the universe is deterministic is level confusion.

Date: 2012-06-18 09:32 pm (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker
A conscious entity can feel, can like, can choose, can be blamed. An unconscious entity, an automaton, cannot.

Nope, still not getting it - what has feeling and liking got to do with free will?

Date: 2012-06-18 09:55 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
That they are aspects of consciousness.

And, therefore, that they are all metaphysical. I know I do them, but I also know I can't prove that to you.

The best I can say, I have already said — that symmetry arguments suggest to me a good working assumption is that other humans are similarly conscious. (That reasoning breaks down in a way that is problematic when considering non-humans.)

Date: 2012-06-19 07:34 am (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker
I don't believe that one has to be conscious of oneself to like things. One would have to be conscious to be aware that one liked things, of course.

And I don't see what consciousness has to do with free will, of course.

Date: 2012-06-19 08:14 am (UTC)
ptc24: (Default)
From: [personal profile] ptc24
I think there's a useful analogy (maybe it's more literal than that) with NP-completeness here. It may only be a coincidence that the N stands for "nondeterministic".

I think there are some good diagonalisation-style arguments for us having limited abilities to predict ourselves and our peers[1]; in order to imagine it and preserve determinism, you'd have to imagine some implausible coercive force that prevented us from acting on our predictions; like people in a consistent-history time travel story who have seen the future and can't quite bring themselves to change it for some unfathomable reason. We can't transcend those limits on prediction by adding more computational power, because the extra computational power would make us harder to predict.

[1] If you could predict your peers, you could predict them predicting you.

Date: 2012-06-19 10:46 am (UTC)
ptc24: (Default)
From: [personal profile] ptc24
It seems that interaction is required, too, to remove free will, if you try defining it this way. If they can predict you, they can manipulate you; if they're rational beings (according to some concepts of rationality) it's very hard for them not to manipulate you.

Possibly there might conceivably be free-will preserving ethical codes for agents with strong knowledge, that let them interact with you without destroying your free will.

I once came up with this idea of playing Go vs God, and playing vs the devil (apparently in the Go scene it's the done thing to think about divine opponents, although possibly with a less Judeo-Christian concept of the divine). In the analogy, God will treat you with courtesy, playing as if you were a perfect being who could implement the minimax algorithm to the end of the game. The Devil, OTOH, would predict what you would actually do, and play trick moves in order to take advantage of your weaknesses. You'd need a stronger handicap against the Devil than against God.

Date: 2012-06-19 03:04 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
And yet there's still that niggling metaphysical aspect of the situation. The bit that means questions like "what happens to us after we die" have instinctive meaning, even if they lack good answers.

Date: 2012-06-19 05:24 pm (UTC)
gerald_duck: (Default)
From: [personal profile] gerald_duck
Hmm.

Firstly, as mentioned passim, I personally treat "free will" as a wholly metaphysical concept, a manifestation of consciousness rather than of mind or brain. In that sense, I have free will.

Secondly, I feel that in mainstream philosophical thought that's always been the crux of free will, it's just that it often extends into (or at least informs) physical concepts of indeterminacy.

Thirdly, the definition of free will I use meshes well with metaphysical concepts of morality and culpability and places sharp focus on various questions surrounding tissue cultivation, cloning, genetic modification, animal intelligence, artificial intelligence, etc. Seeing the problem's not nearly as useful as seeing the solution, but it's at least a start.

I think I see why a lot of people are confused in this area, but I don't think I myself am.

Date: 2012-06-20 11:15 am (UTC)
simont: A picture of me in 2016 (Default)
From: [personal profile] simont
I think "I am conscious of having free will" is the bit I'm not sure of. People experience feeling something that we describe as "free will".

In a previous LJ discussion I used the following example of a subjective experience of free will, and the more I think about it, the more I think it's a good and illustrative one.

Scenario: your employer makes some demand of you that you're not very happy with. (Nothing morally wrong, just not very nice. Let's say, assigning you to spend months working on something rather more boring than what you thought you signed up for.) You grudgingly accept this demand, for whatever reason, but as you walk away you mutter (or think) to yourself "I don't have to put up with this sort of thing, you know, I could tell you to stuff your job."

That, it seems to me, is a quintessential 'experience of free will': a clear subjective awareness that your decision in some sense 'did not have' to go the way it did, but 'could have' gone another way.

But the interesting thing about it, to me, is that it's fairly clearly an experience of the compatibilist sense of free will. Because if I imagine myself in that situation and mentally mutter that sentence in more detail, I find it ends with "I could tell you to stuff your job if I felt strongly enough about it." That is, there's no implication that I might have made the same choice differently starting from the same mental state; I'm acknowledging that my choice was a function of my mental state, and what I'm saying is that in a different mental state I might have responded to the same situation by making a different choice. (And, in particular, that my employer should beware of putting me in this situation repeatedly, since my mental state might be less forgiving the next time.)

So all I'm saying, and feeling, with my 'experience of free will' in this example is that I'm not being externally coerced to the extent of being unable to act on that choice. (If I were forcibly enslaved rather than employed, I'd have muttered something very different under my breath.) And that's compatibilism!

If anyone has an example of what a subjective experience of free will might feel like that's not compatibilist in this way, I'd be interested to hear it.
Edited (it surely cannot be necessary to say 'here' this many times in one paragraph) Date: 2012-06-20 12:56 pm (UTC)

Date: 2012-06-25 10:21 am (UTC)
naath: (Default)
From: [personal profile] naath
I don't have free will; I blame people for their actions even though they don't have any either because my blaming them is pre-determined.

I'm not sure I really see the problem there :-p