jack: (Default)
[personal profile] jack
Can you remember when you first learnt/understood that a negative number times a negative number gave a positive number?

Did you think that makes sense? Did you think it should be negative, or were you just not sure?

I'm sure I'd learnt the rule before then, but I remember working out an explanation for why it makes sense (imagine you give out numbers of merits or demerits to people; taking back a number of demerits is the same as giving that number of merits). Before then, I didn't see why it should be positive, though I don't think I thought it should be be negative, I just didn't understand why.

Conversely, some people think the other way round. Negative Math: How Mathematical Rules Can Be Positively Bent. I haven't read all of his book, so I don't know if it's a good explanation of why and how mathematical rules are chosen, or if he's smoking bad weed. It could be either.

I do know it made me queasy. He seems to think minus times minus giving minus is more obvious, and perhaps should be standard, and that mathematics is a conspiracy against this.

Date: 2006-09-15 01:10 pm (UTC)
liv: cartoon of me with long plait, teapot and purple outfit (Default)
From: [personal profile] liv
In my early teens, I think. I accepted it was a mathematical convention. It's similar, in my mind, to the idea that raising something to the power of 0 gives you 1. On its own, it seems kind of arbitrary, but it makes the whole system behave in a consistent way. Representing complex numbers using vectors for the real and imaginary parts retrospectively made the multiplying negative numbers thing seem more solidly sensible rather than just something that mathematicians had all agreed on for no reason. But yeah, the idea of taking away negatives between equivalent to adding positives is a simpler and probably better explanation.

simpler and probably better explanation.

Date: 2006-09-15 01:25 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
I'd be inclined to say I've thought of four *sorts* of justification for adding a mathematical definiton:

(1) It extends mathematical laws consistently
(2) The most common interpretations remain true
(3) It produces useful results
(4) Everyone else does it

Having more of them are definitely better, though sometimes just one is enough. Here, it makes arithmetic consistent, makes bookkeeping continue to work, allows you to solve equations, and is universal, so -*-=+ is backed up pretty well.

x^0=1 is similar. 0^0=1 is dodgier, because you have to choose to extend *either* 0^x=0 *or* x^0=1 *or* 0=/=1 but *not* all three :)

Re: simpler and probably better explanation.

Date: 2006-09-15 02:52 pm (UTC)
From: [identity profile] ilanin.livejournal.com
I've always considered x^0=1 to be synonymous with the statement x/x = 1, which is probably because that's how I was taught it.