jack: (Default)
[personal profile] jack
Can you remember when you first learnt/understood that a negative number times a negative number gave a positive number?

Did you think that makes sense? Did you think it should be negative, or were you just not sure?

I'm sure I'd learnt the rule before then, but I remember working out an explanation for why it makes sense (imagine you give out numbers of merits or demerits to people; taking back a number of demerits is the same as giving that number of merits). Before then, I didn't see why it should be positive, though I don't think I thought it should be be negative, I just didn't understand why.

Conversely, some people think the other way round. Negative Math: How Mathematical Rules Can Be Positively Bent. I haven't read all of his book, so I don't know if it's a good explanation of why and how mathematical rules are chosen, or if he's smoking bad weed. It could be either.

I do know it made me queasy. He seems to think minus times minus giving minus is more obvious, and perhaps should be standard, and that mathematics is a conspiracy against this.

For the record

Date: 2006-09-15 11:23 am (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
1. It might confuse most people too much, but that mathematical rules are somewhat arbitrary and chosen to be useful is great lesson, necessary to train mathematicians, that I agree should be mentioned to children.

2. There are many reasons I like the normal way. It makes balance sheets calculatable. It makes algabraic rules most consistent.

3. But I don't know if another way could be useful for some things or not. Apparently he constucts an algebra such as he describes. I thought I'd worked out what it must be but now I'm not sure. I think addition works the same way, the positive numbers work the same way, but something nasty happens to distributivity with negative numbers (else you get contradictions).

Date: 2006-09-15 11:59 am (UTC)
From: [identity profile] cornute.livejournal.com
My teacher explained it this way.

You're a photographer and a friend of yours asks your help on some photos. She's losing [[3 pounds a week]] and wants a pictorial record, one picture a week.

After a few weeks, she gets discouraged, and says "I don't feel like anything has changed!"

"Look!" you say, "Here are the last [[5 weeks]]' pictures. If we go through them backwards, week to week, we can see you gain 15 pounds!*"

This is perfectly intuitive in a story, and it's then that the teacher explained that it demonstrates "negative 3 times negative 5 equals positive 15."

*this line is given for the purposes of mathematics only. Do not actually use this line on any friends who are attempting to lose weight.

Date: 2006-09-15 01:01 pm (UTC)
ext_8103: (Default)
From: [identity profile] ewx.livejournal.com

Although I agree you can come up with systems where (-1)(-1)=-1 (the two-element field for instance), that's arguably cheating as not all the symbols mean what the naive reader thinks they mean..



His argument in the excerpt on Amazon that there's no everyday basis for negative numbers seems pretty hopeless to me. Sure, you can't take apples from an empty box, but anyone who's ever lent or borrowed money has implicitly found an everyday interpretation of negative numbers.



(I read in wikipedia's article on complex numbers that negative numbers weren't considered to be on a sound footing in the 17th century. Perhaps they should have asked a moneylender...)






Negative bases are fun, allowing you to write both positive and negative numbers without having to use signs.


Date: 2006-09-15 01:10 pm (UTC)
liv: cartoon of me with long plait, teapot and purple outfit (Default)
From: [personal profile] liv
In my early teens, I think. I accepted it was a mathematical convention. It's similar, in my mind, to the idea that raising something to the power of 0 gives you 1. On its own, it seems kind of arbitrary, but it makes the whole system behave in a consistent way. Representing complex numbers using vectors for the real and imaginary parts retrospectively made the multiplying negative numbers thing seem more solidly sensible rather than just something that mathematicians had all agreed on for no reason. But yeah, the idea of taking away negatives between equivalent to adding positives is a simpler and probably better explanation.

Date: 2006-09-15 01:19 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
His argument in the excerpt on Amazon that there's no everyday basis for negative numbers seems pretty hopeless to me.

Oh yes! However, despite doubts, I haven't managed to rule out that there might be an algebra which has most of the properties of normal algebra, had (-1)(-1)=-1, and has some different interpretation in the real world, which is more useful in *some* situation.

Negative bases are fun, allowing you to write both positive and negative numbers without having to use signs.

Oh yes.

simpler and probably better explanation.

Date: 2006-09-15 01:25 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
I'd be inclined to say I've thought of four *sorts* of justification for adding a mathematical definiton:

(1) It extends mathematical laws consistently
(2) The most common interpretations remain true
(3) It produces useful results
(4) Everyone else does it

Having more of them are definitely better, though sometimes just one is enough. Here, it makes arithmetic consistent, makes bookkeeping continue to work, allows you to solve equations, and is universal, so -*-=+ is backed up pretty well.

x^0=1 is similar. 0^0=1 is dodgier, because you have to choose to extend *either* 0^x=0 *or* x^0=1 *or* 0=/=1 but *not* all three :)

Date: 2006-09-15 01:31 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
Ahem. That is, what I think I was trying to say, is he definitely seems to have an agenda. And it seems unlikely that such an algebra could have any decent interpretation.

However, I haven't quite proved anything with those properties collapses under its own weight, and feel obliged to reserve judgement, because sometimes throwing out an axiom produces something that seems daft, but turns out to be useful in some other way, eg. non-euclidean geometry, or quaternions, or infitessimals.

Re: simpler and probably better explanation.

Date: 2006-09-15 02:52 pm (UTC)
From: [identity profile] ilanin.livejournal.com
I've always considered x^0=1 to be synonymous with the statement x/x = 1, which is probably because that's how I was taught it.