jack: (Default)
In a manful effort to remember which is which, I looked these words up *again*.

It looks like, "synecdoche" means using a part to represent the whole, eg. "how many heads" in a herd of cattle, or "how many bums" in a theatre, or "nice wheels" referring to a whole car. But is also used for the reverse, using a whole to represent a part, eg. "what does Brussels think" referring to the European parliament.

I couldn't tell why the second meaning was included, but secondarily, if the first meaning came first, and then people started using it both ways round, or something else. Nor if only the first meaning is "correct" and the second is a mistake, or if both are equally accepted.

Apparently "metonymy" means "using a closely related concept to represent a thing". Eg. using "suits" for "lawyers" or "businesspeople", or "the pen is mightier than the sword" to mean "the written word is mightier than force of arms".

So the real difference between "synecdoche" and "metonymy" is different history and connotations, which I don't really understand. But in terms of literal meaning, the only difference is "using a part to represent the whole" vs "using one concept to represent another".

But, obviously, human pattern matching means if you mostly use synecdoche in the "part for a whole" sense, then the most common use of metonymy is "whole for a part", even if it could be used for other things.

Can anyone fill in the gaps here?
jack: (Default)
http://blog.plover.com/math/partial-function.html

Mark Dominus said, Ranjit Bhatnagar proposed the notion of "non-standard adjectives". I'm sure I talked about this before, but I can't find the link.

The idea is, that a "big diamond" is a sort of diamond. But a "fake diamond" is not a diamond -- it's something like a diamond, but isn't. Likewise, something which is "definitely complete" is complete, but something which is "partially complete" is not complete. The adjective describes a way the noun applies other than usual.

An example that came up recently was "nearly unique". I still stand by "very unique" for the normal English meaning of "unique in more ways, or out of a bigger set" (I agree it's meaningless if the set in which it's unique is specified, but that's really rare in normal language rather than maths). But I said that "nearly unique" should be acceptable to everyone, in that it means something that isn't unique, but is close to it.

However, I've a feeling I said something recently that came across as wrong, or misleading, or hurtful, and I wanted to quite this explanation to say that that's not what I meant, but I can't remember what it was. Sorry :(
jack: (Default)
I thought this was settled in my mind but now I looked up Octopus plural on the internet again and this time it says Linnaeus made it up we can use whatever plural we like and "octopi" has been common since the beginning even if it wasn't grammatically faithful at the time...?

Maybe we should ban saying any of "octopuses" "octopi" and "octopodes" are any more correct than each other and let people use whichever they like??
jack: (Default)
My impression of "or" in English is that it can mean either "inclusive or" or "exclusive or" depending on context.

Example #1:

"Are you now or have you ever been a member of a communist party?"
"Would you like milk or sugar in your tea?"
"Don't trust someone if they're incompetent or malicious."

Obviously someone who IS NOW and also HAS BEEN IN THE PAST is supposed to answer "yes", not "no". You're allowed to ask for milk AND sugar. Incompetent malicious people are not automatically trustworthy.

Example #2:

"Would you like dinner now or would you like to freshen up first?"
"Eat in or take away?"
"Would you like the free gift or the cash equivalent?"

It would make no sense to ask for both.

And some of those questions have exactly the same form, it's just that you're supposed to know from context that having both is normal, or having both isn't being offered, or isn't possible.

However, I know some people say "in English, 'or' means 'inclusive or', even if sometimes common sense/politeness/physics stops you having both options at once" and some people say "in English, 'or' means 'exclusive or' but people sometimes use it sloppily and we know what they mean".

However, I can't really see the difference between those and "it can mean either depending on context". Is there any evidence that one of those is a superior description of the "default" interpretation (not just "which is more common", but "which people understand you really mean if you emphasise the meaning of the word").
jack: (Default)
If you know that "PIN" stands for "Personal Identification Number", saying "PIN number" naturally sounds exactly as non-sensical as saying "personal identification number number".

However, I disagree with that assessment.

I'm not exactly sure why, but some possible reasons include:

1. I don't think "PIN" really means "personal identification number" any more. I think it refers to a new, specific complete concept. Something like "a low-entropy password used in conjunction with another means of identification". I think a PIN made of four letters would still be a lot more like a "PIN" than a "PIS". It's a fallacy to assume that ALL words automatically mean whatever they meant when they were first used, even if another meaning has been standard for 1200 years. PIN is much more recent, but if the new meaning is prevalent and useful and unambiguous, I don't think it's useful to cling to the old meaning.

2. "PIN" is short, and could be misheard. In the specific case of "please enter your PIN", no-one ever says anything else, so it doesn't matter, but I usually find saying a one-syllable word very loudly doesn't clear up any possible confusion, but "PIN number" is completely clear.

3. Being pedantic and knowledgeable about etymology is a stylistic choice, one which I'm naturally at home with. But I think it's a mistake to think that it's automatically better than learning language by hearing people use it. I think geeks are often marginalised in many circumstances, and assert their identity with this sort of grammar pedantry (which is a good thing). But it can lead to thinking that people who are not pedantic about grammar are less good, and feed into class prejudice that people who say "10 items or fewer" are somehow bad people :(

4. I'm stubborn. I express pedantry by choosing usages and using them specifically[1], even if other people disagree. I think there's nothing wrong with splitting infinitives[2], so I do so, even if the people I'm disagreeing with are themselves persecuted grammar pedants.

5. Redundancy isn't the be-all and end-all of language. In fact, it doesn't matter! I'm often non-redundant because I think it's funny or because compscis habitually express things with precision and little redundancy[3]. But language is naturally redundant for good reason. There's no reason to avoid redundancy unless ABSOLUTELY NECESSARY, if it's often harmless and often useful. Clarity and unambiguity are more important.

But I'm not sure I'm right. Are there reasons I'm missing to be more pedantic?

[1] I also make mistakes, especially with spelling.
[2] Not doing so is a legitimate stylistic choice, even if I don't think it always makes sentences sound better, but not one that's BETTER than me.
[3] Eg "Did you receive this email?" "Mu"
jack: (Default)
Some sentences are ambiguous without a serial comma. Eg. "To my parents, Ayn Rand and God."

Some sentences are ambiguous with a serial comma. Eg. "To my mother, Ayn Rand, and God."

I imagine there are cases where knowing that the author always uses the same style makes it not ambiguous (or makes it ambiguous?) but I can't be bothered to think of any.

My question is, why do people smugly insist on promulgating cartoons that say I prefer X, Y has a humorous misinterpretation in at least one circumstance, THEREFORE I WIN HAHAHAHAHA! I hate it when people say every side has pros and cons with NO effort to weight them, but I EVEN MORE hate it when people assume that every opinion with a flaw is automatically worthless and all their opinions are uniformly perfect, I just hadn't noticed how often it happens.

Am I missing something? Is there some reason the ambiguities of one style don't count, or are so incredibly rare it's sensible to just ignore them and viciously mock anyone who notices them?

Postscript

For the record, I'm not sure what I personally do, I don't care much. I think I usually do without, but use commas if the list includes "A and B" or "A or B" in the middle, and rephrase or use semicolons or separate sentences if necessary.
jack: (Default)
http://languagelog.ldc.upenn.edu/nll/?p=4655#more-4655

Apparently, it's now become common to refer to an individual piece of electronic mail as "an email"</irony>. And to an individual piece of spam as "a spam". And to an individual blog post as "a blog".

I'm not sure how to find peace between my inner descriptivist who says "duh, obviously" and my inner prescriptivist who says "AGH! MY EYES! KILL IT WITH FIRE FROM ORBIT! IT BURNS! IT BURNS! ZE GOGGLES, ZEY DO NOTHING!"
jack: (Default)
The horse raced past; the barn fell.
The horse, raced past the barn, fell.
The horse raced past the barn demonic.
The horse raced past the the cleared highland (the one with the barn on, not the other one).
The horse raced past the barn, Fell.
jack: (Default)
Many of us grow up with some rules, that then become possibly outdated. When is it better to stick with them, and when is it better to abandon them?

1. If you don't have time to think about it now you've absorbed the habit, and it's not obviously harmful, just keep ignoring it.

2. If it DOES take time, or it COULD be harmful, or if it increasingly generates friction with other people who have the opposite instinct, it may be worth considering if it's still valid or not and making a positive distinction to stick with the rule, or abandon it.

I find it satisfying to decide, and lose a nagging sense that I'm "not quite doing the right thing" whatever I do. For instance, IIRC, two friends had a disagreement about whether lightbulbs on dimmer switches still drew maximum power when they were dimmed (or maybe it was whether to turn the light off when they left the room, or something else). But rather than go on nagging each other and attributing the different habits to intransigence rather than the same lingering sense of obligation to a different internalised rule, they looked up the answer. (IIRC, dimmer switches used to waste the energy, now they don't.)

3. Does it provide a small potential benefit? Then you might as well keep it.

If you always do small to medium arithmetic problems in your head, then it's probably worth keeping the habit, because sometimes it's useful, and if you're good at it, it's equally reliable to using a calculator (just with different sorts of errors).

If you don't, it's probably not worth learning. Everyone would benefit from being able to do "6x7", or "20% of $150" or "to the nearest hundred, what's 27*54" in their head. But the benefits of calculating "67*43" or "345/72 to 2dp" are small. It's easy to learn, and learning it is nice, but it's probably not actually going to save you much time if you normally have a computer handy.

4. Does it provide an illusory benefit that's likely to make you keep the rule even at the expense of clarity? This is the difficult one, but it's best to let the rule go.

In the old days, computers did integer arithmetic much faster than floating point arithmetic. Now, on a modern computer, this isn't really true. Using integers is often somewhat useful, because you can make sure if you have precise answers, but if what you really mean is floating point numbers, you should understand and use floating point numbers.

Now, software engineers are having similar introspection about more modern rules. For instance, most people still keep to rules of thumb that may be invalid when computers have large memory caches.

The difficulty is deciding whether the benefit is real or not. If you have a lot of effort invested in something, it's always tempting to think the rule is a benefit, even if a small one. And the knowledge is always good, even if not immediately useful. But sometimes you need to decide if you should keep going it or not.

Controversially, I think think this category includes mixing greek and latin roots. What benefit does this provide? The benefit is that if there are people in Britain who speak ancient greek or latin, but don't even know the common English-adopted prefixes of the other, they can understand some non-mixed words. But there are no such people. It's still somewhat elegant to keep the borrowings separate, because it compartmentalises words, which is NEATER whether or not it's ever used. But this has to be balanced against the benefits of mixing roots, which is that it often gives useful words. Now, while I appreciate both benefits, I think the usefulness of words like "television" definitely outweighs the other.

It's possible there's some other benefit to keeping the old rule, but if you want to say that, I think you have to know what it is. I think it's no longer enough to simply say "it's correct" or "it's normal", or "arbitrary rules are good because they let us measure people's progress in learning" you have to make a positive decision of what you are gaining.
jack: (Default)
Valid normal English usage that differs from technical vocabulary, a higher form of pedentry

Often the same normal English word has one (or more) specific technical uses. Either because the technical term came first, and was adopted for analogous (but not necessarily identical) lay uses. Or, at least as commonly, because the English (or other vernacular) word came first, and it was adopted for technical use (for instance, in maths the meaning of "circle" is more specific than the English use, but no adult, not even the most proprietary maladjusted mathematician objects to the normal English use).

In the rare specific case where previously the technical term was the only usage, and it had a specific meaning, which is distorted by how some people use it, then I think people used to the technical term are correct to request it be used correctly. Obviously language evolves naturally, and it's normally best to accept how it does so, but it's also correct to participate in the change, and sometimes to make value judgements. For instance, "light-year" means "the distance light travels in a year in a vacuum[1]". Some people use it to mean "a really long time". Even if this is understandable and comprehensible, I think it's fair to say that it's wrong, and language is more useful and more clear if we DON'T do this.

However, in most of the other cases, when the lay use is a normal part of English, and especially where the lay use of the word came first, I don't think people have any justification whatsoever for insisting only the technical definition is correct.

Some people are surprised that I think this. "You're pedantic" they say, "surely you approve of all strict rules[2], correct or not?" Well, no. I don't. Firstly, pedantry is not, actually, the highest calling in my life :) Secondly, denying that circle does (and should) have a normal English usage is incorrect -- and the one thing that death to successful pedantry is being incorrect. Appreciating a complex, intricate and subtle interaction of different and sometimes implicit rules of different meanings, and applying them ALL correctly is a HIGHER form of pedantry.

[1] Although the official definition may vary depending which of the units we have defined fundamentally, I think this is the _core_ meaning, and people care correct to use it that way whichever of time, distance, and speed units are not fundamental.

[2] The lust for rules

I was somewhat glib about people always wanting to find underlying rules. In fact, it's a very human tendency. People -- from all walks of life -- are prone to seizing on some distinction and assuming it's "correct".

This urge is very natural, and very healthy insofar as it lets us form useful generalisations.

However, it has the unfortunate side-effect of making us insist there are rules where there aren't.

People insist that momentous happenings (eg. assassinations) must have complicated causes, not just chance.

People want to believe that there is an international committee that decides how the number of horse's legs touching the ground in a statue relates to the manner of death of the rider. They believe it so strongly they're shocked to see statues that don't follow it.

People want to assume that "E. E. Cummings" should be lowercased because he lowercased many other things but (if I recall my information correctly) he didn't lowercase his name.

People want to assume that you should not split an infinitive. People can become very hostile on this point, because they feel it's the primary bastion in the skirmish for correctness. People have been very very hostile to startrek for saying "to boldly go". I'm sure startrek didn't think it through very far. But as it happens, there is no benefit to society to avoiding splitting infinitives (except a very crude sort of exclusionary test to give people arbitrary hoops to see who learns them best), and while some educated people say not to, the people who know the most about it (copyeditors and linguists) approve it.

I do it myself. When I first learnt that a fox was a "vulpes" and a cube was a "cuboid", I insisted on the technical meaning so hard I effectively denied that "fox" and "cube" were equally correct descriptions. However, in my defence, I was about eight. It's good insofar as it represents learning, but bad insofar as it's taken as a universal truth.

So inferring rules is natural. It's just that sometimes it's wrong even though it's natural.

"Technically, that's not true"

But what really gets me is when someone applies "technically" to a sentence which contains no technical definitions of words. If I use "A or B" in the normal English meaning[1], and someone says "technically that means only A or B, but not both", then they're WRONG because they incorrectly assumed people use logical-or when actually they use normal-English-or. But at least their sentence has specific (even if false) meaning.

But sometimes people prepend "technical" to a sentence to make a distinction that is -- as far as I can tell -- completely non-existent. And it annoys me because I can't see how it can EVER be correct.

[1] which is inclusive or exclusive depending on context
jack: (Default)
Via MegaMole, http://megamole.livejournal.com/641176.html,
http://www.telegraph.co.uk/technology/news/6995354/Sarcasm-punctuation-mark-aims-to-put-an-end-to-email-confusion.html, a company proposes a proprietary punctuation mark to indicate sarcasm. It's like the comedy buffet of the decade.
jack: (Default)
Would it be worth having an "unlikely" mark for a spelling checker? For those occasions when it's entirely possible you did mean "licens (n, ceremonial greek fireplace)"[1], but it's overwhelmingly more likely you meant the common word "license"?

It would have to be visually represented as not requiring you to get rid of it, but merely to draw your attention to it, if that's possible. On the other hand, you could argue that's what spelling checkers ought to do anyway, although they are not normally treated like that. At least, "not in dictionary" is a clear judgement, even if the writing is correct (grammar "checkers" results really are all "suggested").

Come to think of it, maybe there's an uncomfortable parallel with compiler warnings?

[1] Warning, not an actual definition.
[2] I think perhaps the "one-liner" tag, as misleading as it has become, represents the nearest I ever come to a single contained thought, rather than a fifteen interrelated ones.

Pronouns

May. 30th, 2008 02:22 pm
jack: (Default)
"When John was a woman, [he/she/they] said '...' " Which pronoun do you prefer? (That is, "he" is appropriate for John now, "she" would be appropriate for what John was then, and "they" would specify the ambiguity.)

"The things God or Jesus [was/were] recorded as saying are ..." Which pronoun do you prefer? (That is, do you treat them as two separate people (were)? Or one person (was)? :))

Obviously both are arbitrary, and I think both sufficiently specialised that most people wouldn't mind which you used, I just wondered if anyone had a strong opinion :)
jack: (Default)
I'm sure you all know the history of Gender neutral pronouns. And most think the question is mostly settled, although not agree in favour of what :)

However, it occurs to me some reluctance might come from the fact that although I have a little voice in my head saying "Women and men are the same. Gender neutral is good" I have a great big klaxon blaring "ALL INFORMATION IS GOOD! LEARN THINGS! BE INFORMED! COMMUNICATE FULLY! INF. ORM. ATION. GOOD." :)

That is, apart from not being aesthetically fond of most of the choices of gender-neutral pronouns, I'm not fond that that word choice is deliberately less informative. If you're talking about a genuinely neutral (eg. hypothetical) or ambiguous person, or you don't know, there's no information lost, but I still only use the pronouns where I have good reason.

But today a friend made another reference to the concept of "Geek as gender" and something occurred to me so obvious I couldn't believe it hadn't before.

What if we had two or more pronouns that drew *different* demarcations? We already have special pronouns for royalty and gods. ("Her Royal Highness's" etc and "His" etc).

You could adopt the archaic second-person model and have "te" (pronounced with a long e), "tis" and "ter" and "ve", "vis" and "ver" for intimate acquaintances and others. Or for social acquiantances and work acquaintances.

Or have different pronouns for different groups people can adopt as whatever they feel like identifying as in a certain concept. (Of course, you shouldn't identify solely as one thing, but most people are happy to identify as one thing but others as well.) Perhaps two sets would be most common ("he" and "she" or some other division), but that someone would borrow the Sindarin or Quenya pronouns from Tolkien and use them when affectionately referring to people from the Tolkien society.

Of course, now we near the Chinese problem of having too many, and having to decide when meeting someone whether to use the very formal or the extremely formal version of their pronoun.

But on the other hand, it seems more positive, as choosing to use such a pronoun doesn't sound like "my gender isn't important to me" but "this other aspect of our acquaintance is more important". And if you have a good reason to use other pronouns, it's not so jarring when someone does.

I'm afraid I haven't thought this out in detail, but I thought it was a lovely idea.

Italics

Feb. 22nd, 2008 11:14 am
jack: (Default)
I'm fairly sure I use bold, italic and *starred* in subtly different ways. I'm often fascinated by such subtle distinctions, such as the subtleties in translating into one of two related words.

The thing is, I can't put my finger on what the differences might be. (The nearest I've come is in observed starring can star two consecutive words separately, or as part of a phrase, which is occasionally useful.) Does anyone recognise a difference in themselves?

I was reminded of this by the idea in html that you have a semantic tag, eg. em for emphasis, and a mapping from that to display, where the mapping could be overridden. Which is definitely the right way, but not yet universal. And part of the reason I'm slow in adopting is that having acquired subtle differences, I don't like losing them, even if it would make sense. After all, if I can't explain them, I can't persuade anyone to make a tag for them :)

A tag for citation, a common usage of italic, makes sense, as often that is represented in a specific way on different web-pages, or in different ways in a piece of text, (eg. in theory, nested cite tags might helpfully do different things). And I can certainly live with only one form of textual emphasis, I use none at all in formal writing. But I don't like to :) And I have a nagging feeling that if I write em, then someone who hasn't configured their web browser might see it as bold[1], and think I was shouting, whereas I'd only meant italic, and it's quite different :)

[1] That is a difference, that bold stands out of the page a lot more, whereas italic doesn't. So both serve some function. I am confused with google chat because it renders starred text as bold, and I use stars both for actions (*hug*) which should be bold and emphasis (I *did* say that) which shouldn't, at least in my writing :)
jack: (Default)
Simon and I were discussing, amongst other things, a term for non-mutants in X-Men.

(If you don't know, in X-Men, there's an X-gene which gives people various invariable useful if often inconvenient special powers. These people are called Mutants, and everyone else is referred to as Human by exclusion, although human would by any sane definition include both as well.)

Peter won, in my opinion, with a suggestion in the pub last night, "Wild type" which in biology means "members of a species not having interesting mutations" (very roughly, someone give a more precise definition below).

But it got me thinking. What do the following terms all have in common:

Human (as in non-mutant)
Carnivore/Omnivore (as in non-vegetarian)
Neurotypical (as in non-autistic)
Heteronormative
Cis (as opposed to trans- or trans-gender)
Atheist

They all define everyone apart from members of a specific group. And hence don't really have any cohesion within themselves. And so the terms can be used literally, but most are generally used with either a grin or a sneer, admitting non-X doesn't just mean non-X, but "what I find annoying about non-X people, particularly their opinions of X people" and "lets see how they like being labelled". To magneto, human is an insult.

I don't know if it's relevant, but I think no-one ever means vegans when they say "non-vegetarians" :) (So a term meaning sometimes-meat-eating is actually more accurate.)

"Wild type", apart from sounding a hell of a lot cooler than "human" and lacking existing prejudice, seems to do a nice job of describing a default state, without implying anything about it as a whole. Of course, it's probably too obscure a term to catch on, but I like it.

I know sometimes it can be difficult to decide which is a group and which isn't. For instance, traditionally religion-X might consider people not of religion X to have more in common than not (and to some extent be right, if religion X is true). But I was enchanted by the analogy between atheist and neurotypical, etc. I'm sure it says something (though I'm not yet sure what).
jack: (Default)
Vocabulary quiz: Free Rice

This vocabulary test is really great? Freerice, unlike a traditional vocab test, has a large dictionary and measures hardness of words by how many people get them right this means it's amazingly good at finding words right on the boundary of what you know, ones you've heard used somewhere, but can't quite place, rather than either being words anyone knows, or common long words, or words so obscure only people in specialist fields know them, or words simply famous for being obscure.

This gives an unfair advantage in the quiz to people who know words from having heard them used in books and conversations with articulate people, rather than people who read dictionaries for fun :) So it suits me.

The questions are tailored to your responses, after a few minutes finding their own level at a category of words you know 3/4 of. I could play all day.

Also, it (apparently) raises advertising revue, from which it donates to charity.

This morning, I got up to level 45/50 before starting getting them wrong. What do you get?

Geography quiz: Statetris

There've been many geography quizes going round that ask if you can place countries in europe, but this is so cool because it's Tetris!

Each country falls from the north, and you have to manoeuvre it into position. There's some clues, for instance, it has to be one of the ones on the bottom.

In the easy level the country is labelled. In intermediate, it isn't. In hard, it needs to be rotated to the correct orientation.

It's really satisfying because when you've completed all-but-one, the last country is Russia, which is really big, and really obvious, so it's such a release to be able to plonk it down with a thunderous crash to finish the level.

Tipping

Oct. 2nd, 2007 01:59 pm
jack: (Default)
Are there any standards of journalistic integrity? Should there be? Truths about people (eg. so-and-so is a fraudster) are protected by libel laws. Dangerous truths (eg. "giant asteroid heading for earth, loot now!") by various laws.

But plain old truths? Obviously it's a disaster[1] to require people to prove every statement they make, freedom of speech would evaporate instantly. But should it be possible to prevent people spreading lies that don't immediately harm anyone?

What about textbooks asserting that atoms are individual? Printing lies about someone whose reputation is already ruined? Printing lies about history?

Read more... )

Would an acceptable compromise be not to *ban* people printing unpalatable lies, but allow, under some circumstances, an injunction making them mark it "government certified lie"? Then, if you believe the reasoning, you can trust the source anyway. But if you don't know, you are alerted to be doubtful.

That rings true both for so-called "government certified lies" that I think are false -- eg. holocaust denial -- and those I think are true -- eg. evolution.

[1] Pun.
[2] Someone should fill me in on the details. I know enough that "CABAL" may have been used an amusing acronym, but was apparently a pre-existing word first, and most acronyms weren't. But enough not to report either as truth without checking some sources.
jack: (Default)
[1]: I wasn't sure if I could get away with "conundra" but it felt right in context. I looked it up on etymonline (was it you, pippa, or you, vyvyan who linked me to that? either way I love both of you :)), and found:

1596, Oxford University slang for "pedant," also "whim," etc., later (1790) "riddle, puzzle," also spelled quonundrum; the sort of ponderous pseudo-Latin word that was once the height of humor in learned circles.

So I think I'm justified in using a plural as pretentious as I can contrive, hah :)
jack: (Default)
Have I asked this before? I know I was *going* to ask it.

How do you use "gross" and "net"? In the context of tax and weight, they are well defined, meaning "before tax" and "after tax", and "with packaging and lorries" and "without" respectively respectively.

I had gained the impression that "net" meant "resultant", and correspondingly assumed "gross" meant "before modifications".

And then I saw the weight example, and was told that "gross" simply meant the larger, the one with the extras, and "net" the one without.

Then I saw someone describe the weight example from the point of view of the people wanting the end product, when "resultant" would be a good description after all.

Etymologically it seems "gross" came from "big" and "net" came from "neat" (in latin). I'm not sure of their later path.

I'm sure I've heard "net effect" to mean "resultant effect, the effect remaining when everything else has cancelled out" and want to use it in that sense, but is that a valid usage?

I couldn't find it discussed anywhere.

Active Recent Entries