jack: (Default)
[personal profile] jack
I recently read a parable about a physicist who had just discovered Newton's laws of motions, and excitedly ran to the tower of Pisa to drop a hammer and a feather off it and proclaim that they would both hit the ground in however long he calculated. And all the locals scoffed, and said they were sure that hammers fall faster than feathers.

That didn't literally happen, but things like it happen all the time. Who was right? Well, both. In most circumstances, the locals had experience the physicist had forgotten or not noticed, and predicted the right answer. But if you tried it on the moon[1], or anywhere else in the universe, the non-physicists wouldn't know or would get it wrong.

You need both experience and theory to make predictions. If you're trying to deal with something new, you need theory to tell you what to most expect. But when you have the luxury of trying it multiple times, experience can often tell you something about the specific conditions you wouldn't have thought of.

My question is, does the same apply to moral questions? Is the right answer that usually you need virtue-based ethics ("practice being compassionate") or rule-based ethics ("don't kill") because they work. But you need to periodically re-evaluate your virtues or rules with a consequentialist metric to see if there's anything big you're missing ("I know we're all used to not thinking about it, but has anyone else noticed that slavery is evil?")

Is there a standard name for that?

[1] http://www.youtube.com/watch?v=5C5_dOEyAfk

Active Recent Entries