![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
A standard warning is to avoid basing your actions on something being true simply because you want it to be true, not because you've any reason to think it is. Eg. Everyone agrees that "The ship can't be sinking, that would be terrible" is understandable if there's nothing you can do about it, but catastrophic if you're in charge of planning for lifeboats. But "There must be meaning in the universe, because otherwise it would be terrible" sounds very seductive, but is not conclusive for exactly the same reason!
However, a related but subtly different case, is something you want to believe because the effects of beleiving it are true. If believing X makes you or other people act in good ways, you are massively incentivised to want to believe X, and to hope X is true, and even to avoid any doubts about X, even if they're reasonable.
I think this is understandable and possibly wise, even though it leads to a sort of double-think of mainting the illusion that X, while simultaneously evaluating the truth of X, and trying to investigate alternatives to X, without ever admitting that's what you're doing.
However, I had a very startled moment, when I realised that many supposed rationalists (including me) tended to believe like an article of faith that it's better to make decisions based on true information. Yes, I think that's in general true. But if hypothetically I provided a celebrated atheist rationalist with evidence convincing in his/her eyes that actually, beleiving a popular religion WAS much more beneficial to humanity, would they go ahead and change their mind? In almost all cases, probably not.
That sounded understandable, but I realised that it was exactly the same process that many people use to justify a belief in heaven: that true or not, it's desirable that it's true, and then it's beneficial to believe it, so lets not rock the boat. Putting those two next to each other made me suddenly very uncertain.
However, a related but subtly different case, is something you want to believe because the effects of beleiving it are true. If believing X makes you or other people act in good ways, you are massively incentivised to want to believe X, and to hope X is true, and even to avoid any doubts about X, even if they're reasonable.
I think this is understandable and possibly wise, even though it leads to a sort of double-think of mainting the illusion that X, while simultaneously evaluating the truth of X, and trying to investigate alternatives to X, without ever admitting that's what you're doing.
However, I had a very startled moment, when I realised that many supposed rationalists (including me) tended to believe like an article of faith that it's better to make decisions based on true information. Yes, I think that's in general true. But if hypothetically I provided a celebrated atheist rationalist with evidence convincing in his/her eyes that actually, beleiving a popular religion WAS much more beneficial to humanity, would they go ahead and change their mind? In almost all cases, probably not.
That sounded understandable, but I realised that it was exactly the same process that many people use to justify a belief in heaven: that true or not, it's desirable that it's true, and then it's beneficial to believe it, so lets not rock the boat. Putting those two next to each other made me suddenly very uncertain.