Padawanbater2
Well-Known Member
This is a question I've been asking religious people recently. I'm sure you guys will pick up the not so subtle undertones, but I'm wondering, what is more important to you, knowing what is the truth, even if it might be terrible, or believing something is the truth, just because it might make you feel good?
Discuss.
Discuss.