The truth or the lies?
From your experiences, do you think people like to live in denial? Or is the truth too painful for them to confront? Isn't living the lie also painful? Either way, we have to confront reality, right? Which is the better option for you?
How does it affect you as an individual when you're confronted with something you know deep down inside you're in denial about?
How does it make you feel when you see those around you living a lie?
I personally find living in the truth rewarding, because there's always something that we can learn with our eyes wide open. Makes us stronger, right?