I’d like to hear opinions on whether it’s better to always be honest or is it sometimes better to straight up lie? This is for people in your circle, towards strangers it’s obviously sometimes better to just make stuff up.
Reason I’m asking is that the truth hurts sometimes.
My take is, as long as I don’t use any insulting words or unnecessary harsh language it’s not my fault if what I’m saying is hurtful to other people, provided it’s the truth. Telling the truth is always the best option in the long run.
Option C is sometimes to not bring up the matter of concern at all, bad solution as well?
What are your experiences in life so far regarding this, and/or am I wrong with my opinion?
You have to lie to protect yourself in an authoritarian world.