I think Americans are raised to be unconfident, even though we all value confidence. It's that thing we all want to touch but are taught not to.

Confidence: the feeling or belief that one can rely on someone or something; firm trust.

Is this really what Americans instill in one another? Culturally, we all start more along the lines of 'ask me before you do this,' 'when you're older you'll understand,' 'listen to your boss,' 'you're not supposed to do that.'

Inevitably, we all feel an internal struggle to flip the switch. When we want to lead by example, be intrepid, do what we love, be fearless. Sometimes we associate this with 'finding ourselves,' but I wonder if, more broadly, it's one of those American qualities that we all sort of move towards.

Americanism should be more than what's on your passport. If there are certain American qualities that we all value, shouldn't the moments we start to embody these qualities be identified as especially important?

Shouldn't they be identified as 'American' and spoken about in a way that unifies all Americans?

Brandon Cohn