Is being free the opposite of being in control?

Control is a well defined and understood concept in American culture. There's physical control, of course, but the more interesting Americanism is the idea that we 'control our own destiny'-- whatever that fluffy phrase means to each of us.

Freedom is America, and yet we all have our own definition of what this means. For some it means not being controlled by others. It could mean being in control yourself. It has physical, emotional, and spiritual connotations.

It's nebulous, changes over time, and yet is the foundation of our country.

Perhaps America is like a company that forgot about its brand strategy division. What words embody American culture, and what do these words mean to each of us?

How do they unify us? What's the consistent thread of these core concepts that run through every American? How does this make us empathetic towards one another, and how can we use this empathy to up the amount of kindness we share?

Brandon Cohn