When did politics become culture in the United States?Posted by T.Collins Logan on
I think “politics becoming culture” in the sense of supplanting culture has been with the U.S. from its inception. The self-liberation from British tyranny was chiefly a cultural act, clothed in a loud and flashy political veneer/justification. And although, on-and-off over the decades, the U.S. has developed various regional sub-cultures that outshine its political currents for a time, these sub-cultures ultimately end up intersecting and merging with the political narrative. U.S. politics is like the Borg in this regard. Perhaps this is a feature of democracy itself — or of a democracy that has always been steeped in media, commercialism, and commoditization — where so much can be determined at a national level. Everything ultimately becomes political, because politics impacts nearly everything in our lives.
My 2 cents.
My 2 cents.
TrackbacksTrackback specific URI for this entry
This link is not meant to be clicked. It contains the trackback URI for this entry. You can use this URI to send ping- & trackbacks from your own blog to this entry. To copy the link, right click and select "Copy Shortcut" in Internet Explorer or "Copy Link Location" in Mozilla.
The author does not allow comments to this entry
CommentsDisplay comments as Linear | Threaded