Tuesday, September 5, 2023

What the Left Did to Our Country

 What the Left Has Done to America


What the Left Did to Our Country

Victor Davis Hanson
American Greatness

In the last 20 years, the Left has boasted that it has gained control of most of America institutions of power and influence—the corporate boardroom, media, Silicon Valley, Wall Street, the administrative state, academia, foundations, social media, entertainment, professional sports, and Hollywood.

With such support, between 2009-17, Barack Obama was empowered to transform the Democratic Party from its middle-class roots and class concerns into the party of the bicoastal rich and subsidized poor—obsessions with big money, race, a new intolerant green religion, and dividing the country into a binary of oppressors and oppressed........

No comments:

Post a Comment