The Moral And Cultural Decline Of America?

With the recent tragic shooting in Nashville, Tennessee involving a transgender activist killing faculty and students in a Christian school, a recent survey conducted by The Wall Street Journal gives an indication as to why events like this and other tragedies, like January 6th prisoners, are occurring.

The results of this survey demonstrates the importance of fighting the culture wars which the political Right has been very lax in engaging in but the left is determined to win. It is one thing to spend time discussing taxes and economics, another to neglect the importance of the conscience or soul of a nation.

The spread of woke ideology in many of America’s cultural institutions was a planned effort that has paid the left dividends since outlets like the media, entertainment, the courts, and even corporations were, by and large, counted on to continue to uphold or even reinforce the values of the United States. But no longer. Especially with the numerous instances of leftist terrorists not being prosecuted or getting kit glove treatment for threatening, assaulting, intimidating and even murdering their opponents.

Patriotism, religion and having families are frowned upon especially since the COVID-19 pandemic. American history along with Western Civilization overall is condemned. What happened during the pandemic has even robbed many Americans of the desire of community involvement all of which help define the kind of country we live in and that is a shame.

The results of the survey are very concerning and, hopefully, it is not too late for these trends to be reversed.