The Theft Of America’s Soul: Blowing The Lid Off The Lies That Are Destroying Our Country

    0
    13

    In a country with so many diverse people, beliefs, and ideas, it is hard to find common ground. Most people would agree that America is a country built on freedom, liberty, and justice for all.

    America has been historically known for its morality, ethics, and integrity in all areas of life: social, spiritual, and political.

    However, in recent years there has been a serious decline in American morality. Not only is there less respect for life in general but also less respect for each other and our country as a whole.

    There has been an increase in lies being spread throughout the country that are tearing apart the very fabric of America. These lies are being used to gain power and control over the people by those who want to see the nation ruined.

    This article will talk about some of the most pervasive lies being told in America today and how they are destroying the nation.

    The lies about guns

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    Guns have been a part of American history for as long as history has been recorded. Yes, there have been gun regulations in the past, but never has this country experienced the kind of anti-gun fervor that it does now.

    Today, it seems like every other person is walking around with a firearm on their person. You can’t go to a restaurant or bar without seeing someone with a gun case on their back checking their pistol at the door.

    Why is this? How did guns become so prevalent?

    The main reason lies in the hands of politicians. In recent years, legislators have introduced bill after bill designed to restrict gun ownership and confiscate existing firearms. This has led many people to purchase more firearms before they’re taken away, just like they did during the Nazi occupation during World War II.

    The lie about climate change

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    In the past few decades, a lie has overtaken large parts of the country. This isn’t just any lie, but a whopper of historic proportions.

    It’s not just that vast numbers of people don’t believe this lie. It’s that they’ve made this particular lie part of their identity. They’re convinced that those who believe this lie are morally inferior and worse, are threats to the common good and to the nation itself.

    This lie is about climate change. More specifically, it’s that human activity is responsible for global warming and climate disruption.

    That’s why it’s not enough to say “I don’t believe in climate change.” You have to go further and declare: “I don’t believe in human-caused climate change.” Otherwise, you could be publicly shamed as a heretic or a traitor.

    Clearly, there’s something very powerful pushing people to insist on this point: It’s an integral part of their identity.

    The lie about capitalism

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    The mainstream narrative about capitalism is that it’s a system that rewards hard work and investment, and that poor people are kept poor by systemic bias.

    This is a lie.

    In fact, the richest members of our society have gotten rich not by investing in good ideas or by working harder, but by exploiting loopholes in the system and getting subsidies from the government.

    Many of them have also gotten rich at the expense of poor people, both American and foreign. And most of them have gotten rich not by creating new value, but by grabbing what other people have created and selling it at a higher price.

    Capitalism is fundamentally flawed because it relies on incentives that don’t always reward good actions. In fact, they often reward bad actions – like taking money from taxpayers to give to CEOs – because they help companies “compete”.

    The lie about socialism

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    In America, the word “socialist” has become synonymous with evil tyranny, dictatorship, and oppression.

    This couldn’t be further from the truth. In fact, American history is filled with examples of socialist policies and reforms that have made our country a better place for everyone.

    American socialists believe in equal rights and opportunities for all people. They believe that wealth and income should be distributed fairly among all people. And they believe that the government should play a role in regulating business and protecting citizens from corporate greed.

    There are many forms of socialism, but these basic principles hold true for most socialists. Of course, there are also plenty of trolls on Twitter who use the label “socialist” as a smear campaign — but most real-life socialists I’ve met just want to make society better for everyone.

    The truth is that America has always been influenced by socialist thought — and it’s time we recognized it.

    The lie about immigrants

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    In the minds of too many Americans, immigrants are all supposed to look, talk, and act a certain way. They’re supposed to come from a specific country or region, and they’re supposed to conform to a certain stereotypes about what kind of work they do and what their values are.

    That’s why so many people were surprised when a young woman from Afghanistan was awarded citizenship after she passed a citizenship test. The video of her being awarded citizenship went viral, with millions of views.

    What people didn’t realize was that she received special treatment due to her special circumstances as a refugee. She wasn’t just given citizenship because she passed a test — she was given special consideration because of what she had been through in her life.

    The average immigrant doesn’t get the same treatment, and that’s part of the problem. People think all immigrants are bad people when most of them aren’t — they’re just average people trying to make ends meet and provide for their families.

    The lie about religion

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    One of the biggest lies our country tells is that religion is a good thing. That faith in God and adherence to a religious doctrine is a good thing for individuals and for society.

    But in fact, religion is a destructive force that has divided us and kept us from solving our greatest challenges as a nation and as a society.

    And yet, despite all the evidence to the contrary, we continue to promote religion as an institution and even give it special privileges in our government.

    This has to stop, and I’m going to tell you why. But first, let’s take a quick look at the role religion has played – and continues to play – in America.

    The lies that have been told about religion can be divided into two broad categories: what it is and what it does. Let’s take a closer look at both of these lies.

    The lies about history

    the theft of america's soul: blowing the lid off the lies that are destroying our country

    There are many lies about history that people believe in America. Many of these lies are used to divide people and promote stereotypes.

    They serve the purpose of promoting a certain political ideology or agenda, and can sometimes be dangerous.

    Some of these lies include the beliefs that white people invented technology, that white people invented democracy, and that America is a free country built on freedom. All of these are very distorted views of history.

    America is often viewed as a completely white country, which is completely false. America is a richly diverse country with many ethnic groups that have their own culture and history. Many immigrants came to America to find freedom, which is one of the reasons it became such a wealthy country.

    The problem is when people don’t learn about the rich heritage and diversity of America, then people begin to believe these lies.

    The lies about culture

    There are a number of lies about American culture that are being pushed by the Left. Many of them have to do with race, and they’re designed to stir up fear and resentment between groups.

    For example, the Left continuously pushes the notion that white people – particularly white men – are seeking to oppress people of color through systemic racism and discrimination.

    This is categorically false, and can be proved so by statistics on income, employment, education, etc. provided by sources you can trust.

    The truth is that people of color have better opportunities in America than they do in their native countries because of the wealth this country provides. It is why there is such a high influx of immigrants every year: America is the land of opportunity!

    The other lies about culture have to do with sexual morality. The Left very strongly advocates for what they call sexual liberation: sex without commitment or restrictions; sex for purely carnal pleasure; sex that often involves partners who are not necessarily compatible or communicative; and sex that sometimes involves substances that dull one’s senses or judgment.

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here