Is America Becoming a Godless Culture? Many people believe the United States is becoming a "Godless" culture. The argument is that we are losing our moral values, because religion is not an important part of our life anymore. This is the view of most Fundamentalist Christians and right wing believers. I vehemently disagree with this argument. I say this because there are many ways to practice religion. We have many ethnic populations in our country with many different religions. Because of this, there are many beliefs that differ from the Christian based or religions of only one god. Therefore, our country is not becoming Godless, there are many ways to practice one's faith. Many people believe the United States is becoming a godless culture. There is more violent crime, underage sex, immoral sex and more divorces in our country then ever before in history. Divorce happens in 50% of marriages, gays and lesbians are seeking marriage rights, there is sex education and encouragement to use condoms for younger people and HIV is rampant. All these sins and abominations are caused by having a godless culture. Our morals are declining because there is nothing binding us to be "morally good" or abiding to the bible. HIV is a disease to rid our population of the undesirables. It is our sinful ways and God is striking down at us. We, the people who do not believe in Jesus Christ as our savior, are even going to be branded after the world ends with the fiery Armageddon. The fundamental Christians do not believe there is any other way to practice faith and religion other than Christianity. I do not believe our country is becoming a godless culture. One of the important aspects that are missing from the godless culture theory is the fact that we have in our country many different religions that are being practiced everyday. People who migrate to this country bring their own religion. Yes, they want to assimilate into our country, bu...