Why is the United States so Christian?
Lets face it. Most of the "believers" who come here, bellowing about how christian they are" are obviously American. Yet when do we actually see any of this christianity. It doesn't appear to be reflected in culture. Popular music, movies, literature etc contains almost no mention of christianity. We live in the most rapaciously capitalist society on earth. No Universal healthcare. Even Children aren't covered. Workers rights are virtually none existent. We're not even legally entitled to paid holidays. The most heavily armed country on earth (odd for a country with "In God we Trust" written on it's money. Maybe we should change it to " In Nukes we Trust"). I could make a list as long as your arm but I'm sure you get the picture.
So what is it that you think makes America christian besides saying that you are?
American Christians generally believe that the country was founded on Judeo-Christian principles. This is somewhat true, in that the country was settled by Christians, ideas from Christian philosophers were woven into our constitution, and modern capitalism arose from the Protestant work ethic. Then again, it's pretty clear that America's founders favored secularism, and intentionally set up a secular system of government, so much of this is wishful thinking on the part of American Christians.
The majority of America is Christian, many of whom are fond of the I Say So Rule ("America is Christian because we say so!). American politicians often harp on Christianity to get elected. Christian special interest groups are very outspoken, and get a lot of press. Americans like the idea of being Christian, so that they don't have to act Christian. These all combine to make "Christian" and "America" synonymous.
So how does this strike most of you so call christians? Is this about on line with what most of you believe? So I ask you again, why is the United States so Christian in YOUR own words? Or for that matter, why are YOU a christian?