Is America a "Christian nation"?

Is America a "Christian nation"?

What I mean by this is, if you are not Christian you are AMerican, and how if you are Muslim (just as an example) and live in AMerica, you are not agreeing with traditional American values (because suddenly America has a religion?)

Be the first to add this question to starred list!
▼Scroll down for more questions

Answers (5)

vote up or down the answers
0
Marvele
Yes but not in the sense you have expanded upon. America (the way we know it) was built upon christian values and christian people in power. As well as the fact that Christianity is one of the main religions in America. But other than that, no I don't.
on February 23
Report
0
BasicCactus
No. America is very diverse, but there's still lots of people who are racist or think their religion is the only religion. Lots of people in the U.S. are sacrilegious. It doesn't matter what kind of religion it is, it happens to everyone. Even Christians.

But I believe that everyone has the right to think what they want.
on February 21
Report
0
DarkFlame1000
In America you can have whatever religion you want but I think most religious people here are Christian so it seems that way but no America is not solely a Christian nation
on February 21
Report
0
neefflove
I don't think anyone says that... well any educated person.
on February 21
Report
-2
Whisker_Queer
No, America isn't about religion, but some people are letting Christians overthrow the government.
on February 21
Report