Is America a "Christian nation"?

Is America a "Christian nation"?

What I mean by this is, if you are not Christian you are AMerican, and how if you are Muslim (just as an example) and live in AMerica, you are not agreeing with traditional American values (because suddenly America has a religion?)

Be the first to add this question to starred list!
▼Scroll down for more questions

Answers (8)

vote up or down the answers
0
phanpy
it used to be, yes, hence the "one nation, under god" line in our pledge. we now have freedom of religion, but not necessarily freedom from religion.
on February 19, 2018
Report
0
Huisfle
"Freedom of religion" remember? Jeez, what sort of memories do you Americans have?
All_hail_Melon_King
The British said Americans were a Christian country. Not us
reply
on March 03, 2018
Report
on February 19, 2018
Report
0
moemoehearts
its really not anymore
on February 19, 2018
Report
0
All_hail_Melon_King
It’s a diverse nation. Jews, christians etc.
on February 18, 2018
Report
0
Marvele
Yes but not in the sense you have expanded upon. America (the way we know it) was built upon christian values and christian people in power. As well as the fact that Christianity is one of the main religions in America. But other than that, no I don't.
on February 23, 2016
Report
0
BasicCactus
No. America is very diverse, but there's still lots of people who are racist or think their religion is the only religion. Lots of people in the U.S. are sacrilegious. It doesn't matter what kind of religion it is, it happens to everyone. Even Christians.

But I believe that everyone has the right to think what they want.
on February 21, 2016
Report
0
DarkFlame1000
In America you can have whatever religion you want but I think most religious people here are Christian so it seems that way but no America is not solely a Christian nation
on February 21, 2016
Report
-2
phannie
No, America isn't about religion, but some people are letting Christians overthrow the government.
on February 21, 2016
Report