Was The United States Ever A Christian Nation?

I came across this on Kurt Willems’ blog: The Pangea Blog. I thought it was relevant to my recent post The End of Christian America?

Perhaps America never was a Christian nation in the first place… thoughts?

Share Button

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload the CAPTCHA.