Tuesday, August 16, 2016

Was America founded on Christian principles?

This question is more complex than it may at first appear, and historians have mixed opinions as to whether the answer is "yes" or "no."

On the one hand, most of the people who came to American to establish the Thirteen colonies of the original United States came from England, which had a monarchy whose king or queen was head of a state Christian church. Almost all of the original colonists identified as Christians and had Christian backgrounds. Early state governments encouraged Christianity, and most of the original universities in the United States were Christian institutions. There is a clear reference to God in the Declaration of Independence:


We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty, and the pursuit of happiness.


However, the United States Constitution, the document that is the foundation of the government, advocates a separation of church and state. There is no stipulation for a government-sponsored church or a requirement that a person be a Christian to hold federal office. Additionally, the First Amendment allows for free expression of religion and does not restrict that religion to Christianity. It's also worth noting that if we consider the freedom advocated by the Bill of Rights to be based on Christian principles, when the country was founded that freedom applied only to adult white males and not to women, black people, Native Americans, and other minorities.
https://www.cnn.com/2015/07/02/living/america-christian-nation/index.html

https://www.heritage.org/political-process/report/did-america-have-christian-founding

No comments:

Post a Comment

What is the theme of the chapter Lead?

Primo Levi's complex probing of the Holocaust, including his survival of Auschwitz and pre- and post-war life, is organized around indiv...