Thursday, August 10, 2017

How has the term "American" changed from the founding of the nation through today?

Generally, the term has expanded to mean "a citizen or inhabitant of the United States of America." When the nation was founded, people did not commonly refer to themselves as "Americans."
Before then, it was usually used to denote Native Americans. After the Revolution, the word was used in formal documents and by politicians in speeches. Generally, though, Americans imagined themselves in far more local terms and seldom had cause to refer to themselves in this way. Even with the emergence of nationalistic sentiment in the aftermath of the War of 1812, the term was usually used by foreign observers (e.g., Alexis de Tocqueville) to refer to people in this country rather than by Americans themselves.
Nativists used the term to refer to white, non-Irish Americans like themselves in the pre-Civil War era, and like-minded people used the term for similar reasons in the aftermath of World War I. Today, the use of the word as a demonym has become controversial due to the fact that it tends to reflect American exceptionalism, as there are after all many other countries on the North and South American continents.
https://slate.com/news-and-politics/2013/08/america-the-continent-vs-america-the-country.html

No comments:

Post a Comment

What is the theme of the chapter Lead?

Primo Levi's complex probing of the Holocaust, including his survival of Auschwitz and pre- and post-war life, is organized around indiv...