Sunday, August 25, 2019

Was the United States ever part of the British Empire?

The American colonies were a vital part of the British Empire. These colonies were quite lucrative in terms of people and goods for the British. After the Revolutionary War, the United States was not truly a member of the British Empire in the sense that other colonies were; while other former colonies jumped at the chance to come to Britain's aid in various wars, the United States did not.
Commercially, the United States was treated as a favored former member of the British Empire. The United States and Britain traded freely after the American Revolution with few exceptions. In the buildup to the War of 1812, Britain tried to conscript American sailors who they believed left American ships; this led to the United States reasserting its independence in the War of 1812. The United States and Britain briefly shared the Oregon Territory until diplomats decided the property belonged to the United States during the Polk administration. The British royal house visited the United States during the reign of Queen Victoria with much American fanfare. While the United States was only a member of the British Empire during its colonial days, many Americans still feel close ties to the mother country.

No comments:

Post a Comment

What is the theme of the chapter Lead?

Primo Levi's complex probing of the Holocaust, including his survival of Auschwitz and pre- and post-war life, is organized around indiv...