Race relations are becoming increasingly important in our civilization. Despite this increasing importance, the question, ‘Is our society racist?’ is commonly debated. After investigating this subject, it is evident to me that American society is racist. There are various definitions of both racism and society. It is important to clearly define these terms when addressing such a controversial and emotional issue. ‘Racism’ is defined by Merriam-Webster as “a belief that race is the primary determinant of human traits and capacities and that racial differences produce an inherent superiority of a particular race” or “racial prejudice or discrimination”. The latter of these two definitions is what I will use as my working definition. The second term, ‘society’, is defined by Merriam-Webster as “a community, nation, or broad grouping of people having common traditions, institutions, and collective activities and interests.” My working definition for this paper, however, is “the American population as a whole”. When a topic as substantial as racism in society is discussed, all Americans must be taken into consideration. Under these definitions, racial influences have played a large part in the formation of America today. ‘Racism’ as defined in the above paragraph, has existed ever since the independence of America in 1776. In fact, racism was acceptable and often the normal way of life in society. An obvious sign of this was that slavery was growing in this new country. Violence against black slaves was only a part of the racism happening in America. Along with this, there was heavy fighting between those trying to expand America’s territory and the Native Americans already settled in these territories. Hundreds of thousands of people died because of racist feelings toward the other group. Racism continued into the mid 1860&...