Historically the U.S has been a "white" country and I think it was founded as such. I don't think the founding fathers would of known in their wildest dreams that the U.S would eventually become a diverse and multiracial country like it is now. In fact the founding fathers would of been in opposition of the U.S becoming a imperialistic and diverse country that one of the reason why the founding fathers and early U.S history presidents were isolationists and against expanding the U.S into other territories was because they didn't want to absorb "lesser" beings according to early American historians. Now 200 years later America has accepted the fact that are men are equal but it wasn't until the Civil Rights movement that it was finally accepted more. It wasn't until the 1980s or so that the culture of America changed and was deemed wrong to held past views of their fore fathers. That's also when America started to change demographically. What time for you did America stopped being a "white" nation?
Log in to comment