So the Native American lived here before white people got here.
Then whites came in and settled.
They brought black people over well before the U.S. even existed as nation.
As the nation expanded they took over land that belonged to Mexico and populated by Mexicans.
I really want someone to ask these people when the nation as we now know it was ever a white nation. There is literally no point in the nation's entire history where only white people lived in or contributed to it.
Any black person that puts the blame for the stamping out of black progress more on left-wing policies than rise of the KKK and similar groups and the systemic racism put in place should be slapped.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.