Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
Furthermore, what part of the country are you living in that leads you to believe we are less racist than other countries!? Our racism has defined our country ever since it was created.
Seriously, I am curious what part of the country you live in? It is sheltered from most American realities
If you think I’m denying racism, you read that comment incredibly wrong. I’m saying in comparison to other coutnries.
Your point was that we’re better because we talk about it.
All over the country legislatures are banning books, and curriculums that even mention racism. It isn’t an isolated incident either.
I agree with you on the exaggeration of racism.
I don’t think you bring valid points to why the United States are better than portrayed, but nonetheless believe so myself.
deleted by creator
I look asian. I experienced more racism traveling through Europe for 6 weeks than I have living 15 years in the US (primarily Atlanta and Jacksonville).
deleted by creator