the US is unreal like girls cant wear shorts to school, you can literally lose your job for being gay, and unarmed black children are brutally murdered on the regular but old white ppl r still like “what a beautiful country. i can freely carry a gun for no reason and some of our mountains look like presidents. god bless”
it isn’t all like that. of course media and other people will hype up the negative things about a place, bagging on it, to create attention and conversation. America has its perks too, don’t forget.
No where on this earth is perfect. No where is all good, no where is all bad.
I mean, do you really want to sit here and have a shit talking competition between which country is the best? Because don’t even get me started on Australia, some European countries, or places in Asia.
I wish ignorant people would just keep their mouths shut.