Every time I read news about USA it becomes harder to believe those news could be true. There are so many bad things happening surely such wealthy and powerful country could do something about all that injustice. It can't all be real. It must be just some probaganda to make them look bad.
But who's propaganda? The obvious candidate would be Russia. However It seems it's the American mega corporations that control all the media. Why would they paint such a grim picture of their own country? Capital doesn't really care as long as it sells. Or maybe the Russians are behind it after all. Pulling the strings of these companies behind the scenes, drawing away the attention front heir war. Could it for example be so that Americans kill each others with guns on average 53 times per day (2020 numbers) while Russian army is barely keeping up with those numbers in Ukraine.
It's hard to tell the truth anymore when all you see could be just tampered images, crafted for some sinister purpose. What can you trust anymore but your own first hand experiences?