For those say in their 60s or 70s here. When you were in your 30’s or 40’s did you have the feeling that the world was a fucked up place? So much has been going on since I entered adulthood in the early 2000s and I feel like it’s getting more and more intense. It’s never ending.
Is it unique? Or has it always been this way?


You’ve been told a racist lie. Native Americans – especially ones in the forested parts of the country – had plenty of agriculture, and at its peak in the 12th Century, Cahokia (the largest city we know about north of Mesoamerica) may have had a larger population than London or Paris did at the time.
What actually happened was that the natives caught old-world diseases from the earliest explorers and colonists, which set off a continent-wide pandemic so virulent that, by a few decades later when the European settlers really started showing up in earnest, something like 90-99% of them were dead and their towns had been reclaimed by nature.