Discussion about this post

User's avatar
Paul Lyren's avatar

Kyla, Your approach to this question of modern economics is just genius. Like Jimmy, meet people where they are at. Making this understandable and actionable is crazy difficult to be sure, but we have to start with understanding it. I wish every Congress Person and every Senator and really everyone would/could read this objectively (as much as anyone really can) and see that this problem of our own making is also fixable if we can simply sit down and admit it is a problem and break it down into its component parts and start fixing them. My biggest take away is a certain level of hope that we actually can confront this next wave of Anthropocene evolution and do it smart. It puts me in mind of the Amish. They are not Luddites as many people think, they just consider what technology and practices they want to let into their community. We need a similar approach and realize that components of AI are truly amazing (medical, space, virology - targeted disciplines), but unfortunately we are confronted with a 10 different bots and options every day that no on really wants and the combined specter of it taking your job. Keep it up and thanks!

Ben Ahmetagic's avatar

Many of your newsletters have certainly influenced this: the more I've read/thought about these problems, the more I keep coming back to the solution needing to start with tuning down algorithmically curated content. We are all SO mad, all the time, fight or flight response every time you open your phone because the algorithms know anger drives engagement and makes us FEEL, even if we feel bad. Feeling bad also uses so much bandwidth. I remember a time where I would open social media (then chronologically sorted), scroll until I saw a post I had already seen, and then closed my phone and moved on. I used to think I was "caught up", like you were done reading the paper for the day. We underestimate the addiction to scrolling to the next thing. The timeline for when it feels like things really started falling apart coincides pretty well with the time where you stopped having quite as much say in what came across your desk (the algoritms know what you like more than you do, after all). Ironically, that is when the serious echo-chamber formation actually accelerated and it's compounded at an alarming rate ever since.

116 more comments...

No posts

Ready for more?