In this episode, Barrett and I talk about what we’ve learned from the eight amazing guests who have been a part of the series. My key takeaway list is way longer than this, but here are a few:
- The earth-shaking swirl of social media, AI, wearable tech, bio tech, etc. is not getting ready to happen — it’s happening.
- I wish we would have been in a more stable, tethered place before the maelstrom hit. For me, it can feel like navigating the worst fun house — on my knees.
- The best way I can capture what I’ve come to understand about most social media platforms is “technology that exploits human vulnerability for commercial gain.” Not all platforms are created equal, but many use algorithms to tap into tender places so that we stay on long enough to be served ads. I like Dr. William Brady’s idea of algorithm transparency on all posts: “You’re seeing this because . . .”
- The interview with award-winning New York Times journalists Jennifer Valentino-DeVries and Michael H. Keller shines a light into some of the darkest places on social media and the influencer economy. It made me feel protective of young people, especially young girls, and it made me think about my own choices when I tell myself I can’t get off the platforms. At some point we all know social media is not good for us, yet we remain committed to the scroll.
- Dr. Joy Buolamwini, Dr. S. Craig Watkins, and Amy Webb taught me a ton about how easily AI can scale inequality and injustice. My new question is: Who is at the table when we build technology? Love the engineers and computational math folks, AND scoot over for the ethicists and people with diverse lived experiences who will be deeply impacted, the humanists, the activists . . .
- I felt a lot of hope around the good AI can do after talking to Lisa Gevelber. I especially love how over-burdened teachers can use AI to better meet students where they are with custom lesson plans, etc. I also loved learning about the pivotal role libraries play in closing the digital divide. PROTECT THE LIBRARIES AND LIBRARIANS!
- One thing that struck me is how each of these scholars and experts reframed AI. Dr. Watkins said it’s really “augmented intelligence” — because people are and should always be centered. Esther Perel said AI is often “artificial intimacy” — so powerful. And, when it comes to the way our IP and many of our books are being used as data to feed to these big models, Dr. Joy said, “It’s not generative, it’s regurgitated.”
There are many reasons that I wanted to do this series including my own curiosity and struggle. Another reason is the new research I’m doing to better understand what non-engineering skills are going to be required for daring leadership in this new environment. I’m probably 60–70% done with developing this theory. More to come!
For now I’ll say that the ability to productively manage uncertainty is going to be key. The hard part is how we’re absolutely not wired for uncertainty. During this entire series I kept thinking about Jon Kabat-Zinn’s definition of overwhelm: “That our lives are somehow unfolding faster than the human nervous system and psyche are able to manage well.”
Navigating uncertainty while caring for our minds, bodies, and souls is not a new challenge. But the velocity of change and the disembodied nature of tech may make it very different. I’m learning some very powerful and surprising tools from people who do that well — I can’t wait to share them with y’all! Until then, join me in some very serious deep breathing!