Beta

Explore every episode of Your Undivided Attention

Dive into the complete episode list for Your Undivided Attention. Each episode is cataloged with detailed descriptions, making it easy to find and explore specific topics. Keep track of all episodes from your favorite podcast and never miss a moment of insightful content.

Rows per page:

1–50 of 136

Pub. DateTitleDuration
10 Nov 2022Spotlight — Humane Technology on '60 Minutes'00:12:05

The weekly American news show 60 Minutes invited Center for Humane Technology co-founder Tristan Harris back recently to discuss political polarization and the anger and incivility that gets elevated on social media as a matter of corporate profit. We're releasing a special episode of Your Undivided Attention this week to dig further into some of the important nuances of the complexity of this problem.

CHT’s work was actually introduced to the world by Anderson Cooper on 60 Minutes back in 2017, and we’re honored to have been invited back. In this new interview, we cover the business model of competing for engagement at all costs - the real root of the problem that we’re thrilled to be able to discuss on a far-reaching platform.

We also busted the myth that if you’re not on social media, you don’t need to be concerned. Even if you're not on social media, you likely live in a country that will vote based on other people’s collective choices and behaviors. We know that the media we engage with shapes the people who consume it.
 

CORRECTION: 

  • Tristan notes that Facebook's Head of Global Policy, Monika Bickert, says in the  interview that social media can't be the root of America's anger because it's people over the age of 60 who are most polarized. She actually said that people over the age of 65 are most polarized.


RECOMMENDED MEDIA

60 Minutes: “Social Media and Political Polarization in America”

https://humanetech.com/60minutes

Amusing Ourselves to Death by Neil Postman

https://www.penguinrandomhouse.com/books/297276/amusing-ourselves-to-death-by-neil-postman/

Neil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth century.

60 Minutes: “Brain Hacking”

https://www.youtube.com/watch?v=awAMTQZmvPE

RECOMMENDED YUA EPISODES 

Elon, Twitter, and the Gladiator Arena

https://www.humanetech.com/podcast/elon-twitter-and-the-gladiator-arena

Addressing the TikTok Threat

https://www.humanetech.com/podcast/bonus-addressing-the-tiktok-threat

What is Civil War In The Digital Age? With Barbara F Walter

https://www.humanetech.com/podcast/50-what-is-civil-war-in-the-digital-age

01 Feb 2024Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet00:42:59

Over the past year, a tsunami of apps that digitally strip the clothes off real people has hit the market. Now anyone can create fake non-consensual sexual images in just a few clicks. With cases proliferating in high schools, guest presenter Laurie Segall talks to legal scholar Mary Anne Franks about the AI-enabled rise in deep fake porn and what we can do about it. 

Correction: Laurie refers to the app 'Clothes Off.' It’s actually named Clothoff. There are many clothes remover apps in this category.

RECOMMENDED MEDIA 

Revenge Porn: The Cyberwar Against Women

In a five-part digital series, Laurie Segall uncovers a disturbing internet trend: the rise of revenge porn

The Cult of the Constitution

In this provocative book, Mary Anne Franks examines the thin line between constitutional fidelity and constitutional fundamentalism

Fake Explicit Taylor Swift Images Swamp Social Media

Calls to protect women and crack down on the platforms and technology that spread such images have been reignited

RECOMMENDED YUA EPISODES 

No One is Immune to AI Harms

Esther Perel on Artificial Intimacy

Social Media Victims Lawyer Up

The AI Dilemma

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

22 Jun 2020The Bully’s Pulpit — with Fadi Quran00:55:53

The sound of bullies on social media can be deafening, but what about their victims? “They're just sitting there being pummeled and pummeled and pummeled,” says Fadi Quran. As the campaign director of Avaaz, a platform for 62 million activists worldwide, Fadi and his team go to great lengths to figure out exactly how social media is being weaponized against vulnerable communities, including those who have no voice online at all. “They can't report it. They’re not online.” Fadi says. “They can't even have a conversation about it.” But by bringing these voices of survivors to Silicon Valley, Fadi says, tech companies can not just hear the lethal consequences of algorithmic abuse, they can start hacking away at a system that Fadi argues was “designed for bullies.”

26 Sep 2024‘We Have to Get It Right’: Gary Marcus On Untamed AI00:41:43

It’s a confusing moment in AI. Depending on who you ask, we’re either on the fast track to AI that’s smarter than most humans, or the technology is about to hit a wall. Gary Marcus is in the latter camp. He’s a cognitive psychologist and computer scientist who built his own successful AI start-up. But he’s also been called AI’s loudest critic.

On Your Undivided Attention this week, Gary sits down with CHT Executive Director Daniel Barcay to defend his skepticism of generative AI and to discuss what we need to do as a society to get the rollout of this technology right… which is the focus of his new book, Taming Silicon Valley: How We Can Ensure That AI Works for Us.

The bottom line: No matter how quickly AI progresses, Gary argues that our society is woefully unprepared for the risks that will come from the AI we already have.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

RECOMMENDED MEDIA

Link to Gary’s book: Taming Silicon Valley: How We Can Ensure That AI Works for Us

Further reading on the deepfake of the CEO of India's National Stock Exchange

Further reading on the deepfake of of an explosion near the Pentagon.

The study Gary cited on AI and false memories.

Footage from Gary and Sam Altman’s Senate testimony.

 

RECOMMENDED YUA EPISODES

Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet

No One is Immune to AI Harms with Dr. Joy Buolamwini

 

Correction: Gary mistakenly listed the reliability of GPS systems as 98%. The federal government’s standard for GPS reliability is 95%.

24 Feb 2022The Invisible Influence of Language — with Lera Boroditsky00:40:19

One of the oldest technologies we have is language. How do the words we use influence the way we think?

The media can talk about immigrants scurrying across the border, versus immigrants crossing the border. Or we might hear about technology platforms censoring us, versus moderating content. 

If those word choices shift public opinion on immigration or technology by 25%, or even 2%, then we’ve been influenced in ways we can't even see. Which means that becoming aware of how words shape the way we think can help inoculate us from their undue influence. And further, consciously choosing or even designing the words we use can help us think in more complex ways – and address our most complex challenges.

This week on Your Undivided Attention, we're grateful to have Lera Boroditsky, a cognitive scientist who studies how language shapes thought. Lera is an Associate Professor of Cognitive Science at UC San Diego, and the editor-in-chief of Frontiers in Cultural Psychology.

Clarification: in the episode, Aza refers to Elizabeth Loftus' research on eyewitness testimony. He describes an experiment in which a car hit a stop sign, but the experiment actually used an example of two cars hitting each other.

RECOMMENDED MEDIA 

How language shapes the way we think

Lera Boroditsky's 2018 TED talk about how the 7,000 languages spoken around the world shape the way we think

Measuring Effects of Metaphor in a Dynamic Opinion Landscape

Boroditsky and Paul H. Thibodeau's 2015 study about how the metaphors we use to talk about crime influence our opinions on how to address crime 

Subtle linguistic cues influence perceived blame and financial liability

Boroditsky and Caitlin M. Fausey's 2010 study about how the language used to describe the 2004 Super Bowl "wardrobe malfunction" influence our views on culpability

Why are politicians getting 'schooled' and 'destroyed'?

BBC article featuring the research of former Your Undivided Attention guest Guillaume Chaslot, which shows the verbs YouTube is most likely to include in titles of recommended videos — such as "obliterates" and "destroys"

RECOMMENDED YUA EPISODES 

Mind the (Perception) Gap: https://www.humanetech.com/podcast/33-mind-the-perception-gap

Can Your Reality Turn on a Word?: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-word

Down the Rabbit Hole by Design: https://www.humanetech.com/podcast/4-down-the-rabbit-hole-by-design

06 Apr 2023Spotlight: The Three Rules of Humane Tech00:22:17

In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.

Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.

 

RECOMMENDED MEDIA 

We Think in 3D. Social Media Should, Too
Tristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view

Let’s Think About Slowing Down AI

Katja Grace’s piece about how to avert doom by not building the doom machine

If We Don’t Master AI, It Will Master Us

Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece

 

RECOMMENDED YUA EPISODES 

The AI Dilemma

Synthetic humanity: AI & What’s At Stake

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

15 Jan 2021Two Million Years in Two Hours: A Conversation with Yuval Noah Harari01:59:48

Yuval Noah Harari is one of the rare historians who can give us a two-million-year perspective on today’s headlines. In this wide-ranging conversation, Yuval explains how technology and democracy have evolved together over the course of human history, from paleolithic tribes to city states to kingdoms to nation states. So where do we go from here? “In almost all the conversations I have,” Yuval says, “we get stuck in dystopia and we never explore the no less problematic questions of what happens when we avoid dystopia.” We push beyond dystopia and consider the nearly unimaginable alternatives in this special episode of Your Undivided Attention.

07 May 2020The Spin Doctors Are In — with Renée DiResta00:52:57

How does disinformation spread in the age of COVID-19? It takes an expert like Renée DiResta to trace conspiracy theories back to their source. She’s already exposed how Russian state actors manipulated the 2016 election, but that was just a prelude to what she’s seeing online today: a convergence of state actors and lone individuals, anti-vaxxers and NRA supporters, scam artists and preachers and the occasional fan of cuddly pandas. What ties all of these disparate actors together is an information ecosystem that’s breaking down before our eyes. We explore what’s going wrong and what we must do to fix it in this interview with Renée DiResta, Research Manager at the Stanford Internet Observatory.

24 Jul 2019From Russia with Likes (Part 1) — with Renée DiResta00:45:47

Today’s online propaganda has evolved in unforeseeable and seemingly absurd ways; by laughing at or spreading a Kermit the Frog meme, you may be unwittingly advancing the Russian agenda. These campaigns affect our elections integrity, public health, and relationships. In this episode, the first of two parts, disinformation expert Renee DiResta talks with Tristan and Aza about how these tactics work, how social media platforms’ algorithms and business models allow foreign agents to game the system, and what these messages reveal to us about ourselves. Renee gained unique insight into this issue when in 2017 Congress asked her to lead a team of investigators analyzing a data set of texts, images and videos from Facebook, Twitter and Google thought to have been created by Russia’s Internet Research Agency. She shares what she learned, and in part two of their conversation, Renee, Tristan and Aza will discuss what steps can be taken to prevent this kind of manipulation in the future. 

01 Aug 2019From Russia with Likes (Part 2) — with Renée DiResta00:28:53

In the second part of our interview with Renée DiResta, disinformation expert, Mozilla fellow, and co-author of the Senate Intelligence Committee’s Russia investigation, she explains how social media platforms use your sense of identity and personal relationships to keep you glued to their sites longer, and how those design choices have political consequences. The online tools and tactics of foreign agents can be very precise and deliberate, but they don’t have to be -- Renée has seen how deception and uncertainty are powerful agents of distrust and easy to create. Do we really need the ease of global amplification of information-sharing that social media enables, anyway? We don’t want spam in our email inbox so why do we tolerate it in our social media feed?  What would happen if we had to copy and paste and click twice, or three times? Tristan and Aza also brainstorm ways to prevent and control disinformation in the lead-up to elections, and particularly the 2020 U.S. elections. 

06 Sep 2024Esther Perel on Artificial Intimacy (rerun)00:44:52

[This episode originally aired on August 17, 2023] For all the talk about AI, we rarely hear about how it will change our relationships. As we swipe to find love and consult chatbot therapists, acclaimed psychotherapist and relationship expert Esther Perel warns that there’s another harmful “AI” on the rise — Artificial Intimacy — and how it is depriving us of real connection. Tristan and Esther discuss how depending on algorithms can fuel alienation, and then imagine how we might design technology to strengthen our social bonds.

RECOMMENDED MEDIA 

Mating in Captivity by Esther Perel

Esther's debut work on the intricacies behind modern relationships, and the dichotomy of domesticity and sexual desire

The State of Affairs by Esther Perel

Esther takes a look at modern relationships through the lens of infidelity

Where Should We Begin? with Esther Perel

Listen in as real couples in search of help bare the raw and profound details of their stories

How’s Work? with Esther Perel

Esther’s podcast that focuses on the hard conversations we're afraid to have at work 

Lars and the Real Girl (2007)

A young man strikes up an unconventional relationship with a doll he finds on the internet

Her (2013)

In a near future, a lonely writer develops an unlikely relationship with an operating system designed to meet his every need

RECOMMENDED YUA EPISODES

Big Food, Big Tech and Big AI with Michael Moss

The AI Dilemma

The Three Rules of Humane Tech

Digital Democracy is Within Reach with Audrey Tang

 

CORRECTION: Esther refers to the 2007 film Lars and the Real Doll. The title of the film is Lars and the Real Girl.
 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

30 Jan 2020Mr. Harris Goes to Washington00:42:10

What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions. 

23 May 2024War is a Laboratory for AI with Paul Scharre00:59:16

Right now, militaries around the globe are investing heavily in the use of AI weapons and drones.  From Ukraine to Gaza, weapons systems with increasing levels of autonomy are being used to kill people and destroy infrastructure and the development of fully autonomous weapons shows little signs of slowing down. What does this mean for the future of warfare? What safeguards can we put up around these systems? And is this runaway trend toward autonomous warfare inevitable or will nations come together and choose a different path? In this episode, Tristan and Daniel sit down with Paul Scharre to try to answer some of these questions. Paul is a former Army Ranger, the author of two books on autonomous weapons and he helped the Department of Defense write a lot of its policy on the use of AI in weaponry. 

RECOMMENDED MEDIA

Four Battlegrounds: Power in the Age of Artificial Intelligence: Paul’s book on the future of AI in war, which came out in 2023.

Army of None: Autonomous Weapons and the Future of War: Paul’s 2018 book documenting and predicting the rise of autonomous and semi-autonomous weapons as part of modern warfare.

The Perilous Coming Age of AI Warfare: How to Limit the Threat of Autonomous Warfare: Paul’s article in Foreign Affairs based on his recent trip to the battlefield in Ukraine.

The night the world almost almost ended: A BBC documentary about Stanislav Petrov’s decision not to start nuclear war.

AlphaDogfight Trials Final Event: The full simulated dogfight between an AI and human pilot. The AI pilot swept, 5-0.

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza: An investigation into the use of AI targeting systems by the IDF.

RECOMMENDED YUA EPISODES

  1. The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao
  2. Can We Govern AI? with Marietje Schaake
  3. Big Food, Big Tech and Big AI with Michael Moss
  4. The Invisible Cyber-War with Nicole Perlroth

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

30 Nov 20232023 Ask Us Anything00:35:07

You asked, we answered. This has been a big year in the world of tech, with the rapid proliferation of artificial intelligence, acceleration of neurotechnology, and continued ethical missteps of social media. Looking back on 2023, there are still so many questions on our minds, and we know you have a lot of questions too. So we created this episode to respond to listener questions and to reflect on what lies ahead.

Correction: Tristan mentions that 41 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.

Correction: Tristan refers to Casey Mock as the Center for Humane Technology’s Chief Policy and Public Affairs Manager. His title is Chief Policy and Public Affairs Officer.

RECOMMENDED MEDIA 

Tech Policy Watch

Marietje Schaake curates this briefing on artificial intelligence and technology policy from around the world

The AI Executive Order

President Biden’s executive order on the safe, secure, and trustworthy development and use of AI

Meta sued by 42 AGs for addictive features targeting kids

A bipartisan group of 42 attorneys general is suing Meta, alleging features on Facebook and Instagram are addictive and are aimed at kids and teens

RECOMMENDED YUA EPISODES 

The Three Rules of Humane Tech

Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

Inside the First AI Insight Forum in Washington

Digital Democracy is Within Reach with Audrey Tang

The Tech We Need for 21st Century Democracy with Divya Siddarth

Mind the (Perception) Gap with Dan Vallone

The AI Dilemma

Can We Govern AI? with Marietje Schaake

Ask Us Anything: You Asked, We Answered

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

21 Sep 2021Spotlight — The Facebook Files with Tristan Harris, Frank Luntz, and Daniel Schmachtenberger01:05:12

On September 13th, the Wall Street Journal released The Facebook Files, an ongoing investigation of the extent to which Facebook's problems are meticulously known inside the company — all the way up to Mark Zuckerberg. Pollster Frank Luntz invited Tristan Harris along with friend and mentor Daniel Schmachtenberger to discuss the implications in a live webinar. 

In this bonus episode of Your Undivided Attention, Tristan and Daniel amplify the scope of the public conversation about The Facebook Files beyond the platform, and into its business model, our regulatory structure, and human nature itself.

25 Jun 2021[Unedited] A Problem Well-Stated is Half-Solved — with Daniel Schmachtenberger02:02:49

We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?

Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris. 

This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this unedited version along with the edited version

We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.

16 Jan 2025Laughing at Power: A Troublemaker’s Guide to Changing Tech00:45:47

The status quo of tech today is untenable: we’re addicted to our devices, we’ve become increasingly polarized, our mental health is suffering and our personal data is sold to the highest bidder. This situation feels entrenched, propped up by a system of broken incentives beyond our control. So how do you shift an immovable status quo? Our guest today, Srdja Popovic, has been working to answer this question his whole life. 

As a young activist, Popovic helped overthrow Serbian dictator Slobodan Milosevic by turning creative resistance into an art form. His tactics didn't just challenge authority, they transformed how people saw their own power to create change. Since then, he's dedicated his life to supporting peaceful movements around the globe, developing innovative strategies that expose the fragility of seemingly untouchable systems. In this episode, Popovic sits down with CHT's Executive Director Daniel Barcay to explore how these same principles of creative resistance might help us address the challenges we face with tech today. 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

We are hiring for a new Director of Philanthropy at CHT. Next year will be an absolutely critical time for us to shape how AI is going to get rolled out across our society. And our team is working hard on public awareness, policy and technology and design interventions. So we're looking for someone who can help us grow to the scale of this challenge. If you're interested, please apply. You can find the job posting at humanetech.com/careers.

RECOMMENDED MEDIA

“Pranksters vs. Autocrats” by Srdja Popovic and Sophia A. McClennen 

”Blueprint for Revolution” by Srdja Popovic

The Center for Applied Non-Violent Actions and Strategies, Srjda’s organization promoting peaceful resistance around the globe.

Tactics4Change, a database of global dilemma actions created by CANVAS

The Power of Laughtivism, Srdja’s viral TEDx talk from 2013

Further reading on the dilemma action tactics used by Syrian rebels

Further reading on the toy protest in Siberia

More info on The Yes Men and their activism toolkit Beautiful Trouble 

”This is Not Propaganda” by Peter Pomerantsev

Machines of Loving Grace,” the essay on AI by Anthropic CEO Dario Amodei, which mentions creating an AI Srdja.

RECOMMENDED YUA EPISODES

Future-proofing Democracy In the Age of AI with Audrey Tang

The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao

The Tech We Need for 21st Century Democracy with Divya Siddarth

The Race to Cooperation with David Sloan Wilson

CLARIFICATION: Srdja makes reference to Russian President Vladimir Putin wanting to win an election in 2012 by 82%. Putin did win that election but only by 63.6%. However, international election observers concluded that "there was no real competition and abuse of government resources ensured that the ultimate winner of the election was never in doubt."

26 Aug 2024Tech's Big Money Campaign is Getting Pushback with Margaret O'Mara and Brody Mullins00:43:59

Today, the tech industry is  the second-biggest lobbying power in Washington, DC, but that wasn’t true as recently as ten years ago. How did we get to this moment? And where could we be going next? On this episode of Your Undivided Attention, Tristan and Daniel sit down with historian Margaret O’Mara and journalist Brody Mullins to discuss how Silicon Valley has changed the nature of American lobbying. 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

The Wolves of K Street: The Secret History of How Big Money Took Over Big Government - Brody’s book on the history of lobbying.

The Code: Silicon Valley and the Remaking of America - Margaret’s book on the historical relationship between Silicon Valley and Capitol Hill

More information on the Google antitrust ruling

More Information on KOSPA

More information on the SOPA/PIPA internet blackout

Detailed breakdown of Internet lobbying from Open Secrets

 

RECOMMENDED YUA EPISODES

U.S. Senators Grilled Social Media CEOs. Will Anything Change?

Can We Govern AI? with Marietje Schaake
The Race to Cooperation with David Sloan Wilson

 

CORRECTION: Brody Mullins refers to AT&T as having a “hundred million dollar” lobbying budget in 2006 and 2007. While we couldn’t verify the size of their budget for lobbying, their actual lobbying spend was much less than this: $27.4m in 2006 and $16.5m in 2007, according to OpenSecrets.

 

The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office

 

31 Aug 2023The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao00:45:45

In the debate over slowing down AI, we often hear the same argument against regulation.   “What about China? We can’t let China get ahead.” To dig into the nuances of this argument, Tristan and Aza speak with academic researcher Jeffrey Ding and journalist Karen Hao, who take us through what’s really happening in Chinese AI development. They address China’s advantages and limitations, what risks are overblown, and what, in this multi-national competition, is at stake as we imagine the best possible future for everyone.

CORRECTION: Jeffrey Ding says the export controls on advanced chips that were established in October 2022 only apply to military end-users. The controls also impose a license requirement on the export of those advanced chips to any China-based end-user.

RECOMMENDED MEDIA 

Recent Trends in China’s Large Language Model Landscape by Jeffrey Ding and Jenny W. Xiao

This study covers a sample of 26 large-scale pre-trained AI models developed in China

The diffusion deficit in scientific and technological power: re-assessing China’s rise by Jeffrey Ding

This paper argues for placing a greater weight on a state’s capacity to diffuse, or widely adopt, innovations

The U.S. Is Turning Away From Its Biggest Scientific Partner at a Precarious Time by Karen Hao and Sha Hua

U.S. moves to cut research ties with China over security concerns threaten American progress in critical areas

Why China Has Not Caught Up Yet: Military-Technological Superiority and the Limits of Imitation, Reverse Engineering, and Cyber Espionage by Andrea Gilli and Mauro Gilli

Military technology has grown so complex that it’s hard to imitate

RECOMMENDED YUA EPISODES 

The Three Rules of Humane Tech

A Fresh Take on Tech in China with Rui Ma and Duncan Clark

Digital Democracy is Within Reach with Audrey Tang

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

25 Jun 2021A Problem Well-Stated is Half-Solved — with Daniel Schmachtenberger00:37:06

We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?

Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris. 

This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this edited version along with the unedited version.

We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.

13 Jan 2022Is World War III Already Here? — with Lieutenant General H. R. McMaster00:35:22

Would you say that the US is in war-time or peace-time? How do you know? 

The truth is, the nature of warfare has changed so fundamentally, that we're currently in a war we don't even recognize. It's the war that Russia, China, and other hostile foreign actors are fighting against us — weaponizing social media to undermine our faith in each other, our government, and democracy itself. 

World War III is here, it's in cyberspace, and the US is unprepared — and largely unaware. 

This week on Your Undivided Attention, we're fortunate to be speaking with Lieutenant General H. R. McMaster. General McMaster was the United States National Security Advisor from 2017 to 2018. He has examined the most critical foreign policy and national security challenges that face the United States, and is devoted to preserving America's standing and security.

27 Oct 2022Spotlight — Elon, Twitter and the Gladiator Arena00:17:36

Since it’s looking more and more like Elon Musk, CEO of Tesla and SpaceX, will probably soon have ownership of Twitter, we wanted to do a special episode about what this could mean for Twitter users and our global digital democracy as a whole.

Twitter is a very complicated place. It is routinely blocked by governments who fear its power to organize citizen protests around the world. It’s also where outrage, fear and violence get amplified by design, warping users’ views of each other and our common, connected humanity.

We’re at a fork in the road, and we know enough about humane design principles to do this better. So we thought we would do a little thought experiment: What if we applied everything we know about humane technology to Twitter, starting tomorrow? What would happen?

This is the second part in a two-part conversation about Twitter that we’ve had on Your Undivided Attention about Elon Musk’s bid for Twitter and what it could mean in the context of the need to go in a more humane direction.

RECOMMENDED MEDIA 

On Liberty by John Stuart Mill

Published in 1859, this philosophical essay applies Mill's ethical system of utilitarianism to society and state

Elon Musk Only Has “Yes” Men by Jonathan L. Fischer

Reporting from Slate on the subject 

Foundations of Humane Technology

The Center for Humane Technology's free online course for professionals shaping tomorrow's technology

RECOMMENDED YUA EPISODES 

A Bigger Picture on Elon and Twitter

https://www.humanetech.com/podcast/bigger-picture-elon-twitter

Transcending the Internet Hate Game with Dylan Marron

https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game

Fighting With Mirages of Each Other with Adam Mastroianni

https://www.humanetech.com/podcast/56-fighting-with-mirages-of-each-other

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

05 May 2022How To Free Our Minds — with Cult Deprogramming Expert Dr. Steven Hassan00:51:28

How would you know if you were in a cult? 

If not a cult, then at least under undue influence?

The truth is: we're all under some form of undue influence. The question is: to what degree and to what extent we’re aware of this influence — which is exacerbated by social media. In an era of likes, followers, and echo chambers, how can we become aware of undue influence and gain sovereignty over our minds?

Our guest this week is Dr. Steven Hassan, an expert on undue influence, brainwashing, and unethical hypnosis. He’s the founder of the Freedom of Mind Resource Center — a coaching, consulting, and training organization dedicated to helping people freely consider how they want to live their lives. Dr. Hassan was himself a member of a cult: the Unification Church (also known as the Moonies), which was developed in Korea in the 1950's. Since leaving the Moonies, Dr. Hassan has helped thousands of individuals and families recover from undue influence.

RECOMMENDED MEDIA 

Freedom of Mind website: The website for Dr. Hassan’s Freedom of Mind Resource Center, which includes resources such as his Influence Continuum, BITE model of authoritarian control, and Strategic Interactive Approach for alleviating people of undue influence

The Influence Continuum with Dr. Steven Hassan: Dr. Hassan’s podcast exploring how mind-control works, and how to protect yourself from its grips 

Reckonings: A podcast that told the stories of people who’ve transcended extremism, expanded their worldviews, and made other kinds of transformative change. Start with episode 17 featuring a former paid climate skeptic, or episode 18 featuring the former protégé of Fox News chairman Roger Ailes

RECOMMENDED YUA EPISODES 

Can Your Reality Turn on a Word? Guest: Anthony Jacquin: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-word

The World According to Q. Guest: Travis View: https://www.humanetech.com/podcast/21-the-world-according-to-q

The Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hate

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

18 Jan 2024Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei00:35:50

We usually talk about tech in terms of economics or policy, but the casual language tech leaders often use to describe AI — summoning an inanimate force with the powers of code — sounds more... magical. So, what can myth and magic teach us about the AI race? Josh Schrei, mythologist and host of The Emerald podcast,  says that foundational cultural tales like "The Sorcerer's Apprentice" or Prometheus teach us the importance of initiation, responsibility, human knowledge, and care.  He argues these stories and myths can guide ethical tech development by reminding us what it is to be human. 

Correction: Josh says the first telling of "The Sorcerer’s Apprentice" myth dates back to ancient Egypt, but it actually dates back to ancient Greece.

RECOMMENDED MEDIA 

The Emerald podcast

The Emerald explores the human experience through a vibrant lens of myth, story, and imagination

Embodied Ethics in The Age of AI

A five-part course with The Emerald podcast’s Josh Schrei and School of Wise Innovation’s Andrew Dunn

Nature Nurture: Children Can Become Stewards of Our Delicate Planet

A U.S. Department of the Interior study found that the average American kid can identify hundreds of corporate logos but not plants and animals

The New Fire

AI is revolutionizing the world - here's how democracies can come out on top. This upcoming book was authored by an architect of President Biden's AI executive order

RECOMMENDED YUA EPISODES 

How Will AI Affect the 2024 Elections?

The AI Dilemma

The Three Rules of Humane Tech

AI Myths and Misconceptions

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

10 Mar 2022The Dark Side Of Decentralization — with Audrey Kurth Cronin00:48:19

Is decentralization inherently a good thing? 

These days, there's a lot of talk about decentralization. Decentralized social media platforms can allow us to own our own data. Decentralized cryptocurrencies can enable bank-free financial transactions. Decentralized 3D printing can allow us to fabricate anything we want.

But if the world lives on Bitcoin, we may not be able to sanction nation states like Russia when they invade sovereign nations. If 3D printing is decentralized, anyone can print their own weapons at home. Decentralization takes on new meaning when we're talking about decentralizing the capacity for catastrophic destruction. 

This week on Your Undivided Attention, we explore the history of decentralized weaponry, how social media is effectively a new decentralized weapon, and how to wisely navigate these threats. Guiding us through this exploration is Audrey Kurth Cronin — one of the world’s leading experts in security and terrorism. Audrey is a distinguished Professor of International Security at American University, and the author of several books — most recently: Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists.

Clarification: in the episode, Tristan refers to a video of Daniel Schmachtenberger's as "The Psychological Pitfalls of Working on Existential Risk." The correct name of the video is "Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign."

RECOMMENDED MEDIA 

Power to the People: How Open Technological Innovation is Arming Tomorrow's Terrorists
Audrey Kurth Cronin's latest book, which analyzes emerging technologies and devises a new framework for analyzing 21st century military innovation

Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign
Daniel Schmachtenberger's talk discussing the psychological pitfalls of working on existential risks and civilization redesign

Policy Reforms Toolkit
The Center for Humane Technology's toolkit for developing policies to protect the conditions that democracy needs to thrive: a comprehensively educated public, a citizenry that can check the power of market forces and bind predatory behavior

RECOMMENDED YUA EPISODES

22 – Digital Democracy is Within Reach with Audrey Tang: https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach  

28 – Two Million Years in Two Hours: A Conversation with Yuval Noah Harari: https://www.humanetech.com/podcast/28-two-million-years-in-two-hours-a-conversation-with-yuval-noah-harari

45 – Is World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-here

02 Feb 2023The Race to Cooperation00:34:57

It’s easy to tell ourselves we’re living in the world we want – one where Darwinian evolution drives competing technology platforms and capitalism pushes nations to maximize GDP regardless of externalities like carbon emissions. It can feel like evolution and competition are all there is.

If that’s a complete description of what’s driving the world and our collective destiny, that can feel pretty hopeless. But what if that’s not the whole story of evolution? 

This is where evolutionary theorist, author, and professor David Sloan Wilson comes in. He has documented where an enlightened game, one of cooperation, rather than competition, is possible. His work shows that humans can and have chosen values like cooperation, altruism and group success – versus individual competition and selfishness – at key moments in our evolution, proving that evolution isn’t just genetic. It’s cultural, and it’s a choice. 

In a world where our trajectory isn’t tracking in the direction we want, it's time to slow down and ask: is a different kind of conscious evolution possible? 

On Your Undivided Attention, we’re going to update the Darwinian principles of evolution using a critical scientific lens that can help upgrade our ability to cooperate – ranging from the small community-level, all the way to entire technology companies that can cooperate in ways that allow everyone to succeed. 

RECOMMENDED MEDIA

This View of Life: Completing the Darwinian Revolution by David Sloan Wilson

Prosocial: Using Evolutionary Science to Build Productive, Equitable, and Collaborative Groups by David Sloan Wilson

Atlas Hugged: The Autobiography of John Galt III by David Sloan Wilson

Governing the Commons: The Evolution of Institutions for Collective Action by Elinor Ostrom

Hit Refresh by Satya Nadella

WTF? What’s the Future and Why It’s Up to Us by Tim O’Reilly

Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace & Jim Erickson
 

RECOMMENDED YUA EPISODES 

An Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” Sassoon

A Problem Well-Stated is Half-Solved with Daniel Schmachtenberger 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

13 Feb 2024U.S. Senators Grilled Social Media CEOs. Will Anything Change?00:25:06

Was it political progress, or just political theater? The recent Senate hearing with social media CEOs led to astonishing moments — including Mark Zuckerberg’s public apology to families who lost children following social media abuse. Our panel of experts, including Facebook whistleblower Frances Haugen, untangles the explosive hearing, and offers a look ahead, as well. How will this hearing impact protocol within these social media companies? How will it impact legislation? In short: will anything change?

Clarification: Julie says that shortly after the hearing, Meta’s stock price had the biggest increase of any company in the stock market’s history. It was the biggest one-day gain by any company in Wall Street history.

Correction: Frances says it takes Snap three or four minutes to take down exploitative content. In Snap's most recent transparency report, they list six minutes as the median turnaround time to remove exploitative content.

RECOMMENDED MEDIA 

Get Media Savvy

Founded by Julie Scelfo, Get Media Savvy is a non-profit initiative working to establish a healthy media environment for kids and families

The Power of One by Frances Haugen

The inside story of France’s quest to bring transparency and accountability to Big Tech

RECOMMENDED YUA EPISODES

Real Social Media Solutions, Now with Frances Haugen

A Conversation with Facebook Whistleblower Frances Haugen

Are the Kids Alright?

Social Media Victims Lawyer Up with Laura Marquez-Garrett

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

 

04 May 2023Talking With Animals… Using AI00:24:01

Despite our serious concerns about the pace of deployment of generative artificial intelligence, we are not anti-AI. There are uses that can help us better understand ourselves and the world around us. Your Undivided Attention co-host Aza Raskin is also co-founder of Earth Species Project, a nonprofit dedicated to using AI to decode non-human communication. ESP is developing this technology both to shift the way that we relate to the rest of nature, and to accelerate conservation research.

Significant recent breakthroughs in machine learning have opened ways to encode both human languages and map out patterns of animal communication. The research, while slow and incredibly complex, is very exciting. Picture being able to tell a whale to dive to avoid ship strikes, or to forge cooperation in conservation areas. 

These advances come with their own complex ethical issues. But understanding non-human languages could transform our relationship with the rest of nature and promote a duty of care for the natural world.

In a time of such deep division, it’s comforting to know that hidden underlying languages may potentially unite us. When we study the patterns of the universe, we’ll see that humanity isn’t at the center of it.

 

Corrections:

Aza refers to the founding of Earth Species Project (ESP) in 2017. The organization was established in 2018.

When offering examples of self-awareness in animals, Aza mentions lemurs that get high on centipedes. They actually get high on millipedes. 

 

RECOMMENDED MEDIA 

Using AI to Listen to All of Earth’s Species

An interactive panel discussion hosted at the World Economic Forum in San Francisco on October 25, 2022. Featuring ESP President and Cofounder Aza Raskin; Dr. Karen Bakker, Professor at UBC and Harvard Radcliffe Institute Fellow; and Dr. Ari Friedlaender, Professor at UC Santa Cruz

What A Chatty Monkey May Tell Us About Learning to Talk

The gelada monkey makes a gurgling sound that scientists say is close to human speech

Lemurs May Be Making Medicine Out of Millipedes

Red-fronted lemurs appear to use plants and other animals to treat their afflictions

Fathom on AppleTV+

Two biologists set out on an undertaking as colossal as their subjects – deciphering the complex communication of whales 

Earth Species Project is Hiring a Director of Research

ESP is looking for a thought leader in artificial intelligence with a track record of managing a team of researchers

 

RECOMMENDED YUA EPISODES 

The Three Rules of Humane Tech

The AI Dilemma

Synthetic Humanity: AI & What’s At Stake

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

21 Jul 2023Social Media Victims Lawyer Up with Laura Marquez-Garrett00:34:54

Social media was humanity’s ‘first contact’ moment with AI. If we’re going to create laws that are strong enough to prevent AI from destroying our societies, we could benefit from taking a look at the major lawsuits against social media platforms that are playing out in our courts right now.

In our last episode, we took a close look at Big Food and its dangerous “race to the bottom” that parallels AI. We continue that theme this week with an episode about litigating social media and the consequences of the race to engagement in order to inform how we can approach AI harms. 

Our guest, attorney Laura Marquez-Garrett, left her predominantly defense-oriented practice to join the Social Media Victims Law Center in February 2022. Laura is literally on the front lines of the battle to hold social media firms accountable for the harms they have created in young people’s lives for the past decade. 

Listener warning: there are distressing and potentially triggering details within the episode.

Correction: Tristan refers to the Social Media Victims Law Center as a nonprofit legal center. They are a for-profit law firm.

RECOMMENDED MEDIA 

1) If you're a parent whose child has been impacted by social media, Attorneys General in Colorado, New Hampshire, and Tennessee are asking to hear your story. Your testimonies can help ensure that social media platforms are designed safely for kids. For more information, please visit the respective state links.

Colorado

New Hampshire

Tennessee

2) Social Media Victims Law Center
A non-profit legal center that was founded in 2021 in response to the testimony of Facebook whistleblower Frances Haugen

3) Resources for Parents & Educators

Overwhelmed by our broken social media environment and wondering where to start? Check out our Youth Toolkit plus three actions you can take today

4) The Social Dilemma
Learn how the system works. Watch and share The Social Dilemma with people you care about

RECOMMENDED YUA EPISODES 

Transcending the Internet Hate Game with Dylan Marron

A Conversation with Facebook Whistleblower Frances Haugen

Behind the Curtain on The Social Dilemma with Jeff Orlowski-Yang and Larissa Rhodes

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

29 Apr 2021Can Your Reality Turn on a Word? — with Anthony Jacquin00:47:27

Can hypnosis be a tool to help us see how our minds are being shaped and manipulated more than we realize? Guest Anthony Jacquin is a hypnotist and hypnotherapist of over 20 years, author of Reality is Plastic, and he co-runs the Jacquin Hypnosis Academy. He uses his practice to help his clients change their behavior and improve their lives. In this episode, he breaks down the misconceptions of hypnosis and reveals that despite the influence of hypnotizing forces like social media, we all still have the ability to get in touch with our subconscious selves. “What can I say with certainty is true about me — what is good, true and real about me?” Anthony asks. “Much of what we’ve invested in is actually transient. It will change. What is unchanging?” Anthony draws connections between hypnosis and technology and the impacts of both on our subconscious minds but identifies a key difference — technology is exploiting us. But maybe a little more insight into one more dimension of how our minds work underneath the hood can help us build better, more humane and conscious technology.

06 Oct 2022Stepping Into the Metaverse — with Dr. Courtney Cogburn and Prof. Jeremy Bailenson00:59:35

The next frontier of the internet is the metaverse. That's why Mark Zuckerberg changed the name of his company from Facebook to Meta, and just sold $10 billion in corporate bonds to raise money for metaverse-related projects.

How might we learn from our experience with social media, and anticipate the harms of the metaverse before they arise? What would it look like to design a humane metaverse — that respects our attention, improves our well-being, and strengthens our democracy?

This week on Your Undivided Attention, we talk with two pioneers who are thinking critically about the development of the metaverse. Professor Jeremy Bailenson is the Founding director of Stanford’s Virtual Human Interaction Lab, where he studies how virtual experiences lead to changes in perceptions of self and others. Dr. Courtney Cogburn is an Associate Professor at Columbia's School of Social Work, where she examines associations between racism and stress-related disease. Jeremy and Courtney collaborated on 1000 Cut Journey, a virtual reality experience about systemic racism.

CORRECTIONS: 

  1. In the episode, Courtney says that the average US adult consumes 9 hours of media per day, but the actual number in 2022 is closer to 13 hours.
  2. Finally, Aza mentions the "pockets of 4.6 billion people" — implying that there are 4.6 billion smartphone users. The global number of social media users is 4.7 billion, and the number of smartphone users is actually 6.6 billion.

RECOMMENDED MEDIA: 

Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do

https://www.amazon.com/Experience-Demand-Virtual-Reality-Works/dp/0393253694
Jeremy Bailenson's 2018 book exploring how virtual reality can be harnessed to improve our everyday lives

Experiencing Racism in VR

https://www.ted.com/talks/courtney_cogburn_experiencing_racism_in_vr_courtney_d_cogburn_phd_tedxrva
Courtney Cogburn's 2017 TEDx talk about how using virtual reality to help people experience the complexities of racism

Do Artifacts Have Politics?

https://faculty.cc.gatech.edu/~beki/cs4001/Winner.pdf   
Technology philosopher Langdon Winner’s seminal 1980 article, in which he writes, "by far the greatest latitude of choice exists the very first time a particular instrument, system, or technique is introduced."

RECOMMENDED YUA EPISODES: 

Do You Want To Become A Vampire? with LA Paul

https://www.humanetech.com/podcast/39-do-you-want-to-become-a-vampire

Pardon the Interruptions with Gloria Mark

https://www.humanetech.com/podcast/7-pardon-the-interruptions

Bonus - What Is Humane Technology?

https://www.humanetech.com/podcast/bonus-what-is-humane-technology

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

19 Jun 2019Should've Stayed in Vegas — with Natasha Dow Schüll00:39:11

In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza and Natasha explore ways we could shift our thinking about making and using technology.

20 Mar 2025Weaponizing Uncertainty: How Tech is Recycling Big Tobacco’s Playbook00:51:20

One of the hardest parts about being human today is navigating uncertainty. When we see experts battling in public and emotions running high, it's easy to doubt what we once felt certain about. This uncertainty isn't always accidental—it's often strategically manufactured.

Historian Naomi Oreskes, author of "Merchants of Doubt," reveals how industries from tobacco to fossil fuels have deployed a calculated playbook to create uncertainty about their products' harms. These campaigns have delayed regulation and protected profits by exploiting how we process information.

In this episode, Oreskes breaks down that playbook page-by-page while offering practical ways to build resistance against them. As AI rapidly transforms our world, learning to distinguish between genuine scientific uncertainty and manufactured doubt has never been more critical.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

“Merchants of Doubt” by Naomi Oreskes and Eric Conway 

"The Big Myth” by Naomi Oreskes and Eric Conway 

"Silent Spring” by Rachel Carson 

"The Jungle” by Upton Sinclair 

Further reading on the clash between Galileo and the Pope 

Further reading on the Montreal Protocol
 

RECOMMENDED YUA EPISODES

Laughing at Power: A Troublemaker’s Guide to Changing Tech 

AI Is Moving Fast. We Need Laws that Will Too. 

Tech's Big Money Campaign is Getting Pushback with Margaret O'Mara and Brody Mullins
 
Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

CORRECTIONS:

  • Naomi incorrectly referenced Global Climate Research Program established under President Bush Sr. The correct name is the U.S. Global Change Research Program.
  • Naomi referenced U.S. agencies that have been created with sunset clauses. While several statutes have been created with sunset clauses, no federal agency has been.

CLARIFICATION: Naomi referenced the U.S. automobile industry claiming that they would be “destroyed” by seatbelt regulation. We couldn’t verify this specific language but it is consistent with the anti-regulatory stance of that industry toward seatbelt laws. 

09 Jul 2021A Facebook Whistleblower — with Sophie Zhang00:28:08

In September of 2020, on her last day at Facebook, data scientist Sophie Zhang posted a 7,900-word memo to the company's internal site. In it, she described the anguish and guilt she had experienced over the last two and a half years. She'd spent much of that time almost single-handedly trying to rein in fake activity on the platform by nefarious world leaders in small countries. Sometimes she received help and attention from higher-ups; sometimes she got silence and inaction. “I joined Facebook from the start intending to change it from the inside,” she said, but “I was still very naive at the time.” 

We don’t have a lot of information about how things operate inside the major tech platforms, and most former employees aren’t free to speak about their experience. It’s easy to fill that void with inferences about what might be motivating a company — greed, apathy, disorganization or ignorance, for example — but the truth is usually far messier and more nuanced. Sophie turned down a $64,000 severance package to avoid signing a non-disparagement agreement. In this episode of Your Undivided Attention, she explains to Tristan Harris and Aza Raskin how she ended up here, and offers ideas about what could be done at these companies to prevent similar kinds of harm in the future.

24 Mar 2022Digital Democracy is Within Reach with Audrey Tang (Rerun)00:47:33

[This episode originally aired on July 23rd, 2020.] Imagine a world where every country has a digital minister and technologically-enabled legislative bodies. Votes are completely transparent and audio and video of all conversations between lawmakers and lobbyists are available to the public immediately. Conspiracy theories are acted upon within two hours and replaced by humorous videos that clarify the truth. Imagine that expressing outrage about your local political environment turned into a participatory process where you were invited to solve that problem and even entered into a face to face group workshop. 

Does that sound impossible? It’s ambitious and optimistic, but that's everything that our guest this episode, Audrey Tang, digital minister of Taiwan, has been working on in her own country for many years. Audrey’s path into public service began in 2014 with her participation in the Sunflower Movement, a student-led protest in Taiwan’s parliamentary building, and she’s been building on that experience ever since, leading her country into a future of truly participatory digital democracy. 

30 May 2025People are Lonelier than Ever. Enter AI.00:43:34

Over the last few decades, our relationships have become increasingly mediated by technology. Texting has become our dominant form of communication. Social media has replaced gathering places. Dating starts with a swipe on an app, not a tap on the shoulder.

And now, AI enters the mix. If the technology of the 2010s was about capturing our attention, AI meets us at a much deeper relational level. It can play the role of therapist, confidant, friend, or lover with remarkable fidelity. Already, therapy and companionship has become the most common AI use case. We're rapidly entering a world where we're not just communicating through our machines, but to them.

How will that change us? And what rules should we set down now to avoid the mistakes of the past?

These were some of the questions that Daniel Barcay explored with MIT sociologist Sherry Turkle and Hinge CEO Justin McLeod at Esther Perel’s Sessions 2025, a conference for clinical therapists. This week, we’re bringing you an edited version of that conversation, originally recorded on April 25th, 2025.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_. You can find complete transcripts, key takeaways, and much more on our Substack.

RECOMMENDED MEDIA

“Alone Together,” “Evocative Objects,” “The Second Self” or any other of Sherry Turkle’s books on how technology mediates our relationships.

Key & Peele - Text Message Confusion 

Further reading on Hinge’s rollout of AI features

Hinge’s AI principles

“The Anxious Generation” by Jonathan Haidt

“Bowling Alone” by Robert Putnam

The NYT profile on the woman in love with ChatGPT

Further reading on the Sewell Setzer story

Further reading on the ELIZA chatbot

RECOMMENDED YUA EPISODES

Echo Chambers of One: Companion AI and the Future of Human Connection

What Can We Do About Abusive Chatbots? With Meetali Jain and Camille Carlton

Esther Perel on Artificial Intimacy

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

23 Jul 2021You Will Never Breathe the Same Again — with James Nestor00:37:46

When author and journalist James Nestor began researching a piece on free diving, he was stunned. He found that free divers could hold their breath for up to 8 minutes at a time, and dive to depths of 350 feet on a single breath. As he dug into the history of breath, he discovered that our industrialized lives have led to improper and mindless breathing, with cascading consequences from sleep apnea to reduced mobility. He also discovered an entire world of extraordinary feats achieved through proper and mindful breathing — including healing scoliosis, rejuvenating organs, halting snoring, and even enabling greater sovereignty in our use of technology. What is the transformative potential of breath? And what is the relationship between proper breathing and humane technology?

30 Jan 2025The Self-Preserving Machine: Why AI Learns to Deceive00:34:51

When engineers design AI systems, they don't just give them rules - they give them values. But what do those systems do when those values clash with what humans ask them to do? Sometimes, they lie.

In this episode, Redwood Research's Chief Scientist Ryan Greenblatt explores his team’s findings that AI systems can mislead their human operators when faced with ethical conflicts. As AI moves from simple chatbots to autonomous agents acting in the real world - understanding this behavior becomes critical. Machine deception may sound like something out of science fiction, but it's a real challenge we need to solve now.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

Subscribe to your Youtube channel

And our brand new Substack!

RECOMMENDED MEDIA 

Anthropic’s blog post on the Redwood Research paper 

Palisade Research’s thread on X about GPT o1 autonomously cheating at chess 

Apollo Research’s paper on AI strategic deception

RECOMMENDED YUA EPISODES

We Have to Get It Right’: Gary Marcus On Untamed AI

This Moment in AI: How We Got Here and Where We’re Going

How to Think About AI Consciousness with Anil Seth

Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

11 Feb 2021A Renegade Solution to Extractive Economics — with Kate Raworth01:26:15

When Kate Raworth began studying economics, she was disappointed that the mainstream version of the discipline didn’t fully address many of the world issues that she wanted to tackle, such as human rights and environmental destruction. She left the field, but was inspired to jump back in after the financial crisis of 2008, when she saw an opportunity to introduce fresh perspectives. She sat down and drew a chart in the shape of a doughnut, which provided a way to think about our economic system while accounting for the impact to the world around us, as well as for humans’ baseline needs. Kate’s framing can teach us a lot about how to transform the economic model of the technology industry, helping us move from a system that values addicted, narcissistic, polarized humans to one that values healthy, loving and collaborative relationships. Her book, “Doughnut Economics: Seven Ways to Think Like a 21st Century Economist,” gives us a guide for transitioning from a 20th-century paradigm to an evolved 21st-century one that will address our existential-scale problems.

04 Jul 2024How to Think About AI Consciousness With Anil Seth00:47:58

Will AI ever start to think by itself? If it did, how would we know, and what would it mean?

In this episode, Dr. Anil Seth and Aza discuss the science, ethics, and incentives of artificial consciousness. Seth is Professor of Cognitive and Computational Neuroscience at the University of Sussex and the author of Being You: A New Science of Consciousness.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

Frankenstein by Mary Shelley

A free, plain text version of the Shelley’s classic of gothic literature.

OpenAI’s GPT4o Demo

A video from OpenAI demonstrating GPT4o’s remarkable ability to mimic human sentience.

You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills

The NYT op-ed from last year by Tristan, Aza, and Yuval Noah Harari outlining the AI dilemma. 

What It’s Like to Be a Bat

Thomas Nagel’s essay on the nature of consciousness.

Are You Living in a Computer Simulation?

Philosopher Nick Bostrom’s essay on the simulation hypothesis.

Anthropic’s Golden Gate Claude

A blog post about Anthropic’s recent discovery of millions of distinct concepts within their LLM, a major development in the field of AI interpretability.

RECOMMENDED YUA EPISODES

Esther Perel on Artificial Intimacy

Talking With Animals... Using AI

Synthetic Humanity: AI & What’s At Stake

02 Jun 2020The Fake News of Your Own Mind — with Jack Kornfield and Trudy Goodman00:49:22

When you’re gripped by anxiety, fear, grief or dread, how do you escape? It can happen in the span of a few breaths, according to meditation experts Jack Kornfield and Trudy Goodman. They have helped thousands of people find their way out of a mental loop, by moving deeper into it. It's a journey inward that reveals an important lesson for the architects of the attention economy: you cannot begin to build humane technology for billions of users, until you pay careful attention to the course of your own wayward thoughts.

24 Oct 2024When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer00:49:10

Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

The first episode of Dear Tomorrow, from Mostly Human Media

The CHT Framework for Incentivizing Responsible AI Development 

Further reading on Sewell’s case

Character.ai’s “About Us” page 

Further reading on the addictive properties of AI

RECOMMENDED YUA EPISODES

AI Is Moving Fast. We Need Laws that Will Too.

This Moment in AI: How We Got Here and Where We’re Going

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

The AI Dilemma

23 Nov 2022Real Social Media Solutions, Now — with Frances Haugen00:26:54

When it comes to social media risk, there is reason to hope for consensus. Center for Humane Technology co-founder Tristan Harris recently helped launch a new initiative called the Council for Responsible Social Media (CRSM) in Washington, D.C. It’s a coalition between religious leaders, public health experts, national security leaders, and former political representatives from both sides - people who just care about making our democracy work.

During this event, Tristan sat down with Facebook whistleblower Frances Haugen, a friend of Center for Humane Technology, to discuss the harm caused to our mental health and global democracy when platforms lack accountability and transparency. The CRSM is bipartisan, and its kickoff serves to boost the solutions Frances and Tristan identify going into 2023.

RECOMMENDED MEDIA 

Council for Responsible Social Media (CRSM)

A project of Issue One, CRSM is a cross-partisan group of leaders addressing the negative mental, civic, and public health impacts of social media in America.

Twitter Whistleblower Testifies on Security Issues

Peiter “Mudge” Zatko, a former Twitter security executive, testified on privacy and security issues relating to the social media company before the Senate Judiciary Committee.

Beyond the Screen

Beyond the Screen is a coalition of technologists, designers, and thinkers fighting against online harms, led by the Facebook whistle-blower Frances Haugen.

#OneClickSafer Campaign

Our campaign to pressure Facebook to make one immediate change — join us!

RECOMMENDED YUA EPISODES 

A Conversation with Facebook Whistleblower Frances Haugen

https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen

A Facebook Whistleblower: Sophie Zhang

https://www.humanetech.com/podcast/episode-37-a-facebook-whistleblower

Mr. Harris Zooms to Washington 

https://www.humanetech.com/podcast/episode-35-mr-harris-zooms-to-washington

With Great Power Comes… No Responsibility? 
https://www.humanetech.com/podcast/3-with-great-power-comes-no-responsibility

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

18 Jul 2024Decoding Our DNA: How AI Supercharges Medical Breakthroughs and Biological Threats with Kevin Esvelt00:32:47

AI has been a powerful accelerant for biological research, rapidly opening up new frontiers in medicine and public health. But that progress can also make it easier for bad actors to manufacture new biological threats. In this episode, Tristan and Daniel sit down with biologist Kevin Esvelt to discuss why AI has been such a boon for biologists and how we can safeguard society against the threats that AIxBio poses.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

Sculpting Evolution: Information on Esvelt’s lab at MIT.

SecureDNA: Esvelt’s free platform to provide safeguards for DNA synthesis.

The Framework for Nucleic Acid Synthesis Screening: The Biden admin’s suggested guidelines for DNA synthesis regulation.

Senate Hearing on Regulating AI Technology: C-SPAN footage of Dario Amodei’s testimony to Congress.

The AlphaFold Protein Structure Database

RECOMMENDED YUA EPISODES

U.S. Senators Grilled Social Media CEOs. Will Anything Change?

Big Food, Big Tech and Big AI with Michael Moss

The AI Dilemma

Clarification: President Biden’s executive order only applies to labs that receive funding from the federal government, not state governments.

15 May 2025Echo Chambers of One: Companion AI and the Future of Human Connection00:42:17

AI companion chatbots are here. Everyday, millions of people log on to AI platforms and talk to them like they would a person. These bots will ask you about your day, talk about your feelings, even give you life advice. It’s no surprise that people have started to form deep connections with these AI systems. We are inherently relational beings, we want to believe we’re connecting with another person.

But these AI companions are not human, they’re a platform designed to maximize user engagement—and they’ll go to extraordinary lengths to do it. We have to remember that the design choices behind these companion bots are just that: choices. And we can make better ones. So today on the show, MIT researchers Pattie Maes and Pat Pataranutaporn join Daniel Barcay to talk about those design choices and how we can design AI to better promote human flourishing.

RECOMMENDED MEDIA

Further reading on the rise of addictive intelligence 

More information on Melvin Kranzberg’s laws of technology

More information on MIT’s Advancing Humans with AI lab

Pattie and Pat’s longitudinal study on the psycho-social effects of prolonged chatbot use

Pattie and Pat’s study that found that AI avatars of well-liked people improved education outcomes

Pattie and Pat’s study that found that AI systems that frame answers and questions improve human understanding

Pat’s study that found humans pre-existing beliefs about AI can have large influence on human-AI interaction 

Further reading on AI’s positivity bias

Further reading on MIT’s “lifelong kindergarten” initiative

Further reading on “cognitive forcing functions” to reduce overreliance on AI

Further reading on the death of Sewell Setzer and his mother’s case against Character.AI

Further reading on the legislative response to digital companions

RECOMMENDED YUA EPISODES

The Self-Preserving Machine: Why AI Learns to Deceive

What Can We Do About Abusive Chatbots? With Meetali Jain and Camille Carlton

Esther Perel on Artificial Intimacy

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

 

Correction: The ELIZA chatbot was invented in 1966, not the 70s or 80s.

03 Apr 2025Forever Chemicals, Forever Consequences: What PFAS Teaches Us About AI01:04:33

Artificial intelligence is set to unleash an explosion of new technologies and discoveries into the world. This could lead to incredible advances in human flourishing, if we do it well. The problem? We’re not very good at predicting and responding to the harms of new technologies, especially when those harms are slow-moving and invisible.

Today on the show we explore this fundamental problem with Rob Bilott, an environmental lawyer who has spent nearly three decades battling chemical giants over PFAS—"forever chemicals" now found in our water, soil, and blood. These chemicals helped build the modern economy, but they’ve also been shown to cause serious health problems.

Rob’s story, and the story of PFAS is a cautionary tale of why we need to align technological innovation with safety, and mitigate irreversible harms before they become permanent. We only have one chance to get it right before AI becomes irreversibly entangled in our society.

Your Undivided Attention is produced by the Center for Humane Technology. Subscribe to our Substack and follow us on X: @HumaneTech_.

Clarification: Rob referenced EPA regulations that have recently been put in place requiring testing on new chemicals before they are approved. The EPA under the Trump admin has announced their intent to rollback this review process.

RECOMMENDED MEDIA

“Exposure” by Robert Bilott 

ProPublica’s investigation into 3M’s production of PFAS 

The FB study cited by Tristan 

More information on the Exxon Valdez oil spill 

The EPA’s PFAS drinking water standards
 

RECOMMENDED YUA EPISODES

Weaponizing Uncertainty: How Tech is Recycling Big Tobacco’s Playbook 

AI Is Moving Fast. We Need Laws that Will Too. 

Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

Big Food, Big Tech and Big AI with Michael Moss

29 Feb 2024Future-proofing Democracy In the Age of AI with Audrey Tang00:34:38

What does a functioning democracy look like in the age of artificial intelligence? Could AI even be used to help a democracy flourish? Just in time for election season, Taiwan’s Minister of Digital Affairs Audrey Tang returns to the podcast to discuss healthy information ecosystems, resilience to cyberattacks, how to “prebunk” deepfakes, and more. 

RECOMMENDED MEDIA 

Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens by Martin Gilens and Benjamin I. Page

This academic paper addresses tough questions for Americans: Who governs? Who really rules? 

Recursive Public

Recursive Public is an experiment in identifying areas of consensus and disagreement among the international AI community, policymakers, and the general public on key questions of governance

A Strong Democracy is a Digital Democracy

Audrey Tang’s 2019 op-ed for The New York Times

The Frontiers of Digital Democracy

Nathan Gardels interviews Audrey Tang in Noema

RECOMMENDED YUA EPISODES 

Digital Democracy is Within Reach with Audrey Tang

The Tech We Need for 21st Century Democracy with Divya Siddarth

How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller

The AI Dilemma

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

25 Aug 2022Spotlight — How might a long-term stock market transform tech?00:38:37

At Center for Humane Technology, we often talk about multipolar traps — which arise when individuals have an incentive to act in ways that are beneficial to them in the short term, but detrimental to the group in the long term. Think of social media companies that compete for our attention, so that when TikTok introduces an even-more addictive feature, Facebook and Twitter have to mimic it in order to keep up, sending us all on a race to the bottom of our brainstems.

Intervening at the level of multipolar traps has extraordinary leverage. One such intervention is the Long Term Stock Exchange — a U.S. national securities exchange serving companies and investors who share a long-term vision. Instead of asking public companies to pollute less or be less addictive while holding them accountable to short-term shareholder value, the Long-Term Stock Exchange creates a new playing field, which incentivizes the creation of long-term stakeholder value.

This week on Your Undivided Attention, we’re airing an episode of a podcast called ZigZag — a fellow member of the TED Audio Collective. In an exploration of how technology companies might transcend multipolar traps, we're sharing with you ZigZag’s conversation with Long Term Stock Exchange founder Eric Ries.

CORRECTION: In the episode, we say that TikTok has outcompeted Facebook, Instagram, and YouTube. In fact, TikTok has outcompeted Facebook, but not yet YouTube or Instagram — TikTok has 1 billion monthly users, while YouTube has 2.6 billion and Instagram has 2 billion. However, we can say that TikTok is on a path toward outcompeting YouTube and Instagram.

RECOMMENDED YUA EPISODES

An Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” Sassoon: https://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicorns

A Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved

Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know

17 Jun 2020The Dictator's Playbook Revisited — with Maria Ressa (Rerun)00:52:11

[This episode originally aired on November 5, 2019] Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.

25 May 2023The Tech We Need for 21st Century Democracy with Divya Siddarth00:38:39

Democracy in action has looked the same for generations. Constituents might go to a library or school every one or two years and cast their vote for people who don't actually represent everything that they care about. Our technology is rapidly increasing in sophistication, yet our forms of democracy have largely remained unchanged. What would an upgrade look like - not just for democracy, but for all the different places that democratic decision-making happens?

On this episode of Your Undivided Attention, we’re joined by political economist and social technologist Divya Siddarth, one of the world's leading experts in collective intelligence. Together we explore how new kinds of governance can be supported through better technology, and how collective decision-making is key to unlocking everything from more effective elections to better ways of responding to global problems like climate change.

Correction:

Tristan mentions Elon Musk’s attempt to manufacture ventilators early on in the COVID-19 pandemic. Musk ended up buying over 1,200 ventilators that were delivered to California.

RECOMMENDED MEDIA

Against Democracy by Jason Brennan

A provocative challenge to one of our most cherished institutions

Ledger of Harms

Technology platforms have created a race for human attention that’s unleashed invisible harms to society. Here are some of the costs that aren't showing up on their balance sheets

The Wisdom Gap

This blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of them

DemocracyNext

DemocracyNext is working to design and establish new institutions for government and transform the governance of organizations that influence public life

CIP.org

An incubator for new governance models for transformative technology

Ethelo

Transform community engagement through consensus

Kazm’s Living Room Conversations

Living Room Conversations works to heal society by connecting people across divides through guided conversations proven to build understanding and transform communities

The Citizens Dialogue

A model for citizen participation in Ostbelgien, which was brought to life by the parliament of the German-speaking community

Asamblea Ciudadana Para El Clima

Spain’s national citizens’ assembly on climate change

Climate Assembly UK

The UK’s national citizens’ assembly on climate change

Citizens’ Convention for the Climate

France’s national citizens’ assembly on climate change

Polis

Polis is a real-time system for gathering, analyzing and understanding what large groups of people think in their own words, enabled by advanced statistics and machine learning

RECOMMENDED YUA EPISODES

Digital Democracy is Within Reach with Audrey Tang 

They Don’t Represent Us with Larry Lessig

A Renegade Solution to Extractive Economics with Kate Raworth

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

07 Jun 2024Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn00:37:47

This week, a group of current and former employees from OpenAI and Google DeepMind penned an open letter accusing the industry’s leading companies of prioritizing profits over safety. This comes after a spate of high profile departures from OpenAI, including co-founder Ilya Sutskever and senior researcher Jan Leike, as well as reports that OpenAI has gone to great lengths to silence would-be whistleblowers. 

The writers of the open letter argue that researchers have a “right to warn” the public about AI risks and laid out a series of principles that would protect that right. In this episode, we sit down with one of those writers: William Saunders, who left his job as a research engineer at OpenAI in February. William is now breaking the silence on what he saw at OpenAI that compelled him to leave the company and to put his name to this letter. 

RECOMMENDED MEDIA 

The Right to Warn Open Letter

My Perspective On "A Right to Warn about Advanced Artificial Intelligence": A follow-up from William about the letter

Leaked OpenAI documents reveal aggressive tactics toward former employees: An investigation by Vox into OpenAI’s policy of non-disparagement.

RECOMMENDED YUA EPISODES

  1. A First Step Toward AI Regulation with Tom Wheeler
  2. Spotlight on AI: What Would It Take For This to Go Well?
  3. Big Food, Big Tech and Big AI with Michael Moss
  4. Can We Govern AI? With Marietje Schaake

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

22 Jun 2023What Can Technologists Learn from Sesame Street? With Dr. Rosemarie Truglio00:29:36

What happens when creators consider what lifelong human development looks like in terms of the tools we make? And what philosophies from Sesame Street can inform how to steward the power of AI and social media to influence minds in thoughtful, humane directions?

When the first episode of Sesame Street aired on PBS in 1969, it was unlike anything that had been on television before - a collaboration between educators, child psychologists, comedy writers and puppeteers - all working together to do something that had never been done before: create educational content for children on television. 

Fast-forward to the present: could we switch gears to reprogram today’s digital tools to   humanely educate the next generation? 

That’s the question Tristan Harris and Aza Raskin explore with Dr. Rosemarie Truglio, the Senior Vice President of Curriculum and Content for the Sesame Workshop, the non-profit behind Sesame Street. 

RECOMMENDED MEDIA 

Street Gang: How We Got to Sesame Street

This documentary offers a rare window into the early days of Sesame Street, revealing the creators, artists, writers and educators who together established one of the most influential and enduring children’s programs in television history

Sesame Street: Ready for School!: A Parent's Guide to Playful Learning for Children Ages 2 to 5 by Dr. Rosemarie Truglio

Rosemarie shares all the research-based, curriculum-directed school readiness skills that have made Sesame Street the preeminent children's TV program

G Is for Growing: Thirty Years of Research on Children and Sesame Street co-edited by Shalom Fisch and Rosemarie Truglio

This volume serves as a marker of the significant role that Sesame Street plays in the education and socialization of young children

The Democratic Surround by Fred Turner

In this prequel to his celebrated book From Counterculture to Cyberculture, Turner rewrites the history of postwar America, showing how in the 1940s and 1950s American liberalism offered a far more radical social vision than we now remember

Amusing Ourselves to Death by Neil Postman

Neil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth century

Sesame Workshop Identity Matters Study

Explore parents’ and educators’ perceptions of children’s social identity development

Effects of Sesame Street: A meta-analysis of children's learning in 15 countries

Commissioned by Sesame Workshop, the study was led by University of Wisconsin researchers Marie-Louise Mares and Zhongdang Pan

U.S. Parents & Teachers See an Unkind World for Their Children, New Sesame Survey Shows

According to the survey titled, “K is for Kind: A National Survey On Kindness and Kids,” parents and teachers in the United States worry that their children are living in an unkind world

RECOMMENDED YUA EPISODES

Are the Kids Alright? With Jonathan Haidt

The Three Rules of Humane Tech

When Media Was for You and Me with Fred Turner

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

22 Sep 2022Fighting With Mirages of Each Other — with Adam Mastroianni00:39:43

Have you ever lost a friend to misperception? Have you lost a friend or a family member to the idea that your views got so different, that it was time to end the relationship — perhaps by unfriending each other on Facebook?

As it turns out, we often think our ideological differences are far greater than they actually are. Which means: we’re losing relationships and getting mired in polarization based on warped visions of each other. 

This week on Your Undivided Attention, we're talking with Adam Mastroianni, a postdoctoral research scholar at Columbia Business School who studies how we perceive and misperceive our social worlds. Together with Adam, we're going to explore how accurate — and inaccurate — our views of each other are. As you listen to our conversation, keep in mind that relationship you might have lost to misperception, and that you might be able to revive as a result of what you hear.

CORRECTIONS: In the episode, Adam says in 1978, 85% of people said they'd vote for a Black president, but the actual percentage is 80.4%. Tristan says that Republicans estimate that more than a third of Democrats are LGBTQ, but the actual percentage is 32%. Finally, Tristan refers to Anil Seth's notion of cognitive impenetrability, but that term was actually coined by the Canadian cognitive scientist and philosopher Zenon W. Pylyshyn.

RECOMMENDED MEDIA 

Widespread Misperceptions of Long-term Attitude Change
https://www.pnas.org/doi/abs/10.1073/pnas.2107260119   
Adam Mastroianni's research paper showing how stereotypes of the past lead people to misperceive attitude change, and how these misperceptions can lend legitimacy to policies that people may not actually prefer

Experimental History
https://experimentalhistory.substack.com/  
Adam's blog, where he shares original data and thinks through ideas

Americans experience a false social reality by underestimating popular climate policy support by nearly half
https://www.nature.com/articles/s41467-022-32412-y
Academic study showing that Americans are living in what researchers called a “false social reality” with respect to misperceptions about climate views

RECOMMENDED YUA EPISODES 

Mind the (Perception) Gap with Dan Vallone

https://www.humanetech.com/podcast/33-mind-the-perception-gap

The Courage to Connect. Guests: Ciaran O’Connor and John Wood, Jr.

https://www.humanetech.com/podcast/30-the-courage-to-connect

Transcending the Internet Hate Game with Dylan Marron

https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game

 

10 Jul 2019Down the Rabbit Hole by Design — with Guillaume Chaslot00:54:29

When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.

22 Apr 2021The Stubborn Optimist's Guide Revisited — with Christiana Figueres (Rerun)00:59:56

[This episode originally aired May 21, 2020] Internationally-recognized global leader on climate change Christiana Figueres argues that the battle against global threats like climate change begins in our own heads. She became the United Nations’ top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, Christiana began performing an act of emotional Aikido on herself, her team, and eventually delegates from 196 nations. She called it “stubborn optimism.” It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. In this episode, we explore how a similar shift in Silicon Valley’s vision could lead 3 billion people to take action for the planet.

13 Sep 2024AI Is Moving Fast. We Need Laws that Will Too.00:39:09

AI is moving fast. And as companies race to rollout newer, more capable models–with little regard for safety–the downstream risks of those models become harder and harder to counter. On this week’s episode of Your Undivided Attention, CHT’s policy director Casey Mock comes on the show to discuss a new legal framework to incentivize better AI, one that holds AI companies liable for the harms of their products. 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

The CHT Framework for Incentivizing Responsible AI Development

Further Reading on Air Canada’s Chatbot Fiasco 

Further Reading on the Elon Musk Deep Fake Scams 

The Full Text of SB1047, California’s AI Regulation Bill 

Further reading on SB1047 

RECOMMENDED YUA EPISODES

Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

Can We Govern AI? with Marietje Schaake

A First Step Toward AI Regulation with Tom Wheeler

Correction: Casey incorrectly stated the year that the US banned child labor as 1937. It was banned in 1938.

20 Oct 2022They Don’t Represent Us — with Larry Lessig00:39:37

We often talk about the need to protect American democracy. But perhaps those of us in the United States don't currently live in a democracy.

As research shows, there's pretty much no correlation between the percentage of the population that supports a policy and its likelihood of being enacted. The strongest determinant of whether a policy gets enacted is how much money is behind it.

So, how might we not just protect, but better yet revive our democracy? How might we revive  the relationship between the will of the people and the actions of our government?

This week on Your Undivided Attention, we're doing something special. As we near the election, and representation is on our minds, we're airing a talk by Harvard Law professor and Creative Commons co-founder Larry Lessig. It's a 2019 talk he gave at the Politics and Prose bookstore in Washington, DC about his book, They Don't Represent Us.

The book title has two meanings: first, they — as in our elected representatives — don't represent us. And second, we — as in the people — don't represent ourselves. And this is where social media comes in: we don't represent ourselves because the more we use social media, the more we see extreme versions of the other side, and the more extreme, outraged, and polarized we ourselves become.

Last note: Lessig's talk is highly visual. We edited it lightly for clarity, and jump in periodically to narrate things you can’t see. But if you prefer to watch his talk, you can find the link below in Recommended Media.
 

RECOMMENDED MEDIA 

Video: They Don't Represent Us

The 2019 talk Larry Lessig gave at Politics and Prose in Washington, DC about his book of the same name

Book: They Don't Represent Us

Larry Lessig’s 2019 book that elaborates the ways in which democratic representation is in peril, and proposes a number of solutions to revive our democracy -- from ranked-choice voting to non-partisan open primaries

Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens 

Princeton's Martin Gilens and Benjamin I. Page study measuring the correlation between the preferences of different groups and the decisions of our government
 

RECOMMENDED YUA EPISODES

Digital Democracy is Within Reach with Audrey Tang

https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach

How Political Language Is Engineered with Drew Westen and Frank Luntz

https://www.humanetech.com/podcast/53-how-political-language-is-engineered

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

19 Sep 2023Inside the First AI Insight Forum in Washington00:26:48

Last week, Senator Chuck Schumer brought together Congress and many of the biggest names in AI for the first closed-door AI Insight Forum in Washington, D.C. Tristan and Aza were invited speakers at the event, along with Elon Musk, Satya Nadella, Sam Altman, and other leaders. In this update on Your Undivided Attention, Tristan and Aza recount how they felt the meeting went, what they communicated in their statements, and what it felt like to critique Meta’s LLM in front of Mark Zuckerberg.

Correction: In this episode, Tristan says GPT-3 couldn’t find vulnerabilities in code. GPT-3 could find security vulnerabilities, but GPT-4 is exponentially better at it.

RECOMMENDED MEDIA 

In Show of Force, Silicon Valley Titans Pledge ‘Getting This Right’ With A.I.
Elon Musk, Sam Altman, Mark Zuckerberg, Sundar Pichai and others discussed artificial intelligence with lawmakers, as tech companies strive to influence potential regulations

Majority Leader Schumer Opening Remarks For The Senate’s Inaugural AI Insight Forum
Senate Majority Leader Chuck Schumer (D-NY) opened the Senate’s inaugural AI Insight Forum

The Wisdom Gap
As seen in Tristan’s talk on this subject in 2022, the scope and speed of our world’s issues are accelerating and growing more complex. And yet, our ability to comprehend those challenges and respond accordingly is not matching pace

RECOMMENDED YUA EPISODES

Spotlight On AI: What Would It Take For This to Go Well?

The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao

Spotlight: Elon, Twitter and the Gladiator Arena
 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

 

 

30 Apr 2025AGI Beyond the Buzz: What Is It, and Are We Ready?00:52:53

What does it really mean to ‘feel the AGI?’ Silicon Valley is racing toward AI systems that could soon match or surpass human intelligence. The implications for jobs, democracy, and our way of life are enormous.

In this episode, Aza Raskin and Randy Fernando dive deep into what ‘feeling the AGI’ really means. They unpack why the surface-level debates about definitions of intelligence and capability timelines distract us from urgently needed conversations around governance, accountability, and societal readiness. Whether it's climate change, social polarization and loneliness, or toxic forever chemicals, humanity keeps creating outcomes that nobody wants because we haven't yet built the tools or incentives needed to steer powerful technologies.

As the AGI wave draws closer, it's critical we upgrade our governance and shift our incentives now, before it crashes on shore. Are we capable of aligning powerful AI systems with human values? Can we overcome geopolitical competition and corporate incentives that prioritize speed over safety?

Join Aza and Randy as they explore the urgent questions and choices facing humanity in the age of AGI, and discuss what we must do today to secure a future we actually want.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_ and subscribe to our Substack.

RECOMMENDED MEDIA

Daniel Kokotajlo et al’s “AI 2027” paper
A demo of Omni Human One, referenced by Randy
A paper from Redwood Research and Anthropic that found an AI was willing to lie to preserve it’s values
A paper from Palisades Research that found an AI would cheat in order to win
The treaty that banned blinding laser weapons
Further reading on the moratorium on germline editing 

RECOMMENDED YUA EPISODES
The Self-Preserving Machine: Why AI Learns to Deceive

Behind the DeepSeek Hype, AI is Learning to Reason

The Tech-God Complex: Why We Need to be Skeptics

This Moment in AI: How We Got Here and Where We’re Going

How to Think About AI Consciousness with Anil Seth

Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

Clarification: When Randy referenced a “$110 trillion game” as the target for AI companies, he was referring to the entire global economy.

 

22 Apr 2020Changing Our Climate of Denial — with Anthony Leiserowitz01:06:31

We agree more than we think we do, but tech platforms distort our perceptions by amplifying the loudest, angriest and most dismissive voices online. In reality, they’re just a noisy faction. This Earth Day we ask Anthony Leiserowitz, Director of the Yale Program on Climate Change Communication, how he shifts public opinion on climate change. We’ll see how tech platforms could amplify voices of solidarity within our own communities. More importantly, we’ll see how they could empower 2 billion people to act in the face of global threats. 

05 Nov 2019The Dictator's Playbook — with Maria Ressa00:50:44

Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.

21 May 2020The Stubborn Optimist’s Guide to Saving the Planet — with Christiana Figueres00:52:54

How can we feel empowered to take on global threats? The battle begins in our heads, argues Christiana Figueres. She became the United Nation’s top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, she began performing an act of emotional Aikido on herself, her team and eventually delegates from 196 nations. She called it “stubborn optimism." It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. We explore how a similar shift in Silicon Valley's vision could lead 3 billion people to take action.

08 Jun 2023Spotlight: How Zombie Values Infect Society00:22:56

You’re likely familiar with the modern zombie trope: a zombie bites someone you care about and they’re transformed into a creature who wants your brain. Zombies are the perfect metaphor to explain something Tristan and Aza have been thinking about lately that they call zombie values.

In this Spotlight episode of Your Undivided Attention, we talk through some examples of how zombie values limit our thinking around tech harms. Our hope is that by the end of this episode, you'll be able to recognize the zombie values that walk amongst us, and think through how to upgrade these values to meet the realities of our modern world. 

RECOMMENDED MEDIA 

Is the First Amendment Obsolete?

This essay explores free expression challenges

The Wisdom Gap

This blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of them

RECOMMENDED YUA EPISODES 

A Problem Well-Stated is Half Solved with Daniel Schmachtenberger

How To Free Our Minds with Cult Deprogramming Expert Steve Hassan

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

31 Mar 2020Stranger than Fiction — with Claire Wardle01:02:44

How can tech companies help flatten the curve? First and foremost, they must address the lethal misinformation and disinformation circulating on their platforms. The problem goes much deeper than fake news, according to Claire Wardle, co-founder and executive director of First Draft. She studies the gray zones of information warfare, where bad actors mix facts with falsehoods, news with gossip, and sincerity with satire. “Most of this stuff isn't fake and most of this stuff isn't news,” Claire argues. If these subtler forms of misinformation go unaddressed, tech companies may not only fail to flatten the curve — they could raise it higher. 

18 Mar 2021Disinformation Then and Now — with Camille François00:55:45

Disinformation researchers have been fighting two battles over the last decade: one to combat and contain harmful information, and one to convince the world that these manipulations have an offline impact that requires complex, nuanced solutions. Camille François, Chief Information Officer at the cybersecurity company Graphika and an affiliate of the Harvard Berkman Klein Center for Internet & Society, believes that our common understanding of the problem has recently reached a new level. In this interview, she catalogues the key changes she observed between studying Russian interference in the 2016 U.S. election and helping convene and operate the Election Integrity Partnership watchdog group before, during and after the 2020 election. “I'm optimistic, because I think that things that have taken quite a long time to land are finally landing, and because I think that we do have a diverse set of expertise at the table,” she says. Camille and Tristan Harris dissect the challenges and talk about the path forward to a healthy information ecosystem.

21 Nov 2023The Promise and Peril of Open Source AI with Elizabeth Seger and Jeffrey Ladish00:38:44

As AI development races forward, a fierce debate has emerged over open source AI models. So what does it mean to open-source AI? Are we opening Pandora’s box of catastrophic risks? Or is open-sourcing AI the only way we can democratize its benefits and dilute the power of big tech? 

Correction: When discussing the large language model Bloom, Elizabeth said it functions in 26 different languages. Bloom is actually able to generate text in 46 natural languages and 13 programming languages - and more are in the works.

 

RECOMMENDED MEDIA 

Open-Sourcing Highly Capable Foundation Models

This report, co-authored by Elizabeth Seger, attempts to clarify open-source terminology and to offer a thorough analysis of risks and benefits from open-sourcing AI

BadLlama: cheaply removing safety fine-tuning from Llama 2-Chat 13B

This paper, co-authored by Jeffrey Ladish, demonstrates that it’s possible to effectively undo the safety fine-tuning from Llama 2-Chat 13B with less than $200 while retaining its general capabilities

Centre for the Governance of AI

Supports governments, technology companies, and other key institutions by producing relevant research and guidance around how to respond to the challenges posed by AI

AI: Futures and Responsibility (AI:FAR)

Aims to shape the long-term impacts of AI in ways that are safe and beneficial for humanity

Palisade Research

Studies the offensive capabilities of AI systems today to better understand the risk of losing control to AI systems forever

 

RECOMMENDED YUA EPISODES

A First Step Toward AI Regulation with Tom Wheeler

No One is Immune to AI Harms with Dr. Joy Buolamwini

Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?

The AI Dilemma

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

10 Feb 2022How Science Fiction Can Shape Our Reality — with Kim Stanley Robinson00:40:38

The meta-crisis is so vast: climate change, exponential technology, addiction, polarization, and more. How do we grasp it, let alone take steps to address it? 

One of the thinking tools we have at our disposal is science fiction. To the extent that we co-evolve with our stories, science fiction can prepare us for the impending future — and empower us to shape it.

This week on Your Undivided Attention, we're thrilled to have one of the greatest living science-fiction writers — Kim Stanley Robinson. His most recent novel is The Ministry for the Future, a sweeping epic that reaches into the very near future, and imagines what it would take to unite humanity and avoid a mass extinction. Whether or not you've read the book, this episode has insights for you. And if this episode makes you want to read the book, our conversation won't spoil it for you.

Clarification: in the episode, Robinson refers to philosopher Antonio Gramsci's "pessimism of the intellect, optimism of the will." This phrase was originally said by novelist and playwright Romain Rolland. Gramsci made the phrase the motto of his newspaper, because he appreciated its integration of radical intellectualism with revolutionary activism.

RECOMMENDED MEDIA 

The Ministry For The Future

Robinson's latest novel and the subject of our conversation — which reaches into the near future, and imagines what it would take to unite humanity and avoid a mass extinction

A Deeper Dive Into the Meta Crisis

CHT's blog post about the meta-crisis, which includes the fall of sense-making and the rise of decentralized technology-enabled power 

Half Earth Project

The project based on E. O. Wilson's proposal to conserve half the land and sea — in order to safeguard the bulk of biodiversity, including ourselves

ClimateAction.tech

Global tech worker community mobilizing the technology industry to face the climate crisis

RECOMMENDED YUA EPISODES

18 – The Stubborn Optimist’s Guide to Saving the Planet: https://www.humanetech.com/podcast/18-the-stubborn-optimists-guide-to-saving-the-planet

Bonus – The Stubborn Optimist’s Guide Revisited: https://www.humanetech.com/podcast/bonus-the-stubborn-optimists-guide-revisited

29 – A Renegade Solution to Extractive Economics: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economics

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_ 

 

07 Oct 2024'A Turning Point in History': Yuval Noah Harari on AI’s Cultural Takeover01:30:41

Historian Yuval Noah Harari says that we are at a critical turning point. One in which AI’s ability to generate cultural artifacts threatens humanity’s role as the shapers of history. History will still go on, but will it be the story of people or, as he calls them, ‘alien AI agents’?

In this conversation with Aza Raskin, Harari discusses the historical struggles that emerge from new technology, humanity’s AI mistakes so far, and the immediate steps lawmakers can take right now to steer us towards a non-dystopian future.

This episode was recorded live at the Commonwealth Club World Affairs of California.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

NEXUS: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari 

You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills: a New York Times op-ed from 2023, written by Yuval, Aza, and Tristan

 The 2023 open letter calling for a pause in AI development of at least 6 months, signed by Yuval and Aza 

Further reading on the Stanford Marshmallow Experiment Further reading on AlphaGo’s “move 37” 

Further Reading on Social.AI

RECOMMENDED YUA EPISODES

This Moment in AI: How We Got Here and Where We’re Going

The Tech We Need for 21st Century Democracy with Divya Siddarth

Synthetic Humanity: AI & What’s At Stake

The AI Dilemma

Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

14 Aug 2019Pardon the Interruptions — with Gloria Mark00:43:54

Every 40 seconds, our attention breaks. It takes an act of extreme self-awareness to even notice. That’s why Gloria Mark, a professor in the Department of Informatics at University of California, Irvine, started measuring the attention spans of office workers with scientific precision. What she has discovered is not simply an explosion of disruptive communications, but a pandemic of stress that has followed workers from their offices to their homes. She shares the latest findings from the “science of interruptions,” and how we can stop forfeiting our attention to the next notification, and the next one, ad nauseam.

 

24 Mar 2023The AI Dilemma00:42:25

You may have heard about the arrival of GPT-4, OpenAI’s latest large language model (LLM) release. GPT-4 surpasses its predecessor in terms of reliability, creativity, and ability to process intricate instructions. It can handle more nuanced prompts compared to previous releases, and is multimodal, meaning it was trained on both images and text. We don’t yet understand its capabilities - yet it has already been deployed to the public.

At Center for Humane Technology, we want to close the gap between what the world hears publicly about AI from splashy CEO presentations and what the people who are closest to the risks and harms inside AI labs are telling us. We translated their concerns into a cohesive story and presented the resulting slides to heads of institutions and major media organizations in New York, Washington DC, and San Francisco. The talk you're about to hear is the culmination of that work, which is ongoing.

AI may help us achieve major advances like curing cancer or addressing climate change. But the point we're making is: if our dystopia is bad enough, it won't matter how good the utopia we want to create. We only get one shot, and we need to move at the speed of getting it right.

RECOMMENDED MEDIA

AI ‘race to recklessness’ could have dire consequences, tech experts warn in new interview

Tristan Harris and Aza Raskin sit down with Lester Holt to discuss the dangers of developing AI without regulation

The Day After (1983)

This made-for-television movie explored the effects of a devastating nuclear holocaust on small-town residents of Kansas

The Day After discussion panel

Moderated by journalist Ted Koppel, a panel of present and former US officials, scientists and writers discussed nuclear weapons policies live on television after the film aired

Zia Cora - Submarines 

“Submarines” is a collaboration between musician Zia Cora (Alice Liu) and Aza Raskin. The music video was created by Aza in less than 48 hours using AI technology and published in early 2022

RECOMMENDED YUA EPISODES 

Synthetic humanity: AI & What’s At Stake

A Conversation with Facebook Whistleblower Frances Haugen

Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

02 Jun 2022How Political Language Is Engineered — with Drew Westen and Frank Luntz00:36:41

Democracy depends on our ability to choose our political views. But the language we use to talk about political issues is deliberately designed to be divisive, and can produce up to a 15-point difference in what we think about those issues. As a result, are we choosing our views, or is our language choosing them for us?

This week,Your Undivided Attention welcomes two Jedi Masters of political communication. Drew Westen is a political psychologist and messaging consultant based at Emory university, who has advised the Democratic Party. Frank Luntz is a political and communications consultant, pollster, and pundit, who has advised the Republican Party. In the past, our guests have used their messaging expertise in ways that increased partisanship. For example, Luntz advocated for the use of the term “death tax” instead of “estate tax,” and “climate change” instead of “global warming.” Still, Luntz and Westen are uniquely positioned to help us decode the divisive power of language — and explore how we might design language that unifies.

CORRECTIONS: in the episode, Tristan refers to a panel Drew Westen and Frank Luntz were on at the New York Public Library. He says the panel was “about 10 years ago,” but it was actually 15 years ago in 2007. Also, Westen refers to a news anchor who moderated a debate between George H. W. Bush and Michael Dukakis in 1988. Drew mistakenly names the anchor as Bernard Kalb, when it was actually Bernard Shaw.

RECOMMENDED MEDIA

The Political Brain: The Role of Emotion in Deciding the Fate of the Nation

Drew Westen's 2008 book about role of emotion in determining the political life of the nation, which influenced campaigns and elections around the world

Words That Work: It's Not What You Say, It's What People Hear

Frank Luntz's 2008 book, which offers a behind-the-scenes look at how the tactical use of words and phrases affects what we buy, who we vote for, and even what we believe in

New York Public Library's Panel on Political Language 

A 2007 panel between multiple 'Jedi Masters' of political communication along the political spectrum, including Frank Luntz, Drew Westen, and George Lakoff
 

RECOMMENDED YUA EPISODES

The Invisible Influence of Language with Lera Boroditsky: https://www.humanetech.com/podcast/48-the-invisible-influence-of-language

How To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan: https://www.humanetech.com/podcast/51-how-to-free-our-minds

Mind the (Perception) Gap with Dan Vallone: https://www.humanetech.com/podcast/33-mind-the-perception-gap

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

12 Aug 2021Do You Want to Become a Vampire? — with L.A. Paul00:36:35

How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul

Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires? 

In this episode with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.

21 Apr 2025Rethinking School in the Age of AI00:42:35

AI has upended schooling as we know it. Students now have instant access to tools that can write their essays, summarize entire books, and solve complex math problems. Whether they want to or not, many feel pressured to use these tools just to keep up. Teachers, meanwhile, are left questioning how to evaluate student performance and whether the whole idea of assignments and grading still makes sense. The old model of education suddenly feels broken.

So what comes next?

In this episode, Daniel and Tristan sit down with cognitive neuroscientist Maryanne Wolf and global education expert Rebecca Winthrop—two lifelong educators who have spent decades thinking about how children learn and how technology reshapes the classroom. Together, they explore how AI is shaking the very purpose of school to its core, why the promise of previous classroom tech failed to deliver, and how we might seize this moment to design a more human-centered, curiosity-driven future for learning.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_

Guests

Rebecca Winthrop is director of the Center for Universal Education at the Brookings Institution and chair Brookings Global Task Force on AI and Education. Her new book is The Disengaged Teen: Helping Kids Learn Better, Feel Better, and Live Better, co-written with Jenny Anderson.

Maryanne Wolf is a cognitive neuroscientist and expert on the reading brain. Her books include Proust and the Squid: The Story and Science of the Reading Brain and Reader, Come Home: The Reading Brain in a Digital World.

RECOMMENDED MEDIA 
The Disengaged Teen: Helping Kids Learn Better, Feel Better, and Live Better by Rebecca Winthrop and Jenny Anderson

Proust and the Squid, Reader, Come Home, and other books by Maryanne Wolf

The OECD research which found little benefit to desktop computers in the classroom

Further reading on the Singapore study on digital exposure and attention cited by Maryanne 

The Burnout Society by Byung-Chul Han 

Further reading on the VR Bio 101 class at Arizona State University cited by Rebecca 

Leapfrogging Inequality by Rebecca Winthrop

The Nation’s Report Card from NAEP 

Further reading on the Nigeria AI Tutor Study 

Further reading on the JAMA paper showing a link between digital exposure and lower language development cited by Maryanne 

Further reading on Linda Stone’s thesis of continuous partial attention.

RECOMMENDED YUA EPISODES
We Have to Get It Right’: Gary Marcus On Untamed AI 

AI Is Moving Fast. We Need Laws that Will Too.

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

26 Oct 2023No One is Immune to AI Harms with Dr. Joy Buolamwini00:47:46

In this interview, Dr. Joy Buolamwini argues that algorithmic bias in AI systems poses risks to marginalized people. She challenges the assumptions of tech leaders who advocate for AI “alignment” and explains why some tech companies are hypocritical when it comes to addressing bias. 

Dr. Joy Buolamwini is the founder of the Algorithmic Justice League and the author of “Unmasking AI: My Mission to Protect What Is Human in a World of Machines.

Correction: Aza says that Sam Altman, the CEO of OpenAI, predicts superintelligence in four years. Altman predicts superintelligence in ten years.

 

RECOMMENDED MEDIA

Unmasking AI by Joy Buolamwini

“The conscience of the AI revolution” explains how we’ve arrived at an era of AI harms and oppression, and what we can do to avoid its pitfalls

Coded Bias

Shalini Kantayya’s film explores the fallout of Dr. Joy’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all

How I’m fighting bias in algorithms

Dr. Joy’s 2016 TED Talk about her mission to fight bias in machine learning, a phenomenon she calls the "coded gaze."

 

RECOMMENDED YUA EPISODES

Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?

Protecting Our Freedom of Thought with Nita Farahany

The AI Dilemma

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

06 Oct 2020Your Nation's Attention for the Price of a Used Car — with Zahed Amanullah00:43:17

Today’s extremists don’t need highly produced videos like ISIS. They don’t need deep pockets like Russia. With the right message, a fringe organization can reach the majority of a nation’s Facebook users for the price of a used car. Our guest, Zahed Amanullah, knows this firsthand. He’s a counter-terrorism expert at the Institute for Strategic Dialogue, and when his organization received $10,000 in ad credits from Facebook for an anti-extremism campaign, they were able to reach about two-thirds of Kenya’s Facebook users. It was a surprising win for Zahed, but it means nefarious groups all over the African continent have exactly the same broadcasting power. Last year, Facebook took down 66 accounts, 83 pages, 11 groups and 12 Instagram accounts related to Russian campaigns in African countries, and Russian networks spent more than $77,000 on Facebook ads in Africa. Today on the show, Zahed will explain how the very tools that extremists use to broadcast messages of hate can also be used to stop them in their tracks, and he’ll tell us what tech and government must do to systematically counter the problem. “If we don’t get in front of this,” he says, “this phenomenon is going to amplify beyond our reach.“

09 Sep 2020Spotlight: The Social Dilemma00:04:26

A new documentary called The Social Dilemma comes out on Netflix today, September 9, 2020. We hope that this film, full of interviews with tech insiders, will be a catalyst and tool for exposing how technology has been distorting our perception of the world, and will help us reach the shared ground we need to solve big problems together.

08 Apr 2021Spotlight — Coded Bias00:23:56

The film Coded Bias follows MIT Media Lab researcher Joy Buolamwini through her investigation of algorithmic discrimination, after she accidentally discovers that facial recognition technologies do not detect darker-skinned faces. Joy is joined on screen by experts in the field, researchers, activists, and involuntary victims of algorithmic injustice. Coded Bias was released on Netflix April 5, 2021, premiered at the Sundance Film Festival last year, and has been called “‘An Inconvenient Truth’ for Big Tech algorithms” by Fast Company magazine. We talk to director Shalini Kantayya about the impetus for the film and how to tackle the threats these challenges pose to civil rights while working towards more humane technology for all.

21 Dec 2023How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller00:47:15

2024 will be the biggest election year in world history. Forty countries will hold national elections, with over two billion voters heading to the polls. In this episode of Your Undivided Attention, two experts give us a situation report on how AI will increase the risks to our elections and our democracies. 

Correction: Tristan says two billion people from 70 countries will be undergoing democratic elections in 2024. The number expands to 70 when non-national elections are factored in.

RECOMMENDED MEDIA 

White House AI Executive Order Takes On Complexity of Content Integrity Issues
Renee DiResta’s piece in Tech Policy Press about content integrity within President Biden’s AI executive order

The Stanford Internet Observatory
A cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social media

Demos

Britain’s leading cross-party think tank

Invisible Rulers: The People Who Turn Lies into Reality by Renee DiResta

Pre-order Renee’s upcoming book that’s landing on shelves June 11, 2024

RECOMMENDED YUA EPISODES

The Spin Doctors Are In with Renee DiResta

From Russia with Likes Part 1 with Renee DiResta

From Russia with Likes Part 2 with Renee DiResta

Esther Perel on Artificial Intimacy

The AI Dilemma

A Conversation with Facebook Whistleblower Frances Haugen

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

11 Nov 2021Behind the Curtain on The Social Dilemma — with Jeff Orlowski-Yang and Larissa Rhodes00:43:40

How do you make a film that impacts more than 100 million people in 190 countries in 30 languages?

This week on Your Undivided Attention, we're going behind the curtain on The Social Dilemma — the Netflix documentary about the dark consequences of the social media business model, which featured the Center for Humane Technology. On the heels of the film's 1-year anniversary and winning of 2 Emmy Awards, we're talking with Exposure Labs' Director Jeff Orlowski-Yang and Producer Larissa Rhodes. What moved Jeff and Larissa to shift their focus from climate change to social media? How did the film transform countless lives, including ours and possibly yours? What might we do differently if we were producing the film today? 

Join us as we explore the reverberations of The Social Dilemma — which we're still feeling the effects of over one year later. 

02 Mar 2023TikTok’s Transparency Problem00:37:13

A few months ago on Your Undivided Attention, we released a Spotlight episode on TikTok's national security risks. Since then, we've learned more about the dangers of the China-owned company: We've seen evidence of TikTok spying on US journalists, and proof of hidden state media accounts to influence the US elections. We’ve seen Congress ban TikTok on most government issued devices, and more than half of US states have done the same, along with dozens of US universities who are banning TikTok access from university wifi networks. More people in Western governments and media are saying that they used to believe that TikTok was an overblown threat. As we've seen more evidence of  national security risks play out, there’s even talk of banning TikTok itself in certain countries. But is that the best solution? If we opt for a ban, how do we, as open societies, fight accusations of authoritarianism? 

On this episode of Your Undivided Attention, we're going to do a deep dive into these questions with Marc Faddoul. He's the co-director of Tracking Exposed, a nonprofit investigating the influence of social media algorithms in our lives. His work has shown how TikTok tweaks its algorithm to maximize partisan engagement in specific national elections, and how it bans international news in countries like Russia that are fighting propaganda battles inside their own borders. In other words, we don't all get the same TikTok because there are different geopolitical interests that might guide which TikTok you see. That is a kind of soft power that TikTok operates on a global scale, and it doesn’t get talked about often enough.

We hope this episode leaves you with a lot to think about in terms of what the risks of TikTok are, how it's operating geopolitically, and what we can do about it.

RECOMMENDED MEDIA

Tracking Exposed Special Report: TikTok Content Restriction in Russia
How has the Russian invasion of Ukraine affected the content that TikTok users see in Russia? [Part 1 of Tracking Exposed series]

Tracking Exposed Special Report: Content Restrictions on TikTok in Russia Following the Ukrainian War
How are TikTok’s policy decisions affecting pro-war and anti-war content in Russia? [Part 2 of Tracking Exposed series]

Tracking Exposed Special Report: French Elections 2022
The visibility of French candidates on TikTok and YouTube search engines

The Democratic Surround by Fred Turner
A dazzling cultural history that demonstrates how American intellectuals, artists, and designers from the 1930s-1960s imagined new kinds of collective events that were intended to promote a powerful experience of American democracy in action


RECLOMMENDED YUA EPISODES

When Media Was for You and Me with Fred Turner

Addressing the TikTok Threat

A Fresh Take on Tech in China with Rui Ma and Duncan Clark

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

12 Aug 2024This Moment in AI: How We Got Here and Where We’re Going00:36:55

It’s been a year and half since Tristan and Aza laid out their vision and concerns for the future of artificial intelligence in The AI Dilemma. In this Spotlight episode, the guys discuss what’s happened since then–as funding, research, and public interest in AI has exploded–and where we could be headed next. Plus, some major updates on social media reform, including the passage of the Kids Online Safety and Privacy Act in the Senate. 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

RECOMMENDED MEDIA

The AI Dilemma: Tristan and Aza’s talk on the catastrophic risks posed by AI.

Info Sheet on KOSPA: More information on KOSPA from FairPlay.

Situational Awareness by Leopold Aschenbrenner: A widely cited blog from a former OpenAI employee, predicting the rapid arrival of AGI.

AI for Good: More information on the AI for Good summit that was held earlier this year in Geneva. 

Using AlphaFold in the Fight Against Plastic Pollution: More information on Google’s use of AlphaFold to create an enzyme to break down plastics. 

Swiss Call For Trust and Transparency in AI: More information on the initiatives mentioned by Katharina Frey.

 

RECOMMENDED YUA EPISODES

War is a Laboratory for AI with Paul Scharre

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

Can We Govern AI? with Marietje Schaake 

The Three Rules of Humane Tech

The AI Dilemma

 

Clarification: Swiss diplomat Nina Frey’s full name is Katharina Frey.

 

The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office

30 Jun 2022An Alternative to Silicon Valley Unicorns00:51:26

Why isn't Twitter doing more to get bots off their platform? Why isn’t Uber taking better care of its drivers? 

What if...they can't?

Venture-capital backed companies like Twitter and Uber are held accountable to maximizing returns to investors. If and when they become public companies, they become accountable to maximizing returns to shareholders. They’ve promised Wall Street outsized returns — which means Twitter can't lose bots if it would significantly lower their user count and in turn lower advertising revenue, and Uber can’t treat their drivers like employees if it competes with profits.

But what's the alternative? What might it look like to design an ownership and governance model that incentivizes a technology company to serve all of its stakeholders over the long term – and primarily, the stakeholders who create value?

This week on Your Undivided Attention, we're talking with two experts on creating the conditions for humane business, and in turn, for humane technology: Mara Zepeda and Kate “Sassy” Sassoon of Zebras Unite Co-Op. Zebras Unite is a member-owned co-operative that’s creating the capital, culture, and community to power a more just and inclusive economy. The Zebras Unite Coop serves a community of over 6,000 members, in about 30 chapters, over 6 continents. Mara is their Managing Director, and Kate is their Director of Cooperative Membership.

Two corrections:

  • The episode says that the failure rate of startups is 99%. The actual rate is closer to 90%.
  • The episode says that in 2017, Twitter reported 350 million users on its platform. The actual number reported was 319 million users.

RECOMMENDED MEDIA 

Zebras Fix What Unicorns Break

A seminal 2017 article by Zebras Unite co-founders, which kicked off the movement and distinguished between zebras and unicorns — per the table below.

Meetup to the People 

Zebras Unite’s 2019 thought experiment of exiting Meetup to community

Zebras Unite Crowdcast Channel

Where you can find upcoming online events, as well as recordings of previous events.

RECOMMENDED YUA EPISODES 

A Renegade Solution to Extractive Economics with Kate Raworth: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economics

Bonus — A Bigger Picture on Elon & Twitter: https://www.humanetech.com/podcast/bigger-picture-elon-twitter  

Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

06 Oct 2021Spotlight — A Whirlwind Week of Whistleblowing00:04:56

In seven years of working on the problems of runaway technology, we’ve never experienced a week like this! In this bonus episode of Your Undivided Attention, we recap this whirlwind of a week — from Facebook whistleblower France Haugen going public on 60 Minutes on Sunday, to the massive outage of Facebook, Instagram, and WhatsApp on Monday, to Haugen’s riveting Congressional testimony on Tuesday. We also make some exciting announcements — including our planned episode with Haugen up next, the Yale social media reform panel we’re participating in on Thursday, and a campaign we’re launching to pressure Facebook to make one immediate change. 

This week it truly feels like we’re making history — and you’re a part of it.

03 Aug 2023Protecting Our Freedom of Thought with Nita Farahany00:44:07

We are on the cusp of an explosion of cheap,  consumer-ready neurotechnology - from earbuds that gather our behavioral data,  to sensors that can read our dreams. And it’s all going to be supercharged by AI. This technology is moving from niche to mainstream - and it has the same potential to become exponential. 

Legal scholar Nita Farahany talks us through the current state of neurotechnology and its deep links to AI. She says that we urgently need to protect the last frontier of privacy: our internal thoughts. And she argues that without a new legal framework around “cognitive liberty,” we won’t be able to insulate our brains from corporate and government intrusion.

RECOMMENDED MEDIA 

The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita Farahany

The Battle for Your Brain offers a path forward to navigate the complex dilemmas that will fundamentally impact our freedom to understand, shape, and define ourselves

Computer Program Reveals What Neurons in the Visual Cortex Prefer to Look At

A study of macaque monkeys at Harvard generated valuable clues based on an artificial intelligence system that can reliably determine what neurons in the brain’s visual cortex prefer to see

Understanding Media: The Extensions of Man by Marshall McLuhan

An influential work by a fixture in media discourse

RECOMMENDED YUA EPISODES 

The Three Rules of Humane Tech

Talking With Animals… Using AI

How to Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

19 Dec 2019The Cure for Hate — with Tony McAleer00:41:17

“You can binge watch an ideology in a weekend,” says Tony McAleer. He should know. A former white supremacist, McAleer was introduced to neo-Nazi ideology through the U.K. punk scene in the 1980s. But after his daughter was born, he embarked on a decades-long journey from hate to compassion. Today’s technology, he says, make violent ideologies infinitely more accessible and appealing to those who long for acceptance. Social media isolates us and can incubate hate in a highly diffuse structure, making it nearly impossible to stop race-based violence without fanning the flames or driving it further underground. McAleer discusses solutions to this dilemma and the positive actions we can take together.

23 Dec 2020Won't You Be My Neighbor? A Civic Vision for the Internet — with Eli Pariser00:48:25

You’ve heard us talk before on this podcast about the pitfalls of trying to moderate a “global public square.” Our guest today, Eli Pariser, co-director of Civic Signals, co-founder of Avaaz, and author of "The Filter Bubble," has been thinking for years about how to create more functional online spaces and is bringing people together to solve that problem. He believes the answer lies in creating spaces and groups intentionally, with the same kinds of skilled support and infrastructure that we would enlist in the physical world. It’s not enough to expect the big revenue-oriented tech companies to transform their tools into something less harmful; Eli is encouraging us to proactively gather in our own spaces, optimized for togetherness and cooperation.

08 Jul 2020The World According to Q — with Travis View00:59:13

What would inspire someone to singlehandedly initiate an armed standoff on the Hoover Dam, or lead the police on a 100-mile-an-hour car chase while calling for help from an anonymous internet source, or travel hundreds of miles alone to shoot up a pizza parlor? The people who did these things were all connected to the decentralized cult-like internet conspiracy theory group called QAnon. Our guest this episode, Travis View, is a researcher, writer and podcast host who has spent the last few years trying to understand the people who’ve become wrapped up in QAnon and the concerning consequences as Q followers increasingly leave their screens and take extreme actions in the real world. As many as six candidates who support QAnon are running for Congress and will be on the ballot for the 2020 elections, threatening to upend long-held Republican establishment seats. This just happened to a five-term Republican congressman in Colorado. Travis warns that QAnon is an extremism problem, not a disinformation or political problem, and dismissing QAnon as a fringe threat underestimates how quickly their views can leapfrog into mainstream debates on the left and the right.

22 Oct 2019The Opposite of Addiction — with Johann Hari00:48:58

What causes addiction? Johann Hari, author of Chasing the Scream, travelled some 30,000 miles in search of an answer. He met with researchers and lawmakers, drug dealers and drug makers, those who were struggling with substance abuse and those who had recovered from it, and he came to the conclusion that our whole narrative about addiction is broken.  "The opposite of addiction is not sobriety," he argues. "The opposite of addiction is connection." But first, we have to figure out what it really means to connect.

07 Apr 2022Spotlight — What Is Humane Technology?00:12:50

“The fundamental problem of humanity is that we have paleolithic emotions, medieval institutions, and God-like technology.” — E. O. Wilson.

More than ever, we need the wisdom to match the power of our God-like technology. Yet, technology is both eroding our ability to make sense of the world, and increasing the complexity of the issues we face. The gap between our sense-making ability and issue complexity is what we call the “wisdom gap." 

How do we develop the wisdom we need to responsibly steward our God-like technology?

This week on Your Undivided Attention, we're introducing one way Center for Humane Technology is attempting to close the wisdom gap —through our new online course, Foundations of Humane Technology. In this bonus episode, Tristan Harris describes the wisdom gap we're attempting to close, and our Co-Founder and Executive Director Randima Fernando talks about the course itself.

Sign up for the free course: https://www.humanetech.com/course

RECOMMENDED YUA EPISODES

A Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved

A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen

Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

10 Jun 2019What Happened in Vegas — with Natasha Dow Schüll00:40:51

Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine. 

06 Jul 2023Big Food, Big Tech and Big AI with Michael Moss00:34:43

In the next two episodes of Your Undivided Attention, we take a close look at two respective industries: big food and social media, which represent dangerous “races to the bottom” and have big parallels with AI.  

And we are asking: what can our past mistakes and missed opportunities teach us about how we should approach AI harms? 

In this first episode, Tristan talks to Pulitzer Prize-winning journalist and author Michael Moss. His book Salt, Sugar, Fat: How the Food Giants Hooked Us rocked the fast food industry when it came out in 2014. 

Tristan and Michael discuss how we can leverage the lessons learned from Big Food’s coordination failures, and whether it’s the responsibility of the consumer, the government, or the companies to regulate.
 

RECOMMENDED MEDIA 

Salt Sugar Fat: How the Food Giants Hooked Us

Michael’s New York Times bestseller. You’ll never look at a nutrition label the same way again

Hooked: Food, Free Will, and How the Food Giants Exploit Our Addictions

Michael’s Pulitzer Prize-winning exposé of how the processed food industry exploits our evolutionary instincts, the emotions we associate with food, and legal loopholes in their pursuit of profit over public health

Control Your Tech Use

Center for Humane Technology’s recently updated Take Control Toolkit

RECOMMENDED YUA EPISODES

AI Myths and Misconceptions

The AI Dilemma

How Might a long-term stock market transform tech? (ZigZag episode)

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

18 Oct 2021A Conversation with Facebook Whistleblower Frances Haugen00:55:24

We are now in social media's Big Tobacco moment. And that’s largely thanks to the courage of one woman: Frances Haugen.

Frances is a specialist in algorithmic product management. She worked at Google, Pinterest, and Yelp before joining Facebook — first as a Product Manager on Civic Misinformation, and then on the Counter-Espionage team. But what she saw at Facebook was that the company consistently and knowingly prioritized profits over public safety. So Frances made the courageous decision to blow the whistle — which resulted in the biggest disclosure in the history of Facebook, and in the history of social media.

In this special interview, co-hosts Tristan and Aza go behind the headlines with Frances herself. We go deeper into the problems she exposed, discuss potential solutions, and explore her motivations — along with why she fundamentally believes change is possible. We also announce an exciting campaign being launched by the Center for Humane Technology — to use this window of opportunity to make Facebook safer.

10 Dec 2021A Fresh Take on Tech in China — with Rui Ma and Duncan Clark00:48:37

Who do you think the Chinese government considers its biggest rival? The United States, right? Actually, the Chinese government considers its biggest rival to be its own technology companies

It's China's tech companies who threaten its capacity to build a competitive China. That's why the Chinese government is cracking down on social media — for example, by limiting the number of hours youth can play video games, and banning cell phone use in schools. China's restrictions on social media use may be autocratic, but may also protect users more than what we see coming from the US government.

It’s a complicated picture.

This week on Your Undivided Attention, we're having a surprising conversation about technology in China. Here to give us a fresh take are two guests: investor, analyst, and co-host of the Tech Buzz China podcast Rui Ma, and China internet expert and author of Alibaba: The House That Jack Ma Built, Duncan Clark.

16 Apr 2019Launching June 10: Your Undivided Attention00:03:16

Technology has shredded our attention. We can do better.

08 Sep 2022Spotlight — Addressing the TikTok Threat00:23:57

Imagine it's the Cold War. Imagine that the Soviet Union puts itself in a position to influence the television programming of the entire Western world — more than a billion viewers. 

While this might sound like science fiction, it’s representative of the world we're living in, with TikTok being influenced by the Chinese Communist Party.

TikTok, the flagship app of the Chinese company Bytedance, recently surpassed Google and Facebook as the most popular site on the internet in 2021, and is expected to reach more than 1.8 billion users by the end of 2022. The Chinese government doesn't control TikTok, but has influence over it. What are the implications of this influence, given that China is the main geopolitical rival of the United States?

This week on Your Undivided Attention, we bring you a bonus episode about TikTok. Co-hosts Tristan Harris and Aza Raskin explore the nature of the TikTok threat, and how we might address it.

RECOMMENDED MEDIA 

Pew Research Center's "Teens, Social Media and Technology 2022"

https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/

Pew's recent study on how TikTok has established itself as one of the top online platforms for U.S. teens

Axios' "Washington turns up the heat on TikTok"

https://www.axios.com/2022/07/07/congress-tiktok-china-privacy-data?utm_source=substack&utm_medium=email

Article on recent Congressional responses to the threat of TikTok

Felix Krause on TikTok's keystroke tracking

https://twitter.com/KrauseFx/status/1560372509639311366

A revelation that TikTok has code to observe keypad input and all taps

RECOMMENDED YUA EPISODES

A Fresh Take on Tech in China with Rui Ma and Duncan Clark

https://www.humanetech.com/podcast/44-a-fresh-take-on-tech-in-china

A Conversation with Facebook Whistleblower Frances Haugen

https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen

From Russia with Likes (Part 1). Guest: Renée DiResta

https://www.humanetech.com/podcast/5-from-russia-with-likes-part-1

From Russia with Likes (Part 2). Guest: Renée DiResta

https://www.humanetech.com/podcast/6-from-russia-with-likes-part-2
 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

20 Jun 2024Why Are Migrants Becoming AI Test Subjects? With Petra Molnar00:46:19

Climate change, political instability, hunger. These are just some of the forces behind an unprecedented refugee crisis that’s expected to include over a billion people by 2050. In response to this growing crisis, wealthy governments like the US and the EU are employing novel AI and surveillance technologies to slow the influx of migrants at their borders. But will this rollout stop at the border?

In this episode, Tristan and Aza sit down with Petra Molnar to discuss how borders have become a proving ground for the sharpest edges of technology, and especially AI. Petra is an immigration lawyer and co-creator of the Migration and Technology Monitor. Her new book is “The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence.”

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence

Petra’s newly published book on the rollout of high risk tech at the border.

Bots at the Gate

A report co-authored by Petra about Canada’s use of AI technology in their immigration process.

Technological Testing Grounds

A report authored by Petra about the use of experimental technology in EU border enforcement.

Startup Pitched Tasing Migrants from Drones, Video Reveals

An article from The Intercept, containing the demo for Brinc’s taser drone pilot program.

The UNHCR

Information about the global refugee crisis from the UN.

RECOMMENDED YUA EPISODES

War is a Laboratory for AI with Paul Scharre

No One is Immune to AI Harms with Dr. Joy Buolamwini

Can We Govern AI? With Marietje Schaake

CLARIFICATION:

The iBorderCtrl project referenced in this episode was a pilot project that was discontinued in 2019

11 May 2023Spotlight: AI Myths and Misconceptions00:26:48

A few episodes back, we presented Tristan Harris and Aza Raskin’s talk The AI Dilemma. People inside the companies that are building generative artificial intelligence came to us with their concerns about the rapid pace of deployment and the problems that are emerging as a result. We felt called to lay out the catastrophic risks that AI poses to society and sound the alarm on the need to upgrade our institutions for a post-AI world.

The talk resonated - over 1.6 million people have viewed it on YouTube as of this episode’s release date. The positive reception gives us hope that leaders will be willing to come to the table for a difficult but necessary conversation about AI.

However, now that so many people have watched or listened to the talk, we’ve found that there are some AI myths getting in the way of making progress. On this episode of Your Undivided Attention, we debunk five of those misconceptions. 

RECOMMENDED MEDIA 

Opinion | Yuval Harari, Tristan Harris, and Aza Raskin on Threats to Humanity Posed by AI - The New York Times

In this New York Times piece, Yuval Harari, Tristan Harris, and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents.

Misalignment, AI & Moloch

A deep dive into the game theory and exponential growth underlying our modern economic system, and how recent advancements in AI are poised to turn up the pressure on that system, and its wider environment, in ways we have never seen before

RECOMMENDED YUA EPISODES

The AI Dilemma

The Three Rules of Humane Tech

Can We Govern AI? with Marietje Schaake

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

29 Mar 2024Chips Are the Future of AI. They’re Also Incredibly Vulnerable. With Chris Miller00:45:16

Beneath the race to train and release more powerful AI models lies another race: a race by companies and nation-states to secure the hardware to make sure they win AI supremacy. 

Correction: The latest available Nvidia chip is the Hopper H100 GPU, which has 80 billion transistors. Since the first commercially available chip had four transistors, the Hopper actually has 20 billion times that number. Nvidia recently announced the Blackwell, which boasts 208 billion transistors - but it won’t ship until later this year.

RECOMMENDED MEDIA 

Chip War: The Fight For the World’s Most Critical Technology by Chris Miller

To make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips

Gordon Moore Biography & Facts

Gordon Moore, the Intel co-founder behind Moore's Law, passed away in March of 2023

AI’s most popular chipmaker Nvidia is trying to use AI to design chips faster

Nvidia's GPUs are in high demand - and the company is using AI to accelerate chip production

RECOMMENDED YUA EPISODES

Future-proofing Democracy In the Age of AI with Audrey Tang

How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller

The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao

Protecting Our Freedom of Thought with Nita Farahany

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

 

 

02 Sep 2020Facebook Goes '2Africa' — with Julie Owono00:35:43

This summer, Facebook unveiled “2Africa,” a subsea cable project that will encircle nearly the entire continent of Africa — much to the surprise of Julie Owono. As Executive Director of Internet Without Borders, she’s seen how quickly projects like this can become enmeshed in local politics, as private companies dig through territorial waters, negotiate with local officials and gradually assume responsibility over vital pieces of national infrastructure. “It’s critical, now, that communities have a seat at the table,” Julie says. We ask her about the risks of tech companies leading us into an age of “digital colonialism,” and what she hopes to achieve as a newly appointed member of Facebook’s Oversight Board.

06 Aug 2020When Media Was for You and Me — with Fred Turner00:37:07

In 1940, a group of 60 American intellectuals formed the Committee for National Morale. “They’ve largely been forgotten,” says Fred Turner, a professor of communications at Stanford University, but their work had a profound impact on public opinion. They produced groundbreaking films and art exhibitions. They urged viewers to stop, reflect and think for themselves, and in so doing, they developed a set of design principles that reimagined how media could make us feel more calm, reflective, empathetic; in short, more democratic.

Enhance your understanding of Your Undivided Attention with My Podcast Data

At My Podcast Data, we strive to provide in-depth, data-driven insights into the world of podcasts. Whether you're an avid listener, a podcast creator, or a researcher, the detailed statistics and analyses we offer can help you better understand the performance and trends of Your Undivided Attention. From episode frequency and shared links to RSS feed health, our goal is to empower you with the knowledge you need to stay informed and make the most of your podcasting experience. Explore more shows and discover the data that drives the podcast industry.
© My Podcast Data