Computer Says Maybe – Détails, épisodes et analyse
Détails du podcast
Informations techniques et générales issues du flux RSS du podcast.

Computer Says Maybe
Alix Dunn
Fréquence : 1 épisode/13j. Total Éps: 40

Classements récents
Dernières positions dans les classements Apple Podcasts et Spotify.
Apple Podcasts
🇫🇷 France - technology
28/02/2025#73🇨🇦 Canada - technology
04/02/2025#100🇨🇦 Canada - technology
14/10/2024#94
Spotify
Aucun classement récent disponible
Liens partagés entre épisodes et podcasts
Liens présents dans les descriptions d'épisodes et autres podcasts les utilisant également.
See all- https://thesignalsnetwork.org/
13 partages
- https://knightfoundation.org/
7 partages
Qualité et score du flux RSS
Évaluation technique de la qualité et de la structure du flux RSS.
See allScore global : 73%
Historique des publications
Répartition mensuelle des publications d'épisodes au fil des années.
Net 0++: Microsoft’s greenwashing w/ Holly Alpine
Épisode 21
vendredi 11 octobre 2024 • Durée 42:07
This week we’re kicking off a series about AI & the environment. We’re starting with Holly Alpine, who just recently left Microsoft after starting and growing an internal sustainability programme over a decade.
Holly’s goal was pretty simple: she wanted Microsoft to honour the sustainability commitments that they had set for themselves. The internal support she had fostered for sustainability initiatives did not match up with Microsoft’s actions — they continued to work with fossil fuel companies even though doing so was at odds with their plans to achieve net 0.
Listen to learn about what it’s like approaching this kind of huge systemic challenge with good faith, and trying to make change happen from the inside.
Holly Alpine is a dedicated leader in sustainability and environmental advocacy, having spent over a decade at Microsoft pioneering and leading multiple global initiatives. As the founder and head of Microsoft's Community Environmental Sustainability program, Holly directed substantial investments into community-based, nature-driven solutions, impacting over 45 global communities in Microsoft’s global datacenter footprint, with measurable improvements to ecosystem health, social equity, and human well-being.
Currently, Holly continues her environmental leadership as a Board member of both American Forests and Zero Waste Washington, while staying active in outdoor sports as a plant-based athlete who enjoys rock climbing, mountain biking, ski mountaineering, and running mountain ultramarathons.
Further Reading:
Chasing Away Sidewalk Labs w/ Bianca Wylie
Épisode 20
vendredi 4 octobre 2024 • Durée 49:39
In 2017 Google’s urban planning arm Sidewalk Labs came into Toronto and said “we’re going to turn this into a smart city”.
Our guest Bianca Wylie was one of the people who stood up and said “okay but… who asked for this?”
This is a story about how a large tech firm came into a community with big promises, and then left with its tail between its legs. In the episode Alix and Bianca discuss the complexities of government procurement of tech, and how attractive corporate solutions look when you’re so riddled with austerity.
Bianca Wylie is a writer with a dual background in technology and public engagement. She is a partner at Digital Public and a co-founder of Tech Reset Canada. She worked for several years in the tech sector in operations, infrastructure, corporate training, and product management. Then, as a professional facilitator, she spent several years co-designing, delivering and supporting public consultation processes for various governments and government agencies. She founded the Open Data Institute Toronto in 2014 and co-founded Civic Tech Toronto in 2015.
Further Reading:
A Counterpublic Analysis of Sidewalk Toronto
In Toronto, Google’s Attempt to Privatize Government Fails—For Now
Exhibit X: The Whistleblower
Épisode 10
vendredi 2 août 2024 • Durée 31:40
In part 2 of Exhibit X, Alix interviewed Frances Haugen, who In 2021 blew the whistle on Meta; they were sitting on the knowledge that their products were harmful to kids, and yet — shocker — they continued to make design decisions that would keep kids engaged.
Mark Zuckerberg worked hard on his image (it’s a hydrofoil, not a surfboard!), while Instagram was being used for human trafficking — the lack of care and accountability here absolutely melts the mind.
What conversations did Frances’s whistleblowing start?
Was whistleblowing an effective mechanism for accountability in this case?
Do we have to add age verification to social media sites or break end-to-end encryption to keep children safe online?
*Frances Haugen is a data scientist & engineer. In 2021 she disclosed 22,000 internal documents to The Wall Street Journal and the Securities & Exchanges Commission which demonstrated Meta’s knowledge of their products harms.*
Your hosts this week are Alix Dunn and Prathm Juneja
Exhibit X: Tech and Tobacco
Épisode 9
vendredi 26 juillet 2024 • Durée 27:04
Here is something you’re probably tired of hearing: Big Tech is responsible for a bottomless brunch of societal harms. And they are not being held accountable. Right now it feels as though we hear constantly about laws, regulation, courts. But none of it is effective in litigating against Big Tech.
In our latest podcast series Exhibit X, we’re looking at how the tides might finally be turning. Legal accountability could be around the corner, but only if a few things happen first.
To start, we look back to 1964. When Big Tobacco was winning the ‘try your best to profit from harm’ race. Research showed cigarettes were addictive and also caused cancer — and yet the industry evaded accountability for decades.
In this episode we ask questions like:
- Why wasn’t a report in 1964 showing cigarettes are addictive and cause cancer enough to transform the industry?
- What can we learn about corporate capture of research on tobacco?
- How did academia and experts shape the outcomes of court cases?
Prathm Juneja was Alix’s co-host for this episode. He is a PhD Candidate in Social Data Science at the Oxford Internet Institute Working at the intersection of academia, industry, and government on technology, innovation, and policy.
Further reading
- C-SPAN: Tobacco Settlement
- The Cigarette Papers - Full Online Version
- The Truth Tobacco Industry Documents
- Big Tobacco and the Historians
- Tobacco Litigation Documents
- A Tobacco Whistle-Blower's Life Is Transformed
- Inventing Conflicts of Interest: A History of Tobacco Industry Tactics
- Tobacco Industry Research Committee
- Experts Debating Tobacco Addiction
New mini-series: Exhibit X
Épisode 8
jeudi 18 juillet 2024 • Durée 03:57
In the Exhibit X series Alix and Prathm sink their fingernails into the tangled universe of litigation and Big Tech; how have the courts held Big Tech firms accountable for their various harms over the years? Is whistleblowing an effective mechanism for informing new regulations? What about a social media platform’s first amendment rights? So much to cover, so many episodes coming your way!
What the FAccT? Evidence of bias. Now what?
Épisode 7
vendredi 12 juillet 2024 • Durée 25:09
In part four of our FAccT deep dive, Alix joins Marta Ziosi and Dasha Pruss to discuss their paper “Evidence of What, for Whom? The Socially Contested Role of Algorithmic Bias in a Predictive Policing Tool”.
In their paper they discuss how an erosion of public trust can lead to ‘any idea will do’ decisions, and often these lean on technology, such as predictive policing systems. One such tool is the Shot Spotter, a piece of audio surveillance tech designed to detect gunfire — a contentious system which has been sold both as a tool for police to surveil civilians, and as a tool for civilians to keep tabs on police. Can it really be both?
Marta Ziosi is a Postdoctoral Researcher at the Oxford Martin AI Governance Initiative, where her research focuses on standards for frontier AI. She has worked for institutions such as DG CNECT at the European Commission, the Berkman Klein Centre for Internet & Society at Harvard University, The Montreal International Center of Expertise in Artificial Intelligence (CEIMIA) and The Future Society. Previously, Marta was a Ph.D. student and researcher on Algorithmic Bias and AI Policy at the Oxford Internet Institute. She is also the founder of AI for People, a non-profit organisation whose mission is to put technology at the service of people. Marta holds a BSc in Mathematics and Philosophy from University College Maastricht. She also holds an MSc in Philosophy and Public Policy and an executive degree in Chinese Language and Culture for Business from the London School of Economics.
Dasha Pruss is a 2023-2024 fellow at the Berkman Klein Center for Internet & Society and an Embedded EthiCS postdoctoral fellow at Harvard University. In fall 2024 she will be an assistant professor of philosophy and computer science at George Mason University. She received her PhD in History & Philosophy of Science from the University of Pittsburgh in May 2023, and holds a BSc in Computer Science from the University of Utah. She has also co-organized with Against Carceral Tech, an activist group working to ban facial recognition and predictive policing in the city of Pittsburgh.
This episode is hosted by Alix Dunn. Our guests are Marta Ziosi and Dasha Prussi
Further Reading
What the FAccT? First law, bad law
Épisode 6
vendredi 5 juillet 2024 • Durée 23:55
In this episode, we speak with Lara Groves and Jacob Metcalf at the seventh annual FAccT conference in Rio de Janeiro.
In part four of our FAccT deep dive, Alix joins Lara Groves and Jacob Metcalf to discuss their paper “ Auditing Work: Exploring the New York City algorithmic bias audit regime”.
Lara Groves is a Senior Researcher at the Ada Lovelace Institute. Her most recent project explored the role of third-party auditing regimes in AI governance. Lara has previously led research on the role of public participation in commercial AI labs, and on algorithmic impact assessments. Her research interests include practical and participatory approaches to algorithmic accountability and innovative policy solutions to challenges of governance.
Before joining Ada, Lara worked as a tech and internet policy consultant, and has experience in research, public affairs and campaigns for think-tanks, political parties and advocacy groups. Lara has an MSc in Democracy from UCL.
Jacob Metcalf, PhD, is a researcher at Data & Society, where he leads the AI on the Ground Initiative, and works on an NSF-funded multisite project, Pervasive Data Ethics for Computational Research (PERVADE). For this project, he studies how data ethics practices are emerging in environments that have not previously grappled with research ethics, such as industry, IRBs, and civil society organizations. His recent work has focused on the new organizational roles that have developed around AI ethics in tech companies.
Jake’s consulting firm, Ethical Resolve, provides a range of ethics services, helping clients to make well-informed, consistent, actionable, and timely business decisions that reflect their values. He also serves as the Ethics Subgroup Chair for the IEEE P7000 Standard.
This episode is hosted by Alix Dunn. Our guests are Lara Groves and Jacob Metcalf.
Further Reading
- Lara Groves (Ada Lovelace Institute, UK), Jacob Metcalf (Data & Society Research Institute, USA), Alayna Kennedy (Independent researcher, USA), Briana Vecchione (Data & Society Research Institute, USA) and Andrew Strait (Ada Lovelace Institute, UK)- Auditing Work: Exploring the New York City algorithmic bias audit regime
What the FAccT?: Abandoning Algorithms
Épisode 5
vendredi 28 juin 2024 • Durée 29:59
In this episode, we speak with Nari Johnson and Sanika Moharana at this year’s FAccT conference in Rio de Janeiro.
In part two of our FAccT deep dive, Alix joins Nari Johnson and Sanika Moharana to discuss their paper “The Fall of an Algorithm: Characterizing the Dynamics Toward Abandonment”.
Nari Johnson is a third-year PhD student in Carnegie Mellon University's Machine Learning Department, where she is advised by Hoda Heidari. She graduated from Harvard in 2021 with a BA and MS in Computer Science, where she previously worked with Finale Doshi-Velez.
Sanika Moharana is a second-year PhD student in Human Computer Interaction at Carnegie Mellon University. As an advocate for human-centered design and research, Sanika practices iterative ideation and prototyping for multimodal interactions and interfaces across intelligent systems, connected smart devices, IOT’s, AI experiences, and emerging technologies .
Further Reading
What the FAccT?: Reformers and Radicals
Épisode 4
vendredi 21 juin 2024 • Durée 54:10
In part 1 of our FAccT conference deep dive, Alix Dunn sits down with co-host Andrew Strait from the Ada Lovelace Institute to talk about the history of FAccT and some of the papers being presented at this year’s event.
The Fairness, Accountability and Transparency Conference, or FAccT is an interdisciplinary conference dedicated to bringing together a diverse community of scholars and exploring how socio-technical systems could be built in a way that is compatible with a fair society. The seventh annual FAccT conference was held in Rio de Janeiro, Brazil, from Monday, June 3rd through Thursday, June 6th 2024 with over five hundred people in attendance.
This episode is hosted by Alix Dunn and our Co-Host is Andrew Strait
Further Reading:
- Robert Gorwa (WZB Berlin Social Science Center, Germany) and Michael Veale (University College London, UK)- Moderating Model Marketplaces: Platform Governance Puzzles for AI Intermediaries
- Marta Ziosi (University of Oxford, UK) and Dasha Pruss (Harvard University, USA)- Evidence of What, for Whom? The Socially Contested Role of Algorithmic Bias in a Predictive Policing Tool
- Michael Madaio (Google Research, USA), Shivani Kapania (Carnegie Mellon University, USA), Rida Qadri (Google Research, USA), Ding Wang (Google Research, USA), Andrew Zaldivar (Google Research, USA), Remi Denton (Google Research, USA) and Lauren Wilcox (eBay, USA)- Learning about Responsible AI On-The-Job: Learning Pathways, Orientations, and Aspirations
- David Gray Widder (Digital Life Initiative, Cornell Tech, USA)- Epistemic Power in AI Ethics Labor: Legitimizing Located Complaints
- Lara Groves (Ada Lovelace Institute, UK), Jacob Metcalf (Data & Society Research Institute, USA), Alayna Kennedy (Independent researcher, USA), Briana Vecchione (Data & Society Research Institute, USA) and Andrew Strait (Ada Lovelace Institute, UK)- Auditing Work: Exploring the New York City algorithmic bias audit regime
- Nari Johnson (Carnegie Mellon University, USA), Sanika Moharana (Carnegie Mellon University, USA), Christina Harrington (Carnegie Mellon University, USA), Nazanin Andalibi (University of Michigan, USA), Hoda Heidari (Carnegie Mellon University, USA) and Motahhare Eslami (Carnegie Mellon University, USA)- The Fall of an Algorithm: Characterizing the Dynamics Toward Abandonment
Protesting Project Nimbus: employee organising to end Google’s contract with Israel w/ Dr.Kate Sim
Épisode 3
jeudi 23 mai 2024 • Durée 51:42
In this episode, we speak with Dr. Kate Sim, one of the core organisers of the Google Worker Sit-In Against Project Nimbus.
Dr. Kate Sim was recently fired, alongside almost 50 other employees, from Google after helping organize a sit-in protesting Project Nimbus, a joint contract between Google and Amazon to provide technology to the Israeli government and military. In this episode, Alix and Dr. Sim discuss technology-enabled violence, Dr. Sim's work in trust and safety, and Google's cancelled project Maven. They also talk about Dr. Sim's journey into protesting Project Nimbus, the many other voices fighting against the contract, and how Big Tech often obfuscates its responsibility in perpetuating violence. In the end, we arrive at a common lesson: solidarity is our main hope for change.
This episode is hosted by Alix Dunn and our guest is Dr. Kate Sim.
Further Reading
- No Tech For Apartheid
- What is Project Nimbus, and why are Google workers protesting Israel deal?
- Israeli Weapons Firms Required to Buy Cloud Services From Google and Amazon
- The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World
- 3 Years After the Project Maven Uproar, Google Cozies to the Pentagon
- How Big Tech and Silicon Valley are Transforming the Military-Industrial Complex
- Google Fired Us for Protesting Its Complicity in the War on Gaza. But We Won’t Be Silenced.
- Computer Says Maybe Newsletter - Protesting Project Nimbus