When Alexa Says Sorry: What We Risk When AI Sounds Human

May 13, 2025 | 47:08 Download MP3

Episode Summary

In this episode of Terms of Service, host Mary Camacho speaks with Marisa Zalabak, an AI ethicist and psychologist who explores how our relationships with artificial intelligence impact emotional intelligence, learning, communication, and mental health. With a rich background in education, social justice, psychology, and theater arts, Marisa offers deep insights into the emotional and ethical implications of anthropomorphizing AI, the risks of synthetic empathy, and the importance of slowing down to ask better questions. Together, they unpack how emotional and cognitive habits are being shaped by our daily interactions with machines—and what it means for our shared future.

Key Takeaways

  • Anthropomorphizing AI—treating machines as if they are human—is natural but dangerous, especially when synthetic empathy (like chatbots saying “I’m sorry”) reinforces emotional trust in non-human systems.
  • Marisa emphasizes the importance of asking better questions about the tools we use, why we use them, and what long-term effects they may have.
  • Research shows people increasingly treat AI systems as coworkers or even confidants, which can affect trust, mental health, and social connection.
  • Systems like Alexa and humanoid AIs often reinforce gender bias, particularly when defaulted to women’s voices.
  • Encouraging digital literacy, slow learning, and psychological grounding helps individuals—and especially children—build healthy habits with technology.

Topics Covered / Timestamped Sections

  • 01:55 – Marisa’s unconventional journey from performing arts to educational psychology to AI ethics
  • 05:48 – Discovering AI and contributing to one of the first IEEE standards on human well-being in AI design.
  • 08:27 – First deep AI encounter: conversing with NASA's humanoid BINA48 and the psychology of human-machine interaction.
  • 13:22 – Synthetic empathy and the blurry boundaries of trust in conversational AI.
  • 18:10 – How politeness and pronouns affect human habits and communication patterns.
  • 21:45 – Designing meaningful research on emotional and psychological effects of AI.
  • 23:14 – Children and AI: the real impacts of early and normalized interaction with synthetic personalities.
  • 38:00 – Why education should be an invitation to inquiry, not a race toward certainty.
  • 33:31 – Gendered AI voice assistants and their unintended social consequences.
  • 37:40 – Why education should be an invitation to inquiry, not a race toward certainty.
  • 42:05 – Breaking down complexity through “aunt Dorothy” explanations and slow, focused inquiry.

Guest Bio and Links

Marisa Zalabak is an AI ethicist, psychologist, and thought leader specializing in responsible AI, education, sustainability, and human well-being. Her talks emphasize adaptive leadership, ethical innovation, and climate action through sustainable practices. A two-time TEDx and international keynote speaker, Marisa has contributed to global forums such as Stratcom, UN Summit of the Future, and AI House in Davos during the World Economic Forum. As Co-Founder of GADES (Global Alliance for Digital Education and Sustainability), Resident Fellow with The Digital Economist Center of Excellence and faculty member at the Trocadéro Forum Institute, Marisa champions education aligning responsible technology with regenerative design for human and planetary flourishing. Chairing IEEE's AI Ethics Education and Planet Positive 2030 initiatives, Marisa has co-authored ethical AI standards for human-wellbeing with AI technologies. Collaborating across sectors with organizations like Microsoft, SAP, and Stanford University Marisa addresses emerging issues in AI for a sustainable future.

Resources Mentioned

  • BINA48 – One of the first advanced humanoids trained for human interaction and space exploration.
  • Synthetic Emotion in AI – IEEE working group focused on standards for AI that emulates human emotions.
  • Digital Assistants & Bias – Ongoing research into how voice assistants perpetuate societal norms and stereotypes.

Further Reading / Related Episodes

Call to Action

How are your emotional habits being shaped by the tools you use every day? Marisa Zalabak invites us to slow down, ask better questions, and reimagine AI as a tool for well-being—not just productivity. Listen now and rethink the terms of service we accept in our digital lives.

🎧 Listen now: Episode Link

Credits

Host: Mary Camacho

Guest: Marisa Zalabak

Produced by Terms of Service Podcast

Sound Design: Arthur Vincent and Sonor Lab

Co-Producers: Nicole Klau Ibarra & Mary Camacho