Femtech's Reckoning: Privacy, Power, and Protection in Health Technology with Soribel Feliz

January 29, 2026 | 51:50 Download MP3

In this episode of Terms of Service, host Mary Camacho speaks with Soribel Feliz—AI governance and tech policy advisor—about the dangerous gaps between what health technology promises and what privacy law actually protects. Drawing from her experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech, Soribel examines how femtech and wellness apps claim to empower women while selling their most intimate data and leaving them vulnerable to law enforcement.

This conversation starts with a moment at a pitch competition: a pregnancy app founder dismissed a question about law enforcement access with "we're HIPAA compliant" and turned away. That turning away from hard questions reveals the problem. As 210 pregnant women face criminal charges using data from apps that promised empowerment, this episode asks: what would it take to build health technology that actually protects the people who use it?

Key Takeaways

  • HIPAA compliance doesn't mean privacy protection. Most consumer health apps aren't covered entities and can sell your data freely.
  • Your health data is being sold without meaningful consent. Period tracking apps sold location data to anti-abortion organizations to target women visiting Planned Parenthood.
  • Pregnancy loss can become criminal evidence. 210 pregnant women faced criminal charges using app data in the first year after Dobbs—HIPAA offered zero protection.
  • Compliance ≠ actual protection. Checking regulatory boxes doesn't mean users are safe. Founders and investors must ask harder questions.
  • Algorithms are personal. From hiring discrimination to insurance denials, AI systems make intimate decisions about people's lives with little transparency.

Topics Covered / Timestamped Sections

  • 02:40 – The pitch competition moment: when "HIPAA compliant" became a shield against accountability
  • 04:31 – What HIPAA actually does and doesn't do
  • 08:49 – Period tracker data sold to Wisconsin Right to Life for anti-abortion targeting
  • 16:30 – Why consumer health apps aren't covered by HIPAA
  • 20:15 – Law enforcement access and pregnancy loss as criminal evidence
  • 32:18 – ChatGPT Health and the risks of sharing complete medical records
  • 37:50 – What Soribel learned in the Senate, at Meta, and Microsoft
  • 43:45 – Algorithms are personal: Workday's hiring discrimination lawsuit
  • 46:23 – Advice for founders: put your money where your mouth is
  • 50:26 – Follow Soribel's work on LinkedIn, Substack, and YouTube

Guest Bio and Links

Soribel Feliz – AI governance and tech policy advisor with experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech. She focuses on how AI systems create legal, ethical, and operational risk, especially in health tech and femtech, and how organizations can govern AI responsibly at scale. Her work highlights where privacy law and AI governance fall short and why robust governance frameworks matter now.

Resources Mentioned

Further Reading / Related Episodes

  • Soribel's "Femtech Reckoning" series on Substack

Call to Action

What does it mean to build health technology that actually protects the people who use it? Soribel Feliz offers a clear-eyed examination of where femtech is failing—and what it would take for founders, investors, and policymakers to ask the hard questions.

Credits

Host: Mary Camacho

Guest: Soribel Feliz

Produced by Terms of Service Podcast

Sound Design: Arthur Vincent and Sonor Lab

Co-Producers: Nicole Klau Ibarra & Mary Camacho