SOC Blog 1: Data & Society Databites Talk
Published on:
What Black Maternal Health Teaches Us About Data Ethics
Video:
Databite 160: Black Maternal Health is in Crisis. Can Technology Help?
Why I Selected This Video
I specifically chose to do this Databite because I was interested in learning how technology and healthcare intersected, especially in relation to Black maternal health. I had never really thought about data privacy in the context of healthcare, yet it seems to be such a huge problem that isn’t talked about often. I feel like many people tend to trust hospitals without a thought, and we assume all of our information is in good hands, but in reality, is it?
Who Are the Speakers?
The conversation was hosted by Joan Mukogosi, a research analyst at Data & Society who studies the social implications of data and automation. She guided the discussion and tied the ethical questions to broader debates about trust and technology.
The guests were Dr. Mary Fleming and Ijeoma Uche. Dr. Fleming is a practicing physician in Baltimore and Louisville, and the co-founder and Chief Medical Officer at Cayaba Care, a maternal health startup in Philadelphia. She also directs a leadership development program at Harvard’s School of Public Health, where she focuses on advancing equity in healthcare.
Ijeoma Uche is the co-founder of Birth By Us, a postpartum and pregnancy app that empowers women of color to shape their own birthing experiences while giving providers better insights into maternal health needs.
Dr. Mary Fleming brought insight from a clinical perspective, while Ijeoma brought one from tech innovation.
What was discussed
The discussion focused on how data-driven maternity care impacts Black women and birth. Joan introduced the discussion by referencing her research that identified three major forms of data collection Black patients experience:
- Electronic health records
- Medicaid enrollment data
- “FemTech” devices
She identified these tools as those who are supposed to improve healthcare, but are also putting patient data at risk through exposure. Joan stressed how these forms of data collection create vulnerability in a system where there isn’t much privacy protection in place.
Where and Why it Matters
This conversation is happening in cultural, technological, and policy contexts. The U.S. continues to face a Black maternal health crisis, where Black women are far more likely to die or suffer complications during childbirth than white women. At the same time, the healthcare industry is rapidly digitizing and collecting more patient data than ever before.
The crises of racial health disparities and data privacy are colliding. As Dr. Fleming explained, regional policy differences and healthcare system design can make care inconsistent and fragmented. Ijeoma added that without diverse datasets and teams, algorithms can reproduce the same biases that already exist in the system.
How Speakers Responded
Dr. Fleming emphasized being mindful of how healthcare policies and data practices vary by region. She also pointed out that improving maternal health requires looking beyond the clinic and observing how data moves between systems and who has access to it.
Ijeoma focused on balance. She argued that digital health tools shouldn’t replace in-person care, but rather bridge gaps in the system. Her company, Birth By Us, aims to make data work for patients rather than against them, helping providers understand cultural context while empowering mothers with digestible, accessible information. She also stressed the importance of teaching patients about their own data, so they can make informed decisions without fear.
So What?
This talk made me think about how technology that’s designed to “help” can sometimes deepen inequities if it isn’t built with care. As someone studying computer science, I found this particularly relevant because the same algorithms we write in code can have real effects on people’s lives… scaryyy. Listening to Ijeoma describe how bias can creep into AI models reminded me that diverse teams and ethical design aren’t just nice to have; they’re necessities. “W” St. Olaf for making CS majors take ethics.
It also connected to my own experience as I’ve used fitness and wellness trackers that collect so much data without explaining where it goes. If I, as a rather tech-savvy person, find that concerning, it’s easy to imagine how much more complicated and risky that becomes in healthcare settings.
New Question
If digital health tools are built on biased data, can they ever truly serve marginalized communities?
As someone who will likely, well hopefully, be working in the tech industry soon enough, I chose this question because biased data is often overlooked and its tough to analyze without an ethical standpoint. We briefly discussed in class an example of a woman asking AI to make her look more professional and it made her paler. I mention this because biased systems are everywhere. However, in the context of healthcare, such systems can make the difference between life or death. Yes, AI making a woman look paler is messed up, but such bias can lead to much bigger disparities.
