top of page
Search

Healthcare's Situationship with AI Won't Be Seeing Its Official Stage for a While

  • Writer: Momoko Monzur
    Momoko Monzur
  • 2 days ago
  • 2 min read


Like any novel technology, AI is thought to be limitless in its possibilities, with investment firms and startups jumping at the opportunity to make their initiative ‘the first.’ Health startups equip the new tech like it’s a database, often using it as a quicker trackpad to the exact type of information one may be looking for. 

However, all AI is based on human intelligence, and if it’s all averaged to reflect the typical human mind, it’s also going to reflect the average human error. In gender bias, for example, healthcare workers continue to perpetuate issues of ignoring pains of women, people of color, and other underrepresented groups out of cultural normalization and social skepticism. 

Google’s Gemma carries this bias over to its digital sphere. Sam Rickman’s experiment had the LLM summarize 617 patient cases of various backgrounds. The research’s results show a much more direct overview of the patient’s condition if they were male than female, with less concern to reflect disabilities and levels of pain. 

AI is also reliant on the datasets that it’s given from its user. Oftentimes, this dataset reflects groups of people that are able to get care, causing the dataset to reflect conditions that largely affect people of certain socioeconomic status. While considering risk factors for different types of medical and mental conditions, AI was unable to assess fully for large, diverse datasets. In this case, the program couldn’t handle the full risk assessment that current healthcare professionals can achieve through heavy collaboration and record-keeping.

Though AI may prove efficient, it still can’t handle the full breadth of diversity that humanity has to offer. If AI was applied to healthcare as it currently is, the results would be catastrophic: large-scale misdiagnoses, low-patient satisfaction, and a continuing decline of trust in the healthcare system. 

The computational model also lacks something that human-to-human interaction always promises. The knowledge built through experience and hospital culture is specific to the system of clinicians of that establishment. These individuals understand the community’s culture like secondhand nature from living in the same area that their patients are also in. The translation of this familiarity from one generation of practitioners to another is sealed to the mode of verbal communication, not a painstakingly kept dataset. Until developers can find a way to accommodate for the data they need while successfully keeping the efficiency and pace of the hospital uninterrupted, AI will most likely not see itself as the fulfilling assistant that its creators and investors want to see in it.



 
 
 

Comments


CONTACT US

Have suggestions, comments, or concerns? Contact us using any of the methods below!

  • Instagram
  • Facebook
  • Linkedin

BE THE FIRST TO KNOW

Sign up to our newsletter to stay informed

© 2025 OvaCare Equity Project.

bottom of page