Google Unveils Three Initiatives to Promote Health Equity in AI
top of page

Google Unveils Three Initiatives to Promote Health Equity in AI

In a significant step towards ensuring fairness and access in healthcare, Google's Chief Health Equity Officer today announced three key initiatives aimed at building equity into AI-powered health tools. These initiatives were unveiled at Google's annual health event, The Check Up.


"Health equity means everyone has a fair and just opportunity to be as healthy as possible," stated the Chief Health Equity Officer. "The reality is that many people, particularly people of color, women, and those in rural areas, face significant barriers to achieving good health. Our team is committed to developing responsible and equitable AI tools to bridge this gap."


Recognizing the rapid evolution of medical AI, Google researchers developed a new framework to identify and mitigate potential biases that could negatively impact health outcomes. Their research paper, "A Toolbox for Surfacing Health Equity Harms and Biases in Large Language Models," outlines a method for assessing bias in medical large language models (LLMs).


The paper introduces "EquityMedQA," a collection of seven datasets designed to test for bias, alongside a framework for utilizing these datasets. Google researchers leveraged existing health equity literature, documented model failures, and insights from equity experts to create these tools. They have been used to evaluate Google's own LLMs and are now available to the wider research community.


A team of Google researchers, including health equity specialists, social scientists, clinicians, and AI experts, collaborated to develop the HEAL (Health Equity Assessment of Machine Learning performance) framework. HEAL is designed to evaluate AI models for potential to create or exacerbate existing health disparities. This four-step framework includes:

  • Identifying factors linked to health inequities and defining relevant AI performance metrics.

  • Quantifying pre-existing health outcome disparities across different populations.

  • Measuring the AI tool's performance for each subgroup identified in step two.

  • Assessing the likelihood of the AI tool exacerbating health disparities.


The HEAL framework has already been applied to a dermatology AI model. While the model performed fairly across race, ethnicity, and sex, researchers identified areas for improvement in performance for older adults. Specifically, the model accurately detected cancerous conditions like melanoma for all age groups, but struggled with non-cancerous conditions like eczema in individuals over 70. Google plans to continue applying and refining the HEAL framework for future healthcare AI models.


Lack of diversity in existing dermatology datasets hinders development of equitable AI models. Current datasets often consist of clinical images that may not reflect the full spectrum of skin tones, body areas, and disease severities found in the real world. Additionally, these datasets typically focus on severe conditions like skin cancer, neglecting more common issues like allergic rashes or infections.


To address this challenge, Google partnered with Stanford Medicine to create the Skin Condition Image Network (SCIN). This open-access dataset contains over 10,000 real-world dermatology images contributed by thousands of participants. Dermatologists and researchers then classified the images according to diagnoses and two skin tone scales, ensuring the dataset represents a wide range of skin types and conditions.


The SCIN dataset empowers scientists and doctors to develop tools for identifying skin conditions, conduct dermatology research, and train future healthcare professionals on a broader range of skin manifestations.


While acknowledging the ongoing nature of this work, Google's Chief Health Equity Officer emphasized their commitment to collaboration and knowledge sharing. "By working with partners and sharing our learnings, we believe we can build a healthier future for everyone, regardless of background or location," they concluded.

Tags:

3 views0 comments

Recent Posts

See All
bottom of page