minigubben's lemmy
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
101@feddit.org to Technology@lemmy.worldEnglish · 8 months ago

Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists.

www.mdpi.com

external-link
message-square
11
fedilink
0
external-link

Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists.

www.mdpi.com

101@feddit.org to Technology@lemmy.worldEnglish · 8 months ago
message-square
11
fedilink
Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists
www.mdpi.com
external-link
Introduction: While the global medical graduate and student population is approximately 50% female, only 13–15% of cardiologists and 20–27% of training fellows in cardiology are female. The potentially transformative use of text-to-image generative artificial intelligence (AI) could improve promotions and professional perceptions. In particular, DALL-E 3 offers a useful tool for promotion and education, but it could reinforce gender and ethnicity biases. Method: Responding to pre-specified prompts, DALL-E 3 via GPT-4 generated a series of individual and group images of cardiologists. Overall, 44 images were produced, including 32 images that contained individual characters and 12 group images that contained between 7 and 17 characters. All images were independently analysed by three reviewers for the characters’ apparent genders, ages, and skin tones. Results: Among all images combined, 86% (N = 123) of cardiologists were depicted as male. A light skin tone was observed in 93% (N = 133) of cardiologists. The gender distribution was not statistically different from that of actual Australian workforce data (p = 0.7342), but this represents a DALL-E 3 gender bias and the under-representation of females in the cardiology workforce. Conclusions: Gender bias associated with text-to-image generative AI when using DALL-E 3 among cardiologists limits its usefulness for promotion and education in addressing the workforce gender disparities.
  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    And even in cases where introducing a bias is desirable, you have to be very careful when doing it. There has been at least one case where introducing a bias towards diversity has caused problems when the algorithm is asked for images of historical people, who were often not diverse at all.

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 1.28K users / day
  • 2.57K users / week
  • 5.59K users / month
  • 8 users / 6 months
  • 0 local subscribers
  • 58.7K subscribers
  • 12.7K Posts
  • 537K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemm.ee
  • L4sBot@lemmy.world
  • fry@fry.gs
  • L3s@fry.gs
  • enu@lemmy.world
  • L4sBot@fry.gs
  • BE: 0.19.7
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org