UNESCO Report Exposes Gender, Ethnic Bias in Leading AI Models
March 7, 2024
UNESCO report uncovers gender and ethnic biases in AI models like GPT-3.5 and Llama 2, with high-status jobs linked to men and domestic roles to women.
The study highlights the negative portrayal of ethnic minorities and gay individuals in AI-generated content.
Calls for increased collaboration, transparency, and regulation to combat AI biases, including the involvement of private companies.
UNESCO's Ethics of AI Recommendation advocates for gender equality in AI and was adopted to guide future AI development.
UNESCO Director-General urges action from both governments and corporations to address AI biases and ensure diversity in AI development.
Summary based on 12 sources
Get a daily email with more Tech stories
Sources

FRANCE 24 • Mar 7, 2024
AI tools by OpenAI and Meta generate sexist content, UNESCO warns
International Business Times • Mar 7, 2024
AI Tools Generate Sexist Content, Warns UN
Mirage News • Mar 7, 2024
UNESCO Study: Generative AI Fuels Concerning Gender Stereotypes
TechCentral.ie • Mar 7, 2024
UNESCO study shows generative AI likely to propagate negative stereotypes - TechCentral.ie