Follow us on social

Latest Posts

Stay in Touch With Us

For Advertising, media partnerships, sponsorship, associations, and alliances, please connect to us below

Email
info@globaltechoutlook.com

Phone
+91 40 230 552 15

Address
540/6, 3rd Floor, Geetanjali Towers,
KPHB-6, Hyderabad 500072

Follow us on social

Globaltechoutlook

  /  Artificial Intelligence   /  Resolving Gender Issues in Wikipedia, Meta’s New AI is at Work
AI

Resolving Gender Issues in Wikipedia, Meta’s New AI is at Work

AI researcher Angela Fan from Meta is addressing gender bias in Wikipedia biographies

On the internet and in the real world, women are drastically under-represented than men in all types of things. But this AI researcher from Meta is employing a novel approach to get Wikipedia to include more biographies of women because just 20% of Wikipedia biographies are about women. That percentage goes down, even more, when it comes to women from intersectional groups. AI models don’t cover everyone in the world equally. Angela Fan uses AI to write rough drafts to address gender issues in Wikipedia.

 

Gender Bias in Wikipedia Biographies

AI researcher Angela Fan from Meta is building an open-source AI system that sources and writes first drafts of Wikipedia-style biographies. She created an AI that could generate Wikipedia-style biographies in their first stage of drafts and their sources. This was recently disclosed in a research paper called “Generating Full-Length Wikipedia Biographies: The Impact of Gender Bias on the Retrieval-Based Generation of Women Biographies”.

Meta is including a data set that was used to evaluate how the model handled 1,527 biographies of women. This model searches for information and drafts a Wikipedia-style entry, including citations. This new AI system will one day help Wikipedia editors create thousands of accurate, compelling biography entries for important people who are currently not on the site. Meta AI has a corresponding dataset. It’s not only directly related to women but women in science and those located in Asia and Africa.

The natural language processing community has focused on combating gender bias and detecting abusive language, machine translation, and such other issues. The major three key challenges in text generation and NLP, are how to gather relevant evidence, how to structure that information into well-formed text, and how to ensure that the generated text is factually correct. Finally, Angela Fan emphasized that the model and dataset are just one small step in the process of righting long-standing, inherent bias against women.