首页> 外文会议>2018 IEEE/ACM 1st International Workshop on Gender Equality in Software Engineering >Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning
【24h】

Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning

机译:人工智能中的性别偏见:机器学习中对多样性和性别理论的需求

获取原文
获取原文并翻译 | 示例

摘要

Artificial intelligence is increasingly influencing the opinions and behaviour of people in everyday life. However, the over-representation of men in the design of these technologies could quietly undo decades of advances in gender equality. Over centuries, humans developed critical theory to inform decisions and avoid basing them solely on personal experience. However, machine intelligence learns primarily from observing data that it is presented with. While a machine's ability to process large volumes of data may address this in part, if that data is laden with stereotypical concepts of gender, the resulting application of the technology will perpetuate this bias. While some recent studies sought to remove bias from learned algorithms they largely ignore decades of research on how gender ideology is embedded in language. Awareness of this re-search and incorporating it into approaches to machine learning from text would help prevent the generation of biased algorithms. Leading thinkers in the emerging field addressing bias in artificial intelligence are also primarily female, suggesting that those who are potentially affected by bias are more likely to see, understand and attempt to resolve it. Gender balance in machine learning is therefore crucial to prevent algorithms from perpetuating gender ideologies that disadvantage women.
机译:人工智能越来越影响人们在日常生活中的意见和行为。但是,在这些技术的设计中,男性人数过多可能会悄无声息地消除几十年来性别平等方面的进步。几个世纪以来,人类发展了批判性理论来指导决策并避免仅基于个人经验。但是,机器智能主要是通过观察所呈现的数据来学习的。虽然一台机器处理大量数据的能力可以部分解决此问题,但如果该数据中充满性别刻板印象的性别观念,则该技术的最终应用将使这种偏见永久化。尽管最近的一些研究试图消除学习算法中的偏见,但它们在很大程度上忽略了数十年来关于性别意识形态如何嵌入语言的研究。意识到这种重新研究并将其纳入从文本进行机器学习的方法中,将有助于防止生成有偏差的算法。解决人工智能偏见的新兴领域的主要思想家也主要是女性,这表明那些可能受到偏见影响的人更有可能看到,理解并试图解决这一问题。因此,机器学习中的性别平衡对于防止算法使不利于妇女的性别意识永存至关重要。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号