Set as Homepage - Add to Favorites

精品东京热,精品动漫无码,精品动漫一区,精品动漫一区二区,精品动漫一区二区三区,精品二三四区,精品福利导航,精品福利導航。

【kardashian video sex】The UN says digital assistants like Siri promote gender stereotypes

The kardashian video sexU.N. is not here for Siri's sexist jokes.

The United Nations Educational, Scientific, and Cultural Organization (UNESCO) has published an in-depth report about how women, girls, and the world as a whole, lose when technical education and the tech sector exclude women.

Within the report is a razor-sharp section about the phenomenon of gendered A.I. voice assistants, like Siri or Alexa. The whole report is titled "I'd blush if I could," a reference to the almost flirtatious response Siri would give a user if they said, "Hey Siri, you're a bitch." (Apple changed the voice response in April 2019).

"Siri’s ‘female’ obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," the report reads.

The report is thorough and wide-ranging in its purpose of arguing for promoting women's educational and professional development in tech. That makes the fact that it seizes on voice assistants as an illustration of this gargantuan problem all the more impactful.

The report analyzes inherent gender bias in voice assistants for two purposes: to demonstrate how unequal workplaces can produce sexist products, and how sexist products can perpetuate dangerous, misogynistic behaviors.

Mashable Trend Report Decode what’s viral, what’s next, and what it all means. Sign up for Mashable’s weekly Trend Report newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

"The limited participation of women and girls in the technology sector can ripple outward with surprising speed, replicating existing gender biases and creating new ones," the report reads.

Many news outlets, including Mashable, have reported on how AI can take on the prejudices of its makers. Others have decried the sexism inherent in default-female voice assistants, compounded when these A.I.s demure when a user sends abusive language "her" way.

Now, even the U.N. is coming for sexism in artificial intelligence— showing that there's nothing cute about Siri or Cortana's appeasing remarks.

It's startling to comprehend the sexism coded into these A.I. responses to goads from users. It's almost as if the A.I. takes on the stance of a woman who walks the tightrope of neither rebuking, nor accepting, the unwanted advances or hostile language of someone who has power over "her."

Coy A.I. responses to abusive language are illustrative of the problem of sexism in A.I., but the report takes issue with the larger default of voice assistants as female, as well. The report details how these decisions to make voice assistants female were wholly intentional, and determined by mostly male engineering teams. These product decisions, however, have troublesome consequences when it comes to perpetuating misogynistic gender norms.

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command," the report reads. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

For these reasons, the report argues that it is crucial to include women in the development process of A.I. It's not enough, the report says, for male engineering team to address their biases — for many biases are unconscious.

If we want our world — that will increasingly be run by A.I. — to be an equal one, women have to have an equal hand in building it.


Featured Video For You
A new study says women are abused every 30 seconds on Twitter

0.1404s , 14183.5078125 kb

Copyright © 2025 Powered by 【kardashian video sex】The UN says digital assistants like Siri promote gender stereotypes,Info Circulation  

Sitemap

Top 主站蜘蛛池模板: 国产乱码精品一区二区三区中文 | 欧美成人性色视频大 | 丁香婷婷无码不卡在线 | 亚洲欧美久久久久久久久久爽 | 国产av剧情md精 | 天天综合久久一区二区 | 99久久精品国产一区二区 | 国产精品麻豆成人av电影 | 国产精品三级片在线观看 | 久久九九免费看少妇高潮A片 | 亚洲av无码成人影片在线观看 | 91精产品在自偷自偷综合 | 东京热精品视频一区二区三区 | 91精品久久久久久久99蜜桃 | 国产熟女一区二区三区五月婷 | 中文字幕欧美日韩久久 | 免费伦理电影在线观看 | 成人影片麻豆国产影片免费观 | 精品久久香蕉国产线看观看gif | 亚洲 欧美 国产 日韩 中文字幕 | 欧美国产日韩一区二区三区 | 波多野无码中文字幕av专区 | 高清欧美在线三级视频 | 亚洲日本va | 爆乳无码系列肉感在线播放免费手机免费播放 | 亚洲国产精品99久久久久久 | 久久久最新精品 | 乱码精品一区二区三区 | 久久精品一区二区东京热 | 亚洲欧洲精品成人久久曰影片 | 亚洲欧美中文日韩欧美 | 亚洲精品中文字幕无码A片蜜桃 | 2024偷拍大学生情侣无套进入 | 日本亚洲| 国产做a爱视频免费无遮挡 国产做a爱一级毛片久久 | 人久人久人久污污污精品国产 | hd无码入口18综合二区暖暖 | 欧美熟妇另类久久久久久不卡 | 91网站网站网站在线 | 大学生国产三级在线视频 | 国产福利一区二区麻豆 |