Auflistung nach Schlagwort "gender bias"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragEffects of Smart Virtual Assistants’ Gender and Language(Mensch und Computer 2019 - Tagungsband, 2019) Habler, Florian; Schwind, Valentin; Henze, NielsSmart virtual assistants (SVA) are becoming increasingly popular. Prominent SVAs, including Siri, Alexa, and Cortana, have female-gendered names and voices which raised the concern that combining female-gendered voices and submissive language amplifies gender stereotypes. We investigated the effect of gendered voices and the used language on the perception of SVAs. We asked participants to assess the performance, personality and user experience of an SVA while controlling the gender of the voice and the attributed status of the language. We show that low-status language is preferred but the voice's gender has a much smaller effect. Using low-status language and female-gendered voices might be acceptable but solely combining low-status language with female-gendered voices is not.
- TextdokumentMeasuring Gender Bias in German Language Generation(INFORMATIK 2022, 2022) Kraft,Angelie; Zorn,Hans-Peter; Fecht,Pascal; Simon,Judith; Biemann,Chris; Usbeck,RicardoMost existing methods to measure social bias in natural language generation are specified for English language models. In this work, we developed a German regard classifier based on a newly crowd-sourced dataset. Our model meets the test set accuracy of the original English version. With the classifier, we measured binary gender bias in two large language models. The results indicate a positive bias toward female subjects for a German version of GPT-2 and similar tendencies for GPT-3. Yet, upon qualitative analysis, we found that positive regard partly corresponds to sexist stereotypes. Our findings suggest that the regard classifier should not be used as a single measure but, instead, combined with more qualitative analyses.