Auflistung nach Autor:in "Haug, Saskia"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragAccelerating Deductive Coding of Qualitative Data: An Experimental Study on the Applicability of Crowdsourcing(Mensch und Computer 2021 - Tagungsband, 2021) Haug, Saskia; Rietz, Tim; Mädche, AlexanderWhile qualitative research can produce a rich understanding of peoples’ mind, it requires an essential and strenuous data annotation process known as coding. Coding can be repetitive and timeconsuming, particularly for large datasets. Crowdsourcing provides flexible access toworkers all around theworld, however, researchers remain doubtful about its applicability for coding. In this study, we present an interactive coding system to support crowdsourced deductive coding of semi-structured qualitative data. Through an empirical evaluation on Amazon Mechanical Turk, we assess both the quality and the reliability of crowd-support for coding. Our results show that non-expert coders provide reliable results using our system. The crowd reached a substantial agreement of up to 91% with the coding provided by experts. Our results indicate that crowdsourced coding is an applicable strategy for accelerating a strenuous task. Additionally, we present implications of crowdsourcing to reduce biases in the interpretation of qualitative data.
- WorkshopbeitragThe Human-in-the-loop CrowdSurfer Concept: Providing User-centered AI Support to Crowdworkers for Improved Working Conditions and Task Outcomes(Mensch und Computer 2023 - Workshopband, 2023) Benke, Ivo; Haug, Saskia; Maedche, AlexanderWe present our vision of a human in-the-loop crowdsourcing system, the Human-in-the-loop (HitL-) CrowdSurfer concept. It allows crowdworkers to leverage the capabilities of LLMs and on the other hand let them contribute their human skills to a potential work outcome. With the HitLCrowdSurfer, crowdworkers can surf the internet and conduct microtasks such as provision of design feedback, creativity tasks, or providing alt-tags and are assisted in the development of answers to the respective microtasks.
- KonferenzbeitragScalable Design Evaluation for Everyone! Designing Configuration Systems for Crowd-Feedback Request Generation(Mensch und Computer 2023 - Tagungsband, 2023) Haug, Saskia; Sommerrock, Sophia; Benke, Ivo; Maedche, AlexanderDesign evaluation is an important step during software development to ensure users’ requirements are met. Crowd feedback represents an effective approach to tackling scalability issues of traditional design evaluation methods. Crowd-feedback systems are usually developed for a fixed use case and designers lack knowledge on how to build individual crowd-feedback systems by themselves. Consequently, they are rarely applied in practice. To address this challenge, we propose the design of a configuration system to support designers in creating individual crowd-feedback requests. By conducting expert interviews (N=14) and an exploratory literature review, we derive four design rationales for such configuration systems and propose a prototypical configuration system instantiation. We evaluate this instantiation in exploratory focus groups (N=10). The results show that feedback requesters appreciate guidance. However, the configuration system needs to find a balance between complexity and flexibility. With our research, we contribute with a generalizable concept to support feedback requesters to create individualized crowd-feedback requests to support scalable design evaluation for everyone.