Auflistung nach Autor:in "Funk, Markus"
1 - 8 von 8
Treffer pro Seite
Sortieroptionen
- mensch und computer 2013 - workshopbandBrain Painting: Action Paintings based on BCI-Input(Mensch & Computer 2013 - Workshopband, 2013) Funk, Markus; Raschke, MichaelWe introduce roboPix, a robot, which is able to paint action paintings from a user's thoughts using a Brain-Computer-Interface (BCI). The BCI provides signals, which encompass the user's recognized thoughts and the user's level of excitement. These signals command the movement of the robot's arm, which spreads the paint on the canvas. Our system combines explicit and implicit signals to personalize and affect the created painting. Furthermore, we implemented a feedback loop to engage the user in interacting with the system again after losing focus. This system creates a modern art representation of the user's excitement and thoughts at the moment of creation.
- KonferenzbeitragMediaBrain: Annotating Videos based on Brain-Computer Interaction(Mensch & Computer 2012: interaktiv informiert – allgegenwärtig und allumfassend!?, 2012) Sahami Shirazi, Alireza; Funk, Markus; Pfleiderer, Florian; Glück, Hendrik; Schmidt, AlbrechtAdding notes to time segments on a video timeline makes it easier to search, find, and play-back important segments of the video. Various approaches have been explored to annotate videos (semi) automatically to summarize videos. In this research we investigate the feasi-bility of implicitly annotating videos based on brain signals retrieved from a Brain-Computer Interface (BCI) headset. The signals provided by the BCI can reveal different infor¬mation such as brain activities, facial expressions, or the level of users excitement. This in¬formation correlates with scenes the users watch in a video. Thus, it can be used for anno¬tating a video and automatically generating a summary. To achieve the goal, an annotation tool called MediaBrain is developed and a user study is conducted. The result reveals that it is possible to annotate a video and select a set of highlights based on the excitement information.
- KonferenzbeitragTeachyverse: Collaborative E-Learning in Virtual Reality Lecture Halls(Mensch und Computer 2019 - Tagungsband, 2019) Marky, Karola; Müller, Florian; Funk, Markus; Geiß, Alexander; Günther, Sebastian; Schmitz, Martin; Riemann, Jan; Mühlhäuser, MaxOver the last decades, E-learning has gained a lot of popularity and enabled students to learn in front of their computers using Internet-based learning systems rather than physically attending lectures. Those E-learning systems are different from traditional learning and do not fully immerse the student in the learning environment. Thus, we propose Teachyverse, an immersive VR lecture hall that combines e-learning, traditional learning, and remote collaboration. Teachyverse immerses the student in a virtual lecture hall. A proof-of-concept study shows that students perceive lectures in Teachyverse as fun and would like to use Teachyverse as a further E-Learning option.
- WorkshopbeitragTowards an Optimal Viewpoint in Third-Person out-of-body Experiences(Mensch und Computer 2015 – Proceedings, 2015) Boldt, Robin; Hoppe, Matthias; Kosch, Thomas; Funk, Markus; Knierim, Pascal; Pfleging, Bastian; Henze, NielsHuman vision underlies natural constraints. Field of view, perceived wavelength, angular resolution, or perspective are just a few. Combining a head-mounted display with a camera can overcome some of these limitations and constraints. We investigate the field of view, which is created by presenting a third person view to the wearer of the head-mounted display. We propose an automatic camera positioning that may provide a better overview to the user compared to a manual positioning.
- KonferenzbeitragVirtual and Augmented Reality in Everyday Context (VARECo)(Mensch und Computer 2018 - Workshopband, 2018) Weyers, Benjamin; Zielasko, Daniel; Pfeiffer, Thies; Funk, MarkusIn the last decade, Virtual and Augmented Reality (VR/AR) hardware entered the consumer market, which broaden up potential use of VR and AR in everyday context. With everyday context we mean situations, environments or workflows that are part of everyday life. Examples may be the use of VR for data analysis as part of the data analyst’s workflow or the use of AR for supporting spatially dispersed workers in, e.g., maintaining of technical systems. To support research and development in this area of interest, we propose a one-day workshop on VR and AR in Everyday Context (VARECo) bringing together interested researchers and practitioners to discuss current and future work in this research domain. The workshop will be split into two parts: presentation of submitted and reviewed papers as well as an interactive part discussing current and future research topics in the field. The latter will be guided towards a common journal publication as follow-up activity.
- KonferenzbeitragVisualizing Locations for a Search Engine for the Physical World(Mensch & Computer 2014 - Tagungsband, 2014) Funk, Markus; Boldt, Robin; Eisele, Marcus; Yalcin, Taha; Henze, Niels; Schmidt, Albrecht
- KonferenzbeitragWe should start thinking about Privacy Implications of Sonic Input in Everyday Augmented Reality!(Mensch und Computer 2018 - Workshopband, 2018) Wolf, Katrin; Marky, Karola; Funk, MarkusEvolution in technology causes privacy issues, which are currently under intense discussion. Here, much attention is given to smart cameras, the Internet of Things and the Internet in general, while sonic AR systems are overlooked. Many users, for example, blindfold their laptop cameras with physical layers, but it seems as if no attention is drawn to the sonic hardware that can be hacked just like cameras. In this position paper, we highlight everyday situations that are prone to cause privacy problems through Sonic AR. We then look at current proposals to protect users from camera-caused privacy violations as examples and discuss how they could be adopted to prevent sonic information misuse. We conclude by stating that the current privacy discussion overlooks Sonic AR, although this is a channel across which even more detailed and hence, more sensitive, information can be communicated and misused.
- WorkshopbeitragWorkshop on Virtual and Augmented Reality in Everyday Context (VARECo)(Mensch und Computer 2019 - Workshopband, 2019) Weyers, Benjamin; Zielasko, Daniel; Kulik, Alexander; Langbehn, Eike; Funk, MarkusThe workshop is meant as a forum for researchers and practitioners interested in the investigation of open questions in the use of VR and AR technology in everyday context. The workshop aims at the presentation and discussion of current results and research questions in this context. The workshop is comprised of a paper session for the presentation of current and ongoing work as well as an interactive session, in which open research questions in this addressed context will be identified and discussed.