Auflistung nach Autor:in "Le, Huy Viet"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- muc: langbeitrag (vorträge)Automatic Classification of Mobile Phone s Contacts(Mensch & Computer 2013: Interaktive Vielfalt, 2013) Sahami Shirazi, Alireza; Le, Huy Viet; Henze, Niels; Schmidt, AlbrechtCurrent smartphones have virtually unlimited space to store contact information. Users typically have dozens or even hundreds of contacts in their address book. The number of contacts can make it difficult to find particular contacts from the linear list provided by current phones. Grouping contacts ease the retrieval of particular contacts and also enables to share content with specific groups. Previous work, however, shows that users are not willing to manually categorize their contacts. In this paper we inves-tigate the automatic classification of contacts in phones contact lists, using the user s communication history. Potential contact groups were determined in an online survey with 82 participants. We collect-ed the call and SMS communication history from 20 additional participants. Using the collected data we trained a machine-learning algorithm that correctly classified 59.2% of the contacts. In a pilot study in which we asked participants to review the results of the classifier we found that 73.6% of the re-viewed contacts were considered correctly classified. We provide directions to further improve the performance and argue that the current results already enable to ease the manual classification of mo-bile phone contacts.
- KonferenzbeitragKnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning(Mensch und Computer 2019 - Tagungsband, 2019) Schweigert, Robin; Leusmann, Jan; Hagenmayer, Simon; Weiß, Maximilian; Le, Huy Viet; Mayer, Sven; Bulling, AndreasWhile mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.