-
9. März 2017
-
17:00 Uhr
-
APB/E023
-
Englisch
Zusammenfassung
Mobile user interfaces are heavily based on small screens and keyboards. These can be hard to operate when on the move which limits the applications and services we can use. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children at the same time as using their phones. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a ‘head up’ way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things. I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using 3D sound and haptic displays in applications such as for mobile devices such as text entry and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don’t make us feel embarrassed or awkward.
Vita
Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow, UK. He runs the internationally leading Multimodal Interaction Group. His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly hearing, touch and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long term theme has been haptics, starting with force-feedback and more recently tactile displays. He has authored over 350 papers and is a member of the CHI Academy.