In interventional radiology, catheters or needles are inserted into the patient's body through small incisions and navigated to the desired structure without the need for open surgery. During such a procedure, the physician does not have a direct view on the applicator and therefore relies on imaging modalities such as X-ray or magnetic resonance imaging. Adjustment of live images, access to planning data or previously acquired images is necessary during the course of an intervention. In clinical practice, touchscreens, buttons or joysticks covered in plastic sheeting are used to interact with computers. Alternatively, an assistant is instructed verbally or gesturally to act as a proxy user. Both approaches are not optimal: Proxy-user interaction is prone to misunderstandings and relies on the experience of the assistant. Using plastic-draped, conventional input devices is cumbersome and might be difficult as medical tasks will most likely keep one or both hands busy during interventions. However, direct interaction is important to gain a deeper understanding of the data.

Benjamin Hatscher The presented work investigates alternative input methods for physicians to interact with computer systems in sterile environments. Touchless hand gesture input is proposed and compared to proxy-user interaction for MRI-guided needle interventions. Hands-free input methods for image manipulation are investigated for scenarios with the hands occupied. Gaze, feet, voice commands and body movements are employed as sole input channels, complementary or as alternatives to each other. Direct, potentially concurrent image manipulation might influence the primary manual task. Therefore, the suitability of different hands-free input methods while performing a manual task is investigated. At last, passive input methods derived from natural user behavior aiming at lowering the workload are proposed.