Raphael Menges
Researcher at Analytic Computing
Hi, I am Raphael Menges, researcher at Analytic Computing at the University of Stuttgart, Germany. My research interests cover the Web, computer graphics, computer vision, machine learning, and eye tracking.

Jobs

September 2021 - Today

  1. Project Lead, University of Stuttgart, Analytic Computing
    Semanux Spin-OffWeb
    Semanux combines hands-free interaction with artificial intelligence to make the Web inclusive for people with motor impairment.
    Operating Time: September 2021 - February 2023
    Funding: EXIST Transfer of Research

April 2021 - August 2021

  1. Scientific Employee, University of Stuttgart, Analytic Computing
    UDeco Research ProjectWeb
    User-friendliness is fundamental for the success of Web applications. User studies help identify problems with Web sites. However, the evaluation of studies by usability experts is time-consuming, highly subjective and so difficult to understand that user studies are often skipped. Hence, many systems remain hard to use. We develop UDeco, a platform for a usability data ecosystem, that collects and processes usability data and knowledge from user studies in order to automate evaluation of usability studies by means of machine learning and data mining.
    Partner: EYEVIDO GmbH
    Operating Time: January 2021 - December 2022
    Funding: KMU-innovativ program by the Federal Ministry of Education and Research of Germany

December 2016 - March 2021

  1. Project Lead, University of Koblenz, Institute for Web Science and Technologies
    GazeMining Research ProjectWebSheet
    The aim of the research project GazeMining is to capture Web sessions semantically and thus obtain a complete picture of visual content, perception and interaction. The log streams of usability tests are evaluated using data mining. The analysis and interpretability of the data collected in this way is made possible by a user-friendly presentation, semi-automatic and automatic analysis procedures.
    Partner: EYEVIDO GmbH
    Operating Time: January 2018 - August 2020
    Funding: KMU-innovativ program by the Federal Ministry of Education and Research of Germany
  2. Scientific Employee, University of Koblenz, Institute for Web Science and Technologies
    MAMEM Research ProjectWebVideoGitHub
    MAMEM's goal is to integrate these people back into society by increasing their potential for communication and exchange in leisure (e.g. social networks) and non-leisure context (e.g. workplace). In this direction, MAMEM delivers the technology to enable interface channels that can be controlled through eye-movements and mental commands. This is accomplished by extending the core API of current operating systems with advanced function calls, appropriate for accessing the signals captured by an eye-tracker, an EEGrecorder and bio-measurement sensors. Then, pattern recognition and tracking algorithms are employed to jointly translate these signals into meaningful control and enable a set of novel paradigms for multimodal interaction. These paradigms will allow for low- (e.g., move a mouse), meso- (e.g., tick a box) and high-level (e.g., select n-out-of-m items) control of interface applications through eyes and mind. A set of persuasive design principles together with profiles modeling the users (dis-)abilities will be also employed for designing adapted interfaces for disabled. MAMEM will engage three different cohorts of disabled (i.e. Parkinson's disease, muscular disorders, and tetraplegia) that will be asked to test a set of prototype applications dealing with multimedia authoring and management. MAMEM's final objective is to assess the impact of this technology in making these people more socially integrated by, for instance, becoming more active in sharing content through social networks and communicating with their friends and family.
    Partners: CERTH - Centre for Research & Technology Hellas, EB Neuro S.p.A (EBN), SMI GmbH, Eindhoven University of Technology (TUe), Muscular Dystrophy Association (MDA) Hellas, Auth - School of Medicine, and Sheba Medical Center (SMC)
    Operating Time: May 2015 - July 2018
    Funding: EU Project Horizon 2020 - The EU Framework Programme for Research and Innovation

March 2014 - November 2016

  1. Student Assistant, University of Koblenz, Institute for Web Science and Technologies
    Software development in Java and C++

November 2013 - February 2015

  1. Student Assistant, University of Koblenz, Institute for Web Science and Technologies
    Corrections of Algorithms and Datastructures assignments

Academia

February 2021

  1. Dr. rer. nat., University of Koblenz, Institute for Web Science and Technologies
    Grade: magna cum laude (very good)
    Improving Usability and Accessibility of the Web with Eye Tracking PDF
    The Web is an essential component of moving our society to the digital age. We use it for communication, shopping, and doing our work. Most user interaction in the Web happens with Web page interfaces. Thus, the usability and accessibility of Web page interfaces are relevant areas of research to make the Web more useful. Eye tracking is a tool that can be helpful in both areas, performing usability testing and improving accessibility. It can be used to understand users' attention on Web pages and to support usability experts in their decision-making process. Moreover, eye tracking can be used as an input method to control an interface. This is especially useful for people with motor impairment, who cannot use traditional input devices like mouse and keyboard. However, interfaces on Web pages become more and more complex due to dynamics, i.e., changing contents like animated menus and photo carousels. We need general approaches to comprehend dynamics on Web pages, allowing for efficient usability analysis and enjoyable interaction with eye tracking. In the first part of this thesis, we report our work on improving gaze-based analysis of dynamic Web pages. Eye tracking can be used to collect the gaze signals of users, who browse a Web site and its pages. The gaze signals show a usability expert what parts in the Web page interface have been read, glanced at, or skipped. The aggregation of gaze signals allows a usability expert insight into the users' attention on a high-level, before looking into individual behavior. For this, all gaze signals must be aligned to the interface as experienced by the users. However, the user experience is heavily influenced by changing contents, as these may cover a substantial portion of the screen. We delineate unique states in Web page interfaces including changing contents, such that gaze signals from multiple users can be aggregated correctly. In the second part of this thesis, we report our work on improving the gaze-based interaction with dynamic Web pages. Eye tracking can be used to retrieve gaze signals while a user operates a computer. The gaze signals may be interpreted as input controlling an interface. Nowadays, eye tracking as an input method is mostly used to emulate mouse and keyboard functionality, hindering an enjoyable user experience. There exist a few Web browser prototypes that directly interpret gaze signals for control, but they do not work on dynamic Web pages. We have developed a method to extract interaction elements like hyperlinks and text inputs efficiently on Web pages, including changing contents. We adapt the interaction with those elements for eye tracking as the input method, such that a user can conveniently browse the Web hands-free. Both parts of this thesis conclude with user-centered evaluations of our methods, assessing the improvements in the user experience for usability experts and people with motor impairment, respectively.
    Examiner and Supervisor: Prof. Dr. Steffen Staab
    Further Examiners: Prof. Dr. Andreas Bulling and Prof. Dr.-Ing. Dietrich Paulus
    Chair of PhD Board: Prof. Dr. Jan Jürjens
    Chair of PhD Commission: Prof. Dr. Harald F.O. von Korflesch

October 2016

  1. Master of Science, University of Koblenz, Computational Visualistics Programme
    Overall Grade: 1.1 (very good)
    Visualization of Molecule Surface Dynamics PDFSlidesGitHub
    Grade: 1.0 (very good)
    The surface of a molecule holds important information about the interaction behavior with other molecules. Amino acid residues with different properties change their position within the molecule over time. Some rise up to the surface and contribute to potential bindings. Other descent back into the molecular structure. Surface extraction algorithms are discussed and for the most appropriate one a highly parallel implementation is proposed. Layers of atoms are extracted by an iterative application of the algorithm. This allows one to track residues in their movement within the molecule in respect to their distance to the surface or core. Sampling of the surface is utilized for approximations of further values of interest, like surface area. Novel visualization methods are presented to support scientists in inspection of simulated molecule foldings. Atoms are colored according to their movement activity or an arbitrary group of atoms can be highlighted and analyzed. Proximity of residues to surface or core can be calculated over simulation time and allow conclusions about their contribution.
    Supervisors: Prof. Dr. Stefan Müller and Nils Lichtenberg, M.Sc.

March 2014

  1. Bachelor of Science, University of Koblenz, Computational Visualistics Programme
    Overall Grade: 1.4 (very good)
    Interactive Ray-Casting of Volume Data (German) PDFSlidesGitHubVideo 1Video 2
    Grade: 1.0 (very good)
    This thesis covers the mathematical background of ray-casting as well as an exemplary implementation on graphics processing units, using a modern programming interface. The implementation is embedded within an editor, which enables the user to activate optimizations of the algorithm. Techniques like transfer functions and local illumination are available for a more realistic visualization of materials. Moreover, the user interface gives access to features like importing volumes, let one define a custom transfer function, holds controls to adjust parameters of rendering and allows to activate further techniques, which are also subject of discussion in this thesis. Benefit of all shown techniques is measured, whether it is expected to be visual or on the part of performance.
    Supervisors: Prof. Dr. Stefan Müller and Gerrit Lochmann, M.Sc.

March 2011

  1. Abitur, Wilhelm-Hofmann-Gymnasium, St.Goarshausen
    General qualification for university entrance. Main subjects were Physics, Maths, and English.
    Overall Grade: 1.5 (very good)

Teaching

2016 - Today

  1. Lecturer, University of Koblenz, Institute for Web Science and Technologies
    Machine Learning and Data MiningWeb
    The course Machine Learning and Data Mining (MLDM) covers the fundamentals and basics of machine learning and data mining. The course provides an overview of a variety of MLDM topics and related areas such as clustering and classification.
    Collaborators: Zeyd Boukhers and Tjitze Rienstra
  2. Lecturer, University of Koblenz, Institute for Web Science and Technologies
    Proseminar "Eye Tracking" (German)Web
    Other Supervisors: Matthias Thimm
  3. Tutor, University of Koblenz, Institute for Web Science and Technologies
    Machine Learning and Data MiningWinter 17/18Winter 18/19Winter 19/20
    The master programme course Machine Learning and Data Mining covers the fundamentals and basics of machine learning and data mining. The course provides an overview of a variety of MLDM topics and related areas such as optimization and deep learning.
    Lecturers: Steffen Staab and Zeyd Boukhers
    Collaborators: Qusai Ramadan, Akram Sadat Hosseini, and Mahdi Bohlouli
  4. Supervisor, University of Koblenz, Institute for Web Science and Technologies
    Research Lab: Eye Tracking in Word ProcessingWeb
    Exploring the future of word processing with eye tracking.
    Other Supervisors: Matthias Thimm
  5. Supervisor, University of Koblenz, Institute for Web Science and Technologies
    Research Lab: Eye Tracking Visualization PlatformWebGitHub
    The platform was created during a research lab project that set the goal to find new ways of visualizing eye tracking data.
    Other Supervisors: Chandan Kumar
  6. Supervisor, University of Koblenz, Institute for Web Science and Technologies
    Research Lab: GazeTheWeb - WatchGitHub
    YouTube application controlled with gaze and processing of gaze and EEG sensor data, part of the EU-funded research project MAMEM.
    Other Supervisors: Chandan Kumar and Korok Sengupta
  7. Supervisor, University of Koblenz, Institute for Web Science and Technologies
    Research Lab: GazeTheWeb - TweetGitHub
    Twitter application controlled with gaze, part of the EU-funded research project MAMEM.
    Other Supervisors: Chandan Kumar and Korok Sengupta
  8. Thesis Supervisor, University of Koblenz, Institute for Web Science and Technologies
    Master Thesis: Intelligent Mapping of Eye-Tracking Gaze-Data on Fixed Web Page ElementsPDFGitHub
    Student: Hanadi Tamimi
    Other Supervisors: Steffen Staab and Christoph Schaefer
    Bachelor Thesis: Visualization of Transitions on Web Sites for Usability Studies with Eye Tracking (German)PDFGitHub
    Student: Christian Brozmann
    Other Supervisors: Steffen Staab
    Bachelor Thesis: Optical Text Recognition in the Web (German)PDFGitHub
    Student: Christopher Dreide
    Other Supervisors: Steffen Staab
    Bachelor Thesis: Shot Detection in Screencasts of Web Browsing with Convolutional Neural NetworksPDFGitHub
    Student: Daniel Vossen
    Other Supervisors: Steffen Staab
    Bachelor Thesis: Semantical Classification of Icons
    Student: Pierre Krapf
    Other Supervisors: Steffen Staab

2012 - 2015

  1. Tutor, University of Koblenz, Institute of Arts, Digital Media Group
    Introduction to Blender Game Engine (German) Slides 1 Slides 2
    Creation of games using the Blender Game Engine.
  2. Tutor, University of Koblenz, Institute of Arts, Digital Media Group
    Introduction to Unreal Development Kit (German) Slides
    Level building, materials, visual programming, and particles.
  3. Tutor, University of Koblenz, Institute of Arts, Digital Media Group
    Introduction to Game Modeling (German) Slides 1 Slides 2 Slides 3 Slides 4 Slides 5
    Modeling, sculpting, painting, and animation for games.

Software

Today

  1. C++
    OpenGL
    JavaScript
    Chromium Embedded Framework
    Google Firebase
    Gaze-controlled Web browser, part of the EU-funded research project MAMEM. GazeTheWeb effectively supports all common browsing operations like search, navigation and bookmarks. GazeTheWeb is based on a Chromium powered framework,comprising Web extraction to classify interactive elements, and application of gaze interaction paradigms to represent these elements.
    Collaborators: Daniel Müller, Christopher Dreide, Chandan Kumar, and Steffen Staab
    3rd place at Digital Imagination Challenge
  2. WeST HomepageWebSlides
    Jekyll
    JavaScript
    Docker
    Python
    Homepage of the Institute for Web Science and Technologies at the University of Koblenz. The entire content is organized in a Git repository as markdown and HTML code. A push to the master branch triggers a continuous integration pipeline, which executes the Jekyll Page Generator within a Docker container and deploys the generated HTML pages to the Web server.
    Collaborators: Philipp Töws, Adrian Skubella, Danienne Wete, and Daniel Janke

2020

  1. Visual Stimuli DiscoveryGitHub
    C++
    Python
    JavaScript
    OpenCV
    sklearn
    Tesseract
    Shogun ML
    Qt
    The framework of visual stimuli discovery contains the tools and scrips required to process video and interaction recordings into stimulus shots and visual stimuli.
    Collaborators: Christoph Schaefer

2018

  1. C++
    OpenGL
    PortAudio
    FreeType 2
    User interface library for eye-tracking input using C++11 and OpenGL 3.3. eyeGUI supports the development of interactive eye-controlled applications with many significant aspects, like rendering, layout, dynamic modification of content, support of graphics, and animation.

2016

  1. Voxel Cone TracingGitHub
    C++
    OpenGL
    CUDA
    Final project for the course ‘Realtime Rendering’. A polygonial scene is voxelized in real-time through geometry and pixel shading. The voxel grid is transferred to CUDA, where Voxel Cone Tracing is implemented to compute ambient occlusion and global illuminatin. My part of the project was the efficient voxelization of the scene.
    Collaborators: Fabian Meyer, Milan Dilberovic, and Nils Höhner

2015

  1. Beer HeaterGitHub
    C++
    OpenGL
    Compute Shaders
    Project about simulating air flow and heat distribution for the course ‘Animation and Simulation’ at the University of Koblenz in the summer term 2015. The simulation is executed with highly parallel compute shader computations.
    Collaborators: Nils Höhner
  2. Schau genau!GitHubVideoWeb
    Java
    JMonkey
    Blender
    Schau genau! was designed for the State Horticultural Show Landau 2015 as arcarde box game, using only gaze and one buzzer as input. Nearly 3000 sessions were played during the summer without any downtime. I have used Java and the jMonkey engine as programming framework and Blender to create the assets.
    Collaborators: Kevin Schmidt

2014

  1. VoracaGitHub
    C++
    OpenGL
    Versatile tool to visualize volumes with GPU ray-casting. I have written the tool as part of my bachelor thesis. It allows to load abitrary volume data sets with density information. A transferfunction can be adjusted to map density values to color, opacity, and shading attributes like specular reflection. The ray-casting has been written as pixel shader and improved through stochastic jitterin, early ray termination, and empty space skipping.

Arts

2011 - 2016

  1. Artist, University of Koblenz, Institute of Arts, Digital Media Group
    Voxelmania. Effects of Videogames on Service Robots Video Web
    We are proud to present our short film "Voxelmania. Effects of Videogames on Service Robots". The main actor is LISA. She is our Star and also an award winning service robot in different international contests (supported by team homer@university koblenz-landau, campus Koblenz). In a crossover of dream and reality LISA is exploring the world of video games.
    Course: Open space course of summer term 2016 by Markus Lohoff
    Collaborators: Julien Rodewald, Markus Lohoff, and 15 further students
  2. Artist, University of Koblenz, Institute of Computational Visualistics, Computer Graphics Group
    Ship happens! - A 3ds Max Movie Video
    Small lighthouse is short of sleep...
    Course: 3ds Max course of winter term 2013/2014 by Sebastian Pohl
    Collaborators: Adrian Derstroff, Raphael Heidrich, Dominik Cremer and Saner Demirel
  3. Artist, University of Koblenz, Institute of Arts, Digital Media Group
    Steoreo - A Blender Movie Video Web
    What are the benefits for visual perception that having two eyes brings with it? Which sensations can be artificially generated? To find out, it's two robots task to work through an experimental series that highlights some aspects of the problem.
    Course: Aspects of image design of summer term 2011 by Markus Lohoff
    Collaborators: Arend Buchacher and Michael Taenzer

2008 - 2014

  1. Artist, Personal Activity
    JustHotAir - A puzzle-action-game for Windows Web Microsoft Store Video 1 Video 2 Video 3
    Little creatures need your help! Save them by kicking them into the lit hole and make use of the environment to use less kicks. Each kick increases the inner pressure of the creature: One kick over limit and it explodes like a balloon filled with too much hot air.
    Collaborators: Andre Taulien and Michael Taenzer
    10.000 downloads in Windows Phone Store
  2. Artist, Personal Activity
    Pulsedrive - A music-driven racing game Web Video 1 Video 2
    Collaborators: Andre Taulien
  3. Artist, Personal Activity
    Beyond Jupiter - A role-playing hack and slash game Web Game Design
    Collaborators: Andre Taulien
  4. Artist, Personal Activity
    Demos for the [w]tech engine by Andre Taulien Web Video 1 Video 2 Video 3 Video 4 Video 5
    Collaborators: Andre Taulien
  5. Artist, Personal Activity
    Tre - Last Life Web Video 1 Video 2
    Total conversion of Unreal Tournament 3
    Collaborators: Andre Taulien and Mr.Tom
    $1 Million Intel "Make Something Unreal" Contest Phase 4 Finalist
  6. Artist, Personal Activity
    Tre - The Spreading Web Video
    Total conversion of Unreal Tournament 3
    Collaborators: Andre Taulien

Publications

I have published my research in international conferences, like ACM ETRA and ACM WWW, and ACM Transactions on Computer-Human Interaction. I have also reviewed submissions for ACM ETRA, ACM UIST, and ACM CHI.

2021

  1. Hedeshy, R., Kumar, C., Menges, R., and Staab, S. 2021. Hummer: Text Entry by Gaze and Hum. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery.
    PDFDOIISBNVideoBibTeX

2020

  1. Hedeshy, R., Kumar, C., Menges, R., and Staab, S. 2020. GIUPlayer: A Gaze Immersive YouTube Player Enabling Eye Control and Attention Analysis. ACM Symposium on Eye Tracking Research and Applications, Association for Computing Machinery.
    PDFDOIISBNBibTeX
  2. Kumar, C., Menges, R., Sengupta, K., and Staab, S. 2020. Eye tracking for Interaction: Evaluation Methods. In: S. Nikolopoulos, C. Kumar and I. Kompatsiaris, eds., Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces. Institution of Engineering and Technology, 117–144.
    DOIISBNBibTeX
  3. Menges, R., Kumar, C., and Staab, S. 2020. Eye tracking for Interaction: Adapting Multimedia Interfaces. In: S. Nikolopoulos, C. Kumar and I. Kompatsiaris, eds., Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces. Institution of Engineering and Technology, 83–116.
    DOIISBNBibTeX
  4. Menges, R., Kramer, S., Hill, S., Nisslmueller, M., Kumar, C., and Staab, S. 2020. A Visualization Tool for Eye Tracking Data Analysis in the Web. ACM Symposium on Eye Tracking Research and Applications, Association for Computing Machinery.
    PDFDOIISBNBibTeX

2019

  1. Menges, R., Kumar, C., and Staab, S. 2019. Improving User Experience of Eye Tracking-Based Interaction: Introspecting and Adapting Interfaces. ACM Trans. Comput.-Hum. Interact. 26, 6, 37:1–37:46.
    Eye tracking systems have greatly improved in recent years, being a viable and affordable option as digital communication channel, especially for people lacking fine motor skills. Using eye tracking as an input method is challenging due to accuracy and ambiguity issues, and therefore research in eye gaze interaction is mainly focused on better pointing and typing methods. However, these methods eventually need to be assimilated to enable users to control application interfaces. A common approach to employ eye tracking for controlling application interfaces is to emulate mouse and keyboard functionality. We argue that the emulation approach incurs unnecessary interaction and visual overhead for users, aggravating the entire experience of gaze-based computer access. We discuss how the knowledge about the interface semantics can help reducing the interaction and visual overhead to improve the user experience. Thus, we propose the efficient introspection of interfaces to retrieve the interface semantics and adapt the interaction with eye gaze. We have developed a Web browser, GazeTheWeb, that introspects Web page interfaces and adapts both the browser interface and the interaction elements on Web pages for gaze input. In a summative lab study with 20 participants, GazeTheWeb allowed the participants to accomplish information search and browsing tasks significantly faster than an emulation approach. Additional feasibility tests of GazeTheWeb in lab and home environment showcase its effectiveness in accomplishing daily Web browsing activities and adapting large variety of modern Web pages to suffice the interaction for people with motor impairment.
    PDFDOIBibTeX
  2. Kumar, C., Akbari, D., Menges, R., MacKenzie, S., and Staab, S. 2019. TouchGazePath: Multimodal Interaction with Touch and Gaze Path for Secure Yet Efficient PIN Entry. 2019 International Conference on Multimodal Interaction, ACM, 329–338.
    PDFDOIISBNVideoBibTeX
  3. Sengupta, K., Menges, R., Kumar, C., and Staab, S. 2019. Impact of Variable Positioning of Text Prediction in Gaze-based Text Entry. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ACM, 74:1–74:9.
    PDFDOIISBNBibTeX

2018

  1. Lichtenberg, N., Menges, R., Ageev, V., et al. 2018. Analyzing Residue Surface Proximity to Interpret Molecular Dynamics. Computer Graphics Forum.
    DOIBibTeX
  2. Menges, R., Tamimi, H., Kumar, C., Walber, T., Schaefer, C., and Staab, S. 2018. Enhanced Representation of Web Pages for Usability Analysis with Eye Tracking. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ACM, 18:1–18:9.
    Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.
    Best Video Award (for accompanying video)
    PDFDOIISBNVideoSlidesBibTeX
  3. Sengupta, K., Ke, M., Menges, R., Kumar, C., and Staab, S. 2018. Hands-free Web Browsing: Enriching the User Experience with Gaze and Voice Modality. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ACM, 88:1–88:3.
    PDFDOIISBNVideoBibTeX

2017

  1. Kumar, C., Menges, R., and Staab, S. 2017. Assessing the Usability of Gaze-Adapted Interface against Conventional Eye-Based Input Emulation. 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS), IEEE, 793–798.
    PDFDOIVideoSlidesBibTeX
  2. Sengupta, K., Sun, J., Menges, R., Kumar, C., and Staab, S. 2017. Analyzing the Impact of Cognitive Load in Evaluating Gaze-Based Typing. 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS), IEEE, 787–792.
    Best Student Paper Award
    PDFDOIBibTeX
  3. Kumar, C., Menges, R., Müller, D., and Staab, S. 2017. Chromium Based Framework to Include Gaze Interaction in Web Browser. Proceedings of the 26th International Conference on World Wide Web Companion, International World Wide Web Conferences Steering Committee, 219–223.
    Honourable Mention
    PDFDOIISBNBibTeX
  4. Menges, R., Kumar, C., Müller, D., and Sengupta, K. 2017. GazeTheWeb: A Gaze-Controlled Web Browser. Proceedings of the 14th Web for All Conference on The Future of Accessible Work, ACM, 25:1–25:2.
    Judges Award at TPG Accessibility Challenge
    PDFDOIISBNVideoBibTeX
  5. Menges, R., Kumar, C., Wechselberger, U., Schaefer, C., Walber, T., and Staab, S. 2017. Schau genau! A Gaze-Controlled 3D Game for Entertainment and Education. Journal of Eye Movement Research, 220.
    PDFVideoBibTeX
  6. Sengupta, K., Menges, R., Kumar, C., and Staab, S. 2017. GazeTheKey: Interactive Keys to Integrate Word Predictions for Gaze-based Text Entry. Proceedings of the 22Nd International Conference on Intelligent User Interfaces Companion, ACM, 121–124.
    PDFDOIISBNVideoBibTeX

2016

  1. Kumar, C., Menges, R., and Staab, S. 2016. Eye-Controlled Interfaces for Multimedia Interaction. IEEE MultiMedia 23, 4, 6–13.
    DOIBibTeX
  2. Menges, R., Kumar, C., Sengupta, K., and Staab, S. 2016. eyeGUI: A Novel Framework for Eye-Controlled User Interfaces. Proceedings of the 9th Nordic Conference on Human-Computer Interaction, ACM, 121:1–121:6.
    PDFDOIISBNVideoBibTeX

2014

  1. Schaefer, C., Kuich, M., Menges, R., Schmidt, K., and Walber, T. 2014. Schau genau! - an Eye Tracking Game With a Purpose. 1st. Workshop on the Applications for Gaze in Games at CHI Play 2014.
    PDFVideoBibTeX