Intelligent Gaze Controlled Interface for Limited Mobility Environment
Abstract
Eye movement-based Human Computer Interaction (HCI) has become a significant area of research in computer science. The gaze-based computing devices allow a computer to be controlled using only eye movements, which can be critical when a person is unable to use a mouse or keyboard. Advances in eye tracking focuses on interactive communication and control tools based on gaze tracking, including eye typing, computer control, and gaming. Users with Severe Speech and Motor Impairment (SSMI) often use a communication chart through their eye gaze or limited hand movement and caregivers interpret their communication intent. My research focuses on developing gaze-based accessible interfaces for users with limited mobility. Limited mobility impairment is a disability that refers to the inability of a person to use one or more of his/her extremities, or lack of strength to walk or lift objects. Limited mobility impairment can be of two types namely, situational impairment, and physical impairment. Situational impairment is the inability to access computers as a result of the situation. Examples include noise, insufficient lighting, diversions, tasks that need the use of eyes or hands. Physical impairment on the other hand is a disability that limits a person's physical capacity to move, coordinate actions, or perform physical activities. People with SSMI, do not have control over their body movements and are subject to assistance in almost all activities like eating, playing and communication. Yet, eye movements may be the only movements that are in their voluntary control. These users depend on electronic interfaces for communication, learning and many other daily activities. However, the interfaces are often designed assuming the preference and ease of use of end users for different screen regions to be the same for people with SSMI and their able-bodied counterparts. Thus, creating accessible electronic interfaces for these users is one of the most pressing problems today. My research focuses on investigating HCI issues in limited mobility environment for developing intelligent eye gaze-controlled interfaces. In particular, I aim to design and develop intelligent gaze-controlled accessible systems for people with SSMI. I further explored the use of gaze-based interfaces for situationally impaired users.