Story
GazeTheWeb has been evaluated as part of the MAMEM project at three clinical cohorts in Athens, Thessaloniki, and Tel Aviv, in two trial phases. At the first trial phase in February 2017, 18 participants with motor impairment succesfully performed dictated tasks in the World Wide Web.
The second phase has taken place in spring 2018, wheere 30 participants with motor impairment operated GazeTheWeb for one month a their homes on their own behalf. The system allowed the particitpants to browse the World Wide Web, perform communication, access entertainment and retrieve information.
Impact
If you use our software as part for your own research, please be kind and cite our publication:
@inproceedings{tochiGazeTheWeb,
author = {Menges, Raphael and Kumar, Chandan and Staab, Steffen},
title = {Improving User Experience of Eye Tracking-Based Interaction: Introspecting and Adapting Interfaces},
journal = {ACM Trans. Comput.-Hum. Interact.},
issue_date = {October 2019},
volume = {26},
number = {6},
month = nov,
year = {2019},
issn = {1073-0516},
pages = {37:1--37:46},
articleno = {37},
numpages = {46},
url = {http://doi.acm.org/10.1145/3338844},
doi = {10.1145/3338844},
acmid = {3338844},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {Eye tracking, GazeTheWeb, Web accessibility, gaze interaction experience, gaze-based emulation, gaze-controlled interface, interface semantics, introspection},}