default search action
ETRA 2024: Glasgow, UK
- Mohamed Khamis, Yusuke Sugano, Ludwig Sidenmark:
Proceedings of the 2024 Symposium on Eye Tracking Research and Applications, ETRA 2024, Glasgow, United Kingdom, June 4-7, 2024. ACM 2024
ETRA 2024 Short Papers
- Süleyman Özdel, Yao Rong, Berat Mert Albaba, Yen-Ling Kuo, Xi Wang, Enkelejda Kasneci:
A Transformer-Based Model for the Prediction of Human Gaze Behavior on Videos. 1:1-1:6 - Ruhi Bhanap, Klaus Oberauer, Agnes Rosner:
Are eye movements and covert shifts of attention functional for memory retrieval? 2:1-2:7 - Nina A. Gehrer, Jennifer Svaldi, Andrew T. Duchowski:
Assessment of body-related attention processes via mobile eye tracking: A pilot study to validate an automated analysis pipeline. 3:1-3:7 - Gonzalo Garde, José María Armendariz, Ruben Beruete Cerezo, Rafael Cabeza, Arantxa Villanueva:
Beyond Basic Tuning: Exploring Discrepancies in User and Setup Calibration for Gaze Estimation. 4:1-4:8 - Björn Rene Severitt, Patrizia Lenhart, Benedikt Werner Hosp, Nora Jane Castner, Siegfried Wahl:
Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VR. 5:1-5:7 - Soumil Chugh, Juntao Ye, Yuqi Fu, Moshe Eizenman:
CSA-CNN: A Contrastive Self-Attention Neural Network for Pupil Segmentation in Eye Gaze Tracking. 6:1-6:7 - Samantha Aziz, Dillon J. Lohr, Lee Friedman, Oleg Komogortsev:
Evaluation of Eye Tracking Signal Quality for Virtual Reality Applications: A Case Study in the Meta Quest Pro. 7:1-7:8 - Simone Mentasti, Francesco Lattari, Riccardo Santambrogio, Gianmario Careddu, Matteo Matteucci:
Event-based eye tracking for smart eyewear. 8:1-8:7 - Naila Ayala, Claudia Alexandra Martin Calderon, Nikhil Sharma, Elizabeth L. Irving, Shi Cao, Suzanne K. Kearns, Ewa Niechwiej-Szwedo:
Examining the Utility of Blink Rate as a Proxy for Cognitive Load in Flight Simulation. 9:1-9:6 - David Dembinsky, Ko Watanabe, Andreas Dengel, Shoya Ishimaru:
Eye Movement in a Controlled Dialogue Setting. 10:1-10:7 - Florian Weidner, Jakob Hartbrich, Stephanie Arévalo Arboleda, Christian Kunert, Christian Schneiderwind, Chenyao Diao, Christoph Gerhardt, Tatiana Surdu, Wolfgang Broll, Stephan Werner, Alexander Raake:
Eyes on the Narrative: Exploring the Impact of Visual Realism and Audio Presentation on Gaze Behavior in AR Storytelling. 11:1-11:7 - Süleyman Özdel, Yao Rong, Berat Mert Albaba, Yen-Ling Kuo, Xi Wang, Enkelejda Kasneci:
Gaze-Guided Graph Neural Network for Action Anticipation Conditioned on Intention. 12:1-12:9 - Maurice Koch, Nelusa Pathmanathan, Daniel Weiskopf, Kuno Kurzhals:
How Deep Is Your Gaze? Leveraging Distance in Image-Based Gaze Analysis. 13:1-13:7 - Jaeyoon Lee, Hanseob Kim, Gerard Jounghyun Kim:
Keep Your Eyes on the Target: Enhancing Immersion and Usability by Designing Natural Object Throwing with Gaze-based Targeting. 14:1-14:7 - Ziang Wu, Xianta Jiang, Jingjing Zheng, Bin Zheng, M. Stella Atkins:
Measuring Motor Task Difficulty using Low/High Index of Pupillary Activity. 15:1-15:6 - Yavuz Inal, Frode Strand Volden, Camilla Carlsen, Sarah Hjelmtveit:
My Eyes Don't Consent! Exploring Visual Attention in Cookie Consent Interfaces. 16:1-16:7 - Roksana Sadeghi, Ryan Ressmeyer, Jacob Yates, Jorge Otero-Millan:
Open Iris - An Open Source Framework for Video-Based Eye-Tracking Research and Development. 17:1-17:7 - Cristina Rovira-Gay, Clara Mestre, Marc Argilés, Jaume Pujol:
Saccade Characteristics during Fusional Vergence Tests as a Function of Vergence Demand. 18:1-18:6 - Yao Wang, Qi Dai, Mihai Bâce, Karsten Klein, Andreas Bulling:
Saliency3D: A 3D Saliency Dataset Collected on Screen. 19:1-19:6 - Mehedi Hasan Raju, Lee Friedman, Dillon J. Lohr, Oleg Komogortsev:
Signal vs Noise in Eye-tracking Data: Biometric Implications and Identity Information Across Frequencies. 20:1-20:7 - Anish S. Narkar, Brendan David-John:
Swap It Like Its Hot: Segmentation-based spoof attacks on eye-tracking images. 21:1-21:7 - Sonali Aatrai, Sandhya Gayatri Prabhala, Saurabh Sharma, Rajlakshmi Guha:
The Eyes Have It: Exploring the Connection Between Domain-Knowledge and Perception. 22:1-22:5 - Philipp Stark, Alexander J. Jung, Jens-Uwe Hahn, Enkelejda Kasneci, Richard Göllner:
Using Gaze Transition Entropy to Detect Classroom Discourse in a Virtual Reality Classroom. 23:1-23:11
ETRA 2024 Doctoral Symposium
- Michaela Vojtechovska:
Beyond Evaluation: Utilizing Gaze-Based Interactions for Digital Cartography. 24:1-24:3 - Izaskun Cia:
Evaluating Subject Behavior During Ingestion: A Portable Eye-Tracking Approach. 25:1-25:3 - Iga Szwoch:
Joint Attention on the Future: Pro-Ecological Attitudes Change In Collaboration. 26:1-26:3 - Joan Goset:
Methods based on eye tracking to assess cognitive impairment. 27:1-27:3 - Jarod P. Hartley:
Task Classification using Eye Movements and Graph Neural Networks. 28:1-28:3 - Anna Warchol-Jakubowska:
Using eye tracking to enhance the efficiency and safety of tram drivers - designing visual attention training. 29:1-29:3
ETRA 2024 Late-Breaking Work
- Rasha Sameer Moustafa, Siyuan Chen, Minoru Nakayama, Frederick Shic, Matt J. Dunn:
A Brief Introduction to ISCET - The International Society for Clinical Eye Tracking. 30:1-30:3 - Nicole Dalia Cilia, Andrea Pietro Arena, Antonino Barbera, Emanuele Borrello, Giuseppe Capizzi, Salvatore Cappello, Luigi Pio Faletra, Stefano Incardone, Salvatore Sorce:
A pilot study on gaze and mouse data for user identification. 31:1-31:3 - Wolfgang Fuhl, Dennis Grüneberg, Abdullah Yalvac:
An Evaluation of Gaze-Based Person Identification with Different Stimuli. 32:1-32:3 - Wolfgang Fuhl:
An Eyelid Simulator for Zero Shot Eyelid Segmentation. 33:1-33:3 - Mohamed Amine Kerkouri, Marouane Tliba, Aladine Chetouani, Alessandro Bruno:
AVAtt : Art Visual Attention dataset for diverse painting styles. 34:1-34:3 - Minoru Nakayama, Qian (Chayn) Sun, Jianhong Cecilia Xia:
Car driving temporal cognitive workload estimation using features of eye tracking. 35:1-35:3 - Gustavo Juantorena, Waleska Berrios, Maria Cecilia Fernández, Agustín Ibáñez, Agustín Petroni, Juan E. Kamienkowski:
Enhancing Cognitive Assessment: Integrating Hand and Eye Tracking in the Digital Trail-Making Test for Mild Cognitive Impairment. 36:1-36:3 - Zekun Wu, Anna Maria Feit:
Enhancing User Gaze Prediction in Monitoring Tasks: The Role of Visual Highlights. 37:1-37:3 - Hong Gao, Enkelejda Kasneci:
Exploring Eye Tracking as a Measure for Cognitive Load Detection in VR Locomotion. 38:1-38:3 - Joan Goset, Flors Vinuela-Navarro, Clara Mestre, Mikel Aldaba, Meritxell Vilaseca:
Eye movements during the performance of the neuropsychological Trail Making Test. 39:1-39:2 - Yannick Sauer, Rajat Agarwala, Patrizia Lenhart, Regine Lendway, Björn Severitt, Alexander Neugebauer, Benedikt Werner Hosp, Nora Jane Castner, Siegfried Wahl:
Eye tracking data set of academics making an omelette: An egg-breaking work. 40:1-40:3 - Kendra K. Noneman, J. Patrick Mayo:
Gaze Decoding with Sensory and Motor Cortical Activity. 41:1-41:3 - Wolfgang Fuhl, Gazmend Hyseni:
Gaze-based Assessment of Expertise in Chess. 42:1-42:3 - Ryo Yasuda, Minoru Nakayama:
Impact of reward expectation on pupillary change during an adaptive two player card game. 43:1-43:2 - Junichi Nagasawa, Yuichi Nakata, Mamoru Hiroe, Yujia Zheng, Yutaka Kawaguchi, Yuji Maegawa, Naoki Hojo, Tetsuya Takiguchi, Minoru Nakayama, Maki Uchimura, Yuma Sonoda, Hisatomo Kowa, Takashi Nagamatsu:
Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data. 44:1-44:3 - Marouane Tliba, Mohamed Amine Kerkouri, Aladine Chetouani, Alessandro Bruno, Mohammed El Hassouni, Arzu Çöltekin:
Perceptual Evaluation of Masked AutoEncoder Emergent Properties Through Eye-Tracking-Based Policy. 45:1-45:3 - Dominik Szczepaniak, Monika Harvey, Fani Deligianni:
Predictive Modelling of Cognitive Workload in VR: An Eye-Tracking Approach. 46:1-46:3 - Deborah N. Jakobi, Daniel G. Krakowczyk, Lena A. Jäger:
Reporting Eye-Tracking Data Quality: Towards a New Standard. 47:1-47:3 - Enkeleda Thaqi, Mohamed Omar Mantawy, Enkelejda Kasneci:
SARA: Smart AI Reading Assistant for Reading Comprehension. 48:1-48:3 - Yui Atarashi, Kaisei Yokoyama, Buntarou Shizuki:
Investigation of Unconscious Gaze and Head Direction in Videoconference - Turning off Microphone/Camera by Unconscious Gaze and Head Direction -. 49:1-49:3 - Ruijie Wang, Reece Bush-Evans, Emily Arden-Close, John McAlaney, Sarah Hodge, Elvira Bolat, Sarah Thomas, Keith Phalp:
Usability of Responsible Gambling Information on Gambling Operator' Websites: A Webcam-Based Eye Tracking Study. 50:1-50:3 - Yannick Sauer, Björn Severitt, Rajat Agarwala, Siegfried Wahl:
Using mobile eye tracking for gaze- and head-contingent vision simulations. 51:1-51:3 - Anatolii Evdokimov, Catherine Finegan-Dollak, Arryn Robbins:
WEyeDS: A desktop webcam dataset for gaze estimation. 52:1-52:2 - Wolfgang Fuhl:
Zero Shot Learning in Pupil Detection. 53:1-53:3
COGAIN 2024
- Youn-Soo Park, Roberto Manduchi:
A Functional Usability Analysis of Appearance-Based Gaze Tracking for Accessibility. 54:1-54:7 - Pawel Kasprowski, Grzegorz Zurek, Roman Olejniczak:
A novel diagnostic tool utilizing eye tracking technology to allow objective assessment of patients' cognitive functions. 55:1-55:3 - Junichi Nagasawa, Yuichi Nakata, Mamoru Hiroe, Yujia Zheng, Yutaka Kawaguchi, Yuji Maegawa, Naoki Hojo, Tetsuya Takiguchi, Minoru Nakayama, Maki Uchimura, Yuma Sonoda, Hisatomo Kowa, Takashi Nagamatsu:
Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data. 56:1-56:3 - Nuno Estalagem, Augusto Esteves:
Between Wearable and Spatial Computing: Exploring Four Interaction Techniques at the Intersection of Smartwatches and Head-mounted Displays. 57:1-57:7 - Mayu Akata, Yoshiki Nishikawa, Toshiya Isomoto, Buntarou Shizuki:
Estimating 'Happy' Based on Eye-Behavior Collected from HMD. 58:1-58:2 - Noha Mokhtar, Augusto Esteves:
Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users' Sense of Presence and Embodiment. 59:1-59:7 - Rasha Sameer Moustafa, Siyuan Chen, Minoru Nakayama, Frederick Shic, Matt J. Dunn:
Introducing ISCET - The International Society for Clinical Eye Tracking. 60:1-60:4 - Are Dæhlen, Ilona Heldal, Jozsef Katona:
Linking Data from Eye-Tracking and Serious Games to NDD Characteristics: A Bibliometric Study. 61:1-61:8 - SaiKiran Kumar Tedla, I. Scott MacKenzie, Michael Brown:
LookToFocus: Image Focus via Eye Tracking. 62:1-62:7 - Ajoy Savio Fernandes, T. Scott Murdison, Immo Schuetz, Oleg Komogortsev, Michael J. Proulx:
The Effect of Degraded Eye Tracking Accuracy on Interactions in VR. 63:1-63:7 - Alejandro Garcia De La Santa Ramos, Javier Muguerza Rivero, David Lopez, Unai Elordi, Luis Unzueta, Arantxa Villanueva:
Unsupervised data labeling and incremental cross-domain training for enhanced hybrid eye gaze estimation. 64:1-64:7
EduEYE 2024 Session 1
- Wunmin Jang, Hong Gao, Tilman Michaeli, Enkelejda Kasneci:
Exploring Communication Dynamics: Eye-tracking Analysis in Pair Programming of Computer Science Education. 65:1-65:7 - Jürgen Horst Mottok, Florian Hauser, Lisa Grabinger, Timur Ezer, Fabian Engl:
An Educational Perspective on Eye Tracking in Engineering Sciences. 66:1-66:7 - Tobias Appel, Luzia Leifheit:
Fluid Intelligence and Mental Effort during Block Programming: What the Eyes Can Tell Us. 67:1-67:6
EduEYE 2024 Session 2
- Prasanth Chandran, Yifeng Huang, Jeremy Munsell, Brian Howatt, Brayden Wallace, Lindsey Wilson, Sidney D'Mello, Minh Hoai, N. Sanjay Rebello, Lester C. Loschky:
Characterizing Learners' Complex Attentional States During Online Multimedia Learning Using Eye-tracking, Egocentric Camera, Webcam, and Retrospective recalls. 68:1-68:7 - Are Dæhlen, Ilona Heldal, Abdul Rehman, Qasim Ali, Jozsef Katona, Attila Kovari, Teodor Stefanut, Paula Da Costa Ferreira, Cristina A. Costescu:
Towards More Accurate Help: Informing Teachers how to Support NDD Children by Serious Games and Eye Tracking Technologies. 69:1-69:7 - Dominik Bittner, Vamsi Krishna Nadimpalli, Lisa Grabinger, Timur Ezer, Florian Hauser, Jürgen Horst Mottok:
Uncovering Learning Styles through Eye Tracking and Artificial Intelligence. 70:1-70:7 - Muhammad Arief Nugroho, Maman Abdurohman, Bayu Erfianto, Mahmud Dwi Sulistiyo:
Meticulous Acquisition System for Tracking User's Natural Kinetics (MAS TUNK): An Approach in Eye Tracking Dataset Collection for Neural Network Training. 71:1-71:8
EMIP 2024
- Florian Hauser, Lisa Grabinger, Timur Ezer, Jürgen Horst Mottok, Hans Gruber:
Analyzing and Interpreting Eye Movements in C++: Using Holistic Models of Image Perception. 72:1-72:7 - Sven Hüsing, Sören Sparmann, Carsten Schulte, Mario Bolte:
Identifying K-12 Students' Approaches to Using Worked Examples for Epistemic Programming. 73:1-73:7 - Wudao Yang, Unaizah Obaidellah:
Attention Dynamics in Programming: Eye Gaze Patterns of High- vs. Low-Ability Novice Coders. 74:1-74:6
ETVIS 2024 Session 1: Eye Tracking Studies
- Nelusa Pathmanathan, Kuno Kurzhals:
Investigating the Gap: Gaze and Movement Analysis in Immersive Environments. 75:1-75:7 - Daniel Klötzl, Tim Krake, Frank Heyen, Michael Becher, Maurice Koch, Daniel Weiskopf, Kuno Kurzhals:
NMF-Based Analysis of Mobile Eye-Tracking Data. 76:1-76:9 - Sita Aukje Vriend, Sandeep Vidyapu, Amer Rama, Kun-Ting Chen, Daniel Weiskopf:
Which Experimental Design is Better Suited for VQA Tasks?: Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations. 77:1-77:7
ETVIS 2024 Session 2: Maps and Text
- Taiki Kodomaru, Minoru Nakayama:
Analysis of eye movement scan-paths and their image features while searching for routes on metro maps. 78:1-78:7 - Stanislav Popelka, Jiri Kominek, Michaela Vojtechovska:
Exploring Geological Map Usability Through Sequence Chart Visualization. 79:1-79:7 - Franziska Huth, Maurice Koch, Miriam Awad-Mohammed, Daniel Weiskopf, Kuno Kurzhals:
Eye Tracking on Text Reading with Visual Enhancements. 80:1-80:7
MultiplEYE 2024
- Daniel Hienert, Heiko Schmidt, Thomas Krämer, Dagmar Kern:
EyeLiveMetrics: Real-time Analysis of Online Reading with Eye Tracking. 81:1-81:7 - Seongsil Heo, Roberto Manduchi, Susana Chung:
Reading with Screen Magnification: Eye Movement Analysis Using Compensated Gaze Tracks. 82:1-82:6 - Kristina Cergol, Marijan Palmovic:
The role of stress in silent reading. 83:1-83:5 - Nicol Dostálová, Petra Pátková Dansová, Stanislav Jezek, Michaela Vojtechovska, Cenek Sasinka:
Towards the Intervention of Dyslexia: a Complex Concept of Dyslexia Intervention System using Eye-Tracking. 84:1-84:6 - Pawel Kasprowski:
Utilizing Gaze Self Similarity Plots to Recognize Dyslexia when Reading. 85:1-85:5
PETMEI 2024
- Aiswariya Milan K, Joseph Amudha, George Ghinea:
Automated Insight Tool: Analyzing Eye Tracking Data of Expert and Novice Radiologists During Optic Disc Detection Task. 86:1-86:7 - Susanne Hindennach, Lei Shi, Andreas Bulling:
Explaining Disagreement in Visual Question Answering Using Eye Tracking. 87:1-87:7 - Jesse W. Grootjen, Henrike Weingärtner, Sven Mayer:
Investigating the Effects of Eye-Tracking Interpolation Methods on Model Performance of LSTM. 88:1-88:6 - Charlie S. Burlingham, Naveen Sendhilnathan, Xiuyun Wu, T. Scott Murdison, Michael J. Proulx:
Real-World Scanpaths Exhibit Long-Term Temporal Dependencies: Considerations for Contextual AI for AR Applications. 89:1-89:7 - Daniele Maria Crafa, Susanna Di Giacomo, Dario Natali, Carlo Ettore Fiorini, Marco Carminati:
Towards Invisible Eye Tracking with Lens-Coupled Lateral Photodetectors. 90:1-90:7
PLEY 2024
- Minxin Cheng, Leanne Chukoskie:
Effects of Visual Cluttering in Virtual Reality on Visuomotor Integration in Autistic Individuals. 91:1-91:7 - Peter A. Smith, Matt Dombrowski, Calvin MacDonald, Courtney Williams, Maanya Pradeep, Elizabeth Barnum, Viviana P. Rivera, John Sparkman, Albert Manero:
Initial Evaluation of a Hybrid eye tracking and Electromyography Training Game for Hands-Free Wheelchair Use. 92:1-92:8 - Prasetia Utama Putra, Fumihiro Kano:
Quantifying the Effect of Anticipatory Eye Movement on Successful Ball Hitting Using Fine-Scale Tracking and SHAP-Analysis. 93:1-93:9
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.