Project 3

PROJECT 3

Gaze-based evaluation of functional vision in activities of daily life

Status of position: filled


ESR: Andrea Ghiani


Why? When faced with one clear task, where people look at each instant is characteristic of the task that they are performing. This holds for everyday tasks such as walking along a path1 or making breakfast2, as well as for particularly challenging tasks such as hitting a fast approaching ball3. At each moment, people look where they expect to find the most relevant information for the task. However, constantly looking exactly where one would normally look is not essential4. This is fortunate, because in daily life it is not always clear where the most relevant information is to be found, and often multiple tasks have to be performed simultaneously. We anticipate that various visual impairments will influence gaze patterns under such circumstances. Therefore, you will examine the relationship between visual impairments and gaze when performing such common daily tasks under natural circumstances with many distractions.


How? You will use a state-of-the-art eye tracker (e.g., Pupil Invisible) to measure eye movements while participants perform various daily tasks in their normal environment. An important feature of these eye trackers is that it is possible to walk around freely (it records gaze on a mobile phone) while hardly noticing that it is being worn. A large part of the work will consist of developing methods to characterize where people normally look under such circumstances. You will evaluate the variability in gaze both within individuals and between individuals for a large variety of tasks. You will then use this information to examine how various visual impairments affect gaze under similar circumstances. You will not just look for differences, but also evaluate to what extent people with visual impairments compensate for their visual deficiencies by adjusting their gaze.



Where? You will be located at the Vrije Universiteit (Amsterdam, NL), at the faculty of Behavioural and Movement Sciences and under supervision of Prof. Eli Brenner. Brenner has an interest in vision and how it is used to guide our actions and many other aspects of perception and action. Some studies will be done in collaboration with experts on low vision from within the OptiVist network (Koninklijke Visio and the International Paralympic Committee).


What can you expect to learn and experience? You will learn how to record and analyse eye movements using the latest technology. This will include evaluating eye movements when people perform certain selected tasks and as they go about their daily life. You will experience the intricacies of evaluating eye movements without restricting the participants’ mobility as well as the complications that arise when interpreting differences in such unconstrained behavior between groups.


Who are we looking for? Much of the work will consist of developing methods to evaluate where the person is looking in his or her ever-changing environment. We are therefore looking for someone who has an interest in studying human (gaze) behaviour and who enjoys developing software to analyse data. Some experience in programming, ideally in Python, Matlab or C/C++, is required.


References

  • Matthis JS, Yates JL, Hayhoe MM. Gaze and the Control of Foot Placement When Walking in Natural Terrain. Curr Biol. 2018 Apr 23;28(8):1224-1233.e5. https://doi.org/10.1016/j.cub.2018.03.008
  • Land MF, Hayhoe M. In what ways do eye movements contribute to everyday activities? Vision Res. 2001;41(25-26):3559-65. https://doi.org/10.1016/S0042-6989(01)00102-X
  • Mann DL, Nakamoto H, Logt N, Sikkink L, Brenner E (2019) Predictive eye movements when hitting a bouncing ball. Journal of Vision 19(14)28, 1-21. https://doi.org/10.1167/19.14.28
  • Cámara C, López-Moliner J, Brenner E, de la Malla C (2020) Looking away from a moving target does not disrupt the way in which the movement toward the target is guided. Journal of Vision 20(5):5. https://doi.org/10.1167/jov.20.5.5
Want to apply for this postion? Click here! 

Project output


No output yet.

Share by: