SAFA ANDAÇ

Assessment of functional vision for daily life locomotion and navigation using VR
This project is under the supervision of Prof. M.B. Hoffmann, located in the Visual Processing Lab at OVGU. The project aims to find tools to determine the effect of visual impairment on daily life locomotion and specifically navigation using VR paradigms.
Personal Background:
- BS in Computer Engineering - MA in Cognitive Science
- Data Structure and Algorithms
- Advanced in C, C++17, C#, Python, Unity and MATLAB
- General Knowledge of Programming Languages
- Design of Experimental Paradigms
Personal Interest(s):
The real Vimmer, a chess player, and a competitive programmer
Aim of the project:
The objectives of my project are to 1) determine the effect of visual impairment on functional vision during daily locomotion, specifically on gait control, body movements and navigation, 2) use visually guided locomotion abilities to determine functional vision, and 3) establish links between locomotion abilities and functional vision capacities and relate these to quality of life (QoL).
Current activities:
To date, I have analyzed outcome measures such as travel time, pointing time, and distance error from locomotion data obtained from 14 glaucoma patients and 15 controls, using VR equipment and a navigation paradigm implemented in a virtual environment.
Furthermore, I have started my cross-sector secondment at the Pattern Recognition Company. In collaboration with ESR4 (Yaxin) I’m developing deep learning methods based to investigate differences and similarities between glaucoma and controls using the locomotion data from the first study.
Future directions:
As a future project, I’m planning to determine how locomotion abilities in visually impaired individuals relate to their quality of life by using machine learning techniques.
Project output
No output yet.