Skip to content

Testing the benefits of gaze-assisted sampling (using eye tracking) in a mobility task with simulated phosphene vision

Notifications You must be signed in to change notification settings

neuralcodinglab/SPVGaze

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Experimental code for VR phosphene simulations with gaze-contingent image processing

This repository contains the experimental scripts for the publication: de Ruyter van Steveninck, J., Nipshagen, M., van Gerven, M., Güçlü, U., Güçlüturk, Y., & van Wezel, R. (2024). Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision. Journal of neural engineering, 21(2), Article 026037. https://doi.org/10.1088/1741-2552/ad357d

Note that this repository only contains the experimental scripts. The data-analysis can be found here: https://github.com/neuralcodinglab/SPVGazeAnalysis/tree/main

Remarks

The code is specific to our experimental setup which used the HTC VIVE Pro Eye headset, and 3D virtual environments designed by ArchVizPRO studios (which you have to buy in the Unity asset store: https://assetstore.unity.com/publishers/13358)

The following libraries are excluded from the upload and need to be added locally:

Branches 'experiment' and 'experiment 2' contain the code that was used for the simulated prosthetic vision experiments. Experiment 1 and 2, respectively, tested mobility (obstacle avoidance) and orientation (scene recognition and visual search) with simulated prosthetic vision.

About

Testing the benefits of gaze-assisted sampling (using eye tracking) in a mobility task with simulated phosphene vision

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •