Eye Tracker

Link to software [1]

Eye tracker: here (96,264 KB)

GUI: here (273 KB)


Our software is based on the open-source GazeParser C and Python libraries 5 [2] that can be controlled via Python. Our system incorporates a number of additions:

  1. A GUI to quickly start the tracker and setup a recording file

  2. Modified acquisition settings (pupil threshold, corneal reflection threshold, etc.) on the fly with GUI sliders.

  3. Extensive calibration features, including a user-guided calibration procedure for minimally-cooperative subjects such as animals or infants.

  4. Ability to re-use a calibration, facilitating psychophysical studies for cooperative and minimally cooperative subjects (i.e., restarting acquisition after a break or separate session).

  5. Quadratic transformation of the eye-to-screen mapping for more uniform accuracy across the screen.

  6. Scene monitor for monitor gaze position on the presented scene.

  7. Recording of the external stimulus events with the eye timestamps.

  8. Free recording of eye position in the absence of a stimulus using a click of a button.

[1] Farivar R & Michaud-Landry D (2016) Construction and Operation of a High-Speed, High-Precision Eye Tracker for Tight Stimulus Synchronization and Real-Time Gaze Monitoring in Human and Animal Subjects. Front. Syst. Neurosci. 10:73. doi: 10.3389/fnsys.2016.00073

[2] Sogo, H. (2013). GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis. Behav. Res. Methods 45, 684–695. doi: 10.3758/s13428-012-0286-x


Good continuity discrimination

Contours elements can follow along a curve and display “good” continuity, or they can be systematically shifted, disrupting good continuity in the contour they form. This Gestalt property of the contour can be detected by the early visual system and the difficulty increases as the curve of the contours flatten, but when characterizing visual perception, sensitivity is not the only limiting factor that has to be taken into account. We applied the equivalent noise method to infer internal noise and efficiency from the performance of participants with the Linear Amplifier Model: contours were presented under three levels of added orientation noise, randomly shifting the elements of the contours—of the good and the bad continuity. Below is an illustration of a typical psychometric performance when presented with the stimuli of corresponding levels of external noise. In this task, the participant chooses the which of the four contour presented has good continuity. For simplicity, the lower right contour is the good continuity contour in all of the stimuli here.

These stimuli were presented to mild Traumatic Brain Injury patients by Tatiana Ruiz and originally created by Alex Baldwin [1].

[1] Alex S Baldwin, Minnie Fu, Reza Farivar & Robert F. Hess (2017). The equivalent internal orientation and position noise for contour integration. Scientific Reports, 7(13048), pp. 1-12.

Depth-cues in the visual system

Stimuli used in an fMRI study on the representation of depth-cues in the visual system. These stimuli were created by Hassan Akhavein.

Objects were defined by isolated depth-cues (shading, texture and structure from motion)

Examples of Objects Defined by shading and Textures

Shaded Body
Texture Body
Shaded Face
Texture Face
Shaded Fruit

Examples of control stimuli

Control Stimuli that preserve the mean depth of the object,

generated from depth-map of the object using steerable


Control Face

Control Face

Texture Fruit





Control Fruit

Control Fruit

Examples of structure from motion