Eye Tracking Glasses ETV

Wearable, lightweight eye tracking glasses

Synchronize eye tracking data & video with physiological data.

Track on displays of movies or images, including phones and monitors (portrait or landscape)

Real time AI tracking of multiple objects

 

Product Enquiry

If you would like to send us an enquiry about this product, please click the button below, fill in the form and submit.

Product Enquiry

Product Enquiry

Please fill out the form if you would like to enquire about this product.

"*" indicates required fields

Name
Please indicate the name of your University or company
We would really like to stay in touch, so that we can inform you about new products, training events and other news that we think might be of interest to you. To comply with data protection regulations, we can only contact you in this way if we have your explicit permission to do so. By ticking the opt in box you are giving consent for Linton Instrumentation to process your data and you understand that you have the right to withdraw it at any time.*

To fully appreciate the power and versatility of the ETVision glasses we recommend a live demo. Please use the demo request form to request a site visit to demonstrate the system.

Precise, binocular point of gaze at 180 Hz

Synchronize eye tracking data & video with physiological data

Use ETVision (ETV) wearable eye tracking glasses for Gaze, Pupillometry, Saccade, and Fixation data—and synchronize with physiology signals in AcqKnowledge for a comprehensive study.

The ETVision (ETV) wearable unit resembles an eyeglasses frame that can be worn by itself or over the participant’s prescription eyeglasses. It contains miniature “eye cameras” that view each eye and a “scene camera” that views the scene in front of the participant, and a microphone. An optional visor can be mounted to the front of the headgear.

  • High-definition Scene Image
  • Fast and simple setup with single-point calibration
  • Displays left eye, right eye, and scene image
  • Score feedback gives investigators confidence for quality of gaze data
  • Automatic vergence correction for accuracy and distance
  • Constant real-time feedback shows point of gaze and system features recognition performance
  • Two-way audio allows investigator and participant to speak with each other during task performance

Recording video and audio data to an SD card on your existing PC for later data processing allows the subject to be completely un-tethered while data recording takes place. The optional “Argus Science ETPhone” app can calculate gaze, display gaze, and record gaze on a mobile device or record to an SD card. Review the Eye Tracking Glasses Overview (pdf).

System options:

EYE-ETV-01 system includes glasses hardware and analysis software plus an AcqKnowledge license for Eye Tracking to import Argus data and synchronize with multiple physiological measures for a comprehensive research study.

EYE-ETV-00 is a stand-alone system with glasses hardware and analysis software but no license for AcqKnowledge integration.

Details

The wearable optics unit resembles an eyeglasses frame that can be worn by itself or over the participant’s prescription eyeglasses. It contains miniature “eye cameras” that view each eye and a “scene camera” that views the scene in front of the participant, and a microphone.

Eye-ETV Glasses Details

The left and right bottom sections of the frame each include a panel containing a camera and a pair of LEDs. The panels rotate within an outer enclosure so that it is possible to adjust the camera vertical aiming direction with respect to the optics frame. Although these panels are usually best left in their standard position, it may be necessary to use this adjustment if facial structure makes the optics frames sit in an unusual position or angle on the face.

The frame comes with two nosepieces that can be interchanged to adjust the vertical position of the entire frame with respect to the face; each nosepiece may be flexed to fit different nose profiles as required. To remove a nosepiece, simply pull the nosepiece horizontally away from frame. Attach a nosepiece by pressing the two pins on the nosepiece into the mating holes in the frame. The frame can also be worn with no nosepiece to position the frame at the lowest possible position on the face.

A microphone is located in the part of the frame that sits just above the participant’s eyebrows.

An optional visor attaches and detaches from the frame by mating small slots at the upper corners of the visor with small hooks on the frame.

The scene camera lens protrudes from the frame just above the nosepiece. Scene camera focus is adjusted by rotating the lens. An M2, hex head, nylon tip setscrew, located at the bottom of the lens assembly, controls the “tightness” of the focus adjustment, and should be tight enough to prevent unintentional rotation of the lens.

An HDMI type cable extends from the right temple clamp (piece that extends from the temple to the top of the ear). It connects to the frame with a micro HDMI connector in the temple clamp, but is intended to be disconnected only if it necessary to replace the cable. The other end of the cable has a standard “type-A-to-micro” HDMI connector and connects to the ETVision Controller.

The small Controller (slightly larger than a smart phone) easily fastens to an adjustable belt or can be held by the participant (or near the participant) in some other way. The Controller holds a micro SD card for video and audio data recording, and includes a rechargeable battery. It can also be powered directly from a DC power supply.

ETVision–Minimum Computer Specifications

To achieve the expected ETVision performance, the associated laptop must meet or exceed these specifications:

CPU: Intel i7-8750H
OS: Win10 Pro 64-bit
Memory: 16 GB DDR4 RAM
Wireless: 802.11ac
Graphics*: NVIDIA GeForce RTX™ 2060 – 6GB GDDR5 Memory
Features: 10/100/1000 LAN, 2X USB3, SD Card Reader, Audio (Mic & Speaker)

*The GPU is the most critical specified component—it should meet or exceed the RTX™ 2060 performance. RTX 2070, RTX 2080 and the RTX™ 30 series (RTX 3060, RTX 3070, and RTX 3080) are all GPU’s that also have at least 6 GB memory; models with 4 GB or less are not acceptable performance.


StimTrac Module ETAnalysis Licensed Add-on Available

Add the EYE-ETV-STIMTRAC Stimulus Tracking Module for exciting new data analysis technology for eye tracking mobile devices and computers! The new Stimulus Tracking analysis software module is the latest option in the powerful ETAnalysis suite for Argus Science eye tracking glasses. Revolutionary software capability further automates and expedites data analysis for applications where gaze falls on stimulus device quick and easy with a near automated process!

This optional StimTrac licensed feature adds revolutionary functionality to the ETV software that comes with the EYE-ETV eye tracking glasses. It allows tracking of a single display and works with the ETRemote (supplied with the Argus glasses) to present movies or images on the screen and sync with the data collection. This add-on also includes two sets of 4 vinyl fiducials.


Optional ETPhone App

The “Argus Science ETPhone” app can calculate gaze, display gaze, and record gaze on the mobile device. There is one difference to be aware of: Gaze recorded to the mobile device via ETPhone is recorded at 30 Hz (whereas gaze recorded live to a PC or recorded to the SD card in the controller is recorded at 180 Hz). You can record to SD card via ETPhone and that will be recorded at 180 Hz on the .emv file. The free app is available on Google Play (Android) or Apple Store (iOS). Learn more: Instructional Video | ETPhone App Guide


Eye Tracking AI MOT License

AI Multiple Object Tracking (MOT) Licensed Feature Add-on for Argus Eye Tracking Glasses

This new AI Powered feature for ETVision eye tracking glasses uses an AI Model to recognize and detect multiple objects that appear in the head mounted scene camera image. The objects are detected and tracked in real time and the system flags periods of gaze engagement with each object in real time.

This license provides real-time object tracking using AI object recognition for Existing Argus Eye Tracking Glasses Users. Use the ETAnalysis package to auto-train objects using the moving areas of interest capabilities. Object training may take many hours to train different views of an object but utilizing ETAnalysis you auto train these AI objects automatically and save time! Stream out this new capability with new data fields that define the object that gaze is engaged with.

  • Track AI objects live in ETVision
  • Report Gaze & Fixations on each AI Object in real time
  • Train objects for AI recognition with automated process using ETAnalysis Moving Area of Interest capability
  • Auto-detect trained AI objects for all participants
  • Generate real-time Bar Plots by AI object engagement: Total Time, Fixation Time, Time to First Fixation, and Pupil Size

 

ETVision AI Object Detection and AI Model Training

AI feature recognition is supported by ETVision, version 1.0.7.2 or above. The AI training procedure is supported by ETAnalysis version 1.1.0.3 or above with the AIObjectTraining option included.

The standard AI model in ETVIsion is trained to recognize a set of common objects. This optional AIObjectTraining add-on to the ETAnalysis application enables users to easily create custom AI models for ETVision that will recognize objects specific to their requirements and environment.

In the video below, ETVision has been loaded with a custom AI model trained to recognize the information display, instrument cluster, rearview mirror, left sideview mirror, and right sideview mirror. A recognition box tracks each of these objects in real time and the system flags periods of gaze engagement with each object. Real-time bar plots show cumulative fixation time on each object. Alternately, the bar plots can show Total Time, Number of Fixations, Average Fixation Duration, Time to First Fixation, or Pupil Size.

In the video below, the custom AI model has been trained to recognize putting drill. Object boundaries are red for golf balls (5), green for the club, and purple for the hole, and the participant’s gaze is shown as a yellow circular cursor. Object engagement is indicated at the top of the scene image and at the same time the live bar plots for gaze time, fixation time, and pupil size relative to objects are updated for immediate feedback.