WebGazer, Githubbed here, is designed to let site designers understand what grabs visitors' attention.
The designers explain that they've created a self-calibrating eye-tracking model that “trains a mapping between the features of the eye and positions on the screen.”
Instead of having to be explicitly calibrated, the software uses clicks and mouse movements to calibrate its predictions against screen regions the user is looking at.
Rather than trying to send video back to a server, the WebGazer.js library runs entirely in the client browser, and it “runs only if the user consents in giving access to their webcam”. That consent, having read the documentation, might include giving access to local storage if the WebGazer operator wants to store data between sessions.
The key bits of the software are a tracker module, which “controls how eyes are detected”, and the regression module, which handles learning and predictions.
These are designed to be swapped out, the developers explain, so that WebGazer can be improved or extended.
The Brown University boffins that developed it (PhD candidate Alexandra Papoutsaki, assistant professor Jeff Huang and undergrad James Laskey) collaborated with Georgia Tech associate professor James Hayes. ®