[AI] EyePoint: Look and click

vishnu ramchandani vishnuhappy at yahoo.com
Tue Jan 22 23:23:38 EST 2008


EyePoint: Look and click

Computer scientists are developing a system that would
let you simply look at a desired place on your screen
to click on it by using a keyboard, eliminating
the need for a mouse or similar pointing devices

IDG News Service

The keyboard and mouse have long been the dominant
forms of input on computer systems. Eye-gaze tracking
as a form of input was primarily developed for
disabled users who are unable to make normal use of a
keyboard and pointing device. 

Look-Press-Look-Release: The user first looks at the
desired target
brings up a magnified view of the region the user was
looking in. The user then looks again at the target in
the magnified view 
However, with the increasing accuracy and decreasing
cost of eye-gaze tracking systems, it will soon be
practical for able-bodied users to use gaze as
a form of input in addition to a keyboard and mouse,
say a group of US computer scientists. 

The GUIDe (Gaze-enhanced User Interface Design)
project at Stanford University explores how gaze
information can be effectively used as an input in
addition
to keyboard and mouse. 

“Human beings look with their eyes. When they want to
point, either on the computer or in real life, they
look before they point. Therefore, using eye
gaze as a way of interacting with a computer seems
like a natural extension of our human abilities,” the
researchers say in their study. “However, to date,
research has suggested that using eye gaze for any
active control task is not a good idea.”

“Our approach uses gaze as an input without
overloading the visual channel and enables
interactions which feel natural and easy to use,” they
add.

WHAT IT DOES

The EyePoint system provides a practical gaze-based
solution for everyday pointing and selection using a
combination of gaze and keyboard. It works by
using a simple look-press-look-release cycle.

To use EyePoint, the user simply looks at the target
on the screen and presses a hotkey for the desired
action – single-click, double-click, etc.

The system displays a magnified view of the region the
user was looking at. The user looks at the target
again in the magnified view and releases the hotkey.
This results in the appropriate action being performed
on the target.

Pressing and holding down a hotkey
and releases the hotkey to perform the equivalent
mouse action
The region around the user’s initial gaze point is
presented in the magnified view with a grid of orange
dots overlaid. These orange dots are called focus
points and may aid in focusing the user’s gaze at a
point within the target. Focusing at a point reduces
the jitter and improves the accuracy of the system.

Single-click, double-click and right-click actions are
performed as soon as the user releases the key.
Click-and-drag is a two-step interaction. The user
first selects the starting point for the click and
drag with one hotkey and then the destination with
another hotkey.

The user can also abort an action by looking away or
anywhere outside of the zoomed region and release the
hotkey, or press the ‘Esc’ key on the keyboard.

HOW IT WORKS

The eye tracker constantly tracks the user’s
eye-movements, using complex algorithms to help filter
the gaze data. It determines the user’s gaze using
low-cost, commercial cameras to track the location of
the centre of the pupil.

When the user presses and holds one of the
action-specific hotkeys on the keyboard, the system
uses the key press as a trigger to perform a screen
capture
of around 60 pixels in all four directions from the
estimated gaze point. 

The system then magnifies the captured region of the
screen. The resulting image is shown to the user at a
location centred at the previously estimated
gaze point, but offset to remain within screen
boundaries. 

Then, Eyepoint uses a secondary gaze in the magnified
view for more accuracy.

The user looks at the desired target in the magnified
view and releases the hotkey. The eye gaze is recorded
when the hotkey is released. Since the view
has been magnified, the resulting eye-gaze is more
accurate. 

The cursor then moves to the desired location and the
action corresponding to the hotkey (single-click,
right-click, etc) is executed. 

“Users strongly preferred the experience of using
gaze-based pointing over the mouse even though they
had years of experience with the mouse,” the
researchers
say.

The GUIDe team is also working on two similar
technologies. EyeExposé renders all the open windows
into thumbnails on the screen – much like the Exposé
feature on Apple computers – thus allowing users to
easily switch between applications. EyeScroll tracks
the eye-gaze to automatically scroll down when
the user’s gaze drops below the middle of the screen.


      5, 50, 500, 5000 - Store N number of mails in your inbox. Go to http://help.yahoo.com/l/in/yahoo/mail/yahoomail/tools/tools-08.html





More information about the AccessIndia mailing list