Cellular phones are becoming more powerful
by the day and users are now provided with tools comparable to
those available to desktop computer users. In many cases, the
limiting factor is no longer associated with computational
power, memory size, or battery life and it is becoming more and
more related to the input interface. Touch-screens can simplify
the menu navigation by far but the vast majority of phones on
the market are not equipped with them. Cameras, however, are
widely spread and are usually placed on the opposite side of the
screen: this allows the user to look at the screen while
pointing the camera towards some surface. The surface can now
become a virtual touch-screen and the user can control the
pointer on the screen by moving a finger on it.
11/02/07 Reviewed by: James Skorupski (11/04/07), Gillian Smith
10/22/07 Reviewed by: James Skorupski(10/31/07), Gillian
Smith(10/25/07), Adam Smith (10/31/07))
Figures (Distributed: 11/16/07 Reviewed by: James Skorupski(11/18/07), Gillian
Smith(---), Adam Smith (---))
 Rogers, W.A., O'Brien, M.A., McLaughlin,
Selection and Design of Input Devices for Assistive Technologies.
9th International Conference on Control, Automation,
Robotics and Vision(ICARCV),pp.1-6 (2006).
authors discuss various options for User Interfaces and how well
they apply for applications in assistive technologies.
Some of the devices include: touchpads, virtual keyboards, and
 Synaptic Inc.
Screens Improve Handheld Human Interface. Synaptics Inc., pp.
authors discuss touchpad technologies using a capacitive layer
in hardware that allows for more accurate detection of where the
stylus or finger has press.
 Starner, T., Mann, S., Rhodes, B., Levine,
Augmented Reality Through Wearable Computing. Presence-Cambridge
Massachusetts, Vol. 6, No. 4, pp. 386-398 (1997).
authors have an apparatus that is worn on the head that has
goggles through which the user can see information overlaid over
what the user is actually seeing in the world. The user's
finger is used to interface through the various menu's on the
image displayed. The idea is that the interface will be
able to display information necessary for the user, such as a
relevant reference paper.
Dominguez, S., Keaton, T., Sayed, A.H.
A Robust Finger Tracking Method for Multimodal Wearable Computer
Interfacing. IEEE Transactions on Multimedia, Vol. 8, No. 5, pp.
authors use a wearable computer interfacing which is worn on the
head and the interface is seen through the goggles. These
authors combine finger and audio based commands. Their
application is a user specified area for segmentation, by
outlining the object in the world with their finger.
Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.
LucidTouch: A See-Through Mobile Device. Symposium on User
Interface Software and Technology(UIST'07), Proceedings of, Newport,
Rhode Island, pp. 269-278 (2007).
authors simulate the perception of a see through screen so that
the user can see their finger movements which are taking place
behind the device. The fingers are tracked and detected in
order to use them to navigate through the applications. A
camera is mounted behind the device to record the movements.
Letessier, J., Berard, F.
Visual Tracking of Bare Fingers for Interactive Surfaces.
Symposium on User Interface Software and Technology (UIST'04),
Proceedings of, Vol. 6, No.2, Santa Fe, New Mexico, pp. 119-122
authors use the difference image for segmentation and filtering
techniques to detect fingers based on color information.
 Oka, K., Sato, Y., Koike, H.
Tracking of Multiple Fingertips and Gesture Recognition for
Augmented Desk Interface Systems. Proceedings of the Fifth IEEE
International Conference on Automatic Face and Gesture Recognition
Authors use infrared images for tracking and detection.
They are able to detect fingers in cluttered backgrounds and
under varying lighting conditions.
 Sato, Y., Kobayashi, Y., Koike, H.
Fast Tracking of Hands and Fingertips in Infrared Images for
Augmented Desk Interface. IEEE International Conference on
Automatic Face and Gesture Recognition, Proceedings of, pp. 462
authors use an infrared camera to detect finger movements.
They use this to work on a physical desk while also able to
interface to a computer.
 Wellner, P.
The DigitalDesk Calculator: Tactile Manipulation on a Desk Top
Display. Symposium on User Interface Software and Technology
(UIST '91), Proceedings of, pp. 27-33 (1991).
authors use a camera and a projector to interface to a computer
while working on a physical desk. The camera reads in the
finger movements while the projector displays output
 Bencheikh-el-hocine, M., Bouzenada, M., Batouche, M.C.
A New Method of Finger Tracking Applied to the Magic Board. IEEE
International Conference on Industrial Technology (ICIT),
authors use finger detection to interface with a game of chess.
The actions of picking up a piece and moving it are detected and
displayed on the screen.
 Lockton, R., Fitzgibbon, A.W.
Real-time gesture recognition using deterministic boosting.
British Machine Learning Vision Conference (BMVC), Proceedings of,
authors use boosting algorithms (e.g. Ada-Boost) to recognize
gestures. Their goal is to detect sign language gestures.
 Wu, A., Shah, M., da Vitoria Lobo, N.
Virtual 3D Blackboard: 3D Finger Tracking using a Single Camera.
Automatic Face and Gesture Recognition, Proceedings of, pp. 536-543
authors detect the arm and use the arm's outline to determine
the most likely location of the finger to do their finger
Dorfmuller-Ulhaas, K., Schmalstieg, D.
Finger tracking for interaction in augmented environments. IEEE
and ACM International Symposium on Augmented Reality, Proceedings
of, pp. 55-64 (2001).
authors do finger tracking by making use of a specialized glove
with markers on it, allowing for a more constrained solution.
 Perrin, S., Ishikawa, M.
Quantized Features for Gesture Recognition Using High Speed Vision
Camera. XVI Brazilian Symposium on Computer Graphics and Image
Processing (SIBGRAPI '03), Proceedings of, pp.383-390 (2003).
authors use a high speed vision camera that detects the hand
location quickly. Their goal is to merge gesture detection
with voice commands.
 Microsoft Corp. <http://www.microsoft.com/surface/>
is a surface that is able to interface to email, mobile devices,
and the internet. It also has the potential of providing a
new dimension of socializing tools, such as providing a picture
sharing interface at a coffee shop.
LAST UPDATED: DEC. 09, 2007