Virtual Touch-Pad

Team Members: Orazio Gallo and Sonia Arteaga

Fall '07 CMPS 290b

 

 

                        

ABSTRACT:

Cellular phones are becoming more powerful by the day and users are now provided with tools comparable to those available to desktop computer users. In many cases, the limiting factor is no longer associated with computational power, memory size, or battery life and it is becoming more and more related to the input interface. Touch-screens can simplify the menu navigation by far but the vast majority of phones on the market are not equipped with them. Cameras, however, are widely spread and are usually placed on the opposite side of the screen: this allows the user to look at the screen while pointing the camera towards some surface. The surface can now become a virtual touch-screen and the user can control the pointer on the screen by moving a finger on it.

 

PROJECT COMPONENTS:

  • Implementation of the finger detection and tracker algorithm

  • Implementation of a simple application to show proof of concept

  • Finger Mouse Evaluation

PROJECT REPORT LINKS:

  • Introduction (Distributed: 11/02/07  Reviewed by: James Skorupski (11/04/07), Gillian Smith (11/05/07), Adam Smith (11/05/07))

  • Related Work  (Distributed: 10/22/07  Reviewed by: James Skorupski(10/31/07), Gillian Smith(10/25/07), Adam Smith (10/31/07))

  • Figures (Distributed: 11/16/07  Reviewed by: James Skorupski(11/18/07), Gillian Smith(---), Adam Smith (---))

  • Full Paper

REFERENCES:

[1] Rogers, W.A., O'Brien, M.A., McLaughlin, A.C. Selection and Design of Input Devices for Assistive Technologies. 9th International Conference on Control, Automation, Robotics and Vision(ICARCV),pp.1-6 (2006).

Description: The authors discuss various options for User Interfaces and how well they apply for applications in assistive technologies.  Some of the devices include: touchpads, virtual keyboards, and voice commands.

[2] Synaptic Inc. New Touch Screens Improve Handheld Human Interface. Synaptics Inc., pp. 1-6 (2001).

Description: The authors discuss touchpad technologies using a capacitive layer in hardware that allows for more accurate detection of where the stylus or finger has press.

[3] Starner, T., Mann, S., Rhodes, B., Levine, J., et.al. Augmented Reality Through Wearable Computing. Presence-Cambridge Massachusetts, Vol. 6, No. 4, pp. 386-398 (1997).

Description: The authors have an apparatus that is worn on the head that has goggles through which the user can see information overlaid over what the user is actually seeing in the world.  The user's finger is used to interface through the various menu's on the image displayed.  The idea is that the interface will be able to display information necessary for the user, such as a relevant reference paper.

[4] Dominguez, S., Keaton, T., Sayed, A.H. A Robust Finger Tracking Method for Multimodal Wearable Computer Interfacing. IEEE Transactions on Multimedia, Vol. 8, No. 5, pp. 956-972 (2006).

Description: The authors use a wearable computer interfacing which is worn on the head and the interface is seen through the goggles.  These authors combine finger and audio based commands.  Their application is a user specified area for segmentation, by outlining the object in the world with their finger.

[5] Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C. LucidTouch: A See-Through Mobile Device. Symposium on User Interface Software and Technology(UIST'07), Proceedings of, Newport, Rhode Island, pp. 269-278 (2007).

Description: The authors simulate the perception of a see through screen so that the user can see their finger movements which are taking place behind the device.  The fingers are tracked and detected in order to use them to navigate through the applications.  A camera is mounted behind the device to record the movements. 

[6] Letessier, J., Berard, F. Visual Tracking of Bare Fingers for Interactive Surfaces. Symposium on User Interface Software and Technology (UIST'04), Proceedings of, Vol. 6, No.2, Santa Fe, New Mexico, pp. 119-122 (2004).

Description: The authors use the difference image for segmentation and filtering techniques to detect fingers based on color information.

[7] Oka, K., Sato, Y., Koike, H. Real-Time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface Systems. Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR), pp.411-416(2002)

Description: Authors use infrared images for tracking and detection.  They are able to detect fingers in cluttered backgrounds and under varying lighting conditions.

[8] Sato, Y., Kobayashi, Y., Koike, H. Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface. IEEE International Conference on Automatic Face and Gesture Recognition, Proceedings of, pp. 462 (2000).

Description: The authors use an infrared camera to detect finger movements.  They use this to work on a physical desk while also able to interface to a computer.

[9] Wellner, P. The DigitalDesk Calculator: Tactile Manipulation on a Desk Top Display. Symposium on User Interface Software and Technology (UIST '91), Proceedings of, pp. 27-33 (1991).

Description: The authors use a camera and a projector to interface to a computer while working on a physical desk.  The camera reads in the finger movements while the projector displays output information.

[10] Bencheikh-el-hocine, M., Bouzenada, M., Batouche, M.C. A New Method of Finger Tracking Applied to the Magic Board. IEEE International Conference on Industrial Technology (ICIT), pp.1046-1051(2004).

Description: The authors use finger detection to interface with a game of chess.  The actions of picking up a piece and moving it are detected and displayed on the screen.

[11] Lockton, R., Fitzgibbon, A.W. Real-time gesture recognition using deterministic boosting. British Machine Learning Vision Conference (BMVC), Proceedings of, (2002).

Description: The authors use boosting algorithms (e.g. Ada-Boost) to recognize gestures. Their goal is to detect sign language gestures.

[12] Wu, A., Shah, M., da Vitoria Lobo, N. A Virtual 3D Blackboard: 3D Finger Tracking using a Single Camera. Automatic Face and Gesture Recognition, Proceedings of, pp. 536-543 (2000).

Description: The authors detect the arm and use the arm's outline to determine the most likely location of the finger to do their finger detection.

[13]Dorfmuller-Ulhaas, K., Schmalstieg, D. Finger tracking for interaction in augmented environments. IEEE and ACM International Symposium on Augmented Reality, Proceedings of, pp. 55-64 (2001).

Description: The authors do finger tracking by making use of a specialized glove with markers on it, allowing for a more constrained solution.

[14] Perrin, S., Ishikawa, M. Quantized Features for Gesture Recognition Using High Speed Vision Camera. XVI Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI '03), Proceedings of, pp.383-390 (2003).

Description: The authors use a high speed vision camera that detects the hand location quickly.  Their goal is to merge gesture detection with voice commands.

[15] Microsoft Corp. <http://www.microsoft.com/surface/>

Description: This is a surface that is able to interface to email, mobile devices, and the internet.  It also has the potential of providing a new dimension of socializing tools, such as providing a picture sharing interface at a coffee shop.

 

LAST UPDATED: DEC. 09, 2007