Number of the records: 1
Natural user interface as a supplement of the holographic Raman tweezers
- 1.
SYSNO ASEP 0434981 Document Type C - Proceedings Paper (int. conf.) R&D Document Type Conference Paper Title Natural user interface as a supplement of the holographic Raman tweezers Author(s) Tomori, Z. (SK)
Kaňka, Jan (UPT-D) RID, SAI
Kesa, P. (SK)
Jákl, Petr (UPT-D) RID, ORCID, SAI
Šerý, Mojmír (UPT-D) RID, SAI
Bernatová, Silvie (UPT-D) RID, SAI
Antalík, M. (SK)
Zemánek, Pavel (UPT-D) RID, SAI, ORCIDNumber of authors 8 Source Title Optical Trapping and Optical Micromanipulation XI (Proceedings of SPIE 9164). - Bellingham : SPIE, 2014 - ISSN 0277-786X - ISBN 9781628411911 Pages 91642p:1-7 Number of pages 7 s. Publication form Online - E Action Optical Trapping and Optical Micromanipulation /11./ Event date 17.08.2014-21.08.2014 VEvent location San Diego Country US - United States Event type WRD Language eng - English Country US - United States Keywords Holography ; Interfaces ; Cameras ; Eye ; Particles ; Sensors ; Speech recognition ; Gesture recognition ; Tablets ; Software Subject RIV BH - Optics, Masers, Lasers R&D Projects LO1212 GA MŠMT - Ministry of Education, Youth and Sports (MEYS) Institutional support UPT-D - RVO:68081731 UT WOS 000349300600056 DOI 10.1117/12.2061024 Annotation Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface” (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion” and “MyGaze” low-cost sensors and a simple speech recognition program “Tazti”. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin” and “tweezers”) serving for the manipulation with particles are displayed on the transparent “overlay” window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work. Workplace Institute of Scientific Instruments Contact Martina Šillerová, sillerova@ISIBrno.Cz, Tel.: 541 514 178 Year of Publishing 2015
Number of the records: 1