Number of the records: 1  

Natural user interface as a supplement of the holographic Raman tweezers

  1. 1.
    SYSNO ASEP0434981
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleNatural user interface as a supplement of the holographic Raman tweezers
    Author(s) Tomori, Z. (SK)
    Kaňka, Jan (UPT-D) RID, SAI
    Kesa, P. (SK)
    Jákl, Petr (UPT-D) RID, ORCID, SAI
    Šerý, Mojmír (UPT-D) RID, SAI
    Bernatová, Silvie (UPT-D) RID, SAI
    Antalík, M. (SK)
    Zemánek, Pavel (UPT-D) RID, SAI, ORCID
    Number of authors8
    Source TitleOptical Trapping and Optical Micromanipulation XI (Proceedings of SPIE 9164). - Bellingham : SPIE, 2014 - ISSN 0277-786X - ISBN 9781628411911
    Pages91642p:1-7
    Number of pages7 s.
    Publication formOnline - E
    ActionOptical Trapping and Optical Micromanipulation /11./
    Event date17.08.2014-21.08.2014
    VEvent locationSan Diego
    CountryUS - United States
    Event typeWRD
    Languageeng - English
    CountryUS - United States
    KeywordsHolography ; Interfaces ; Cameras ; Eye ; Particles ; Sensors ; Speech recognition ; Gesture recognition ; Tablets ; Software
    Subject RIVBH - Optics, Masers, Lasers
    R&D ProjectsLO1212 GA MŠMT - Ministry of Education, Youth and Sports (MEYS)
    Institutional supportUPT-D - RVO:68081731
    UT WOS000349300600056
    DOI10.1117/12.2061024
    AnnotationHolographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface” (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion” and “MyGaze” low-cost sensors and a simple speech recognition program “Tazti”. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin” and “tweezers”) serving for the manipulation with particles are displayed on the transparent “overlay” window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.
    WorkplaceInstitute of Scientific Instruments
    ContactMartina Šillerová, sillerova@ISIBrno.Cz, Tel.: 541 514 178
    Year of Publishing2015
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.