Text Document

GestureLaser and GestureLaser Car: Development of an embodied space to support remote instruction

Loading...
Thumbnail Image

Fulltext URI

Document type

Text

Additional Information

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Kluwer Academic Publishers, Dordrecht, The Netherlands

Abstract

When designing systems that support remote instruction on physical tasks in the real world, one must consider four requirements- 1) participants must be able to take appropriate positions, 2) they must be able to see and show gestures, 3) they must be able to organize the arrangement of bodies and tools and gestural expression sequentially and interactively 4) the instructor must be able to give instructions to more than one operator at a time GestureLaser and GestureLaser Car are systems we have developed in an attempt to satisfy these requirements GestureLaser is a remote controlled laser pointer that allows an instructor to show gestural expressions referring to real world objects from a distance GestureLaser Car is a remote controlled vehicle on which the GestureLaser can be mounted. Experiments with this combination indicate that it satisfies the four requirements reasonably well and can be used effectively to give remote instruction. Following the comparison of the GestureLaser system with existing systems, some implications to the design of embodied spaces are described

Description

Yamazaki, Keiichi; Yamazaki, Akiko; Kuzuoka, Hideaki; Oyama, Shinya; Kato, Hiroshi; Suzuki, Hideyuki; Miki, Hiroyuki (1999): GestureLaser and GestureLaser Car: Development of an embodied space to support remote instruction. ECSCW 1999: Proceedings of the Sixth European Conference on Computer Supported Cooperative Work. DOI: 10.1007/0-306-47316-X_13. Kluwer Academic Publishers, Dordrecht, The Netherlands. ISBN: 978-0-306-47316-6. pp. 239-258. Full Papers. Copenhagen, Denmark. 12–16 September 1999

Keywords

Citation

URI

URI

Endorsement

Review

Supplemented By

Referenced By


Number of citations to item: 5

  • Takeshi Tsujimura, Yoshihiro Minato, Kiyotaka Izumi (2013): Shape recognition of laser beam trace for human–robot interface, In: Pattern Recognition Letters 15(34), doi:10.1016/j.patrec.2013.03.023
  • Ryotaro Kuriya, Takeshi Tsujimura, Kiyotaka Izumi (2015): Augmented reality robot navigation using infrared marker, In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), doi:10.1109/roman.2015.7333607
  • Nobuchika Sakata, Yuuki Takano, Shogo Nishida (2014): Remote Collaboration with Spatial AR Support, In: Lecture Notes in Computer Science, doi:10.1007/978-3-319-07230-2_15
  • Nobuchika Sakata, Tomoyuki Kobayashi, Shogo Nishida (2013): Communication Analysis of Remote Collaboration System with Arm Scaling Function, In: Lecture Notes in Computer Science, doi:10.1007/978-3-642-39330-3_40
  • Takeshi Tsujimura, Kiyotaka Izumi (2016): Active spatial interface projecting luminescent augmented reality marker, In: 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), doi:10.1109/mfi.2016.7849460
Please note: Providing information about citations is only possible thanks to to the open metadata APIs provided by crossref.org and opencitations.net. These lists may be incomplete due to unavailable citation data.source: opencitations.net, crossref.org