GestureLaser and GestureLaser Car: Development of an embodied space to support remote instruction
dc.contributor.author | Yamazaki, Keiichi | |
dc.contributor.author | Yamazaki, Akiko | |
dc.contributor.author | Kuzuoka, Hideaki | |
dc.contributor.author | Oyama, Shinya | |
dc.contributor.author | Kato, Hiroshi | |
dc.contributor.author | Suzuki, Hideyuki | |
dc.contributor.author | Miki, Hiroyuki | |
dc.date.accessioned | 2017-04-15T11:51:10Z | |
dc.date.available | 2017-04-15T11:51:10Z | |
dc.date.issued | 1999 | |
dc.description.abstract | When designing systems that support remote instruction on physical tasks in the real world, one must consider four requirements- 1) participants must be able to take appropriate positions, 2) they must be able to see and show gestures, 3) they must be able to organize the arrangement of bodies and tools and gestural expression sequentially and interactively 4) the instructor must be able to give instructions to more than one operator at a time GestureLaser and GestureLaser Car are systems we have developed in an attempt to satisfy these requirements GestureLaser is a remote controlled laser pointer that allows an instructor to show gestural expressions referring to real world objects from a distance GestureLaser Car is a remote controlled vehicle on which the GestureLaser can be mounted. Experiments with this combination indicate that it satisfies the four requirements reasonably well and can be used effectively to give remote instruction. Following the comparison of the GestureLaser system with existing systems, some implications to the design of embodied spaces are described | |
dc.identifier.doi | 10.1007/0-306-47316-X_13 | |
dc.identifier.isbn | 978-0-306-47316-6 | |
dc.language.iso | en | |
dc.publisher | Kluwer Academic Publishers, Dordrecht, The Netherlands | |
dc.relation.ispartof | ECSCW 1999: Proceedings of the Sixth European Conference on Computer Supported Cooperative Work | |
dc.relation.ispartofseries | ECSCW | |
dc.title | GestureLaser and GestureLaser Car: Development of an embodied space to support remote instruction | |
dc.type | Text | |
gi.citation.endPage | 258 | |
gi.citation.startPage | 239 | |
gi.citations.count | 5 | |
gi.citations.element | Takeshi Tsujimura, Yoshihiro Minato, Kiyotaka Izumi (2013): Shape recognition of laser beam trace for human–robot interface, In: Pattern Recognition Letters 15(34), doi:10.1016/j.patrec.2013.03.023 | |
gi.citations.element | Ryotaro Kuriya, Takeshi Tsujimura, Kiyotaka Izumi (2015): Augmented reality robot navigation using infrared marker, In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), doi:10.1109/roman.2015.7333607 | |
gi.citations.element | Nobuchika Sakata, Yuuki Takano, Shogo Nishida (2014): Remote Collaboration with Spatial AR Support, In: Lecture Notes in Computer Science, doi:10.1007/978-3-319-07230-2_15 | |
gi.citations.element | Nobuchika Sakata, Tomoyuki Kobayashi, Shogo Nishida (2013): Communication Analysis of Remote Collaboration System with Arm Scaling Function, In: Lecture Notes in Computer Science, doi:10.1007/978-3-642-39330-3_40 | |
gi.citations.element | Takeshi Tsujimura, Kiyotaka Izumi (2016): Active spatial interface projecting luminescent augmented reality marker, In: 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), doi:10.1109/mfi.2016.7849460 | |
gi.conference.date | 12–16 September 1999 | |
gi.conference.location | Copenhagen, Denmark | |
gi.conference.sessiontitle | Full Papers |