layout | title |
---|---|
default |
team JSK Web page |
We will participate in OPL with two robots, HSR and AERO, customized with our own sensor boards and other components.
HSR | AERO |
---|---|
We have recently developed new self-made sensor boards as a culmination of previous research[1,2] in our laboratory to create and use sensor boards. This board is equipped with sensors such as proximity sensor, force sensor, microphone, and IMU. Therefore, the use of this sensor board is expected to contribute greatly to the improvement of various manipulation skills of the robots.
Life support robots that coexist with humans must be able to interpret verbal instructions and perform tasks desired by humans in real environments. In addition, just as humans use language to acquire concepts and perform advanced thinking, robots should be able to use language to approximate some of the abilities that are taken for granted by humans but are still difficult for robots to perform.
It is also necessary to collect data while performing daily life support tasks, and to learn and improve task execution based on that data. This is a very challenging issue. Since there are two types of learning elements: global learning elements that are effective in multiple environments and local learning elements that are relevant to the unique circumstances of those environments, we believe it is important to successfully separate both types of learning and improve the system at various levels.
[1] Naoya Yamaguchi, Shun Hasegawa, Masaki Murooka, Kei Okada, and Masayuki Inaba. Selective grasp in occluded space by all-around proximity perceptible finger. Robotics and Autonomous Systems, 127:103464, 2020.
[2] Tasuku Makabe, Naoki Hiraoka, Shintaro Noda, Tomoki Anzai, Kohei Kimura, Mirai Hattori, Hiroya Sato, Fumihito Sugai, Yohei Kakiuchi, Kei Okada, et al. Design and development for humanoid-vehicle transformer platform with plastic resin structure and distributed redundant sensors. In 2022 International Conference on Robotics and Automation (ICRA), pages 8526–8532. IEEE, 2022.
[3] Iori Yanaokura, Naoki Wake, Kazuhiro Sasabuchi, Riku Arakawa, Kei Okada, Jun Takamatsu, Masayuki Inaba, and Katsushi Ikeuchi. A multimodal learning-from-observation towards all-at-once robot teaching using task cohesion. In 2022 IEEE/SICE International Symposium on System Integration (SII), pages 367–374. IEEE, 2022.
[4] Yuki Furuta, Kei Okada, Yohei Kakiuchi, and Masayuki Inaba. An everyday robotic system that maintains local rules using semantic map based on long-term episodic memory. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1–7. IEEE, 2018.
We have developed and used open source robot software in our laboratory.
- EusLisp(https://github.com/euslisp/EusLisp ): An Integrated Programming System for the Research on Intelligent Robots
- scikit-robot(https://github.com/iory/scikit-robot ): A Flexible Framework for Robot Control in Python
- jsk_recognition(https://github.com/jsk-ros-pkg/jsk_recognition ): A Stack for the Recognition Packages for ROS which are Used in JSK Lab
- jsk_visualization(https://github.com/jsk-ros-pkg/jsk_visualization ): A Stack for the Visualization Packages which are Used in JSK Lab
Naoaki Kanazawa
Yoshiki Obinata
Soonhyo Kim
Iori Yanokura
Shingo Kitagawa
Kento Kawaharazuka
Although the team members have been replaced and are no longer a separate team, JSK has participated in the RoboCup@Home 2017 in Nagoya. The JSK team called JSK@Home proceeded to the finals showing great performance especially in the competition concerning speech and person recognition, scoring the highest of all teams.
Our team has never participated in a local RoboCup tournament and will participate in the Japanese domestic tournament to be held in March 2023.