TY - GEN
T1 - Target follwing with a vision sway compensation for robotic fish Fibo
AU - Na, Ki In
AU - Jeong, In Bae
AU - Han, Seungbeom
AU - Kim, Jong Hwan
PY - 2011
Y1 - 2011
N2 - Since the heading direction of a robotic fish continuously changed when it is swimming, its surrounding information may not be easily obtained from a front camera equipped on the robotic fish's head. Also, it is difficult to know both the heading direction, which is direction of robotic fish's head and the swimming direction, which robotic fish are moving toward. Therefore, this paper proposes the gyro sensor-based vision sway compensation module to obtain surrounding information from a camera equipped on a swaying robotic fish's head using a gyro sensor. Firstly, the time when heading direction and swimming direction coincide is detected. The swimming direction image is priodically captured and updated at this time using a gyro sensor and a front camera. Then, the input images are continuously transformed to the relative location. This transformation is performed by image stitching which consists of speeded up robust features (SURF) and random sample consensus (RANSAC), comparing with the captured swimming direction image. By this developed module, it is possible to get the surrounding information from the transformed input images and also to get the relative location of target object. The effectiveness of the proposed gyro sensor-based vision sway compensation module is demonstrated through real target following experiment using the robotic fish "Fibo II," developed in the RIT Lab, KAIST.
AB - Since the heading direction of a robotic fish continuously changed when it is swimming, its surrounding information may not be easily obtained from a front camera equipped on the robotic fish's head. Also, it is difficult to know both the heading direction, which is direction of robotic fish's head and the swimming direction, which robotic fish are moving toward. Therefore, this paper proposes the gyro sensor-based vision sway compensation module to obtain surrounding information from a camera equipped on a swaying robotic fish's head using a gyro sensor. Firstly, the time when heading direction and swimming direction coincide is detected. The swimming direction image is priodically captured and updated at this time using a gyro sensor and a front camera. Then, the input images are continuously transformed to the relative location. This transformation is performed by image stitching which consists of speeded up robust features (SURF) and random sample consensus (RANSAC), comparing with the captured swimming direction image. By this developed module, it is possible to get the surrounding information from the transformed input images and also to get the relative location of target object. The effectiveness of the proposed gyro sensor-based vision sway compensation module is demonstrated through real target following experiment using the robotic fish "Fibo II," developed in the RIT Lab, KAIST.
UR - https://www.scopus.com/pages/publications/84860745396
U2 - 10.1109/ROBIO.2011.6181604
DO - 10.1109/ROBIO.2011.6181604
M3 - Conference contribution
AN - SCOPUS:84860745396
SN - 9781457721373
T3 - 2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011
SP - 2114
EP - 2119
BT - 2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011
T2 - 2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011
Y2 - 7 December 2011 through 11 December 2011
ER -