ABSTRACT
This paper presents an effective approach to estimate the fixed internal and varying external parameters of the camera for real time experiments using 2D-3D point correspondences. Images are acquired at each time step, a pose estimation algorithm is then employed to determine the camera pose w.r.t the object. A simple homogenous transformation is derived between the camera and end-effector to determine the position of the manipulator end-effector, as camera is mounted on the tool in eye-in-hand configuration. The paper focuses on determining the pose accurately and to look upon those issues that we encounter in real time. The major contribution of this paper is in two folds: camera pose parameters are easily and accurately recovered from 2D to 3D point correspondence; second is that experiments using real images are conducted, which presents good results.
- Maybank, S. J. and Faugeras, O. 1992. A therory of self-calibration of a moving camera. Intl. J. Comp. Vis. 8, 2. 123--151. Google ScholarDigital Library
- Pollefeys, M., Koch, R., and Gool, L. V. 1998. Self calibration and metric reconstruction in spite of varying and unknown internal camera parameters. In Proceedings of International Conference on Computer Vision. Google ScholarDigital Library
- Fermüller, C., and Aloimonos, Y. 2000. Observability of 3d motion. Intl. J. Comp. Vis. 37, 1 (Jun. 2000), 43--62. Google ScholarDigital Library
- Fischler, M. A., and Bolles, R. C. 1981. Random Sample Consensus: a Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. ACM. 24, 6. 381--395. Google ScholarDigital Library
- Grewatsch, S., Richter, H., and Müller, E. 2004. Efficient mapping of 3D video using OpenGL. In Proceedings Visualisation, Imaging and Image Processing, Marbella, SpainGoogle Scholar
- Leonard, J. J., and Durrant-Whyte, H. F. 1992. Direct Sonar Sensing for Mobile Robot Navigation, Kluwer Academic Publishers, Dordrecht. (IROS 99). 3, 1687--1692. Google ScholarDigital Library
- Baltzakis, H., Argyros, A., and Trahanias, P. 2003. Fusion of Laser and Visual data for robot motion planning and collision avoidance. Machine Vision and Applications, 15, 2, 92--100. Google ScholarDigital Library
- Hong, T. H., Rasmussen, C., Chang, T., and Shneier, M. Road Detection and Tracking for Autonomous Mobile Robots. In Proceedings of the SPIE 16th Annual International Symposium on Aerospace/Defense Sensing, Simulation, and Controls (Orlando, FL, April 1--5, 2002).Google Scholar
- Corke, P. I. and Hutchinson, S. A. 2000. A new hybrid image-based visual servo control scheme. In Proceedings IEEE Conference on Decision and Control, 2521--2527.Google Scholar
- Malis, E. and Chaumette, F. 2002. Theoretical improvements in the stability analysis of a new class of model-free visual servoing methods. IEEE Trans. Robot Autom. 18, 2 (Apr. 2002), 176--186.Google ScholarCross Ref
- Chen, J., Dawson, D. M., Dixon, W. E., and Behal, A. 2005. Adaptive Homography-Based Visual Servo Tracking for a fixed camera configuration with camera-in-hand extension. IEEE Trans. Control Sys. Tech. 13, 5 (Sep. 2005).Google Scholar
- Chaumette. F. and Malis, E. 2000. 2 1/2 D visual servoing: a possible solution to improve image-based and position-based visual servoing. In Proceedings IEEE International Conference Robotics and Automation, 630--635.Google Scholar
- Hartley, R. and Zisserman, A. 2003. Multiple view geometry in computer vision, 2nd Ed. Cambridge University Press. Google ScholarDigital Library
- Salma, C. C. 1980. Manual of Photogrammetry, 4th ed., American Society of Photogrammetry, Falls Church, Virginia.Google Scholar
- Abdel-Aziz, Y. I. and Karara, H. M. 1971. Direct Linear transformation into object space coordinates in close-range photogrammetry. In Proceedings Symposium on Close-Range Photogrammetry, Urbana, Illinois, 1--18.Google Scholar
- Faugeras, O. D. and Toscani, G. 1987. Camera Calibration for 3D computer vision. In Proceedings International Workshop on Industrial Applications of Machine Vision and Machine Intelligence, Silken, Japan, 240--247.Google Scholar
- http://www.robix.comGoogle Scholar
Index Terms
- Vision based system for camera tracking in eye-in-hand configuration
Recommendations
Hand-Eye Camera Calibration with an Optical Tracking System
ICDSC '18: Proceedings of the 12th International Conference on Distributed Smart CamerasThis paper presents a method for hand-eye camera calibration via an optical tracking system (OTS) faciltating robotic applications. The camera pose cannot be directly tracked via the OTS. Because of this, a transformation matrix between a marker-plate ...
Fully Vision-based Calibration of a Hand-Eye Robot
This article is concerned with calibrating an anthropomorphic two-armed robot equipped with a stereo-camera vision system, that is estimating the different geometric relationships involved in the model of the robot. The calibration procedure that is ...
A Generic Camera Model and Calibration Method for Conventional, Wide-Angle, and Fish-Eye Lenses
Fish-eye lenses are convenient in such applications where a very wide angle of view is needed, but their use for measurement purposes has been limited by the lack of an accurate, generic, and easy-to-use calibration procedure. We hence propose a generic ...
Comments