Visual servoing considering sensing dynamics and robot dynamics

Cong Wang, Chung Y. Lin, Masayoshi Tomizuka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations


For many desirable applications of vision guided industrial robots, real-time visual servoing is necessary but also challenging. Difficulty comes from the limited sampling rate and response time of typical machine vision systems equipped on industrial robots. These factors are addressed as the dynamics of visual sensing. In addition, robot dynamics should also be fully considered when designing the control law. Considering these aspects, this paper presents a control scheme of visual servoing. A dual-rate adaptive tracking filter is presented to compensate the visual sensing dynamics. Based on the compensated vision feedback, the techniques of multi-surface sliding control and dynamic surface control are used to formulate a two-layer control law for target tracking. System kinematics and dynamics are decoupled and dealt with by the two layers of the control law respectively. The proposed method is validated through experiments on a SCARA robot.

Original languageEnglish (US)
Title of host publication6th IFAC Symposium on Mechatronic Systems, MECH 2013
PublisherIFAC Secretariat
Number of pages8
ISBN (Print)9783902823311
StatePublished - 2013
Externally publishedYes
Event6th IFAC Symposium on Mechatronic Systems, MECH 2013 - Hangzhou, China
Duration: Apr 10 2013Apr 12 2013

Publication series

NameIFAC Proceedings Volumes (IFAC-PapersOnline)
ISSN (Print)1474-6670


Other6th IFAC Symposium on Mechatronic Systems, MECH 2013

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering


  • Adaptive Kalman filter
  • Dynamic surface control
  • Multi-surface sliding control
  • Tracking filter
  • Visual sensing dynamics
  • Visual servoing


Dive into the research topics of 'Visual servoing considering sensing dynamics and robot dynamics'. Together they form a unique fingerprint.

Cite this