Neuromorphic Eye-in-Hand Visual Servoing

Rajkumar Muthusamy, Abdulla Ayyad, Mohamad Halwani, Dewald Swart, Dongming Gan, Lakmal Seneviratne, Yahya Zweiri

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

Robotic vision plays a major role in factory automation to service robot applications. However, the traditional use of frame-based cameras sets a limitation on continuous visual feedback due to their low sampling rate, poor performance in low light conditions and redundant data in real-time image processing, especially in the case of high-speed tasks. Neuromorphic event-based vision is a recent technology that gives human-like vision capabilities such as observing the dynamic changes asynchronously at a high temporal resolution ( 1~\mu s ) with low latency and wide dynamic range. In this paper, for the first time, we present a purely event-based visual servoing method using a neuromorphic camera in an eye-in-hand configuration for the grasping pipeline of a robotic manipulator. We devise three surface layers of active events to directly process the incoming stream of events from relative motion. A purely event-based approach is used to detect corner features, localize them robustly using heatmaps and generate virtual features for tracking and grasp alignment. Based on the visual feedback, the motion of the robot is controlled to make the temporal upcoming event features converge to the desired event in Spatio-temporal space. The controller switches its operation such that it explores the workspace, reaches the target object and achieves a stable grasp. The event-based visual servoing (EBVS) method is comprehensively studied and validated experimentally using a commercial robot manipulator in an eye-in-hand configuration for both static and dynamic targets. Experimental results show superior performance of the EBVS method over frame-based vision, especially in high-speed operations and poor lighting conditions. As such, EBVS overcomes the issues of motion blur, lighting and exposure timing that exist in conventional frame-based visual servoing methods.

Original languageBritish English
Article number9395430
Pages (from-to)55853-55870
Number of pages18
JournalIEEE Access
Volume9
DOIs
StatePublished - 2021

Keywords

  • event camera
  • event-based visual servoing
  • Neuromorphic vision sensor
  • neuromorphic vision-based robot control
  • pick and place task
  • robotic manipulator
  • robotic vision
  • vacuum gripper

Fingerprint

Dive into the research topics of 'Neuromorphic Eye-in-Hand Visual Servoing'. Together they form a unique fingerprint.

Cite this