LI Jinjian, HU Quan. Visual Servo of Space Manipulators Driven by Event CameraJ. Aerospace Control and Application, 2023, 49(3): 28-35. DOI: 10.3969/j.issn.1674-1579.2023.03.004
Citation: LI Jinjian, HU Quan. Visual Servo of Space Manipulators Driven by Event CameraJ. Aerospace Control and Application, 2023, 49(3): 28-35. DOI: 10.3969/j.issn.1674-1579.2023.03.004

Visual Servo of Space Manipulators Driven by Event Camera

  • Event cameras, a novel type of neuromorphic visual sensor, offer the high temporal resolution and dynamic range, making them suitable for use in computer vision and robotics. This article addresses the challenges of high measurement delay and large data volume in traditional visual systems used in space manipulators by exploring perception and control methods based on event cameras. Asynchronous event stream data generated by event cameras differs from that of traditional cameras, necessitating new algorithms designed specifically for event data. In this paper, a fast circular feature detection and tracking algorithm using iterative reweighting fitting is designed. Building upon this algorithm, a visual servo method adapted to circular features for manipulators is proposed. Experimental results demonstrate that the proposed detection and tracking algorithm achieves a high success rate and significantly faster detection speeds than traditional algorithms. Meanwhile, the event camera’s highspeed feature feedback facilitates more precise servo motion of the manipulators.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return