Haar Feature Object Recognition and Manipulation

December 20, 2016

This is Image-Processing-Based Object Recognition and Manipulation with a 5-DOF Smart Robotic Arm through a Smartphone Interface Using Human Intent Sensing, and the image processing contains haar cascade and affine transformation.

This project developed an image processing-based object recognition and manipulation system with a 5-DOF smart robotic arm through a smartphone interface considering human user’s intent sensing. The project utilized a camera, two microcontrollers, a mobile phone, simulator of iOS in computer, and an arm robot.  Human user chooses different objects to be manipulated by the robot for the human through computer interface or mobile phone interface, and the devices recognize the targeted object and measure the distance between the object and the manipulator in the video frame got from the camera according to different object classifiers. After image processing, the computer or the mobile phone sends the position information to the raspberry pi, and the microcontroller utilizes the inverse kinematic equation to calculate angle of each joint in the robotic arm, and finally sends this data to the Arbotix-M for transforming the angular positional information to electrical signal and controlling the action of the robot. Moreover, the robot arm can also pick and place the object triggered by  the walking steps of human user obtained by the accelerator in the mobile phone, where exceeding a threshold in human walking step indicates the human’s intent to receive service from the robot.

 

 

Tags:

Share on Facebook
Share on Twitter
Please reload

  • github-512
  • LinkedIn Social Icon
  • YouTube Social  Icon
  • Facebook Social Icon

© 2017 by Haiming Gang