Robotic handling technologies have been commonplace in the manufacturing industries for years. Think of an automobile production line where manufactured parts are usually very consistent, structurally rigid, and have known geometries. All these attributes make robotic handling fairly straightforward.
But this is not the case for production environments comprised mostly of non-uniform, naturally flexible parts like poultry products. As a result, automated handling of natural products lags far behind that of manufactured parts.
Here, researchers in the Georgia Tech Research Institute’s Agricultural Technology Research Program are exploring the development of better robotic imaging and sensing technologies and more dexterous grippers and manipulators for poultry handling tasks.
“We believe available commercial off-the-shelf hardware, particularly robot arms and grippers, are capable of properly handling and manipulating natural products,” says Dr. Ai-Ping Hu, project director and senior research engineer. However, hardware is just the beginning.
Hu and fellow researchers need to train the robotic arms and grippers to adjust to variable objects like a human hand does. So, the team chose to demonstrate the processing task of loading bird front-halves from a stationary pile onto moving deboning cone lines. The first step was to observe human workers performing the task. The team learned that workers will often first lift up a bird in an arbitrary way (to separate it from the pile) and then place it on a flat surface momentarily (slightly re-orienting it) before re-grabbing it (in a standardized way) and placing the bird onto the moving cone.
With knowledge of how humans perform the task, researchers then developed an experimental platform along with imaging and sensing algorithms. The main components include a six-degrees-of-freedom KUKA food-grade industrial robot, a three-finger adaptive robot gripper by Robotiq, and a Microsoft Kinect 3D sensor (see figure). The Robotiq gripper (hand) is affixed to the KUKA robot’s arm, while the Kinect sensor is used to obtain 3D recognition of the bird, first from within the pile and then how it is grasped in the gripper.
Next, using the experimental platform and an articulated 3D-printed bird front half — composed of a separate wing and body connected by a mechanical joint that simulates a bird shoulder — the team deconstructed the human approach into five steps:
1. Take a 3D image of a bird pile.
2. Use image processing to determine which wing should be grasped.
3. Use the robot gripper to approach and grasp the identified wing and lift the bird to an intermediate position.
4. Take another 3D image of the bird to determine its orientation in the robot gripper.
5. Determine the optimal path for the robot gripper to place the bird onto a moving cone facing a preferred direction.
The result is a dynamic motion imparted from the robot gripper, through the wing, to the bird’s body.
“Because a bird is not a rigid object, it won’t simply follow the movements of the robot gripper. The team’s robotic task is to design the gripper motion to act through the wing and shoulder joint to get the bird to move in a desired way onto the cone. As an analogy, we like to refer to the ball-in-cup game where a ball is attached to a cup by a string. The player has to move the cup to use the string to force the ball into it,” explains Hu.
The team is using the articulated 3D-printed bird to research and design robot motions and plans to begin testing on real birds next year. The intention is that the articulated bird will capture the important dynamic features of real birds, even though it is not a perfect model.
Hu believes, if successful with real birds, the developed robot technology will find numerous applications in the food processing and agricultural sectors where interaction with naturally flexible and non-uniform products is an everyday challenge.