A helping hand for the design of robotic manipulators


Typically, a robotics expert can spend months designing a custom manipulator manually, mostly through trial and error. Each iteration may require new parts that must be designed and tested from scratch. In contrast, this new pipeline requires no manual assembly or specialized knowledge.

Similar to building with digital LEGOs, a designer uses the interface to build a robotic manipulator from a set of modular components that are guaranteed to be made. The user can adjust the palm and fingers of the robotic hand, adapting it to a specific task, and then easily integrate touch sensors into the final design.

Once the design is complete, the software automatically generates the 3D printing and machine knitting files for manufacturing the manipulator. The touch sensors are incorporated into a knitted glove that fits snugly in the robot’s hand. These sensors allow the manipulator to perform complex tasks, such as picking up delicate objects or using tools.

“One of the coolest things about this pipeline is that it makes design accessible to a general audience. Rather than spending months or years working on a design and investing a lot of money in prototypes, you can have a working prototype in minutes,” says lead author Lara Zlokapa, who will be graduating with her master’s degree in mechanical engineering this spring.

Join Zlokapa on paper his advisers are Pulkit Agrawal, a professor at the Computer Science and Artificial Intelligence Laboratory (CSAIL), and Wojciech Matusik, a professor of electrical engineering and computer science. Other co-authors include CSAIL graduate students Yiyue Luo and Jie Xu, mechanical engineer Michael Foshey, and Kui Wu, principal researcher at Tencent America. The research is presented at the International Conference on Robotics and Automation.

Think about modularity
Before starting to work on the pipeline, Zlokapa paused to think about the concept of modularity. She wanted to create enough components that users could mix and match flexibly, but not so much that they were overwhelmed with choices.

She thought creatively about component functions, rather than shapes, and came up with around 15 parts that can be combined to create trillions of unique manipulators.

Researchers then focused on creating an intuitive interface where the user mixes and matches components in a 3D design space. A set of production rules, known as the graphical grammar, controls how users can combine parts based on how each component, such as a joint or a finger, fits together.

“If we think of this as a LEGO kit where you have different building blocks that you can put together, then the grammar might be something like ‘red blocks can only go on blue blocks’ and ‘blue blocks can’t go on top of green blocks.’ Graph grammar is what allows us to make sure that every design is valid, which means it makes physical sense and you can make it,” she explains.

Once the user has created the structure of the manipulator, they can deform the components to customize it for a specific task. For example, the manipulator may need fingers with finer tips to manipulate office scissors or curved fingers capable of gripping bottles.

During this deformation step, the software surrounds each component with a digital cage. Users stretch or bend the components by dragging the corners of each cage. The system automatically limits these movements to ensure that parts always connect properly and the finished design remains manufacturable.

Fit like a glove
After personalization, the user identifies the locations of the touch sensors. These sensors are embedded in a knitted glove that fits securely around the 3D printed robotic manipulator. The glove is made of two layers of fabric, one that contains horizontal piezoelectric fibers and another with vertical fibers. The piezoelectric material produces an electrical signal when pressed. Touch sensors are formed at the intersection of the horizontal and vertical piezoelectric fibers; they convert pressure stimuli into electrical signals that can be measured.

“We used gloves because they’re easy to fit, easy to replace, and easy to remove if we need to fix anything inside,” Zlokapa says.

Additionally, with gloves, the user can cover the entire hand with touch sensors, rather than embedding them in the palm or fingers, as is the case with other robotic manipulators (if they have touch sensors).

Once the design interface was complete, the researchers produced custom manipulators for four complex tasks: picking up an egg, cutting paper with scissors, pouring water from a bottle, and screwing on a wing nut. The wing nut manipulator, for example, had an elongated, offset finger, which prevented the finger from colliding with the nut as it turned. This successful design only required two iterations.

The egg grabbing manipulator never broke or dropped the egg during testing, and the paper cutting manipulator could use a wider range of scissors than any existing robotic hand they could find in the litterature.

But while testing the manipulators, the researchers discovered that the sensors created a lot of noise due to the uneven weave of the knitted fibers, which impairs their accuracy. They are now working on more reliable sensors that could improve the performance of the manipulator.

The researchers also want to explore the use of additional automation. Since graph grammar rules are written in a way that a computer can understand, algorithms could search the design space to determine optimal configurations for a task-specific robotic hand. With autonomous manufacturing, the entire prototyping process could be done without human intervention, Zlokapa says.

“Now that we have a way for a computer to explore this design space, we can work on answering the question, ‘Is the human hand the optimal shape for performing everyday tasks?’ Maybe there is a better form? Or maybe we want more or less fingers, or fingers pointing in different directions? This research does not fully answer that question, but it is a not in that direction,” she said.

This work was supported, in part, by the Toyota Research Institute, the Defense Advanced Research Projects Agency, and an Amazon Robotics Research Award.

Previous Police professional | Police Scotland respond to a domestic violence call every nine minutes
Next Full stops ahead on I-94 during bridge construction