Farm Robots Navigate With SonicBoom’s Sound-Based Sensing

Farm Robots Navigate With SonicBoom’s Sound-Based Sensing

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Agricultural robots could help farmers harvest food under tough environmental conditions, especially as temperatures continue to rise. However, creating affordable robotic arms that can gracefully and accurately navigate the thick network of branches and trunks of plants can be challenging.

In a recent study, researchers developed a sensing system, called SonicBoom, which allows autonomous robots to use sound to sense the objects it touches. The approach, which can accurately localize or “feel” the objects it encounters with centimeter-level precision, is described in a study published 2 June in IEEE Robotics and Automation Letters.

Moonyoung (Mark) Lee is a fifth-year Ph.D. student at Carnegie Mellon University’s Robotics Institute who was involved in developing SonicBoom. He notes that many autonomous robots currently rely on a collection of tiny camera-based tactile sensors. Minicameras beneath a protective gel pack that lines the robot’s surface let the sensors visually estimate the gel’s deformation to gain tactile information. However, this approach isn’t ideal in agricultural settings, when branches are likely to occlude the visual sensors. What’s more, camera-based sensors can be expensive and could be easily damaged in this context.

Another option is pressure sensors, Lee notes, but these would need to cover much of the surface area of the robot in order to effectively sense when it comes into contact with branches. “Imagine covering the entire robot arm surface with that kind of [sensor]. It would be expensive,” he says.

Instead, Lee and his colleagues are proposing a completely different approach that relies on sound for sensing. The system involves an array of contact microphones, which detect physical touch as sound signals that propagate through solid materials.

How Does SonicBoom Work?

When a robotic arm touches a branch, the resulting sound waves travel down the robotic arm until they encounter the array of contact microphones. Tiny differences in sound-wave properties (such as signal intensity and phase) across the array of microphones are used to localize where the sound originated, and thus the point of contact.

  In this video, see SonicBoom in action during laboratory testing.   youtu.be  

Lee notes that this approach allows microphones to be embedded deeper in the robotic arm. This means they are less prone to damage compared to traditional visual sensors on the exterior of a robotic arm. “The contact microphones can be easily protected from very harsh, abrasive contacts,” he explains.

As well, the approach uses a small handful of microphones dispersed across the robotic arm, rather than many visual or pressure sensors more densely coating it.

To help SonicBoom better localize points of contact, the researchers used an AI model, trained on data collected by tapping the robotic arm more than 18,000 times with a wooden rod. As a result, SonicBoom was able to localize contact on the robotic arm with an error rate of just 0.43 centimeters for objects it was trained to detect. It was also able to detect novel objects, for example ones made of plastic or aluminum, with an error rate of 2.22 cm.

In a subsequent study pending publication, Lee and his colleagues are using new data to train SonicBoom to identify what kind of object its encounteringfor example, a leaf, branch, or trunk.

“With SonicBoom, you can blindly tap around and know where the [contact happens], but at the end of the day, for the robot, the really important information is: Can I keep pushing, or am I hitting a strong trunk and should rethink how to move my arm?” he explains.

Of note, SonicBoom has yet to be tested in real-world agricultural settings, Lee says.

From Your Site Articles

Related Articles Around the Web

Leave a Comment