Vegebot robot applies machine learning to harvest lettuce

Vegebot, a vegetable-picking robot, uses machine learning to identify and harvest a commonplace, but challenging, agricultural crop.

A team at the University of Cambridge initially trained Vegebot to recognize and harvest iceberg lettuce in the laboratory. It has now been successfully tested in a variety of field conditions in cooperation with G’s Growers, a local fruit and vegetable co-operative.

Although the prototype is nowhere near as fast or efficient as a human worker, it demonstrates how the use of robotics in agriculture might be expanded, even for crops like iceberg lettuce which are particularly challenging to harvest mechanically. The researchers published their results in The Journal of Field Robotics.

Crops such as potatoes and wheat have been harvested mechanically at scale for decades, but many other crops have to date resisted automation. Iceberg lettuce is one such crop. Although it is the most common type of lettuce grown in the U.K., iceberg is easily damaged and grows relatively flat to the ground, presenting a challenge for robotic harvesters.

“Every field is different, every lettuce is different,” said co-author Simon Birrell from Cambridge’s Department of Engineering. “But if we can make a robotic harvester work with iceberg lettuce, we could also make it work with many other crops.”

“For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot.” — Josie Hughes, University of Cambridge report co-author

“At the moment, harvesting is the only part of the lettuce life cycle that is done manually, and it’s very physically demanding,” said co-author Julia Cai, who worked on the computer vision components of the Vegebot while she was an undergraduate student in the lab of Dr Fumiya Iida.

The Vegebot first identifies the “target” crop within its field of vision, then determines whether a particular lettuce is healthy and ready to be harvested. Finally, it cuts the lettuce from the rest of the plant without crushing it so that it is “supermarket ready.”

“For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot,” said co-author Josie Hughes.

Vegebot designed for lettuce-picking challenge

The Vegebot has two main components: a computer vision system and a cutting system. The overhead camera on the Vegebot takes an image of the lettuce field and first identifies all the lettuces in the image. Then for each lettuce, the robot classifies whether it should be harvested or not. A lettuce might be rejected because it’s not yet mature, or it might have a disease that could spread to other lettuces in the harvest.

Vegebot in the field

Vegebot uses machine vision to identify heads of iceberg lettuce. Credit: University of Cambridge

The researchers developed and trained a machine learning algorithm on example images of lettuces. Once the Vegebot could recognize healthy lettuce in the lab, the team then trained it in the field, in a variety of weather conditions, on thousands of real lettuce heads.

A second camera on the Vegebot is positioned near the cutting blade, and helps ensure a smooth cut. The researchers were also able to adjust the pressure in the robot’s gripping arm so that it held the lettuce firmly enough not to drop it, but not so firm as to crush it. The force of the grip can be adjusted for other crops.

“We wanted to develop approaches that weren’t necessarily specific to iceberg lettuce, so that they can be used for other types of above-ground crops,” said Iida, who leads the team behind the research.

In the future, robotic harvesters could help address problems with labor shortages in agriculture. They could also help reduce food waste. At the moment, each field is typically harvested once, and any unripe vegetables or fruits are discarded.

However, a robotic harvester could be trained to pick only ripe vegetables, and since it could harvest around the clock, it could perform multiple passes on the same field, returning at a later date to harvest the vegetables that were unripe during previous passes.

“We’re also collecting lots of data about lettuce, which could be used to improve efficiency, such as which fields have the highest yields,” said Hughes. “We’ve still got to speed our Vegebot up to the point where it could compete with a human, but we think robots have lots of potential in agri-tech.”

Iida’s group at Cambridge is also part of the world’s first Centre for Doctoral Training (CDT) in agri-food robotics. In collaboration with researchers at the University of Lincoln and the University of East Anglia, the Cambridge researchers will train the next generation of specialists in robotics and autonomous systems for application in the agri-tech sector. The Engineering and Physical Sciences Research Council (EPSRC) has awarded £6.6 million ($8.26 million U.S.) for the new CDT, which will support at least 50 Ph.D. students.

The post Vegebot robot applies machine learning to harvest lettuce appeared first on The Robot Report.

Hank robot from Cambridge Consultants offers sensitive grip to industrial challenges

Robotics developers have taken a variety of approaches to try to equal human dexterity. Cambridge Consultants today unveiled Hank, a robot with flexible robotic fingers inspired by the human hand. Hank uses a pioneering sensory system embedded in its pneumatic fingers, providing a sophisticated sense of touch and slip. It is intended to emulate the human ability to hold and grip delicate objects using just the right amount of pressure.

Cambridge Consultants stated that Hank could have valuable applications in agriculture and warehouse automation, where the ability to pick small, irregular, and delicate items has been a “grand challenge” for those industries.

Picking under pressure

While warehouse automation has taken great strides in the past decade, today’s robots cannot emulate human dexterity at the point of picking diverse individual items from larger containers, said Cambridge Consultants. E‑commerce giants are under pressure to deliver more quickly and at a cheaper price, but still require human operators for tasks that can be both difficult and tedious.

“The logistics industry relies heavily on human labor to perform warehouse picking and packing and has to deal with issues of staff retention and shortages,” said Bruce Ackman, logistics commercial lead at Cambridge Consultants. “Automation of this part of the logistics chain lags behind the large-scale automation seen elsewhere.”

By giving a robot additional human-like senses, it can feel and orient its grip around an object, applying just enough force, while being able to adjust or abandon if the object slips. Other robots with articulated arms used in warehouse automation tend to require complex grasping algorithms, costly sensing devices, and vision sensors to accurately position the end effector (fingers) and grasp an object.

Robotics Summit & Expo 2019 logoKeynotes | Speakers | Exhibitors | Register

Hank uses sensors for a soft touch

Hank uses soft robotic fingers controlled by airflows that can flex the finger and apply force. The fingers are controlled individually in response to the touch sensors. This means that the end effector does not require millimeter-accurate positioning to grasp an object. Like human fingers, they close until they “feel” the object, said Cambridge Consultants.

With the ability to locate an object, adjust overall system position and then to grasp that object, Hank can apply increased force if a slip is detected and generate instant awareness of a mishandled pick if the object is dropped.

Cambridge Consultants claimed that Hank moves a step beyond legacy approaches to this challenge, which tend to rely on pinchers and suction appendages to grasp items, limiting the number and type of objects they can pick and pack.

“Hank’s world-leading sensory system is a game changer for the logistics industry, making actions such as robotic bin picking and end-to-end automated order fulfillment possible,” said Ackman. “Adding a sense of touch and slip, generated by a single, low-cost sensor, means that Hank’s fingers could bring new efficiencies to giant distribution centers.”

Molded from silicone, Hank’s fingers are hollow and its novel sensors are embedded during molding, with an air chamber running up the center. The finger surface is flexible, food-safe, and cleanable. As a low-cost consumable, the fingers can simply be replaced if they become damaged or worn.

With offices in Cambridge in the U.K.; Boston, Mass.; and Singapore, Cambridge Consultants develops breakthrough products, creates and licenses intellectual property, and provides business and technology consulting services for clients worldwide. It is part of Altran, a global leader in engineering and research and development services. For more than 35 years, Altran has provided design expertise in the automotive, aerospace, defense, industrial, and electronics sectors, among others.

Mamut mobile robot automates data collection for farmers


Agriculture is a $5 trillion industry, and it’s ripe for automation. Cambridge Consultants today announced Mamut, an autonomous robot that explores crop fields, capturing data on health and yield at the level of individual plants and on a massive scale. By automating data capture, Mamut gives growers regular, precise and actionable information on their crops, enabling them to predict and optimize yields.

Agriculture is under pressure to increase efficiencies, producing greater yields with fewer inputs and less labor. To meet these demands, growers need precise information on crop growth and health throughout the growing season. Automation of the data collection process is essential to providing growers with information at scale.

Existing large-scale monitoring approaches use drones, which cannot capture information from beneath the crop canopy. Attempts to use ground-based monitoring have been limited by the requirement for additional infrastructure, such as cabling or radio beacons.

Mamut is an AI-powered autonomous robotic platform. Equipped with an array of sensors, Mamut maps and navigates its surroundings without the need for GPS or fixed radio infrastructure. As it travels the rows of a field, orchard or vineyard, cameras capture detailed crop data at the plant level, enabling accurate predictions of yield and crop health.

Mamut integrates stereo cameras, LIDAR, an inertial measurement unit (IMU), a compass, wheel odometers and an on-board AI system that fuses the multiple sensor data inputs. This sophisticated blend of technologies enables Mamut to know where it is and how to navigate through a new environment, in real time.

“Mamut is a practical application of AI, meeting a real and pressing need, particularly for growers of specialty crops where failure carries a high cost,” said Niall Mottram, Head of Agritech, Cambridge Consultants. “AI systems are already being used to understand crop conditions, yield predictions and to enable weed identification, but our autonomous robotic platform can collect valuable and granular data below the canopy, where drones cannot see.

“This data enables farmers to treat each plant in their vineyard, orchard or field individually, and on the scale of massive industrial farming, optimizing yields and producing more output with less input.”

Mamut’s capability to perform simultaneous localization and mapping (SLAM), enabling the robot to react and learn from unstructured routes in real time, was developed in navigation trials through the twists and turns of a 12-acre maize maze at Skylark Garden Centre, and at Mackleapple’s orchard, both in Cambridgeshire, UK.

The Robot Report named Augean Robotics one of its 10 robotics startups to watch in 2019. Augean Robotics makes Burro, an autonomous mobile robot that follows people on a farm, moving up to 500 lbs of cargo around to free up workers to perform more valuable tasks. Burro can learn the routes it takes and re-run them autonomously. Augean is currently working with fresh fruit farmers. In December 2018, Augean took home top honors at the FBNFarmers Startup Competition by winning the Judge’s Choice Award.

Robotics Summit & Expo 2019 logoKeynotes | Agenda | Speakers | Exhibitors | Register

The post Mamut mobile robot automates data collection for farmers appeared first on The Robot Report.