« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984


AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Agriculture Agriculture


Agriculture Shortcomings Put Machine Vision to the Test

POSTED 05/26/2017  | By: Winn Hardin, Contributing Editor

While machine vision has been deployed in farming for two decades, advances in microprocessors and high-speed data transmission are giving farmers unprecedented insights into their operations.On a global scale, the agriculture sector faces a tough row to hoe. According to the World Bank, the planet needs to produce at least 50% more food to feed 9 billion people by 2050. That means a hike in surface water usage in a sector that already consumes 70% of the world’s accessible freshwater supply. Meanwhile, in the United States, produce growers have cut back on production because they can’t find enough workers to tend to their crops. The projected ongoing labor shortage also could raise food prices. 

The use of machine vision in the air and on the ground can alleviate these obstacles by continuously monitoring crop and soil conditions, assessing food quality, and guiding robots and tractors to automate labor-intensive tasks. While machine vision has been deployed in farming for two decades, advances in microprocessors and high-speed data transmission are giving farmers unprecedented insights into their operations.

Eye in the Sky
To attain information about their crops from the air, farmers have traditionally relied on earth observation data from multispectral sensors aboard satellites and piloted aircraft. The visible and infrared images highlight differences between healthy and distressed vegetation, and improvements in sensor technology have resulted in better quality imagery. But more agricultural producers are turning to camera-equipped drones because, compared to satellites, they get closer to the fields, produce higher resolution images, give farmers operational control, and are affordable. 

Taken together, multispectral images help farmers detect crop stress and other anomalies, estimate harvest yields, and more. However, hyperspectral imaging (HSI) is gaining traction in agriculture — particularly aboard drones — for its ability to differentiate properties that multispectral sensors cannot.

“Red, green, blue, and infrared data points are pretty wide and give very coarse information on plant health,” says Adam Stern, senior scientist at Resonon Inc. (Bozeman, Montana), which develops hyperspectral imaging systems. “But with hyperspectral imaging, we have 250 or 500 data points. This high-precision data allows for powerful statistical analyses leading to more sensitive and accurate classification.”

HSI can identify different plant species, a useful tool in detecting noxious weeds. It can also alert farmers to early signs of crop stress by distinguishing different health conditions in the same species more precisely than multispectral imaging.

According to Stern, HSI technology for advanced crop management is on the cusp of leaving the lab to become commercially viable, but a few barriers remain. For example, scientists are studying how HSI can enable variable rate applications. This process dispenses herbicide, pesticides, or nutrients exactly where they’re needed, rather than over an entire field. Such targeted crop management means that farmers use fewer chemicals — a boon to the environment and the bottom-line. But for variable rate application to become standardized, more research is necessary to correlate spectral features and biophysical processes such as plant appearance, lack of specific nutrients, or parasite presence. 

Stern also notes that work needs to be done in converting HSI technology from the controlled environment of a laboratory setting to the variable illumination conditions of the outdoors. 

Until recently, computers could not handle the massive data sets produced by hyperspectral imaging, which hampered the technology’s adoption. But thanks to improvements in computational power, storage, and communication bandwidth, hyperspectral imaging is accessible to more users.

Intelligence in the Field
Of course, data collection is meaningless without the ability to analyze the information. As an integrator, Prolucid Technologies (Mississauga, Ontario) takes machine vision images and other data points like GPS location to create a robust platform for data analytics.

“In agriculture, we build distributed connected systems across large geographical regions,” says Prolucid CEO Darcy Bachert. In addition to collecting the data and pulling it into the cloud, Prolucid develops algorithms to make the data intelligent. “We provide very simple output decision points that make it easy for the end user to make better decisions.” 

As part of this process, Prolucid has prioritized machine learning in the software it develops in order to help farmers adapt to new or unexpected conditions. “We write algorithms that are designed to look for something new or something that hasn’t been seen before and flag that for the user,” says Nick Stupich, Certified Vision Professional and embedded developer for Prolucid. “We have found that beneficial on top of the regular vision inspection systems because machine learning will save the customer a lot of time and help identify trends.”

Meanwhile, Back on Earth…
In the field, vision guidance has been used on tractors since the mid-1990s. Today, farm vehicles are moving toward full autonomy as they adopt the same camera, radar, and lidar technology deployed in self-driving cars. While full implementation of autonomous tractors is still years away, vision-guided robots have found a more immediate home among the crops.

Vision Robotics (San Diego, California) entered the agricultural market in 2004 with a feasibility study for harvesting oranges, but robotic picking technology hadn’t caught up to machine vision at that point. Instead, Vision Robotics transformed the concept into a crop load estimation system that used cameras to count Granny Smith apples on trees.

In 2011, the company developed the vision-guided VR Lettuce Thinner. When growing lettuce, farmers intentionally plant seeds close together because not all of them will be viable. Vision Robotics’ system, which affixes to the back of the tractor, automates the thinning of these over-planted seedlings — typically a labor-intensive process. 

“This year we've gotten a lot more interest because worker shortages have become a bigger problem, at least in California,” says Tony Koselka, a founder of Vision Robotics.

The VR Lettuce Thinner features a modular design where an enclosed hood containing a camera, lighting, and spraying system is positioned over each row of lettuce. The USB 2.0 camera from IDS takes images at 20 fps as the tractor moves down the row to identify which seedlings to keep based on parameters adjusted by the grower. The sprayer extends 15 inches behind the camera, dispensing different agrochemicals to kill the unwanted plants and feed the ones being kept. 

Because growing lettuce is a near year-round endeavor, so, too, is the need to thin the crop. “In theory, the growers want to be planting 5 to 10 acres every day, which means thinning 5 to 10 acres every day,” Koselka says. “If it is used optimally, which is 10 hours a day and six days a week, the lettuce thinner would pay for itself in less than six months. It is much cheaper then hand labor, and the performance is better.” The system also can withstand temperatures up to 125° — a necessity for the hot summers in California and Arizona, where the majority of the U.S.’s lettuce is grown.

In addition, Vision Robotics has developed a prototype of an autonomous grapevine pruner, where a self-driving tractor pulls the pruning system behind it. At every inch the front camera takes pictures of the grapevine, which are then analyzed to generate a model that determines cut points. The tractor stops every 18 inches to accommodate precision cutting.

Feeding the Future
Despite the demonstrated benefits of machine vision in agriculture, some farmers are reluctant to adopt the technology. "Farming is an inherently risky proposition due to weather and other issues beyond the grower’s control,” says Vision Robotics’ Koselka. “Because of this, many are reluctant to change their practices, even for potential improvements from what they have used successfully in the past."

The next generation of farmers may bridge that gap. “We have found that people in their twenties or early thirties are comfortable with computers and excited to do farming in a different way,” Koselka says. “But the decision-makers aren't quite there yet.” 

Machine vision, in tandem with machine learning, artificial intelligence, and data analysis, will continue to help farmers maximize crop output and minimize the environmental impact — two critical objectives in a world with a swelling population.

“If you look at the amount of people on planet earth, resources are going to get a lot more strained,” says Prolucid’s Bachert. “We’re going to have to figure out ways of increasing yields, dealing with pests, making corn grow in a desert. Those things are only going to happen by applying these technologies. Otherwise, we are going to have huge problems as a society in a few decades.”

Embedded Vision This content is part of the Embedded Vision curated collection. To learn more about Embedded Vision, click here.
Vision in Life Sciences This content is part of the Vision in Life Sciences curated collection. To learn more about Vision in Life Sciences, click here.