From Smartphone to Smart Farm: How AI Can Maximize Global Crop Yield

Researchers from IBM Brazil have built a prototype AgroPad, which enables real-time, on-location, chemical analysis of a soil or water sample using AI. Samples are placed on the circle on the front of the AgroPad. (Image courtesy of IBM Research.)

Ever thought about using artificial intelligence(AI) to optimize your fertilizer? If it’s starting to feel like there’s no problem in the world that AI can’t tackle, you can now add farming to the list.

While “fertilizer optimization” can sound extraneous or gimmicky, it is anything but. Agriculture consumes more than 70 percent of the world’s annual water usage, with small farms producing nearly 80 percent of food for the developing world. With an impending global food crisis on our hands, we’re at a point where maximizing a plot of land’s potential is essential.

If a farmer in possession of a smartphone was equipped with the tools to perform environmental analysis on their own in the field, it may make a real difference in crop yield. This could only happen if some researcher comes along with a smart enough piece of software to make it happen.

Mathias Steiner wants to be that researcher. He and his team from IBM Research Brazil have developed a prototype, AgroPad, for monitoring five chemical parameters for arable soil and water. The AgroPad is simple. It consists of a strip of test paper and an app driven by machine learning that can stand in for a laboratory technician.

Here’s how it works: The farmer puts a drop of soil or water on the test strip. The five indicators change color based on the levels of pH, nitrogen dioxide, aluminum, magnesium and chlorine that are present in the sample. The farmer’s smartphone then performs machine vision on the strip, runs the results through a machine learning algorithm and spits out a chemical read. The app makes a recommendation to the farmer for fertilizer adjustments that will help optimize the crop’s growth. For the final touch, the data gets uploaded into the cloud along with all the other chemical reads from the area. Interested parties can track greater trends of soil and water in the geographical area.

The cloud-computing aspect here is nifty, but the ability to analyze soil samples for chemical composition is not exactly a new technology. Just because the app uses machine learning doesn’t mean the results are necessarily better than a read a laboratory could deliver. Despite this, Steiner believes there are advantages to being able to perform a five-parameter chemical test on-location in under 10 seconds.

“In a lab analysis, you might be able to increase the precision of the actual test result, but what we see as a benefit of our technology is that you get a robust reliable result almost in-time, in the field, when you need it,” Steiner said. “The problem is when we are shipping the samples to a lab that is far away. You might need to wait for weeks in order to get a result that might be of higher accuracy but oftentimes that [level of accuracy] is not needed.”

Steiner also implied that environmental analysis performed by a laboratory can be prohibitively expensive for the smaller farmer.

“Because you don’t need a trained lab technician to perform this test, anyone can do it. It’s very simple to perform,” he said.

Why does an app that only reads five parameters off a strip of paper require machine vision? The answer has to do with making the app accessible to anyone. If you’re ever tried to take consistent photos with your smartphone camera, you’ll notice that light conditions strongly influence the quality of the photo. Whether you’re inside, outside, in full sunlight or having an overcast day, they all effect what your pictures look like. Not everyone is a professional photographer. Smartphones have an annoying habit of trying to change your camera settings on your behalf to adjust for these changes, often quite poorly.

IBM Research developed the AgroPad to use machine vision to adjust for those changes in light quality that would otherwise deliver a totally different read of the colors on the test strip. By using machine vision, farmers with even the worst of cellphone cameras could still use the app and achieve exactly the same level of accuracy in their chemical analysis as someone with the latest iPhone, or more accurate even than the human eye, according to Steiner.

With clever usage of cloud computing and machine vision, the AgroPad is a leap in the direction of the Smart Farm but a step below automation. Farmers still need to perform the analysis manually, even if it takes under 10 seconds. Steiner is hopeful that the simplicity and convenience of the AgroPad can change agriculture for the better.

“We can see that companies or larger organizations could take advantage of this,” he said. “For example, a chemical or fertilizer company could hand out these AgroPads to farmers to determine fertilizer concentrations in order to manage product supply.”

He added that organizations in the business of environmental monitoring could also benefit from the AgroPad.