FYI.

This story is over 5 years old.

Tech

Trippy Satellite Imagery Could Forecast Food Crises Before They Happen

New machine learning algorithm from Descartes Labs is like ‘Minority Report,’ but for corn.
This mosaic represents all the different corn fields in a portion of Iowa. The boundaries of each corn field was determined through machine analysis of a satellite image. Image: Descartes Labs

In the late 18th century, an economist named Thomas Malthus argued that the increasing standards of living seen around the globe were unsustainable because they would drive a population boom that would outpace food production. Critics of Malthus' argument, however, have long held that he ignored technological advancement (among other factors), which could offset population growth by making food production more efficient.

Advertisement

Yet now that the World Bank is finding that we're going to lose 25 percent of our crop yields to climate change despite needing to produce 50 percent more food to keep up with the global population boom, it's hard not to see some validity in Malthus' logic. The burden is now on technology to make up for the forecasted food crises in the coming decades, but thankfully there is no shortage of ideas on this front.

The latest is coming from Descartes Labs, a company is using machine learning to analyze satellite imagery to predict food supplies months in advance of current methods employed by the US government, a technique that could help predict food crises before they happen.

On a given day, Descartes will collect roughly 5 terabytes of images from NASA, ESA and commercial satellites, and combine this with other relevant data such as weather forecasts and the prices of agricultural products. When this data is fed to a proprietary machine learning platform, it is able to track and forecast food supply with uncanny accuracy.

First, the Descartes platform must process the satellite imagery from these different sources to make it uniform so that the algorithm can extract data from the images. Initially, the researchers at Descartes Labs trained their machine learning platform on a petabyte's worth of satellite images using a supercomputer which consists of 30,000 processor cores (for the sake of comparison, the fastest supercomputer in the world has over 3 million cores).

Advertisement

After running thousands of different models on the best ways to interpret this image data, the algorithm was able to differentiate individual corn fields on its own. In fact, that brightly colored mosaic above is an image of cornfields in Iowa, the boundary of each individual cornfield determined by the machine.

But that's just the beginning.

After determining the boundary lines of each crop field, the algorithm is then able to determine whether that plot of land is being used to grow corn or soy depending on how light is reflecting off the surface of the field. After determining the type of crop planted on that field, the algorithm can then efficiently monitor production levels in a given area—whether that's an individual county, state, or country.

Corn harvest. Image: United Soybean Board/Flickr

"Corn are these little factories that absorb or reflect certain kinds of light based on where they're at in their growing life cycle," Mark Johnson, CEO of Descartes Labs, told Motherboard. "Basically what we're looking at is how much energy is being absorbed to create corn. We don't have agronomists on our team—we have a bunch of physicists."

For the initial test of the machine learning algorithm, Descartes Labs forecasted US corn production on August 6, 2015. According to the company's reports, they came within 1.9 percent of the final US Department of Agriculture report on corn yield—but they were able to do so almost five months before the USDA report was released in January of this year.

Advertisement

For Descartes Labs, timeliness is everything. Analyzing crop yields half a year after the fact is too late to stave off food crises—this will require accurate crop analysis in real time. So far, the Descartes platform has only been applied to corn and soy yields in the United States, but the algorithm has been able to produce hyper-detailed forecasts on a weekly basis for each of the 3114 counties in the US. By comparison, the USDA's National Agricultural Statistic Service only releases monthly reports on a state by state basis.

"Our theory is observe every field in the country every day, rather than what the USDA does which is take surveys from thousands of farmers every month," Johnson said. "If you're only reporting every month, a lot changes in that time, like weather forecasts. Knowing those things as they change is really important."

For Johnson and his colleagues at Descartes, the ability to have such fine-grained data on agricultural production helps make the food supply chain more efficient. As they add more geospatial data to their already robust database of Earth imagery (totaling about 3 petabytes, by Johnson's estimates), these models will get even more accurate.

Right now, Descartes Labs is basing its platform models on satellite data with mid-level resolution, to the point where each pixel represents about 250 meters on Earth, so not quite enough to really observe what is happening in an individual cornfield. But as higher resolution telescopes come online, such as the Planet Labs satellite which will be able to image on the order of about 5 meters, the data will be even better.

"Compared to the measurements a farmer would have walking through the field, he is always going to get a better number compared to satellites," Johnson said. "At least for the time being."