Quantcast

This Company Built AI to Detect Modern Slavery

Can machine learning stop forced labor?

Ankita Rao

Ankita Rao

Child labor in Sri Lanka. Image: International Labour Organization

By now you probably know that the people who make our clothes, chocolate, and diamond rings often suffer in the process. Forced labor—i.e. work done against a person's will—impacts 20.9 million people across the world, in multiple industries, even though almost every country has a law that prohibits this modern slavery.

With growing consumer consciousness and stricter regulations putting pressure on companies to clean up their act, one company has devised a machine learning system it says will sift through data and locate forced labor in the manufacturing process.

SAP Ariba, a for-profit software and IT company based in California, works with millions of large and small-scale companies to streamline their supply chain, the system through which their products are sourced, made or delivered. It works with companies that deal with fashion and food, but also technology.

"The fact that every country in the world has made forced labor illegal, the acknowledgement alone has made companies realize they cannot take this lightly," said Padmini Ranganathan, the vice president of products and innovation at Ariba.

The AI flags forced labor risks in a demo.

The artificial intelligence program Ranganathan and her team created is similar to risk analysis. With the support of nonprofit organizations focused on fair labor practices, it uses hundreds of data points to flag possible forced labor violations.

Some of this data is reported by independent auditors, who are meant to inspect factories or farms periodically, or the workers themselves, through anonymous hotlines. Other data comes from media or aggregation—possibly revealed by whistleblowers or advocates.

Ranganathan walked me through a demo of the AI, which will launch later this year in collaboration with Made in a Free World, which focuses on fair labor. In a simple interface, the risk of forced labor is flagged for the company, and mapped by color. The nonprofits help input information outlining the risks in countries like Ghana or Bangladesh, where they say workers are vulnerable to abuse.

"My personal ambition is to make that accountability part of the status quo," she told me.

The AI uses existing company data and other inputs to locate risk.

But machine learning is notorious for adopting the biases of humans it relies on for information.

"The vast majority of data around supply chain systems is garbage," said Kohl Gill, the founder and CEO of LaborVoices, a company that gathers data directly from thousands of workers in countries like Bangladesh and Turkey to support safe work environments.

Gill said workers don't always trust or use anonymous hotlines or speak to auditors. If they've tried these methods in the past without success, they might feel like they're "shouting into a void," he said. And since much of the data collected about workers by employers is proprietary, and not available to anyone outside the company, getting a clear picture of factory conditions is difficult.

"If you you're using audits, self-reported data, hotline data—you're lost," he said. LaborVoices has tried to address this by working outside the company to gather information in laborers' communities, in exchange for information about work opportunities.

Ranganathan is hoping her system at SAP Ariba will tackle these issues. She said the fact that machine learning can flag and report labor abuse in real time, instead of the usual months or years-long analysis, could help companies take more immediate action. And by creating an open API (making the program publically available) later this year, she wants SAP Ariba to be on the forefront of cleaning up supply chains.

A map demonstrating higher risk countries.

Even so, artificial intelligence can't address the root causes of forced labor in each country. And reported data isn't always reliable.

"When you get these forced labor reports back, it's not just sitting in an email," she said. "It's urgent—companies don't want to get fined while they mitigate that risk."