Apple reportedly threatened to boot Facebook from the App Store over human trafficking concerns


Apple threatened to remove Facebook from its App Store after a report about an online slave market.The BBC in 2019 reported that human traffickers were using Facebook’s services to sell domestic workers.The Wall Street Journal reports that Facebook knew about the practice even before Apple made its threat.10 Things in Tech: Get the latest tech trends & innovations Loading Something is loading.

Email address By clicking ‘Sign up’, you agree to receive marketing emails from Insider as well as other partner offers and accept our Terms of Service and Privacy Policy .Apple threatened to kick Facebook off its App Store after a 2019 BBC report detailed how human traffickers were using Facebook to sell victims, according to The Wall Street Journal .

The paper viewed company documents that show a Facebook investigation team was tracking down a human trafficking market in the Middle East whose organizers were using Facebook’s services.What appeared to be employment agencies were advertising domestic workers that they could supply against their will, per the Journal.

The BBC published a sweeping undercover investigation of the practice, prompting Apple to threaten to remove Facebook from its store, the paper said.

An internal memo found that Facebook was aware of the practice even before then: A Facebook researcher wrote in a report dated 2019, “was this issue known to Facebook before BBC inquiry and Apple escalation?,” per the Journal.

Underneath the question reads, “Yes.Throughout 2018 and H1 2019 we conducted the global Understanding Exercise in order to fully understand how domestic servitude manifests no our platform across its entire life cycle: recruitment, facilitation, and exploitation.”

Apple and Facebook did not immediately respond to requests for comment.

The Wall Street Journal on Thursday also reported how Facebook’s AI content moderators cannot detect most languages used on the platform, a needed skill if the company is going to monitor content in foreign markets where it has expanded.

The paper found that human moderators don’t know how to speak the languages used in those markets, leaving a blind spot in the company’s efforts to crack down on harmful content.

One result was drug cartels and human traffickers using the platform to conduct their business, per The Journal.

Sign up for notifications from Insider! Stay up to date with what you want to know.Subscribe to push notifications.

Leave a Reply

Your email address will not be published. Required fields are marked *

Next Post

The Saga Behind The New PS5's Heatsink, Explained

Over the last few weeks, much hay has been made about the PlayStation 5, specifically a new revision of the Sony console that includes a smaller heatsink .Early reports criticized this change for leading to increased temperatures during operation, but fears that this would impact performance have mostly been put […]
The Saga Behind The New PS5’s Heatsink, Explained

Subscribe US Now