When a survivor panellist at Amazon’s Tech Against Trafficking Summit in Seattle last year explained how data sharing with law enforcement had put their family members at greater risk of trafficking, they were met with the following callous response from a director at an anti-trafficking organization: “shit happens”.
Tech giants’ focus on developing tools of mass surveillance and gathering swathes of data from the public, which is then shared with law enforcement to ostensibly tackle human trafficking, is a deeply concerning trend. The zest for this approach amongst anti-trafficking professionals equally so.
Ineffective at best, deeply harmful at worst
Tools like the Amazon Ring – camera enabled doorbells – and artificial intelligence were lauded by law enforcement and anti-trafficking professionals alike at the Tech Against Trafficking Summit as effective trafficking prevention strategies. ‘Safety over privacy’ was the oft-repeated mantra, a dangerous maxim knowing that data collected on human trafficking is “limited and notoriously inaccurate”, not to mention the ingrained biases which may be present in huge datasets.
The deprioritization of people’s privacy in favor of increasingly invasive tech tools in the name of anti-trafficking is worrying, particularly as voices of people with lived experience present at the Summit were summarily dismissed when they raised the, ineffective at best and deeply harmful at worst, consequences that could be unleashed with these tools.
The Amazon Ring for example has been dubbed “the largest civilian surveillance network the US has ever seen”, made all the more sinister as Amazon has given around one in 10 police departments across the country the ability to access this recorded content.
Journalist and child trafficking survivor Sabra Boyd attended the Summit and shares her perspectives in openDemocracy, breaking down how big tech’s drive to collect and sell people’s data is putting trafficking survivors in danger.
It is counterintuitive to expect companies to ensure trafficking survivors’ safety when their business models profit from the sale of users’ data – especially when they’re developing tools for law enforcement. There are also good reasons why most trafficking survivors do not trust the police to effectively protect them. Apart from concerns around immigration status, some survivors I know were trafficked by law enforcement officers. Others’ traffickers were protected by police. From this vantage point, these tools are like giving a serpent a second set of fangs while telling you to trust the snake with your life.
Tech won’t solve trafficking
While technology has the potential to play a helpful role in supporting trafficking survivors, what we know for certain is that holistic approaches are needed to identify and implement truly effective anti-trafficking strategies.
Survivors and allies have been sounding the alarm on what effective anti-trafficking measures look like for decades. Ensuring survivors are supported to access to housing, healthcare, and education. Protection from criminalization for crimes they were forced to commit as a result of their exploitation, and expunging criminal records so that survivors can rebuild their lives and find jobs. Protection from detention and deportation. These are some of the much needed interventions that could be complemented with tech innovations that would be in the interests of survivors, not law enforcement.
Surviving the physical abuses of being trafficked is only the first step in not dying. Every trafficking survivor I know lives with chronic illness or disability. If the tech industry actually cared about trafficking survivors, they would build apps to make life easier for people living with disabilities. They would build apps that help survivors communicate more easily. They would build apps that protect me from my trafficker, and make it easier for me to whistleblow when he enjoys impunity from law enforcement.