Consent apps don’t stop sexual violence, so stop trying to make them

[ad_1]

Yesterday, NSW Police Commissioner Mick Fuller suggested technology should be part of the solution to growing concerns over sexual assault. He encouraged a serious discussion about using a digital app to record positive sexual consent.

In our research, we looked at a wide range of mobile applications and artificial intelligence (AI) chatbots that have been used to attempt to tackle sexual violence over the past decade. We found that these apps had many limitations and unintended consequences.

How apps are used to fight sexual abuse

Apps aimed at responding to sexual harassment and assault have been circulating for at least a decade. With the support of government initiatives, such as the Obama administration’s 2011 Apps Against Abuse challenge, and global organizations, such as UN Women, they have been implemented in corporate environments, universities and institutions. mental health services.

These applications are not limited to documenting consent. Many are designed to offer emergency help, information and a way for survivors of sexual violence to speak out and gather evidence against perpetrators. Promoters often present these technologies as empowerment tools that support women through accessible and anonymous data processing.

In the case of the proposed consent app, critics noted that efforts to time-stamp consent to ignore consent can always be withdrawn. In addition, a person may consent out of pressure, fear of repercussions or intoxication.

If a person indicates consent at some point but circumstances change, the record could be used to discredit their claims.

How digital apps fail to tackle sexual violence

Using apps won’t solve many long-standing problems with common responses to sexual violence. Research indicates that security apps often reinforce rape myths, such as the idea that sexual assault is most often perpetrated by strangers. In fact, the vast majority of rapes are committed by people the victims already know.

Usually marketed to women, these applications collect data from users through monitoring using persistent cookies and location-based tracking. Even “anonymized” data can often be identifiable.

Digital tools can also foster violence. Abusive partners can use them for cyberstalking, giving them constant access to victims. Apps designed to encourage survivors to report violence raise similar concerns, as they fail to address power imbalances that lead authorities to discredit survivors’ accounts of violence.

Apps don’t change the big picture

Introducing an app does not in itself change the larger landscape in which sexual violence cases are handled.

The high-profile sexual abuse scandal involving Larry Nassar, a former doctor of American gymnastics and Michigan State University who was convicted of a series of sexual offenses after being accused by more than 350 young women and girls, has led reforms including the SafeSport application.

This resulted in 1,800 reports of sexual misconduct or abuse within a year of the app’s introduction. However, due to lack of funding, the reports could not be properly investigated, undermining the organization’s promises to enforce penalties for sexual misconduct.


Learn more: Anti-rape devices may have their uses, but they don’t solve the ultimate problem


Poor implementation and cost-saving measures compromise user safety. In Canada and the United States, the hospitality industry is deploying smart panic buttons to 1.2 million hotel and casino employees. This is a response to widespread sexual violence: A union survey found that 58% of employees had been sexually harassed by a guest and 65% of casino workers had experienced unwanted contact.

Employers are now required by law to provide panic buttons, but they are turning to cheap, substandard devices, raising safety concerns. The law does not prevent them from using these devices to monitor the movements of their employees.

Who owns the data?

Even if implemented as intended, apps raise questions about data protection. They collect large amounts of sensitive data, which is stored on digital databases and cloud servers vulnerable to cyber attacks.


Learn more: The Sad Truth: Tech companies track and misuse our data, and there’s little we can do


The data can be owned by private companies who can sell it to other organizations, allowing authorities to circumvent privacy laws. Last month, it was revealed that U.S. Immigration and Customs Services had purchased access to the Reuters CLEAR database containing information on 400 million people whose data they could not legally collect on their own. .

In short, apps do not protect victims or their data.

Why we need to take this ‘bad idea’ seriously

Fuller, the NSW Police Commissioner, admitted his recommendation might be a bad idea. His idea was based on the premise that the important issue to be resolved is to ensure that consent is clearly communicated. He misunderstands the nature of sexual violence, which is based on unequal power relations.

In practice, a consent app is unlikely to protect victims. Research shows that data collected through new forms of investigation often results in evidence that is used against the will of victims.

There are other reasons why the consent app is a bad idea. It perpetuates flawed assumptions about the ability of technology to “fix” societal harm. Consent, violence and accountability are not data issues. These complex questions require strong cultural and structural answers, and not just quantifiable and time-stamped data.The conversation

This article by Kathryn Henne, Professor and Director, School of Regulation and Global Governance, Australian National University; Jenna Imad Harb, PhD student, Australian National University, and Renee M. Shelby, postdoctoral fellow, Sexualities Project, Northwestern University, is republished from The Conversation under a Creative Commons license. Read the original article.

Did you know that we have a newsletter dedicated to consumer technologies? It’s called Plugged In – and you can subscribe to it here.

Published March 22, 2021 – 12:18 UTC


[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *