A new Twitter study has revealed that female journalists and politicians receive an abusive tweet every 30 second.
The study, carried out between Amnesty International and artificial intelligence software company Element AI, was based on a crowdsourcing project — code named Troll Patrol — that examined 288,000 tweets sent to female politicians and journalists in the US and UK during 2017.
Amnesty International, which considers online abuse of women a human rights violation, said Tuesday that more than 6,500 volunteers, ages 18 to 70, from 150 countries participated in the project, logging 2,500 hours of time analyzing the tweets.
The 778 women who were the target of the tweets ranged from both conservative and liberal ends of the political spectrum and were employed by an equally diverse list of media publications.
Based on the Troll Patrol’s work, Element AI determined that 1.1million abusive or otherwise problematic tweets were sent to the women who were involved in the study.
There are 525,600 minutes per year, which means, on average, one abusive tweet is directed at a woman on Twitter every 30 seconds.
‘We found that, although abuse is targeted at women across the political spectrum, women of colour were much more likely to be impacted, and black women are disproportionately targeted,’ Milena Marin, Senior Advisor for Tactical Research at Amnesty International, said in a statement posted on Amnesty’s website.
‘Twitter’s failure to crack down on this problem means it is contributing to the silencing of already marginalized voices.’
Compared to white woman, the study found that 84 per cent of black women were more likely to receive abusive tweets, while 81 per cent of Latinx women were more likely to be threatened with physical violence. Meanwhile, 70 per cent of Asian women were more likley to receive ethnic, racial or religious slurs.
The study considered abusive content anything that violated Twitter’s own rules of conduct, including tweets that threatened or promoted violence against people based on race, gender and religion among other categories.
Problematic content was said to be ‘not as intense as an abusive tweet’ and was defined as content that is hurtful or hostile, especially if repeated to a person multiple times.
Examples of such abusive or problematic tweets Amnesty called out, included a woman being told she is ‘dumb, DUMB and DUMBER’ and to ‘Go back to Cuba,’ as well as another woman who received a tweet stating, ‘I would rather hit you in the face with a large sledgehammer you white hating racist bitch.’
Twitter, according to Amnesty, responded by asking the human rights organization to clarify its definition of the term ‘problematic’ while also stating that it was necessary to protect freedom of expression and ensure ‘policies are clearly and narrowly drafted.’
Despite pointing out Twitter’s apparently inability to prevent such rampant trolling of women on its platform, Marin said that the study and Troll Patrol weren’t ‘about policing Twitter or forcing it to remove content. We are asking it to be more transparent, and we hope that the findings from Troll Patrol will compel it to make that change.’
Still, the study’s results meant that Amnesty now has ‘the data to back up what women have long been telling us – that Twitter is a place where racism, misogyny and homophobia are allowed to flourish basically unchecked.’
In an updated statement to Amnesty, which was provided to WIRED, Twitter’s legal, policy, and trust and safety lead, Vijaya Gadde, wrote, that Twitter remains ‘committed to expanding our transparency reporting to better inform people about the actions we take under the Twitter rules.’
Gadde also thanked Amnesty for its recommendations, which included the fact that ‘Twitter should publicly share comprehensive and meaningful information about the nature and levels of violence and abuse against women, as well as other groups, on the platform, and how they respond to it’ as well as improve its reporting mechanisms to ‘ensure consistent application and better response to complaints of violence and abuse.’