Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Facebook to revisit policy after online campaign

  • 29-05-2013 3:25pm
    #1
    Registered Users, Registered Users 2 Posts: 8,427 ✭✭✭


    http://www.independent.ie/business/technology/how-three-women-took-on-sexist-facebook-and-won-29306599.html
    WELL Facebook has caved.

    It took just seven days for the tech titan to bow to the thousands of people protesting online (and via Facebook – oh the irony) and agree to change its user guidelines and the way it moderates photos and posts which celebrate rape and violence against women. Having toiled as a tech hack for years, enduring an endless stream of bland comments from Facebook executives toeing the corporate line, this is no small moment.


    Facebook’s PR team’s de facto position is usually “no comment” when asked about the service or alternatively they make some kind of reference to its ever evolving terms and conditions.

    For three women: Laura Bates – the British 26-year-old founder of the Everyday Sexism Project, US writer Soraya Chemaly and Jaclyn Friedman – the American creator of Women, Action & the Media, to make Facebook to change its mind about anything, never mind something as large as its moderation technique and what it considers hate speech, in only one week, have achieved something quite remarkable indeed.

    It would be all too easy to dismiss their protestations as three women “who just don’t have a sense of humour” about a few bits of “laddish banter” shared in the digital microcosm that is Facebook. Or to tell them to change their Facebook friendship groups.

    However, this catalogue of images and posts, which the women confronted Facebook with, fall extremely far outside of that tent. I can’t publish them here on The Telegraph’s website, which is family friendly (that should tell you something in itself). But the one at the top of this story – which was ‘allowed’ by Facebook’s team of outsourced moderators after it was rightly reported – should give you, dear reader, enough of a bitter flavour of what they were on about.


    These images, routinely uploaded by individuals and groups around the world, depict sexual violence, domestic violence and rape, and are routinely accompanied by little ‘jokes’.


    Until today – Facebook – which disallows any such equivalent hate images or messages broadcasting homophobia, racism or anti-Semitism, has been blithely letting these images by shared, ‘liked’ and commented upon across its network for years.



    As Laura Bates tells me: “These incredibly graphic and upsetting images were popping up everywhere. They were becoming prolific. We had parents get in touch telling us their children had seen images of little girls with black eyes in their News Feeds. They were appearing in Facebook Groups – which had nothing to do with women or violence – such as an Atheist Group.


    “What was happening was a normalisation of these types of images across Facebook. And while Facebook executives kept telling us that the site had to allow people freedom of speech – what they didn’t account for was how these images were stifling other women’s freedom of expression – as they left the site distressed and speechless.”



    And what made no sense at all was why Facebook was treating this form of hate speech differently to other forms of hate speech it proactively bans – especially once alerted to it.


    In the seven days since the three women came together to launch this campaign, some 15 major advertisers have suspended their Facebook marketing campaigns - while 5,000 emails and 60,000 tweets (using the hashtag #FBRape) have been posted pushing the social network to act.



    And today – victory was all theirs.


    It is good that they are revisiting how their policy is implemented, know people got fed up of reporting images and action not being taken.

    Have you ever reported an image or group on facebook?
    Would you be more inclined to now?


Comments

  • Closed Accounts Posts: 7,484 ✭✭✭username123


    Morag wrote: »
    Have you ever reported an image or group on facebook?

    I simply "unfollow" things that I dont want to see in my newsfeed. I cant say Ive been subjected to many things that I would consider reporting - the most graphic were probably images posted after the Boston bombings. Usually the stuff that annoys me are pictures of babies with tumours on their faces or these fake stories about some unfortunate who had to sell his goat etc.. Obviously if I saw something criminal I would report it. But Im far more inclined to just disassociate myself with the source of any nonsense I dont agree with.


  • Closed Accounts Posts: 6,154 ✭✭✭Dolbert


    I've reported many pages of rape jokes/ domestic violence 'memes', but more often than not been told that it didn't violate the terms of service. Up until now, images like this were apparently fine and dandy, while pages about breastfeeding and images of mastectomy scars were not. If you think pages about rape jokes and gendered violence are rare on FB think again, there are thousands of them.


  • Moderators, Social & Fun Moderators, Regional East Moderators, Regional North West Moderators Posts: 12,523 Mod ✭✭✭✭miamee


    I think I would be more likely to report anything untoward, sexist or violent that I saw on Facebook after this. however, I have never come across any of the images that were being reported. I must be very lucky in who I am friends with.


  • Registered Users, Registered Users 2 Posts: 1,135 ✭✭✭starling


    Delighted this campaign worked so well and so quickly!
    I've reported stuff to Facebook before, albeit stuf that's not quite as extreme as the kind of images and pages etc that were focused on in this particular campaign.
    I've also reported the more overtly misogynist comments you see on various pages, with the thinking that "Facebook claims it would not allow that kind of comment/hate speech against POC, religious groups, sexual orientations etc so why is okay for women".
    I've rarely had a positive outcome from the actual FB response. Actually I think never would be accurate. Often the poster will remove their comment. Admittedly this is something I'm not 100% au fait with - did the poster get told "your comment has been reported" and decide on their own to delete or what was the actual process.

    Very often though I will get the feedback "we reviewed this and it doesn't violate our community guidelines" even if it something that, when applied to a different group, would reasonably count as "hate speech."
    (I know that it's not always appropriate to put "women" in the same category as "Muslims" or "Lgbt people" or whatever, but in certain circumstances you can take something a person says about "teh wimmins" and think, if you replace "women" with "poc" in that sentence, most people would say you were being racist - so maybe you should not be saying it or at least be challenged on it)

    IMHO the FB feedback/reporting process is not very transparent and can be pretty frustrating. But then they don't always seem to have the interests or preferences of the actual users (as a whole, not just women) at heart in other areas either.

    I could be cynical but I imagined that FB had stuff like "We don't tolerate racism or homophobia etc" because it was conceivable that under US law it could face legal repercussions for appearing to tolerate such things, whereas it had less of a legal concern wrt misogyny. May not be true.

    that said, I can certainly understand anyone who prefers to ignore stuff rather than report it, and we have to choose the course of action that is better for us.
    Edit: besides which there are only so many hours in the day. I once saw a picture of two men who had fallen asleep in a car in a state of undress, and reported a whole bunch of comments that were pretty nasty and homophobic, but an hour later literally 1000s of new comments had gone up all saying nothing but "durty f*gs" I mean it feels pretty pointless to do anything at that stage:()

    Plus of course I think FB is pretty sh1t in general and I only use it for certain limited things that it does best.


Advertisement