Advertisement
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
If we do not hit our goal we will be forced to close the site.

Current status: https://keepboardsalive.com/

Annual subs are best for most impact. If you are still undecided on going Ad Free - you can also donate using the Paypal Donate option. All contribution helps. Thank you.
https://www.boards.ie/group/1878-subscribers-forum

Private Group for paid up members of Boards.ie. Join the club.

So "X" - nothing to see here. Elon's in control - Part XXX

1264265267269270398

Comments

  • Registered Users, Registered Users 2 Posts: 7,456 ✭✭✭HalloweenJack


    Genuine question: who verifies community notes? Can any user post one? Or do they have to come from blue ticks?

    Because if there is no means of verifying community notes then they could easily be used to contribute to disinformation and are clearly open to abuse.

    Such a lax approach to the truth is a concern, especially when it comes to issues surrounding elections.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    Election interference from foreign countries it's a matter of national security, not social media censorship and moderation.



  • Registered Users, Registered Users 2 Posts: 893 ✭✭✭timetogo1




  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    No, not X, or any other social media which is used as a tool. Governments should regulate the acts of interference, not the tools. Foreign interference in the democratic process is a hostile foreign action which needs to have much more serious consequences than fines and bans.



  • Registered Users, Registered Users 2 Posts: 41,057 ✭✭✭✭ohnonotgmail


    Hardly surprising that some -people are in favour of a system that allows more lies and disinformation.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,486 ✭✭✭eightieschewbaccy


    Let's try a non election level one, the Sandy Hook conspiracy about crisis actors etc led to parents moving state cause their lives were at risk. They were already scarred after the events. Should proponents of such conspiracy theories be given free reign on the platform? Opening individual legal cases against each user is unrealistic to remove them from the platform.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    Again, for death threats and other forms of violence and harassment there need to be criminal proceedings, not only social media bans.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    I'm not in favor of lies and disinformation, I'm opposing having a call centre level employee deciding what are lies and disinformation.



  • Registered Users, Registered Users 2 Posts: 41,057 ✭✭✭✭ohnonotgmail


    you dont seem to be in favour of any twitter employees fact checking.



  • Registered Users, Registered Users 2 Posts: 41,057 ✭✭✭✭ohnonotgmail


    and for other lies and disinformation that dont reach a criminal standard you are happy just to let them be?



  • Advertisement
  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,711 CMod ✭✭✭✭pixelburp


    All smelling faintly of warmed up libertarianism at this stage.

    And as anyone who has been stalked, harassed in real life or had former teachers appear on their premises without permissions will tell you, legal or criminal proceedings don't make the perpetrators run off and hide. It's not some press button, get response workflow that makes the pain stop.

    Social media is a unique system in that it allowed immediate, direct and intimate access to a person's being (for lack of a better word). That's the entire point of the abuse; that they can get in your face. Going to the police to report 1000 death threats? No let's keep it at least simple: 10. What are the police gonna do - that famously understaffed institution chronically incapable of helping even "modest" domestic crimes like rape? Log into `@police` and DM the bully? Don't do it again TrutherMaga7632. OR else we'll do absolutely nothing cos we have 0 jurisdiction!

    Here's a wild idea: let the media outlet have a system to help genuine victims turn off the tap of abuse, it's not a sacrifice of some higher purpose of free speech. It's a basic, pain free bit of empathy for those who experience on social media is "not great".



  • Registered Users, Registered Users 2 Posts: 37,187 ✭✭✭✭odyssey06


    What are these serious consequences then then would stop the interference? Declaration of war? Trade sanctions?

    How would that stop the interference in a global online world? How do you regulate the acts of interference?

    What utter nonsense.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    Yes, trade sanctions and diplomatic sanctions. Let's actually name that country who does interfere, it's Russia. And now it's facing trade and diplomatic sanctions, not for election interference officially but the message is clear, the world have had enough of their shite. So it can be done, and it must be done, but the governments need to take this actions, not social media.



  • Registered Users, Registered Users 2 Posts: 37,187 ✭✭✭✭odyssey06


    How would that stop them from doing what they were doing?

    So instead of social media doing something, you now want to put obligations on trading companies. So how is that the government taking the actions? How is that any different from the government putting obligations on social media?

    Makes no sense.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    Yes. No one is forcing anyone to read lies and disinformation, and it's dangerous to come to depend on a third party to do the thinking and filtering.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    What doesn't actually makes no sense is to create obligations and consequences only for the tool, and not for the perpetrator.



  • Registered Users, Registered Users 2 Posts: 26,011 ✭✭✭✭pjohnson


    No one is forcing them sure but generally its people of limited intellectual abilility who see the rubbish and believe it due to their own issues.

    Its far easier to just remove the rubbish, unless you realise you can try to monetize it by being the only place refusing to remove it.



  • Registered Users, Registered Users 2 Posts: 41,057 ✭✭✭✭ohnonotgmail




  • Registered Users, Registered Users 2 Posts: 41,057 ✭✭✭✭ohnonotgmail


    why doesn't it make sense? why can there not be obligations and consequences on both?



  • Registered Users, Registered Users 2 Posts: 37,187 ✭✭✭✭odyssey06


    Why? Why does that make no sense?

    If you have a lever \ point of action within your legal domain that will shut off the problem, then of course you use it.

    You were ok with creating obligations and consequences for the tool when it was companies trading with Russia, the sanctions consequences there may harm those companies too as they lose a sales outlet.

    The 'perpetrator' isn't someone you can get your hands on and bring to court - it is a nebulous global network, behind that shell game, a state actor.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Advertisement
  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,711 CMod ✭✭✭✭pixelburp


    Something something free speech, take responsibility, report to the police, slippery slope. Like I said, warmed-up libertarianism making sure private enterprise has no responsibility to at least help those in trouble. God forbid Saint Musk show a modicum of empathy towards those who use his tool but find themselves targetted by the ásshole demographic.

    I've asked this question before and yet to see a sensible explanation for what the Gardaí are expected to do if I received abuse from, say, a dozen teenagers in Acron, Ohio. Should a task force be sent there to wag their finger, ask nicely to stop sending me DMs? Oh and that's assuming the abuser is dumb enough to share any breadcrumbs as to their location.

    It's never quite clear how this magical report to the police is supposed to just make it better, instantly. The pivot I guess will then become blaming the victim; just put down your phone, Twitter's not real life etc. etc.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    I think the proper solution is to educate and trust people more, not less.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    what the Gardaí are expected to do if I received abuse from, say, a dozen teenagers in Acron, Ohio

    Again, regulation of abuse and harassment is not the same as deciding on your behalf what is and what isn't misinformation.



  • Registered Users, Registered Users 2 Posts: 19,233 ✭✭✭✭VinLieger




  • Registered Users, Registered Users 2 Posts: 6,486 ✭✭✭eightieschewbaccy


    Thing is, the social media sites that don't attempt some kind of moderation are the more niche and unsuccessful ones. Twitter are effectively the only major social media site that aren't adopting the policy... Same platform can't get advertisers due to their attitude to these issues be it racism, anti semitism, disinformation and conspiracy theories. So this discounting of these issues, you might view as genius on musk's part but it doesn't look like it's the smartest overall. Similar to all of Musk's moves on the platform.



  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,711 CMod ✭✭✭✭pixelburp


    Not answering the question. Exactly as I said. Stick to the example please.

    Your response to the removal of basic empathetic moderation from Twitter was that victims of abuse et al report it to the Gardaí; this is your idea so you tell me: what can the Gardaí do that's equivalent to Twitter's former moderation policy? Have you ever actually rung the Gardaí BTW, or taken legal action? Neither are quick, and depending on the issue - effective.



  • Registered Users, Registered Users 2 Posts: 6,486 ✭✭✭eightieschewbaccy


    Sure, why should the families of mass shootings be afforded basic decency? This stuff is also still frequently posted on there but all fine..





  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,711 CMod ✭✭✭✭pixelburp


    Just this week we've seen that GB News spat, the end result being a journalist (female, of course) receiving an avalanche of directed online abuse 'cos ... lemme check the notes here ... was a female news journalist who Laurence Fox didn't wanna shag and was too woke. Hahaha. What a cow.

    She should just ring the police then, get it all sorted. Or just put up with it: her fault for being female and a journalist and using Twitter and having Woke opinions anyway. Though given the issue's with GB news, there's a good chance its 12 viewers and fans probably are all based in the same English village, so yeah; the Police probably could arrest them all for harassment.



  • Registered Users, Registered Users 2 Posts: 6,906 ✭✭✭Cordell


    OK, to the example, it depends on the abuse. If they are just calling you names, block them, and that's the end of it, move on, it's just words from insignificant twats across the world. If there is an actual threat of violence they won't be in Ohio and X won't be able to protect you; otherwise again, it's just words from insignificant twats across the world. Yes X can block them, but that won't change anything.

    But I never referred to abuse, I never implied that abuse should go free. My point was about misinformation, not abuse, this is where I have an issue with having someone deciding of other's behalf what is true and what isn't.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,456 ✭✭✭HalloweenJack


    Its a question of fact-checking. You can't 'decide' what's misinformation or not; it either is or it isn't: What's true can be proved; if it can't be proved, it shouldn't be posted.

    Your wording reveals your belief that these moderation teams are acting as sinister gate-keepers who control the narrative when they would be ensuring accuracy in debate taking place on their platforms.

    Due to libel laws, a lot of traditional media have fact-checking departments because they can get into a lot of trouble otherwise. Its not always sufficiently thorough but a lot more is done to ensure that stories aren't just plucked out of thin air and have something to support them.

    People in positions of responsibility and with an audience to influence public debate (say people looking to get elected or those who report the news) should be held accountable for what they say and should be required to provide evidence of their claims.

    I don't see why social media should be any exemption in this case and passing the torch to the authorities is just giving them more work that would require more specification. Also, given you think people in this area have biases, what's to say the authorities wouldn't 'decide' what is or isn't misinformation?

    I agree with another post of yours that people should be educated and trust more and having social media challenge and deny misinformation is one way of ensuring that people using it are trusted sources.



Advertisement