Advertisement
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
If we do not hit our goal we will be forced to close the site.

Current status: https://keepboardsalive.com/

Annual subs are best for most impact. If you are still undecided on going Ad Free - you can also donate using the Paypal Donate option. All contribution helps. Thank you.
https://www.boards.ie/group/1878-subscribers-forum

Private Group for paid up members of Boards.ie. Join the club.
Hi all, please see this major site announcement: https://www.boards.ie/discussion/2058427594/boards-ie-2026

So "X" - nothing to see here. Elon's in control - Part XXX

1268269271273274398

Comments

  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    "I'm not going to investigate this any further, I trust Louis and Linus enough and if they sided with him he must be in the right."

    And people wonder how misinformation spreads so easily on the Internet...

    Also, is the Linus you're referring to the guy from Linus Tech Tips, who recently had to issue an apology for not doing enough research on products they were reviewing and getting a lot of stuff wrong in their reviews?




  • Registered Users, Registered Users 2 Posts: 37,427 ✭✭✭✭odyssey06


    "Not going to investigate further"

    "Must be right"

    Is that an echo I hear?

    It is certainly not engaging with the evidence or counter arguments.

    You have no idea who made the decision. What does calling them irrelevent even mean in this context?

    If they are irrelevent why are you talking about them?

    You have provided zero evidence to support your claims. It is without foundation or credibility.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Registered Users, Registered Users 2 Posts: 26,653 ✭✭✭✭pjohnson


    Dont ministry of truth his stories with actual facts and reality.



  • Registered Users, Registered Users 2 Posts: 7,142 ✭✭✭Cordell


    Yes, that Linus. If he, who's exposed to such levels of scrutiny can get things wrong, imagine how wrong and how often those low level employees get them wrong. And how many people had their channel deleted just because someone saw a kitty meowing the wrong way - they did it to Louis Rossmann in case you missed it.



  • Registered Users, Registered Users 2 Posts: 6,127 ✭✭✭silliussoddius


    But it’s Dinesh, the powerhouse behind 2,000 Mules.



  • Advertisement
  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    I didn't miss the Louis Rossmann thing. I ignored it because I never heard of him and, like the 3D printer guy who you casually forgot to mention posts videos about 3D printing guns, I assumed you're leaving out relevant information.

    So fine, I spent 2 minutes googling. And what I found was a bunch of his videos were removed one day for breaches of rules, including two videos of his just featuring his cat meowing. Including this one:

    This video on YouTube, which I've just linked from YouTube, was removed from YouTube. Can't believe this YouTube video which is still on YouTube was removed from the YouTube account which is also still on YouTube.

    Unless of course, it was removed at the time by mistake, and reinstated upon appeal/review. That would explain how the YouTube video is still on YouTube and hasn't been removed on YouTube.

    Look, Youtube's moderation sucks ass. Nobody is stating otherwise. They can be over reactionary and so much of it is automated that it's open to huge abuse with regards copyright claims etc. Yet they can be completely blind to some of the horrendous and dangerous stuff, particularly as it relates to children and teens.

    But as a company they still enforce the rules they set and if a guy is showing how to 3D print guns and they ban his account for it, maybe he should have printed literally everything else in the world except guns.



  • Registered Users, Registered Users 2 Posts: 7,142 ✭✭✭Cordell


    This video on YouTube, which I've just linked from YouTube, was removed from YouTube. Can't believe this YouTube video which is still on YouTube was removed from the YouTube account which is also still on YouTube.

    No need for this, I said flagged, not removed.

    Look, Youtube's moderation sucks ass. Nobody is stating otherwise. They can be over reactionary and so much of it is automated that it's open to huge abuse with regards copyright claims etc. Yet they can be completely blind to some of the horrendous and dangerous stuff, particularly as it relates to children and teens.

    That's my point. Moderation sucks ass because it's being delegated to low level employees who either can't be bothered or don't have the capacity to do it properly. Same kind employees who would be tasked with fact checking.



  • Registered Users, Registered Users 2, Paid Member Posts: 2,912 ✭✭✭nachouser


    ...



  • Registered Users, Registered Users 2 Posts: 17,180 ✭✭✭✭Grayson


    I used to work for eBay. They have loads of rules for what can't be on the site. And sometimes they get it wrong. I've seen times when the rules were misapplied. For example it's against the rules to use a celebrities name to sell something. You can't sell a pair of jeans and show brad pitt wearing a similar pair. This meant that I once saw a celebrity item removed incorrectly. It was the actual clothing that the celebrity wore and was being sold in a charity auction and someone removed it. It was reinstated.

    The way items are reviewed now is by machine learning. This can remove hundreds of thousands, if not millions, of items in a month. Those items are generally breaking policy but some may also be illegal. Of the items that are removed, a random selection are sent to a human team who review to make sure that the program is correctly removing them. That same team would also deal with appeals. Because of this, hundreds of thousands of items, many of which are dangerous and illegal, are removed.

    I know facebook do something similar but they also have teams dedicated to suicide, terrorism, child abuse etc.

    I'd imagine that youtube videos are the same. About 3.7 million videos are uploaded every day. It's over 200k hours of video. It's impossible to have people review them all. I'd imagine that even just reviewing reports, it would me impossible to eyeball all the videos. I know that depending on your size, you get different levels of moderation. The largest accounts have account managers and reports are sent to them first for review. Smaller, but still larger accounts, would have managers who review the appeals and deal directly with the account holder.

    But smaller level accounts would not have that and would have material removed by a system, not a person. And a person would look at the appeal.

    And there's some fantastic moderation tools being developed to find fake news. Once a story/post/tweet is flagged as fake, the system can locate similar stories and flag them. They can be removed, reviewed or just have warning attached along with a link to a factual story.

    The thing is that twitter got rid of most of their moderation teams. So there's next to no moderation on the platform. Whereas people could report posts that are harassing, or hateful or just misleading, that's not possible any more. There's no-one to look at them. And as imperfect as these moderation policies are across all platforms, at least they're there. I can tell you that although you might see material on them that is bad, they're still removing millions that are far worse. Twitter doesn't care anymore.



  • Registered Users, Registered Users 2 Posts: 19,332 ✭✭✭✭VinLieger


    From everything I read Linus isn't supporting him per say he's simply agreeing he deserves an explanation. However YouTube are notorious for not giving explanations or reviewing decisions though and that's not new behavior so imo if you use the platform that's the risk you take.



  • Advertisement
  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    You didn't say flagged, you said:

    "And how many people had their channel deleted just because someone saw a kitty meowing the wrong way - they did it to Louis Rossmann in case you missed it."



  • Registered Users, Registered Users 2 Posts: 7,142 ✭✭✭Cordell


    You're missing the point, again.



  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    You're misrepresenting the point, by trying to put forward the idea that actions were taken against his account because of a video of a cat meowing, when really it was a case of that video getting caught up with a bunch of other videos they were actioning.

    Just like you neglected to mention that the 3D print guy was printing guns in his videos (and then circumvented the ban by posting those videos on a friend's account for a year), and that's why actions were taken against his account.

    You're twisting narratives and inventing your own boogeymen (low level untrained neckbeards) to justify your point, and your point seems to be that moderation isn't good enough, so therefore there should be less moderation, when the reality is that less moderation leads to far worse issues.

    So are you arguing for less moderation, or better moderation? Because better moderation means more moderators, more supervisors, more training and more oversight. However none of that will ever fully eliminate mistakes.



  • Registered Users, Registered Users 2 Posts: 7,142 ✭✭✭Cordell


    when really it was a case of that video getting caught up with a bunch of other videos they were actioning

    That's even worse. Point is, moderation teams are made of people who sometimes will be flagging videos of cats meowing, that's their level of intellectual abilities. That's why we're better of without them.



  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    Okay, so you're in favour of less moderation.

    So what happens when videos do need to be removed? Who deals with them?



  • Registered Users, Registered Users 2 Posts: 3,786 ✭✭✭McFly85


    The idea that content moderators are some sort of arbiters of truth is fanciful. They are people employed to remove videos that they believe have violated the ToS, that’s all.

    And leaving everything up and letting the users decide what they believe to be true is an awful idea. People generally have bias towards some people or another but facts are always going to be the most important thing in believing someone’s statement.

    You see it a lot with conspiracy theorists - the idea that they have the ability to innately know what the truth is, lots of patting themselves in the back for not being a sheep and believing what the mainstream media says - and having a platform that does not fact check or remove false statements makes it incredibly easy to exploit these people.



  • Registered Users, Registered Users 2 Posts: 19,332 ✭✭✭✭VinLieger


    So unless a system is absolutely perfect with no possibility of faults whatsoever we should remove it?



  • Registered Users, Registered Users 2, Paid Member Posts: 36,643 ✭✭✭✭Penn


    Plus when it comes to things like election or vaccine/public health misinformation, the arbiters of what content gets removed or not is not the individual moderators, or even the moderation team. Such policies would be judged in line with what official sources say. Ie. If someone is spreading a video saying you don't have to be registered to vote you can just show up with a bill from Netflix with your name on it, the "truth" isn't decided by a YouTube moderator scratching his neckbeard and pondering if that's right or not, the "truth" is the official election rules.

    So for big ticket items like voting laws, emergency public health notices etc, the companies are going to go by the recognised official sources, if for no other reason than to protect the company itself from possible liability.

    Moderators are like a HR department in many ways; they're there to protect the company's interests more than anything else.



  • Registered Users, Registered Users 2 Posts: 7,142 ✭✭✭Cordell


    No, what I was saying and what started this whole debate was that we should not let these people decide what is misinformation and what isn't, because they won't be able to do so. I was asked for proof, there you have it, they can't even get cat videos right. They can and will make arbitrary decisions, they will let their own bias guide them, because they don't know any better.



  • Registered Users, Registered Users 2 Posts: 8,111 ✭✭✭Christy42


    That wasn't a misinformation issue though. That was taking too much down when someone was teaching people to make guns.

    No matter what you still have that type of moderation.


    A lie will go around the world before the truth gets it's shoes on. If you allow random lies then it will just become a conspiracy forum as sheer weight of numbers from conspiracies outnumber the truthful posts and most regular people just stop bothering.

    You will always have 4chan which is how this will inevitably go if moderation is not used.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,670 ✭✭✭francois




  • Registered Users, Registered Users 2 Posts: 6,127 ✭✭✭silliussoddius


    It’s not even a moral case, people like that thrive on seeing the other side get worked up. People have made entire careers out of it. Stop giving him attention and be the bigger person and walk away, leave him to his chum bucket echo chamber.



  • Registered Users, Registered Users 2 Posts: 19,332 ✭✭✭✭VinLieger


    Except your argument for what actually happened to those videos and channels effectively boils down to since the system isn't perfect they should scrap the whole system. The enemy of good is perfection.



  • Registered Users, Registered Users 2 Posts: 33,451 ✭✭✭✭AndrewJRenko


    Post edited by AndrewJRenko on


  • Registered Users, Registered Users 2 Posts: 6,898 ✭✭✭eightieschewbaccy


    I'm sure the resident defender will justify paying people for getting lots of views on violent attacks.



  • Registered Users, Registered Users 2 Posts: 14,562 ✭✭✭✭hotmail.com


    Twitter doing that for ages.

    Despite Musk taking ownership and changing the name, the site still appears amazingly popular among the cohort that like Twitter.



  • Registered Users, Registered Users 2 Posts: 6,898 ✭✭✭eightieschewbaccy


    I would strongly suspect engagement on the platform has fallen off a cliff.


    Twitter has also not been paying users all the time for viral tweets, it's a recent thing. Sharing violent videos of crimes would have a user banned fairly lively previously, in this case they're actively making money off of the tweets.



  • Registered Users, Registered Users 2 Posts: 14,562 ✭✭✭✭hotmail.com


    Been on twitter for years. They have much looser rules than Facebook and Instagram in terms of violence and nudity/porn.

    It's a more adult site that doesn't interest children or younger people in general.

    Regarding charges - it's a private company losing money. It's trying to arrest the decline and make it profitable.



  • Registered Users, Registered Users 2 Posts: 33,451 ✭✭✭✭AndrewJRenko


    The free speech absolutist strikes again

    image.png image.png




  • Advertisement
  • Registered Users, Registered Users 2 Posts: 61,272 ✭✭✭✭Agent Coulson


    The Department of Truth seems to work when Elon isn't happy.

    Elon not perpared to let people make up their own minds on these allegations so he deletes content and bans users from his free speech platform




Advertisement