Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

d'EU wants to scan WhatsApp messages for kiddy porn

«13

Comments

  • Registered Users, Registered Users 2 Posts: 1,382 ✭✭✭FFVII


    Who is they? Like, what sleepy bollox thinks up this stuff, he's doing it to himself, his family, kids. Then the sleepy bollox dies and the powers are their for govs/corporations to use now, they've made things worse. And then it backfires every now and then and is used on the sleepy bolloxs.


    And you left out sex with escorts from the OP. Used a few times their to get rid of problem sleepy bolloxs in other areas.



  • Registered Users, Registered Users 2 Posts: 11,437 ✭✭✭✭EmmetSpiceland


    Is that some sort of “code”? I don’t follow.

    I never, really, understand the pushback from this sort of thing. It sounds like a great idea. Anything to lessen the scourge of child abuse is a good thing, in my book.

    Taking out the consumer is just as important as getting the providers and creators. They are all links in a dreadful chain.

    I’m sure people are worried about what else they might look for, or uncover, but I really don’t have time for a defence of ‘yes I was doing something illegal but how they found out wasn’t fair’.

    They should go for the platforms, as well. Get them to clean up what they’re hosting.

    “It is not blood that makes you Irish but a willingness to be part of the Irish nation” - Thomas Davis



  • Registered Users, Registered Users 2 Posts: 6,292 ✭✭✭Ubbquittious


    No it's a sh1t idea. Let them find some other way to go after kiddy fiddlers in a way that doesn't affect me or my privacy. I don't want some jumped up little sh1ts called Rupert & Tarquin from GCHQ looking through my messages, they can take a run and jump !

    Better to leave the EU than let those cnuts take even more freedom off us.



  • Registered Users, Registered Users 2 Posts: 13,439 ✭✭✭✭Purple Mountain


    I know this is AH but I thought an adult would have more respect that to use flippant terms like kiddy fiddlers/kiddie porn.

    Call it out like Emmett above. It's child sexual abuse.

    And they're not fidders. They're abusers.

    I've no problem with my privacy being infringed if it stops children being victims of these despicable crimes.

    To thine own self be true



  • Registered Users, Registered Users 2 Posts: 4,620 ✭✭✭enfant terrible


    Why do you call it "kiddy porn" instead of its real name child abuse videos?



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,294 ✭✭✭YellowFeather


    If you put a message somewhere digitally, it is no longer private in reality. And, the people who may look at the message - they don’t care about its content in general.

    For want of a better example, look at the Depp vs Heard case. All kinds of things are coming out in court because of messages. They are not private. And if your company gets sued, and you are considered as perhaps(!) being involved in the issue, your work phone, laptop, camera, whatever equipment can be taken and analysed. If it’s a criminal case, all of your stuff can be taken, but that’s a different topic.

    So forget about Rupert and Tarquin looking through your messages and having a good giggle down the pub. These guys have seen thousands of weird messages. This is about child protection from horrendous things. I think that’s more important.

    Edited to add - nobody is going to read all of your messages in this context. Even if they were made available, they would be filtered for specific content.



  • Registered Users, Registered Users 2 Posts: 6,412 ✭✭✭Jequ0n


    Ah sure. They’ll only be asking for child porn material to be flagged and totally ignore any other illegal content... At least they get some buy in from the white knights who endorse an idiotic idea thinking this will end any child exploitation.

    Post edited by Jequ0n on


  • Registered Users, Registered Users 2 Posts: 9,034 ✭✭✭Ficheall


    How do they scan for things like this? Google Photos occasionally sends me memories folders - or whatever they're called - like "Rainbows", "Beaches" etc., after presumably having had some AI filter through all of my photos - I can't imagine a person has the mind-numbing task of skimming through them. (I'm somewhat dreading/looking forward to the day they send me an album of "Poops" based on photos from a couple of months when I was having bowel issues...)

    Gmail also must scan your email to offer those predicted responses etc, again with an AI.

    All of this scanned info must be stored somewhere, right?

    And the AI to detect CP.. While it may be easy for us to recognise it, I'd imagine it's a trickier thing for an AI to be trained for?



  • Registered Users, Registered Users 2 Posts: 206 ✭✭Amenhotep


    How about the Gardai visiting your home and searching through all your belongings ?

    If you have nothing to hide you shouldn't mind...


    The whatsapp thing doesn't bother me so much, we are using it's service on the cloud, what does bug me however is windows and macOS will be scanning local drives for this stuff - again if it were online photo storage like flickr/picasso fair enough, but a line is crossed when they scan your local personal devices.

    Also , what are they using to gauge what's CP material ? whats the machine learning algorithm ?

    Where are all these sample images stored ? you don't think some employees with less morals will be copying some of this data ?

    I have photos of my kids swimming on a beach, their bare sandy bums exposed - is that CP ? who decides ?

    It's no surprise that the people for this are also the little authoritarians looking for "hate speech" legislation ...


    dangerous dangerous people.



  • Registered Users, Registered Users 2 Posts: 461 ✭✭HerrKapitan


    It's just a front for scanning your phone and eroding EVERYBODY rights, innocent or guilty They've picked an excuse that shocks everyone so that they would go along with it.

    Why not scan for terrorism or drug trafficking before now?



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    @Ubbquittious wrote:

    Let them find some other way to go after kiddy fiddlers in a way that doesn't affect me or my privacy. 

    You realise why this is a paradox, right? It's like saying, "Revenue should find out some other way to catch tax evaders without me having to tell them how much I earned last year".

    Absolute personal privacy is incompatible with maintaining a society. There is always a balance that has to be struck between the two. Libertarian unicorns of being able to do whatever you want with zero state interference fall at the first hurdle when you realise that includes protecting peadophiles, human traffickers and other vile scum.

    In this case though, I'm not sure how much value there is in it. If you scan stuff on legit apps, you push abusers into the dark web. Which is where most of them will be anyway. There's the argument that "it shouldn't be facilitated on mainstream apps", which is valid. But social norms will for the most part keep it out anyway. Nobody will tolerate receiving abuse images over normal channels, and I'm guessing paedophiles don't have big open groups with 150 members called, "Subscribe here for sexy children".

    I have no problem with cloud service providers being obligated to scan for dodgy material when it has been uploaded to their service. It's important though to include protections to limit the scope of what they may search for. Child porn, violence is one thing. No bother there with Google Photos picking them up. And creating an obligation in law for anyone who provides cloud services to police what's put on their servers.

    But how about seditious material? Idle debate about a coup? How do we stop future lawmakers from expanding these powers to find and arrest political dissidents?

    This EU initiative will likely die as the technical impossibility / pointlessness of scanning encrypted content is driven home. But it's still a discussion that needs having.



  • Registered Users, Registered Users 2 Posts: 7,964 ✭✭✭growleaves


    'Absolute personal privacy is incompatible with maintaining a society. There is always a balance that has to be struck between the two'

    Well the balance being struck is that anyone who cares about their privacy ought not to use electronic communications for anything of importance. Because sending an email, a 'private' message etc. is like sending a postcard.

    Its nothing new anyway. The Garda Detective Unit can accesss your phone and go through it. There was a piece about it on Journal.ie years along with allegations that some Gardai were abusing this power.

    Guarantee you many boardsies are being checked out from time to time by garda surveillance units for having 'politically extreme' opinions, and not just 'rabid right-wing' boardsies but anyone who sticks their oar in on any sensitive topic. They know your name and have probably gone through your phone once or twice at least.



  • Registered Users, Registered Users 2 Posts: 6,292 ✭✭✭Ubbquittious


    If you send someone a message and it's properly end-to-end encrypted only the recipient can betray you. I won't forget about Rupert and Tarquin because they are only human and if they see something funny they will giggle. Right now they have to take actual equipment off you and they can only see what you haven't already deleted. What they are proposing is that they can look through everything from the comfort of their office chair at a central location.



  • Registered Users, Registered Users 2 Posts: 2,108 ✭✭✭CGI_Livia_Soprano
    Holding tyrants to the fire


    You think Rupert and Tarquin will "giggle" at pictures of "kiddy porn?"



  • Registered Users, Registered Users 2 Posts: 7,964 ✭✭✭growleaves


    'So forget about Rupert and Tarquin looking through your messages and having a good giggle down the pub. These guys have seen thousands of weird messages.'

    Who are Rupert and Tarquin?

    Guards have already been accused of abusing surveillance powers.

    Say your girlfriend sends you pictures of herself topless. Does it matter if a guard has a **** over the photos?

    You're not addressing privacy concerns at all. You're just saying that privacy doesn't matter and to get over it.

    Suppose they need to see inside your bedroom at all times to ensure that criminals don't get away with crimes. Is that okay? Do you have something hide? Maybe you're the criminal etc., etc.



  • Registered Users, Registered Users 2 Posts: 7,755 ✭✭✭MrMusician18


    Indeed, what is the difference between images of child exploitation and an image of your child playing in the bath?

    In Ireland you can be convicted of possessing images of child exploitation even if you receive them in an unsolicited manner and subsequently delete them.



  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    From a scanning point of view, there's very little difference. Any scanning software typically looks for an abundance of skin tones and such, though this software is getting better all the time and using machine learning to better identify not only people in images, but their rough ages and what they're actually doing.

    From a legal point of view, there's a big difference though. Merely in possession of pictures of a naked child is not illegal, it would need to be proven in court that the purpose for which someone is in possession of these images is illegal.

    It also has to be remembered that it's not and will never be a case that some computer flashes an alert with your name on it and suddenly officers are kicking in your doors and confiscating your equipment. Any images flagged by software are reviewed by a human to sift between irrelevant images and potentially iffy images.

    Any "potentially iffy" images are then further reviewed to assess whether this clearly requires further follow-up, or if it's just some sketchy image that has been incidentally retrieved or cached.

    Can you give a source on this claim that you can be convicted of possession of something you don't possess? As far as I know, if you receive illegal material unsolicited and then destroy it immediately, you cannot be charged with possession, though you my be liable for prosecution under child protection laws and destruction of evidence.



  • Registered Users, Registered Users 2 Posts: 7,014 ✭✭✭Allinall


    And yet you put up ridiculous posts here for everyone to giggle at?



  • Posts: 0 [Deleted User]


    "They" can already see everything you're seeing through your own eyes due to the Covid microchips we're all now implanted with.



  • Registered Users, Registered Users 2 Posts: 7,380 ✭✭✭timmyntc


    These kinds of powers are just the start, once they get a foot in the door they will then use the existing legislation to justify further breaches of your privacy.

    shur dont we already scan for CSAM material, scanning your messages for "extremist content" is just an extension of that.

    They will keep pushing and pushing for more and more surveillance on everything you do, say and have until you have no privacy left. Privacy is an outdated concept anyways, if you have something you dont want in public then its obviously immoral and wrong



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,122 ✭✭✭eggy81




  • Registered Users, Registered Users 2 Posts: 9,211 ✭✭✭Royale with Cheese


    Some people actually think this is a good idea? We have end to end encryption currently and people are happy to give that up because something something paedos? That episode of Brass Eye springs to mind. This isn't about child porn, it's about using potentially anything in the end to end encrypted messages you currently send to incriminate you. Facebook will have free reign over your WhatsApp conversations on top.



  • Registered Users, Registered Users 2 Posts: 8,454 ✭✭✭ceadaoin.


    It's not "porn". It's abuse. Images of child abuse. "Kiddy porn"? What a disgusting, minimizing and downright offensive term.



  • Registered Users, Registered Users 2 Posts: 13,062 ✭✭✭✭TheValeyard


    You cannot be found guilty if you receive unsolicited illegal material. Otherwise that be a great way to take out potential enemies. Needs to be proven you deliberately set out to recieve and view such obhorent material.

    All eyes on Kursk. Slava Ukraini.



  • Registered Users, Registered Users 2 Posts: 13,062 ✭✭✭✭TheValeyard


    Worrying the thinking some here are happy to surrender freedoms to stop the online boogeyman. Interpol have various means and clever ways to catch these predators. Giving governments the right to scan your messages is a dangerous slippery slope. We are not China.

    All eyes on Kursk. Slava Ukraini.



  • Posts: 1,010 ✭✭✭ [Deleted User]


    Scan away. If you are using any sort of electronic communication to do illegal stuff you are an idiot and deserve to be caught.



  • Registered Users, Registered Users 2 Posts: 6,292 ✭✭✭Ubbquittious


    It is a well established term, everyone knows what it means. Putting a different name on it won't make it any better



  • Registered Users, Registered Users 2 Posts: 9,138 ✭✭✭Gregor Samsa


    Apple's technology, which wasn't actually rolled out in the end, didn't scan the actual photos as such, and AI didn't have to be trained for it.

    Each photo uploaded to their iCloud system was to be "hashed" - a unique string of letters and numbers representing the image, using a specific algorithm. There's already a database of hashed child abuse images provided by the National Center for Missing & Exploited Children and other child protection organisations. Apple would compare the hash of the uploaded photo to the database to see if there was a match. It would therefore only detect existing and confirmed child abuse images - it wouldn't do anyting in relation to pics of your kids in the bath or even brand new child abuse images. What was novel about it was that usually an edited photo (even just resized) would have a different hash to the original, but Apple found a way to recognise the hash of an edited photo as relating to the hash of the original. The advantage of using hashes is that no-one has to look at the actual photos of abuse, and the technology doesn’t have to be “fed” actual child abuse images to teach, as AI would need.

    People got their knickers in a twist that Apple were snooping on photos, when all they were doing was comparing a string of meaningless characters. But that’s the way the world works - you don’t have to actually understand something to object to it.



  • Registered Users, Registered Users 2 Posts: 6,292 ✭✭✭Ubbquittious


    It wasn't just stuff you uploaded to iCloud though. There was a local element to it as well that would go through the files stored on the iDevice itself and hash those and if you had some offending images planted on your phone and just stored locally you could potentially still get a visit from the bhoys in blue



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 16 AttractiveAndSingle


    Why do we need privacy at all in your opinion?



  • Registered Users, Registered Users 2 Posts: 11,437 ✭✭✭✭EmmetSpiceland


    Post is private, “snail mail” like. But if someone sends something illegal through An Post, and they scan it, they can withhold the item, or arrest the recipient on collection. Is that an invasion of privacy?

    Someone can buy drugs on from a dealer on a street corner. If a cop sees this they can go up and arrest them. Should the cop mind their own “business”? Is that an invasion of privacy?

    We did well in this country making paedophilia socially unacceptable in 1994. But just because people started to view these acts as “wrong” didn’t mean they just went away. There are just as many predators out there as there was then. I just don’t think sacrificing a small amount of, online, privacy is a big deal if it means catching, or discouraging, child abusers.

    “It is not blood that makes you Irish but a willingness to be part of the Irish nation” - Thomas Davis



  • Registered Users, Registered Users 2 Posts: 7,964 ✭✭✭growleaves


    Paedophilia was socially acceptable in 1993? What are you talking about? What happened in 1994?



  • Registered Users, Registered Users 2 Posts: 5,744 ✭✭✭kleefarr


    May be they should take a leaf out of their own book. From what I have seen recently it's those in power and their pals that are doing the disgusting things to kids.

    Could post a link but it would probably be taken down.



  • Registered Users, Registered Users 2 Posts: 2,294 ✭✭✭YellowFeather


    Interesting, but how would they calculate old hashes? Trial and error?



  • Registered Users, Registered Users 2 Posts: 7,755 ✭✭✭MrMusician18


    An post don't open letters to check their contents, and read the messages - which is what this proposal is.

    It's also presuming everyone is a criminal to be cleared.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,194 ✭✭✭Jarhead_Tendler


    No problem with this .



  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    You mean how would they trace it back? Certain hashing algorithms don't produce completely random outputs. When input A looks like input B, then hash A may also look like hash B. This allows you to compare hashes and have a certain level of confidence about whether the inputs are similar.

    We tend to think about hashing algorithms in terms of encryption, and being able to do this with a hashed password would be bad. But hashing is used for lots of other things, and being able to compare data based on their hashes rather than having to do a full comparison has lots of good applications.

    There are also ways to compute hashes that don't necessarily take every byte of data into the hash. For images, for example, you can calculate a number for the general palette within the picture to get an idea of the tones of it. This is how if you do an image search for "Green", google finds a load of green images.

    You can also take samples from various parts of the picture, calculate the tone in those individual regions and then combine them to create a larger hash. This is one way to get more accurate image comparisons because it's less susceptible to manipulations like adding borders or reducing the resolution of the image.



  • Registered Users, Registered Users 2 Posts: 2,294 ✭✭✭YellowFeather


    Thank you seamus and Gregor! Yep, I had often used hashes to compare data, but never thought of, or saw it being used to identify similar data I suppose it’s because we always need an exact match in my line of work.

    Very interesting! Appreciated.



  • Registered Users, Registered Users 2 Posts: 7,380 ✭✭✭timmyntc


    Hashing functions rarely identify "similar" data, if they do then the function is retaining too much information about the file in the first place.

    Also most hashing functions can still generate collisions from totally different data, hash collisions can and do happen albeit rarely.

    So what then? I get a hash collisions against one of those CSAM they have in a database, and now they have the right to imprison me? To surveil all my communications? Seize my phone?

    A typical court warrant for any of that would fail instantly with that kind of tenuous evidence of a hash match, so why should we let govts legislate for this directly? Theyre effectively lowering the standard of evidence required to get electronic surveillance in place on anyone.


    A society who would give up freedom for a little security deserves neither.



  • Registered Users, Registered Users 2 Posts: 2,294 ✭✭✭YellowFeather


    If there was a hashing collision, the police won’t rock up at your door and throw you in jail. There would - potentially (as, as I said, I’m not familiar with this similar hashing process and would need to research it) be an investigation.

    We all have our own views, but I’d, personally, hand over my phone, laptop, passwords, backup info, email info whatever if it would help to protect one child from a predator.

    To reiterate, nobody would look at all of your data and nor would they care to.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 492 ✭✭Dublinandy3


    The question is simply, would you be willing to give up some privacy for the protection of others, that question can be asked in a multitude of ways, in this instance, it's via Whatsapp. Personally, I would be willing but I know others would never be willing to.

    For issues such as this, or any issue, there will never be 100% agreement which is why politicians are not popular. They have to weigh up what they believe is public opinion and go with what the consensus wants. Sometimes it'll suit a particular individual, sometimes it won't.

    I'm guessing if this was implemented then it wouldn't suit the OP due to their own views on privacy but would suit me, and vice versa.



  • Registered Users, Registered Users 2 Posts: 9,138 ✭✭✭Gregor Samsa


    @timmyntc "A society who would give up freedom for a little security deserves neither."

    Of course this statement makes no sense. Every society gives up freedom for security. We don't allow drivers the freedom to drive on whatever side of the road they want, as we want to drive in the security of not facing random head-on collisions. That's not controversial at all.

    I assume you're attempting to involve the spirit of Ben Franklin, but you've got the literal words, the sprit and the context of the quote wrong. The actual Ben Franklin, quote is "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety".

    Franklin's quote was nothing to do with personal liberty. It was made in a letter sent in relation to attempts by the Pennsylvania General Assembly to tax the landowning Penn family and use that money to pay for defences during the French and Indian War. The Penn family were urging the Governor to veto the tax plan, and instead they'd provide a limited lump sum to be spent on defense. The "essential liberty" Franklin spoke of was the ability of the legislature to collect taxes. The "little temporary safety" was the small and inadequate amount of defence benefit the lump sum would provide.

    Far from Franklin's quote being an absolutist defense of personal liberty, it was actually a defense of a government's ability to raise taxes from individual landowners to provide collective safety for the entire population. And he was saying that people like the Governor didn't deserve either liberty or safety if they were willing to bend to the lobbying of powerful landowners.



  • Registered Users, Registered Users 2 Posts: 206 ✭✭Amenhotep


    If this is true, this is a lot better so, but I do wonder how the heck they could relate hashes of edited photos back to original, hash functions are one way, a tiny tiny edit gives a COMPLETELY different hash.

    For this I think they have some AI ... and therefore need a stash of imagery for the machine learning ...



  • Registered Users, Registered Users 2 Posts: 9,138 ✭✭✭Gregor Samsa




  • Registered Users, Registered Users 2 Posts: 206 ✭✭Amenhotep


    Thanks, interesting reading.

    Again, I've no issues with them doing this on their own iCloud servers, it's when they start scanning local devices that annoys me.

    Amazed the people here defending it, saying when a childs security is at stake the right to privacy disappears ?? ! seriously ?

    By the same logic you could argue to have cameras on in your homes at all times to "keep the kids safe" ...

    Why should we all be guilty until proven innocent ? it should be the opposite.



  • Registered Users, Registered Users 2 Posts: 206 ✭✭Amenhotep


    I also had a photo of my son on a beach flagged by flickr once, this photo was completely innocent, he was in his shorts , giving the up yours to the camera ... nothing in background, when I manually unflagged it, I guess a human somewhere checked it and gave it the OK, but there is some sort of AI/machine learning scanning going on - not just the hashing stuff.



  • Registered Users, Registered Users 2 Posts: 5,148 ✭✭✭rom


    As someone in the know how this all works. The ask is here is that a hash of an image ( a unique value for every file) match to google image match of file which is partial match on an image. I have no issue with this and I am 1000% pro privacy but am informed. Not allowing a file that is 100% child porn going over whatsapp is a good thing. Come back to me when there is someone held up in the wrong here. A file hash is unique. I am not going to take a picture that is the same as a child porn image, it's impossible.

    Basically having an issue with this is the same as having an issue with someone providing 100% proof that the won the loto. It can't be refuted. If you have sent child porn it is black and white and there is database of these hashes and any check of this against this is against the metadata and not the content. It is not the case where a human needs to verify and it's a picture of your wife instead. It is 100% child porn match which no risk for failure on hash.



  • Registered Users, Registered Users 2 Posts: 6,292 ✭✭✭Ubbquittious


    They still need to subvert control of your device somewhat to make it work or else compromise the end-to-end encryption



  • Registered Users, Registered Users 2 Posts: 8,184 ✭✭✭riclad


    This new law apply to all digital messaging apps , Facebook message, imessage, discord, telegram, WhatsApp basically it asks all messaging apps to scan for illegal content, right now they use encryption, eg most apps can't read your messages, they know user 1 send a message to user 2 , they don't know the text of the message , it might contain links to certain illegal websites. Apps like WhatsApps telegram use encryption so only you and the person who recieve the message can read it, this is important in country's like Russia Iran where human rights protestors and activists and minority groups need to be able to have privacy from government surveillance, right now if you use certain apps you know hackers or some random person can't read your messages

    once data is collected it tends to become a target for hackers , do you want someone reading all your 2fa 2factor passwords or pin codes that Google and your bank sends you so that can hack your data and gain acess to your email and your bank account do you think that no one should have a right to privacy just because a tiny proportion of people wish to view illegal content.

    You can see more info on this at www.techdirt.com

    I think this would apply to any messages including sms txt messages.

    If you look into the detail of this law it is

    on similar to data retention laws. Passed in Hong Kong China and Russia eg. Eg the government will have a right to examine all messages sent by EU citizens if they are suppected of illegal activity

    Right now apple i message and Facebook WhatsApp telegram in America can choose to encrypt all messages on both devices if they want to so only the sender and reciever can read the data sent

    If the message is looked at ll look like a random string of nos as its not in pain text form

    You are simply wrong if you think no one has any privacy online that depends on what apps they use and and how they use them and do they use 2factors pin codes to login into online accounts


    This is especially important now when Texas is bringing in laws that will make abortion illegal and also it could be made illegal to help or give a woman any information as to how to travel outside Texas to get a medical procedure that will be legal in another state

    So if you were sending messages to someone in Texas you would use an app like telegram that is encrypted

    So in the future it will be important that people can have some degree of privacy and can send a message without it being read by another agency

    Like for instance in some American states its legal to buy or sell cannabis and in others its not

    What apple was doing was scanning images on the iphone to see if they matched with known illegal content

    If they new law comes into force what's app Facebook etc would have to do is to weaken encryption or turn it off completely so as to be able to scan all messages in case they might contain illegal content

    One problem is this will make it easy for hackers if they can get acess to your phone they might be able to read all messages 2fa code pin codes that Google or your bank sends you and then gain acess to your bank account and read your emails in order for instance to hack into your bitcoin account

    In some EU countrys abortion is illegal unless the woman's life is in danger

    and you could maybe be arrested for helping a woman to travel to another country to obtain an abortion

    So if you think that no EU citizen deserves privacy in sending a message to another person I don't agree with you

    I think police in some parts of America America have already said they will be collecting digital data in states in order to stop women getting abortions if roe versus wade is struck down and abortion is illegal



  • Registered Users, Registered Users 2 Posts: 8,184 ✭✭✭riclad


    Many minority groups eg lbgt rights groups , religious minoritys, humans rights activists , lawyers, environmental protestors, use telegram and WhatsApp in country's like Iran russia to communicate and avoid surveillance if this law comes into force it.ll basically make end to end encryption illegal in Europe this would be a disaster for the right to privacy and human rights.

    I understand sometimes criminals use motorbikes or cars to commit crime I think it would be ridiculous to make it illegal for instance for anyone person under the age of 30 to drive a car or use a motorbike.



  • Advertisement
Advertisement