Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Apple to scan your iPhone for child porn

«1

Comments

  • Registered Users, Registered Users 2 Posts: 7,516 ✭✭✭Outkast_IRE


    The brand that has superior privacy and data protection as one of its main selling point is going to start scanning your phone for illegal material ?

    Whilst the theory behind it is fine, its the precedent it sets that is disturbing.



  • Registered Users, Registered Users 2 Posts: 2,081 ✭✭✭GetWithIt


    And how exactly will they be training that model?



  • Registered Users, Registered Users 2 Posts: 11,424 ✭✭✭✭EmmetSpiceland


    I mean, whatever about scanning for any other, potential, illegal “activity” would anyone really have a problem with this one?

    “It is not blood that makes you Irish but a willingness to be part of the Irish nation” - Thomas Davis



  • Registered Users, Registered Users 2 Posts: 13,119 ✭✭✭✭Flinty997


    My wife asked why I speak so softly in the house. I said in case Mark Zuckerberg was listening.

    She laughed...I laughed...

    Alexa laughed ...

    Siri laughed....

    Facebook said it wasn't funny..



  • Registered Users, Registered Users 2 Posts: 2,588 ✭✭✭ahnowbrowncow


    Why won't someone think of the children. This is the definition of a slippery slope, violate the privacy of all to catch a few.

    Surely there are better ways to catch them and this is a wild idea, maybe we could start by giving those that are caught some prison time rather than the slaps on the wrist they currently get.

    And you'd want to be very naive to think that Governments, Intelligence agencies won't use this technology for more than just this stated reason.

    What's to stop China from implementing this on all it's citizens phones to catch people with images of Tiananmen Square?



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 461 ✭✭HerrKapitan


    This could be a trojan horse, a foot in the door. It's something that could be introduced with good intentions.

    Then it will be brought in to monitor spending habits, health status, etc. Yes, forms of that exist already, but this might not be voluntary.



  • Registered Users, Registered Users 2 Posts: 3,637 ✭✭✭thomil


    I read through some of the documentation on a more reliable website than the one linked in the OP. Basically, this new initiative is taking a two-pronged approach, both based around iCloud.


    Part 1 revolves around ensuring that children aren't exposed to explicit imagery via Messages. This utilises the Family Sharing option of iCloud. If someone who is registered as a child in Family Sharing gets sent an explicit image, this will be analyzed by the OS and blurred. The user will then receive a pop up telling them why it has been blurred. If the child proceeds to open the image anyway, a notification will be sent to the parents, something that the child will also be notified about. The analysis is done on-device using iOS & MacOS built-in machine learning abilities.

    Part 2 will scan images stored in iCloud Photos. To do this, they will compare the hash values of the photos in the iCloud Photo library with hashes of known child abuse imagery, so the comparison won't use the images as such. If a mashing image hash is detected, a so called "Safety Voucher" is created and sent to Apple. This "voucher" is encrypted, and Apple will need a certain number of these vouchers to decrypt the content and alert the authorities if required. If the number of vouchers remains below this threshold, no action will be taken. It seems to me as if this measure is designed to reduce false positives.

    The comparison is done on-device but will only look at images that are either already uploaded to iCloud Photos or are about to be uploaded. Other parts of the operating system are not touched.

    Here's a link to the technical documentation Apple has released:

    CSAM Detection - Technical Summary (apple.com)

    Technical_Assessment_of_CSAM_Detection_Benny_Pinkas.pdf (apple.com)

    Good luck trying to figure me out. I haven't managed that myself yet!



  • Registered Users, Registered Users 2 Posts: 237 ✭✭RulesOfNature


    What about parents with bath pictures of their own children? I know a lot of families that have photos of their own kids.

    More regulation is never good. Just watch when the garda comes along an innocent household, strips their children away, freezes their assets, invades their privacy, detains the children, demolishes your reputation (because who wants to be friends with a pedo?), gets you fired from your job, and accesses all your computers to look for 'child porn' that the AI told them exists in your computer. Its for your own safety, of course.


    And the best part is, they claim they're gonna have a live team review the photos pinged by the AI. Which means that an unknown group of strangers (probably from India) is going to be looking at naked photos of your own children.


    Again, this is just for your own safety.



  • Registered Users, Registered Users 2 Posts: 967 ✭✭✭Burt Renaults


    Parents with bath pictures of their own children should be jailed. Especially if they wheel them out at their 21st birthday.



  • Registered Users, Registered Users 2 Posts: 8,184 ✭✭✭riclad



    Some people may have different views of what content is legal or appropriate to photograph depending on what culture or country they live in

    The problem is apple says user privacy is important but it sets a very bad precedent , eg how long till the government says every tech company must scan for pirated tv shows, music, illegal content, political content

    Many country's have laws against promoting lgbt content or free speech viewpoints

    once you put a backdoor in online phone storage governments like Russia Iran Hungary will be demanding acess to it to look for photos of political protests or free speech activists or simply journalism that is critical of the government

    We have see with the nso malware many governments will target journalists political activists and Politicans phones with software that has acess to all the data on the phone

    Many messaging apps encrypt all data so only the user has the password so all messages are private and can't be accessed or read by anyone apart from the phone own

    And of course google may be under pressure to do the same thing which may mean innocent people who use android will have less security and privacy especially in country's like Hungary where there's little respect for human rights and political free speech



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,218 ✭✭✭patnor1011


    And of course you do have no idea what you talk about when you say that there is little respect for human rights and free political speech in Hungary. Have you ever been in Hungary?



  • Registered Users, Registered Users 2 Posts: 150 ✭✭Gary Scrod


    The regular posters of the 'sh1t thread' have something to fear. I'd strongly suspect they share photos of themselves wallowing in their latest faecal deposits. Sick chuntz.



  • Registered Users, Registered Users 2 Posts: 2,807 ✭✭✭ShatterAlan



    Yes. How about you give autorities tacit permission to enter your house at any time of the day or night unannounced under the proviso of searching for child porn?



  • Posts: 3,689 ✭✭✭ [Deleted User]


    In that case, I'm sure Apple will cover the iPod Touch on this operation.



  • Registered Users, Registered Users 2 Posts: 13,041 ✭✭✭✭TheValeyard


    Dangerous technology that is not really going to be used in the West. The idea of protecting children by identifying pedos is just demonstration of the technology for potential buyers in China, Russia and the Middle East for more nefarious purposes.

    All eyes on Kursk. Slava Ukraini.



  • Registered Users, Registered Users 2 Posts: 1,543 ✭✭✭KildareP


    Be under no illusion, Android scans all of your photos as a matter of routine already - that's how it can automatically group them by person using facial recognition, by location, etc.

    And while the likes of WhatsApp encrypts all of your message exchanges end-to-end, if any party to a conversation has turned on either WhatsApp backup or the option to automatically add all media to the camera roll, then it is no longer end-to-end encrypted and can be read outside of WhatsApp, including by this automated scanning that Apple propose.


    On Android, if you use Photos, Drive or Docs, or back your device up automatically to your Google account, then their T&C's effectively already covers them for what Apple are now proposing to do on the iCloud side of things:

    Google Terms of Service – Privacy & Terms – Google

    Purpose

    This license is for the limited purpose of:

    • operating and improving the services, which means allowing the services to work as designed and creating new features and functionalities. This includes using automated systems and algorithms to analyze your content:
      • for spam, malware, and illegal content
      • to recognize patterns in data, such as determining when to suggest a new album in Google Photos to keep related photos together
      • to customize our services for you, such as providing recommendations and personalized search results, content, and ads (which you can change or turn off in Ads Settings)
    • This analysis occurs as the content is sent, received, and when it is stored.

    And that doesn't even take account what your device manufacturer - Samsung, Huawei, OnePlus, etc. - might be doing on top of that using their own platforms and services.


    Now, what Apple appear (at least at first glance) to be going one step further on is by also scanning what's locally on the device itself against a known database of abuse images, even if that content doesn't ever go onto one of their cloud platforms.

    I say at first glance, because iOS carries out a lot of processing tasks on the device which other platforms (like Android) offloads to the cloud. Once any content that needs to be processed hits Google's cloud, even if just for processing and not for permanent storage, then the above T&Cs come into play.



  • Registered Users, Registered Users 2 Posts: 15,037 ✭✭✭✭Kintarō Hattori


    In regards to the first part of your reply? Why? Plenty of parents such as myself would have a few bath pictures of their kids.



  • Registered Users, Registered Users 2 Posts: 40,637 ✭✭✭✭ohnonotgmail


    Now, what Apple appear (at least at first glance) to be going one step further on is by also scanning what's locally on the device itself against a known database of abuse images, even if that content doesn't ever go onto one of their cloud platforms.

    the summary of what apple are doing posted by somebody else seems to say that they are only scanning data in iCloud.



  • Posts: 1,263 ✭✭✭ [Deleted User]


    It's a tough one. 100% they should scan for known child porn images (at the very least) so that law enforcement can follow up and do what's necessary. But I think this should be done covertly. No need for the world to know. Just do it.. and support laws that enable them to do this.

    At the same time, this type of access will be used for nefarious purposes, political supression, monitoring of dissidents etc. What seems unfair to me is that some corporations and governments want us to give up all pretense that we have privacy, but they don't give us transparency in return. How is that data used? Who has access to it? Etc. Etc. It's not a fair transaction.

    Plus, now that this is known about iPhones, I assume pedos will go more underground and start using some other tech. Another reason just do it and shut up about it until it's done.

    All that said, it is frustrating to know that there are child predators out there, and that we have the technology and human resources to scan devices to find them and apparently, this is not being done or is being done in a way that is not legally actionable.



  • Registered Users, Registered Users 2 Posts: 1,543 ✭✭✭KildareP


    There will be scanning happening in Messages, too - anyway, seems that these changes are going to be for the US only (at least for now)

    Child Safety - Apple

    All of the mentions of the features are asterisked and the footnote at the bottom of the page states "Features available in the U.S."



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 12,046 ✭✭✭✭L'prof


    Hospital weighing scales photos here too. Although I can claim that I was coerced into taking those by the nurse in the delivery room



  • Registered Users, Registered Users 2 Posts: 35,587 ✭✭✭✭o1s1n
    Master of the Universe


    Have to wonder if this algorithm will incorrectly pick up on classical art as being child porn? ie - a religious old lady with a classical painting as her wallpaper of baby jesus. (who in many of them happens to be naked)

    Will also be interesting to see if it flags drawings, if so you have a lot of people in the US who are into some questionable Japanese content...



  • Registered Users, Registered Users 2 Posts: 6,291 ✭✭✭Ubbquittious


    That is even more big brother. are you out of your fecking mind?



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    Really good point !!

    Also, what about pics of your own kids on the beach? Surely this model will raise lots of false flags.

    Don't like the idea of people getting their doors kicked down by mistake.



  • Registered Users, Registered Users 2 Posts: 6,291 ✭✭✭Ubbquittious


    I think I will go back to a Sailfish OS phone. Don't want the government/mega corps looking through my files



  • Posts: 1,263 ✭✭✭ [Deleted User]


    Probably. But when I evaluate the competing claims of 'Big Brother' and 'child predators that need to be removed from society' I come down on the side of the latter. The difference between China's in your face monitoring of its citzenry and the Western approach is simply that China is overt about it and we are covert about it. Covert operations need to be brought into the light. Privacy is dead.

    Also, how is it possible to run covert stings on child predators without some degree of snooping and Big Brothery-type shenanigans? Those people need to be caught.. are we all supposed to throw our hands up in the air and go 'oh well, privacy, yadda yadda, big brother'?



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    Also, presumably if iOS starts this Windows will too ..

    So will this just be cloud images or will it scan the local C drive ?

    terrifying concept really, since I have so much BluRay rips there (some legally my own - others not exactly).

    Where does it stop ?

    Using Linux as a next step perhaps...



  • Registered Users, Registered Users 2 Posts: 6,412 ✭✭✭Jequ0n


    Admittedly an excellent tactic to sell the concept. Everyone who is opposed to the idea will be branded as someone who supports or endorses child porn.



  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    Both of these.

    It will be trained on real user data, scanning people's phones. "Potential" suspect images will be uploaded and shown to a real human being for them to decide if this is a good or a bad image. When the system is in training, this will include porn your mates sent from whatsapp, picture of your kids in the bath and at the beach, and not forgetting intimate pictures that you and your partner(s) are sending to eachother.

    The privacy breach in this is enormous and immediate.

    But ultimately, as said, this is not mainly being trialled for use in the west. There is some limited espionage/spying usefulness in this for security agencies, but the main draw here is for authoritarian regimes.

    Apple are proving to the likes of China and Russia that they can force-install software on users' phones which can then scan the data on the phone searching for banned materials, and searching for photographs of persons of interest, government dissidents especially. This could easily extend to having the front camera take photos at random and scan the resulting picture for a known face before discarding the image.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    Exactly, was chatting to a friend about this some weeks back because interestingly I was convinced that apple are allready doing this with hashes of MP3 files that are "known torrents" - long story, must post another question on that.

    Anyway, got to the inevitable one day Windows/apple etc will scan local devices for CP, and he didn't have a problem with it said again if you are against it would be only because you had something to hide.

    Only thing that kind of convinced him, was the scenario of cops knocking on his door at random asking to search his house for illegal materials - surely he would be all for that and if he wasn't it must mean he has something to hide ??? right ????


    Wrong!



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    Like my post above, how would you feel about police knocking on your door to search your house for CSAM ?

    Surely you have nothing to hide, let them in right ?



  • Registered Users, Registered Users 2 Posts: 26,197 ✭✭✭✭Strumms


    Nobody has a problem with people using covert tactics to uncover child predators / child porn users but the people and the only people who need to do this work have be the Gardai..or relevant law enforcement . not big business...

    if you woke up and found unexpected people who were not known to you, in your front garden looking at and taking photos of your tax / insurance discs, or a drone doing the same and on challenging those people they identify themselves as Apple employees who are invading your privacy.... because they have a cause to do with road safety or whatever ?....



  • Registered Users, Registered Users 2 Posts: 11,424 ✭✭✭✭EmmetSpiceland


    That’s not really the same though, T. What you’re equating is closer to a cop, physically, taking your phone and going through it.

    This is more like a helicopter flying over your house with a heat sensor looking for a “grow house”. Would you have a problem with the police sweeping your house from a height?

    No one wants their phone taken off them, or having police in their house, because that is an inconvenience. Having your phone “scanned” for images, or videos, of child abuse without your knowledge is something most, normal, people wouldn’t have an issue with as it does interfere with their day.

    “It is not blood that makes you Irish but a willingness to be part of the Irish nation” - Thomas Davis



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    He/him/his

    “When you're used to privilege, equality feels like oppression”.

    #bekind


    Why am I not surprised with this signature that you are for this ?

    It's authoritarian whatever way you look at it ...and the issue here is false positives, like mentioned before photos of my own children on the beach in the pool wherever ...and no I wouldn't want the police scanning my house looking for a "grow house" , they could think my terrariums are growing spots and again - kick down my doors and cause an "inconvenience" ...



  • Registered Users, Registered Users 2 Posts: 9,373 ✭✭✭FourFourRED


    What about reading up about what this actually means in practice before jumping to conclusions?

    "Apple does not learn anything about images that do not match the known CSAM database."

    https://www.theverge.com/2021/8/5/22611721/apple-csam-child-abuse-scanning-hash-system-ncmec



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 11,424 ✭✭✭✭EmmetSpiceland


    I don’t see what my “sig” has to do with this. I’m fairly sure the police would be able to differentiate between a terrarium and a, large scale, “grow house”.

    I mean, it really just a comes down to the fact that if you have nothing to hide, you have nothing to worry about. You can move to Android products to get around it but how long do you think it will be until this becomes “standard” across all platforms?

    Hopefully, these measures will become a massive hurdle for child abuse and, eventually, human trafficking.

    “It is not blood that makes you Irish but a willingness to be part of the Irish nation” - Thomas Davis



  • Posts: 3,801 ✭✭✭ [Deleted User]


    facebook and google have been doing this for years, apparently.



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney




  • Registered Users, Registered Users 2 Posts: 4,633 ✭✭✭FishOnABike


    "First they came for the socialists, and I did not speak out—because I was not a socialist.

    Then they came for the trade unionists, and I did not speak out— because I was not a trade unionist.

    Then they came for the Jews, and I did not speak out—because I was not a Jew.

    Then they came for me—and there was no one left to speak for me."

    Martin Niemöller



  • Posts: 3,801 ✭✭✭ [Deleted User]


    No. All uploads. The local drives are Apples attempt at privacy apparently.



  • Advertisement
  • Posts: 3,801 ✭✭✭ [Deleted User]


    False positives aren’t an issue. This is hash matching. Can’t happen. They are looking for exact replicas. well that’s what they say.



  • Registered Users, Registered Users 2 Posts: 40 rrudden


    They aren't looking at the actual contents of the image but the hash or digital fingerprint (which is just a series of numbers). Every image has a unique hash. The hashes are compared to a database of known hashes of child porn. Now they say they can identify images which have been slightly altered which implies that the hash does not need to be an exact match to flag up. A threshold number of flags need to be reached before authorities become concerned. A small number of flags could indeed just be images that generate similar hashes but if 100s of hashes are matching the child porn database then that's an issue.



  • Registered Users, Registered Users 2 Posts: 3,660 ✭✭✭pah


    Most likely they will be using PhotoDNA hashing also which can match images that have been slightly altered by cropping or changing some detail in the images.


    https://en.wikipedia.org/wiki/PhotoDNA



  • Registered Users, Registered Users 2 Posts: 3,249 ✭✭✭TomSweeney


    hmmm, well this certainly changes things along the false flag front, I still don't like it though, it sets a dangerous precedent.



  • Registered Users, Registered Users 2 Posts: 2,807 ✭✭✭ShatterAlan




  • Registered Users, Registered Users 2 Posts: 2,807 ✭✭✭ShatterAlan



    Why should the entire population have their privacy violated because the powers that be want to try and catch purveyors and consumers of child pornography? They want to snoop on people, they can't just come right out and say it so they make you an offer you can't refuse, i.e. it's to find child porn. Double-edged sword. Give up your right to privacy OR be labelled as a kiddie-porn sympathizer/apologist/consumer.


    If it was announced that anyone's house can and will be entered without notice at any time, day or night to search for drugs would you be ok with that? Would you fcuk!


    If you suspect my phone or other electronic devices to contain child pornography or my house to contain drugs and/or other contraband then follow the protocol. Get a search warrant. Otherwise, p1ss off.



  • Registered Users, Registered Users 2 Posts: 2,807 ✭✭✭ShatterAlan




  • Registered Users, Registered Users 2 Posts: 4,957 ✭✭✭kirk.


    Nothing worse than child porn

    Steve Jobs should go for it



  • Registered Users, Registered Users 2 Posts: 2,807 ✭✭✭ShatterAlan


    The government have the tools they need to catch those engaged in child pornography. They don't need nor should they be given blanket surveillence. They need a warrant to wiretap a conversation. They should also require a warrant to search property.



  • Registered Users, Registered Users 2 Posts: 967 ✭✭✭Burt Renaults


    To preemptively stop them from doing what I referred to in the second part of my reply.



  • Advertisement
Advertisement