Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Google hashing

Options

Comments

  • Registered Users Posts: 5,112 ✭✭✭Blowfish


    It seems a little pointless as it's really only going to prevent the 'low level' stuff. It may prevent the odd 'consumer' from viewing stuff, but at the 'producer' level, they'll easily be smart enough technically to know that for a start, hashing is ridiculously easy to get around and secondly, sharing anything via Google is stupid if you want to keep it completely private.


  • Registered Users Posts: 3,131 ✭✭✭Dermot Illogical


    Blowfish wrote: »
    It seems a little pointless as it's really only going to prevent the 'low level' stuff. It may prevent the odd 'consumer' from viewing stuff, but at the 'producer' level, they'll easily be smart enough technically to know that for a start, hashing is ridiculously easy to get around and secondly, sharing anything via Google is stupid if you want to keep it completely private.

    It's definitely only going to catch the technically illiterate, but I suspect that's most of the customer base that accounts for law enforcement requests. On that basis it possibly makes business sense.

    If they go beyond hashing it might get interesting, but there are greater privacy issues with that. Some of the current crop of forensic tools are combining hashing with image analysis so they can check the content of unknown images for similarity to previously hashed ones. Altering the file doesn't defeat that method and it's not a great leap for a service provider to make once they've already started down that road.

    I also wonder at what stage they will move on to intellectual property. It's inevitable someone will push for it.


  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard


    Google defends child porn tip-offs to police
    Google defended its policy of electronically monitoring its users' content for child sexual abuse after it tipped off police in Texas to a child pornography suspect.

    Houston restaurant worker John Henry Skillern, 41, was arrested Thursday following a cyber-tip that Google had passed along via the National Center for Missing and Exploited Children (NCMEC), based outside Washington.

    "He was trying to get around getting caught, he was trying to keep it inside his email," said detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce.

    "I can't see that information, I can't see that photo -- but Google can," he told Houston television station KHOU, which first reported the story.

    It's common knowledge that the world's leading Internet service, like its rivals, tracks users' online behavior in order to fine-tune its advertising services.

    But the Texas case prompted concerns about the degree to which Google might be giving information about its users' conduct to law enforcement agencies.

    "The story seems like a simple one with a happy outcome -- a bad man did a crime and got caught," blogged John Hawes, chief of operations at Virus Bulletin, a cyber security consultancy.

    "However, there will of course be some who see it as yet another sign of how the twin Big Brothers of state agencies and corporate behemoths have nothing better to do than delve into the private lives of all and sundry, looking for dirt," he said.

    In an email to AFP, a Google spokesperson said Monday: "Sadly, all Internet companies have to deal with child sexual abuse.

    "It’s why Google actively removes illegal imagery from our services -- including search and Gmail -- and immediately reports abuse to the NCMEC."

    The NCMEC operates the CyberTipline, through which Internet service providers can relay information about suspect online child sexual abuse on to police departments.

    "Each child sexual abuse image is given a unique digital fingerprint which enables our systems to identify those pictures, including in Gmail," added the spokesperson, who did not disclose technical details about the process.

    "It is important to remember that we only use this technology to identify child sexual abuse imagery -- not other email content that could be associated with criminal activity (for example using email to plot a burglary).”

    In a separate email to AFP, the NCMEC said federal law requires Internet service providers to report suspected child porn to the CyberTipline.

    "NCMEC makes all CyberTipline reports available to appropriate law-enforcement agencies for review and possible investigation," it said.

    On its website Monday, KHOU described Skillern as a registered sex offender, convicted 20 years ago of sexually assaulting an eight-year-old boy.

    Investigators who raided his home allegedly found child porn on his phone and tablet device, as well as cellphone videos of children visiting the Denny's family restaurant where he worked as a cook.

    Skillern has been charged with one count of possession of child pornography and one count of promotion of child pornography. He remains in custody on a $200,000 bond, KHOU said.

    Google's online set of "program policies" for its Gmail service, with more than 400 million users worldwide, includes "a zero-tolerance policy against child sexual abuse imagery."

    "If we become aware of such content, we will report it to the appropriate authorities and may take disciplinary action, including termination, against the Google accounts of those involved," it states.

    Last year, Google's chief legal officer David Drummond, writing in Britain's Daily Telegraph newspaper, acknowledged Google had created technology to "trawl" for known images of child sex abuse."

    "We can then quickly remove them and report their existence to the authorities," he said.


  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard


    Microsoft now.
    The PhotoDNA scanning software creates a unique signature for each image using data about the pixels of the image. Those signatures can then be tracked and matched, allowing Microsoft as well as Google, Facebook,Twitter and others to detect flagged photos.

    No employees of any of the technology companies have to look at the images, solely relying on the “DNA” of the image to compare matches.

    The PhotoDNA system is also used to prevent child abuse images appearing in search results.


  • Moderators, Technology & Internet Moderators Posts: 10,339 Mod ✭✭✭✭LoLth


    Good. more tools like this and proper, responsible, monitoring is exactly what we need. Perfect secrecy = ineffectual law enforcement , indiscriminate monitoring = complete lack of privacy , we need something in the middle ground that can be legislated for. automated "speed camera" type monitoring that actually ignores anything it doesn't need to pay attention to is a logical compromise.


  • Advertisement
Advertisement