Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

With Big Data Comes Big Responsibility

  • 09-07-2014 8:19pm
    #1
    Closed Accounts Posts: 1,260 ✭✭✭


    Good read about all the companies that know so much about us and what they are gonna do with all that Data that we give them without a second thought.

    http://om.co/2014/07/08/with-big-data-comes-big-responsibility/
    You should presume that someday, we will be able to make machines that can reason, think and do things better than we can,” Google co-founder Sergey Brin said in a conversation with Khosla Ventures founder Vinod Khosla. To someone as smart as Brin, that comment is as normal as sipping on his super-green juice, but to someone who is not from this landmass we call Silicon Valley or part of the tech-set, that comment is about the futility of their future.

    And more often than not, the reality of Silicon Valley giants, who are really the gatekeepers of the future, is increasingly in conflict with the reality of the real world! What heightens that conflict — the opaque and often tone-deaf responses from companies big and small!


    Silicon Valley (both the idea and the landmass) means that we always try to live in the future. We imagine what the future looks like and then we try and build it. Sometimes that future delights us and we embrace it whole heartedly, like with iPhones and Android-based smartphones. And sometimes, that future seems so dystopian that society is scared and unnerved by the unknown.

    That Uncanny Feeling


    Facebook’s emotional experiments are an example of that future. Sara Watson, a fellow at the Berkman Center for Internet and Society, in an essay about data and advertising brought up the 1970s concept of the Uncanny Valley aka “the unsettling feeling some technology gives us.” Watson continues in her essay: “Technologies that are simultaneously familiar and alien evoke a sense of dread. In the same way, when our data doesn’t match our understanding of ourselves, the uncanny emerges.”


    That uncanny feeling is what we are confronted with Facebook’s emotional manipulation through algorithms. It is not necessarily because of the experiment, but what the experiment portends. It is the future where machines manipulate our wants and our desires and preempt our needs and emotions. We are scared because we will lose the illusion that we are making decisions that run our life. There is no coming back once we cross the threshold.


    Facebook’s emotion-driven-engagement experiments are tiny glimpse of what really awaits us: a data-driven and alogrithmic future, where machines make decisions on our behalf, nudging us into making decisions. As I pointed out in my recent FastCompany magazine column, the new machine age is already underway, unseen by us. “It is not really just a human world,” said Sean Gourley, cofounder and CTO of Quid who points out that our connected world is producing so much data that it is beyond human cognitive abilities and machines are going to be part of making sense of it all. So the real question is what will we do and what should we — the technology industry and we the people do? From my perspective, we need to start with the raw material of this algorithmic future: data. Whether it is a billions of photos that carry a payload of emotions, relationships and location data, or status updates announcing the arrival of a new one or those searches for discount Prada shoes or a look-up about a medical condition — there is someone somewhere vacuuming our data droppings and turning them into fodder for their money machine.

    For sale, our data


    Forbes tells us that even seemingly benign apps like Google-owned Waze, Moovit or Strava are selling our activity and behavior data to someone somewhere. Sure they aren’t selling any specific person’s information, but who is to say that they won’t do it in the future or will use the data collected differently. I am actually amazed that cities are willing to trade data such as photos from traffic cameras that impacts its citizenry to a privately-owned company (in this case, Google) without as much as a debate. I am sure, a new parking lot gets more attention from the legislators.


    Further down in the story, a Waze spokesperson remarked that the company can tell what speeds you drove from a “point a” to a “point b.” What if they sell that data to an insurance company, that then uses that information to raise insurance premiums.


    Did you know at the time of signing up for Strava, that lovable cycling and running activity tracker is sharing real time user data and selling that to municipalities for 80 cents a year. In what universe does it make sense for the company to do that without asking, and have a company spokesperson blatantly admit to a Forbes reporter that, the default is opt-in — a malaise popularized by Facebook. Because not doing so means, actually explaining to people what they intend to do with that all that personal information.


    And to be honest that is the crux of the problem — we, the citizens don’t really know what these data-hoarding companies — big and small are really good to do with all the data they have about us in their databases. How does a big company like Google use the data that resides in various different databases — Nest, DropCam, Waze, Android, Google Maps, Google Mail and Google Search — in tandem?


    A few weeks ago when reading The New York Times interview with Google co-founder and CEO Larry Page, I kept hoping that the interviewer would really dig deeper into Google’s stance on privacy, data gathering and what they plan to do with all the information they are gathering about us. What and why of Google’s grand vision for the data it collects is an important issue and it would be nice to know what Google intends to really do with it.
    The reality is that with all this information that is out there about me, then we should have a talk about things such as our rights as a citizen over that data. I am not saying let’s all go back to the villages and caves, but instead why not have a conversation (that is not hysterical but also not dismissive) about these issues around data, expectations of privacy and transparency.
    When Facebook released its Home app, I was unsettled by it mostly because it took away any notion of privacy. That post just might have been about Google or Amazon or Apple as well. Data from GPS Sensors is enough to quickly deduce your home location, work location, sleep and patterns. Add data from Waze app or Google maps, and Google can figure out what route you take. The data from accelerometers and gyro meters, a company can deduce some physical ailments. New sensor processors can add even more human-like abilities to our phones.


    Look, I am actually delighted about the possibilities of what can happen with all that data and sensors. I can’t wait for future of better medicine to arrive. I also can’t wait for Google Cars to become common place. What I don’t care about is that all these changes are happening with nary a thought about its impact on our society. If we as an industry are change agents and can want to talk about age of abundance in 50 years, we can’t ignore that next 50 years might mean a tear in our social fabric.
    It is important for us to talk about the societal impact of what Google is doing or what Facebook can do with all the data. If it can influence emotions (for increased engagements), can it compromise the political process? What more, today Facebook has built a facial recognition system that trumps that of FBI — think about that for a minute.


    Can we trust these Medici of modern times to regulate themselves and do the right thing? How long before the pressure of Wall Street and its incessant quarterly demands makes Facebook or Google go to thinkable places? These are issues of our times — something I had initially discussed in my posts about data darwinism and its impact on society.

    Automation Ahead


    Automation of our society is going to cause displacement, no different than mechanization of our society in the past. There were no protections then, but hopefully a century later we should be smarter about dealing with pending change. People look at Uber and the issues around it as specific to a single company. It is not true — drones, driverless cars, dynamic pricing of vital services, privatization of vital civic services are all part of the change driven by automation, and computer driven efficiencies. Just as computers made corporations efficient — euphemism for employed fewer people and made more money — our society is getting more “efficient,” thanks to the machines.


    “There is an increasing realization of the pain brought about by all these changes, especially the number of industries being disrupted and the many jobs that have been lost and will never come back. We hope that, as in the past, new industries will give rise to exciting new jobs,” writes Irving Wladawsky-Berger, a veteran technologist who worked for IBM, “But, no one knows for sure. It’s important that we collaborate across disciplines, – technology, business, social sciences, humanities, – to better understand and anticipate where the journey might take us.”

    And that is exactly the problem — no one really wants to take that humanistic approach. The hybridization of man and machine has begun in earnest. Google, Facebook and Amazon know that and are quite far ahead than rest of the world. Take Facebook, for instance knowing how to manipulate our emotions based on information they surface for us? How about Amazon’s future ability to predict our commercial needs.


    What about those new voice processing chips inside the smartphones that will constantly listen to what is happening to the world around us and help create magical experience for you. What about the sensor data collected from other sensors on the phones. What are the rules around the privacy of that information? Who is making those rules?


    John Foreman, a data scientist at MailChimp, in an eloquent essay, pointed out that “humans are bad at discerning the value of their data” and that the “personal data just appears out of nowhere, exhaust out of life’s tailpipe” and thus we are willing to trade it for something that seems less valuable. Foreman’s argument points out the futility — we are trading our freedoms in the data age for some minor gains.


    In March 2013, in his keynote at Gigaom’s Structure Data conference Quid’s Gourley estimated that it costs $1.20 a year for Facebook to generate over $6 per year in revenues. We are willing to trade our data for less than what it costs to get a cup of coffee at Starbucks. “Our past data betrays our future actions, and rather than put us in a police state, corporations have realized that if they say just the right thing, we’ll put the chains on ourselves,” Foreman writes. “In the hands of machine learning models, we become nothing more than a ball of probabilistic mechanisms to be manipulated with carefully designed inputs that lead to anticipated output.”

    MORAL IMPERATIVE


    While many of the technologies will indeed make it easier for us to live in the future, but what about the side effects and the impacts of these technologies on our society, it’s fabric and the economy at large. It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse, and will deserve the bleak future we fear the most.


    The sad part is that the legislators and the judiciary bodies of our nations are woefully under equipped to deal with the monumental change that as a society are experiencing. In a way, I feel, Silicon Valley and the companies that control the future need to step back and become self accountable, and develop a moral imperative. My good friend and a Stanford D.School professor Reilly Brennan points out that it is all about consumer trust. The concept of Waze working with municipal groups in theory should be a good thing, but we are all highly skeptical and suspicious of the motives of data collectors.


    Like I said, a lack of clarity around data-intentions is to blame. And the only way I see to overcome that challenge is if companies themselves come up with a clear, coherent and transparent approach to data. Instead of an arcane Terms of Service, we need plain and simple Terms of Trust. To paraphrase Peter “Spiderman” Parker’s Uncle Ben — with big data, comes big responsibility. The question is will the gatekeepers of the future rise to the challenge?


Comments

  • Closed Accounts Posts: 8,061 ✭✭✭keith16


    I was a bit nervous at first about enabling location sharing on my phone.

    But google offers an amazing free service, so who cares if they know where I took a piss last week.

    In any case, that data on it's own (my data) isn't even remotely interesting, nor will it ever be. So I don't feel as tho my privacy is compromised as such.

    I guess the real moral question is when the likes of FB start manipulating user data to try and control peoples moods etc.


  • Registered Users, Registered Users 2 Posts: 8,219 ✭✭✭Calina


    keith16 wrote: »
    I was a bit nervous at first about enabling location sharing on my phone.

    But google offers an amazing free service, so who cares if they know where I took a piss last week.

    In any case, that data on it's own (my data) isn't even remotely interesting, nor will it ever be. So I don't feel as tho my privacy is compromised as such.

    I guess the real moral question is when the likes of FB start manipulating user data to try and control peoples moods etc.

    Well they already tried that and the fall out was fairly negative.

    I'm not a fan of location sharing in general. I don't mind it to some extent for something like Waze but it is definitely something I don't switch on for Facebook for example. I think ultimately it is something that you deal with on a case/application by case basis.

    The other problem is if it ever becomes interesting. EG, the US court system has ruled that police now need a warrant for access to data on your phone...now I'm not suggesting for one minute that you will ever turn out to be a major criminal or anything like that, but...the truth is the more information that is collected about you, even with your consent, may not always be used in your direct interest.


  • Registered Users, Registered Users 2 Posts: 8,219 ✭✭✭Calina


    This dropped into my twitter feed from Hilary Mason today.
    Imagine getting a call from your doctor if you let your gym membership lapse, make a habit of buying candy bars at the checkout counter, or begin shopping at plus-size clothing stores. For patients of Carolinas HealthCare System, which operates the largest group of medical centers in North and South Carolina, such a day could be sooner than they think. Carolinas HealthCare, which runs more than 900 care centers, including hospitals, nursing homes, doctors’ offices, and surgical centers, has begun plugging consumer data on 2 million people into algorithms designed to identify high-risk patients so that doctors can intervene before they get sick. The company purchases the data from brokers who cull public records, store loyalty program transactions, and credit card purchases.

    Information on consumer spending can provide a more complete picture than the glimpse doctors get during an office visit or through lab results, says Michael Dulin, chief clinical officer for analytics and outcomes research at Carolinas HealthCare. The Charlotte-based hospital chain is placing its data into predictive models that give risk scores to patients. Within two years, Dulin plans to regularly distribute those scores to doctors and nurses who can then reach out to high-risk patients and suggest changes before they fall ill. “What we are looking to find are people before they end up in trouble,” says Dulin, who is a practicing physician.

    I'm not really sure I like the idea of health care providers getting data from store loyalty programs and credit card purchases.


  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard


    Calina wrote: »
    This dropped into my twitter feed from Hilary Mason today.



    I'm not really sure I like the idea of health care providers getting data from store loyalty programs and credit card purchases.
    ye, posted that in Info Sec last wk.

    Google can't Data Mine Health Data are present, imagine the Collective Data they'd have if they could.

    Companys mentioned in Article,
    Acxiom
    LexisNexis Group


  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard




  • Advertisement
  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard




  • Registered Users, Registered Users 2 Posts: 7,521 ✭✭✭jmcc


    I just don't trust the people in Google. They have shown themselves to be completely untrustworthy with that Google Maps/ Wifi gathering issue and the counterfeit drugs advertising for which Google was fined $500 Million. ( http://www.forbes.com/sites/robertlenzner/2014/01/17/how-a-500-million-fine-paid-by-google-for-selling-illegal-drugs-on-the-web-is-used-for-retiremernt-benefits-rhode-island-police/ )

    Regards...jmcc


  • Closed Accounts Posts: 1,260 ✭✭✭Rucking_Fetard




Advertisement