« Back to Blog

The Future of Privacy: Trade-offs in the Age of Context

By John McClurg

The confluence of social media, digital mobile devices, sensors, and location-based technology is generating unprecedented volumes of information about society and individuals. A few years ago, a study found that taking stock of a person’s Facebook likes, for example, creates a more accurate personality assessment than one done by friends and family.

Armed with such insights, digital devices and services can anticipate what we’ll need next and serve us better than a butler or an executive assistant, according to Age of Context authors Robert Scoble and Shel Israel.

Of course, such benefits don’t come without trade-offs. A Pew Research report, The Future of Privacy, explores these changes, the growing monetization of digital encounters, and the shifting relationship of citizens and their governments.

As people increasingly value the contextual richness that highly personalized technology brings to life — Scoble and Israel’s Age of Context — the concept of privacy, particularly in the minds of Linksters or Generation Z, is drastically evolving. And as people willingly share more personal information — on social media, with location-based services, and elsewhere — securing that data only for authorized uses becomes more critical.

Classically-Conceived Privacy vs. Contextual Richness

We consider classically-conceived privacy as the time only a few years back, when sharing any information online immediately raised red flags for everyone, not just the paranoid. But now, we see that the trade-off between classically-conceived privacy and contextual richness will continue to evolve, just as the advent of the Internet and digital media changed the concept of property ownership and copyright protection.

With the Internet and digital media, came the ability to make unlimited copies, without depriving the original owner of use, which forced a significant expansion and retooling of legal protections of intellectual property and copyrights.

Similarly, we must adapt the way we protect privacy rights in the era of big data analytics, artificial intelligence, and machine learning, and their collection of ever-larger data sets from myriad sources.

Today we grant specific permissions for the use of our information, both personal and aggregate usage, when we agree to privacy policies on social media and other digital services. But as big data analytics grows, spawning secondary and tertiary uses downstream from the primary data collectors, it may become impossible to seek permission of all vested parties. Data collectors may ultimately have to be accountable for how your data is used, regardless of the permissions they obtain up front.

The Future of Privacy

One solution to this data issue will be to embed access controls into data itself at the point of creation. With such self-aware and self-protecting data, organizations can ensure that it securely flows to the right people—and only the right people—at the right time and in the right location.

Enjoying the fruits of our ever-more connected world requires the free flow of data to people, places and “things” — yet only the ones we authorize. When you’re staying at your favorite hotel, and your room service breakfast arrives fifteen minutes early—because the traffic on route to your morning meeting is snarled and the concierge knows you’ll need a cab early—the benefits of sharing your preferences and schedule with the hotel are clear.

Yet, you only want trusted partners and service providers to have access to such data. Developing the necessary security and an accountability model for organizations that put personalized information and big data to use may take some time.

But if we do our jobs correctly, as we leverage the prowess of artificial intelligence and machine learning, the benefits of our hyper-connected world should always outweigh the risks.


John McClurg
Cylance VP & Ambassador-At-Large

Tags: