The confluence of social media, digital mobile devices, sensors and location-based technology is generating unprecedented volumes of information about society and individuals. A recent study found that taking stock of a person’s Facebook likes creates a more accurate personality assessment than one done by friends and family. Armed with such insights, digital devices and services can anticipate what we’ll need next and serve us better than a butler or an executive assistant, according to Age of Context authors Robert Scoble and Shel Israel.
Of course, such benefits don’t come without trade-offs. A Pew Research report, The Future of Privacy, explored these changes, the growing monetization of digital encounters and the shifting relationship of citizens and their governments. As people, particularly millennials, increasingly value the contextual richness that highly personalized technology brings to life, the concept of privacy is evolving toward a new normal. As people willingly share more personal information – on social media, with location-based services and elsewhere – securing that data for strictly authorized uses becomes more challenging.
The trade-off between privacy and “contextual richness” will continue to evolve, just as the advent of the Internet and digital media changed the concept of property ownership and copyright protection. The ability to make unlimited copies – without depriving the original owner of use forced a significant expansion and retooling of legal protections of intellectual property and copyrights. So too must the way we protect privacy rights evolve in the era of big data analytics, with the collection of ever-larger data sets from myriad sources.
Today we grant specific permissions for the use of our information – both personal and aggregate usage – when we agree, whether we have read them or not, with privacy policies on social media and other digital services. But as big data analytics grows, bolstered by the strength of AI – spawning secondary and tertiary uses downstream from the primary data collectors – it may become impossible to seek permission of all vested parties. Data collectors may ultimately have to be accountable for how our data is used, regardless of the permissions they obtain up front. One solution will be to embed access controls into data itself at the point of creation. With such self-aware and self-protecting data, organizations can ensure that it securely flows to the right people – and only the right people – at the right time and in the right location.
Enjoying the full fruits of our connected world requires the free flow of data to people, places and “things” – ideally only that which we authorize. When staying at your favorite hotel, and your room service breakfast arrives 15 minutes early – because the traffic on route to your morning meeting is snarled and the concierge knows you’ll need an earlier cab – the benefits of sharing your preferences and schedule with the hotel are clear. Yet you only want trusted partners and service providers to have access to such data. Developing the necessary security and an accountability model for organizations – public and private – that put personalized information and big data to use may take some time and is an activity complicated by past wounds that Shakespeare described as not having “felt a scar.” But if we do our jobs correctly, the benefits of our hyper-connected world should always outweigh the risks. Recent exchanges between private sector corporations and various branches of law enforcement and the intelligence community reflect the challenge that can come as we seek that appropriate balance. Fortunately, our past is replete with instances of our having successfully navigated such potentially hazardous shoals.
Next month we will examine one such instance, which, notwithstanding the substantial handwringing and consternation that preceded it, presents an encouraging message of hope to those currently grappling to find a measured path forward.