Has Google ever guessed what you were going to search for long before you finished typing it out, even before you gave it enough information to really be able to make that guess? It’s uncanny at first, but it quickly becomes something you not only expect but appreciate. Because that’s what we want our digital tools and technologies to be: Instruments that guess what we want and give it to us as soon as, or even better before we ask for it. But are these tools giving us the information we are looking for of are they providing us with the answers they think we want even if that information is not actually what we should be receiving? And just as importantly: How do these technologies know what we are looking for and what kind of answers we prefer? And who controls, interprets, and protects that information and that process?
As I write this Google is in the process of rolling out a new type of technology that has the potential of changing our lives, our interactions, and our society in a fundamental way. That technology, obscurely named “Glass“, is designed to add a digital layer to our everyday lives, removing the abstraction of the screen by superimposing web-based services and capabilities onto the real world we see in front of us. Glass is a computer in the shape of glasses, providing a heads up display akin to what you see in video games but designed for everyday life. The stuff of dreams made reality. The tech world is not surprisingly raving about this new leap in technological advancement. Wearable computers have long been the Holy Grail for tech enthusiasts and the potential inherent in this technology has long been a favored topic among science fiction writers and technologists alike. Used for good the technology Glass represents could be of tremendous value and benefit to us all. I can think of thousands of situations where Gass could be useful, essential, even required. And that is undoubtedly the intention of its creators: To make live easier, better, more enjoyable. But whatever its intention, this technology could easily end up augmenting our reality and our lives in a very real way that makes Orwell’s dystopian predictions of Big Brother a rosy fairytale. And the alarming part is we wound’t notice it was happening because it already is.
The Map of You and Me
Take a step back and think about how you use the web today. No longer just an information hub the web has become the medium on which we conduct a large percentage of our communication. In the past you probably used Google mainly for search, but today you likely use it for your email, chat, social networking, video consumption, and more. And Google is but one of many vendors for search and web services. Facebook, Microsoft, Twitter, Pinterest, all of these services have been adopted into our everyday lives under the auspice of making our lives simpler and more informed. But what happens behind the scenes? How is it that these services are so good at guessing what we want and serving our social, informational, and entertainment needs? It’s because every time we use one of these services that service in turn gathers, stores, and interprets information about us and our behaviours. And the more information is gathered and analyzed, the better the algorithms get and the better the services get at predicting our behaviour. Every email you send, every Tweet or Facebook update, FourSquare check in, every watched YouTube video, comment on Google+ or simple text search in a search engine becomes part of a personality profile. And every future action on these services is impacted by this profile. If this was happening in the real world we would be alarmed. When Target started profiling its customers and was able to predict a customer’s pregnancy before her family, it sparked an outrage. But our online services have been doing this for years and have eased us into it so that rather than questioning what is going on we not only accept but expect it. We have implicitly allowed large data mining corporations to start the biggest mapping of human behaviour ever undertaken, and done so without asking questions about why they are doing it and what this information is and will be used for.
On the face of it all this may seem to be OK. If a personality profile means the services you use online can predict what you are doing and simplify your life accordingly, what’s wrong with that? The problem is that the main purpose of these services is not to help you but to keep you using the service and be influenced by it and things like advertising in the process. So instead of providing you with the information you are looking for, they provide you with the information they think you will like the most and therefore return next time you want information. When you make a search on a search engine or open Facebook you are not presented with an accurate picture of the online world. Instead what you get is a carefully crafted image skewed to match your biases and preferences, whether they be social, religious, ethnic, or political. A conservative christian white male will be presented with vastly different search results from those of a liberal atheist Asian female when entering queries regarding politics, religion, or ethnicity. And the search results they get will usually be ones that provide positive reinforcement to their views and ideals. This phenomenon has been called the Filter Bubble and it is something we as a society need to take a long hard look at.
In a nutshell Filter Bubbles are web based worldwide echo chambers that isolate ideas and protect their inhabitants from opposing or dissenting views. As a result when a person with extreme ideas goes to the web, he will find endless support for his ideas, even if those ideas are groundless, misinformed, and largely discarded by society as a whole. In a worst-case-scenario this informational bias can lead to a person becoming radicalized and a danger to society. In the last few years we have seen several instances where the filter bubble is likely to have had a part: In the USA a large portion of the populace believe in one of many unfounded and debunked conspiracy theories about President Barack Obama – that he is a Muslim, that he is not a US citizen, that he is a terrorist and so on. In Norway an ultra-nationalist right wing terrorist killed 77 people in an attempt to quash a political party he was convinced was trying to convert the country to Islam. And in the wake of the Newtown massacre that saw 26 killed, so-called “Truthers” used the web to promote a conspiracy theory that the attack was a hoax perpetrated by the government to bring forth stricter gun control laws. The common thread that binds these and other such instances together is that the ideas are perpetuated on the web and spread among like minded people. And once they are caught in a filter bubble they only find information that reinforces and strengthens these ideas. Google and other service provides claim they are taking steps to prevent this type of extremist bubble effect, but the principle of the filter bubble lies at the core of their services and will more likely get further entrenched than dismantled.
Your Lifestream, controlled
Looking into his crystal ball technologist David Gelernter is now predicting we are moving towards a future in which predictive search and input is coupled with real-time streaming of information producing a personalized information stream presented to us at all times. Considering the current bias in online information delivery, and the ever escalating data mining of our everyday lives, this is an alarming proposition at the best of times. When you add Google Glass and the inevitable Apple variety of the same product it becomes a nightmare Orwell would have thought too unrealistic to write, even as fiction. Consider a world in which a significant percentage of the population wore Glass or an equivalent product. They would be wired to the web and its services 24/7/365 and would send and receive a constant stream of information. At the other end all that information would get stored, parsed, analyzed, and used to guide the users through their lives. There are tremendous security and privacy issues here, many of which are addressed in Mark Hurst’s The Google Glass future no one is talking about, but to me the more alarming aspect is the potential this technology gives to large corporations, clever marketers, and even governments to influence and control our behaviour.
If you take a look at your life today you can see how much influence search and social sharing has on your decisions and your opinions. And these influences are already heavily curated to move you towards certain products, attitudes, and behaviours. For now this is based on your interactions with computers, tablets, and smart phones. Now imagine what happens when you start wearing a device that provides this same type of information to you at all times. No longer abstracted to an external screen but added to your regular field of vision. And while you are consuming the carefully curated and controlled information fed to you, the device is recording your every move, every interaction, and every word spoken.
Brave New World of Glass
On a server somewhere there is a file with your name on it with more information about you than you have on yourself. The server can predict your every move with impressive accuracy, it knows where you are, where you are going, and who you are interacting with. And at every turn in your life it will use this information to try to influence your decisions and your actions. This is not science fiction nor the future. This is happening today, right now, as you are reading this and considering who to share it with. Tomorrow it will be right in front of your eyes changing your reality. Big Brother could be so lucky…