DW: You start your latest book, The Internet of Us, with a vision of the near future dominated by neuromedia – which would give us instant access to facts, downloaded directly to our brains. It sounds kind of nice, doesn’t it?
Michael P. Lynch: Well, it would be nice in many, many ways. It would make our lives much more convenient. In the same ways our smartphones make our lives much more convenient. And neuromedia would allow us to coordinate activities even more, if we were able to think to one another via digital means, and those coordinated activities would allow us to come up with new ways of doing – not just mundane things – but new types of art and coordinated activities like medicine.
On the other hand coordinated activities like war would also be easier if we were able to communicate and access all this information faster than the speed of tweet. So its being nice would depend on what we would do with it.
Deeper understanding
You say there is a difference between having facts at our fingertips and and actually remembering those facts, let alone truly grasp them as real understanding and knowledge. Why do we need to be thinking about this now?
Well the wonderful thing about our devices is that they do give us access to all sorts of information, good and bad information. If we concentrate on the good information for a moment that means it allows us to access “discrete packets” of facts, or lists of facts, which are incredibly useful on a piecemeal basis. But [these] lists of facts – and I call them lists only for shorthand – don’t give us the hows and the whys. To understand something is to see how the facts fit together, to connect the dots. If you think about the sort of deeper knowledge a historian has, it is not just knowing that certain things happened on such and such a date, but understanding why and how they happened.
And deeper knowledge gives us a deeper understanding about ourselves, which in turn influences our interactions with other people. We need to know more than a person’s name, age…
Exactly. When we understand each other, and ourselves, in this deeper way, we understand how and why we do things. We all want that. We want to know why we do the things we do. That’s the quest for self-understanding.
There is so much information available about people online that they themselves post – I and most other people in the world curate a self-image of ourselves online. But that curated self-image isn’t telling you what the deeper person is all about.
Issue of trust
Getting a better understanding of ourselves and each other is about whether we can trust each other. But aren’t we being misled to believe that correlations of data are more important than the deeper understanding, the grey areas of data, the bits we don’t openly see?
First, you’re right. Trust is a crucial aspect of any human transaction, whether that’s an emotional transaction or one of information. If you tell me something, I want to take it as useful and true.
The correlations and big data analytics give us predictions. But they don’t tell us why that is trustworthy. And that root explanation is where trust is grounded.
On the other hand, on this issue of grey areas, sometimes trust is most needed in precisely those cases where we don’t have all the information. That brings up a slightly different issue, which is that no matter how much predictive power data analytics is going to give us, it is never going to solve all of these grey areas. The reason for that is, as I said before, that it’s not going to explain why the predictions are accurate. And without understanding that, we’re not going to understand the causal basis of the prediction or why the world works in the way the predictions take it to work.
One of the other issues you raise is the filtering of information by large tech firms. Doesn’t filtering pre-date Silicon Valley? Take libraries. Even libraries have to make decisions on which books they can afford to buy in, based on what they think is important for the local community. So that’s filtering too.
You’re right, filtering is always taking place. Even organisms – animals – have to filter out a lot of noise to be able to track predators or prey. Filtering is a way in which cognitive mechanisms tend to work. However, the difference now is that the filtering is happening as a result of processes that are opaque to us. Or they are opaque if we’re not paying attention. When we do a Google search, the links that come up first are the product of machine generated algorithms and mathematical techniques that are not really available to us. But we do know this: they are the product of our preferences. So what you see on the internet as the result of a search is the result of filtering that’s being done by millions and millions of users, including yourself.
The ever-changing library
When you move through the world of the internet now it is like walking down a row of library books that are constantly changing to reflect what you are interested in and what millions of other people are interested in. As a result they also change what you’re interested in. And if you’re not paying attention to how the internet works, that is something you might ignore, or not notice, or not care about, and yet it has a tremendous impact on which book you’re going to be able to take down off the shelf.
Back at the start of your book, you make the point that you’re not anti-technology.
Absolutely not.
A lot of writers, writing about the internet and technology, make that point these days. It’s like a standard disclaimer to what otherwise becomes a book of critique about the ways tech is going. And to me it either suggests you’re trying to take the wind out of the sails of the Silicon Valley luminaries. Or you feel you have to say this for the record, but actually the battle is already lost. Is that the situation we’re in?
Information technology is meant to get us to knowledge. That’s why the book focuses on knowledge. But relying exclusively on digital means to get knowledge – even falling in love with them – can blind us to their bad consequences and also blind us to the value of other ways of getting to where we want to go. So it’s not about the technology itself. On the other hand I will concede this: if you think there is no way of separating the technology from our use of the technology, then the question becomes more complicated about whether I’m for or against it.
Michael P. Lynch is professor of philosophy at the University of Connecticut, where he also heads the Humanities Institute. “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data” is published by W. W. Norton.