“[D]eep learning systems are ‘already pushing their way into real-world applications. Some help drive services inside Google and other Internet giants, helping to identify faces in photos, recognize commands spoken into smartphones, and so much more’. If deep learning networks systematically classify the world’s patterns in ways that are at variance with our ordinary human classifications, and if those networks are lodged in the workings of the technology that organizes and shapes our cognitive lives, then those lives will be organized and shaped by those variant classifications.
However, surely we will notice this divergence, I hear you say. It is here that a second point becomes relevant. What if the networks in question are fluidly and expertly integrated into our everyday activities, such that they are transparent in use? Imagine such networks operating as part of a cognitive-assistant-style wearable that classifies situations and transmits the results via an optical head-mounted display… such behaviour-guiding technology, even though it enhances cognitive performance, and even though it is operating in cognition central, could be transparent. On some occasions, no doubt, its variant classifications of the world would lead to mismatches to which the human user will be sensitive before anything detrimental occurs. However, it seems just as likely that subtle changes in one’s engagement with the world—changes that, for example, have potentially damaging social consequences for how one classifies others…
The final aspect of this worrying scenario comes to light once one realizes that [such technologies] are at least on the way to being correctly treated as genuine parts of the user’s own cognitive architecture… that unconsciously guide my behaviour… be part of what I unconsciously believe to be the case, and thus presumably will have the same status as my more familiar, internally realized unconscious beliefs when it comes to any moral judgments that are made about my resulting thoughts and actions.”
– Michael Wheeler
When some technology is used by a skilled person without difficulty, the person is no longer consciously aware of the technology. It disappears from her awareness in the same way that when in the flow of writing, she doesn’t notice the pen in her hand. This kind of lack of active awareness of the technology in use is labelled transparency.
Continue reading “Should smart technologies be transparent when used?”
“The difference between a computer programmer and a user is much less like that between a mechanic and a driver than it is like the difference between a driver and a passenger. If you choose to be a passenger, then you must trust that your driver is taking you where you want to go. Or that he’s even telling you the truth about what’s out there. You’re like Miss Daisy, getting driven from place to place. Only the car has no windows and if the driver tells you there’s only one supermarket in the county, you have to believe him. The more you live like that, the more dependent on the driver you become, and the more tempting it is for the driver to exploit his advantage”
– Douglas Rushkoff
Rushkoff may appear to be making an argument commonly made by those who see the world through the prism of their own particular expertise. The mathematician will argue we should all learn to think more mathematically. The scientist, we should be more scientific in our world view. The artist, philosopher, psychologist, and so on, in turn might argue that in order to make sense of things, you need to be able to understand the world through the tools of their particular craft. But does Rushkoff’s claim have more to it in this particular technological age? Is there a special need to learn programming, in the same way we accept everyone should learn some maths; how to read and write? To answer this question, I consider the core difference in learning to program versus other technical endeavours, examining one way in which this might change our way of comprehending the world.
Continue reading “Should you learn to program?”
“How we use technologies is shaped by the games and forms of life that are already in place “before’’ we use them. There is already a “grammar’’ of technology. Of course there is also a “grammar’’ in the sense of “syntax’’: specific rules how to put together different parts for instance, or specific operating instructions. But there is also a grammar in a wider, more social and cultural sense: there are already particular activities and ways we do things, there are already games, and the technologies are part of those games and their use is shaped by the games.”
– Mark Coeckelbergh
Continue reading “How is technology use shaped by the wider culture?”
“We should notice the force, effect, and consequences of inventions, which are nowhere more conspicuous than in those three which were unknown to the ancients… printing, gun powder, and the compass. For these three have changed the appearance and state of the whole world… innumerable changes have been thence derived, so that no empire… appears to have exercised a greater power and influence on human affairs than these mechanical discoveries.”
– Francis Bacon
The idea that science is a “theory producing machine” came to be challenged by philosophers of science such as Karl Popper and Thomas Kuhn. A growing, if still controversial, view sees science as situated within social, political and constructivist contexts. Even amongst scientific realists (and I tentatively include myself in that camp), it is recognised that science itself is technologically embodied:
“Without instruments and laboratories, there was no science.” (p. 7)
Continue reading “How does technology enable scientific discovery?”
“Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that he abandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was not born a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.””
– Bruno Latour
Continue reading “What care do we owe for the world we have made?”
“Thus we shall never experience our relationship to the essence of technology so long as we merely conceive and push forward the technological, put up with it, or evade it. Everywhere we remain unfree and chained to technology, whether we passionately affirm or deny it. But we are delivered over to it in the worst possible way when we regard it as something neutral; for this conception of it, to which today we particularly like to do homage, makes us utterly blind to the essence of technology.”
– Martin Heidegger
A prevalent misunderstanding is that technology is a tool, a means to an end and neutral in value. Heidegger wants to free us from this misunderstanding, to really bring to attention our attitude and relationship towards the technology that we have at hand – in use. That we accept and use technology without this attentiveness, without often being aware that we are using it at all, is what makes us unfree.
Continue reading “How do we free our relationship with technology?”
“From the early days of manned space travel comes a story that exemplifies what is most fascinating about the human encounter with modern technology. Orbiting the earth aboard Friendship 7 in February 1962, astronaut John Glenn noticed something odd. His view of the planet was virtually unique in human experience; only Soviet pilots Yuri Gagarin and Gherman Titov had preceded him in orbital flight. Yet as he watched the continents and oceans moving beneath him, Glenn began to feel that he had seen it all before. Months of simulated space shots in sophisticated training machines and centifuges had affected his ability to respond. In the words of chronicler Tom Wolfe, “The world demanded awe, because this was a voyage through the stars. But he couldn’t feel it. The backdrop of the event, the stage, the environment, the true orbit … was not the vast reaches ofthe universe. It was the simulators. Who could possibly understand this?” Synthetic conditions generated in the training center had begun to seem more “real” than the actual experience.”
– Langdon Winner
Continue reading “What happens when the virtual feels more real than the actual?”