Monday, April 6, 2009

Some Hope for the Future of Computing

I think it might be a good idea to put Mark Weiser and Jean Baudrillard in a room together. While the latter names the computer screen or more accurately the entirety of simulated reality, as the harbinger of the end of actual reality, the former seems unimpressed by our level of technological achievement. Weiser outlines an impressive array of devices and processes which describe the way we might experience ubiquitous computing n the future. More important than his specific idea, though, is the overarching image of technology he describes. It's a view that would put a lot more people at ease.


The potential for oversaturation and overstimulation associated with digital culture is a popular subject both in the classroom and around the dinner table. What many fail to recognize in these constantly overgeneralized, vague and nebulous conversations is Weiser's exact point: our species has integrated many technologies seamlessly into our lives. Language is a great example. While it may complicate things, language is an integral part of humanity. One may argue that a friend talks too much, but no one would say that we as a culture are 'spending too much time on the language.' However, language was probably quite clunky at first, requiring a great deal of focus and thought to form the right words. Weiser sees computers as being similarly clunky in their current state. We engage with them so totally because we haven't figured out how to fit them into our lives such that don't notice them. Ultimately his vision is of a world where computing is a means to an end.


Early on, there was an image of computing and the Internet as tools, ways of helping us communicate and process information. While that's still the case, digital life has increasingly become an end unto itself. Even things like blogging, which you could say is a method of communication, is really more about digital life than it is about sharing information, particularly when you look at the vast numbers of people without Internet access in the world. Somewhere between the dot com bubble and Web 2.0, there was a shift (for me, anyway) where computers went from being something I used to do other things into being something I did for itself. I'm lucky to be able to receive an education about such things, because it helps both to curb that craving for stimulation, but also to understand that, dangerous though it may be, I am experiencing only one step in the evolution of computing.


The fact of the matter is, this may be a transitional stage that we have to go through in order to understand this new technology. My language analogy starts to break down here, because I don't know if there was a time in which humans were worried that this newfangled 'talking' was taking away from real life, like gathering berries, or hunting mammoths. Still, I think there is reason to believe that before digital technology and simulated reality cause us to completely retreat into ourselves and become little more than hyperstimulated brains, we may just shrink our computers and tuck them away where we don't notice them.


No comments:

Post a Comment