We’re often consumed by what will change over the next few years, but taking a longer view can be more interesting and instructive. Some of the world’s smartest people are hard at work developing amazing things that will dramatically change the way we interact with technology and each other over the next decade. Here are a few of them:
1. Virtually free and unlimited bandwidth and storage: This is so important that Chris Andersen (of “Long Tail” fame) just wrote an entire book about it: “Free“. Storage and bandwidth, already fairly substantial, are improving performance and cost efficiency even faster than processor speed is . The marginal cost is falling to practically zero.
The trend is significant because it changes usage drastically. What is scarce is conserved but what is abundant is used freely. The days of waiting those agonizing few minutes for web pages to load are already long past. When we can trade full length movies as easily as we can download web pages and store as many as we want, the way we interact with media will change. The restriction won’t be download time, but real time (as rock bands today give away their music but charge for performances).
As people become able to colloborate on media like they do a PowerPoint presentation, we can expect to see a lot more “peer to peer” stuff with people creating and sharing back and forth – Marshal McLuhanstyle. If you think about where storytelling, music, and other performance media would fit into a “Global Village” – a virtual campfire without any boundaries – you can imagine all sorts of primordial things that people will want to do digitally: sing-a-longs, treasure hunts, mythmaking, etc.
2. Quantum Computing and Encryption: In order for technology to advance, computers need to get faster and cheaper. That’s been done successfully for 50 years, but now the ability to continue progress is threatened. If we can’t make chips faster, we can’t move forward. This is a big problem and researchers are working overtime to find an amazing solution! This one is a little bit hard to explain, so stay with me here…
Current Chip design is based on transistors, which were developed at Bell Laboratories in 1947. They replaced vacuum tubes in electronic devices and were much smaller and more efficient. Transistors could have two positions, which could be translated into binary code (ones and zero’s). By producing a series of transistors it became possible to generate enough information to translate the ones and zeros into something intelligible. Everything computers can do today is limited to how many ones and zeros the computer chips can generate.
Eventually, William Shockley, the “father of the transistor’ left Bell labs to form Shockley Semiconductor. He was long on intelligence (and was a recipient of the Nobel Prize) but short on personal qualities and key members of his staff left to start Fairchild Semiconductor. It was there that the silicon based integrated circuit (what we now call computer chips) was invented in 1958. They key people at Fairchild would eventually form Intel.
Chips today are exactly like those in 1958, with one important difference: they have a lot more transistors. The transistors are basically drawn (etched) into silicon and the smaller they can be drawn, the more transistors can fit onto a single chip and the faster the processor speed (which means that the computer is both cheaper and more powerful). Current advances to chip design, for the most part, involve drawing the circuits smaller and smaller, but that’s about to change because it won’t be possible to draw the circuits smaller than a few atoms (and we’re approaching that limit).
We’re running out of space (measured in nanometers or billionths of a meter) that we can fit transistors into. The present “etching” technology is coming to the end of its usefulness (current chip design already utilizes spaces just a few dozen atoms across). But what if we weren’t limited by ones and zeros anymore?
The most likely solution is Quantum Computing which goes beyond binary by using electrons’ energy states to record information. “Qubits” (quantum processor units) will replace bits and processor speed will increase exponentially. We wouldn’t be limited by ones and zeros, but could utilize apotentially infinite amount of values between one and zero! There are still a lot of problems to be worked out, but there is a wide and intensive effort to solve them.
Another important aspect is that it’s possible that computers will become so powerful that current modes of encryption will become obsolete (using an algorithm developed by Peter Shor at MIT). This is a potential nightmare scenario. It would have enormous consequences for e-commerce and any kind of electronic payment (to say nothing of the military significance). We won’t have to deal with this for at least a decade and so I’m sure it will get worked out. (One possibility, as outlandish as it sounds, could be teleported bandwidth). However, both the strength of the quantum processors and the solution to the encryption problem will both probably change the way we think about a lot of things.
3. Exploiting synergies between technologies and appliances. Way back in the 1987, Nobel Prize winning economist Robert Solow famously said “we see computers everywhere except in the productivity statistics.” Of course, soon afterwards productivity rose to levels not seen for 30 years, but the point is that it takes time for technology to be used effectively.
Up until the mid to late 90’s, the internet was in its infancy and people didn’t do things like collaborate on documents. Even e-mail wasn’t pervasive yet. It took time not only for people to adopt technologies into their daily routines but also for technologists to figure out how to make things work together. It was much like early in the 20th century when cars needed roads and gas stations to really change the way people live. Economists call this the “Productivity Paradox.”
We’re living in a unique time because, although we are still learning how to use the new technologies of ten years ago (i.e. social media), in the meantime important, paradigm shifting technologies are being developed (i.e. quantum computing). The most exciting things we can expect to see over the next decade will be based on how we will learn to use and combine the technologies that have just recently become available and how those technologies become further enabled by greatly increased processor speed, bandwidth and storage.
One of the amazing possibitities is Augmented Reality based on the following three existing technologies:
– RFID: Radio Frequency Identification is a virtual “bar code” that can track objects and locate them in space.
– High Definition Satellite Imagery: Widely available commercial satellite imagery makes available real time images of the earth that previously were available only to governments. It’s only a matter of time before the really cool military grade technology goes mainstream.
– Holograms and Virtual Environments: With an expensive set of goggles, it is possible to walk through a virtual environment as if it were real. Moreover, as anybody who watched Barak Obama get elected on CNN knows, Holographic technology has come a long way, with 3-D volumetric display’sbecoming more viable . (See video below)
Imagine a primtive verson of Star Trek’s “Holodeck.” Through combining those three technologies it is only a question of processing power and bandwidth before it is possible to change what it means to “watch” a performance. We may be able to not only attend a global event locally, but also interact with the event as it interacts with us, in real time.
These are just a few of the possibilities, there are many more. As far as we’ve come, it’s still “early days” and there are still a lot of exciting surprises in store for us…
SOURCE: Digital Tonto