This is the End

YouTube – Live.

Cutting the cord is now becoming a reality.  Netflix is producing a TV show, YouTube is going live, and what do you want to bet that Hulu follows suit within the next two months?  This trend is not going away, and the corporate maniacs who think that they can control how and when people get access to the media they want to watch or listen to are wrong.

They know it, they can’t fight it, but they will.  In the process, they’ll cause damage to the infrastructure that is actually supporting them right now because of their greed.  They don’t want to actually provide interesting things for people to watch or read, they just want to wrap it in an iron cage so people have to pay through the nose to get it, just like the diamond industry.

Advertisements

In Like a Lion

20110331-115248.jpg

One of the most powerful developments in recent years has been the creation of “cloud computing.” Folks familiar with the technology know that it’s essentially doing for your computer what email services like Gmail and Yahoo! have done for your communication–they’ve taken your messages, contacts, and other personal information and stored it on secure servers across the nation to make it easily retrievable in the case of an emergency or hardware failure. Instead of relying on a single storage point (your home PC, for example) to store all of your communication, Google, Yahoo, and dozens of other websites offer to handle of those tasks in exchange for showing you advertising or using some non-identifiable information to craft better algorithms.

For most people, the immediate benefit of these systems was apparent. Access your mail anywhere, store contacts somewhere that won’t be affected in the case of a system crash or loss of a single device (like a phone), and integrate these services with your web browsing. Easy, and powerful. The systems that provided these services long ago have evolved significantly, now allowing entire operating systems to essentially run through your broadband connection, piping only the data necessary for input and allowing massive supercomputers to handle all of the processing.

That all sounds fine and good, but what does it mean for you?

Cloud computing, so named because of its pseudo-omnipresence, changes the role of computers significantly. They no longer exist as a single point of storage for all your information. Instead, the computer is more of a gateway, a portal to your data that is stored in massive servers. One analogy I can draw is that of a dry cleaner. With the old model of computing, it was as though you were standing at the front of a dry cleaning factory trying to look for a specific shirt. You might not even know where the shirt was located, but you’d still have to find it yourself. With the advent of search, that process was trimmed a bit- you tell someone else what to look for and where to look, and they find the shirt.

Now, with cloud computing, we see that yet another layer of interaction is slowly melting away. We’re doing away with the fetching entirely. You don’t even really need to know where you’ve stored your data, you just need to run a search, and you can pull down results from the stuff you have stored locally on your computer as well as the files floating up with the sun and moon. We are no longer limited by how much space is on our devices, how much storage we can buy. The only limiting factor is the infrastructure that connects all these devices together. Some people have asked me, almost accusingly, “Well what happens if the network goes down? What then, huh?”

If the entire United States suddenly experiences a simultaneous and catastrophic shutdown of all of its network infrastructure, we will have much bigger things to worry about than listening to our music or accessing the documents on our cloud folder. That’s akin to asking what would happen if all paper in the United States suddenly caught fire. I don’t want to hypothesize about the events or circumstances that would need to exist in order to facilitate such a terrible reality, but, assuming it was both spontaneous and total, I doubt anyone would be worried about their fourth grade diary.

Digression. Apologies.

In recent news, we’ve heard rumblings of Apple’s new iOS 5 being cloud-based, a total overhaul of the OS. I can’t even begin to fathom what that means. The OS seems just fine as it is, but the cloud is where it’s at these days, and that darn data center that’s been occupying so many of my thoughts and predictions seems like the perfect use of all those massive petaflops (or whatever they use to measure data centers of that magnitude). It all seems to be coming together now.

What we will start to see is more unity across Apple’s various OS products. Remember back in 2005, when Steve was asked what kind of OS the iPhone was running? Does anyone remember his response? Let’s recap, shall we?

Jobs admitted that Apple is a new player in the cell phone business, saying “We’re newcomers. People have forgotten more than we know about this.” Jobs noted that the operating system to run the iPhone — Mac OS X itself — has been in develop for more than a decade (its roots like in NeXT’s Nextstep operating system). Mossberg suggested that the iPhone doesn’t have the entire operating system on it, but Jobs protested.

“Yes it does. The entire OS is gigabytes, but it’s data. We don’t need desktop patterns, sound files. If you take out the data, the OS isn’t that huge. It’s got real OS X, real Safari, real desktop e-mail. And we can take Safari and put a different user interface on it, to work with the multitouch screen. And if you don’t own a browser, you can’t do that,” said Jobs.

This shift is not overnight, and it is not a new direction for Mac OS. Once Apple began work on the iPad, they started planning for this shift, possibly even before that. I seem to remember some folks discussing the origins of the iPhone, how it was actually rooted in an experimental side project that Steve Jobs somehow got a look at and recognized as brilliant, and that said side project was actually more akin to the iPad than the iPhone. At any rate, it looks to me as though Apple has been planning this shift for years, possibly even the better part of a decade. I believe that Apple designed iOS with unification in mind all along, seeing a desire to create a powerful OS for new mobile devices that hadn’t even been developed yet. It seems fairly obvious when you look at their last “Back to the Mac” event, and even more glaringly obvious when you see something like this coming out of Gizmodo.

Adobe demonstrated Photoshop for iPad yesterday. Not a sub-product like Photoshop Express, but the real Photoshop, with a new skin. Sure, it doesn’t have some of the advanced print and web publishing oriented features of the desktop behemoth. But it has everything you need, from layers compositing—including a 3D mode to show people how they work—to what appeared to be non-destructive adjust layers, levels, color controls, and all the features I use every day in the desktop Photoshop. From the little we have seen, the application was fast and smooth.

I believe Apple has succeeded in ushering in a new age already; I can’t wait to see them throw the doors wide open to a future we’ve only dreamed of.


Close Encounters of the HiFi Kind

Bust out your b-boy skillzOne of the most exciting things about being a technophile is the reactions I get to experience from friends and family members regarding new technology and its place in their lives. For some members of my immediate family, technology is something to be shunned or, at best, regarded cautiously. The intersection between life and technology seldom occurs and, when it does, the intersection is typically relegated to the living room TV or family computer for just a few moments.

The general distrust of technology is not unique to my family, however. As phones have increasingly taken on more characteristics of computers, many of my friends have opted for lower-tech, less-capable devices that offer the illusion of simplicity and security1. There seems to be a general trend, however, towards devices that are intentionally simpler or less advanced than the iPhones and Androids of today. This seems to go hand-in-hand with a trend that was very prevalent in the early 90s in consumer electronics: blinky things.

This isn’t a joke or intended to poke fun at things that blink and glow, it’s an observation about the level of interaction that most people have with their technology, and the way that technology is designed today vs. twenty years ago. Currently, almost everything we see in the mainstream consumer electronics space is being geared towards user-friendliness and maximum functionality. We see device after device being introduced into the marketplace with the same glass face, the same general form factors, the same trend away from confusing buttons and towards devices that shift and morph as the user invokes different commands and demands different functionality from the device.

A close friend of mine was discussing his experiences in Japan in the early 1990s when Japan was leading the world in technological advancements in the consumer electronics space. His defining memory of the era was of blinking lights. He told me about his friends who would go shopping for electronics, looking expressly for the devices and gadgets that had the most blinky lights on them. Contrast to the devices of today, which have few, if any, lights at all (save for the screen).

I believe that this shift in the visual appearance of devices also has a great deal to do with the intended usage of devices and the sea change we see occurring in mainstream media in general. In a recent discussion I had (referenced here as well), I argued that media consumption is moving away from the all-you-can-eat huge cable bills and more towards selective, pay-for-what-you-watch models. This means that people have to go out and find what they want to watch in order to actually watch anything, which means that the consumption of media must be intentional. This is incredibly important when we look at how these new fit into our lives.

My father picked up an iPad recently (it was off, but plugged in and charging) and said something interesting. “How do you know it’s charging?” he asked. “There’s nothing blinking on here.” He’s right, of course, but that simple statement illustrates the difference between current-gen devices and last-gen technology. In previous generations of electronics, devices were ambient, non-interactive, and representative. The stereo represented music, the typewriter represented writing. These gadgets were single-function, specialized devices. They were large and expensive, and sometimes required some sort of technical training in order to learn how to operate them. The trend in recent years, however, has been away from single-function devices like stereos, typewriters, and cassette players. The shift has been decidedly towards convergence devices whose role in day-to-day activities is not clearly defined because it is so amorphous.

In the early 90s, a person could glance over at his or her stereo and be greeted by an array of lights and digits that portrayed all sorts of information which varied by model and type of stereo. This information, however, was specific to the gadget and usage case thereof. In that scenario, a person would have any number of different devices to display very specific pieces of information. Thermometers, clocks, typewriters, stereos, and more have all been replaced by multi-function devices that are becoming more and more ubiquitous, and some people feel threatened by that. Gone are the blinkenlights, gone is the specialized knowledge required to operate the machinery, gone is the sense of self that is then inevitably tied to the gadget. Instead, we see inherently mutable devices with no single purpose taking center stage. Suddenly all the gadgets that people have been hoarding over the years are rendered useless or unnecessary, and the owner of said devices suffers a bit of an identity crisis. Should we decide to keep the devices, we clutter our lives with junk. Should we decide to pitch them, we admit defeat to the tides of change.

This, however is not as bad as it may sound. A shift away from clearly defined objects means that our sense of self becomes tied to ideas instead, tied to our interactions with technology, not the technology itself. We come to think more critically, more abstractly. What are we looking for? How do we find the information we seek? Is this information important? How should we process and/or internalize this information?

Ultimately, a shift in the type of technologies that our lives revolve around signals a shift in our self-awareness. When you think about it, another analogy comes to mind, one that I discussed recently vis à vis the transition Apple is making with their new data center.

Let’s get existential, shall we? Let’s get right into it. Here it is: our sense of self, our identity, by being disassociated from things, now lives…wait for it…”in the cloud.”

Bet you thought you’d never see the day, huh?

1 One of the most often-heard arguments I have heard from my paranoid friends/family members is “What if you lose your phone?” or “What if someone steals your phone?” I actually faced that exact scenario recently and discovered some very interesting things about security and vulnerability that will undoubtedly raise some eyebrows. I’ll describe that story in detail soon.