Embarrassed

There’s something strange that’s been happening in the world of tech as hotly anticipated products (primarily of the Apple variety) near launch: the world finds out about them long before they’re unveiled.

I think the entire phenomenon is so strange. When kids are young and looking forward to a hot new toy, they sometimes try to approximate its presence in their lives by creating an ersatz model to take the place of the real thing until they can actually touch, hold, and use the real thing. Strangely, this is happening with increasing frequency to the iPhone. The tech world is so hungry for anything iPhone that they will contract graphic designers to create 3D models of the new gadgets, and even go so far as to build full physical models.

The noise is deafening.

Post after post featuring blurry component photos hits the interwebs, and the tech press gobbles them up like bacon-stuffed donuts. Most folks don’t follow tech blogs, don’t really have a pressing desire to know the internal layout of new gadgets, feel no need to really seek this stuff out. They read what falls in their lap and, usually, are better and more sane because of it.

Then the device hits, and it elicits “yawns” from the peanut gallery because they’ve already seen it all. They make sweeping (often literally global) statements about the reception of the product, about the excitement it’s generated, etc. Their actions are, again, childish, just like the kid whose favorite team gets eliminated from the playoffs really early and starts claiming that no one likes [insert sport here] anymore, anyway.

Ultimately, they’re embarrassed.

Who wouldn’t be? Their phones are either knock-offs or faked. The real deal is just that, and consumers know the difference. Companies will try to illustrate how their products “stack up” against Apple’s iPad, or iPhone, or whatever, but it ultimately just makes them look, again, juvenile. I can make a checklist that makes me look like the best human being ever compared to random people on the street. I could create a checklist of the features of a raw, uncooked potato, and compare it to all the features of a slice of deep-dish Chicago pizza, but comparing those two things would make no sense. “Grows in the ground”, “Has eyes”, “Will sprout if placed in water” are all “features” of the potato that the pizza doesn’t have, but who really cares? I’ll take the pizza thankyouverymuch.

Which leads me back to my point. The leaked specs, the feature parity, the checklists, etc. are all meaningless in the face of true user experience and the whole package.

A guy I know had his iPhone run over by a car. It was absolutely destroyed, which was sad for him. He was contemplating purchasing a replacement, but decided to wait it out until his contract was up for renewal so he could purchase a new iPhone 4S. In the meantime, someone gave him a Motorola Droid RAZR (or whatever it’s called…these things have the weirdest names). He ditched the Droid in favor of an iPhone 3G. You read that right. He disliked the Droid user experience so much that he went with a molasses-slow (comparatively) phone, simply because the overall user experience was so superior. When you’re on the losing team, shouting really loudly and making a lot of noise is still fun, sure, but it doesn’t win you ball games. Just ask Cubs fans.

At any rate, it’s clear that people are jazzed about the iPhone 5, and all these “yawn” reactions are just the tech news equivalent of Cubs fans getting uppity. People will choose good design and a fluid, beautiful user experience over checklists and noise.

As they say, it doesn’t take a genius.

Advertisements

Singing in the Rain

When the MacBook Air came out last year with its super-sexy new design and blazing fast SSD, I knew I was in trouble. It’s hard for me to resist the siren call of a new Apple product, but it’s even harder when the thing looks and performs as well as that li’l guy. I was even looking to upgrade my Mac Mini, and saw that as the perfect opportunity to dive into something portable. Since that day, I’ve had to fight off the urge to buy one nearly every single day.

Then I realize that I have an amazing iPad 2, and I the conversation with myself ends. I don’t need a laptop, I already have an incredible machine. Sure, there are shortcomings, and there are certain incompatibilities here and there that make it difficult and/or frustrating, but by and large the experience is incredible, and very freeing. I have something with me at all times that I can use for *gasp* serious work (almost every blog post I’ve ever written has been with the help of an iPad, and all of my Grad school papers come from this tiny beast) as well as having fun and playing games. Truth be told, this is the best computer I’ve ever owned, and the reason is baked into the OS.

What a glorious feeling!

A while back, I went to the Apple store to ask some questions to the friendly folks there about the MacBook Air, to see if I should choose that over the Mac Mini. I came away with this realization: if you already have an iPad, skip the MacBook Air, and if you already have a MacBook Air, skip the iPad. They’re pretty close in form and function, anyway (despite one being a “laptop” and one being a tablet). The reason I say that is because of the use-case. People buy a MacBook Air because they need a computer that is:

  1. Portable
  2. Fast
  3. Long-lasting
  4. Simple
  5. With a full keyboard

The MacBook Air is that machine, among other things. So is the iPad, however, and I’ve found that the pseudo-multitasking of the iPad is far more preferable to me when I’m working because I know that the apps won’t crash, won’t interfere with anything else, and won’t start to bog down. The’re lean, simple, and engage me physically, why I need when I’m writing. The MacBook Air is essentially redundant…except that it runs the full MacOS, instead of iOS. This seems great, until you start trying to manage multiple media libraries, apps, save files, etc. Then it gets to be more of a pain to work with MacOS than an iOS device. But wait…the new version of MacOS, Lion, looks and behaves a LOT like iOS, doesn’t it? I mean…Apple expressly talked about the similarities in their “Back to the Mac” event. So then there’s this:

Most people had dismissed that rumor due to the compatibility issues that would be introduced with such a transition. Another major issue is that while ARM processors are more power efficient, they presently offer significantly lower performance than their Intel counterparts.

Sure, an ARM-based A5 wouldn’t make sense running MacOS…but what about iOS? Let’s even blow it up a bit and look further down the road a year or two. Let’s focus on a time in the not-too-distant future when iOS and MacOS start to merge, when the distinctions between the various Apple OSs start to become blurry. Then, ARM chips would make sense. They sip power, and (currently) iOS sings on those chips. It’s built for exactly that type of chipset. The two work in perfect synergy, and you can bet that Apple is spending a lot of time making sure that, when it’s time to make that jump, that they’ve gotten the whole machine tuned and tweaked so the transition is beautiful. If you look at it that way, it makes a whole lot more sense to be using ARM-based chips for your supermodel MacBook Air, while the MacBook Pros would still run Intel chips due to their more “Pro” nature. I’m willing to be dollars to donuts that most people are going to start shifting away from MacOS “Classic” and will absolutely love the new look and feel of Lion. Who knows, maybe the Mac OS “Classic” look and feel will persist, while everything else will run some new version of iOS that is fully scalable across any hardware, much like HP is planning to do with their new version of WebOS.

There’s also this little nugget:

Although not mentioned in the most recent rumor, one of the largest features may be over-the-air updates that would finally make iOS independent of a computer for all but backup and local media syncing.

So…like a “real” computer? Can you see it? Can you see how the walls are disintegrating? The distinction between a “mobile” OS and a “desktop” OS is not as clear now, and I think the lines will continue to blur.
And this, too:

Talk of Apple using Nuance voice commands in iOS was already supported recently by code mentions in Lion. Most also presume that Apple’s cloud music service may play an integral role in the new mobile software.

So we can infer here that iOS and Lion are very closely related (doesn’t take a rocket scientist to figure that one out, Apple said so), but that they share code is telling of Apple’s long-term strategy, and the strategies of several major players out there (Google, Microsoft, natch).

The jump from what we see in our hands and on our laps and desks and what we will be seeing over the next few years will be immense, and will change what every single person recognizes as a computer.

Mind the gap.


Close Encounters of the HiFi Kind

Bust out your b-boy skillzOne of the most exciting things about being a technophile is the reactions I get to experience from friends and family members regarding new technology and its place in their lives. For some members of my immediate family, technology is something to be shunned or, at best, regarded cautiously. The intersection between life and technology seldom occurs and, when it does, the intersection is typically relegated to the living room TV or family computer for just a few moments.

The general distrust of technology is not unique to my family, however. As phones have increasingly taken on more characteristics of computers, many of my friends have opted for lower-tech, less-capable devices that offer the illusion of simplicity and security1. There seems to be a general trend, however, towards devices that are intentionally simpler or less advanced than the iPhones and Androids of today. This seems to go hand-in-hand with a trend that was very prevalent in the early 90s in consumer electronics: blinky things.

This isn’t a joke or intended to poke fun at things that blink and glow, it’s an observation about the level of interaction that most people have with their technology, and the way that technology is designed today vs. twenty years ago. Currently, almost everything we see in the mainstream consumer electronics space is being geared towards user-friendliness and maximum functionality. We see device after device being introduced into the marketplace with the same glass face, the same general form factors, the same trend away from confusing buttons and towards devices that shift and morph as the user invokes different commands and demands different functionality from the device.

A close friend of mine was discussing his experiences in Japan in the early 1990s when Japan was leading the world in technological advancements in the consumer electronics space. His defining memory of the era was of blinking lights. He told me about his friends who would go shopping for electronics, looking expressly for the devices and gadgets that had the most blinky lights on them. Contrast to the devices of today, which have few, if any, lights at all (save for the screen).

I believe that this shift in the visual appearance of devices also has a great deal to do with the intended usage of devices and the sea change we see occurring in mainstream media in general. In a recent discussion I had (referenced here as well), I argued that media consumption is moving away from the all-you-can-eat huge cable bills and more towards selective, pay-for-what-you-watch models. This means that people have to go out and find what they want to watch in order to actually watch anything, which means that the consumption of media must be intentional. This is incredibly important when we look at how these new fit into our lives.

My father picked up an iPad recently (it was off, but plugged in and charging) and said something interesting. “How do you know it’s charging?” he asked. “There’s nothing blinking on here.” He’s right, of course, but that simple statement illustrates the difference between current-gen devices and last-gen technology. In previous generations of electronics, devices were ambient, non-interactive, and representative. The stereo represented music, the typewriter represented writing. These gadgets were single-function, specialized devices. They were large and expensive, and sometimes required some sort of technical training in order to learn how to operate them. The trend in recent years, however, has been away from single-function devices like stereos, typewriters, and cassette players. The shift has been decidedly towards convergence devices whose role in day-to-day activities is not clearly defined because it is so amorphous.

In the early 90s, a person could glance over at his or her stereo and be greeted by an array of lights and digits that portrayed all sorts of information which varied by model and type of stereo. This information, however, was specific to the gadget and usage case thereof. In that scenario, a person would have any number of different devices to display very specific pieces of information. Thermometers, clocks, typewriters, stereos, and more have all been replaced by multi-function devices that are becoming more and more ubiquitous, and some people feel threatened by that. Gone are the blinkenlights, gone is the specialized knowledge required to operate the machinery, gone is the sense of self that is then inevitably tied to the gadget. Instead, we see inherently mutable devices with no single purpose taking center stage. Suddenly all the gadgets that people have been hoarding over the years are rendered useless or unnecessary, and the owner of said devices suffers a bit of an identity crisis. Should we decide to keep the devices, we clutter our lives with junk. Should we decide to pitch them, we admit defeat to the tides of change.

This, however is not as bad as it may sound. A shift away from clearly defined objects means that our sense of self becomes tied to ideas instead, tied to our interactions with technology, not the technology itself. We come to think more critically, more abstractly. What are we looking for? How do we find the information we seek? Is this information important? How should we process and/or internalize this information?

Ultimately, a shift in the type of technologies that our lives revolve around signals a shift in our self-awareness. When you think about it, another analogy comes to mind, one that I discussed recently vis à vis the transition Apple is making with their new data center.

Let’s get existential, shall we? Let’s get right into it. Here it is: our sense of self, our identity, by being disassociated from things, now lives…wait for it…”in the cloud.”

Bet you thought you’d never see the day, huh?

1 One of the most often-heard arguments I have heard from my paranoid friends/family members is “What if you lose your phone?” or “What if someone steals your phone?” I actually faced that exact scenario recently and discovered some very interesting things about security and vulnerability that will undoubtedly raise some eyebrows. I’ll describe that story in detail soon.


The Living Room Takeover

A tiny black monolith of wonder.A long time ago, in a convention hall not so very far away, Apple introduced a product intended to revolutionize the living room viewing experience. It was one of the neatest things that most people had seen happen to the TV in a while, and there were a lot of people who were impressed by what it was capable of. It was loaded with all sorts of storage (a lot for the time, at least), and offered a novel way to get your media from your computer onto your TV. The sad thing is, it didn’t really take off the way other Apple products did. People liked it well enough, and it sold decently, but it wasn’t the hot ticket item that people were scrambling to pick up. That honor generally belongs to the iPhone, and now, iPad. It was a little too pricey for what it offered, and most people probably felt like the Apple TV was a sideline player.

Fast forward to September 2010, and we see a renewed focus to Apple’s efforts; their so-called “hobby” suddenly has a brand-new face, has lost a ton of weight, and can do basically the same stuff without all the baggage. More, actually. Some people still asked “Why?” but for $99, it was hard to argue against it. Those people (myself included), just went ahead and picked one up to find out what all the fuss was about.

I can tell you right off the bat that I love my Apple TV, but not for the reasons one might expect. I don’t love it because it makes watching movies really enjoyable (it does) or because my family can see all the new pictures I just imported from my camera on the TV, or because I can stream that awesome YouTube video I’m watching right to the TV seamlessly. All those things are great, sure, but what really got me excited is what the little black box represents.

Some folks have already jury-rigged a console experience into the iPad/iPhone/Apple TV. Even before that, however, before the 2nd generation Apple TV rolled out, there were reports that it would run some version of iOS. Ultimately, iOS under the hood really only exists in order to open the door to apps. With apps come developers, innovation (and, depending on the level of the APIs, usually some griping), and new software ecosystems. With iOS under the hood, we will eventually enjoy apps that talk to each other seamlessly, network invisibly, and build off of each other in synergy. That’s what got me excited.

John Gruber has a great take on the whole thing:

I think I see what Apple is trying to do with the App Store, and the potential upside for the company is tremendous. They’re carving out a new territory between the game consoles (tight control over content and experience) and computers (large number of titles, open to development from anyone). Think of the iPhone and iPad as app consoles. (Consider too, the possibility of an all-new iPhone OS-based Apple TV. TV apps! Using iPhones and iPads as controllers.)

So, basically what I just said.

The key here is that Apple would be competing against veritable giants in this space, companies that have years and years of experience creating behemoth machines that are designed for lifespans that fill the better part of a decade.  These consoles are powerful, multi-role devices that have also taken on increasing cultural significance as gaming moves more and more into mainstream culture.  Contrast that to Apple’s predictable and consistent release cycle, which, on the one hand, allows them to react quickly to shifts in the marketplace but, on the other hand, sometimes leaves customers feeling alienated.

While I tend to side more with the stability and development cycle that is characteristic of current-gen consoles, Apple’s move into this space may also spur more innovation and force the current trifecta (Sony, Nintendo, and Microsoft) to think of things that Apple hasn’t.  Sony’s current offerings (PS3, PSP) are great, but lack synergy.  If there’s anything that Apple can nail, it’s synergy, and those big three will have to work hard to integrate their home consoles with other services and devices if they want to offer the consumer some more value.  Developers’ successes in the phone space have translated smoothly from the mobile  to living room space (see Angry Birds and Dungeon Hunter), and Apple sees itself uniquely positioned to make use of that transition.

Think about it: if a developer crafts a successful, top-selling title for iOS, Apple wants to make sure that the player who wants to enjoy that same experience in their living room with three of their friends can do just that.  Apple doesn’t want that developer transitioning to another platform.  Apple doesn’t want people spending their money on other people’s hardware, either.  Why buy the PS4 or XBox 720, four controllers, and whatever other magic peripherals they have for the primary purpose of playing games when instead a person can purchase an Apple TV and iOS devices for the whole family, and be simultaneously purchasing a game console and input devices?  Let’s take it a step further.  Ever heard of OnLive?  Ever seen their game console?  Does that seem familiar to you?  OnLive’s servers stream games from the cloud to your TV.  You can play super high-quality games over a broadband connection.  Apple just built a mammoth data center, purportedly for iTunes and MobileMe.  Let’s think a little further, here.  Apple is also focused more on social now than they ever were, and it also wouldn’t seem too far-fetched to use Apple’s newly-introduced Game Center to pull all their iOS users together into a platform not unlike PSN or XBox Live.  Add to that all the success that more casual titles have seen, and it seems elementary that Apple would take steps in this direction.

I don’t know what gaming in Apple’s ecosystem will look or feel like, but I have a strong suspicion that the war for the living room is just heating up.


Eyes, Like Lightning

Luxo's distant relative.

As the Apple news starts reaching a fever pitch, mostly surrounding the very imminent launch of the iPad 2 and perhaps some other things that Apple has up its sleeve, my attention is drawn to the unsung and curiously short-lived news regarding the MacBook Pro refresh and this new Thunderbolt port.  I’ve been reading some other news throughout the day and it appears that there are some things that folks have missed so far, things that signal a great future for Apple and the personal computer industry as a whole.

One of the first of these important bits is, of course, Thunderbolt.  Great technology, lots of bandwidth, high transfer speeds, etc.  Generally a good thing.  The article I linked to has a lot of great information regarding the specs and capabilities of this new transfer protocol.  This, by itself, tells us that Apple’s current-gen displays will look gorgeous, that they’ll be able to run free while also co-existing nicely with other peripherals (hard drives, cameras, iOS devices).  Good news, but what caught my eye was this little tidbit that came up a little while ago about Lion’s support for a desktop Retina Display.

But one particularly interesting under-the-hood change that we’ve learned about is an evolution of Mac OS X’s “resolution independence” features. Resolution independence has been a long talked about feature that would eventually provide support for high DPI (dots per inch) displays. While there has been the beginnings of support for it starting in Mac OS X Tiger (10.4) and into Mac OS X Snow Leopard (10.6), full support was never realized.

This is something that I was very interested in when it first rolled out, but never really saw the fruits of.  With Thunderbolt (née Light Peak) technology, very high-resolution displays will become the norm.  The incredible transfer speeds required to display all those juicy pixels are now present in Thunderbolt, and Apple has a way to get all those ports out there now, into the hands of the exact folks (photographers, filmmakers, journalists, designers, etc.) who would soak themselves with drool over a double-resolution display.  Target audience, check.

Then there’s this.  Didn’t really happen, did it?  Apple, however, has a history of releasing things that they (or Mr. Jobs, more specifically) expressly deny.  So now we’ve got super high-res screens on the fuzzy horizon (Just over there!  Can you see them?) powered by a transfer technology that will ensure that people using them don’t go cross-eyed or have their retinas burned out by anything other than sweet, innumerable pixels.  Suddenly, all those touchscreen iMacs that were never supposed to be look like they can be.

Let’s also talk about how Lion fits into this.

Apple’s new OS is a powerful statement for user-friendliness without the expense of power.  Lion, being designed with all kinds of iOS conventions baked-in, seems oddly reminiscent of a hardware/software duet that mysteriously disappeared right before the launch of the iPhone some years ago.  The keyboard, developed by a company called “FingerWorks,” was a capacitative (if I remember correctly) keyboard that allowed for multiple fingers on the board simultaneously.  I was going to buy one for my 12″ PowerBook G4, when suddenly the device was nowhere to be found.  The company’s website stated that they had been acquired by Apple, and I started telling my friends to get ready for something huge.

I’m not sure where to find these videos anymore, but FingerWorks’ instructional videos on their pages look oddly like what I’m seeing in Apple’s own marketing material for Lion.  While not expressly a touchscreen OS, Apple will undoubtedly start adjusting their future plans to be able to create computers that have that capability.  Even though it isn’t what they’re designed for now, the older paradigm of keyboard percussion and mouse gymnastics will shift one day, and I’ll bet dollars to donuts that Apple wants to be at that bleeding edge.

While perhaps not quite where we’re headed, the idea of a beautiful, completely interactive table that syncs with the phone/camera/device that you lay on top of it (NFC, anyone?) and allows us to interact with our information naturally is a science fiction dream, and Apple’s vision is putting it within reach once again.

UPDATED: A couple folks asked about the title of the post.  The title is actually a line from a poem written by a martial artist about Aikido, the next line is “Throw, like thunder.”  Seeing as how the post is about Thunderbolt, I thought I’d add something related in there.  Sorta esoteric, but fun.