When the MacBook Air came out last year with its super-sexy new design and blazing fast SSD, I knew I was in trouble. It’s hard for me to resist the siren call of a new Apple product, but it’s even harder when the thing looks and performs as well as that li’l guy. I was even looking to upgrade my Mac Mini, and saw that as the perfect opportunity to dive into something portable. Since that day, I’ve had to fight off the urge to buy one nearly every single day.
Then I realize that I have an amazing iPad 2, and I the conversation with myself ends. I don’t need a laptop, I already have an incredible machine. Sure, there are shortcomings, and there are certain incompatibilities here and there that make it difficult and/or frustrating, but by and large the experience is incredible, and very freeing. I have something with me at all times that I can use for *gasp* serious work (almost every blog post I’ve ever written has been with the help of an iPad, and all of my Grad school papers come from this tiny beast) as well as having fun and playing games. Truth be told, this is the best computer I’ve ever owned, and the reason is baked into the OS.
A while back, I went to the Apple store to ask some questions to the friendly folks there about the MacBook Air, to see if I should choose that over the Mac Mini. I came away with this realization: if you already have an iPad, skip the MacBook Air, and if you already have a MacBook Air, skip the iPad. They’re pretty close in form and function, anyway (despite one being a “laptop” and one being a tablet). The reason I say that is because of the use-case. People buy a MacBook Air because they need a computer that is:
- With a full keyboard
The MacBook Air is that machine, among other things. So is the iPad, however, and I’ve found that the pseudo-multitasking of the iPad is far more preferable to me when I’m working because I know that the apps won’t crash, won’t interfere with anything else, and won’t start to bog down. The’re lean, simple, and engage me physically, why I need when I’m writing. The MacBook Air is essentially redundant…except that it runs the full MacOS, instead of iOS. This seems great, until you start trying to manage multiple media libraries, apps, save files, etc. Then it gets to be more of a pain to work with MacOS than an iOS device. But wait…the new version of MacOS, Lion, looks and behaves a LOT like iOS, doesn’t it? I mean…Apple expressly talked about the similarities in their “Back to the Mac” event. So then there’s this:
Most people had dismissed that rumor due to the compatibility issues that would be introduced with such a transition. Another major issue is that while ARM processors are more power efficient, they presently offer significantly lower performance than their Intel counterparts.
Sure, an ARM-based A5 wouldn’t make sense running MacOS…but what about iOS? Let’s even blow it up a bit and look further down the road a year or two. Let’s focus on a time in the not-too-distant future when iOS and MacOS start to merge, when the distinctions between the various Apple OSs start to become blurry. Then, ARM chips would make sense. They sip power, and (currently) iOS sings on those chips. It’s built for exactly that type of chipset. The two work in perfect synergy, and you can bet that Apple is spending a lot of time making sure that, when it’s time to make that jump, that they’ve gotten the whole machine tuned and tweaked so the transition is beautiful. If you look at it that way, it makes a whole lot more sense to be using ARM-based chips for your supermodel MacBook Air, while the MacBook Pros would still run Intel chips due to their more “Pro” nature. I’m willing to be dollars to donuts that most people are going to start shifting away from MacOS “Classic” and will absolutely love the new look and feel of Lion. Who knows, maybe the Mac OS “Classic” look and feel will persist, while everything else will run some new version of iOS that is fully scalable across any hardware, much like HP is planning to do with their new version of WebOS.
There’s also this little nugget:
Although not mentioned in the most recent rumor, one of the largest features may be over-the-air updates that would finally make iOS independent of a computer for all but backup and local media syncing.
So…like a “real” computer? Can you see it? Can you see how the walls are disintegrating? The distinction between a “mobile” OS and a “desktop” OS is not as clear now, and I think the lines will continue to blur.
And this, too:
Talk of Apple using Nuance voice commands in iOS was already supported recently by code mentions in Lion. Most also presume that Apple’s cloud music service may play an integral role in the new mobile software.
So we can infer here that iOS and Lion are very closely related (doesn’t take a rocket scientist to figure that one out, Apple said so), but that they share code is telling of Apple’s long-term strategy, and the strategies of several major players out there (Google, Microsoft, natch).
The jump from what we see in our hands and on our laps and desks and what we will be seeing over the next few years will be immense, and will change what every single person recognizes as a computer.
Mind the gap.
One of the most exciting things about being a technophile is the reactions I get to experience from friends and family members regarding new technology and its place in their lives. For some members of my immediate family, technology is something to be shunned or, at best, regarded cautiously. The intersection between life and technology seldom occurs and, when it does, the intersection is typically relegated to the living room TV or family computer for just a few moments.
The general distrust of technology is not unique to my family, however. As phones have increasingly taken on more characteristics of computers, many of my friends have opted for lower-tech, less-capable devices that offer the illusion of simplicity and security1. There seems to be a general trend, however, towards devices that are intentionally simpler or less advanced than the iPhones and Androids of today. This seems to go hand-in-hand with a trend that was very prevalent in the early 90s in consumer electronics: blinky things.
This isn’t a joke or intended to poke fun at things that blink and glow, it’s an observation about the level of interaction that most people have with their technology, and the way that technology is designed today vs. twenty years ago. Currently, almost everything we see in the mainstream consumer electronics space is being geared towards user-friendliness and maximum functionality. We see device after device being introduced into the marketplace with the same glass face, the same general form factors, the same trend away from confusing buttons and towards devices that shift and morph as the user invokes different commands and demands different functionality from the device.
A close friend of mine was discussing his experiences in Japan in the early 1990s when Japan was leading the world in technological advancements in the consumer electronics space. His defining memory of the era was of blinking lights. He told me about his friends who would go shopping for electronics, looking expressly for the devices and gadgets that had the most blinky lights on them. Contrast to the devices of today, which have few, if any, lights at all (save for the screen).
I believe that this shift in the visual appearance of devices also has a great deal to do with the intended usage of devices and the sea change we see occurring in mainstream media in general. In a recent discussion I had (referenced here as well), I argued that media consumption is moving away from the all-you-can-eat huge cable bills and more towards selective, pay-for-what-you-watch models. This means that people have to go out and find what they want to watch in order to actually watch anything, which means that the consumption of media must be intentional. This is incredibly important when we look at how these new fit into our lives.
My father picked up an iPad recently (it was off, but plugged in and charging) and said something interesting. “How do you know it’s charging?” he asked. “There’s nothing blinking on here.” He’s right, of course, but that simple statement illustrates the difference between current-gen devices and last-gen technology. In previous generations of electronics, devices were ambient, non-interactive, and representative. The stereo represented music, the typewriter represented writing. These gadgets were single-function, specialized devices. They were large and expensive, and sometimes required some sort of technical training in order to learn how to operate them. The trend in recent years, however, has been away from single-function devices like stereos, typewriters, and cassette players. The shift has been decidedly towards convergence devices whose role in day-to-day activities is not clearly defined because it is so amorphous.
In the early 90s, a person could glance over at his or her stereo and be greeted by an array of lights and digits that portrayed all sorts of information which varied by model and type of stereo. This information, however, was specific to the gadget and usage case thereof. In that scenario, a person would have any number of different devices to display very specific pieces of information. Thermometers, clocks, typewriters, stereos, and more have all been replaced by multi-function devices that are becoming more and more ubiquitous, and some people feel threatened by that. Gone are the blinkenlights, gone is the specialized knowledge required to operate the machinery, gone is the sense of self that is then inevitably tied to the gadget. Instead, we see inherently mutable devices with no single purpose taking center stage. Suddenly all the gadgets that people have been hoarding over the years are rendered useless or unnecessary, and the owner of said devices suffers a bit of an identity crisis. Should we decide to keep the devices, we clutter our lives with junk. Should we decide to pitch them, we admit defeat to the tides of change.
This, however is not as bad as it may sound. A shift away from clearly defined objects means that our sense of self becomes tied to ideas instead, tied to our interactions with technology, not the technology itself. We come to think more critically, more abstractly. What are we looking for? How do we find the information we seek? Is this information important? How should we process and/or internalize this information?
Ultimately, a shift in the type of technologies that our lives revolve around signals a shift in our self-awareness. When you think about it, another analogy comes to mind, one that I discussed recently vis à vis the transition Apple is making with their new data center.
Let’s get existential, shall we? Let’s get right into it. Here it is: our sense of self, our identity, by being disassociated from things, now lives…wait for it…”in the cloud.”
Bet you thought you’d never see the day, huh?
1 One of the most often-heard arguments I have heard from my paranoid friends/family members is “What if you lose your phone?” or “What if someone steals your phone?” I actually faced that exact scenario recently and discovered some very interesting things about security and vulnerability that will undoubtedly raise some eyebrows. I’ll describe that story in detail soon.
Came across an interesting post on TUAW today:
Some advantages of the newly integrated suite of server administrative software include a guided setup process for configuring a Mac as a server; “local and remote administration – for users and groups, push notifications, file sharing, calendaring, mail, contacts, chat, Time Machine, VPN, web, and wiki services – all in one place”; “simple, profile-based setup and management for Mac OS X Lion, iPhone, iPad, and iPod touch devices” with Profile Manager; Wiki Server 3, designed to make it “even easier to collaborate, share, and exchange information”; and WebDAV services that give iPad users “the ability to [wirelessly] access, copy, and share documents on the server from applications such as Keynote, Numbers, and Pages.”
What we’re seeing is a paradigm shift in home computer usage. More and more people are shifting away from traditional desktop configurations for their everyday computing and adopting the iPad as their primary method of getting access to the information they want. This as inevitable as it is surprising. Inevitable, because mobile computers have increasingly become the focal point of the technology world; surprising, because it happened so fast and so definitively. I need more than the fingers on my hands to count the number of people who use the iPad as their primary computer. As they become more powerful and ever more portable, that number will increase.
iPad sales have also been staggering, especially when compared to other manufacturers (HP, Samsung), and has captured huge percentages of the market (even markets that don’t even really belong to it). Hence, people are starting to wonder if it makes sense to even own a computer if this sort of thing starts becoming the norm.
Unfortunately, the iPad still needs to sync to something, and this something is quickly changing into less of a computing device and more of a server. The fact that Lion (Mac OS 10.7) will essentially allow any Mac owner to function as a server is quite interesting, and I believe it shows Apple’s future plans under the surface.
Apple likes Mac OS, and believes that it will survive for a long, long time. I agree with this, but I believe that the Mac OS will shift subtly away from its current place as the OS that people see to the OS that works under the surface. It’s a powerful statement about the future roles of the “computer” and “user.” In Apple’s future, the “computer” should be invisible, providing a means for people to access what they need, when they need it. The “user” simply gets access to what he or she wants through one of the many pipelines that transfer his or her data.
This is a trend that I have been participating in for a while, through apps like Simplify (RIP) and now Audiogalaxy, LogMeIn, and Air Sharing. The whole idea is that my iPad serves as a window/portal to everything that I may need.
Introducing a “server” option to a standard install of Mac OS Lion is Apple telling the world that soon, the computer they have sitting in the den will grow wings and live in the cloud.