This is the future we were promised, folks. Well…it’s getting there, at least. Check back in a couple years when this is beamed right into our eyeballs with connected contact lenses.
Mobile is the future. No one doubts that, and those that do are clearly riding their tiny rafts toward the inevitable plummet off the edge of the waterfall.
What is interesting is how these different mobile OS choices are defined (e.g. available apps, number of users, types of users, user engagement, developers, just to name a few), and what those definitions mean for the larger mobile landscape.
Many people argue for the benefits of iOS over Android, and vice versa, and I think the choice that most people make to go with one operating system or another isn’t driven by some core ethos or belief in how a mobile operating system should behave, it’s driven by far simpler forces – popular culture, how much money is in their wallet, and what feels right.
When it comes to the tablet space, iOS is the clear winner, having scooped up both the lion’s share of the market as well as customer satisfaction. I find it somewhat painful to watch owners of most other tablet devices struggle with basic functionality; I’m left with the feeling that someone, somewhere has done them a disservice by recommending something that did not fit their needs, having pushed some other device into their hands instead.
Where things start to blur, however, is when people start looking to devices outside of the mobile device market, things like connected TVs, appliances, and other gadgets. A person who isn’t fond of Apple can, ostensibly, purchase a Roku box for streaming content to their TV, but how well does that really integrate with a person’s home theater setup if they have iOS devices? How about Android? What about Linux? The trick here is that there are some devices that work well together, and some that don’t. Did you happen to buy one of those early Google TVs? How’s that working out for you? Sorry there aren’t more of them out there, turns out people didn’t like them very much. Sad.
The Apple TV is an iOS device, however, and I think it fills a key role in Apple’s connected living room idea. I’ve talked about this in past posts, as well, but something that many people don’t take into account is the fact that the Apple TV runs iOS, but in a form that isn’t immediately recognizable to most people.
Apple has created a chain of interconnected devices which, on their own, may seem unremarkable. Start linking them together, however, and they become far stronger and more capable than they were on their own.
I’ll end with a little story. I spent a half on the phone with a man recently, trying to help him compose and reply to an email on his new Android phone. I felt sorry for him. He had never owned a smartphone before, and was having a very difficult time using the device. For whatever reason, data was not enabled on his phone and he had to find the setting to turn it on before he could actually send the email. He was very frustrated, and it was clear that he wasn’t feeling confident. He was told that this device was very “user friendly” and that it “just worked”, but his experience demonstrated otherwise. That same night, I had some friends over, many of whom are involved in some sort of music production or performance, or who simply have great taste in music. They were sharing their favorite tracks and videos on my TV, all from their phones, all without having to fiddle with a remote or web browser. They were laughing and talking, all able to discuss and converse without needing to configure anything. They just tapped the AirPlay button and sent the media to the Apple TV. Zero configuration, zero setup.
From where I was sitting, it looked like magic.
So, Apple has unveiled iOS 7 to much discussion, hand-wringing, and cheers. There are lots of things that I feel that Apple is promising to do right with this release, and a number of things that we will, of course, need to see to believe.
One of the most prevalent activities that iOS users engage in is photo sharing. Apple’s recently-released “every day” video showcases the power of the iPhone as a camera. iPhone users know that their iPhone is probably one of the best cameras they’ve owned, and the millions of pictures snapped daily obviously underscores that.
I was intrigued by the keynote’s handling of photos, namely the application of filters and the introduction of “Shared Photo Streams” into iCloud. To be honest, I felt that this was a feature sorely missing from iOS and iCloud for a long time. The idea of a photo stream curated by a single person is fine if one of your friends happens to be a professional photographer or something like that, but most situations in which people are snapping photos tend to be social, with multiple people desiring to both view and (most likely) contribute to an album of the event. The trick lies in determining the canonical center of the stream. Who “owns” the photo stream? When people contribute to the photo stream, are they adding to a single user’s photos, or are they, in effect, “copying” that stream to their own photo collection and then adding to it, which can then be seen by other parties? Or, are the photos stored on Apple’s servers, where multiple parties “own” photos, can add to the stream, and then define who else “owns” those photos? “Own”, here, is an operative word, since ownership of the photos is tough to nail down, in this case.
I’ve always wondered how something like this would work, but it’s a problem that Apple absolutely has to tackle in order to stay relevant. As people add more and more photos of their lives to their devices, the related storage of said photos become of paramount importance, followed closely by how people identify and integrate those photos into their identity. What has become increasingly obvious is that people don’t just craft an identity that is tied to a mobile device, they create a digital identity that the mobile device allows them to access. In order for these technologies to be relevant, they have to allow people to share photos and feel comfortable about storing them in a way that is non-destructive and still allows them to reference past events with ease. It’s clear that Apple is now moving towards more meaningful photo sharing, but it has yet to be seen if they can take this idea and use it to deliver the type of interconnectivity that people implicitly ask for.
One of the things that Apple did not address, and something I’ve heard from people who have recently switched away from iOS as their primary mobile platform, is that iOS hamstrings users by not allowing them to easily pass data between apps. While I agree with some parts of this argument, I can see Apple’s stance on the idea of inter-app data sharing. The scenario that I often hear from heavy Android users is that things like taking notes, or saving PDFs from one app to another, etc. are easier on Android. I don’t agree with this because I do those very same things every single day with iOS and, ever since Apple started allowing for custom URLs to pass data from one app to another, have never had an issue with that. As such, I think I understand their stance – that Android allows a freer exchange of data between apps using a more-or-less centralized file system.
One thing that we saw in the WWDC keynote, however, is the introduction of a new tagging feature in Mac OS 10.9, which, I believe, is going to be Apple’s eventual answer to the file system. Instead of files being stored on the device, in a folder, they’ll be stored in iCloud, accessible as clusters of files related to a specific idea. This is finally the intelligent organization that Palm’s WebOS got right. Ultimately, people don’t really organize their data by app, they organize it by idea or topic, which is a far cry from having data “live” in an app.
I think the ultimate goal is to enable a user to cluster files together around a central theme or project that they may be working on, and make that cluster available as an item in an app that keeps track of and syncs tags across platforms. Ostensibly, the user could open the app, see all of their tag groups, and (possibly using an Photos app-like pinch to spread gesture) see all of the files in that tag group. Tapping on a file would open a list of corresponding apps that are capable of handling that type of file. Interestingly enough, this may also allow Apple to put a little more control in the user’s hands by allowing the user to pick which app would be the default handler of that file type. In this manner, people don’t necessarily have to know where to look for their files, they need only to open the “Tags” app, find the group they want to work on, and tap the file they want to work on in that group. The OS then passes that file to whatever helper application the user has selected as default, and they’re off to the races. A system like this wouldn’t be able to satisfy every Android lover’s desire for a true file system, but Apple wouldn’t need to – the average user would see this as a new feature, and customers on the fence may see this as a tipping point.
This one is weird to me, but I like the way that Apple has addresed it in the update, with the WebOS-style “cards” interface informing this component of the OS heavily. The ability to see live updates of each app, or at least the current status of each open application as the user left it is another way Apple brings parity with Android, but does it better. I’ve seen Android’s task-switching waterfall, and It has always felt too sterile to be enjoyable to use, although I believe that’s more of a fault of the OS design language as a whole than that specific part of the interface.
There have been a not insignificant number of words spoken about the changes Apple has made to the look of the stock app icons in iOS 7. To be honest, I feel like this whole discussion is completely moot. App icons are incredibly important, to be sure – they are the way a user identifies your application in the sea of other apps on their phone – but they are somewhat arbitrary. They need to be well-designed, but there is a certain “minimum effective dose” that allows most people to identify the app they’re looking for and associate it with the task they’re looking to accomplish.
When Apple made the choice to redesign the stock app icons, the folks behind Apple’s design choices exposed their design process as well as the grid-based layout sytem that informed the icon designs. There were comments made by graphic designers about how Apple’s layout choices were half-baked or wrong, and other coments that discussed how the color choices were catering to a younger generation, or the aesthetic biases of the cultures in new and emerging markets. Regardless of the reason behind the choices, I can’t help but relegate all of this commentary to the trash heap for the same simple reason: all of these comments are about a subjective experience. Of course Jony Ive wants to create an experience that is beautiful, familiar, approachable, friendly, and functional…but there are so many ways to accomplish this, and all of the commentary comes from a single data point in the universe. Even assuming that all of these designers and amateur critics were able to ascertain some objective truth about these designs that was universally applicable, they all have differing opinions – some of them conflicting – and it must thus follow that they’re either all right, or all wrong. I’m clearly in the latter camp. People are going to take a look at the icons and freak out because they’re different, and then everything will go back to normal and everything will be fine because, in truth, app icons only matter as pointers to something a user wants to accomplish. Once users draw new associations in their minds, they’ll be fine.
The Little Things
There are, of course, things that Apple hasn’t mentioned or brought up, most likely because they simply didn’t have enough time to do so, but I feel like I should mention them here for the sake of completeness.
While I know that not all of these things will be addressed (or even should be addressed) because of the focus that Apple is trying to maintain with iOS, there are some things that venture into that grey area that exists between the worlds of Mac OS and iOS. The first of these is the way the OS (and many apps) handle external keyboards. Safari, for instance, is able to handle a “Tab” keystroke, but does not recognize Command+L to put the cursor in the address bar, or Command+W to close an open tab. These aren’t necessarily “shortcomings” of the OS, but nor do they enhance the user experience. I’ve never thought to myself “Boy, am I sure glad they left out those keystrokes! My life is so much easier!” With this type of behavior, I’m not sure if the omission is intentional or not. Apple is a very intentional company, but something like this feels like an oversight as opposed to a deliberate design decision.
Naturally, when people see new OS announcements from Apple, they assume that new hardware is going to follow closely behind. Something that I heard recently was that Apple’s new design, while beautiful on all current iOS devices, absolutely sings and looks right at home on the new devices that Apple has lined up for the fall. What these devices are is anyone’s guess, but I don’t think anyone would lose betting on a new iPhone. New iPad minis, iPads, and possibly iPod Touch units may also be in the works, but it isn’t completely clear yet exactly how these things will take shape, and what sort of changes we can expect. I love looking forward, but I don’t “do” rumors, so I’m not going to waste any time on speculating about what Apple is working on.
Ultimately, the new iOS version that Apple has introduced to the the world looks great and, based on what I’ve heard, feels amazing. I have no desire to start ripping on an OS that’s in beta, nor do I have the desire to laud it. While it’s exciting to see a refresh to the world’s most important mobile OS, the proof will be in the pudding once it’s been finalized and released.
There’s something strange that’s been happening in the world of tech as hotly anticipated products (primarily of the Apple variety) near launch: the world finds out about them long before they’re unveiled.
I think the entire phenomenon is so strange. When kids are young and looking forward to a hot new toy, they sometimes try to approximate its presence in their lives by creating an ersatz model to take the place of the real thing until they can actually touch, hold, and use the real thing. Strangely, this is happening with increasing frequency to the iPhone. The tech world is so hungry for anything iPhone that they will contract graphic designers to create 3D models of the new gadgets, and even go so far as to build full physical models.
The noise is deafening.
Post after post featuring blurry component photos hits the interwebs, and the tech press gobbles them up like bacon-stuffed donuts. Most folks don’t follow tech blogs, don’t really have a pressing desire to know the internal layout of new gadgets, feel no need to really seek this stuff out. They read what falls in their lap and, usually, are better and more sane because of it.
Then the device hits, and it elicits “yawns” from the peanut gallery because they’ve already seen it all. They make sweeping (often literally global) statements about the reception of the product, about the excitement it’s generated, etc. Their actions are, again, childish, just like the kid whose favorite team gets eliminated from the playoffs really early and starts claiming that no one likes [insert sport here] anymore, anyway.
Ultimately, they’re embarrassed.
Who wouldn’t be? Their phones are either knock-offs or faked. The real deal is just that, and consumers know the difference. Companies will try to illustrate how their products “stack up” against Apple’s iPad, or iPhone, or whatever, but it ultimately just makes them look, again, juvenile. I can make a checklist that makes me look like the best human being ever compared to random people on the street. I could create a checklist of the features of a raw, uncooked potato, and compare it to all the features of a slice of deep-dish Chicago pizza, but comparing those two things would make no sense. “Grows in the ground”, “Has eyes”, “Will sprout if placed in water” are all “features” of the potato that the pizza doesn’t have, but who really cares? I’ll take the pizza thankyouverymuch.
Which leads me back to my point. The leaked specs, the feature parity, the checklists, etc. are all meaningless in the face of true user experience and the whole package.
A guy I know had his iPhone run over by a car. It was absolutely destroyed, which was sad for him. He was contemplating purchasing a replacement, but decided to wait it out until his contract was up for renewal so he could purchase a new iPhone 4S. In the meantime, someone gave him a Motorola Droid RAZR (or whatever it’s called…these things have the weirdest names). He ditched the Droid in favor of an iPhone 3G. You read that right. He disliked the Droid user experience so much that he went with a molasses-slow (comparatively) phone, simply because the overall user experience was so superior. When you’re on the losing team, shouting really loudly and making a lot of noise is still fun, sure, but it doesn’t win you ball games. Just ask Cubs fans.
At any rate, it’s clear that people are jazzed about the iPhone 5, and all these “yawn” reactions are just the tech news equivalent of Cubs fans getting uppity. People will choose good design and a fluid, beautiful user experience over checklists and noise.
As they say, it doesn’t take a genius.
I’ve reading a great deal in the past few months about all of the new Nexus phones that have been coming out recently, reviews by people who have used iPhones and tried to switch but failed, reviews by people who are avid Android users who love them, and most people who are somewhere in between. I’ve heard arguments as to why certain operating systems have more future, certain phones are objectively better, and really just stand somewhere in the middle, looking at all of this with a little bit of a quizzical look on my face. I’m not trying to take sides here, but I believe that Apple’s position in this market is much better because of one main reason: NFC.
While it’s true that Google’s Nexus phones have had NFC built-in for some time, it has been clear that the feature has been little more than a bullet point in a presentation in order to build some buzz and give Android pundits something to hold over Apple’s head. I thought the inclusion of NFC in the first round of Nexus phones to be half-baked, mostly because I looked around at the places I visited every single day and saw literally nothing that used NFC in a way that was available for public interaction. The only usage for NFC that I’ve seen implemented anywhere was in the TouchPad. We all know how that went.
The key here is this.
If users wave a NFC-equipped iPhone at a NFC Mac (they need to be in close proximity to interact), the Mac will load all their applications, settings and data. It will be as though they are sitting at their own machine at home or work. When the user leaves, and the NFC-equipped iPhone is out of range, the host machine returns to its previous state.
This is huge, and with Bluetooth coming back in a big (or perhaps little, as in low-power) way, this may be even more effective.
“The usual idea is that you would use NFC to set up the link between the two devices and then do an automatic hand over to a different protocol for doing the actual transfer of data – eg Bluetooth, Wi-Fi, TransferJet etc – and that’s what I imagine would be happening here,” she said.
The above coming from Analyst Sarah Clark of SJB Research.
This idea still has so much potential. As Steve Jobs said when he unveiled iCloud, Apple is demoting the computer to just another device, one that accesses your data in its servers in North Carolina somewhere. With the computer being just a gateway to your computing state anywhere, any device can also theoretically access this saved state and allow the user to resume their previous session wherever they are.
Let’s also look at another piece to the puzzle: Apple TV. We don’t know what Apple is planning for this theoretical Apple TV later this year, but let’s take a look at the Apple TV in its current incarnation, the tiny little black box that, quite frankly, is a little Wünderdevice.
For starters, you can now do this. I think that’s a pretty big deal. So the Apple TV, in its current state, can run iOS apps. It can access iCloud. It can play music and movies, and also allows a compatible device to mirror its display through a Wi-Fi connection. Let’s talk about that for a moment, as well.
If you haven’t already, check out Gameloft’s Modern Combat 3. It’s basically a Modern Warfare clone, but it has one killer feature: the ability to mirror the game on an Apple TV, which turns the iOS device you’re holding into a controller and puts the game on the big screen. I tried this on my iPad and was amazed with the results. This is truly something that game developers need to be looking at, but it’s also something that regular developers need to be looking at, as well. Think about it–if a device that is mirroring its display output to an Apple TV can display different content on the device than on the TV, a word processing app could essentially turn the tablet into a wireless keyboard, while the main workspace is displayed on the TV. The iPad or iPhone (or both!) could display a suite of controls or “function” keys, or function as pointing devices, or really anything that you can think of. The idea of a “technology appliance” holds even more water here, since these devices can be used synergistically to create an effect that one device on its own is technically capable of, but is better when spread out among several devices. Look at Keynote, for instance. With an iPad and iPhone, a person can run an entire professional presentation with no bulky equipment and a minimum of technical prowess.
In the context of the aforementioned connection to an Apple TV, this capability becomes even more important, since it allows the TV to function like a traditional “desktop”, but without the bulk of wiring, an extra device to draw power, and connections to set up. NFC handles everything, and the bulk of the transfer can then take place over Wi-Fi, Bluetooth, or some other protocol that is standard in Apple devices.
And this, my friends, is why Apple is positioned so much more powerfully in this market than any Android device manufacturer. While other manufacturers will essentially be playing catch up with all of this anyway, they will also have to contend with consumers who will be presented with each manufacturer’s take on this idea. Where Samsung may offer one type of connectivity, Asus may not, since it doesn’t have a TV of its own, but LG might. The consumer will stand in front of his TV and scratch his head wondering why his Motorola Xoom isn’t connecting to his Samsung TV, while his neighbor with an iPad and Apple TV is able to transition from room to room in his house without missing a beat.
The aftermath of this whole shebang would be the equivalent of a Destruction Derby, with all of these companies vying for the consumer dollar, blowing themselves to bits and waging a war of attrition while Apple’s devices still lead the way due to their simplicity and interoperability. The next thing that will happen is that these other manufacturers will start listing even more specs on their TVs, things like gigs of ram, processor speeds, and core counts. The consumer will look at all this and once again scratch his or her head in confusion. The Apple TV will say something like “Best-in-Class Picture Quality, Siri, and [catchy Apple-fied name for NFC connections]. Say Hello to Apple TV.”
It’ll sell like gangbusters, and we’re all going to want one. Of course we will, it’s going to represent the future of computing. Can we even call it that anymore? No, not really, it doesn’t feel right, and in this one (admittedly long-shot) future, “computing” isn’t a thing. You just pick what you want or need to do, and you use well-designed, simple hardware to do it.
Looking at the state of mobile technology today, it’s clear that the tablet form factor is the flavor of the week. A decade ago, however, the future of mobility looked a lot less like a clipboard and a lot more like a wristwatch.
For years, people were focused on wearing their computers. What is a thin, rectangular window to endless content now was a wrist-mounted portal to information then. The problem that designers always ended up getting stuck on, however, was the interface.
Designers tried to tackle this in a wearable computer concept, but the end result is still a mashup of the ideas of the last few decades and the fancy swirly graphics of today. The input method in the aforementioned concept (a swing-out keyboard? really?) is kludgy, at best, and the whole thing looks, well…huge. Would anyone actually wear that? No, no they wouldn’t because that sort of thing is a fashion nightmare.
Then there’s this one. Ouch. Really? I mean, sure this is military technology, so we’re not looking for haute couture here, but…I mean…really? This just won’t do.
The problem is that the input method for all of these concepts still involves directly interacting with the device, touching buttons, or tapping the screen with a tiny stylus. All of these options are unacceptable when it comes to wearable computing. A person cannot have devices oozing out of every pore and orifice just to get at a Wikipedia article. What they need is a device that is intuitive and simple, something that “just works”.
This is where it gets difficult.
Apple has already developed a powerful, revolutionary computing interface powered by speech. They call it Siri, and I’m sure that most people are familiar with it at this point. If not, the link should tell you everything you need to know. The bottom line is that it’s intuitive, and allows a person to perform almost every single task they usually need a computer to do with little else than a functional set of vocal cords. This powerful computing interface, however, requires a persistent connection to the internet to be able to send your voice to Siri, and to receive Siri’s reply. Furthermore, access to Siri’s beautiful mind is limited solely to owners of Apple’s iPhone 4S, at the moment.
Here’s where it gets interesting.
Apple designs hardware. They also design software and build empires on their intuitive, simple interfaces. Siri is about as simple as you can get, but not everyone has the ability to talk to Siri, and there may be those who simply don’t want to purchase a new phone for the privilege. What if, however, access to Siri could be granted by wearing a watch? Apple’s design team could surely design a beautiful watch. What if this watch was actually a computer, however? Or, perhaps not a computer, but rather a gateway to this magical, intuitive, almost infinitely powerful computer? Follow me, child, the path to this potential future is an interesting one.
Apple has been doing a lot of work behind the scenes, as it usually does. It’s been chugging away at the internal components of the iPhone 4S, upgrading a little-loved part of the phone that may actually end up being the key to this whole new ecosystem that Apple has developed: Bluetooth 4.0. The main thing about the new Bluetooth 4.0 specification is that it allows for a very low-power state, which keeps certain communication avenues open while allowing others to close. This versatility means that a wrist-mounted “computer” doesn’t actually need to do any processing of its own, but requires a connection to a device that can. Furthermore, while previous iPhone models may not sport the swanky new Bluetooth 4.0-compatible chips, they can still perform admirably with normal Bluetooth connections. This opens up the possibility for previous iPhone models to access Siri through a special piece of hardware that piggybacks off of the existing iPhone data connection through Bluetooth in much the same manner a headset would.
The end result is that a person will be able to talk to Siri, but do so without any sort of visual feedback. Ultimately, this is the sort of interaction that Apple is going for anyway. The device doesn’t need a screen (but may have one like the iPod Nano) because the interface is completely invisible. Much like the iPod Shuffle’s tiny form factor that can still communicate with the user, the new “wearable computer” does not have to be anything more than a gateway. The magic of the iPod Shuffle is that it feels like it’s so much bigger. The power of the new wearable computer is not that it is super fast and spec’ed to the gills. The power is that it feels like the world is no more than a question away.
Dick Tracy would be jealous.
Navigating the latter half of this year of my life has been treacherous. Let’s leave it at that.
Apple’s new iPhone 4S continues to sell well, while competition from other phone manufacturers remains steady. In recent news, the focus has shifted away from the iPad and the tablet space, back to the iPhone vis a vis the competition. Competitors need to put out a real alternative to Siri, which really won’t happen because Siri is a thing, an entity unto itself, one that everyone has his or her own personal experience with. I just don’t think any voice interface will do. The experience has to be wholly natural, use no jargon or “commands”, and needs to integrate into the OS in a way that is basically ubiquitous. Good luck to everyone on designing that.
The iPad is also the only game in town when it comes to an authentic tablet experience. Yet, news surrounding the iPad and the tablet market has quieted of late. There was a tablet sporting nVidia’s new Tegra processor that was released and supposed to kill the iPad. I…maybe I missed that one? Doesn’t look like anyone’s favorite fruit-flavored tablet friend is in the crosshairs, so perhaps this new “Transformer Prime” is just waiting for Shia? Dunno.
Furthermore, other tablets that I’ve seen miss the point, entirely. Have you seen the Kindle Fire? Ouch. I was traveling at the time that it was released, but there was nothing about it that made me want to use it. I picked one up at a kiosk, hoping to walk away from the experience with my eyebrows still raised. Upon taking my seat at the gate, I found my eyebrows in their normal resting position. Mission failed, Amazon.
The problem is that all these companies are all playing catch-up, trying to create value in a market that is valued on features that Apple defines. It’s a tough market to be in, and it’s getting more crowded every day.
I remember when MP3 players were all the rage, and some of my friends got them. I was excited to get one too, but never more so than when the iPod came out. I could never find the money to plunk down on one of those beasts, but I yearned. I also wouldn’t accept imitations. My father, never one for paying full price, would buy lots of off-brand MP3 players and say “Look! Just like an iPod!” He was wrong, of course. They weren’t. My first iPod was an iPod Shuffle, which was, despite its unorthodox appearance, an iPod. Next came the iPod Nano, which my sister also received. We moved up, we enjoyed, and we haven’t looked back. I believe the same can be said of the experience many people are having or will have with the vast majority of tablets out there…they’re not the iPad. Nothing else is.