So, Apple has unveiled iOS 7 to much discussion, hand-wringing, and cheers. There are lots of things that I feel that Apple is promising to do right with this release, and a number of things that we will, of course, need to see to believe.
One of the most prevalent activities that iOS users engage in is photo sharing. Apple’s recently-released “every day” video showcases the power of the iPhone as a camera. iPhone users know that their iPhone is probably one of the best cameras they’ve owned, and the millions of pictures snapped daily obviously underscores that.
I was intrigued by the keynote’s handling of photos, namely the application of filters and the introduction of “Shared Photo Streams” into iCloud. To be honest, I felt that this was a feature sorely missing from iOS and iCloud for a long time. The idea of a photo stream curated by a single person is fine if one of your friends happens to be a professional photographer or something like that, but most situations in which people are snapping photos tend to be social, with multiple people desiring to both view and (most likely) contribute to an album of the event. The trick lies in determining the canonical center of the stream. Who “owns” the photo stream? When people contribute to the photo stream, are they adding to a single user’s photos, or are they, in effect, “copying” that stream to their own photo collection and then adding to it, which can then be seen by other parties? Or, are the photos stored on Apple’s servers, where multiple parties “own” photos, can add to the stream, and then define who else “owns” those photos? “Own”, here, is an operative word, since ownership of the photos is tough to nail down, in this case.
I’ve always wondered how something like this would work, but it’s a problem that Apple absolutely has to tackle in order to stay relevant. As people add more and more photos of their lives to their devices, the related storage of said photos become of paramount importance, followed closely by how people identify and integrate those photos into their identity. What has become increasingly obvious is that people don’t just craft an identity that is tied to a mobile device, they create a digital identity that the mobile device allows them to access. In order for these technologies to be relevant, they have to allow people to share photos and feel comfortable about storing them in a way that is non-destructive and still allows them to reference past events with ease. It’s clear that Apple is now moving towards more meaningful photo sharing, but it has yet to be seen if they can take this idea and use it to deliver the type of interconnectivity that people implicitly ask for.
One of the things that Apple did not address, and something I’ve heard from people who have recently switched away from iOS as their primary mobile platform, is that iOS hamstrings users by not allowing them to easily pass data between apps. While I agree with some parts of this argument, I can see Apple’s stance on the idea of inter-app data sharing. The scenario that I often hear from heavy Android users is that things like taking notes, or saving PDFs from one app to another, etc. are easier on Android. I don’t agree with this because I do those very same things every single day with iOS and, ever since Apple started allowing for custom URLs to pass data from one app to another, have never had an issue with that. As such, I think I understand their stance – that Android allows a freer exchange of data between apps using a more-or-less centralized file system.
One thing that we saw in the WWDC keynote, however, is the introduction of a new tagging feature in Mac OS 10.9, which, I believe, is going to be Apple’s eventual answer to the file system. Instead of files being stored on the device, in a folder, they’ll be stored in iCloud, accessible as clusters of files related to a specific idea. This is finally the intelligent organization that Palm’s WebOS got right. Ultimately, people don’t really organize their data by app, they organize it by idea or topic, which is a far cry from having data “live” in an app.
I think the ultimate goal is to enable a user to cluster files together around a central theme or project that they may be working on, and make that cluster available as an item in an app that keeps track of and syncs tags across platforms. Ostensibly, the user could open the app, see all of their tag groups, and (possibly using an Photos app-like pinch to spread gesture) see all of the files in that tag group. Tapping on a file would open a list of corresponding apps that are capable of handling that type of file. Interestingly enough, this may also allow Apple to put a little more control in the user’s hands by allowing the user to pick which app would be the default handler of that file type. In this manner, people don’t necessarily have to know where to look for their files, they need only to open the “Tags” app, find the group they want to work on, and tap the file they want to work on in that group. The OS then passes that file to whatever helper application the user has selected as default, and they’re off to the races. A system like this wouldn’t be able to satisfy every Android lover’s desire for a true file system, but Apple wouldn’t need to – the average user would see this as a new feature, and customers on the fence may see this as a tipping point.
This one is weird to me, but I like the way that Apple has addresed it in the update, with the WebOS-style “cards” interface informing this component of the OS heavily. The ability to see live updates of each app, or at least the current status of each open application as the user left it is another way Apple brings parity with Android, but does it better. I’ve seen Android’s task-switching waterfall, and It has always felt too sterile to be enjoyable to use, although I believe that’s more of a fault of the OS design language as a whole than that specific part of the interface.
There have been a not insignificant number of words spoken about the changes Apple has made to the look of the stock app icons in iOS 7. To be honest, I feel like this whole discussion is completely moot. App icons are incredibly important, to be sure – they are the way a user identifies your application in the sea of other apps on their phone – but they are somewhat arbitrary. They need to be well-designed, but there is a certain “minimum effective dose” that allows most people to identify the app they’re looking for and associate it with the task they’re looking to accomplish.
When Apple made the choice to redesign the stock app icons, the folks behind Apple’s design choices exposed their design process as well as the grid-based layout sytem that informed the icon designs. There were comments made by graphic designers about how Apple’s layout choices were half-baked or wrong, and other coments that discussed how the color choices were catering to a younger generation, or the aesthetic biases of the cultures in new and emerging markets. Regardless of the reason behind the choices, I can’t help but relegate all of this commentary to the trash heap for the same simple reason: all of these comments are about a subjective experience. Of course Jony Ive wants to create an experience that is beautiful, familiar, approachable, friendly, and functional…but there are so many ways to accomplish this, and all of the commentary comes from a single data point in the universe. Even assuming that all of these designers and amateur critics were able to ascertain some objective truth about these designs that was universally applicable, they all have differing opinions – some of them conflicting – and it must thus follow that they’re either all right, or all wrong. I’m clearly in the latter camp. People are going to take a look at the icons and freak out because they’re different, and then everything will go back to normal and everything will be fine because, in truth, app icons only matter as pointers to something a user wants to accomplish. Once users draw new associations in their minds, they’ll be fine.
The Little Things
There are, of course, things that Apple hasn’t mentioned or brought up, most likely because they simply didn’t have enough time to do so, but I feel like I should mention them here for the sake of completeness.
While I know that not all of these things will be addressed (or even should be addressed) because of the focus that Apple is trying to maintain with iOS, there are some things that venture into that grey area that exists between the worlds of Mac OS and iOS. The first of these is the way the OS (and many apps) handle external keyboards. Safari, for instance, is able to handle a “Tab” keystroke, but does not recognize Command+L to put the cursor in the address bar, or Command+W to close an open tab. These aren’t necessarily “shortcomings” of the OS, but nor do they enhance the user experience. I’ve never thought to myself “Boy, am I sure glad they left out those keystrokes! My life is so much easier!” With this type of behavior, I’m not sure if the omission is intentional or not. Apple is a very intentional company, but something like this feels like an oversight as opposed to a deliberate design decision.
Naturally, when people see new OS announcements from Apple, they assume that new hardware is going to follow closely behind. Something that I heard recently was that Apple’s new design, while beautiful on all current iOS devices, absolutely sings and looks right at home on the new devices that Apple has lined up for the fall. What these devices are is anyone’s guess, but I don’t think anyone would lose betting on a new iPhone. New iPad minis, iPads, and possibly iPod Touch units may also be in the works, but it isn’t completely clear yet exactly how these things will take shape, and what sort of changes we can expect. I love looking forward, but I don’t “do” rumors, so I’m not going to waste any time on speculating about what Apple is working on.
Ultimately, the new iOS version that Apple has introduced to the the world looks great and, based on what I’ve heard, feels amazing. I have no desire to start ripping on an OS that’s in beta, nor do I have the desire to laud it. While it’s exciting to see a refresh to the world’s most important mobile OS, the proof will be in the pudding once it’s been finalized and released.
I’ve been reading a significant amount of backlash agains the iPad mini event focusing specifically on the lamentable lack of the “one more thing” moments of old. The typical banter has something to do with leaks coming from places that Apple has a hard time monitoring (China), and that it does everything it can to keep things hush hush in a world in which money talks, and loudly. My main point of contention with this sentiment is that it implies that Apple can’t keep anymore secrets about its new products.
I think that’s a silly idea.
Consider, for a moment, the scale of manufacturing that has to be brought to bear in order to manufacture products for Apple on the scale we are currently seeing. It has to be massive, and requires the coordinated efforts of millions of people, literally. From product inception, design, fabrication, and manufacture, there are literally millions of people involved, taking care of everything from the actual design and sourcing of raw materials to the shipping to your doorstep. Truth be told, their job isn’t even over when you have the product in your hands; they still have to support it and continue developing new software. The human life energy devoted to the manufacture and support of a single iPad is immense.
As such, consider the original iPhone, first introduced in January of 2007, but released in June of the same year. That’s a 6-month gap from introduction to purchase. In contrast, iPhone 5 was revealed on September 12, went on pre-sale two days later, and was available for retail purchase one full week after the introduction, on September 19th. The full implication of that is that Apple’s manufacturing machine has to be at work for months before the device truly sees the light of day. In short, more human beings (see above) are aware the device exists for more time before the general public can purchase the device.
With the original iPhone, Apple had the luxury of producing prototypes and testing them in relative seclusion. Apple no longer has that luxury because it works on some of the tightest schedules a person can conceive of.
Think about it; If Apple wanted to prototype a totally new product using in-house fabrication today, they could do it. They could show a working device to a room full of awed spectators who had no idea that such a thing existed, but they wouldn’t be able to put it in your hands until months later, and that isn’t something that Apple wants to do–they want you to make a decision and strike while the iron is hot.
So when you’re done watching the reveal of a new Apple products from another Apple device that’s barely a month old, remember that things weren’t always this way. You can’t manufacture your cake and be surprised by it, too.
While my posts haven’t been coming fast and furious lately, I’ve been watching the tech landscape recently and have seen some interesting shifts in where I believe a lot of things are heading.
Whither the iPod Nano?
This has been a perennial issue for me. When the iPhone 4S (aka the iPhone 5), was released, people did two things:
1. Thought that it was an inferior phone because the character “5” was not in the title
2. Forgot about everything else for a little while.
I, however, did not forget about the iPod nano. Conversely, I began to think more about it, mostly from the perspective of “How can Apple make use of this new Bluetooth 4.0 thing?” While Bluetooth may not be very important to many people in the world, or may be synonymous with “headset”, Bluetooth information exchange technology makes possible a great many things that people basically don’t take advantage of. Case and point, a friend of mine just saw me typing this blog post on a wireless Bluetooth keyboard and said “Wow, a wireless keyboard? I didn’t even know they made those.” Naturally, he’s a little behind the times (friar, vow of poverty), but that doesn’t stop the concept from being foreign to many people. An iPad-toting client of mine didn’t know that Bluetooth could be used to connect an iPad to a wireless keyboard, either (see “headset” equivocation above).
At any rate, that’s where we’re at. Bluetooth having effectively been relegated to another name for “headset”
The iPod Nano has the opportunity to become something so far beyond what it is right now. It can be a gateway to the information stored on an iPhone, a supplement to an iPad (remote control, keyfob, microphone, etc.), and, possibly even more importantly, a front-end for Siri. Naturally, the iPod Nano’s screen isn’t designed for displaying large amounts of information, but that doesn’t preclude it from being an information portal.
When talk of an “iPad Mini” started swirling about, I immediately started thinking about the whole Steve Jobs “people don’t like these ‘tweener’ sizes for tablets” statement. Whenever he says that, you know that a product isn’t too very far away. The issue for Apple wasn’t creating a product in that size, but rather timing their entry into that size category. One of the things that I’ve noticed about a great deal of the other 7″ (ish) tablets on the market is that they lack anything truly compelling for me. I wouldn’t want a Kindle/Kindle Fire because its primary purpose is to read books purchased through Amazon.
The Nexus 7 was almost enough to get me on board until I used one. “Why would I spend any money on this?” I found myself asking over and over. The only truly compelling thing that I saw in the Nexus 7 was the NFC capability, but even that was a stretch. I need a product like that to be an iPad, but smaller, capable of all the things my iPad is capable of. I’m sure there are many people in the same boat.
I’ve been using the iPad to take notes, draw, read, and write since its introduction to the market. People tried to tell me that it wouldn’t be capable of much, and I would just quietly continue working, nodding as I continued to accomplish goals I set out for myself from the comfort of a tablet that I could use comfortably all day.
I knew there was one problem, though: it was too big (and not by much) for me to carry in my hoodie pocket. There were times that I only wanted to carry my tablet with me and nothing else, lack of charging equipment and extra tubes for my bike being reasonable things to forego in favor of a tablet that could slip easily into my back pocket. My iPad was literally a half inch too big, and I resigned myself to carrying the things I needed in addition to my wundertablet.
It was a hard life, I know, but I made it through. Thanks for your concern
Now, however, I feel like Apple is going to make a lot of people happy by creating a device that is perfectly capable of an absolutely ludicrous number of things (vis a vis other tablets), yet still has an extremely portable form factor (as though the iPad wasn’t portable enough).
Here’s the thing, though: Apple needed to time this whole thing. Releasing a 7″ (ish) tablet shortly after the iPad would have been great, and people would have really liked it, sure, but it wouldn’t have had the same impact that I believe it will have now. By releasing an “iPad Mini” now, Apple has allowed all the trash to sift itself out. Plenty of other companies have brought “me too” devices to market, and each has captured some small part of the iPad experience that people love, but left even more behind. Other companies thought that, if they could only have gotten that 7″ tablet to market first, that they would have ruled that space. The issue with that type of thinking is that it leads to sloppiness. Should this “iPad Mini” be released soon, it will be released with the entire weight of Apple behind it. It will have access to the iTunes store, it will have access to the App Store. All the apps that people have already purchased will be available on their device from day one. Their contacts and calendars will be synced through iCloud, and, while the same can be said for any Android tablet in that form factor, a person toting both Android and Apple devices would have to manage two devices with two different stores to shop from, two places to store their media, and no convenient way to slosh purchases around between devices.
With a device having a smaller screen size and profile, Apple will be making their signature store/device integration available in an even more portable form factor. The market will respond, and it will respond favorably.
Keep Your Friends Close
The last thing that I haven’t been hearing much about recently is NFC. Samsung released the Galaxy S III to a mediocre amount of fanfare, touting all of this NFC magic…but I have yet to see anything really interesting come out of it. I love the idea of NFC, but, like the Nexus 7, I see no one using it. I don’t see any stores with NFC tags on their doors, no restaurants with NFC tags on the tables to allow patrons to silence their phones and join their wifi with a single tap. None of this is real because I have a sneaking suspicion that Samsung has no idea what it’s doing. They put products on the market that have checkboxes in all the right places, but no real-world application of any of the things that those boxes relate to. Great job, Sammie, your phone has NFC! Does that honestly play a role in most people’s buying decisions? No, no it doesn’t. A friend of mine recently purchased a new GSIII and, when asked about the NFC feature, had no idea what I was talking about.
Truth be told, I’m not sure NFC will ever be a truly compelling technology, but I believe that, if it is, that Apple will do it right. They’ll do it right because they’re really the only company that can make something as obscure as NFC relevant enough to matter to the world. When the world’s most valuable company throws its weight behind something, you’re pretty safe betting that people are going to pay attention.
All of this assumes a few things
1. Apple is releasing a new iPod Nano.
2. Apple is releasing an “iPad Mini”.
3. The aforementioned products, in addition to the new iPhone, will contain NFC technology.
Those are a great deal of assumptions, but they all seem to make sense. I’m not one to start making assumptions and thinking that I’ve got it all right, but, based on what I’ve been seeing and, perhaps even more importantly, what I haven’t been seeing, I believe that all of these things are very close to reality.
I haven’t even touched on the possible integration with a refresh of the Apple TV, but I think that all those things are around the corner, as well.
It’s gonna be a helluva September
I’ve reading a great deal in the past few months about all of the new Nexus phones that have been coming out recently, reviews by people who have used iPhones and tried to switch but failed, reviews by people who are avid Android users who love them, and most people who are somewhere in between. I’ve heard arguments as to why certain operating systems have more future, certain phones are objectively better, and really just stand somewhere in the middle, looking at all of this with a little bit of a quizzical look on my face. I’m not trying to take sides here, but I believe that Apple’s position in this market is much better because of one main reason: NFC.
While it’s true that Google’s Nexus phones have had NFC built-in for some time, it has been clear that the feature has been little more than a bullet point in a presentation in order to build some buzz and give Android pundits something to hold over Apple’s head. I thought the inclusion of NFC in the first round of Nexus phones to be half-baked, mostly because I looked around at the places I visited every single day and saw literally nothing that used NFC in a way that was available for public interaction. The only usage for NFC that I’ve seen implemented anywhere was in the TouchPad. We all know how that went.
The key here is this.
If users wave a NFC-equipped iPhone at a NFC Mac (they need to be in close proximity to interact), the Mac will load all their applications, settings and data. It will be as though they are sitting at their own machine at home or work. When the user leaves, and the NFC-equipped iPhone is out of range, the host machine returns to its previous state.
This is huge, and with Bluetooth coming back in a big (or perhaps little, as in low-power) way, this may be even more effective.
“The usual idea is that you would use NFC to set up the link between the two devices and then do an automatic hand over to a different protocol for doing the actual transfer of data – eg Bluetooth, Wi-Fi, TransferJet etc – and that’s what I imagine would be happening here,” she said.
The above coming from Analyst Sarah Clark of SJB Research.
This idea still has so much potential. As Steve Jobs said when he unveiled iCloud, Apple is demoting the computer to just another device, one that accesses your data in its servers in North Carolina somewhere. With the computer being just a gateway to your computing state anywhere, any device can also theoretically access this saved state and allow the user to resume their previous session wherever they are.
Let’s also look at another piece to the puzzle: Apple TV. We don’t know what Apple is planning for this theoretical Apple TV later this year, but let’s take a look at the Apple TV in its current incarnation, the tiny little black box that, quite frankly, is a little Wünderdevice.
For starters, you can now do this. I think that’s a pretty big deal. So the Apple TV, in its current state, can run iOS apps. It can access iCloud. It can play music and movies, and also allows a compatible device to mirror its display through a Wi-Fi connection. Let’s talk about that for a moment, as well.
If you haven’t already, check out Gameloft’s Modern Combat 3. It’s basically a Modern Warfare clone, but it has one killer feature: the ability to mirror the game on an Apple TV, which turns the iOS device you’re holding into a controller and puts the game on the big screen. I tried this on my iPad and was amazed with the results. This is truly something that game developers need to be looking at, but it’s also something that regular developers need to be looking at, as well. Think about it–if a device that is mirroring its display output to an Apple TV can display different content on the device than on the TV, a word processing app could essentially turn the tablet into a wireless keyboard, while the main workspace is displayed on the TV. The iPad or iPhone (or both!) could display a suite of controls or “function” keys, or function as pointing devices, or really anything that you can think of. The idea of a “technology appliance” holds even more water here, since these devices can be used synergistically to create an effect that one device on its own is technically capable of, but is better when spread out among several devices. Look at Keynote, for instance. With an iPad and iPhone, a person can run an entire professional presentation with no bulky equipment and a minimum of technical prowess.
In the context of the aforementioned connection to an Apple TV, this capability becomes even more important, since it allows the TV to function like a traditional “desktop”, but without the bulk of wiring, an extra device to draw power, and connections to set up. NFC handles everything, and the bulk of the transfer can then take place over Wi-Fi, Bluetooth, or some other protocol that is standard in Apple devices.
And this, my friends, is why Apple is positioned so much more powerfully in this market than any Android device manufacturer. While other manufacturers will essentially be playing catch up with all of this anyway, they will also have to contend with consumers who will be presented with each manufacturer’s take on this idea. Where Samsung may offer one type of connectivity, Asus may not, since it doesn’t have a TV of its own, but LG might. The consumer will stand in front of his TV and scratch his head wondering why his Motorola Xoom isn’t connecting to his Samsung TV, while his neighbor with an iPad and Apple TV is able to transition from room to room in his house without missing a beat.
The aftermath of this whole shebang would be the equivalent of a Destruction Derby, with all of these companies vying for the consumer dollar, blowing themselves to bits and waging a war of attrition while Apple’s devices still lead the way due to their simplicity and interoperability. The next thing that will happen is that these other manufacturers will start listing even more specs on their TVs, things like gigs of ram, processor speeds, and core counts. The consumer will look at all this and once again scratch his or her head in confusion. The Apple TV will say something like “Best-in-Class Picture Quality, Siri, and [catchy Apple-fied name for NFC connections]. Say Hello to Apple TV.”
It’ll sell like gangbusters, and we’re all going to want one. Of course we will, it’s going to represent the future of computing. Can we even call it that anymore? No, not really, it doesn’t feel right, and in this one (admittedly long-shot) future, “computing” isn’t a thing. You just pick what you want or need to do, and you use well-designed, simple hardware to do it.
Looking at the state of mobile technology today, it’s clear that the tablet form factor is the flavor of the week. A decade ago, however, the future of mobility looked a lot less like a clipboard and a lot more like a wristwatch.
For years, people were focused on wearing their computers. What is a thin, rectangular window to endless content now was a wrist-mounted portal to information then. The problem that designers always ended up getting stuck on, however, was the interface.
Designers tried to tackle this in a wearable computer concept, but the end result is still a mashup of the ideas of the last few decades and the fancy swirly graphics of today. The input method in the aforementioned concept (a swing-out keyboard? really?) is kludgy, at best, and the whole thing looks, well…huge. Would anyone actually wear that? No, no they wouldn’t because that sort of thing is a fashion nightmare.
Then there’s this one. Ouch. Really? I mean, sure this is military technology, so we’re not looking for haute couture here, but…I mean…really? This just won’t do.
The problem is that the input method for all of these concepts still involves directly interacting with the device, touching buttons, or tapping the screen with a tiny stylus. All of these options are unacceptable when it comes to wearable computing. A person cannot have devices oozing out of every pore and orifice just to get at a Wikipedia article. What they need is a device that is intuitive and simple, something that “just works”.
This is where it gets difficult.
Apple has already developed a powerful, revolutionary computing interface powered by speech. They call it Siri, and I’m sure that most people are familiar with it at this point. If not, the link should tell you everything you need to know. The bottom line is that it’s intuitive, and allows a person to perform almost every single task they usually need a computer to do with little else than a functional set of vocal cords. This powerful computing interface, however, requires a persistent connection to the internet to be able to send your voice to Siri, and to receive Siri’s reply. Furthermore, access to Siri’s beautiful mind is limited solely to owners of Apple’s iPhone 4S, at the moment.
Here’s where it gets interesting.
Apple designs hardware. They also design software and build empires on their intuitive, simple interfaces. Siri is about as simple as you can get, but not everyone has the ability to talk to Siri, and there may be those who simply don’t want to purchase a new phone for the privilege. What if, however, access to Siri could be granted by wearing a watch? Apple’s design team could surely design a beautiful watch. What if this watch was actually a computer, however? Or, perhaps not a computer, but rather a gateway to this magical, intuitive, almost infinitely powerful computer? Follow me, child, the path to this potential future is an interesting one.
Apple has been doing a lot of work behind the scenes, as it usually does. It’s been chugging away at the internal components of the iPhone 4S, upgrading a little-loved part of the phone that may actually end up being the key to this whole new ecosystem that Apple has developed: Bluetooth 4.0. The main thing about the new Bluetooth 4.0 specification is that it allows for a very low-power state, which keeps certain communication avenues open while allowing others to close. This versatility means that a wrist-mounted “computer” doesn’t actually need to do any processing of its own, but requires a connection to a device that can. Furthermore, while previous iPhone models may not sport the swanky new Bluetooth 4.0-compatible chips, they can still perform admirably with normal Bluetooth connections. This opens up the possibility for previous iPhone models to access Siri through a special piece of hardware that piggybacks off of the existing iPhone data connection through Bluetooth in much the same manner a headset would.
The end result is that a person will be able to talk to Siri, but do so without any sort of visual feedback. Ultimately, this is the sort of interaction that Apple is going for anyway. The device doesn’t need a screen (but may have one like the iPod Nano) because the interface is completely invisible. Much like the iPod Shuffle’s tiny form factor that can still communicate with the user, the new “wearable computer” does not have to be anything more than a gateway. The magic of the iPod Shuffle is that it feels like it’s so much bigger. The power of the new wearable computer is not that it is super fast and spec’ed to the gills. The power is that it feels like the world is no more than a question away.
Dick Tracy would be jealous.
Apple announced the iPhone
5 4S yesterday, much to the chagrin of the internet. Well…perhaps not to the chagrin of the internet, but everyone was expecting something called the “iPhone 5”, and Apple announced an absolutely amazing piece of kit they’re calling the “iPhone 4S”.
There was a lot of backlash, from what I understand, which seems…silly? I think that’s probably the best word to use right now. Silly.
See, the iPhone 5 was supposed to have all these amazing features, like a dual-core A5 processor, a higher-resolution camera, and image stabilization when shooting video. It was supposed to do all these amazing things with even better battery life, too. What a product! Yet, what we got was…the…wait let me check on this…we got the iPhone 4S…thing…with a dual-core A5 processor, higher-resolution camera, image stabilization, and something called “Siri”. Ok? But this silly piece of hardware is…well just look at it! It looks the same as the iPhone 4! And it’s called the iPhone “4S”. PEOPLE can you hear what I’m saying? It has a four in the name. Four is not five, my dearies. This is clearly a disappointment.
Let’s talk about what’s NOT in the iPhone 4S:
I think that about covers it.
Seriously, though, the next iPhone is revolutionary. Not because it looks like an iPod touch, but because it’s basically an iPad 2 in the palm of your hand.
I don’t think it’s time for a chassis redesign, and I’m glad they stuck with the iPhone 4’s slick glass and steel thing. There’s so much more in there, and all it will take for people to understand the beauty of the iPhone’s new guts is moseying down to their local crystal palace (aka Apple Store) and fiddling with the thing for five minutes, in which time they’ll realize that they can be twice as productive with this new pocket computer than they are with their current one. Game, set, and match.
This won’t work. I’m not saying that it never will, but I don’t believe that this is something that Apple’s framework actually even allows; Apple doesn’t allow this by design. The whole idea of a phone that does the “bidding” of another company, or simply becomes a platform for another company’s ideas, values, and way of thinking is absurd. Google might allow it because they’d find a way to monetize it, but can you imagine that? I mean actually take a minute to imagine a Google Ads-ridden Facebook interface shoehorned onto an Android phone running some forked version of the OS. Jesus, it hurts to even think about. What a horrible, mind-destroying user experience that would be.