As the Apple news starts reaching a fever pitch, mostly surrounding the very imminent launch of the iPad 2 and perhaps some other things that Apple has up its sleeve, my attention is drawn to the unsung and curiously short-lived news regarding the MacBook Pro refresh and this new Thunderbolt port. I’ve been reading some other news throughout the day and it appears that there are some things that folks have missed so far, things that signal a great future for Apple and the personal computer industry as a whole.
One of the first of these important bits is, of course, Thunderbolt. Great technology, lots of bandwidth, high transfer speeds, etc. Generally a good thing. The article I linked to has a lot of great information regarding the specs and capabilities of this new transfer protocol. This, by itself, tells us that Apple’s current-gen displays will look gorgeous, that they’ll be able to run free while also co-existing nicely with other peripherals (hard drives, cameras, iOS devices). Good news, but what caught my eye was this little tidbit that came up a little while ago about Lion’s support for a desktop Retina Display.
But one particularly interesting under-the-hood change that we’ve learned about is an evolution of Mac OS X’s “resolution independence” features. Resolution independence has been a long talked about feature that would eventually provide support for high DPI (dots per inch) displays. While there has been the beginnings of support for it starting in Mac OS X Tiger (10.4) and into Mac OS X Snow Leopard (10.6), full support was never realized.
This is something that I was very interested in when it first rolled out, but never really saw the fruits of. With Thunderbolt (née Light Peak) technology, very high-resolution displays will become the norm. The incredible transfer speeds required to display all those juicy pixels are now present in Thunderbolt, and Apple has a way to get all those ports out there now, into the hands of the exact folks (photographers, filmmakers, journalists, designers, etc.) who would soak themselves with drool over a double-resolution display. Target audience, check.
Then there’s this. Didn’t really happen, did it? Apple, however, has a history of releasing things that they (or Mr. Jobs, more specifically) expressly deny. So now we’ve got super high-res screens on the fuzzy horizon (Just over there! Can you see them?) powered by a transfer technology that will ensure that people using them don’t go cross-eyed or have their retinas burned out by anything other than sweet, innumerable pixels. Suddenly, all those touchscreen iMacs that were never supposed to be look like they can be.
Let’s also talk about how Lion fits into this.
Apple’s new OS is a powerful statement for user-friendliness without the expense of power. Lion, being designed with all kinds of iOS conventions baked-in, seems oddly reminiscent of a hardware/software duet that mysteriously disappeared right before the launch of the iPhone some years ago. The keyboard, developed by a company called “FingerWorks,” was a capacitative (if I remember correctly) keyboard that allowed for multiple fingers on the board simultaneously. I was going to buy one for my 12″ PowerBook G4, when suddenly the device was nowhere to be found. The company’s website stated that they had been acquired by Apple, and I started telling my friends to get ready for something huge.
I’m not sure where to find these videos anymore, but FingerWorks’ instructional videos on their pages look oddly like what I’m seeing in Apple’s own marketing material for Lion. While not expressly a touchscreen OS, Apple will undoubtedly start adjusting their future plans to be able to create computers that have that capability. Even though it isn’t what they’re designed for now, the older paradigm of keyboard percussion and mouse gymnastics will shift one day, and I’ll bet dollars to donuts that Apple wants to be at that bleeding edge.
While perhaps not quite where we’re headed, the idea of a beautiful, completely interactive table that syncs with the phone/camera/device that you lay on top of it (NFC, anyone?) and allows us to interact with our information naturally is a science fiction dream, and Apple’s vision is putting it within reach once again.
UPDATED: A couple folks asked about the title of the post. The title is actually a line from a poem written by a martial artist about Aikido, the next line is “Throw, like thunder.” Seeing as how the post is about Thunderbolt, I thought I’d add something related in there. Sorta esoteric, but fun.
Apple also made a small, but very meaningful change to their iOS app store, namely the shift to a button labeled “Install.”
While this may appear on the surface to be merely cosmetic, looking deeper reveals a lot of information in light of all the movement Apple has been making recently in building out the data center and rolling out the tall ladders for cloud (or pseudo-cloud) computing. AppleInsider discusses the physical processes that are beginning to facilitate this, but here is the first
What we see here is a blurring of the lines between local and cloud storage. If a button is labeled “Install,” it implies that the app is close at hand, just a tap away in order to be in front of us and usable.
Consider the language Apple uses when downloading and installing apps from the App Store. While the app is being downloaded, the user sees “Loading…” below the app, creating the impression that the app is not being fetched from some far-away place, but that the app is being unwrapped, that it’s simply starting up for the first time. As the process continues, “Loading…” changes to “Installing…,” which further increases the similarity to a locally-stored app. Shortly thereafter, the app is ready, and the user can go to town.
Displaying “Install” in the app store, instead of the app’s price, puts the user at ease that they already own this piece of software, that Apple is keeping track and taking care of all of their software for them, and that they have their own personal software vault from which any app they own is accessible to them at any time.
Think about that change in the juxtaposition to the old way of computing, when installing a program meant loading a physical disc into a tray and transferring the data onto a computer. Think about the programs that actually required that the disc be in the tray. This is a distinct and marked shift away from that type of application and media, a shift toward user-friendliness, toward ease-of-use.
Once again, this is good technology. The computer gets out of the way, and we are able to engage our information more quickly, without a break in thought, without losing ourselves to the process. We are able to focus, explore, create. We are able to be more human.
Came across an interesting post on TUAW today:
Some advantages of the newly integrated suite of server administrative software include a guided setup process for configuring a Mac as a server; “local and remote administration – for users and groups, push notifications, file sharing, calendaring, mail, contacts, chat, Time Machine, VPN, web, and wiki services – all in one place”; “simple, profile-based setup and management for Mac OS X Lion, iPhone, iPad, and iPod touch devices” with Profile Manager; Wiki Server 3, designed to make it “even easier to collaborate, share, and exchange information”; and WebDAV services that give iPad users “the ability to [wirelessly] access, copy, and share documents on the server from applications such as Keynote, Numbers, and Pages.”
What we’re seeing is a paradigm shift in home computer usage. More and more people are shifting away from traditional desktop configurations for their everyday computing and adopting the iPad as their primary method of getting access to the information they want. This as inevitable as it is surprising. Inevitable, because mobile computers have increasingly become the focal point of the technology world; surprising, because it happened so fast and so definitively. I need more than the fingers on my hands to count the number of people who use the iPad as their primary computer. As they become more powerful and ever more portable, that number will increase.
iPad sales have also been staggering, especially when compared to other manufacturers (HP, Samsung), and has captured huge percentages of the market (even markets that don’t even really belong to it). Hence, people are starting to wonder if it makes sense to even own a computer if this sort of thing starts becoming the norm.
Unfortunately, the iPad still needs to sync to something, and this something is quickly changing into less of a computing device and more of a server. The fact that Lion (Mac OS 10.7) will essentially allow any Mac owner to function as a server is quite interesting, and I believe it shows Apple’s future plans under the surface.
Apple likes Mac OS, and believes that it will survive for a long, long time. I agree with this, but I believe that the Mac OS will shift subtly away from its current place as the OS that people see to the OS that works under the surface. It’s a powerful statement about the future roles of the “computer” and “user.” In Apple’s future, the “computer” should be invisible, providing a means for people to access what they need, when they need it. The “user” simply gets access to what he or she wants through one of the many pipelines that transfer his or her data.
This is a trend that I have been participating in for a while, through apps like Simplify (RIP) and now Audiogalaxy, LogMeIn, and Air Sharing. The whole idea is that my iPad serves as a window/portal to everything that I may need.
Introducing a “server” option to a standard install of Mac OS Lion is Apple telling the world that soon, the computer they have sitting in the den will grow wings and live in the cloud.
This is exactly what so many people are afraid of with subscriptions. Let’s pause for a moment and look at some other examples of where people thought Apple’s App Store was going to be some sort of godawful den of iniquity (granted, with the initial onslaught of fart apps, that was a distinct possibility).
We’ve got this example about the 99 cent apps that people are so afraid of. Articulate and accurate, if I don’t say so, myself.
Then there’s some of this fear-mongering that hit the interwebs right after the opening of the app store.
Can we see a pattern here? Everyone panicked when apps were selling for $0.99, and now the dollar apps are the way to financial independence. Develop an amazing app, sell it for a dollar to a million people, and you’re set for the next few years. Sell it to five million, and you can buy a house. Subscriptions don’t need to be expensive, folks.
What will start to happen is a dramatic shift in how people actually start to read, how they get their media. The current model is, well, free. RSS dominated for a while, and Twitter started stealing some of RSS’ thunder. Some folks (like me) still like RSS, but I recognize that it’s not the only way of getting news out there. There’s a whole wide world of content that is waiting to be discovered and digested, and people who (up until now) had no way of feeling comfortable breaking into that world can suddenly have access to it in a very easily accessible manner.
Here’s the clincher. Ready?
Publishers are upset by their sudden restrictions. Just last month it was OK for all these subscription-based or -focused publishers to make all sorts of money off of their customers. There were no transportation fees, no raw material fees, none of that. Just what they paid their developers and writers, but they were already doing that. They had a low-overhead way of distributing their product to lots of people. There were, however, two problems with this model a) the previous subscription process was cumbersome at best and user-hostile, at worst; b) very few publications actually had subscriptions, and people were confused by them (Go to the site? Register? What happens when I’m on my iPad? Do I have to go through Safari?). Again, not optimal. Furthermore, there were (and are) lots of people who want to get paid for what they write, and carving out a place for oneself in the world of publishing demands a not-insignificant amount of research, hard work, and do-overs. The world of writing is a tough one to make it in, mostly because the signal-to-noise ratio is getting lower every day (as I wrote about in this post), and publishers want to make sure they’re paying for quality.
What if, however, it were easy to publish, build a subscriber base, and make a name for yourself? What if there was a system in place that exposed your work to millions of people who are already invested in a thriving digital ecosystem? People who are used to and demand curated, well-researched news sources? Perhaps people with a little more green in their wallets? Or perhaps people who are moving up in the world?
This model is not for the established and entrenched giants of publishing, who will attempt to nickel and dime their subscribers and who are too anachronistic to develop truly compelling and groundbreaking digital publications. This model is for the new media, for the folks who want to reach as many subscribers as they can with their good ideas; for the new media consumers, the folks who want those same good ideas but don’t want to load their minds down with ads about “weird belly fat tips” and think that maybe a dollar is a good price to pay for a month’s worth of strong-voiced columns. This is throwing open the gates to the publishing world and finally making it accessible to all the guys and gals with the good ideas but not enough time to eat or sleep and for SURE not look for an agent.
This is what disruption looks like.
So there’s this.
And then there’s this.
And you know what this tells me? Everyone is just upset that they didn’t think of it first. If you’re going against the rules, and then somebody comes in and says, “Hey man, you can’t do that anymore,” you should not respond with, “But we’ve been doing this all along.” You are only going to look like a moron.
Came across this article on the (admittedly grotesque) Gizmodo this morning, and thought I’d chime in.
Hipstamatic generates an atmosphere, an aesthetic that ostensibly doesn’t exist in reality. Our vision only tends to resemble 1970s photography when our minds are lubricated with pharmaceutical enhancements, after all. Is it photojournalism when an image is deliberately changed to heighten or affect mood that we literally can’t see with our eyes for the sake of aesthetics and emotion? Is the definition of reality here merely confined to the collection of objects depicted in the photograph?
Staring at the photo in question, “A Grunt’s Life,” I can see how the photographer—the person who was there, documenting a moment in a time—can reasonably argue that his Hipstamatic print more accurately depicts the feeling of what it was like to be there than if he had simply taken a conventional, straightforward photograph. A photo that, from a certain point of view, is perhaps more truthful.
Here’s the deal: as technology advances, what once required a highly developed and specialized skill set will eventually have assistive software/hardware developed for it that mimics and/or replaces the majority of the skills in that set. Shooting, processing, developing, and printing a photo like this now would take a lot of knowledge and access to resources that most people are unaware even exist. Hipstamatic removes most of these obstacles and enables “average” people without these skills to create in a manner similar to a person with said skills.
I have this discussion with people all the time. There’s a sort of nostalgia that creeps around whenever technology starts to change the way people create. When writing and publishing something was a long, difficult, and laborious process, only people who were willing to invest a great deal of time into that process had their work published. The same goes for photography, painting, filmmaking…basically almost any type of creation had a “price of entry,” if you will. A long time ago, anything that was created was vetted to make sure that it was something that was worth creating.
As I said above, the evolution and widespread adoption of previously all-but-unattainable (due to cost prohibition, licensing, etc.) technology by mainstream culture has placed tools for creation in the hands of folks who do not possess the highly specialized and developed skill set that their “artist” counterparts possess (artist being a term to denote anyone who has devoted a significant amount of time to the development and refinement of a set of skills). Despite this discrepancy in investments of time and energy, there are a not insignificant number of people who have a high degree of innate artistic talent and are able to create a “product” that is similar to the “product” created by the “artists.”
This is usually where all hell breaks loose. Lots of folks decry the use of these new technologies as “cheating,” in a way. “If anyone can do it, it isn’t art anymore!” they cry, “and they have no training!” The death knell of photography has been sounded!
I’ve had this exact same discussion on the topic of writing and the impact that the internet has had on “good” writing. Many people are of the opinion that people are “getting dumber” or that our literature is “in decline,” when the reality is there is simply more of everything. There are more people taking pictures, writing, making movies, and creating than ever before. The fact that a phone can take a picture today that looks better than pictures looked thirty years ago is just a testament to the progressive iteration that takes place in technology. Nobody ever hung a photographer because he or she didn’t know how to build a camera.
What we’re getting at is that the creation of anything is getting easier, and more people are doing it than ever before. That, ladies and gentlemen, is a wonderful, beautiful thing. People get their ideas out there. I’d like to say that all those ideas are gems of knowledge and insight, but not all of them are, and that’s OK. What we have is a much larger body of knowledge to draw from, and the tools we’re using to pull data are evolving rapidly. Sure, the “overall” quality of the work is declining, but that’s only because there’s so much more out there. That says nothing about the unrelenting and constant creation of high-quality stuff. If more people have access to good technology, then more good stuff gets out into the world. That’s where tech is supposed to fit in, it’s supposed to remove barriers to that sort of engagement with the world that usually only comes with, again, those highly developed, highly specialized skill sets.
So, when I see something that says “OH NO HE USED HIPSTAMATIC” I usually put the earmuffs on. There’s simply no place for that anymore. If people say that the photography isn’t real because the app adds things to the frame that weren’t there, then you’re going to have to chase after all the filters, all the lensbabies, all the grease-smeared lenses that are out there. Those aren’t “real” in the same way that the virtual lenses in Hipstamatic aren’t real. Or are they? In the first scenario, someone is taking a physical object and changing the light before it hits the film; in the latter, a person is applying a modification to the image after it has already been taken, but the end result is basically the same.
The most important line in the blockquote above is the one about feeling. In Tim O’Brien’s The Things They Carried, one of the main points of the novel is to illustrate the importance of the “story truth” vs. the “real” truth. Ultimately, it is how we experience things that is important, and how we convey that experience to others is critical.
As technology improves and our ability to convey our thoughts and feelings moves beyond having a specialized set of skills, we will find that the number of brilliant people will skyrocket. It may seem small, but Hipstamatic is just one of the first steps along that path.
When people speed down highways and sidestreets, they’re often banking on the kindness of the local law enforcement to look the other way, letting the folks driving five to ten miles-per-hour over the limit squeak by. People are thankful for that, in a way, but they also come to expect that sort of leniency everywhere. Small towns that rely on speed traps for revenue won’t be so forgiving of the lead-footed among us, and some folks get mad about that. They have no right, of course; they were speeding, but that doesn’t stop them from being upset.
Enter Apple, who made news once again this morning with their apparent “rejection” of Sony’s Reader app. People were all up in arms about this move earlier (prematurely, if you ask me), and only slightly changed their tone as more details began to emerge. Then, we were treated to this tasty morsel:
“We have not changed our developer terms or guidelines,” Apple spokesperson, Trudy Muller, told The Loop. “We are now requiring that if an app offers customers the ability to purchase books outside of the app, that the same option is also available to customers from within the app with in-app purchase.”
Now it’s getting tasty. Apple is now reeling it in. They created their “walled garden” and, despite all the protests and ballyhoo about the whole kit and kaboodle being “closed,” people absolutely loved to play in it. Everyone wants in on a piece of the iOS pie; now that they’ve bitten, Apple is bringing them in to play by the rules that were so clearly laid down at the outset. Basically, this:
Except now it’s apparently choosing a different way to actually enforce those terms, which makes the report seem accurate after all.
So now we have an interesting predicament. Apple is effectively charging other companies for the content that these other companies’ customers have enjoyed on Apple’s hardware using Apple’s operating system, marketed through Apple’s App Store. Seems almost fair, doesn’t it? It seems as though Sony is more than a little upset about the whole thing, but I think John Gruber has a good take on it:
My guess is that Sony is getting hurt because they were late to the game. Amazon’s Kindle app precedes the existence of Apple’s in-app purchasing API. I thoroughly doubt Apple is going to pull the Kindle (or Nook) app from the App Store, but I’ll bet they’re already in discussions with Amazon (and Barnes & Noble) about how these apps need to change going forward. It’s easier to reject Sony’s app as a first step toward the application of new rules because Sony’s app is brand-new — Apple isn’t taking anything away from users that was previously available to them.
Sour grapes, indeed. But there’s more here, as usual. This could spark a price war in the ongoing struggle between Apple, Amazon, Barnes & Noble, and (to a lesser degree) Borders. If Amazon wants to keep its margins, it’s going to have to jack up the prices (still one of the aces that Amazon holds) or end up losing profits to Apple. If Amazon exercises this option, people will undoubtedly be upset because that competitive pricing that Kindle owners are so famous for flaunting will have disappeared. If Amazon does nothing and allows in-app purchasing at the current price points, Apple still gets a cut of each book sold and suddenly has a whole lot more money to throw at publishers to get their books into the iBookstore (if they even want to do that).
Ultimately, theres a lot of interesting stuff that this is beginning to imply in Apple’s future posturing and the continued role publishing will play in hardware sales and long-term sales strategies.