Apple announced the iPhone
5 4S yesterday, much to the chagrin of the internet. Well…perhaps not to the chagrin of the internet, but everyone was expecting something called the “iPhone 5”, and Apple announced an absolutely amazing piece of kit they’re calling the “iPhone 4S”.
There was a lot of backlash, from what I understand, which seems…silly? I think that’s probably the best word to use right now. Silly.
See, the iPhone 5 was supposed to have all these amazing features, like a dual-core A5 processor, a higher-resolution camera, and image stabilization when shooting video. It was supposed to do all these amazing things with even better battery life, too. What a product! Yet, what we got was…the…wait let me check on this…we got the iPhone 4S…thing…with a dual-core A5 processor, higher-resolution camera, image stabilization, and something called “Siri”. Ok? But this silly piece of hardware is…well just look at it! It looks the same as the iPhone 4! And it’s called the iPhone “4S”. PEOPLE can you hear what I’m saying? It has a four in the name. Four is not five, my dearies. This is clearly a disappointment.
Let’s talk about what’s NOT in the iPhone 4S:
I think that about covers it.
Seriously, though, the next iPhone is revolutionary. Not because it looks like an iPod touch, but because it’s basically an iPad 2 in the palm of your hand.
I don’t think it’s time for a chassis redesign, and I’m glad they stuck with the iPhone 4’s slick glass and steel thing. There’s so much more in there, and all it will take for people to understand the beauty of the iPhone’s new guts is moseying down to their local crystal palace (aka Apple Store) and fiddling with the thing for five minutes, in which time they’ll realize that they can be twice as productive with this new pocket computer than they are with their current one. Game, set, and match.
Came across this article on the (admittedly grotesque) Gizmodo this morning, and thought I’d chime in.
Hipstamatic generates an atmosphere, an aesthetic that ostensibly doesn’t exist in reality. Our vision only tends to resemble 1970s photography when our minds are lubricated with pharmaceutical enhancements, after all. Is it photojournalism when an image is deliberately changed to heighten or affect mood that we literally can’t see with our eyes for the sake of aesthetics and emotion? Is the definition of reality here merely confined to the collection of objects depicted in the photograph?
Staring at the photo in question, “A Grunt’s Life,” I can see how the photographer—the person who was there, documenting a moment in a time—can reasonably argue that his Hipstamatic print more accurately depicts the feeling of what it was like to be there than if he had simply taken a conventional, straightforward photograph. A photo that, from a certain point of view, is perhaps more truthful.
Here’s the deal: as technology advances, what once required a highly developed and specialized skill set will eventually have assistive software/hardware developed for it that mimics and/or replaces the majority of the skills in that set. Shooting, processing, developing, and printing a photo like this now would take a lot of knowledge and access to resources that most people are unaware even exist. Hipstamatic removes most of these obstacles and enables “average” people without these skills to create in a manner similar to a person with said skills.
I have this discussion with people all the time. There’s a sort of nostalgia that creeps around whenever technology starts to change the way people create. When writing and publishing something was a long, difficult, and laborious process, only people who were willing to invest a great deal of time into that process had their work published. The same goes for photography, painting, filmmaking…basically almost any type of creation had a “price of entry,” if you will. A long time ago, anything that was created was vetted to make sure that it was something that was worth creating.
As I said above, the evolution and widespread adoption of previously all-but-unattainable (due to cost prohibition, licensing, etc.) technology by mainstream culture has placed tools for creation in the hands of folks who do not possess the highly specialized and developed skill set that their “artist” counterparts possess (artist being a term to denote anyone who has devoted a significant amount of time to the development and refinement of a set of skills). Despite this discrepancy in investments of time and energy, there are a not insignificant number of people who have a high degree of innate artistic talent and are able to create a “product” that is similar to the “product” created by the “artists.”
This is usually where all hell breaks loose. Lots of folks decry the use of these new technologies as “cheating,” in a way. “If anyone can do it, it isn’t art anymore!” they cry, “and they have no training!” The death knell of photography has been sounded!
I’ve had this exact same discussion on the topic of writing and the impact that the internet has had on “good” writing. Many people are of the opinion that people are “getting dumber” or that our literature is “in decline,” when the reality is there is simply more of everything. There are more people taking pictures, writing, making movies, and creating than ever before. The fact that a phone can take a picture today that looks better than pictures looked thirty years ago is just a testament to the progressive iteration that takes place in technology. Nobody ever hung a photographer because he or she didn’t know how to build a camera.
What we’re getting at is that the creation of anything is getting easier, and more people are doing it than ever before. That, ladies and gentlemen, is a wonderful, beautiful thing. People get their ideas out there. I’d like to say that all those ideas are gems of knowledge and insight, but not all of them are, and that’s OK. What we have is a much larger body of knowledge to draw from, and the tools we’re using to pull data are evolving rapidly. Sure, the “overall” quality of the work is declining, but that’s only because there’s so much more out there. That says nothing about the unrelenting and constant creation of high-quality stuff. If more people have access to good technology, then more good stuff gets out into the world. That’s where tech is supposed to fit in, it’s supposed to remove barriers to that sort of engagement with the world that usually only comes with, again, those highly developed, highly specialized skill sets.
So, when I see something that says “OH NO HE USED HIPSTAMATIC” I usually put the earmuffs on. There’s simply no place for that anymore. If people say that the photography isn’t real because the app adds things to the frame that weren’t there, then you’re going to have to chase after all the filters, all the lensbabies, all the grease-smeared lenses that are out there. Those aren’t “real” in the same way that the virtual lenses in Hipstamatic aren’t real. Or are they? In the first scenario, someone is taking a physical object and changing the light before it hits the film; in the latter, a person is applying a modification to the image after it has already been taken, but the end result is basically the same.
The most important line in the blockquote above is the one about feeling. In Tim O’Brien’s The Things They Carried, one of the main points of the novel is to illustrate the importance of the “story truth” vs. the “real” truth. Ultimately, it is how we experience things that is important, and how we convey that experience to others is critical.
As technology improves and our ability to convey our thoughts and feelings moves beyond having a specialized set of skills, we will find that the number of brilliant people will skyrocket. It may seem small, but Hipstamatic is just one of the first steps along that path.
Everyone is talking about the Gizmodo iPhone 4 feature story. They’re talking about the possible ramifications of Gizmodo paying for known stolen property, of the moral issues of making the apple employee’s identity public. That’s all fine and good, but what about the camera? The front-facing one, I mean.
Europe has had front-facing video cameras forever, and their phone networks have been capable of sustaining two-way video calls for quite some time, as well. The idea never took off here, except among the mobile crowd with cheap netbooks tied to expensive data plans. Now these phone will have a front-facing camera and a rear camera that will (presumably) add even more weight to the use of this phone as a true convergence device.
I feel that signals even more than that. Sure it’s great that the phone is able to do so much, has a crazy high-resolution screen, and oodles of storage. I’m not talking about that, I’m talking about the backbone that allows it to do all these whiz-bang tricks: AT&T.
If there’s a front-facing camera on this phone, it might be limited to use when the phone is in range of wifi. That’s fine, and wouldn’t be so bad…but what if this camera was finally AT&T’s way of saying, “Yup, we’re ready for you now.”
Their coverage and ability to handle high-load areas in places like New York and San Francisco may be lacking, but Chicago is great.
Bring on the video calls.
everyone is so torn up about the ipad’s lack of camera, when they should really be thinking much bigger.
apple has always languished in the business space, targeting students, creative professionals, and the education sectors. the ipad marks the first time apple will be making a product that, from its outset, can be used in the legal, medical, and business fields. in most of these occupations, cameras are a big no-no, due to the sensitivity of patient/client information. this is huge. these types of professionals will finally be able to use a device in a form factor that is conducive to their work and versatile enough to be used anywhere.
add to that the fact that software developers are scrambling to produce ipad apps for these under-penetrated markets, and you’ve set the stage for…wait for it…magic.