While my posts haven’t been coming fast and furious lately, I’ve been watching the tech landscape recently and have seen some interesting shifts in where I believe a lot of things are heading.
Whither the iPod Nano?
This has been a perennial issue for me. When the iPhone 4S (aka the iPhone 5), was released, people did two things:
1. Thought that it was an inferior phone because the character “5” was not in the title
2. Forgot about everything else for a little while.
I, however, did not forget about the iPod nano. Conversely, I began to think more about it, mostly from the perspective of “How can Apple make use of this new Bluetooth 4.0 thing?” While Bluetooth may not be very important to many people in the world, or may be synonymous with “headset”, Bluetooth information exchange technology makes possible a great many things that people basically don’t take advantage of. Case and point, a friend of mine just saw me typing this blog post on a wireless Bluetooth keyboard and said “Wow, a wireless keyboard? I didn’t even know they made those.” Naturally, he’s a little behind the times (friar, vow of poverty), but that doesn’t stop the concept from being foreign to many people. An iPad-toting client of mine didn’t know that Bluetooth could be used to connect an iPad to a wireless keyboard, either (see “headset” equivocation above).
At any rate, that’s where we’re at. Bluetooth having effectively been relegated to another name for “headset”
The iPod Nano has the opportunity to become something so far beyond what it is right now. It can be a gateway to the information stored on an iPhone, a supplement to an iPad (remote control, keyfob, microphone, etc.), and, possibly even more importantly, a front-end for Siri. Naturally, the iPod Nano’s screen isn’t designed for displaying large amounts of information, but that doesn’t preclude it from being an information portal.
When talk of an “iPad Mini” started swirling about, I immediately started thinking about the whole Steve Jobs “people don’t like these ‘tweener’ sizes for tablets” statement. Whenever he says that, you know that a product isn’t too very far away. The issue for Apple wasn’t creating a product in that size, but rather timing their entry into that size category. One of the things that I’ve noticed about a great deal of the other 7″ (ish) tablets on the market is that they lack anything truly compelling for me. I wouldn’t want a Kindle/Kindle Fire because its primary purpose is to read books purchased through Amazon.
The Nexus 7 was almost enough to get me on board until I used one. “Why would I spend any money on this?” I found myself asking over and over. The only truly compelling thing that I saw in the Nexus 7 was the NFC capability, but even that was a stretch. I need a product like that to be an iPad, but smaller, capable of all the things my iPad is capable of. I’m sure there are many people in the same boat.
I’ve been using the iPad to take notes, draw, read, and write since its introduction to the market. People tried to tell me that it wouldn’t be capable of much, and I would just quietly continue working, nodding as I continued to accomplish goals I set out for myself from the comfort of a tablet that I could use comfortably all day.
I knew there was one problem, though: it was too big (and not by much) for me to carry in my hoodie pocket. There were times that I only wanted to carry my tablet with me and nothing else, lack of charging equipment and extra tubes for my bike being reasonable things to forego in favor of a tablet that could slip easily into my back pocket. My iPad was literally a half inch too big, and I resigned myself to carrying the things I needed in addition to my wundertablet.
It was a hard life, I know, but I made it through. Thanks for your concern
Now, however, I feel like Apple is going to make a lot of people happy by creating a device that is perfectly capable of an absolutely ludicrous number of things (vis a vis other tablets), yet still has an extremely portable form factor (as though the iPad wasn’t portable enough).
Here’s the thing, though: Apple needed to time this whole thing. Releasing a 7″ (ish) tablet shortly after the iPad would have been great, and people would have really liked it, sure, but it wouldn’t have had the same impact that I believe it will have now. By releasing an “iPad Mini” now, Apple has allowed all the trash to sift itself out. Plenty of other companies have brought “me too” devices to market, and each has captured some small part of the iPad experience that people love, but left even more behind. Other companies thought that, if they could only have gotten that 7″ tablet to market first, that they would have ruled that space. The issue with that type of thinking is that it leads to sloppiness. Should this “iPad Mini” be released soon, it will be released with the entire weight of Apple behind it. It will have access to the iTunes store, it will have access to the App Store. All the apps that people have already purchased will be available on their device from day one. Their contacts and calendars will be synced through iCloud, and, while the same can be said for any Android tablet in that form factor, a person toting both Android and Apple devices would have to manage two devices with two different stores to shop from, two places to store their media, and no convenient way to slosh purchases around between devices.
With a device having a smaller screen size and profile, Apple will be making their signature store/device integration available in an even more portable form factor. The market will respond, and it will respond favorably.
Keep Your Friends Close
The last thing that I haven’t been hearing much about recently is NFC. Samsung released the Galaxy S III to a mediocre amount of fanfare, touting all of this NFC magic…but I have yet to see anything really interesting come out of it. I love the idea of NFC, but, like the Nexus 7, I see no one using it. I don’t see any stores with NFC tags on their doors, no restaurants with NFC tags on the tables to allow patrons to silence their phones and join their wifi with a single tap. None of this is real because I have a sneaking suspicion that Samsung has no idea what it’s doing. They put products on the market that have checkboxes in all the right places, but no real-world application of any of the things that those boxes relate to. Great job, Sammie, your phone has NFC! Does that honestly play a role in most people’s buying decisions? No, no it doesn’t. A friend of mine recently purchased a new GSIII and, when asked about the NFC feature, had no idea what I was talking about.
Truth be told, I’m not sure NFC will ever be a truly compelling technology, but I believe that, if it is, that Apple will do it right. They’ll do it right because they’re really the only company that can make something as obscure as NFC relevant enough to matter to the world. When the world’s most valuable company throws its weight behind something, you’re pretty safe betting that people are going to pay attention.
All of this assumes a few things
1. Apple is releasing a new iPod Nano.
2. Apple is releasing an “iPad Mini”.
3. The aforementioned products, in addition to the new iPhone, will contain NFC technology.
Those are a great deal of assumptions, but they all seem to make sense. I’m not one to start making assumptions and thinking that I’ve got it all right, but, based on what I’ve been seeing and, perhaps even more importantly, what I haven’t been seeing, I believe that all of these things are very close to reality.
I haven’t even touched on the possible integration with a refresh of the Apple TV, but I think that all those things are around the corner, as well.
It’s gonna be a helluva September
When Siri was unveiled with the introduction of the iPhone 4S, there were a lot of very intrigued, very happy people. Already, in my usage of Siri with my new iPhone 4S, I find myself pleasantly surprised with the things I’m able to do, and how easy Siri makes so many of the things I’m used to doing. Naturally, there are some shortcomings. Since I use an unlocked 4S with the T-Mobile network, I’m relegated to EDGE when not on wi-fi (how was this speed ever acceptable?), and communication with Siri is woefully slow. I wish I had the scratch to pull off an AT&T subscription, but I just don’t right now.
This got me thinking, however. Since the 4S relies on a persistent, high-speed network to deliver results to the user, what happens when a person has a slow connection, or is in a wireless dead zone? The ability for Siri to function as an interface diminishes dramatically, leaving a person only able to interact with the data that is already on his or her phone. While this normally would not be a problem, anyone looking for Siri functionality in a wireless dead zone is going to be frustrated, period. Naturally, the last thing Apple wants is unhappy customers, so what can Apple do to circumvent this situation?
I found the answer in the iPod Shuffle.
This little device, as many know, is what one might call one of Apple’s lesser-loved projects. At the time of its inception, it filled a necessary void–that of a low-cost music player bearing the iconic Apple logo and “iPod” name. It was my first iPod, and, I’d wager, the first iPod for many others, as well. The problem with the iPod Shuffle, now, is it lacks features. It isn’t relevant anymore. When the shuffle was introduced, MP3 players, including the iPod Classic, were large and relatively bulky, and their battery life left something to be desired. The Shuffle had long battery life, was capable of syncing with iTunes, and offered people an interesting alternative to the blue-hued screens and click wheels of their larger cousins. The storage was all flash, which meant that it wasn’t prone to hard drive failures in the same way the iPod Classic was, and that it could play all day on a single charge.
Since the Shuffle lacked a screen, however, there was no way for a user to really know what was about to play. Apple solved this with their “VoiceOver” feature, which was able to announce the name of the playing track or playlist, or the remaining battery life. In order to do this, however, the user needs to give up some storage space on their device to make room for the VoiceOver data. For some, this is an easy tradeoff, since it adds a sense of depth to the diminutive device. Tuck that in the back of your mind for a moment.
It was recently discovered that the iPhone 4S contains a dedicated sound-processing chip that enables it to better separate your voice from background noise, which increases its ability to recognize what you’re saying before sending that data off to Siri for processing and language recognition. All this data being sent to Siri means that there are a great deal of sound snippets that Apple has at its disposal to refine and improve its voice-recognition and accuracy. The more people use Siri, the better it gets, and the better it gets, the more people use it. Eventually, I believe, Apple will be able to “distill” certain Siri queries down to their core components, picking out speech patterns and pull user voices away from background noises more easily. Furthermore, Apple will be able to condense certain components of Siri down to include that functionality on devices that don’t have a persistent wireless connection, and significantly speed up Siri queries on devices that do. Naturally, looking up restaurants on Yelp or finding out data from Wolfram is going to require a connection to the internet, but things like setting reminders, calendar appointments, taking notes, and playing music can all (theoretically) be done locally, without a persistent data connection. This would allow Apple to install Siri on all of its devices. When the device has a wireless connection, it would be able to upload usage statistics, and download changes to the onboard Siri database while doing its nightly iCloud backup.
Naturally, the user might have to sacrifice some storage space, but it would allow even the iPod shuffle to become a “personal computer”, with the ability to store notes, read emails, and access a user’s information in the cloud when a connection becomes available. Who knows? Apple may even negotiate a wireless deal with service providers that allow all its devices to connect to a Kindle WhisperNet-style “SiriNet” for free, for the purposes of communicating with the Siri servers.
Until we have ubiquitous worldwide wireless coverage, we can talk to the little Siri in our Shuffle.
So the big announcement is iCloud, iOS 5, and Lion. These are all good things, and probably make clever use of a new, powerful back-end that will hopefully be a major part of Apple’s strategy going forward. One of the interesting thing to see is how Apple will be pricing this “new” service, if it’s going to be considered “new” at all.
I agree with what TUAW has to say about Apple’s paid vs. free options being a part of its iCloud (née MobileMe) plans. I can’t imagine that Apple would ignore the vast potential in this market. There’s just no way that any company in their right mind would ignore the power that a uniting backbone would have in its ecosystem.
It’s been a perennial rumor that Apple will stop charging $99/year for much of its MobileMe service. The rumors have always suggested Apple will offer basic services (like email and over-the-air device syncing) for free, while paying subscribers will have access to things like website hosting, online photo galleries, storage options through iDisk, and now potentially wireless streaming of music via the rumored iCloud service.
Then there’s this article by AppleInsider that offers up another possible interpretation, namely that offer will be introducing a “tiered” pricing model to their new iCloud service based on the user’s operating system. I don’t think this is going to happen, since tiered pricing is uncharacteristic of Apple.
That price tag may remain for users who do not make the upgrade to Lion, or for Windows users. But it is expected that the cloud services will become free to Mac users who run the latest version of Mac OS X.
My opinion is that Apple will introduce some kind of free option. Just about every big tech player out there offers some sort of free email option, and that’s by design. By pulling people into your ecosystem, you grab mindshare and envelop them in whatever “culture” your product or service suite represents.
There’s also the increasing awareness of what email addresses mean. A person with an “@me.com” email address is telling the world “I probably own a Mac or iOS device, and have the ability to view whatever files you’d like to email me or access just about any site you send my way.” This is important in today’s business world, where the data is less important than the connections they represent. A business owner isn’t going to say, “Hey, can you send me that file in a keynote? I have an iPhone.” No, they’re just going to be able to open because they have an iPhone. Offering their customers even more integration, stability, and ease-of-use would be a huge selling point for Apple, and will also pave the way for their future plans for FaceTime (which I believe Apple will push heavily as a replacement for phone calls in the coming years).
Exciting stuff, can’t wait for the Keynote.
One of the most powerful developments in recent years has been the creation of “cloud computing.” Folks familiar with the technology know that it’s essentially doing for your computer what email services like Gmail and Yahoo! have done for your communication–they’ve taken your messages, contacts, and other personal information and stored it on secure servers across the nation to make it easily retrievable in the case of an emergency or hardware failure. Instead of relying on a single storage point (your home PC, for example) to store all of your communication, Google, Yahoo, and dozens of other websites offer to handle of those tasks in exchange for showing you advertising or using some non-identifiable information to craft better algorithms.
For most people, the immediate benefit of these systems was apparent. Access your mail anywhere, store contacts somewhere that won’t be affected in the case of a system crash or loss of a single device (like a phone), and integrate these services with your web browsing. Easy, and powerful. The systems that provided these services long ago have evolved significantly, now allowing entire operating systems to essentially run through your broadband connection, piping only the data necessary for input and allowing massive supercomputers to handle all of the processing.
That all sounds fine and good, but what does it mean for you?
Cloud computing, so named because of its pseudo-omnipresence, changes the role of computers significantly. They no longer exist as a single point of storage for all your information. Instead, the computer is more of a gateway, a portal to your data that is stored in massive servers. One analogy I can draw is that of a dry cleaner. With the old model of computing, it was as though you were standing at the front of a dry cleaning factory trying to look for a specific shirt. You might not even know where the shirt was located, but you’d still have to find it yourself. With the advent of search, that process was trimmed a bit- you tell someone else what to look for and where to look, and they find the shirt.
Now, with cloud computing, we see that yet another layer of interaction is slowly melting away. We’re doing away with the fetching entirely. You don’t even really need to know where you’ve stored your data, you just need to run a search, and you can pull down results from the stuff you have stored locally on your computer as well as the files floating up with the sun and moon. We are no longer limited by how much space is on our devices, how much storage we can buy. The only limiting factor is the infrastructure that connects all these devices together. Some people have asked me, almost accusingly, “Well what happens if the network goes down? What then, huh?”
If the entire United States suddenly experiences a simultaneous and catastrophic shutdown of all of its network infrastructure, we will have much bigger things to worry about than listening to our music or accessing the documents on our cloud folder. That’s akin to asking what would happen if all paper in the United States suddenly caught fire. I don’t want to hypothesize about the events or circumstances that would need to exist in order to facilitate such a terrible reality, but, assuming it was both spontaneous and total, I doubt anyone would be worried about their fourth grade diary.
In recent news, we’ve heard rumblings of Apple’s new iOS 5 being cloud-based, a total overhaul of the OS. I can’t even begin to fathom what that means. The OS seems just fine as it is, but the cloud is where it’s at these days, and that darn data center that’s been occupying so many of my thoughts and predictions seems like the perfect use of all those massive petaflops (or whatever they use to measure data centers of that magnitude). It all seems to be coming together now.
What we will start to see is more unity across Apple’s various OS products. Remember back in 2005, when Steve was asked what kind of OS the iPhone was running? Does anyone remember his response? Let’s recap, shall we?
Jobs admitted that Apple is a new player in the cell phone business, saying “We’re newcomers. People have forgotten more than we know about this.” Jobs noted that the operating system to run the iPhone — Mac OS X itself — has been in develop for more than a decade (its roots like in NeXT’s Nextstep operating system). Mossberg suggested that the iPhone doesn’t have the entire operating system on it, but Jobs protested.
“Yes it does. The entire OS is gigabytes, but it’s data. We don’t need desktop patterns, sound files. If you take out the data, the OS isn’t that huge. It’s got real OS X, real Safari, real desktop e-mail. And we can take Safari and put a different user interface on it, to work with the multitouch screen. And if you don’t own a browser, you can’t do that,” said Jobs.
This shift is not overnight, and it is not a new direction for Mac OS. Once Apple began work on the iPad, they started planning for this shift, possibly even before that. I seem to remember some folks discussing the origins of the iPhone, how it was actually rooted in an experimental side project that Steve Jobs somehow got a look at and recognized as brilliant, and that said side project was actually more akin to the iPad than the iPhone. At any rate, it looks to me as though Apple has been planning this shift for years, possibly even the better part of a decade. I believe that Apple designed iOS with unification in mind all along, seeing a desire to create a powerful OS for new mobile devices that hadn’t even been developed yet. It seems fairly obvious when you look at their last “Back to the Mac” event, and even more glaringly obvious when you see something like this coming out of Gizmodo.
Adobe demonstrated Photoshop for iPad yesterday. Not a sub-product like Photoshop Express, but the real Photoshop, with a new skin. Sure, it doesn’t have some of the advanced print and web publishing oriented features of the desktop behemoth. But it has everything you need, from layers compositing—including a 3D mode to show people how they work—to what appeared to be non-destructive adjust layers, levels, color controls, and all the features I use every day in the desktop Photoshop. From the little we have seen, the application was fast and smooth.
I believe Apple has succeeded in ushering in a new age already; I can’t wait to see them throw the doors wide open to a future we’ve only dreamed of.
One of the most exciting things about being a technophile is the reactions I get to experience from friends and family members regarding new technology and its place in their lives. For some members of my immediate family, technology is something to be shunned or, at best, regarded cautiously. The intersection between life and technology seldom occurs and, when it does, the intersection is typically relegated to the living room TV or family computer for just a few moments.
The general distrust of technology is not unique to my family, however. As phones have increasingly taken on more characteristics of computers, many of my friends have opted for lower-tech, less-capable devices that offer the illusion of simplicity and security1. There seems to be a general trend, however, towards devices that are intentionally simpler or less advanced than the iPhones and Androids of today. This seems to go hand-in-hand with a trend that was very prevalent in the early 90s in consumer electronics: blinky things.
This isn’t a joke or intended to poke fun at things that blink and glow, it’s an observation about the level of interaction that most people have with their technology, and the way that technology is designed today vs. twenty years ago. Currently, almost everything we see in the mainstream consumer electronics space is being geared towards user-friendliness and maximum functionality. We see device after device being introduced into the marketplace with the same glass face, the same general form factors, the same trend away from confusing buttons and towards devices that shift and morph as the user invokes different commands and demands different functionality from the device.
A close friend of mine was discussing his experiences in Japan in the early 1990s when Japan was leading the world in technological advancements in the consumer electronics space. His defining memory of the era was of blinking lights. He told me about his friends who would go shopping for electronics, looking expressly for the devices and gadgets that had the most blinky lights on them. Contrast to the devices of today, which have few, if any, lights at all (save for the screen).
I believe that this shift in the visual appearance of devices also has a great deal to do with the intended usage of devices and the sea change we see occurring in mainstream media in general. In a recent discussion I had (referenced here as well), I argued that media consumption is moving away from the all-you-can-eat huge cable bills and more towards selective, pay-for-what-you-watch models. This means that people have to go out and find what they want to watch in order to actually watch anything, which means that the consumption of media must be intentional. This is incredibly important when we look at how these new fit into our lives.
My father picked up an iPad recently (it was off, but plugged in and charging) and said something interesting. “How do you know it’s charging?” he asked. “There’s nothing blinking on here.” He’s right, of course, but that simple statement illustrates the difference between current-gen devices and last-gen technology. In previous generations of electronics, devices were ambient, non-interactive, and representative. The stereo represented music, the typewriter represented writing. These gadgets were single-function, specialized devices. They were large and expensive, and sometimes required some sort of technical training in order to learn how to operate them. The trend in recent years, however, has been away from single-function devices like stereos, typewriters, and cassette players. The shift has been decidedly towards convergence devices whose role in day-to-day activities is not clearly defined because it is so amorphous.
In the early 90s, a person could glance over at his or her stereo and be greeted by an array of lights and digits that portrayed all sorts of information which varied by model and type of stereo. This information, however, was specific to the gadget and usage case thereof. In that scenario, a person would have any number of different devices to display very specific pieces of information. Thermometers, clocks, typewriters, stereos, and more have all been replaced by multi-function devices that are becoming more and more ubiquitous, and some people feel threatened by that. Gone are the blinkenlights, gone is the specialized knowledge required to operate the machinery, gone is the sense of self that is then inevitably tied to the gadget. Instead, we see inherently mutable devices with no single purpose taking center stage. Suddenly all the gadgets that people have been hoarding over the years are rendered useless or unnecessary, and the owner of said devices suffers a bit of an identity crisis. Should we decide to keep the devices, we clutter our lives with junk. Should we decide to pitch them, we admit defeat to the tides of change.
This, however is not as bad as it may sound. A shift away from clearly defined objects means that our sense of self becomes tied to ideas instead, tied to our interactions with technology, not the technology itself. We come to think more critically, more abstractly. What are we looking for? How do we find the information we seek? Is this information important? How should we process and/or internalize this information?
Ultimately, a shift in the type of technologies that our lives revolve around signals a shift in our self-awareness. When you think about it, another analogy comes to mind, one that I discussed recently vis à vis the transition Apple is making with their new data center.
Let’s get existential, shall we? Let’s get right into it. Here it is: our sense of self, our identity, by being disassociated from things, now lives…wait for it…”in the cloud.”
Bet you thought you’d never see the day, huh?
1 One of the most often-heard arguments I have heard from my paranoid friends/family members is “What if you lose your phone?” or “What if someone steals your phone?” I actually faced that exact scenario recently and discovered some very interesting things about security and vulnerability that will undoubtedly raise some eyebrows. I’ll describe that story in detail soon.
Apple also made a small, but very meaningful change to their iOS app store, namely the shift to a button labeled “Install.”
While this may appear on the surface to be merely cosmetic, looking deeper reveals a lot of information in light of all the movement Apple has been making recently in building out the data center and rolling out the tall ladders for cloud (or pseudo-cloud) computing. AppleInsider discusses the physical processes that are beginning to facilitate this, but here is the first
What we see here is a blurring of the lines between local and cloud storage. If a button is labeled “Install,” it implies that the app is close at hand, just a tap away in order to be in front of us and usable.
Consider the language Apple uses when downloading and installing apps from the App Store. While the app is being downloaded, the user sees “Loading…” below the app, creating the impression that the app is not being fetched from some far-away place, but that the app is being unwrapped, that it’s simply starting up for the first time. As the process continues, “Loading…” changes to “Installing…,” which further increases the similarity to a locally-stored app. Shortly thereafter, the app is ready, and the user can go to town.
Displaying “Install” in the app store, instead of the app’s price, puts the user at ease that they already own this piece of software, that Apple is keeping track and taking care of all of their software for them, and that they have their own personal software vault from which any app they own is accessible to them at any time.
Think about that change in the juxtaposition to the old way of computing, when installing a program meant loading a physical disc into a tray and transferring the data onto a computer. Think about the programs that actually required that the disc be in the tray. This is a distinct and marked shift away from that type of application and media, a shift toward user-friendliness, toward ease-of-use.
Once again, this is good technology. The computer gets out of the way, and we are able to engage our information more quickly, without a break in thought, without losing ourselves to the process. We are able to focus, explore, create. We are able to be more human.
Came across an interesting post on TUAW today:
Some advantages of the newly integrated suite of server administrative software include a guided setup process for configuring a Mac as a server; “local and remote administration – for users and groups, push notifications, file sharing, calendaring, mail, contacts, chat, Time Machine, VPN, web, and wiki services – all in one place”; “simple, profile-based setup and management for Mac OS X Lion, iPhone, iPad, and iPod touch devices” with Profile Manager; Wiki Server 3, designed to make it “even easier to collaborate, share, and exchange information”; and WebDAV services that give iPad users “the ability to [wirelessly] access, copy, and share documents on the server from applications such as Keynote, Numbers, and Pages.”
What we’re seeing is a paradigm shift in home computer usage. More and more people are shifting away from traditional desktop configurations for their everyday computing and adopting the iPad as their primary method of getting access to the information they want. This as inevitable as it is surprising. Inevitable, because mobile computers have increasingly become the focal point of the technology world; surprising, because it happened so fast and so definitively. I need more than the fingers on my hands to count the number of people who use the iPad as their primary computer. As they become more powerful and ever more portable, that number will increase.
iPad sales have also been staggering, especially when compared to other manufacturers (HP, Samsung), and has captured huge percentages of the market (even markets that don’t even really belong to it). Hence, people are starting to wonder if it makes sense to even own a computer if this sort of thing starts becoming the norm.
Unfortunately, the iPad still needs to sync to something, and this something is quickly changing into less of a computing device and more of a server. The fact that Lion (Mac OS 10.7) will essentially allow any Mac owner to function as a server is quite interesting, and I believe it shows Apple’s future plans under the surface.
Apple likes Mac OS, and believes that it will survive for a long, long time. I agree with this, but I believe that the Mac OS will shift subtly away from its current place as the OS that people see to the OS that works under the surface. It’s a powerful statement about the future roles of the “computer” and “user.” In Apple’s future, the “computer” should be invisible, providing a means for people to access what they need, when they need it. The “user” simply gets access to what he or she wants through one of the many pipelines that transfer his or her data.
This is a trend that I have been participating in for a while, through apps like Simplify (RIP) and now Audiogalaxy, LogMeIn, and Air Sharing. The whole idea is that my iPad serves as a window/portal to everything that I may need.
Introducing a “server” option to a standard install of Mac OS Lion is Apple telling the world that soon, the computer they have sitting in the den will grow wings and live in the cloud.
Gizmodo did a week-long series of posts relating to memory and how the transition to social networking, cloud storage, and a more digital lifestyle has affected our ability to remember things, both positively and negatively.
I often joke that I, like David Bowie, have “the memory of a tiny goldfish.” what this often leads to is me forgetting often important things like birthdays, phone numbers, and previous engagements, despite my best attempts to keep these things in mind and present.
Another side-effect is my increasing inability to remember my life, past events and experiences. Sure, there are formative events, important parts of my life that I do indeed remember, and these will undoubtedly be clear to me for many years, but there are far more events, people, and places that blur together unintelligibly. I lose track of the who’s and the what’s. I get taken by the moment, unable to free myself from what is happening right now. While some people are unable to free themselves from the past, I cannot seem to find my way back to it.
In some cases, this is a good thing, a GREAT thing. I’m sure we’ve all had those moments we would rather not remember, experiences we’d rather forget. Twenty years ago, that may have been possible. Without a persistent digital memory, our past dangled above the abyss of oblivion. If I wanted to forget something, I stopped thinking about it. I burned the pictures, the letters, the drawings. Physical things held meaning, and their destruction was cathartic. Now, however, our lives are transitioning away from fickle physicality and into immortal ones and zeroes. Tiny bits of information define who we are now, and maintaining those bits and bytes from now until eternity is most likely inevitable. Vast data centers will store our information for…well…forever. Our demographics will be used as part of research and studies done by mega-Internet firms. Our pictures will remain tagged long after we have passed on. In a sense, we are now immortal. This immortality, however, brings with it another thing to consider.
In years past, a person engaged in illegal or immoral activity could hide his or her tracks relatively easily by being mindful of his or physical space. He or she could walk away from his or her old life and back onto the straight and narrow. Now, mistakes stay with you. Email exchanges, instant messenger conversations, and posts on forums persist and are accessible for many years after they have lost their relevance. They may no longer be important, but they still exist and are accessible. Can you say that about your notes from high school? Pictures from graduation? How about that wedding you went to? Digital storage and cloud computing make all of this possible.
But what if you want to forget? You can’t. You’ll run a search for something in your inbox, and you’ll be served up an email from a painful time in your life, potentially years ago. Maybe you haven’t thought about it for years, and now there it is, staring you in the face, a reminder of a past you may have tried to forget. In the physical world, the chance that mistakes will literally come back is slim. We can put things behind us, move away, physically destroy our past. In the digital world, we cannot. Just because you deleted that email doesn’t mean the other person did, and those pictures on Facebook, despite being untagged, still exist on their servers in someone else’s profile.
All of this begs us to make one simple change to our lives: live honestly.
We cannot do things the right way each time…we are human, after all. But as our digital worlds collide with the physical world, we are given the opportunity to live our lives more truly, to line up our intentions with our actions and live with purpose.
The next time you find yourself in a situation that may not be entirely characteristic of the person you have been trying to be, think of how you would like to be remembered. Chances are, someone is tagging you when the night is through, and they won’t stop to consider the ramifications of that red solo cup on your future careers. Do you want to have a job when all is said and done? Do you want to be remembered as “that guy who went nuts on the pool table wearing a lampshade as a hat?” If yes, then go right ahead, lampshade guy.
But, if you think for even one second that this is something you might not want future generations of Americans to read, don’t write it. As we move towards ubiquitous image, video, and sound capture, we will have to become increasingly more aware of the weight of our thoughts, words, and actions. So let’s all pull our pants back up and clean up our lives. Does that mean our Facebook pages may become dull and boring? Maybe. Does it mean that our lives will be lived more intentionally and meaningfully?