I think Iowa has the right idea:
Unlike other schools that plop computers on a student’s desk and walk away, Carver did away with traditional paper-based learning and actively used the laptops in a new digital curriculum
I’ve seen these non-traditional, “progressive” methods pushed on students, and they’re usually awful, mostly because the administration doesn’t get technology. They think they can just throw some iPads or netbooks at students and everything will be hunky-dory. Typically, that fails so miserably it’s not even funny. Teachers have to spend 10-15 minutes per class period just troubleshooting tech problems that they are ill-prepared for or have no patience for. The majority of these teachers were over a decade into their teaching, some nearing two decades, and some nearing retirement.
Across the board, the issue was that they were given little to no guidance as to how to integrate these new technologies into their classroom. Furthermore, a digital classroom needs to have a curriculum that moves quickly and takes advantage of the technology so that the teacher is engaging students, creating opportunities for them to think and synthesize information. Without that, progress becomes an illusion.
The rumors of an Apple-branded HDTV have been around for a long time (although perhaps not as long as the iconic iPhone). For many reasons and for many years, this didn’t make sense. Having an Apple-branded phone was ludicrous since so many other companies controlled the market in terms of handset design, technology, carrier availability, etc. Apple had no leverage; they were just getting their feet under them after recovering from an almost-inevitable downfall, and they weren’t seen as competitive in the marketplace due to the highly exclusive nature of their products. Then they started designing their own hardware, coupled it with some amazing software, and all that changed.
Now, the world looks to Apple for guidance on just about everything.
Now, we’re seeing the same thing with TVs, and it smacks of WebOS.
One of the big announcements that came out of the HP Think Beyond event was that webOS will be shipping on every PC, laptop, and some printers that they sell by the end of this year. We have pondered what that will do to the scale of webOS and how HP would implement it.
To be honest, I don’t know how this is going to play out, but it looks like these companies want to get their OS into everything in your home. I think the idea here is to have a network of appliances, devices, and screens that are discoverable and OS-aware, meaning that they can sniff out other devices/appliances on the network and interface with them. A person might be able to control his or her washing machine with a phone, or monitor the state of the vegetables in the refrigerator by glancing at a widget in the dock of a tablet, or activate a Roomba to clean the floors while he or she is away. The more devices run your flavor of OS, the more is possible on the network. Naturally, this might also lead to Skynet, but whatever.
It’s the “home of the future”, and it started with your phone.
There are lots of approaches to mobile OS development these days. Some folks are closed, some are open, some are really stable, others risk stability for customizability, etc. There’s one OS that I’ve always admired for its cleanliness, integration, and overall user experience, and that’s WebOS. The underdog of the “Mobile OS Wars”, WebOS did a lot of really neat things that its competitors simply didn’t want to do or didn’t want to risk because they had too much on the table. Palm, which developed WebOS, had little to lose, and they bet most of it on WebOS. Some could argue that they lost that fight, but I think HP’s acquisition of WebOS recently was a phenomenal idea and a great long-term strategy. I get what happened there, and HP makes a good argument as to why they did it.
HP currently sells a lot of computer, possibly more than any other company, but they’ve long lacked any real brand identity. They’ve been plagued by the same problem just about every other computer company has had: reliance on Microsoft. When netbooks took the stage just prior to the tablet revolution, a lot of companies- HP included- tried to get on board with customer Linux builds that were easy to put together and load because they included no licensing fees. The problem was that the OS conventions didn’t quit carry over, and the software that people were expecting to run, well…it didn’t. For the computer-savvy individual, this didn’t matter much. For the older folks looking to get their first computer because it was “cute”, netbooks were a disaster.
So, HP’s acquisition of Palm and all of its resources- WebOS included- was smart. This is going to allow HP to finally create an OS that can be associated synonymously with the HP brand. I imagine that WebOS will eventually have its name changed as well, but it’s sticking for now.
What I also applaud is HP’s apparent commitment to bring WebOS to everything including possibly refrigerators. This is where things start to get really interesting because these devices can (theoretically) be aware of each other and include the ability to work right out of the box (in much the same way Apple is building iOS into other things), or open the door to the introduction of new features down the road which may not have been available at release.
The other interesting aspect to HP’s decision is to recognize that their new OS isn’t just about layering a “touch-friendly” shell over an ugly set of insides, it’s about designing a unique experience from the ground up using an OS that can be scaled to fit everything from phones to high-powered desktop boxes. Eventually, if the developer community is rich enough, people will realize that these custom OS builds are actually the way people want to work, not some of this crud.
Being a gamer, and reading game related news, I was a little surprised at this article from Rock, Paper, Shotgun, which talked about a shift in policy on EA’s part regarding the marketing and sales of games on Steam:
HMM. This demonstrates incredible confidence in EA’s own brands, but the key back foot they’re on is that they don’t have any other publishers they can bring on board. What would change everything in the war against Steam is if the other major publishers launched their own Origin-like services and restricted their download sales to those. I won’t be at all surprised if that happens, as a few are quietly building the infrastructure – THQ have a store, Ubisoft have that uPlay thing, Blizzard obviously sell their own digital stuff direct… You could even see Call of Duty: Elite as heading vaguely in that direction.
It feels like this is a trend that’s moving very quickly. When we see artists, developers, etc. selling their own stuff without a store or aggregation service to market their wares for them, we enter into a different kind of relationship with the creator- it’s more one-to-one as opposed as separated by the rift of the store.
Before, people would go to a single place to find stuff. This method of curation led people to associate their buying and their consumption with a place, a store, an entity somewhat divorced from the source of the goods. This is a fallacy, and can be frustrating for a customer because they don’t necessarily know where their stuff is from. It also robs people of creativity and imagination.
Now, with the proliferation of creators on the internet, there’s an increasing emphasis on discovery. That means that people need to be more self-aware and understand their wants and likes more. It also means that the creators have to have more clout since no one is doing their marketing for them. Either that, or a lot of really awesome relationships to build on.
This reminds me of Trent Reznor’srecent push into digital publishing:
Like a more magnanimous Radiohead, Reznor’s called into question the major-label reserve clause for established, profitable musicians by not just coming up with a new way to monetize music, but just giving it away for free, no strings attached. Instead of “tip-jar,” it’s “this one’s on me.”
and, of course, there’s always the “original” self-released album:
This is a hint of things to come. Over time more artists will decide to self-release music in this fashion, thus creating long, staggered release windows that place serious fans first and more casual fans further back in line. Traditional retail must wait in line, too. That means service companies that provide the tools and expertise for the online self-release of albums will benefit from this self-release strategy while the second wave of consumers are left to retailers.
What remains to be seen is if self-publishing will win out over a curated experience like the various “App Stores” that are cropping up all over the place. Clearly, if a developer or creator of something wants all the money, they’re going to have to sell it themselves. If they want maximum exposure, they have to give a little of that up to be on one of these stores. This will be interesting to watch, for sure. Will we see increasing fragmentation or consolidation? Or, still possible, some strange hybrid of both.
Then it must be one, right?
When Apple released its new AppleTV, I asked one of the questions that I should really learn to stop asking with Apple products: “So what?”
What I forget constantly is that Apple figures all of its new products into a wonderful long-term strategy that is often hard to decipher but beautiful to watch unfold. The AppleTV was one of those devices that didn’t really have a place in my heart until I started using it, and it’s become even more incredible with the recent unveiling of iOS 5.
When I started using my iPhone, I discovered a little jailbreak-only app that allowed me to mirror my iPhone’s screen on my TV, which allowed me to do things that were (at the time) not possible, like pump music from the iPod app out to the TV and show photos (without creating a slideshow). “It’s like having a computer in your pocket,” one of my friends remarked at the time.
This is heating up now, and I think the barely-mentioned screen mirroring over AirPlay is going to be one of the most life-changing things I’ll experience this side of 2000. I already use my iPad for just about everything in my life, and the Mac Mini sitting just below my TV does very little. Sure, it has some apps installed for design purposes, but they’re all secondary to the writing and creation that I do on a daily basis on my iPad. Sometimes (rarely), I feel like it would be nice to be able to throw what I’m doing on a larger screen and just lean back a little, see the whole thing take shape in front of me. Why is this not a computer, again? Currently, I can do that (sort of) with my iPad via an HDMI cable…but that isn’t really ideal because it means that I have to jockey with cables, and risk damaging ports when things get inevitably jerked around or flexed in strange ways.1
Now, I can dock my iPad, throw the bluetooth keyboard on a desk, and type as much as my little fingers can type without a second thought. This is huge. It means that I can go to a friend’s house, and mirror my display on his or her TV without any setup, without digging around behind the TV to find his or her HDMI port, and without any cables to lose or forget. Just make sure the friend has an AppleTV (which have enough value proposition on their own) and I’m set. It means that businesses don’t have to worry about sales pitches going wrong due to configuration issues anymore. It means that we’re one step closer to a shared classroom where people can contribute anything they’re reading to a discussion without having to be an IT professional to do it.
One step closer to living in the future. Oh wait, we’re already there.
1 I’ve always had problem with dongles or adapter cords. For some reason, the cables alway break or fray internally, and the whole thing fails within a few months of normal use. I like to avoid them whenever possible.
I’ve had the pleasure of taking part in a couple of Groupon deals, and I’m really happy I did. I was able to get some delicious food for relatively cheap, and I’ve seen lots of good stuff on the site for meager amounts of money. This is both a good and bad thing. It’s great for me, because I eat food, and I like it when food is both good and cheap. I like supporting local business by eating at non-chain restaurants and cafés, and I thought I could do both through Groupon. What I am discovering, however, is that Groupon can actually be very damaging. This is bad.
From a recent post on TechCrunch:
Groupon can clearly deliver customers. But in order to know if it makes financial sense as a customer acquisition tool, merchants need to know two key numbers:
- The proportion of Groupon customers who are already their customers
- How often new customers come back.
That second metric is key. I’ve seen a lot of businesses have record drives of customers after running a Groupon deal, but I’ve wondered how many of those customers will actually come back. I’ve always thought that seeing a packed house of people whom I know have Groupons waiting in their pockets resembled a swarm of locusts- they consume everything they see and just move on to the next cheap thing. I’m not so sure I want to be a part of that.
Then there’s this whole idea on the back end:
Why is Groupon not merely a tech-bubble datum but a Ponzi scheme? Simple: Groupon has found that you can get local merchants to try anything once if it brings them new customers. A few local merchants in Chicago get them started, and Groupon shows good revenues. In fact, Groupon immediately remits half of those “revenues” back to the local merchant — they were never Groupon revenues in any meaningful sense of the word. But, optically, Groupon revenues look high — which they use to raise a financing round at a high valuation. Then they use the proceeds to hire vast armies of salespeople to dig deeper into Chicago’s local merchant community and repeat the trick in other cities.
This is so bad! I never considered this take, but it really does make total sense. I’m not a big investor, and when I do put my money in companies, I’d like to know that they’re actually good companies and not killing off local business, which have a hard enough time surviving on their own.
We all vote with our dollars every single day, and it’s a sad day when we tell small, local businesses that we won’t buy from them unless they gut themselves in front of us. I refuse to be a part of that, and I think you should, too.
The announcement of Windows 8 and its subsequent demos were interesting for many reasons, not the least of which was its inevitable comparison to Lion, iOS, and Android. To be fair, Android should generally be left out of this comparison since it doesn’t have a true desktop operating system (yet), but periodic comparisons have to be made.
Windows 8’s user interface, at least the touch portion, looks good. I like the clean, muted-color aesthetic, and the transitions between apps looks natural and pleasing (similar to the Star Trek LCARS aesthetic, but a little bit sharper). The way that Microsoft is pushing this thing, however, is silly to me. Recently, I wrote about Apple playing “catch-up” with this release, but something I failed to address was the simplicity of the whole experience, and this is ultimately the most important aspect of the entire OS.
I can wax poetic about the history of the personal computer and its role in our lives, the changes that personal computing has brought to our lives and how we experience the world around us, but it’s unnecessary. We all know that the face of personal computing is changing rapidly and being redefined constantly. Instead, let’s ask a fundamental question:
Why are computers becoming “simpler”?
I know lots of people who lament the “over-simplification” of today’s computers. Computers should require lots of specialized knowledge and time to learn, in the opinion of many folks. They don’t understand that all the layers of schlock the operating system puts in between a user and the task that they wish to accomplish are unnecessary and silly. In order to build a spreadsheet in Microsoft Excel, for instance, I need to understand Microsoft Excel conventions and jargon on top of Microsoft Windows conventions and jargon. This is difficult, and it gets in the way of actually accomplishing things. The learning of the tool takes up more time and energy than using it. This is bad technology, but has come to be accepted.
Companies that design software continue to simplify it so that it gets out of the way of the user’s intentions and goals, so that it helps them accomplish what they need to accomplish. This is good technology, and has difficulty gaining traction because the software almost looks like a toy compared to the byzantine and grotesque software hydras that have become commonplace in the world.
I remember using my Palm Treo 700w, the Windows Mobile version of the popular Palm smartphone. It had a custom build of the Windows Mobile software that was tuned for the Palm “experience” which meant that it worked better than standard windows. I thought it was really great, but what really blew me away was a software shell (a layer over the standard Windows Mobile OS that looked way better than the standard home screen) by SPB called “SPB Mobile Shell”. This was right around the time the iPhone was introduced, incidentally. I tried using this OS shell for a while because it was “touch-friendly” (where have we heard that before?), but ultimately gave up because the phone was still using a really ugly, really unusable OS designed for styli under all the gloss and shine.
This is what Windows 8 is, and it’s flawed. The key is to change what the OS is at its heart, change the way that interacts with the user, change the way that OS feels. IF the user gets the opportunity to “peek under the hood” of the OS, he or she will see the ugliest, most confusing parts of the system laid bare. The gears, cogs, oil, chugging engines…everything. In Windows 8, this is still confusing, ugly, overcomplicated. In the Apple world, this is simpler, easier, uncomplicated. This is where Windows will fail. If I start to use an app that hasn’t been designed for the new “touch-friendly” shell (which is essentially all Windows 8 looks to be), then it fires up in its old, byzantine, bloated-hydra form. With iOS and the future of MacOS, this isn’t even an option. If an app hasn’t been designed to be used full-screen, it doesn’t matter, it’s still usable in its beautiful, native form.
This brings me around to the initial question of “Why are computers becoming simpler?”
The computer-savvy elite that used to be the only folks for whom computers were intelligible are no longer the only ones who can use computers, and computers are becoming simpler because that’s what we all want. Even programmers, developers, and coders want everyone to use computers. Everyone! They want their software to shine on beautiful hardware, too, and we’re seeing this happen from the world’s most innovative companies. Microsoft, however, doesn’t seem to get it. They want their software to be archaic, opaque, and impenetrable when it comes to interacting with the user.
Maybe that’s why they keep losing.
I’ve been trying to digest the Apple news over the past few days in a way that would be meaningful, and it’s been difficult. Amidst all of the noise regarding unrevealed iOS 5 features, unrevealed Lion features, unicorns flying and granting wishes, and the future of all three, I was able to come up with a coherent thought that I think captures what I actually think about the future of mobile.
When Apple started getting serious about iOS, Google also started getting really serious about Android, and the divide that grew between the two has been significant. A lot of people get Android phones now because they’re “just like iPhones”, until they realize that their Android-powered device can’t do X (very rarely do I ever run into a situation that’s the other way around), or needs 20 steps to do Y. A few people get Android-powered phones because they want to do things that they “can’t” do with an iPhone. There will always be things that Android devices will be able to that iOS devices won’t be able to do and vice versa, but that’s not the key metric here. What we have to be concerned about is whether or not those things actually make sense and are “doable” by the majority of users. In my opinion, they’re not. Most people don’t have the ability to or desire to root their phones, don’t want to dig into firmware files, don’t want to jailbreak their devices, don’t want to do all the stuff that the advanced users (who tend to be the most vocal) use as ammunition against the competing platform. In the end, most users want to pick up the phone, send a few texts, make a few calls, hop on Facebook, and have fun doing that. Oh and play games. That tends to be about it. Does this make me upset? Yes, sure. I tend to use my stuff a little more, but hey, not my phone.
As mentioned in the past, Apple is doing some neat stuff with their product reveals as of late. Apple is telling people how they work. This is important because yeah, it’s about the user experience (UX), but the reason you’ve got such a killer experience is because of all this hardware underneath, because of this glass, because of this epic battery. Apple is communicating that there’s a lot that goes into the design and production of each device, and that should make you feel good. You should look at all this stuff and feel like they made it for you, to fit your lifestyle, your aesthetics, your pocketbook.
So, that brings us to now. Apple unveils all these new things that are a part of its new iOS, and some people1 looked at all that and had a very meh response, saying that this release was more of a parity release, that it wasn’t really breaking any new ground. I continued to look at this iOS release, however, and I think I figured out why I feel so excited about it. Whenever Apple has released a new product or new version of their OS, Android users have always held it over Apple users’ heads that they’ve been able to do this for months or years or millennia or whatever. Now, they can’t do that. Now, a person deciding between iOS and Android is going to have to choose between The Real Thing and a knockoff. This is where we’re at, folks.
People used to walk into a store and have the sales associate give them a weighted assessment of iOS vs. Android which probably included that ridiculous “open” buzzword in there somewhere. What does “open” mean for the end user?2 I’ll let that one percolate for a bit.
Ultimately, “open” is just a word, a marketing tactic that has no meaning for the customer, for the actual user of the product. “Open” is only meaningful to the developer (and marginally, at that). For the customer, it’s meaningless, but it sounds good, like you’re sticking it to the man or something. For the baby boomer generation, this is great because they used to stick it to the man, and maybe it makes them feel good. But let’s extrapolate that out a little bit. Let’s say a person hears “open” and buys the Android phone because they think it farts rainbows or something. Now they think that everything they do is better, the perceived benefits of using an “open” phone start to shine through. Until they see something running iOS. All of the things they thought were so great are also clearly on iOS, but look better, respond better, feel better. Where’s “open” now? Where’s Android now? It’s just another cheap imitator.
A new iPad owner will be able to pop the top on their new iPad and start using it right away as his or her primary computer. There will be little to no configuration, and all iOS devices will be kept in sync. Apps will use iCloud, people will love the experience, and the whole thing will grow its own. The Apple club is getting bigger, and the cost of entry is dropping like a rock. As highlighted by other writers, Apple is re-stating its devotion to being a hardware company, a mobile devices company, not a software company. Sure, Apple writes software, but only because its software sings on its devices.
For any other company, a software release that brings in features that others have had as “standard” for a little while would be “just” playing catch-up; for Apple, which designs software that is already powerful to the nth degree, “catching up” means creating almost unstoppable inertia.
1 I’m counting myself among those people.
2 I’ve been in carrier stores before, and listening to these floor guys try to explain it to the customer is hilarious. Listen in sometime and you’ll see what I mean.
Sometime back around 2005, Apple released this operating system called Tiger. It represented a huge leap over the previous operating system under the hood, but also represented a palpable shift in Apple’s software development that looked like it took cues from other places. At the time, I was using an app called Konfabulator to view “widgets” that existed on an invisible layer that I could invoke with a keystroke. It was really cool at the time, and I used it to store things like an iTunes player, weather, movie showtimes, and directories for local restaurants (even though I probably shouldn’t have been going out to eat as much). I loved it, and I thought, “Man, this is a really cool piece of software, but it doesn’t feel ‘at home’ on my Mac.” These thoughts would crop up periodically, but not often, and they didn’t get in the way of using the software. This software would eventually be acquired by Yahoo, and, like most things that Yahoo acquires, there isn’t much more to say about it now.
I also used a couple other pieces of software interchangeably called “LaunchBar” and “Quicksilver“, both of which were system-wide apps that were invoked by a simultaneous press of the Command and Spacebar keys. With Quicksilver, a customizable window would pop up in the center of the screen and, as the user started typing, would drill down through an indexed list of results to what the user wanted. Quicksilver was really powerful, though, and could be customized to run Google searches, play songs (by album, artist, playlist, etc.), locate images or albums, mail messages, etc. Launchbar was similar, and it lived in the top-right corner of the screen, a tiny bar that would drop down when invoked (using the same keystroke). Over time, I came to like Quicksilver more, and really used that almost exclusively.
Why am I writing about this obsolete software? Because of the manner in which it was cast into oblivion.
All of these pieces of software somehow found their features absorbed into and assimilated by MacOS. When one looks at MacOS now, the presence of “Dashboard“, “Spaces”, “Spotlight”, “Exposé”, and so many other features are built right into the OS. There are even dedicated keys for these features on current Apple keyboards. Spotlight was also more than just an app launcher or search tool, it was a powerful, system-wide indexing engine that took everything in your computer and made it easy to find. That’s a huge win for people looking to ditch Byzantine and overly-complicated file systems. When delivering the keynote for Tiger, Steve Jobs said that these features were “Built-in, not bolted-on.” And, while I agreed with him that this was a better way for most people to experience this software, I couldn’t help but feel awful for the developers who poured a lot of time and effort into making this software, only to have it basically ganked by Apple. I was doing just fine with the software as-is, but there were lots of other folks out there who didn’t have that software, people for whom Spotlight and Dashboard were going to be revolutionary. I knew that, too.
Now, we’re seeing this again, and I’m not sure how to feel. The first time I saw this happen, I had tuned my machine to my tastes with software that modified the way the system worked so that it suit me perfectly, there was almost nothing stock about it. This time, however, I’m not one of the pioneering souls on the bleeding edge of the jailbreak world. Rather, I’m in that big group of people who is waiting for Apple to release new stuff so they can get at what some of the daring few have had for a while. I’ll discuss a few of them here.
Apple’s new Notification system is ripped directly from the Jailbreak community’s developers. This concept was illustrated in a video not long ago by Andreas Hellqvist. Then we got wind that this guy got snapped up by Apple to develop their notifications. I’m not saying this is bad. In fact, good for Peter! He was hired by Apple because he had done some really great work. Good job, sir. I’m saying that this is interesting because of the way Apple took cues from the non-Apple-sanctioned developer community, who always push hardware and software to its limits in order to make something unique and fun.
Safari now has tabs (yay!) and an integrated “Read Later” list. If this sounds familiar to you, you might already be a user of Marco Arment‘s “Instapaper” service and/or app. It’s really fantastic, and has been a staple of my iOS experience for a while…but only because there were very few better solutions. Apple’s own MobileMe service was less than stellar in syncing information between my iOS devices and my home computer (admittedly, it wasn’t awful), but it was cumbersome and not Apple-like in its fluidity and simplicity. It needed to be just…better. Instapaper on iOS filled that void, and lots of apps started including hooks for Instapaper in their apps to allow the user to quickly send stuff over without too much fuss, and made everything feel more connected, like there was a synergy there. Now, I really don’t think I’m going to be doing much with Instapaper any more, and that makes me feel a little sad.
In an interview with Marco, he talked about Instapaper’s functionality extending beyond the whole “read it later” thing, and how Apple building that functionality into its iOS devices isn’t really a threat, since there’s so much more there. After seeing what iOS 5 will have built-in, I’m not sure that many people will want to spring for extra services that aren’t tightly integrated or explicitly woven into the iOS experience. Actually, I don’t even think it’s a question of ponying up a few extra bucks for an app that’s really effective, I think that most people just won’t care.
This cropped up about a year ago, and made its rounds through various news sites. I’m not sure how many people still actively use this through the various iOS updates, but I know that the idea of wireless syncing appeals to a lot of people.
Walk through the door, take off your shoes and grab an apple to nosh on, and by the time you’re ready to settle down, just about anything that has needed to get to your device is there, synced and good to go. This sounds so great, but it reminds me of the early days of Bluetooth, using Jonas Salling’s “Salling Clicker” to control my old PowerBook G4’s music playback and volume. I also used to sync my Nokia 6600 using iSync over Bluetooth as well, and it all happened automatically. I’d walk through the door, head toward the fridge, and my Mac would pick up the Bluetooth signal and start playing music exactly where it had left off when I walked out the door.
What I’m saying is that these “new” technologies and features aren’t new at all, they’re updated versions of old ideas that used to be cobbled together from various bits of software that were made by all sorts of developers. The “newness” comes from being baked into the device on an OS level. Again, not bolted-on anymore.
There’s more, too. “Reminders”? Please, it’s a to-do list with geofencing. Hardly groundbreaking. Other apps (TextFree, Google Voice, WhatsApp) do what iMessages already does, so that’s not new, either.
So where’s the innovation? If you look at everything Apple’s done over the past few years, all the “progress” they’ve made and “innovation” they’ve done, you’ll come up with a long list of stuff that they didn’t really think of. So why does this company consistently do well if all they do is essentially take other people’s ideas?
I think the answer to that is this: they put all those little things together in a way that no one else can in a package that is an absolute joy to use. It doesn’t matter that I was able to control my Mac with my Nokia 6600 five years ago. What matters is that the same action feels a thousand times better now and is easier to the nth degree. Add to that the fact that there are hundreds of millions of these devices out there, and you can see that these small “pseudo-innovations” gain a whole lot of inertia.
There’s still more that I’d like to see from Apple, but I think they’ve done a stellar job with this one so far. Can’t wait to see where they go next.
On the verge of the announcement of iCloud, Lion, unicorns, iOS 5, and magic fairy dust, there’s an issue I wanted to dig into a little bit, something that the increasing use of cloud services is going to come into direct contact with: bandwidth.
I came across this article on TechCrunch and felt like I was in some sort of twilight zone. Here are a couple highlights:
The report encourages mobile networks to offer integrated rate plans, while “providing a wide range of segmented postpaid and postpaid tariffs”. It also puts stress on the potential for escalating revenues in the cloud, machine-to-machine, and mobile financial services where networks can leverage their existing assets
The cause of the impending crunch? The report indicates that the dire situation is caused by market saturation and falling average revenue per user, which is causing core revenues to flatline, while the cost of handling mobile data traffic is skyrocketing. As one might expect from the growing use of smartphones (Nielsen predicts there may be 100 million by the end of the year), cellular data traffic doubled in 2010. And it’s expected to increase thirteen-fold by 2015.
If you’re half as confused as I am, then you’re plenty confused. What is all this jibba jabba? The rhetoric pitched to us by data providers (I’m lumping everyone together here, not just the mobile providers) is that “Bandwidth is getting more expensive, guys! You’re using a lot of data, and that’s really expensive! I don’t buy it. The thing is, all these internet providers, whether they’re mobile or otherwise, have seen this coming for a long time. The writing was on the wall back when WAP was the only way to get up-to-the-minute weather and MOVIE TIMES IN YOUR AREA. All these carriers and ISPs knew that people were hungry for more information, and so they started developing new ways to deliver bits and bytes to people’s eyeballs. The problem is that, as the number of subscribers grow, they aren’t making as much per person because people are getting wiser. Coffee shops offering free wifi and libraries with computer access have become the de facto standard. Even grocery stores offer free wifi these days. As in, I walked into a local Jewel and was greeted with a giant sign telling me that the whole store was blanketed in sweet, sweet 802.11b/g. That’s fantastic! I got the same thing when I walked into Nordstrom’s. If offering internet access was so expensive, you can bet that these businesses wouldn’t be offering it for free to anyone who wants it. “Come in and get you some interwebs here” has become this decade’s “FREE TOASTER” promotion. The toasters, as we all know, were cheap. Dirt cheap. People felt like they were getting a good deal because, hey, free toaster. Who doesn’t like toast? I don’t know of anyone. Who doesn’t like free wifi? *crickets*
Then there’s this regarding a recent dig into internet pricing in Canada:
Assuming an inflated cost of 10 cents per gigabyte, it means that Bell, Shaw and Rogers are charging consumers between 10 and 50 times what it costs them to deliver data. This on top of their regular monthly Internet pricing! While I agree that heavy users should be prepared to pay more once they have reached their bandwidth caps, a fair price would be much closer to 10 cents per GB than the inflated $1-to-$5-per-gigabyte charge sanctioned by the CRTC.
The argument that the exponential growth in Internet usage as the primary reason for higher prices is a seductive one. However, it ignores the fact that the technology that drives the Internet has become more powerful and much cheaper in the past decade.
So, exactly what I’ve been saying. Our infrastructure is evolving incredibly rapidly, and companies continue to use outdated arguments and backwards-facing data to justify rate hikes and consumer-hostile practices. Now, this isn’t a problem. I talked to a friend recently about mobile internet usage, and how my internet usage is astronomical each month. He couldn’t understand it. Read my previous post “The Cap’s the Limit” on one of the possible models for Apple’s “music locker” service, and you can see how this is incredibly consumer-hostile. Not on Apple’s part, mind you, on the carriers and ISPs. They’re the ones that see the future of media being cloud-centric, and they want to sink their teeth into more money, despite the fact that they’re already making plenty. Correction: they want more profit. Even so, I have a hard time believing that ISPs will be hurting from the profit angle.
At any rate, this is going to come to a head sooner than later, and I hope we have some strong government officials on our side to help us fight off the behemoth businesses that clearly see themselves as big enough to make whatever rules they want at the expense of the consumer.