For a while now, I’ve been using an app called Audiogalaxy to get back to my music library at home and essentially have access to my library with over 100 gigs of music to supplement whatever tracks I have synced to my iPhone/iPad. It’s fantastic, mostly because I know two things:
- I have music on my iPhone that I can listen to anywhere, regardless of whether I have a data connection or not.
- I can, with a data connection, get access to my huge music library.
The recently-uncovered Apple patent application is simultaneously awesome and horrific for a few reasons, all of which have to do with #2.
One of the most explosive and formative things to happen to America recently is the widespread adoption of mobile data and internet usage. As I’ve discussed before, the mobile telecom providers have used this to push their agendas and create an awful dystopian future that the American wireless subscriber is going to end up paying dearly for. It’s going to be ugly, folks. Get ready for a future based on as-yet-unwritten disgusting rates based on AT&T’s greed.
If you think this reaction is a bit overblown, let’s dissect the groundwork that needs to be in place for a person to listen to music with Apple’s new system. A person would need:
- A computer running iTunes (for syncing purposes). This is pretty much standard, and shouldn’t come as a surprise to anyone.
- An iOS device with a data connection. Not everyone wants to or can run a persistent data connection. iPod touch devices are reliant on wifi, and people with the lower-tier AT&T or Verizon data plan (250 MB for $15.00/month, in AT&T’s case) may not be comfortable with a service that sucks up data every time they wan to listen to a song.
- Possibly: the above computer with a persistent connection to the internet. This is a variable, and the future is hazy here. Depending on how the whole “Music Locker” thing will work, or how MacOS Lion home server is structured, this may or may not be necessary. We’ll see.
Let’s assume that a person has an iPhone, is using AT&T, and is using the $15.00/month data plan for 250 MB of data per month. We don’t know how much of each song will be synced to the iOS device, but let’s assume it’s about 30% of each song to allow ample buffering time. We can then “fit” three times the number of songs on the iOS device due to the reduced footprint of each song on the device’s memory. The remaining amount of each song would then be pulled from a cloud. I say “a” cloud because it’s unclear if that cloud will be the individual’s computer or this “Music Locker” service. Let’s assume it will be from this person’s computer, so as not to incur any additional fees (yet). The computer will have to be on in order to access the library data, which means an extra power demand and a load on the person’s internet usage (we’re also assuming that internet usage is capped, which, despite some companies claiming their data is “unlimited,” is most likely the case). Most likely, the data usage through a home internet connection is insignificant (especially relative to a theoretical cap of 50-250 GB). The proposed data usage relative to mobile internet connection with a 250 MB cap is significant, however, and listening to a day’s worth of music can potentially eat up all of a person’s monthly data before they have to pony up another $25.00 for the higher 2 GB plan.
Did you catch that? Let’s look at it again.
The folks who want to use this feature will be streaming data every single time they listen to music. The amount of data that will be used is unclear, but I predict that listening to music for a prolonged period of time (even a few hours a day) will cut deeply into or completely use up a person’s data for the month (again, assuming usage of a cheaper $15.00/month, 250 MB plan). Even on a 2 GB plan, monthly data usage can quickly skyrocket, shooting people dangerously close to the ceiling or their plan. I use about 1.5 GB/month right now with occasional usage of my Audiogalaxy service to get at my home library. If I were to switch over to a model that used data every single time I played a song, I’d find myself breaking that 2 GB barrier on a monthly basis, which would cost me more money.
AT&T and Verizon made a long-term move here, and we’re staring it in the face right now. Back when AT&T first introduced tiered data pricing, I could see the act as predatory. More and more services are being pushed online, to the cloud, and so forth. What AT&T did was squeeze the pipes before the water started flowing. Netflix is growing in popularity and capability, and their long-term dominance in the mobile media marketplace (I love alliteration!), while not guaranteed, is just shy of that. How are we going to watch movies on our mobile devices if we’re being pinched to do so? How will companies innovate if they know they’re going to be dealing with hamstrung devices? People are going to be paying for subscription services and the bandwidth it takes to use them, a double whammy. The outlook doesn’t look good.
Boy am I glad I got that unlimited Clear iSpot subscription while it was still around.
so i had a conversation with a friend of mine the other day regarding all this hullabaloo with google’s forays into the fiber market and how they’d like to bring superfast 1000 Mbit/s data into the home. i think that’s a great idea…but there are some shortcomings to that plan (which i’m sure google is thinking about).
even if they’re thinking about it, i’m still gonna talk about it.
so google said something recently about how it wants to make the web faster. i think that’s a great idea. now, they’ve moved beyond the theoretical “let’s try real hard to make stuff more efficient” into the “let’s just GO FAST” realm. i’m not sure if i think that’s the best way for them to be using their might.
the internet is a vast sea of stuff, right? getting access to this stuff takes bandwidth, and having all this data served up to your eyeballs and earholes is what so many telcos make their money off of. doing the same thing is what makes google a ton of money, as well. this leads them to the interesting position of having a distinct interest in making sure lots and lots and lots of data gets into your head as quickly as possible. basically every time you use the internet, you’re making google some money, so it makes sense that they’d want you to use it MOAR.
they also have the right idea in serving up data instead of creating more programs and applications. we’ve had fast computers for a while, and they keep getting faster. the problem is that they always feel slow, since the code that is being written to run on them is trying to take advantage of the new horsepower. you get more complex code, more operations occurring per second, and the overall experience doesn’t change, despite shelling out tons of cash for a new rig to browse the internet. this is really bad. we get locked into this cycle of buying new stuff, just so we can run an upgraded version of the same program we had last week, only now it does more, so it needs more power.
at what point do we hit saturation?
google says now.
really, we don’t need more powerful programs and applications, we need more data. this is important, since the applications we have now can do everything we need them to do if we can just get them the data fast enough. you can also leverage the power of supercomputing clusters around the country to take care of calculations and operations that would make your dream machine at home cry since you can pass them huge chunks of raw data and tell them “here do something with this,” and they’ll say, “ok!”
all that is so awesome! but…it’s sorta limited in the same way the current internet is sorta limited. currently, the state of the internet (true broadband) is basically limited to phone booth-style execution. you go home, or to work, or to a coffee shop, and your internet is fast in these places because they have landline connections to the ISPs. if you want mobile internet, you need to suffer through “3g” service provided by your mobile provider, or go with someone like clear or sprint for 4g. in most cases, both of these “solutions” are really stopgap measures, since they don’t provide the sort of coverage that a truly mobile solution does. sure, i could walk into a clear store and walk out with the ability to log onto my gmail from anywhere in chicago…but what if i wanted to visit some friends in wisconsin? what if i had to drive to southern illinois for work? i’d be out of luck. not truly mobile, and not truly broadband, but somewhere in between, really.
this is where google should be focusing. the current state of this data fetching is unreliable because our infrastructure lacks consistency. i may be able to get great reception when i’m at home, but i’d rather have great reception when i go to my doctor’s office on the fourth floor of a small office building. is that too much to ask? how about if i’m on the subway? at a mall?
this is where the future needs to be. it’s one thing to have a person at home, browsing at lightning fast speeds, but it’s another to be able to have a similar experience while walking down the street checking stocks or watching a movie. at some point, a person hits their limit of how much data they can absorb simultaneously. even right now, i’m not trying to load 20+ pages simultaneously. loading one or two as i think of new ideas is pretty common, but by the time i’m done typing in the query for the second page, the first has already loaded. granted, my usage may not be typical, but it’s not so far out of left field that one could call me a “power user.”
so google, if you’re listening, focus on the mobile space (like you said you would). forget fiber, give me the ability to access your pages from everywhere, and i think we’ll have a mutually beneficial relationship.