Like “Eagle Eye,” But Nicer


Last year, a great deal of industry powerhouses gathered together at a conference to discuss the future of MEMS (microelectromechanical systems) that will be integrated into all aspects of our lives, be used to gather formerly impossible-to-gather data, and generally open doors to a future that we can only imagine (and possibly aren’t equipped to imagine yet). When I read about these advancements and dreams, I couldn’t help think about the incessant march of progress towards the world of tomorrow.

I recently read an interview with the founder of Foursquare, who was talking about recent advances in smartphone technology and how his business (and other businesses) could leverage not-yet available sensing technology to create device that acts as more of a companion than a communication device, something that could keep a person aware of his or her surroundings through better use of the device’s built-in sensors.

“If you think of the phone as a bunch of sensors stuck in this device connected to the network, how can I walk around the city and have the phone come alive and remind me, ‘Oh this is a place you should go to lunch” or “this is the place you read an article about 6 months ago.'”

The thing is, this really isn’t anything new. Intel CEO Paul Otellini has been talking about this type of integration for years, and we see the idea crop up again here through Intel CTO Justin Rattner’s use of the phrase “context-aware computing.”

The thing is, people get sorta jittery when things like this start popping up in the news. No one really knows what the future will look like, but they do know that they don’t want some Orwellian “Big Brother” watching every move they make and cataloguing all of their habits. The problem with that sort of paranoia, however, is that it’s already happening, and we’re actually glad about. I’m no conspiracy theorist, and I don’t subscribe to the whole “Internet privacy” thing. I know it’s an illusion. My point is that I would really like a device that’s totally integrated into my life, I would like my phone to pipe up and say, “Hey Paul, I know you like gardens, and there’s a really neat garden around the corner from here.” Does that violate my privacy? No, not really, because I told my phone to let me know about things like that. Does that open the door for someone (or something) to track my habits, movement, and preferences?

Not any more than they do right now. The point here, without sounding too kooky, is that in order for devices to reach the next level of usefulness, a deeper level of integration into our lives, we will need to realize that all the data we’ve been providing corporations, marketing agencies, and *gasp* the government can actually be used to make our lives a little more pleasant. That sounds kinda nice to me.

Location:Dennis Dr,Northbrook,United States

Advertisements

The Living Room Takeover

A tiny black monolith of wonder.A long time ago, in a convention hall not so very far away, Apple introduced a product intended to revolutionize the living room viewing experience. It was one of the neatest things that most people had seen happen to the TV in a while, and there were a lot of people who were impressed by what it was capable of. It was loaded with all sorts of storage (a lot for the time, at least), and offered a novel way to get your media from your computer onto your TV. The sad thing is, it didn’t really take off the way other Apple products did. People liked it well enough, and it sold decently, but it wasn’t the hot ticket item that people were scrambling to pick up. That honor generally belongs to the iPhone, and now, iPad. It was a little too pricey for what it offered, and most people probably felt like the Apple TV was a sideline player.

Fast forward to September 2010, and we see a renewed focus to Apple’s efforts; their so-called “hobby” suddenly has a brand-new face, has lost a ton of weight, and can do basically the same stuff without all the baggage. More, actually. Some people still asked “Why?” but for $99, it was hard to argue against it. Those people (myself included), just went ahead and picked one up to find out what all the fuss was about.

I can tell you right off the bat that I love my Apple TV, but not for the reasons one might expect. I don’t love it because it makes watching movies really enjoyable (it does) or because my family can see all the new pictures I just imported from my camera on the TV, or because I can stream that awesome YouTube video I’m watching right to the TV seamlessly. All those things are great, sure, but what really got me excited is what the little black box represents.

Some folks have already jury-rigged a console experience into the iPad/iPhone/Apple TV. Even before that, however, before the 2nd generation Apple TV rolled out, there were reports that it would run some version of iOS. Ultimately, iOS under the hood really only exists in order to open the door to apps. With apps come developers, innovation (and, depending on the level of the APIs, usually some griping), and new software ecosystems. With iOS under the hood, we will eventually enjoy apps that talk to each other seamlessly, network invisibly, and build off of each other in synergy. That’s what got me excited.

John Gruber has a great take on the whole thing:

I think I see what Apple is trying to do with the App Store, and the potential upside for the company is tremendous. They’re carving out a new territory between the game consoles (tight control over content and experience) and computers (large number of titles, open to development from anyone). Think of the iPhone and iPad as app consoles. (Consider too, the possibility of an all-new iPhone OS-based Apple TV. TV apps! Using iPhones and iPads as controllers.)

So, basically what I just said.

The key here is that Apple would be competing against veritable giants in this space, companies that have years and years of experience creating behemoth machines that are designed for lifespans that fill the better part of a decade.  These consoles are powerful, multi-role devices that have also taken on increasing cultural significance as gaming moves more and more into mainstream culture.  Contrast that to Apple’s predictable and consistent release cycle, which, on the one hand, allows them to react quickly to shifts in the marketplace but, on the other hand, sometimes leaves customers feeling alienated.

While I tend to side more with the stability and development cycle that is characteristic of current-gen consoles, Apple’s move into this space may also spur more innovation and force the current trifecta (Sony, Nintendo, and Microsoft) to think of things that Apple hasn’t.  Sony’s current offerings (PS3, PSP) are great, but lack synergy.  If there’s anything that Apple can nail, it’s synergy, and those big three will have to work hard to integrate their home consoles with other services and devices if they want to offer the consumer some more value.  Developers’ successes in the phone space have translated smoothly from the mobile  to living room space (see Angry Birds and Dungeon Hunter), and Apple sees itself uniquely positioned to make use of that transition.

Think about it: if a developer crafts a successful, top-selling title for iOS, Apple wants to make sure that the player who wants to enjoy that same experience in their living room with three of their friends can do just that.  Apple doesn’t want that developer transitioning to another platform.  Apple doesn’t want people spending their money on other people’s hardware, either.  Why buy the PS4 or XBox 720, four controllers, and whatever other magic peripherals they have for the primary purpose of playing games when instead a person can purchase an Apple TV and iOS devices for the whole family, and be simultaneously purchasing a game console and input devices?  Let’s take it a step further.  Ever heard of OnLive?  Ever seen their game console?  Does that seem familiar to you?  OnLive’s servers stream games from the cloud to your TV.  You can play super high-quality games over a broadband connection.  Apple just built a mammoth data center, purportedly for iTunes and MobileMe.  Let’s think a little further, here.  Apple is also focused more on social now than they ever were, and it also wouldn’t seem too far-fetched to use Apple’s newly-introduced Game Center to pull all their iOS users together into a platform not unlike PSN or XBox Live.  Add to that all the success that more casual titles have seen, and it seems elementary that Apple would take steps in this direction.

I don’t know what gaming in Apple’s ecosystem will look or feel like, but I have a strong suspicion that the war for the living room is just heating up.


tweeting twilight

For a recent assignment in one of my classes, I was tasked to uncover and explore an issue that is trending in the discussion of Young Adult Literature.  I could have found plenty of topics relating to the overuse of certain character archetypes or the efficacy of having a profit-driven publishing industry decide what is best for kids to read (books are written for girls because more girls are reading.  you’d think that if someone wrote a book for guys, more guys would read?  pish posh, that doesn’t make us money).  Instead, I decided to do what I do best: look at recent trends in technology and articulate their effects on society.  I love looking at the evolution of tech and the way it’s been changing our world, and I’m exploring more and more ways of using it to the benefit of kids in the classroom.  I also happen to love books and reading the exciting stories in YAL.

My initial idea was good, but limited.  There are plenty of folks out there who are already exploring the integration of social media and the modern classroom, and I’d be lying if I wasn’t already considering the effect that twitter will have on shakespeare.  There are, however, better ways to use these phenomena of social networking and social media to increase literacy and involvement in literature.  We always think of “technology” as shiny, expensive objects that are mostly intended for a specific audience.  The fact is that “technology” is everywhere.  Understanding how the mind works, how people react to different social stimuli, how societies react to changing world conditions; all these are technologies that we can leverage to help kids read.  In this case, in this post, I’m not concerned with the latest Apple product, but rather the utilization of our collective human experience to create a better English classroom.

Recently, I had the pleasure of meeting Josh Elder, author of Mail Order Ninja, hearing him speak about the use of comics/graphic novels in the classroom, and grilling him about the possible perils and pleasures of having this unique form of literature in front of this country’s young minds.  Josh makes some good points, and I’d like to focus down on just a few for the purpose of my arguments here.  Josh opened up with establishing the graphic novel in the landscape of literature, namely that graphic novels and comics are the landscape.  Prose, in his point, is a wholly subsumed subsidiary of the experience of a comic.  If you add pictures to words, it becomes a comic.  If you remove words from a comic, you still have…a comic.  This is important because we are used to reading certain kinds of literature in certain ways.  Comics and graphic novels demand new skills from us, a new way of digesting information.  The world of pictures and text is also one that gives us the ability to give the gift of literature to a much wider audience.  Authors want that, teachers want that, and more people want that every day.  Why, then, does literature have to be confined to prose?

There are kids who may have wanted to read at one point, but are now living in a state of fear.  These kids started out with their classes, learning their alphabet, learned to piece some sentences together, and, at some point, hit a wall.  In some cases, these kids may have even missed the whole alphabet thing.  A friend of mine had the opportunity to participate in City Year not so very long ago, and would tell me stories about his experiences.  He told me some heartbreaking stories of kids who desperately wanted to read so that they could feel better about themselves, feel like they were moving forward and learning something.  Sometimes these kids could barely read, falling behind in simple texts and books far below their grade level.  In some cases, these kids were even having trouble identifying letters in the alphabet.  One story he told me involved a student who could only recognize two letters.  When I consider my upbringing, the stress my parents placed on getting a good education, this story is absolutely amazing to me.  Two letters.  How can a person find any measure of happiness when they are constantly bombarded by symbols and signs they simply cannot recognize?  Is that a quality life?  It’s no wonder that so many kids become violent when they’re literally assaulted every day with reminders of their own inadequacy.

Interestingly enough, there are things they can recognize, but mainstream culture tells us that these things have no value when it comes to education.  They can feel music, understand movies and the plots contained therein.  With a little bit of digging, I’m sure they’d be able to identify and articulate abstract concepts that the intelligentsia believe themselves to have a monopoly on.  Movies, music, and comic books/graphic novels communicate in a language that we do not have to learn.  They can be a way for us to understand things that we have no first-hand experience with, no empirical evidence of.  The theory of multiple intelligences tells us that people can learn in a variety of ways, and that there are many ways to teach any type of subject matter.  A good teacher needs to recognize this.  We, as a society, still hammer home this idea that literacy only happens one way – with prose.  If a student has a difficult time understanding what they’re reading, or if they reach a point in their education where reading becomes more of a stressor than a means of conveying information, we need to find a way to teach this student and make sure he or she understands what he or she is learning.  If educators (and I place myself in this category) do not find a way to teach this student, we have failed.

Let’s bridge this over to the tech space.  What browser are you using right now?  I can guarantee you that there’s someone near you right that is using a different browser, yet, they can view this information in the same way you can.  Underneath each and every single web page is a mountain of code, a language that you most likely have never learned, may not recognize, and maybe never even seen.  Yet, you’re looking at this language expressed in a way that you can digest.  Are you tracking me here?  The web is insanely complicated, and developers are constantly striving to simplify the way we interact with it.  They’re trying to see what we want to do, not giving us another hurdle to overcome.  What’s important to these developers is that you receive what they’re putting out into the world.  That was the entire purpose of language, of literacy, of printing books.  Somehow, though, we got stuck on this idea that the written (or printed) word was where the buck stopped.  Our world is packed with so many forms of communication, and more are being discovered all the time.  Developers are scrambling over each other to be the first to utilize these new technologies to deliver content to the end user.

Someone please explain to me why we’re not taking the same approach to education.

There are kids in classrooms who are staring at pages in books the same way you’d stare at the almost infinite amount of code that is running the page you’re reading right now and thinking to themselves, “I wonder what this all means?  I wonder what it would look like if I could see it?” They know there’s something there, and they want access to it!  There’s something in the way, though.  It’s this singular approach to literacy that we have adopted as a society.  We know this, we understand it, but by constantly perpetuating the same memes in education, we’re telling them, “Look, this just isn’t for you.”

It seems counterproductive, doesn’t it?  Let’s fix it.