I think Iowa has the right idea:
Unlike other schools that plop computers on a student’s desk and walk away, Carver did away with traditional paper-based learning and actively used the laptops in a new digital curriculum
I’ve seen these non-traditional, “progressive” methods pushed on students, and they’re usually awful, mostly because the administration doesn’t get technology. They think they can just throw some iPads or netbooks at students and everything will be hunky-dory. Typically, that fails so miserably it’s not even funny. Teachers have to spend 10-15 minutes per class period just troubleshooting tech problems that they are ill-prepared for or have no patience for. The majority of these teachers were over a decade into their teaching, some nearing two decades, and some nearing retirement.
Across the board, the issue was that they were given little to no guidance as to how to integrate these new technologies into their classroom. Furthermore, a digital classroom needs to have a curriculum that moves quickly and takes advantage of the technology so that the teacher is engaging students, creating opportunities for them to think and synthesize information. Without that, progress becomes an illusion.
The rumors of an Apple-branded HDTV have been around for a long time (although perhaps not as long as the iconic iPhone). For many reasons and for many years, this didn’t make sense. Having an Apple-branded phone was ludicrous since so many other companies controlled the market in terms of handset design, technology, carrier availability, etc. Apple had no leverage; they were just getting their feet under them after recovering from an almost-inevitable downfall, and they weren’t seen as competitive in the marketplace due to the highly exclusive nature of their products. Then they started designing their own hardware, coupled it with some amazing software, and all that changed.
Now, the world looks to Apple for guidance on just about everything.
Now, we’re seeing the same thing with TVs, and it smacks of WebOS.
One of the big announcements that came out of the HP Think Beyond event was that webOS will be shipping on every PC, laptop, and some printers that they sell by the end of this year. We have pondered what that will do to the scale of webOS and how HP would implement it.
To be honest, I don’t know how this is going to play out, but it looks like these companies want to get their OS into everything in your home. I think the idea here is to have a network of appliances, devices, and screens that are discoverable and OS-aware, meaning that they can sniff out other devices/appliances on the network and interface with them. A person might be able to control his or her washing machine with a phone, or monitor the state of the vegetables in the refrigerator by glancing at a widget in the dock of a tablet, or activate a Roomba to clean the floors while he or she is away. The more devices run your flavor of OS, the more is possible on the network. Naturally, this might also lead to Skynet, but whatever.
It’s the “home of the future”, and it started with your phone.
There are lots of approaches to mobile OS development these days. Some folks are closed, some are open, some are really stable, others risk stability for customizability, etc. There’s one OS that I’ve always admired for its cleanliness, integration, and overall user experience, and that’s WebOS. The underdog of the “Mobile OS Wars”, WebOS did a lot of really neat things that its competitors simply didn’t want to do or didn’t want to risk because they had too much on the table. Palm, which developed WebOS, had little to lose, and they bet most of it on WebOS. Some could argue that they lost that fight, but I think HP’s acquisition of WebOS recently was a phenomenal idea and a great long-term strategy. I get what happened there, and HP makes a good argument as to why they did it.
HP currently sells a lot of computer, possibly more than any other company, but they’ve long lacked any real brand identity. They’ve been plagued by the same problem just about every other computer company has had: reliance on Microsoft. When netbooks took the stage just prior to the tablet revolution, a lot of companies- HP included- tried to get on board with customer Linux builds that were easy to put together and load because they included no licensing fees. The problem was that the OS conventions didn’t quit carry over, and the software that people were expecting to run, well…it didn’t. For the computer-savvy individual, this didn’t matter much. For the older folks looking to get their first computer because it was “cute”, netbooks were a disaster.
So, HP’s acquisition of Palm and all of its resources- WebOS included- was smart. This is going to allow HP to finally create an OS that can be associated synonymously with the HP brand. I imagine that WebOS will eventually have its name changed as well, but it’s sticking for now.
What I also applaud is HP’s apparent commitment to bring WebOS to everything including possibly refrigerators. This is where things start to get really interesting because these devices can (theoretically) be aware of each other and include the ability to work right out of the box (in much the same way Apple is building iOS into other things), or open the door to the introduction of new features down the road which may not have been available at release.
The other interesting aspect to HP’s decision is to recognize that their new OS isn’t just about layering a “touch-friendly” shell over an ugly set of insides, it’s about designing a unique experience from the ground up using an OS that can be scaled to fit everything from phones to high-powered desktop boxes. Eventually, if the developer community is rich enough, people will realize that these custom OS builds are actually the way people want to work, not some of this crud.
Being a gamer, and reading game related news, I was a little surprised at this article from Rock, Paper, Shotgun, which talked about a shift in policy on EA’s part regarding the marketing and sales of games on Steam:
HMM. This demonstrates incredible confidence in EA’s own brands, but the key back foot they’re on is that they don’t have any other publishers they can bring on board. What would change everything in the war against Steam is if the other major publishers launched their own Origin-like services and restricted their download sales to those. I won’t be at all surprised if that happens, as a few are quietly building the infrastructure – THQ have a store, Ubisoft have that uPlay thing, Blizzard obviously sell their own digital stuff direct… You could even see Call of Duty: Elite as heading vaguely in that direction.
It feels like this is a trend that’s moving very quickly. When we see artists, developers, etc. selling their own stuff without a store or aggregation service to market their wares for them, we enter into a different kind of relationship with the creator- it’s more one-to-one as opposed as separated by the rift of the store.
Before, people would go to a single place to find stuff. This method of curation led people to associate their buying and their consumption with a place, a store, an entity somewhat divorced from the source of the goods. This is a fallacy, and can be frustrating for a customer because they don’t necessarily know where their stuff is from. It also robs people of creativity and imagination.
Now, with the proliferation of creators on the internet, there’s an increasing emphasis on discovery. That means that people need to be more self-aware and understand their wants and likes more. It also means that the creators have to have more clout since no one is doing their marketing for them. Either that, or a lot of really awesome relationships to build on.
This reminds me of Trent Reznor’srecent push into digital publishing:
Like a more magnanimous Radiohead, Reznor’s called into question the major-label reserve clause for established, profitable musicians by not just coming up with a new way to monetize music, but just giving it away for free, no strings attached. Instead of “tip-jar,” it’s “this one’s on me.”
and, of course, there’s always the “original” self-released album:
This is a hint of things to come. Over time more artists will decide to self-release music in this fashion, thus creating long, staggered release windows that place serious fans first and more casual fans further back in line. Traditional retail must wait in line, too. That means service companies that provide the tools and expertise for the online self-release of albums will benefit from this self-release strategy while the second wave of consumers are left to retailers.
What remains to be seen is if self-publishing will win out over a curated experience like the various “App Stores” that are cropping up all over the place. Clearly, if a developer or creator of something wants all the money, they’re going to have to sell it themselves. If they want maximum exposure, they have to give a little of that up to be on one of these stores. This will be interesting to watch, for sure. Will we see increasing fragmentation or consolidation? Or, still possible, some strange hybrid of both.
Then it must be one, right?
When Apple released its new AppleTV, I asked one of the questions that I should really learn to stop asking with Apple products: “So what?”
What I forget constantly is that Apple figures all of its new products into a wonderful long-term strategy that is often hard to decipher but beautiful to watch unfold. The AppleTV was one of those devices that didn’t really have a place in my heart until I started using it, and it’s become even more incredible with the recent unveiling of iOS 5.
When I started using my iPhone, I discovered a little jailbreak-only app that allowed me to mirror my iPhone’s screen on my TV, which allowed me to do things that were (at the time) not possible, like pump music from the iPod app out to the TV and show photos (without creating a slideshow). “It’s like having a computer in your pocket,” one of my friends remarked at the time.
This is heating up now, and I think the barely-mentioned screen mirroring over AirPlay is going to be one of the most life-changing things I’ll experience this side of 2000. I already use my iPad for just about everything in my life, and the Mac Mini sitting just below my TV does very little. Sure, it has some apps installed for design purposes, but they’re all secondary to the writing and creation that I do on a daily basis on my iPad. Sometimes (rarely), I feel like it would be nice to be able to throw what I’m doing on a larger screen and just lean back a little, see the whole thing take shape in front of me. Why is this not a computer, again? Currently, I can do that (sort of) with my iPad via an HDMI cable…but that isn’t really ideal because it means that I have to jockey with cables, and risk damaging ports when things get inevitably jerked around or flexed in strange ways.1
Now, I can dock my iPad, throw the bluetooth keyboard on a desk, and type as much as my little fingers can type without a second thought. This is huge. It means that I can go to a friend’s house, and mirror my display on his or her TV without any setup, without digging around behind the TV to find his or her HDMI port, and without any cables to lose or forget. Just make sure the friend has an AppleTV (which have enough value proposition on their own) and I’m set. It means that businesses don’t have to worry about sales pitches going wrong due to configuration issues anymore. It means that we’re one step closer to a shared classroom where people can contribute anything they’re reading to a discussion without having to be an IT professional to do it.
One step closer to living in the future. Oh wait, we’re already there.
1 I’ve always had problem with dongles or adapter cords. For some reason, the cables alway break or fray internally, and the whole thing fails within a few months of normal use. I like to avoid them whenever possible.
I’ve had the pleasure of taking part in a couple of Groupon deals, and I’m really happy I did. I was able to get some delicious food for relatively cheap, and I’ve seen lots of good stuff on the site for meager amounts of money. This is both a good and bad thing. It’s great for me, because I eat food, and I like it when food is both good and cheap. I like supporting local business by eating at non-chain restaurants and cafés, and I thought I could do both through Groupon. What I am discovering, however, is that Groupon can actually be very damaging. This is bad.
From a recent post on TechCrunch:
Groupon can clearly deliver customers. But in order to know if it makes financial sense as a customer acquisition tool, merchants need to know two key numbers:
- The proportion of Groupon customers who are already their customers
- How often new customers come back.
That second metric is key. I’ve seen a lot of businesses have record drives of customers after running a Groupon deal, but I’ve wondered how many of those customers will actually come back. I’ve always thought that seeing a packed house of people whom I know have Groupons waiting in their pockets resembled a swarm of locusts- they consume everything they see and just move on to the next cheap thing. I’m not so sure I want to be a part of that.
Then there’s this whole idea on the back end:
Why is Groupon not merely a tech-bubble datum but a Ponzi scheme? Simple: Groupon has found that you can get local merchants to try anything once if it brings them new customers. A few local merchants in Chicago get them started, and Groupon shows good revenues. In fact, Groupon immediately remits half of those “revenues” back to the local merchant — they were never Groupon revenues in any meaningful sense of the word. But, optically, Groupon revenues look high — which they use to raise a financing round at a high valuation. Then they use the proceeds to hire vast armies of salespeople to dig deeper into Chicago’s local merchant community and repeat the trick in other cities.
This is so bad! I never considered this take, but it really does make total sense. I’m not a big investor, and when I do put my money in companies, I’d like to know that they’re actually good companies and not killing off local business, which have a hard enough time surviving on their own.
We all vote with our dollars every single day, and it’s a sad day when we tell small, local businesses that we won’t buy from them unless they gut themselves in front of us. I refuse to be a part of that, and I think you should, too.
The announcement of Windows 8 and its subsequent demos were interesting for many reasons, not the least of which was its inevitable comparison to Lion, iOS, and Android. To be fair, Android should generally be left out of this comparison since it doesn’t have a true desktop operating system (yet), but periodic comparisons have to be made.
Windows 8’s user interface, at least the touch portion, looks good. I like the clean, muted-color aesthetic, and the transitions between apps looks natural and pleasing (similar to the Star Trek LCARS aesthetic, but a little bit sharper). The way that Microsoft is pushing this thing, however, is silly to me. Recently, I wrote about Apple playing “catch-up” with this release, but something I failed to address was the simplicity of the whole experience, and this is ultimately the most important aspect of the entire OS.
I can wax poetic about the history of the personal computer and its role in our lives, the changes that personal computing has brought to our lives and how we experience the world around us, but it’s unnecessary. We all know that the face of personal computing is changing rapidly and being redefined constantly. Instead, let’s ask a fundamental question:
Why are computers becoming “simpler”?
I know lots of people who lament the “over-simplification” of today’s computers. Computers should require lots of specialized knowledge and time to learn, in the opinion of many folks. They don’t understand that all the layers of schlock the operating system puts in between a user and the task that they wish to accomplish are unnecessary and silly. In order to build a spreadsheet in Microsoft Excel, for instance, I need to understand Microsoft Excel conventions and jargon on top of Microsoft Windows conventions and jargon. This is difficult, and it gets in the way of actually accomplishing things. The learning of the tool takes up more time and energy than using it. This is bad technology, but has come to be accepted.
Companies that design software continue to simplify it so that it gets out of the way of the user’s intentions and goals, so that it helps them accomplish what they need to accomplish. This is good technology, and has difficulty gaining traction because the software almost looks like a toy compared to the byzantine and grotesque software hydras that have become commonplace in the world.
I remember using my Palm Treo 700w, the Windows Mobile version of the popular Palm smartphone. It had a custom build of the Windows Mobile software that was tuned for the Palm “experience” which meant that it worked better than standard windows. I thought it was really great, but what really blew me away was a software shell (a layer over the standard Windows Mobile OS that looked way better than the standard home screen) by SPB called “SPB Mobile Shell”. This was right around the time the iPhone was introduced, incidentally. I tried using this OS shell for a while because it was “touch-friendly” (where have we heard that before?), but ultimately gave up because the phone was still using a really ugly, really unusable OS designed for styli under all the gloss and shine.
This is what Windows 8 is, and it’s flawed. The key is to change what the OS is at its heart, change the way that interacts with the user, change the way that OS feels. IF the user gets the opportunity to “peek under the hood” of the OS, he or she will see the ugliest, most confusing parts of the system laid bare. The gears, cogs, oil, chugging engines…everything. In Windows 8, this is still confusing, ugly, overcomplicated. In the Apple world, this is simpler, easier, uncomplicated. This is where Windows will fail. If I start to use an app that hasn’t been designed for the new “touch-friendly” shell (which is essentially all Windows 8 looks to be), then it fires up in its old, byzantine, bloated-hydra form. With iOS and the future of MacOS, this isn’t even an option. If an app hasn’t been designed to be used full-screen, it doesn’t matter, it’s still usable in its beautiful, native form.
This brings me around to the initial question of “Why are computers becoming simpler?”
The computer-savvy elite that used to be the only folks for whom computers were intelligible are no longer the only ones who can use computers, and computers are becoming simpler because that’s what we all want. Even programmers, developers, and coders want everyone to use computers. Everyone! They want their software to shine on beautiful hardware, too, and we’re seeing this happen from the world’s most innovative companies. Microsoft, however, doesn’t seem to get it. They want their software to be archaic, opaque, and impenetrable when it comes to interacting with the user.
Maybe that’s why they keep losing.