So, Apple has unveiled iOS 7 to much discussion, hand-wringing, and cheers. There are lots of things that I feel that Apple is promising to do right with this release, and a number of things that we will, of course, need to see to believe.
One of the most prevalent activities that iOS users engage in is photo sharing. Apple’s recently-released “every day” video showcases the power of the iPhone as a camera. iPhone users know that their iPhone is probably one of the best cameras they’ve owned, and the millions of pictures snapped daily obviously underscores that.
I was intrigued by the keynote’s handling of photos, namely the application of filters and the introduction of “Shared Photo Streams” into iCloud. To be honest, I felt that this was a feature sorely missing from iOS and iCloud for a long time. The idea of a photo stream curated by a single person is fine if one of your friends happens to be a professional photographer or something like that, but most situations in which people are snapping photos tend to be social, with multiple people desiring to both view and (most likely) contribute to an album of the event. The trick lies in determining the canonical center of the stream. Who “owns” the photo stream? When people contribute to the photo stream, are they adding to a single user’s photos, or are they, in effect, “copying” that stream to their own photo collection and then adding to it, which can then be seen by other parties? Or, are the photos stored on Apple’s servers, where multiple parties “own” photos, can add to the stream, and then define who else “owns” those photos? “Own”, here, is an operative word, since ownership of the photos is tough to nail down, in this case.
I’ve always wondered how something like this would work, but it’s a problem that Apple absolutely has to tackle in order to stay relevant. As people add more and more photos of their lives to their devices, the related storage of said photos become of paramount importance, followed closely by how people identify and integrate those photos into their identity. What has become increasingly obvious is that people don’t just craft an identity that is tied to a mobile device, they create a digital identity that the mobile device allows them to access. In order for these technologies to be relevant, they have to allow people to share photos and feel comfortable about storing them in a way that is non-destructive and still allows them to reference past events with ease. It’s clear that Apple is now moving towards more meaningful photo sharing, but it has yet to be seen if they can take this idea and use it to deliver the type of interconnectivity that people implicitly ask for.
One of the things that Apple did not address, and something I’ve heard from people who have recently switched away from iOS as their primary mobile platform, is that iOS hamstrings users by not allowing them to easily pass data between apps. While I agree with some parts of this argument, I can see Apple’s stance on the idea of inter-app data sharing. The scenario that I often hear from heavy Android users is that things like taking notes, or saving PDFs from one app to another, etc. are easier on Android. I don’t agree with this because I do those very same things every single day with iOS and, ever since Apple started allowing for custom URLs to pass data from one app to another, have never had an issue with that. As such, I think I understand their stance – that Android allows a freer exchange of data between apps using a more-or-less centralized file system.
One thing that we saw in the WWDC keynote, however, is the introduction of a new tagging feature in Mac OS 10.9, which, I believe, is going to be Apple’s eventual answer to the file system. Instead of files being stored on the device, in a folder, they’ll be stored in iCloud, accessible as clusters of files related to a specific idea. This is finally the intelligent organization that Palm’s WebOS got right. Ultimately, people don’t really organize their data by app, they organize it by idea or topic, which is a far cry from having data “live” in an app.
I think the ultimate goal is to enable a user to cluster files together around a central theme or project that they may be working on, and make that cluster available as an item in an app that keeps track of and syncs tags across platforms. Ostensibly, the user could open the app, see all of their tag groups, and (possibly using an Photos app-like pinch to spread gesture) see all of the files in that tag group. Tapping on a file would open a list of corresponding apps that are capable of handling that type of file. Interestingly enough, this may also allow Apple to put a little more control in the user’s hands by allowing the user to pick which app would be the default handler of that file type. In this manner, people don’t necessarily have to know where to look for their files, they need only to open the “Tags” app, find the group they want to work on, and tap the file they want to work on in that group. The OS then passes that file to whatever helper application the user has selected as default, and they’re off to the races. A system like this wouldn’t be able to satisfy every Android lover’s desire for a true file system, but Apple wouldn’t need to – the average user would see this as a new feature, and customers on the fence may see this as a tipping point.
This one is weird to me, but I like the way that Apple has addresed it in the update, with the WebOS-style “cards” interface informing this component of the OS heavily. The ability to see live updates of each app, or at least the current status of each open application as the user left it is another way Apple brings parity with Android, but does it better. I’ve seen Android’s task-switching waterfall, and It has always felt too sterile to be enjoyable to use, although I believe that’s more of a fault of the OS design language as a whole than that specific part of the interface.
There have been a not insignificant number of words spoken about the changes Apple has made to the look of the stock app icons in iOS 7. To be honest, I feel like this whole discussion is completely moot. App icons are incredibly important, to be sure – they are the way a user identifies your application in the sea of other apps on their phone – but they are somewhat arbitrary. They need to be well-designed, but there is a certain “minimum effective dose” that allows most people to identify the app they’re looking for and associate it with the task they’re looking to accomplish.
When Apple made the choice to redesign the stock app icons, the folks behind Apple’s design choices exposed their design process as well as the grid-based layout sytem that informed the icon designs. There were comments made by graphic designers about how Apple’s layout choices were half-baked or wrong, and other coments that discussed how the color choices were catering to a younger generation, or the aesthetic biases of the cultures in new and emerging markets. Regardless of the reason behind the choices, I can’t help but relegate all of this commentary to the trash heap for the same simple reason: all of these comments are about a subjective experience. Of course Jony Ive wants to create an experience that is beautiful, familiar, approachable, friendly, and functional…but there are so many ways to accomplish this, and all of the commentary comes from a single data point in the universe. Even assuming that all of these designers and amateur critics were able to ascertain some objective truth about these designs that was universally applicable, they all have differing opinions – some of them conflicting – and it must thus follow that they’re either all right, or all wrong. I’m clearly in the latter camp. People are going to take a look at the icons and freak out because they’re different, and then everything will go back to normal and everything will be fine because, in truth, app icons only matter as pointers to something a user wants to accomplish. Once users draw new associations in their minds, they’ll be fine.
The Little Things
There are, of course, things that Apple hasn’t mentioned or brought up, most likely because they simply didn’t have enough time to do so, but I feel like I should mention them here for the sake of completeness.
While I know that not all of these things will be addressed (or even should be addressed) because of the focus that Apple is trying to maintain with iOS, there are some things that venture into that grey area that exists between the worlds of Mac OS and iOS. The first of these is the way the OS (and many apps) handle external keyboards. Safari, for instance, is able to handle a “Tab” keystroke, but does not recognize Command+L to put the cursor in the address bar, or Command+W to close an open tab. These aren’t necessarily “shortcomings” of the OS, but nor do they enhance the user experience. I’ve never thought to myself “Boy, am I sure glad they left out those keystrokes! My life is so much easier!” With this type of behavior, I’m not sure if the omission is intentional or not. Apple is a very intentional company, but something like this feels like an oversight as opposed to a deliberate design decision.
Naturally, when people see new OS announcements from Apple, they assume that new hardware is going to follow closely behind. Something that I heard recently was that Apple’s new design, while beautiful on all current iOS devices, absolutely sings and looks right at home on the new devices that Apple has lined up for the fall. What these devices are is anyone’s guess, but I don’t think anyone would lose betting on a new iPhone. New iPad minis, iPads, and possibly iPod Touch units may also be in the works, but it isn’t completely clear yet exactly how these things will take shape, and what sort of changes we can expect. I love looking forward, but I don’t “do” rumors, so I’m not going to waste any time on speculating about what Apple is working on.
Ultimately, the new iOS version that Apple has introduced to the the world looks great and, based on what I’ve heard, feels amazing. I have no desire to start ripping on an OS that’s in beta, nor do I have the desire to laud it. While it’s exciting to see a refresh to the world’s most important mobile OS, the proof will be in the pudding once it’s been finalized and released.
Something that I heard a significant amount of with the release of the new iPad mini, as well as the new iPhone, was this idea that Apple can’t have any more “Just one more thing” moments, mostly due to their inability to mask the movements of their supply chains. Truth be told, it’s difficult to build an economy of that scale without drawing someone’s attention. The eyes of the world are on Apple right now (as well as Google and Microsoft, of course), and it’s clear that the world is poring over Apple’s supply chains in the hopes that they’ll be able to predict Apple’s next move based on fluctuations in part orders and such. The idea is that by scrutinizing Apple’s suppliers, looking at the parts that are coming out of the various manufacturers around the world and being assembled in China, that analysts will be able to stay one step ahead of Apple’s next “big thing”.
Here’s where I disagree with that idea, however. While analysts may be able to look at Apple’s current supply chains and see where they’re headed with their current products, they can’t find what they don’t know what to look for to begin with. We all know that Apple produces smartphones, tablets, laptops, computers, and displays (as well as other things). Here’s the thing about their products though: while they’re currently “predictable”, there weren’t always that way. No one really saw the iPhone coming, and they didn’t really know what was up with the iPad before it was the iPad. One might look at those examples and say, “Well, sure, but we saw the iPad mini coming, and we’re able to predict the new iPhones before they’re out…” and so on. Of course people can predict those things because they know what to look for. Analysts have their eyes fixed on display shipments (and the size of those displays), processors, logic boards, and more. They’re looking for all the things that make up the current generation of Apple products, and, since they have a pretty good idea of how those things fit together right now, they can make some pretty good guesses as to how those things fit together, and “predict” the next apple product.
Let’s look at this another way, though. Let’s say an applesauce manufacturer orders a lot of sugar and a lot of apples. It wouldn’t take a genius to figure out that they’re making applesauce, and there’s the rub. Analysts look at the current state of Apple products and say, “Hey look! They’re making applesauce! I’m so smart!” Except they’re really not. What they’re doing is putting the words on the page together to form a complete sentence, and they’re screaming from the rooftops that they’re literate. While that’s a great accomplishment, it’s only the first step to being able to truly analyze information and synthesize some new ideas.
Apple’s ability to have “Just one more thing moments” hasn’t diminished in the slightest. Their ability to innovate isn’t waning at all, in my opinion. The true innovation will appear where people aren’t looking, and will manifest itself in a way that people aren’t expecting, utilizing components that people aren’t expecting to see. Or, alternatively, they’ll take components that people understand and put them together in an unpredictable or disruptive way.
We can’t know how those things will take shape, because we don’t know what to look for yet. But, I’d be willing to bet that, when it does happen, people will still be just as surprised, and it will make all of the so-called analysts look like third grade children trying to read Chaucer.
I’ve been reading a significant amount of backlash agains the iPad mini event focusing specifically on the lamentable lack of the “one more thing” moments of old. The typical banter has something to do with leaks coming from places that Apple has a hard time monitoring (China), and that it does everything it can to keep things hush hush in a world in which money talks, and loudly. My main point of contention with this sentiment is that it implies that Apple can’t keep anymore secrets about its new products.
I think that’s a silly idea.
Consider, for a moment, the scale of manufacturing that has to be brought to bear in order to manufacture products for Apple on the scale we are currently seeing. It has to be massive, and requires the coordinated efforts of millions of people, literally. From product inception, design, fabrication, and manufacture, there are literally millions of people involved, taking care of everything from the actual design and sourcing of raw materials to the shipping to your doorstep. Truth be told, their job isn’t even over when you have the product in your hands; they still have to support it and continue developing new software. The human life energy devoted to the manufacture and support of a single iPad is immense.
As such, consider the original iPhone, first introduced in January of 2007, but released in June of the same year. That’s a 6-month gap from introduction to purchase. In contrast, iPhone 5 was revealed on September 12, went on pre-sale two days later, and was available for retail purchase one full week after the introduction, on September 19th. The full implication of that is that Apple’s manufacturing machine has to be at work for months before the device truly sees the light of day. In short, more human beings (see above) are aware the device exists for more time before the general public can purchase the device.
With the original iPhone, Apple had the luxury of producing prototypes and testing them in relative seclusion. Apple no longer has that luxury because it works on some of the tightest schedules a person can conceive of.
Think about it; If Apple wanted to prototype a totally new product using in-house fabrication today, they could do it. They could show a working device to a room full of awed spectators who had no idea that such a thing existed, but they wouldn’t be able to put it in your hands until months later, and that isn’t something that Apple wants to do–they want you to make a decision and strike while the iron is hot.
So when you’re done watching the reveal of a new Apple products from another Apple device that’s barely a month old, remember that things weren’t always this way. You can’t manufacture your cake and be surprised by it, too.
I’ve reading a great deal in the past few months about all of the new Nexus phones that have been coming out recently, reviews by people who have used iPhones and tried to switch but failed, reviews by people who are avid Android users who love them, and most people who are somewhere in between. I’ve heard arguments as to why certain operating systems have more future, certain phones are objectively better, and really just stand somewhere in the middle, looking at all of this with a little bit of a quizzical look on my face. I’m not trying to take sides here, but I believe that Apple’s position in this market is much better because of one main reason: NFC.
While it’s true that Google’s Nexus phones have had NFC built-in for some time, it has been clear that the feature has been little more than a bullet point in a presentation in order to build some buzz and give Android pundits something to hold over Apple’s head. I thought the inclusion of NFC in the first round of Nexus phones to be half-baked, mostly because I looked around at the places I visited every single day and saw literally nothing that used NFC in a way that was available for public interaction. The only usage for NFC that I’ve seen implemented anywhere was in the TouchPad. We all know how that went.
The key here is this.
If users wave a NFC-equipped iPhone at a NFC Mac (they need to be in close proximity to interact), the Mac will load all their applications, settings and data. It will be as though they are sitting at their own machine at home or work. When the user leaves, and the NFC-equipped iPhone is out of range, the host machine returns to its previous state.
This is huge, and with Bluetooth coming back in a big (or perhaps little, as in low-power) way, this may be even more effective.
“The usual idea is that you would use NFC to set up the link between the two devices and then do an automatic hand over to a different protocol for doing the actual transfer of data – eg Bluetooth, Wi-Fi, TransferJet etc – and that’s what I imagine would be happening here,” she said.
The above coming from Analyst Sarah Clark of SJB Research.
This idea still has so much potential. As Steve Jobs said when he unveiled iCloud, Apple is demoting the computer to just another device, one that accesses your data in its servers in North Carolina somewhere. With the computer being just a gateway to your computing state anywhere, any device can also theoretically access this saved state and allow the user to resume their previous session wherever they are.
Let’s also look at another piece to the puzzle: Apple TV. We don’t know what Apple is planning for this theoretical Apple TV later this year, but let’s take a look at the Apple TV in its current incarnation, the tiny little black box that, quite frankly, is a little Wünderdevice.
For starters, you can now do this. I think that’s a pretty big deal. So the Apple TV, in its current state, can run iOS apps. It can access iCloud. It can play music and movies, and also allows a compatible device to mirror its display through a Wi-Fi connection. Let’s talk about that for a moment, as well.
If you haven’t already, check out Gameloft’s Modern Combat 3. It’s basically a Modern Warfare clone, but it has one killer feature: the ability to mirror the game on an Apple TV, which turns the iOS device you’re holding into a controller and puts the game on the big screen. I tried this on my iPad and was amazed with the results. This is truly something that game developers need to be looking at, but it’s also something that regular developers need to be looking at, as well. Think about it–if a device that is mirroring its display output to an Apple TV can display different content on the device than on the TV, a word processing app could essentially turn the tablet into a wireless keyboard, while the main workspace is displayed on the TV. The iPad or iPhone (or both!) could display a suite of controls or “function” keys, or function as pointing devices, or really anything that you can think of. The idea of a “technology appliance” holds even more water here, since these devices can be used synergistically to create an effect that one device on its own is technically capable of, but is better when spread out among several devices. Look at Keynote, for instance. With an iPad and iPhone, a person can run an entire professional presentation with no bulky equipment and a minimum of technical prowess.
In the context of the aforementioned connection to an Apple TV, this capability becomes even more important, since it allows the TV to function like a traditional “desktop”, but without the bulk of wiring, an extra device to draw power, and connections to set up. NFC handles everything, and the bulk of the transfer can then take place over Wi-Fi, Bluetooth, or some other protocol that is standard in Apple devices.
And this, my friends, is why Apple is positioned so much more powerfully in this market than any Android device manufacturer. While other manufacturers will essentially be playing catch up with all of this anyway, they will also have to contend with consumers who will be presented with each manufacturer’s take on this idea. Where Samsung may offer one type of connectivity, Asus may not, since it doesn’t have a TV of its own, but LG might. The consumer will stand in front of his TV and scratch his head wondering why his Motorola Xoom isn’t connecting to his Samsung TV, while his neighbor with an iPad and Apple TV is able to transition from room to room in his house without missing a beat.
The aftermath of this whole shebang would be the equivalent of a Destruction Derby, with all of these companies vying for the consumer dollar, blowing themselves to bits and waging a war of attrition while Apple’s devices still lead the way due to their simplicity and interoperability. The next thing that will happen is that these other manufacturers will start listing even more specs on their TVs, things like gigs of ram, processor speeds, and core counts. The consumer will look at all this and once again scratch his or her head in confusion. The Apple TV will say something like “Best-in-Class Picture Quality, Siri, and [catchy Apple-fied name for NFC connections]. Say Hello to Apple TV.”
It’ll sell like gangbusters, and we’re all going to want one. Of course we will, it’s going to represent the future of computing. Can we even call it that anymore? No, not really, it doesn’t feel right, and in this one (admittedly long-shot) future, “computing” isn’t a thing. You just pick what you want or need to do, and you use well-designed, simple hardware to do it.
The rumors of an Apple-branded HDTV have been around for a long time (although perhaps not as long as the iconic iPhone). For many reasons and for many years, this didn’t make sense. Having an Apple-branded phone was ludicrous since so many other companies controlled the market in terms of handset design, technology, carrier availability, etc. Apple had no leverage; they were just getting their feet under them after recovering from an almost-inevitable downfall, and they weren’t seen as competitive in the marketplace due to the highly exclusive nature of their products. Then they started designing their own hardware, coupled it with some amazing software, and all that changed.
Now, the world looks to Apple for guidance on just about everything.
Now, we’re seeing the same thing with TVs, and it smacks of WebOS.
One of the big announcements that came out of the HP Think Beyond event was that webOS will be shipping on every PC, laptop, and some printers that they sell by the end of this year. We have pondered what that will do to the scale of webOS and how HP would implement it.
To be honest, I don’t know how this is going to play out, but it looks like these companies want to get their OS into everything in your home. I think the idea here is to have a network of appliances, devices, and screens that are discoverable and OS-aware, meaning that they can sniff out other devices/appliances on the network and interface with them. A person might be able to control his or her washing machine with a phone, or monitor the state of the vegetables in the refrigerator by glancing at a widget in the dock of a tablet, or activate a Roomba to clean the floors while he or she is away. The more devices run your flavor of OS, the more is possible on the network. Naturally, this might also lead to Skynet, but whatever.
It’s the “home of the future”, and it started with your phone.
Then it must be one, right?
When Apple released its new AppleTV, I asked one of the questions that I should really learn to stop asking with Apple products: “So what?”
What I forget constantly is that Apple figures all of its new products into a wonderful long-term strategy that is often hard to decipher but beautiful to watch unfold. The AppleTV was one of those devices that didn’t really have a place in my heart until I started using it, and it’s become even more incredible with the recent unveiling of iOS 5.
When I started using my iPhone, I discovered a little jailbreak-only app that allowed me to mirror my iPhone’s screen on my TV, which allowed me to do things that were (at the time) not possible, like pump music from the iPod app out to the TV and show photos (without creating a slideshow). “It’s like having a computer in your pocket,” one of my friends remarked at the time.
This is heating up now, and I think the barely-mentioned screen mirroring over AirPlay is going to be one of the most life-changing things I’ll experience this side of 2000. I already use my iPad for just about everything in my life, and the Mac Mini sitting just below my TV does very little. Sure, it has some apps installed for design purposes, but they’re all secondary to the writing and creation that I do on a daily basis on my iPad. Sometimes (rarely), I feel like it would be nice to be able to throw what I’m doing on a larger screen and just lean back a little, see the whole thing take shape in front of me. Why is this not a computer, again? Currently, I can do that (sort of) with my iPad via an HDMI cable…but that isn’t really ideal because it means that I have to jockey with cables, and risk damaging ports when things get inevitably jerked around or flexed in strange ways.1
Now, I can dock my iPad, throw the bluetooth keyboard on a desk, and type as much as my little fingers can type without a second thought. This is huge. It means that I can go to a friend’s house, and mirror my display on his or her TV without any setup, without digging around behind the TV to find his or her HDMI port, and without any cables to lose or forget. Just make sure the friend has an AppleTV (which have enough value proposition on their own) and I’m set. It means that businesses don’t have to worry about sales pitches going wrong due to configuration issues anymore. It means that we’re one step closer to a shared classroom where people can contribute anything they’re reading to a discussion without having to be an IT professional to do it.
One step closer to living in the future. Oh wait, we’re already there.
1 I’ve always had problem with dongles or adapter cords. For some reason, the cables alway break or fray internally, and the whole thing fails within a few months of normal use. I like to avoid them whenever possible.
I’ve been trying to digest the Apple news over the past few days in a way that would be meaningful, and it’s been difficult. Amidst all of the noise regarding unrevealed iOS 5 features, unrevealed Lion features, unicorns flying and granting wishes, and the future of all three, I was able to come up with a coherent thought that I think captures what I actually think about the future of mobile.
When Apple started getting serious about iOS, Google also started getting really serious about Android, and the divide that grew between the two has been significant. A lot of people get Android phones now because they’re “just like iPhones”, until they realize that their Android-powered device can’t do X (very rarely do I ever run into a situation that’s the other way around), or needs 20 steps to do Y. A few people get Android-powered phones because they want to do things that they “can’t” do with an iPhone. There will always be things that Android devices will be able to that iOS devices won’t be able to do and vice versa, but that’s not the key metric here. What we have to be concerned about is whether or not those things actually make sense and are “doable” by the majority of users. In my opinion, they’re not. Most people don’t have the ability to or desire to root their phones, don’t want to dig into firmware files, don’t want to jailbreak their devices, don’t want to do all the stuff that the advanced users (who tend to be the most vocal) use as ammunition against the competing platform. In the end, most users want to pick up the phone, send a few texts, make a few calls, hop on Facebook, and have fun doing that. Oh and play games. That tends to be about it. Does this make me upset? Yes, sure. I tend to use my stuff a little more, but hey, not my phone.
As mentioned in the past, Apple is doing some neat stuff with their product reveals as of late. Apple is telling people how they work. This is important because yeah, it’s about the user experience (UX), but the reason you’ve got such a killer experience is because of all this hardware underneath, because of this glass, because of this epic battery. Apple is communicating that there’s a lot that goes into the design and production of each device, and that should make you feel good. You should look at all this stuff and feel like they made it for you, to fit your lifestyle, your aesthetics, your pocketbook.
So, that brings us to now. Apple unveils all these new things that are a part of its new iOS, and some people1 looked at all that and had a very meh response, saying that this release was more of a parity release, that it wasn’t really breaking any new ground. I continued to look at this iOS release, however, and I think I figured out why I feel so excited about it. Whenever Apple has released a new product or new version of their OS, Android users have always held it over Apple users’ heads that they’ve been able to do this for months or years or millennia or whatever. Now, they can’t do that. Now, a person deciding between iOS and Android is going to have to choose between The Real Thing and a knockoff. This is where we’re at, folks.
People used to walk into a store and have the sales associate give them a weighted assessment of iOS vs. Android which probably included that ridiculous “open” buzzword in there somewhere. What does “open” mean for the end user?2 I’ll let that one percolate for a bit.
Ultimately, “open” is just a word, a marketing tactic that has no meaning for the customer, for the actual user of the product. “Open” is only meaningful to the developer (and marginally, at that). For the customer, it’s meaningless, but it sounds good, like you’re sticking it to the man or something. For the baby boomer generation, this is great because they used to stick it to the man, and maybe it makes them feel good. But let’s extrapolate that out a little bit. Let’s say a person hears “open” and buys the Android phone because they think it farts rainbows or something. Now they think that everything they do is better, the perceived benefits of using an “open” phone start to shine through. Until they see something running iOS. All of the things they thought were so great are also clearly on iOS, but look better, respond better, feel better. Where’s “open” now? Where’s Android now? It’s just another cheap imitator.
A new iPad owner will be able to pop the top on their new iPad and start using it right away as his or her primary computer. There will be little to no configuration, and all iOS devices will be kept in sync. Apps will use iCloud, people will love the experience, and the whole thing will grow its own. The Apple club is getting bigger, and the cost of entry is dropping like a rock. As highlighted by other writers, Apple is re-stating its devotion to being a hardware company, a mobile devices company, not a software company. Sure, Apple writes software, but only because its software sings on its devices.
For any other company, a software release that brings in features that others have had as “standard” for a little while would be “just” playing catch-up; for Apple, which designs software that is already powerful to the nth degree, “catching up” means creating almost unstoppable inertia.
1 I’m counting myself among those people.
2 I’ve been in carrier stores before, and listening to these floor guys try to explain it to the customer is hilarious. Listen in sometime and you’ll see what I mean.