The Windows [8] Store (and other App Store comments)

The world got a pretty good explanation of things yesterday and as far as my development is concerned I’m sold on the idea. I’m not sure which apps I’ll be porting or when (I still have nothing on the Mac App Store). What I specifically find intriguing is that 80/20 vs 70/30 split price break at USD25000. That’s a pretty small number. But there is one are in which I’m disappointed and hope I can be proven wrong.

Designed for discovery
Ensuring the visibility of apps and the efficiency and fluidity of app discovery became the fundamental building block of our Store design. We use minimal chrome so apps shine through, and complement the apps with a series of way-finding and promotion mechanisms—search, category browse, ranking lists, editorial curation — to help people find great apps.

Windows Store is designed for easy app discovery

We designed the landing page to push compelling apps to the surface. We use categories to help organize the apps—the latest, most popular, and fast rising apps all have dedicated lists surfaced here. You’ll see personalized app recommendations and also topic pages that promote apps related to editorial themes, helping surface what would otherwise be hidden gems.

Navigation is simple and consistent with the model of Windows 8. Built-in search supports directed discovery, fluid panning moves you through the categories, and category filters help locate the most relevant apps.

What they describe as “discovery” is no different from how Apple does it, and the App Store has decidedly poor discovery. The problem with both of these is that people buy things BECAUSE they’re in the top whatever, and thus stay at that position or at least in that decade.

There’s also the problem of “featured” sections. These are highly subjective and can probably be bought so some extent. For example, Apple often features big name apps that are terrible just so they can say “Look! iOS has Citrix support”. Apple or MS can also stuff this sections with their own apps as Apple did when Final Cut Pro X came out.

A few things all app stores need to do

  1. A very obvious setting to show/hide already purchased apps
  2. A way to hide apps for any reason whatsoever (purchased not through app store, sick of seeing something you’ll never buy, etc)
  3. Keep games out of the top overall rankings
  4. Differentiate between daily (current) and all time in lists.
  5. Keep lists long. If you use Apple’s xml feed, the lists go up to 400, but only 200 show in iTunes. Nothing wrong with a “next” button is there?

That’s obviously not going to fix much but basically what I think would help would be to make the “Genius” like features more upfront instead of a feature you have to choose to use. I think hiding apps and making ranking lists more relative would help a lot though. I don’t have too many apps to organize and even I find top-10 lists populated almost entirely with things I’ve already purchased.

ARM and x86

  • Apple switched from IBM to Intel, not PPC to x86
  • Apple sells more ARM devices than x86 devices
  • ARM is open for licensing and Intel can become Apple’s ARM supplier.
  • OS X and iOS have no significant amount of machine code and thanks to OpenCL and Accelerate neither do third party apps.

In 2004, all processors sucked. Intel’s best pentium 4s as well as IBMs G5s used so much power and made so much heat that neither were suitable for legitimately portable laptops. Instead, Intel had to make low powered (and weak) centrino processors or risk losing the market to AMD (who already figured out how to make low powered x86 chips). Similarly, Apple had to stick with miniaturized G4s. Intel had a trick up its sleeve though, the Core processors. They benched slightly slower than Pentiums and G5s, but did it with way less power consumption. IBM had no such trick. A chip similar to the G5 is still used in the Xbox 360 and it took them until this year’s release of the Slim model that we finally have a chip that maybe could’ve worked in a laptop. This wasn’t about architecture, it was about engineering, and Intel won while IBM failed to innovate. PASemi later proved there was nothing wrong with PPC and better chips could be made but it was too late. Luckily, Apple bought them.

The reason I mention that Apple sells more iOS devices than Macs is to give perspective to the argument that Apple needs someone as big as Intel to make their processors. Obviously they don’t if they can get even more from Samsung. When Intel said Apple’s threat to move to ARM was a wakeup call they meant it: Apple is one of Intel’s biggest customers but Apple doesn’t need Intel.

But that doesn’t mean a failure to innovate x86 would result in a divorce from Intel. Legally speaking, IBM could not have offered to make x86 chips for Apple because Intel has really tough licensing. AMD only exists as a result of anti trust. But ARM can be licensed by anyone, including Intel, so there’s no reason that Intel can’t be one who designs (or at least fabricates) the A6.

Finally, when Steve said OS X was living a secret double life with x86 builds what he didn’t mention was why it was so easy to do so. OS 9 had roots back in the days before compilers were useful, thus it had a lot of machine code to do simple things. By the time NeXT was writing what became OS X, compilers were trustworthy enough to compile a whole operating system. In fact, most Linux distros today can be compiled on anything with the GCC.

And that brings me to OpenCL… back in the G4 days developers got their first taste of hardware acceleration outside of graphics. The G4 could operate on multiple numbers (vectors) simultaneously on a single core if you wrote special code to do so. They called it the Velocity Engine. Unfortunately, that special code was machine code – machine code for the G4 (and G5). It won’t work on G3s, or x86. Intel has SSE4 for similar purposes. What’s nice about OpenCL is that you now can write vector arithmetic in C and GCC will turn it into SSE4 for you, or ARM, or whatever. Write once, compile anywhere. It’s even technically possible for the compiler to write non accelerated code to run on unsupported hardware, although as far as I know this is not universally offered.

Today we have a similar situation to 2005. We have one set of chips that are fast and power hungry, and another that are slow but sip power at a much more favorable ratio. The ratio always wins. Raw speed isn’t good enough, it’s performance per watt that matters.

Windows 8 Pessimism?

Really?

“Windows 8 will be largely irrelevant to the users of traditional PCs, and we expect effectively no upgrade activity from Windows 7 to Windows 8 in that form factor,”

I’m going to ignore my rational side that says these “analysts” aren’t trolling for clicks and pretend they’re just morons.

Have these people watched the keynote? Or even downloaded the DP? Metro Apps are just a layer that won’t interest the masses with traditional PCs? As I mentioned in my last Windows 8 post, Metro scales beautifully across screen sizes and to say it’s only for touch input is naive. Metro apps are better compared to Apple’s Full Screen apps, because that’s really all that’s different about them. You can click or touch or use the keyboard. That’s up to the developer. In fact, the Metro labyrinth game MS includes is played with arrow keys. You can bet Angry Birds will be a metro App. And people will want these metro apps. Why?

Because there will be an App store, and it will not be on Windows 7. App Stores are lowering the IT needed to install things, and if MS can pull it off without multistep wizards it will be huge. But analysts are no longer “impressed” by App Stores as they’ve essentially become a minimum standard due to Apple’s success. So maybe it won’t be impressive, but it’s not there now, and when it’s there a lot of dollars will be moving from consumers to developers whether the press gives a damn or not.

If you’re following Building Windows 8 you’ll see that MS is actually concentrating on a lot of aspects of Windows. To say people won’t be interested is like saying Vista users weren’t interested in Windows 7. Why would people want to shell out money for what was viewed as a mere “fix” of Vista? Maybe it’s because the type of people who upgrade are the type who AWAYS upgrade, and the types who won’t didn’t upgrade to Windows 7 either. Like Windows 7 before it, Windows 8 has a SMALLER footprint than its predecessor and weaker system requirements. That alone is worth it to upgrade if you’re still clinging to a 32bit machine.

Given the short lifespan of modern very cheap PC’s there are a lot of Vista, XP, and even some Windows 7 machines almost ready to blink off of BSOD for the last time and users will get Windows 8 on new “traditional” PC’s whether they like it or not. This probably represents that largest number of “new” users so whether or not MS can convince people to pay for Windows by itself probably doesn’t affect the bottom line too much.

Screen Sizes that Don’t Matter

Two things today: The sizes of the “real” AppleTV are supposed to be 32″-55″ (inclusive, i.e., multiple sizes) and iPhone betas have drivers for 1280×720 and 1440×800.

I’ll start with the iPhone one: since the iPhone 4 (technically the iPad 1) iOS has been capable of operating in “multiple monitors” extended mode (not mirrored) just like a MacBook. A4 devices could do at most 720p and A5 devices can do 1080p. If you’re a developer, you know that when you plug in another display (whether AirPlay, HDMI, VGA, or even with the component or composite adapters) you actually get an NSArray of available resolution. If you give it my native 1080p Dell to an iPad it’ll return 720×480, 640×480, 800×600, 1024×768, 1280×1024, 1280×720, and 1920×1080. (In iOS 5 and later you also get overscanning options). Now, when connected to my MacBook, I get a lot more. This is because the MacBook lists everything and the iOS trims the list to ones that are more likely. So a poorly translated “finding” is likely just code related to external display support. Furthermore, nothing has a resolution of 1440×800; the 15″ MacBook Pros and 13″ Air have 1440×900. But 1440×800 is actually wider than 16:9 so it’s just a nonsense number by a unreliable (and too many times translated) source.

Regarding the iTV screen sizes and reports of being priced at “double” comparable sets…

The 27″ Cinema display has the best 27″ panel around. It’s also in the Dell U2711 27″ IPS display. Apple’s costs $999, Dell’s $1099. There are some differences, more ports on the Dell, Apple’s is LED… but in reality, the so-called “Apple Tax” is dead. In fact, the new “Apple Tax” is actually regressive and gives Apple products a LOWER price for comparable (or better) products because Apple can outsell and thus take advantage of economies of scale.

Maybe the reason the Sony is taking a loss on all of their TV’s is because they have dozens of models with different panels, thus a small number of purchases from suppliers for any individual panel, thus a price to high to profit off of. So maybe Apple will have high prices compared to sets with TN panels at the same size, but compared to sets with the same LED lit IPS panel Apple’s will have a very similar, if not slightly lower cost, and they WILL make a profit.

Now that the Flash wars are over, Apple and Adobe need to be friends again

My 2003 self can’t believe I’m saying this, but right now Premiere Pro is the Professional single user non linear editor to use. InDesign and everything they got from Macromedia are giant steaming piles of shit, but their big four, PhotoShop, AfterEffects, Premiere, and Illustrator are still without legitimate competition. Final Cut Pro used to be better, as did Shake, but those days are gone. Things like Pixelmator and PaintShopPro are nothing close to Photoshop if you’re actually a power user, same goes for Illustrator.

In 2001 Mac users had a problem. Photoshop was for OS9 but worked OK in classic. Premiere did not. Final Cut Pro wasn’t ready for prime time until 3.0. Personally, once I started getting to know OS X I was hooked. I wasn’t going to dual boot. I could tolerate Photoshop 5 under classic but my pro video was done in Premiere 5 under Windows 2000. Compared to its predecessors, Windows 2000 was stable, lightweight, and even kinda pretty. It was the Windows 7 of its day.

With the crippling of Final Cut Pro, Apple’s pro users are once again at Adobe’s mercy. When a job needs to get done they’re probably more likely to be loyal to the application rather than the OS, even if begrudgingly. Sure Photoshop has only innovated / iterated over the years and the same is true of Premiere, but for now we’re without legitimate alternatives and it sucks.

Was there almost a Final Cut Pro 8?

Updated
Apple released Logic 9 on the App Store AS IS with a price change and changing the 19GB worth of extras as an in app download. Clearly they learned now isn’t the time to reinvent anything as Steve put in back in ’97 “sometimes it’s 10% better but usually it’s 50% worse”. I hope they will finish Final Cut Pro 8 and add it to the app store and drop Final Cut Pro X to $99 or something.

So today we heard that a 64bit “evolutionary” Final Cut Pro 8 was in fact almost ready but someone decided to go in the “revolutionary” Final Cut Pro X direction.

I gave Final Cut Pro X time, patience, the benefit of the doubt, and even gave it a pass on not opening new projects and still find it horrible. Here’s what I specifically hate about it:

  • Even when turing off previewing, scrubbing, waveforms, etc, timelines are still “choppy” to navigate and the shrinkable pro-kit scrollbar was much better than the current crap.
  • Multi-monitor support is a joke. I used 4 displays in Final Cut Pro 7. I can only use two in X and I can’t really decide how to lay them out.
  • The iMovie style quick selection is a pane in the ass. I’ll trim my clips with precision thank you, just let me put the whole damn thing on the timeline please.

In summary, the UI of Final Cut Pro X is terrible. The only good thing it has is the magnetic timeline vs static tracks.
Do you know what was wrong with Final Cut Pro 7?

  • Only rendered on 4 Cores
  • Only rendered on CPU
  • Antialiasing sucked / nonexistant
  • Unsupported import Formats

In other words, all of Final Cut Pro 7’s problems were under the hood, and most of it was due to it being built on Carbon, which Apple chose not to provide 64bit, Grand Central, or OpenCL support for.

I’ve always maintained an attitude of knowing skills rather than specific pieces of software.  As such I’m down for any nonlinear editor. Maybe Final Cut Pro X.2 or XI or whatever will blow me away. I still feel like Apple made the wrong decision and left actual professionals high and dry. It’s already clear the Mac Pro won’t be upgraded, we’ll never get our TowerMac, and the 15″ MacBook Pro is about to go Air. If Apple continues to move too fast towards the “Post PC” era they’ll accidentally create a bunch of former Mac users who now buy Dell boxes or at least VMWare to get by.

Note: My Final Cut machine was a Dual Quad 2.8GHz 2008 Mac Pro with two NVidia 8800s and 20GB of RAM.

Windows 8 on a Touch Screen

Today I got my Cyber Friday Dell ST 2220T which for those of you who don’t read Dell model numbers is is a 22″ 1080p IPS display with Optical Touch support for Windows 8. I’ve always been happy with the versatility of Dell displays and their price points and this is no exception. But this isn’t about the display, it’s about using Windows 8 on a touch surface rather than mouse/keyboard.

To get a few things out of the way, yes Windows 8 is a developer preview so sometimes things went a little awry and a re-login (not a whole reboot) was needed. Also, I was using my MacBook Pro in BootCamp and I had to manually install drivers. FYI, it’s a 2.26 Core2Duo 13″ MacBook Pro with 8GB RAM and Mac OS X on a SSD and Windows on a120GB partition on a 500GB 5400rpm HDD. Internal. I removed the optical drive. Since Windows is in Beta and bootcamp of it isn’t even supported I’m not going to write about any glitches because they’re obviously my fault.

  • Metro does look sane at 1024×768 AND 1920×1080. It’s neither useless cramped at lower resolutions nor is it awkwardly sparse at higher ones.
  • Metro lets you scroll the normal “tablet” way by scrolling in a view or the “classic” way by dragging the scrollbars. The Scrollbars are more subtle than previous versions of Windows without being nonexistent like Lion scrollbars.
  • The onscreen keyboard for Metro Apps appears as needed, like iOS etc, but for some reason the number pad is in the “phone” orientation rather than the “number pad” orientation. Since it was a computer, not a phone, I expected the top row to be “7 8 9” not “1 2 3”.
  • “Off Screen” gestures (where you drag from the edge of the screen) are too necessary. Maybe if you’re taught that they exist you’ll be fine but that’s not very “Grandma Friendly”. They also seem inconsistent. When I write my iOS apps I design them all to be operated without a manual. “Pinch to Zoom” is intuitive. Swipe Up for the Address Bar is not. Like the WebOS “Flick” or the iOS Four Finger Swipe between apps, these kinds of things should only be shortcuts and never the only way to do things.
  • Metro is completely void of right clicks. This is a great for touch screens. But regular Windows underneath it is still full of them. Yes, they’re shortcuts, but they’re really big ones. To change the desktop background in Windows 7, you can right click the desktop, and change it. Or, you can press start, and start typing “Background” or “Desktop” into the Spotlight Search. In Windows 8, without right clicking, you have to go to the control panel, then “more”, than Appearance or something. I’ve read that “holding” a touch makes a right click but I couldn’t get it to work. Maybe drivers, who knows. Either way, being a Windows veteran it was far too annoying to use the “normal” stuff without it. I suggest Windows steal the UIPopOverController from iOS to replace contextual menus with more useful more versatile views and let them be triggered by “normal” tapping on something.

Right now, Metro and Windows are at a very awkward distance from each other. It’s most like two high school relationship virgins who “like” like each other but are both to shy to ever make a move. They sit next to each other in class, are lab partners, maybe even walk home together but they don’t hold hands, they don’t kiss, they don’t flirt, but they desperately want to. Windows and Metro need to comfort each other about their feelings and either become an item or decide to be just friends who share the same PC.

When I’m using Visual Studio or any non-metro App, with a keyboard and mouse, accidentally hitting the Windows key because I wanted to search or Run or launch something because a homicidal fit of rage when presented with the Metro Screen, which btw is ASS to navigate with arrow keys or a mouse. If two finger horizontal scrolling on my MacBook worked that’d be different, but that’s not gonna happen any time soon. It will be many a year before proprietary business apps (and anything that uses citrix) are ready for Metro, so maybe some people would just like to forget it exists at all.

Similarly, when not using Metro Apps, you need a mouse an keyboard. While the Flash killing mouseover events aren’t too big of a deal, the non-automatic keyboard is. Leaving the onscreen keyboard up at all times at a usable size wastes space. As it is now, you “can’t” use non-Metro apps any better than current Windows 7 tablets.

And that’s where the hormonal adolescents come back into the picture. If they’re not going to date and we’re going to call them “Windows” and “Metro” separately, then we need a way to completely partition them. Doing so would require the entire control panel and all MS apps to run in Metro. Ideally they will date and we’ll call the whole thing “Windows 8”.

is iAd as good as dead

It’s supposed to be privately negotiated between Apple and advertisers but we know that it’s something like $2.00 per click and $0.02 per impression (if not clicked) for what we’ll call “Blue Chip” Ads. And on the surface, 60% of that going to developers sounds fantastic. There’s some problems. Apple assumes it can reinvent how often consumers click ads because iAds are fun. AdMob Ads are not fun, so users always ignore them. Simple right? Here’s why iAd click through rates stay at 1% and developers aren’t making money.

The first is that we’re used to ignoring ads. We’re not going to go “hey that’s the iAd badge, this ought to be fun!”, it just gets ignored.

There’s about 30 Blue Chip ads, so it’s unlikely a user is seeing the McRib ad for the first time in your app. You can only expect to make anything if your app is a users only experience with iAds, and even then, most of us already know the brands and their products and don’t need an Ad telling us AT&T’s data plans.

Since everyone has already seen all of the Blue Chip ads already in some app or another, users tend to only care about the Developer Ads. But developer ads don’t have animated banners and don’t go anywhere fun, just an app store link (that doesn’t close your app but with multitasking does it even make a difference?). Developer Ads look static just like AdMob ads, so users are already trained to ignore them. I know I do. That would be OK, except that they’re FREE. Yes, FREE. If a user is presented with a developer ad, and they don’t tap it, you have 1 impression, 0 revenue. There’s only money for click through, and it’s $0.25 compared to that $2.00.

No matter who’s phone I pick up, I find that each app shows about 5 ads to each person. Auto Adjust tends to only get the AT&T Blue Chip ads, and apps in the “Utilities” and “Productivity” categories. The Huffington Post app seems to only get games and the Heineken Blue Chip ad.

But HuffPo also uses AdMob to fill unfilled iAd impressions, which as mentioned earlier, continues to train users NOT to click on Ads.

So to wrap up, users don’t click Ads for a lot of reasons, but that’s not canceled out by any decent payout for mere impressions without clicks. Apparently it’s typical for eCPM (estimated cost per thousand impressions) to be less than $10.00, or one cent per impression. Think about that for a minute. You’ve sold the bottom portion of your app for literally free. And the reason you did is because people aren’t going to click that ad anyway. Think about that too. Who exactly profits in this business model? Does anyone? Does anything get accomplished other than annoying users with distracting ads that make a lot of HTTP requests and sometimes themselves crash an app?

Can anything fix iAds? Maybe. Here are a few ideas…

  • The Blue Chip ads are great, leave that as it is. Try to find more advertisers.
  • Make developers make REAL interactive ads.
  • Use 50% revenue sharing for developer ads.
  • Do Not allow Free Apps to be advertise using iAds. They’re just noise.
Considering that Apple often fails to enforce a lot of its policies (Spamming the App store, Apps that mainly advertise real products but have no functionality themselves, HTML wrapped as an app, etc) I doubt there’s much hope for iAds.

Supporting iOS 3.1.3

Being the elitist/purist that I am, I look down my nose at people who aren’t using iOS 5, Lion, etc and usually blame them when stuff I write doesn’t work. There is, however, one exception to this rule.

When Apple releases a new OS, some older hardware usually gets shafted. With Lion, it was anything that didn’t have a 64bit Intel chip in it. In real world terms, that means any Mac that came out during the summer between my sophomore and junior year of college, or 2006. As a developer, the idea of nice clean 64bit only binaries sounds great. The Macs I have from before then can barely browse so no big loss. Sure it sucks if you got one of those first Intel machines with the CoreDuo but Apple’s market share was a “rounding error” back then so that’s a small number of people.

No so with iOS

While I doubt there are any people with an original iPhone out there, people still use the 3G and the first two generation iPod touch. They’re still great MP3 players. Unfortunately, 1st gen devices are marooned at 3.1.3 and 2nd gen at 4.1. Among supported devices, iOS 5 adoption is only around 40%, but an alarming 12% on iPods.

Since Auto Adjust started way back in the day there isn’t too much critical code that requires newer iOS’s, but one new feature did. In iOS 5 Apple finally let developers have access to screen brightness. This is a big deal. Most people are taught early on that “you can increase battery life by lowering your screen brightness”. While technically correct, it often leads to people incorrectly using photography apps because they have no idea how bright their photos already are. Over brightening and over saturating is usually the result. I decided that rather than hide the feature if not available to nag to set their brightness using the settings app.

Missing features is one thing. The If’s and tests are annoying but doable. The reason iDecorate by contrast requires iOS 5 is because iOS 5 requires a device with 256MB of RAM. In my testing, iDecorate was barely usable on my worst-case iPod Touch 1 with iOS 3.1.3. I could get back some drawing performance by dumbing down the image resolution but scrolling through the amount of stamps was simply unwieldy. If anyone remembers, back then Safari would often show unloaded squares if you scrolled too fast. Not because of network speeds but because the hardware couldn’t keep up. I decided this wasn’t the performance I wanted iDecorate to have and instead made it iOS 5 only.

So why not iOS 4 if the hardware is the same? Because people should update if their hardware is capable. Maybe seeing apps starting to demand it will help them get over the annoyance of not having your device for 10 minutes – although I must say that the 5.0.0 to 5.0.1 update was a very painless experience.

Poor WebOS

HP just reported huge losses but reports are indicating that their fire sale of the TouchPad propelled it to be a the number two tablet. It’s still a far cry from Apple, but it beat out all the Android tablets. I can see why. Palm designed the Tablet version of WebOS to operate just like the iPad version of iOS. The experience is very familiar to your existing user base. I’ll admit I only tried them in stores but as an iOS user I could figure out how to do things on the TouchPad. On the HoneyComb tablets I could not. They felt like using cellphones in the 2000s where you couldn’t remember if it was settings > sounds > ringtone or tools > phone > ringtone and you basically had to look everywhere to do anything. WebOS made sense to me. The card paradigm was different from iOS. It makes sense.

So it’s really unfortunate how things are going at HP regarding WebOS. They have a few choices.

They could license WebOS and stop producing hardware
This may result in fragmentation and preinstalled bloatware. But more importantly, it could lead to driver nightmares and delayed upgrades that Android users are so willing to put up with. WebOS is almost as grandma friendly as iOS. It’s definitely parent friendly at least. I don’t think the added complexity of getting your software updates from Samsung or HTC rather than HP and accidentally bricking it are worth whatever business sense this makes. After all, Apple could’ve licensed OS X easily (Michael Dell said he was interested) but that might’ve brought a lot of substandard hardware (and thus experience) into the mix. If anyone ever tried to use that CoreSolo Mac Mini with the GMA950 you’ll know what I’m talking about.

HP could sell WebOS
I can’t say I have faith in this because the only likely buyer is Oracle and I hate them. Their acquisition of Sun was a reminder that the Java age had in fact closed by 2006. Based on Oracle’s business model (products too complicated to not pay for ongoing support) I don’t see this as a good fit, even with Mark Hurd.

I also don’t think any international companies would be a good fit. If you haven’t noticed, American Capitalism is a lot different than around the world. With the exception of RIM, the Operating Systems and Ecosystems other companies build for come out of the US. Call me patriotic, call me elitist, whatever, but I don’t think Sony or Samsung has that good old American drive needed to make an operating system that’s more than status quo.

RIM needs to tear everything up and start all over but they’re too proud so they’ll never be smart enough to use WebOS. They’re also likely financially unable at this point.

HP could keep WebOS
But HP will never figure it out. WebOS is fun. HP is servers and printer bloatware.

So what would I like to see?
It’s unrealistic, but I would like to see WebOS become the true Open Source operating system that Android only claims to be. Android is about as open as Mac OS X. You can get the kernel, X11, gcc and all that good stuff but to get anything recognizable (like Cocoa is to Mac OS X) you need to be in bed with Google. So I would like to see WebOS become the responsibility of the Webkit team. Webkit is, after all, most of it anyway. Being truly open (not having to sign deals with Webkit) will also open up a new world of smart devices. Imagine what having an HTML5 browser on the front of your refrigerator is going to do for your cooking. But maybe I’m getting ahead of myself.

Think of all the screens you see during the day that you don’t think about the underlying OS of. What comes immediately to my mind are NJ Transit timetables and ticket machines. I’ve seen them BSOD (and I have a pic I’ll upload when i get home). Imagine if instead of Windows they ran WebOS and had proper capacitive touch screens that were actually responsive. I’m not saying you’d be able to flick the ticket app aware and arbitrarily browse, but it would be an improvement. An of course once you turn all those things into Web Apps that opens up the possibility of just releasing them. “See this screen on your Phone. http://shorturl/somehash”.

Imagine the same for maps in the mall. Write one HTML5 app and run it on a vertical 46″ Plasma with or without a touch layer, and offer the URL for you to do the same on whatever tablet you have with you. Maybe add in mall WiFi to assist with that…