Pointing at the TV
By now it’s generally agreed that iTV is coming on Wednesday and is likely to be iOS-based.
Many folks have assumed that “running iOS” means “running iPad apps” directly, or iPad-style apps via another App Store. This raises a lot of questions about the interaction model; how do you manipulate an app that’s beyond your reach? If we expect any new iOS device will run existing apps from smaller screens, we run into the “focus” problem: if you can’t touch directly, you have to have context for the “noun” you are about to “verb” with the next tap.
There are a few ways to address focus.
A directional controller (d-pad or gestural touch surface) can navigate a straightforward, rectilinear menu interface as most TV interfaces (including Apple TV) do today. Or that same controller could move focus between more arbitrary active regions, as with many DVD menus. Jon Bell and Dan Wineman are excited about gestural touch surfaces and their potential here. After all, the Remote app for controlling your Apple TV from an iPhone has a similar touch surface approach for navigating menus.
I’m not convinced. Even though it is common to confound gestural touch surfaces and direct touch UI, this is still an indirect focus controller. I cannot imagine Apple adopting a discretely-shifting-focus UI akin to DVD menus, and the best alternative seems to be introducing a cursor for arbitrary focus. Once you’re using a directional controller (gestural or not) to control a cursor on a screen (decoupled from the controller surface)… well, that’s a pointer. You might as well have a mouse.
A more widget-like App Store isn’t out of the question, though, for very simple apps with simple interactions. My hopes, though, are for greater interaction between iOS devices.
It’s a common complaint that we cannot stream audio from our iPads to AirTunes; we could see that enabled along with a video streaming API. I imagine content owners are the only obstacle to the SDK enabling the Netflix app to send video to your AppleTV. Such an interface could mature into a very compelling platform for passive display, whether for displaying Keynote presentations or dashboards for multiplayer games.
Coordination from other direction would also be sensible. If any connected iOS “remote” could display interactive meta-content, we’d finally have a chance at the sorts of compelling cross-screen experiences I’ve seen mocked up to many times. Apple already has a format for that, which could let us browse through supplemental content on our personal screen while watching media on the big one.
With time, I imagine we’ll see this sort of coordination, even if it’s disabled for “professional” content. This week, maybe we just get a scroll wheel.