The future of user interfaces is in our hands, literally. Both hands. A lot of people are saying the way forward is multi-touch input on the screens of devices like the iPhone. I say, don't forget about the Apple Remote which is here right now!
When I worked at IBM Canada, I attended a talk given by Bill Buxton, who was doing UI research at Alias|Wavefront at the time. Bill is heavily into multi-touch interfaces. He described some gestural interfaces and two-handed input techniques employed by artists at alias, and that idea really struck a chord with me and has stayed in my mind ever since.
Fast forward to 2007. In photography, I'm losing patience with the tradeoffs between DSLRs and point-and-shoots, and have started using a two-handed, two-camera shooting technique for combined landscape/wildlife outings. At home, I have an iMac with an Apple Remote, and have started using the remote with Mira, a driver that lets the remote control any application.
At first, the remote seems like a good way to cut down on eye and wrist strain; it replaces the mouse for dead-simple interaction that shouldn't require hunching over the keyboard. I set up Firefox shortcuts so the remote can boost the font size, page up / page down, and close the current window. That way, open a bunch of tabs and then sit back and read 'em at a comfortable distance on the 24" screen. I set up shortcuts for image-viewing applications to go full-screen, forward/back, and delete the current picture. Then it's much simpler to cull the bad pictures from a picture-taking trip.
But as I ponder my most laborious interactive tasks, I realize that the remote can supplement the mouse by working in concert. In Photoshop, I can make remote flag, delete, launch, and do other common operations that usually involve double-clicks, multiple keystrokes, or widely spaced menus and icons. I still need the mouse to select the photo to act upon. Mouse in one hand, remote in the other, I'm a whirling dervish of photographic productivity.
It would be even better if Mira could adapt its remote functions based on which Photoshop window was open: in the Layers menu, do this and this, in the Levels dialog do that and that. I'll suggest that to the fine folks at Twisted Melon. Another multi-use sort of app is iTunes; half the time I want the usual Play/Pause/Next controls, half the time I want to assign ratings or edit metadata using shortcuts.
Other challenges await. A lot of programming work already involves cursoring up and down and copying and pasting. Maybe I can come up with a set of mappings for OS X's Terminal. I'm sure Mail.app will go faster when one hand selects a message and the other chooses from half a dozen actions on that message.
Now that I think about it, possibly the finest days of multi-handed input for the average person came in the '80s with the Commodore 64. I remember using a light pen, Koala Pad, and game controller spinning knobs all in the course of a day. Today, the parallels are a Wacom tablet, that glowing knob/button whose name I forget, and the Apple Remote. Ah, if only the infrared sensor could record position like a light pen!
Friday, April 13, 2007
Subscribe to:
Posts (Atom)