Who moved my menus?

ANYONE who has grown up using a mouse or a trackpad will understand the concept of menus.

In a computer application, menus give us access to commands or functions that a program can carry out. For example, in a word processing program, you could execute a search and replace command by going to a the Edit menu and clicking on “Find & Replace.”

Menus are easy to use in two senses: 1) They can display the complete range of commands available; and 2) They make selecting these commands easy. Just point and click and you’re on your way. The approach is tried and tested – and more than 30 years old.

Menus, in fact, are the third component of WIMP – windows, icons, menus and pointing device – a graphical approach to computing first introduced by the Xerox Palo Alto Research Center in the early 1970s. These early efforts at creating a graphical user interface or GUI (pronounced gooey) later became the basis for the Apple Macintosh, which popularized point-and-click and drag-and-drop computing. WIMP was also widely used by Microsoft’s Windows operating system and X Window System (X11) interfaces used in Linux.

Today, menus have become such a central part of the computing experience that it is difficult to imagine what a desktop would be without them. Yet this is exactly what Ubuntu founder Mark Shuttleworth is asking us to do, starting with the upcoming release of Ubuntu Linux, which will seek to replace traditional menus with a heads-up display or HUD. The HUD, he says, will control applications as well as the system.

Think of the HUD as a souped-up search box that can anticipate what we want from the first letters that we type into it, much like Google now “guesses” what we want to find as we start to type. If I were working with a word processor, for example, I could invoke the HUD and begin to type “FI” and the system would be smart enough to offer us “Find & Replace.”

In Ubuntu 12.04, the HUD is a smart look-ahead search through the app and system (indicator) menus, Shuttleworth says.

“It’s smart, because it can do things like fuzzy matching, and it can learn what you usually do so it can prioritise the things you use often,” Shuttleworth writes in his blog. “It covers the focused app (because that’s where you probably want to act) as well as system functionality; you can change IM state, or go offline in Skype, all through the HUD, without changing focus, because those apps all talk to the indicator system. When you’ve been using it for a little while it seems like it’s reading your mind, in a good way.”

Shuttleworth acknowledges that there are some things that traditional menus do well.

“They are always in the same place (top of the window or screen). They are organized in a way that’s quite easy to describe over the phone, or in a text book (“click the Edit->Preferences menu”), [and] they are pretty fast to read since they are generally arranged in tight vertical columns.”

On the other hand, menus become difficult to use when they are nested, and require users to remember the tree structure, he says. They are also generally more difficult to use from the keyboard, and force developers to make arbitrary choices about where menu items appear (should Preferences be in Edit or in Tools or in Options?).

“The HUD solves many of these issues, by connecting users directly to what they want,” Shuttleworth says. “It’s a ‘vocabulary UI’, or VUI, and closer to the way users think. “I told the application to…” is common user paraphrasing for “I clicked the menu to…”. The tree is no longer important, what’s important is the efficiency of the match between what the user says, and the commands we offer up for invocation.”

If all this sounds pretty drastic, don’t worry. The traditional menus will still be around in the next version of Ubuntu. You’ll just have the option to jump to what you want faster with the HUD.

Never one to aim low, Shuttleworth says the natural next step is to get the HUD to accept voice commands.

“Searching is fast and familiar, especially once we integrate voice recognition, gesture and touch,” he writes. “We want to make it easy to talk to any application, and for any application to respond to your voice.”

For some reason, this vision evoked the following dialog between the protagonist and the computer named HAL in the 1968 sci-fi classic, 2001: A Space Odyssey:

 Dave Bowman: Hello, HAL. Do you read me, HAL?

HAL: Affirmative, Dave. I read you.

Dave Bowman: Open the pod bay doors, HAL.

HAL: I’m sorry, Dave. I’m afraid I can’t do that.



About The Author