Look, ma, no hands!

After about 10 to 15 minutes of quite difficult concentration, I could "write" at about 10 words a minute, considerably slower than my keyboard typing rate.

Oct 1st, 2002
Th 107613

I have been trying a novel method of "writing" without using a conventional QWERTY keyboard. I would like to have been able to say that this column was written with this intriguing method but, alas, the system isn't yet fully developed and I did not have the time to become as proficient as I am with a traditional keyboard. I am not a touch typist and I do not use all 10 fingers but I generally move right along—with ample pauses for querying my muse.

The text entry system, as the two physicists who invented it, prefer to describe it, is called "Dasher," probably because the user "dashes" after letters as sentences are composed. The inventors are two researchers in the famed Cavendish Laboratory of Cambridge University, David MacKay and David Ward. MacKay's original inspiration came when he was struggling with a tiny conventional keyboard attached to a personal digital assistant (PDA). This set MacKay thinking "how could we make a smaller PDA given that the limiting thing is the keyboard?"

MacKay and Ward believed that they could make something much more efficient than a regular keyboard. A single digit, MacKay points out, is capable of making fine movements as opposed to the clumsy pushing down of keys. If a pointing system could be combined with a predictable language model, then it would be theoretically possible to write at 24 characters per second, according to MacKay. And, if the pointing system were an eye-tracker that used movement of the eye to replace mouse movement, then true no-hands text entry would be possible.

The authors described the Dasher system in a recent issue of Nature ("Fast Hands-free Writing by Gaze Direction," (Nature 418, p. 838, Aug. 22, 2002; www.nature.com). It's quite difficult to describe Dasher—simple animated demos on their web site make the process somewhat more understandable. The user is first presented with the letters of the alphabet arranged vertically on the right of the screen. Each letter is in a colored box. When the user moves the mouse pointer in the direction of a particular letter (by hand with a mouse or by eye-tracker), the letter moves to the left and its box expands to present letters chosen by Dasher as being the most likely to follow. For example, if the first letter is "h" then the vowels "a,e,i,o,u" pop out from the right and the user points to the desired second letter. If "e" is chosen, then more probable consonants present themselves for the third letter. Thus the word "hello" might be formed. You can download free prototype versions of Dasher from the researchers' web site (www.inference.phy.cam.ac.uk/dasher). Free software for head tracking using an inexpensive web camera can be found at www.mousevision.com.

After about 10 to15 minutes of quite difficult concentration, I could "write" at about 10 words a minute, considerably slower than my keyboard typing rate. The authors claim that speeds of 15 to 25 words per minute are possible after an hour or so of practice. One of the researchers, David Ward, can write at 39 words per minute. That may be near the limit unless the language model is enhanced.

The version I used produced underscores instead of word spaces and capitalization was rather clumsy (there was no equivalent of a "shift" key). Nor did the system offer punctuation or numerals. Consequently, I produced strings of words that resembled the lower-case writings of the poetic cockroach Archy who wrote by jumping on the keys of the manual typewriter belonging to humorist author Don Marquis.

MacKay and Ward are determined to keep the Dasher software in the "open source" mode, like Linux and X Windows, both programs that the researchers used in designing and developing Dasher. I'm looking forward to trying Version 3 for Windows, which should be available by the time you read this column.

Click here to enlarge image

ATD Online Editorial Director
jbairstow@pennwell.com

More in Research