The beat goes on

March 1, 2008
Vision-based music synthesizer uses pattern recognition.
Conard Holton2

When Icelandic singer/performer Björk took a machine-vision-based synthesizer on her recent Volta tour her group created a melodic blend of electronic music and image processing. A somewhat similar experience was available this past November to any engineer visiting the Allied Vision Technologies booth at VISION 2007 in Stuttgart, Germanywhere the instrument was available to budding pop stars.

The Reactable synthesizer, developed by researchers at Pompeu Fabra University (Barcelona, Spain), advances the electronic music synthesizer first demonstrated by Robert Moog in 1964. To “play” the instrument, a musician moves Plexiglas objects representing the elements of a classical modular synthesizer across a round translucent tabletop, while a camera below continuously analyzes the surface, tracking the nature, position, and orientation of the objects so as to affect the sound.

The Plexiglas objects perform different functions based upon their geometric shape. For example, square elements generate basic tones, while round objects act as sound filters, which modulate these basic tones. The symbol on the selected elements determines the type of the basic tone and/or filters and the spatial relationship of the objects to each other determines the extent to which one element affects another.

To meet the system requirements for object recognition, a special collection of objects known as “amoebae” was developed to code each individual Plexiglas object. To recognize each object, images of the table are captured with a Guppy FireWire camera from Allied Vision Technologies (Stadtroda, Germany) and converted to a black and white image using an adaptive thresholding algorithm.

The Reactable synthesizer projects markings onto the surface of the table to make the instrument easy to operate. These confirm to the musician that the object has been recognized by the system and provide additional information regarding the status of the generated tone and its interaction with neighboring objects. This process allows the artist to see the connections and a dynamic graphic presentation of the generated sound waves on the table. Thus, the musician can change individual sound parameters by touching the projected information with his finger.

After capture by the camera, the images from each Plexiglas object are analyzed and their position and orientation measured. Corresponding sound information is then generated and transferred to the PCs speakers as an audio signal as well as being graphically projected onto the tabletop.

Showtime

At the VISION 2007 show, Günter Geiger, a senior researcher at Pompeu Fabra University, demonstrated how each object synthesized sound. First, an object representing a sine tone generator was placed on the table. By rotating the object, the frequency of the sound generated could be modulated. Next, an object representing a bandpass filter was placed on the Reactable table. After the PC recognizes the object, bandpass functions of the object are linked with the sine generator and displayed graphically on the table. Rotating the object changes the resonant frequency of this filter. In a similar manner, a third object, in this case an audio delay, can also be placed in the pipeline and its delay length modified in a similar fashion. The result is a graphical real-world hard-object orientation of the sound path. Other objects, such as low-pass filters, oscillators and saw-tooth generators can be added to the table.

“With a camera and a constant bandwidth, we can monitor the entire surface, regardless of how many objects are on the table. In addition, image processing determines not only the position of the objects but also their orientation (rotation) and can recognize human fingers on the tabletop. With this, the optical solution offers many more fine-tuning possibilities and nuances than alternative technologies and places almost no limits on musical creativity,” says Sergi Jordà, director of the project.

Image transfer and processing speed were critical in the design of the system. “With the digital FireWire cameras, the data rate allows 60 frames/s to be captured and processed in-real-time,” explains Jordà. The camera was equipped with a near-infrared filter to decrease sensitivity to the ambient light.

Prototypes have proved the system can function reliably in a live concertas Björk demonstrated. Visit http://mtg.upf.edu/reactable to see the Reactable in action.

About the Author

Conard Holton | Editor at Large

Conard Holton has 25 years of science and technology editing and writing experience. He was formerly a staff member and consultant for government agencies such as the New York State Energy Research and Development Authority and the International Atomic Energy Agency, and engineering companies such as Bechtel. He joined Laser Focus World in 1997 as senior editor, becoming editor in chief of WDM Solutions, which he founded in 1999. In 2003 he joined Vision Systems Design as editor in chief, while continuing as contributing editor at Laser Focus World. Conard became editor in chief of Laser Focus World in August 2011, a role in which he served through August 2018. He then served as Editor at Large for Laser Focus World and Co-Chair of the Lasers & Photonics Marketplace Seminar from August 2018 through January 2022. He received his B.A. from the University of Pennsylvania, with additional studies at the Colorado School of Mines and Medill School of Journalism at Northwestern University.

Sponsored Recommendations

Request a quote: Micro 3D Printed Part or microArch micro-precision 3D printers

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.

How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

Precision Motion Control for Sample Manipulation in Ultra-High Resolution Tomography

April 10, 2024
Learn the critical items that designers and engineers must consider when attempting to achieve reliable ultra-high resolution tomography results here!

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!