A smartphone app developed at Drexel University deconstructs music into discernible elements like tone, intensity, and rhythm, and facilitates a fuller way of experiencing a live performance. Drexel University’s Expressive & Creative Technologies Center (ExCITe) uses the “Science of Jazz” app to translate some of what’s measurable about music into visual form.
Microphones capture sound and the app transforms it to images in real time: one for how sound waves reach different parts of a concert hall, another to approximate which notes musicians are playing on their instruments, and another to depict the pitch and intensity of each instrument.
Dr. Youngmoo Kim, ExCITe’s director, is behind the app, which he says makes the live concert more educational and meaningful. Limited to the iPhone - and used only for jazz performances so far - see how it works:
Audio FileIn these excerpts of Meridee Duddleston’s interview with Dr. Kim he describes synergy between art and science and creates a word picture of how the “Science of Jazz” Iphone app works. It was first demonstrated during a jazz concert at the Philadelphia Science Festival in 2012 and was further refined for another concert in 2013.Edit | Remove