A smartphone app developed at Drexel University deconstructs music into discernible elements like tone, intensity, and rhythm, and facilitates a fuller way of experiencing a live performance. Drexel Universityâ€™s Expressive & Creative Technologies Center (ExCITe) uses the â€śScience of Jazzâ€ť app to translate some of whatâ€™s measurable about music into visual form.Â Â
Microphones capture sound and the appÂ transforms it to images in real time: one for how sound waves reach different parts of a concert hall, another to approximate which notes musicians are playing on their instruments, and another to depict the pitch and intensity of each instrument.
Dr. Youngmoo Kim, ExCITeâ€™s director, is behind the app, which he says makes the live concert more educational and meaningful. Limited to the iPhone - and used only for jazz performances so far - see how it works:Â
Â Audio FileIn these excerpts of Meridee Duddlestonâ€™s interview with Dr. Kim he describes synergy between art and science and creates a word picture of how the â€śScience of Jazzâ€ť Iphone app works. It was first demonstrated during a jazz concert at the Philadelphia Science Festival in 2012 and was further refined for another concert in 2013.Edit | Remove