By Daniel Choi B.M. ’12

A studio control room with light wood panelingScience and technology have always been an integral part of the evolution of music. Many musicians in the modern age often associate the term “music technology” with sound engineering, and even sound engineering can mean varying depths of insight into the technical and scientific aspects of sound. However, science and technology in music is not limited to sound engineers using fancy audio gear to record and/or mix music, whether it be in the studio or in live environments. The influences of technology in music can also include the way instruments are designed, music is composed and arranged, and how sound is reproduced within any given environment.

I began my studies at Berklee with a simple goal—to learn to express my musical thoughts through writing and arranging. As my understanding of theoretical concepts were put into practice, I learned that the sonic qualities of music can heavily affect my intended expression of music, regardless of the composition and the arrangement. In addition to my studies in contemporary writing and production, I consequently declared an additional major—music production and engineering, in my hopes of understanding the production process of music and learning how to sculpt the sonic aspects of my music.

I wanted answers to why adjusting my equalizers actually affected the time and phase. I needed to know scientifically why my music that sounded decent in the studio sounded dreadful through a sound system in a reverberant venue.

At the end of my studies at Berklee, my craving for understanding why my music sounds the way it does further escalated to a higher degree. I wanted to understand how the tools that I was using to sculpt the sonic characteristic of my music worked. While I did not envision myself becoming a developer of audio equipment, I simply wanted to learn how reverb plugins worked and what they were doing to recreate and emulate a spatial depth to music. I wanted answers to why adjusting my equalizers actually affected the time and phase. I needed to know scientifically why my music that sounded decent in the studio sounded dreadful through a sound system in a reverberant venue.

Continuing my studies with a major of architectural acoustics at a polytechnic graduate school has provided many answers to my questions. I not only learned how sound behaves within a space, but also learned foundations of digital signal processing and psychoacoustics. While through this education I began to pursue a career in architectural acoustics, the choices that I made in my music creation process has vastly changed. Implementing the science of music into practice also seems to be a life-long learning journey, but all of the creative and artistic thinking behind my music now has a technical foundation underneath. No, I do not think about the wave equation and Newton’s second law every time I apply a reverb plugin in my mix, nor do I deeply contemplate psychoacoustical frequency masking effects when being conscious of low interval limits in my arrangements. However, understanding the technical aspects of my musical choices provide confidence and justification instead of always answering with “because it sounds good if I do it this way”.


Headshot of Daniel ChoiDaniel Choi B.M. ’12 is a multi-talented professional with nine years of experience on a wide array of projects including performing arts, worship, education, and corporate. His capabilities span data collection, analysis, facility assessments, research, design, and onsite inspections. Daniel is a graduate of RPI with a master’s degree in architectural acoustics and also holds an undergraduate degree in music production and engineering with a minor in acoustics and electronics. Daniel was a sound engineer for various institutional and community-based performing arts productions. He plays the bass, guitar, piano, and drums and brings a musician’s ear to his work. Daniel has lived all over the world and speaks English, German, Korean, and Russian.