Music and Audio Computing Lab

AI DJ


Main Contributor: Taejun Kim

Once upon a time, humans found how to record music, and during the "Great Depression" radio stations started to play records instead of live musicians. Despite musicians’ resistance, playing records became standard way for radio. Soon, people started dancing with records in clubs instead of live music. Recorded music is not only a cheap way to listen to music, but it also has its own unique charm: music with various styles can be played.

Nowadays, dancing to recorded music led by DJs is a popular format in dance culture, however, it has never been explored thoroughly how DJs play music in a data-driven way, and use knowledge from statistical analysis to build a automatic DJ system.

The AI DJ project collected music records and DJ mixes that DJs played in the past shows (e.g. dance festivals and podcasts) in a large-scale. Our goal is to understand patterns in DJ techniques and use the distilled knowledge to build automatic DJ systems which can help human DJs and listeners.



DJ Mix Analysis

The DJ mix analysis has a goal of understanding patterns in DJ techniques statistically of which questions can be such as:

  1. How do they select music records? Are genres/moods of the records consistent? Do they consider keys of the records?
  2. In which order do they play music?
  3. How much tempo/key modifications do they make for seamless flow of music while not hurting the original essence of individual records?
  4. How do they make a smooth transition from the previous to the next track?
  5. How do they control DJ mixers?
  6. Do they consider musical structures?
  7. ...and more!
In more details, the DJ mix analysis includes tasks such as:
  1. Track Identification recognizes which musical tracks are played by DJs.
  2. Mix-to-track Alignment aligns the identified original tracks to the DJ mixes for further analysis.
  3. Cue Point Extraction finds time positions that the tracks start/end to be played in the DJ mixes.
  4. Transition Analysis explains which and how audio effects are applied to the previous and the next tracks for a smooth transition.
The image below is a screen capture of our LIVE WEB DEMO of transition analysis. The curves explain how the DJ controlled EQ knobs on a DJ mixer. You can listen to individual EQ-applied tracks and reconstructed mix, and compare them to the original DJ mix.

The videos below are presented at ISMIR 2020 and NIME 2021:


Related Publications

  • Reverse-Engineering The Transition Regions of Real-World DJ Mixes using Sub-band Analysis with Convex Optimization
    Taejun Kim, Yi-Hsuan Yang, and Juhan Nam
    Proceedings of the New Interfaces for Musical Expression (NIME), 2021 [paper] [code] [demo] [video]
  • A Computational Analysis of Real-World DJ Mixes using Mix-To-Track Subsequence Alignment
    Taejun Kim, Minsuk Choi, Evan Sacks, Yi-Hsuan Yang, and Juhan Nam
    Proceedings of the 21st International Society for Music Information Retrieval Conference (ISMIR), 2020 [paper] [code] [poster] [video]



Automatic DJ System

Using the extracted knowledge and data from the DJ mix analysis, we build automatic DJ systems which help people who want to learn DJing and enjoy music. Music recommendation and sequencing have been studied mostly using user and tag data, but we study them using data from human DJs. Furthermore, we model systems that automatically apply audio effects when the previous and the next tracks are given. This topic is currently ongoing.