top of page

1. How are things working? Did you hit any bumps in the road? Is your plan for your project essentially the same, or are you going to go in a new direction? Tell me about your plan for the last two weeks of class until the project is due.

     During the past few weeks, we kept making progress on our music visualization project. Generally speaking, our plan for the project is essentially the same. Our visualization is based on the volume and pitch of the music. After some struggle to improve the detection algorithms, we can now say that the detection results for volume and pitch detection are satisfying. The only thing that we decided to change is the visualization method. Initially we decided to generate a meshgrid in 3D space, and each new pitch present in the music would generate a wave in the 3D space, with the peak of the wave being the grid corresponding to the pitch. However, this method requires the detection results to be ideal (consecutive steps in the time-frequency plot). And a single outlier in the detection results can make “weird things” happen. This was really a bump in our road. So we decided to use a new visualization method. Our current visualization method is to draw a 2D trajectory corresponding to the pitch variations before the synchronized visualization starts, and make a colored circle move on the trajectory. The size of the circle corresponds to the instantaneous volume, and the height and color of the circle corresponds to the frequency. This method is working quite well.

Plan for last two weeks:

     Our major task for the next two weeks will be developing a better visualization method in 3D space instead of 2D space. Our idea now is to replace the 2D trajectory with a wave-shape trajectory is 3D space, and replace the circle with a colored ball. A main challenge will be finding the mathematical expression for the center and radius of the ball so that we can make the ball slide smoothly on the 3D trajectory. If we still have time after implementing this visualization method, we will spend some more time on improving our detection algorithms.

 

2. What in-class DSP tools are you incorporating into your project?

FFT:

We applied FFT to the windowed music signal (or cepstrum-biased signal) in order to track the pitch of the music.

Filter:

We designed two filters for our project based on what we learned in class. The first one is a high-pass filter for the original music signal. We used the Matlab filter design tool introduced in class to help us with the design. The second filter is a moving median filter for removing the outliers of the pitch-detection results.

3. What out-of-class DSP tools are you incorporating?

Autocorrelation:

The autocorrelation function measures the extent to which a signal correlates with a delayed version of itself. For periodic signals like sinusoidal signal, the signal would correlate strongly with itself when delayed by its fundamental period. Therefore, by examining the magnitude and finding the peak of the autocorrelation function, we can roughly find the fundamental period of the signal.

Cepstrum:

Cepstrum method is one of the methods that we implemented for pitch detection. The cepstrum is calculated by taking the FFT of the log of a windowed magnitude spectrum. The pitch can then be determined by finding the peak of the FFT in a limited range.

4. What is the coolest / most fun thing you've done on your project so far?

     The coolest thing we’ve done so far is trying different filters with different parameters to our music signal and seeing how they work. Each time we applied a new filter to the signal, we were able to listen to the filtered music to see how things change. This really helped us develop some intuition into filter design. Another fun thing we’ve done is visualizing the music. Trying different visualization methods is really fun. Sometimes our visualization became “crazy” because of some silly mistakes we made. Those crazy results really brought us laughter.

bottom of page