Some major additions has been done. First of a menu-system has been implemented so that the user can choose between different songs. Also some additional visualisations has been added that the user can change between at his’ or her’s discretion. An example can be seen in the video below.
The user can switch between the three different types of background animations (The horizontal bars, the vertical bars or nothing) with the arrow keys. Additionally the user can also switch between different color modes via their corresponding keys. ‘B’ for black bars and ‘E’ for enabling “Epilepsy-mode” The menu for switching zones can be shown or hidden via the Z-key.
Link to the project can be found here.
Some additional visualisation has been implemented that are shown behind the existing polygon. These has the goal of further accentuating the different aspects of the song. An example can be seen in the video below.
Each of the horizontal bars represents a frequency of the played audio. The color is a compliment to the polygons current color, decided by a XOR operation.
We have now integrated the music together with the the noiseball project and added interactions between them. The result can be seen in the video below.
The radius of the edges are dependent on the lowest frequencies of the song, i.e. the bass. The spinning of the polygon and the radius expansion are dependent of the higher frequencies.
We have managed to gather some initial spectrum-data from the mp3s making it possible to try and start with different ideas for visualisations . We constructed a simple bar visualisation to get a feel of how the data is divided.
The next step is to trying out different mappings between data and visualisations.
We have decided to split up work responsibilities in this manner:
Music data retrieval: Marcus Heine
Graphical components: Ludwig Sidenmark, Jonathan Golan
Initially Ludwig will focus on vertex movement and Jonathan will focus on the shaders.
Today the Ludwig and Jonathan sat and looked at some previous made projects, more specifically NoiseBall in order to understand how they work and how they can help our project, focusing on the graphical aspect of the project. We dug even deeper and found some additional sources that may be helpful.
Idea: The main project idea is to visualize music through a polygon that is deformed by the data retrieved from the music played.
Implementation: The project will take an mp3 file as input. Our current ideas for deformations are stretching the surface, rotation, changing the size of the polygon, changing the color. The plan is to use Unity as platform with C#. For processing and retrieving music data we plan to use the NAudio library which has extensive support for working with audio files in code. As for the polygon, we were inspired by the github repository NoiseBall by Keijiro Takahashi. Depending on the time that we have left, additional effects may be added in addition to those already mentioned.
Evaluation: if there is enough time, we also plan to do a small case study where we see if the user feels that the visual effects match the audio.
This is your very first post. Click the Edit link to modify or delete it, or start a new post. If you like, use this post to tell readers why you started this blog and what you plan to do with it.