Audio Archives - ab https://alexandrebruffa.com/tag/audio/ Tue, 10 Oct 2023 18:12:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://alexandrebruffa.com/wp-content/uploads/2022/04/cropped-cropped-elior-favicon-270x270-1-32x32.png Audio Archives - ab https://alexandrebruffa.com/tag/audio/ 32 32 Combining real-time Texture analysis and audio customization with Unity3D https://alexandrebruffa.com/combining-real-time-texture-analysis-and-audio-customization-with-unity3d/ Thu, 07 Apr 2022 15:05:20 +0000 https://alexandrebruffa.com/?p=1404 In this article, we will combine two fields that generally don’t come together: Texture analysis and audio customization.

The post Combining real-time Texture analysis and audio customization with Unity3D appeared first on ab.

]]>
This article was initially published on my Medium Page.

In my previous article, I built a sound-to-color synesthesia application with Unity3D. In this short article, we will perform the exact opposite: a color-to-sound synesthesia application.

Are you ready to “hear” images? 🖼👂

Building the app with Unity

We will re-use the main elements of my previous article to build our new app. We need the following Unity objects: 3 AudioSources, a Camera, a Plane, and a WebCamTexture.

The architecture of our app

Texture Analysis with Unity3D 📈

We are going to analyze which is the dominant color channel (red, green, or blue) of the video stream from our device. To achieve it, we will iterate each pixel of the whole WebCamTexture and calculate the average of each color channel. So we have:

The AudioSources 🔈

Now we need a loopable song with 3 distinct tracks, each track will represent a color channel of the WebCamTexture. I first looked on Freesound, but unfortunately, I was not able to find something relevant.

Then, I discovered Soundtrap, and I made a great hip-hop beat with it (Please don’t laugh, I felt I was Dr. Dre for a couple of minutes). I used 3 tracks: drums, piano, and flute from the extensive Soundtrap’s sounds library.

Overview of Soundtrap studio

We will now create 3 AudioSources, one per track. Our hip-hop song is loopable and will play automatically when the app starts, so we check the “Loop” and “Play On Awake” options for each track.

AudioSource settings

Note: Unity uses to manage .mp3 format adding milliseconds of silence at the beginning of the file, and making it “unloopable”. To avoid this issue, we will use .ogg files.

Here is an example of the same song in two different formats, .mp3 and .ogg. Seconds of silence at the beginning of the .mp3 file can be observed:

.mp3 file // .ogg file

Great! We can now update each track with the value of its respective color channel:

The User Interface 👤

First, we create a Canvas component. Don’t forget to select “Scale With Screen Size”, this will allow Canvas components to have a constant size.

Canvas settings

Then, we create 3 fancy bars as Images with a semi-transparent background. The bar filler is another Image with the type “Filled”.

Image settings

It looks good! We will now update each bar with the value of its respective color channel using the Fill Amount parameter:

Final result

After building the application on an old LG phone, here is the final result on video:

Off-topic: about my oldie phone

Maybe you wonder: “why this guy is working with an old LG phone? There are plenty of much better Android phones!”. Oh yes, there are. This is the worst phone I have ever had: it is slow (even after resetting, Android is not running correctly), it has a poor screen resolution and audio quality, user experience is awful (for an unknown reason volume buttons are on the back and make it very uncomfortable using it), and the finishing is, well, very basic (rough plastic).

LG Magna
LG Magna phone.

The answer is simple: if the Unity applications that I create for my articles are running decently on the LG Magna, it will run perfectly on any modern Android phone.

Closing thoughts

This article showed you how to make a very simple Unity app involving 3D objects and a device camera. We also had the opportunity to learn about the R, G, B color channels and Texture analysis in general.

Every code of this article has been tested using Unity 2020.3.17 and Visual Studio Community 2019. The mobile device I used to run the Unity app is an LG Magna with Android Lollipop.

You can download the full Chromesthesia2 Unity Package specially designed for this article.

A special thanks to Gianca Chavest for designing the awesome illustration.

The post Combining real-time Texture analysis and audio customization with Unity3D appeared first on ab.

]]>
Combining real-time audio analysis and shaders customization with Unity3D https://alexandrebruffa.com/combining-real-time-audio-analysis-and-shaders-customization-with-unity3d/ Thu, 07 Apr 2022 05:51:29 +0000 https://alexandrebruffa.com/?p=1387 In this article, we will combine with Unity3D two fields that generally don’t come together: audio analysis and shaders.

The post Combining real-time audio analysis and shaders customization with Unity3D appeared first on ab.

]]>
What if our hearing and visions were connected? Imagine that you could see differently according to the sounds that come to your ears. Sounds crazy, right? Well, it actually exists and it is called chromesthesia.

In this article, we will recreate the chromesthesia phenomenon with Unity3D combining two fields that generally don’t come together: audio analysis and shaders.

Are you ready to “see” sounds? 🔈 👀

A hint of audio theory

Sounds that reach our ears vibrate in a certain way, called frequency which is expressed in hertz (Hz). The hearing range for humans, as known as the acoustic range, is approximately 20 Hz to 20,000 Hz. Sounds outside this range are infrasounds (below) and ultrasounds (above).

After capturing a sound with a recording device, we can figure out that the corresponding audio signal is not clean, and it is kind of complicated to determine which frequencies compose it. Thanks to signal processing and the Fast Fourier Transform (FFT) algorithm, it is possible to identify the main frequencies of a sound from the other ones (ambient noise or interferences).

Audio signal oscillations // After signal processing with Fast Fourier Transform

Building the app with Unity

Do you remember my previous article? We will re-use the main elements to build our new app. We need the following Unity objects: a Microphone, an AudioSource, a Mixer, a Shader, a Camera, a Plane, and a WebCamTexture.

The architecture of our app

The Microphone 🎤

First, we look for an available Microphone on the device where the Unity app is running:

The AudioSource and the Mixer 🔈

We have the Microphone, then we apply the audio stream to an AudioSource object and finally turn it on:

Now, the problem is: if the AudioSource plays, it will create an infinite loop between the Microphone and the speakers, and the result will be an awful reverberation. So we attach an Audio Mixer with a volume equal to 0 (=-80 dB) to the AudioSource to silence it:

Audio Analysis with Unity3D 📈

According to Unity documentation, the GetSpectrumData function applies the FFT algorithm to the current sound played by an AudioSource and returns a spectrum array of samples. Each sample represents a range of frequencies and its related amplitude value. This range can be calculated in the following way:

hertzPerSample = sampleRate / 2 / spectrumSize

Note that:

  • The sample rate is defined by the outputSampleRate function (generally 48 kHz)
  • Signal processing needs a sample rate twice the human hearing range to perform the FFT algorithm, so we divide this sample rate by 2
  • In this article, we will deal with musical pitches, so we use the maximum allowed spectrum size (8192) to get smaller samples and more precise frequencies.

➡ Calculating with the previous values gives us 48,000 / 2 / 8192 ≈ 2.93 Hz per sample.

Let’s initialize our variables:

Note: I found that samples with an amplitude below 0.01 are irrelevant (noise)

Now we create a Peak class to store data from the spectrum:

We retrieve the data and store it in a Peak list:

After playing an A4 pitch on my wife’s piano, I obtain 3 peaks with the following values:

Graphical representation:

It looks good! If we interpolate those values, we obtain 439.7 Hz which is pretty close to 440 Hz, the frequency of the A4 pitch.

The Shader 🖼

We will create a shader with a cool pixelated effect. To achieve it, we divide the texture into blocks of identical size, each pixel of a block takes the value of its center pixel.

Then we update our shader with the number of blocks that the image or texture will contain according to the interpolated frequency previously calculated:

The Camera 🎥

We will use a generic Unity Camera. Before starting with the device camera implementation, let’s try the pixel shader we have just created with a simple picture (SpriteRenderer).

It works! Let’s compare our result with Adobe’s Mosaic filter:

Original image from Pixabay // With Mosaic filter // With our shader

Well, it’s a pretty good result! Our result is a little bit rougher than the Mosaic filter.

The Plane

A default Plane in Unity is usually used as a floor component, so we play with the rotation parameters to place the Plane in front of the Camera.

Note: if you are going to work with a webcam or a frontal device camera, you will need to invert the x scale.

The WebCamTexture

The WebCamTexture is a special texture that allows capturing a live video from a device camera (or from a webcam if you are working with a laptop) and rendering it in Unity space. we create a new WebCamTexture and we apply it to the Plane.

Final result

After building the application on an old LG phone, here is the final result on video:

Closing thoughts

This article showed you how to make a very simple Unity app involving 3D objects, shaders, a device camera, and a microphone. We also had the opportunity to learn about frequencies and audio analysis in general.

Every code of this article has been tested using Unity 2020.3.17 and Visual Studio Community 2019. The mobile device I used to run the Unity app is an LG Magna with Android Lollipop. My wife’s piano is a Yamaha Clavinova CLP-645.

You can download the full Chromesthesia Unity Package specially designed for this article.

A special thanks to Gianca Chavest for helping me edit the video and designing the awesome illustration.

The post Combining real-time audio analysis and shaders customization with Unity3D appeared first on ab.

]]>