Shaders Archives - ab https://alexandrebruffa.com/tag/shaders/ Tue, 10 Oct 2023 18:14:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://alexandrebruffa.com/wp-content/uploads/2022/04/cropped-cropped-elior-favicon-270x270-1-32x32.png Shaders Archives - ab https://alexandrebruffa.com/tag/shaders/ 32 32 Combining real-time audio analysis and shaders customization with Unity3D https://alexandrebruffa.com/combining-real-time-audio-analysis-and-shaders-customization-with-unity3d/ Thu, 07 Apr 2022 05:51:29 +0000 https://alexandrebruffa.com/?p=1387 In this article, we will combine with Unity3D two fields that generally don’t come together: audio analysis and shaders.

The post Combining real-time audio analysis and shaders customization with Unity3D appeared first on ab.

]]>
What if our hearing and visions were connected? Imagine that you could see differently according to the sounds that come to your ears. Sounds crazy, right? Well, it actually exists and it is called chromesthesia.

In this article, we will recreate the chromesthesia phenomenon with Unity3D combining two fields that generally don’t come together: audio analysis and shaders.

Are you ready to “see” sounds? 🔈 👀

A hint of audio theory

Sounds that reach our ears vibrate in a certain way, called frequency which is expressed in hertz (Hz). The hearing range for humans, as known as the acoustic range, is approximately 20 Hz to 20,000 Hz. Sounds outside this range are infrasounds (below) and ultrasounds (above).

After capturing a sound with a recording device, we can figure out that the corresponding audio signal is not clean, and it is kind of complicated to determine which frequencies compose it. Thanks to signal processing and the Fast Fourier Transform (FFT) algorithm, it is possible to identify the main frequencies of a sound from the other ones (ambient noise or interferences).

Audio signal oscillations // After signal processing with Fast Fourier Transform

Building the app with Unity

Do you remember my previous article? We will re-use the main elements to build our new app. We need the following Unity objects: a Microphone, an AudioSource, a Mixer, a Shader, a Camera, a Plane, and a WebCamTexture.

The architecture of our app

The Microphone 🎤

First, we look for an available Microphone on the device where the Unity app is running:

The AudioSource and the Mixer 🔈

We have the Microphone, then we apply the audio stream to an AudioSource object and finally turn it on:

Now, the problem is: if the AudioSource plays, it will create an infinite loop between the Microphone and the speakers, and the result will be an awful reverberation. So we attach an Audio Mixer with a volume equal to 0 (=-80 dB) to the AudioSource to silence it:

Audio Analysis with Unity3D 📈

According to Unity documentation, the GetSpectrumData function applies the FFT algorithm to the current sound played by an AudioSource and returns a spectrum array of samples. Each sample represents a range of frequencies and its related amplitude value. This range can be calculated in the following way:

hertzPerSample = sampleRate / 2 / spectrumSize

Note that:

  • The sample rate is defined by the outputSampleRate function (generally 48 kHz)
  • Signal processing needs a sample rate twice the human hearing range to perform the FFT algorithm, so we divide this sample rate by 2
  • In this article, we will deal with musical pitches, so we use the maximum allowed spectrum size (8192) to get smaller samples and more precise frequencies.

➡ Calculating with the previous values gives us 48,000 / 2 / 8192 ≈ 2.93 Hz per sample.

Let’s initialize our variables:

Note: I found that samples with an amplitude below 0.01 are irrelevant (noise)

Now we create a Peak class to store data from the spectrum:

We retrieve the data and store it in a Peak list:

After playing an A4 pitch on my wife’s piano, I obtain 3 peaks with the following values:

Graphical representation:

It looks good! If we interpolate those values, we obtain 439.7 Hz which is pretty close to 440 Hz, the frequency of the A4 pitch.

The Shader 🖼

We will create a shader with a cool pixelated effect. To achieve it, we divide the texture into blocks of identical size, each pixel of a block takes the value of its center pixel.

Then we update our shader with the number of blocks that the image or texture will contain according to the interpolated frequency previously calculated:

The Camera 🎥

We will use a generic Unity Camera. Before starting with the device camera implementation, let’s try the pixel shader we have just created with a simple picture (SpriteRenderer).

It works! Let’s compare our result with Adobe’s Mosaic filter:

Original image from Pixabay // With Mosaic filter // With our shader

Well, it’s a pretty good result! Our result is a little bit rougher than the Mosaic filter.

The Plane

A default Plane in Unity is usually used as a floor component, so we play with the rotation parameters to place the Plane in front of the Camera.

Note: if you are going to work with a webcam or a frontal device camera, you will need to invert the x scale.

The WebCamTexture

The WebCamTexture is a special texture that allows capturing a live video from a device camera (or from a webcam if you are working with a laptop) and rendering it in Unity space. we create a new WebCamTexture and we apply it to the Plane.

Final result

After building the application on an old LG phone, here is the final result on video:

Closing thoughts

This article showed you how to make a very simple Unity app involving 3D objects, shaders, a device camera, and a microphone. We also had the opportunity to learn about frequencies and audio analysis in general.

Every code of this article has been tested using Unity 2020.3.17 and Visual Studio Community 2019. The mobile device I used to run the Unity app is an LG Magna with Android Lollipop. My wife’s piano is a Yamaha Clavinova CLP-645.

You can download the full Chromesthesia Unity Package specially designed for this article.

A special thanks to Gianca Chavest for helping me edit the video and designing the awesome illustration.

The post Combining real-time audio analysis and shaders customization with Unity3D appeared first on ab.

]]>
Building a dog’s vision application with Unity3D https://alexandrebruffa.com/building-a-dogs-vision-application-with-unity3d/ Wed, 06 Apr 2022 00:17:46 +0000 https://moonbicycle.com/alexandrebruffa/?p=1185 Did you know that our four-legged friends see the world in a different way than us? In this article, we will recreate with Unity3D what would be a dog’s vision and simulate it with a mobile device.

The post Building a dog’s vision application with Unity3D appeared first on ab.

]]>
This article was initially published on my Medium Page.

Did you know that our four-legged friends see the world in a different way than us? In this article, we will recreate with Unity3D what would be a dog’s vision and simulate it with a mobile device.

Are you ready? 🐶

How do dogs see?

Studies suggest that dogs are dichromats; their vision is very similar to human deuteranopia (red-green color blindness). That means they can only perceive 2 pure spectral colors: blue and yellow and have difficulties distinguishing between red and green colors.

Toys
How I see my daughter’s toys // How my dogs see them

Dogs also have poor visual acuity (they don’t perceive faraway objects with clarity) and awesome moving detection. That can explain some strange behavior, like when your dog does not recognize you until you are near, but can detect a cat walking two blocks away (and eventually get crazy).

In this article, we will focus on the colors singularity of dogs’ vision.

Building the app with Unity

To achieve it, this is pretty simple, we only need 4 main elements: a Camera, a Shader, a Plane, and a WebCamTexture. The Shader will act as a colors filter for the Camera, and the Plane will be a support for the WebCamTexture.

The visual architecture of our application

The Shader

First, we write a shader with an RGB color transformation:

According to Martin Krzywinski’s incredible work, this is the RGB transformation matrix of a deuteranopia vision:

OK, so we write a c# script to load the shader we created. We pass the previous matrix values through the OnRenderImage function to render the deuteranopia visual effect:

The Camera

We will use a generic Unity Camera and attach the script we have just created to it. Before starting with the device camera implementation, let’s try our deuteranopia shader with a simple picture (SpriteRenderer).

How my dogs see me.

It works!

Adobe developed a powerful color tool that allows simulating color blindness. Let’s compare our result with Adobe’s tool:

Normal vision // Deuteranopia vision with Adobe Proof Colors // The result we obtained with Unity

Well, this is a pretty good result! We got a deuteranopia vision very close to Adobe’s tool, ours is maybe a little bit greener.

The Plane

A default Plane in Unity is usually used as a floor component, so we play with the rotation parameters in order to place the Plane in front of the Camera.

Note: if you are going to work with a webcam or a frontal device camera, you will need to invert the x scale.

The WebCamTexture

The WebCamTexture is a special texture that allows capturing a live video from a device camera (or from a webcam if you are working with a laptop) and rendering it in Unity space.

We create a script in order to access the user’s device camera. We have first to request authorization to the user using the RequestUserAuthorization function. Then we create a new WebCamTexture and we apply it to the Plane.

Then we attach the script to the Plane:

Final Result

After building the application on an old LG phone, here is the final result on video:

Closing thoughts

This article showed you how to make a very simple Unity app involving 3D objects, shaders, user requests, and device cameras. We also had the opportunity to learn about dog’s vision and deuteranopia in general and figured out that buying a red or orange ball for our dogs is a very bad idea 🔴 🐶

If you want to go further with Unity3D and color blindness, you can check the wonderful work of Alan Zucconi.

Every code of this article has been tested using Unity 2020.3.17 and Visual Studio Community 2019. The mobile device I used to run the Unity app is an LG Magna with Android Lollipop.

You can download the full DogVision Unity Package specially designed for this article.

A special thanks to Gianca Chavest for designing the awesome illustration.

The post Building a dog’s vision application with Unity3D appeared first on ab.

]]>