Everybody knows that limiting too much in the mastering process causes a lot of problems when it comes to audio quality. There are always questions about how clipping affects the audio quality . The only place where the idea of getting your audio damaged becomes most evident is in normalization. Only people who have mixed and mastered for a long period understand the subtle changes that happen with normalization.
When you normalize an audio track, the chances of losing quality may not seem obvious. Only when you look at the different aspects of an audio track such as dynamic range, nominal LUFS, and Loss of detail over 0db, does it become noticeable that the normalization process in fact damages audio quality in more ways than one.
Normalization is not the most popular effect for increasing the overall volume of the track, especially in the new era of mixing and mastering. Limiting and Clipping are two of the most popular ways to increase the overall volume of the track. Due to the ability to push a song further than normalization, Limiting and clipping are used way more than other techniques. It’s important to understand how you should use each of these techniques to avoid damage to the track.
Yes, normalization does affect audio quality. The dynamic range, LUFS metric will be altered; the overall RMS value will suffer while trying to increase the volume of the overall track with normalization. Some of these factors might be insignificant but they play a major role in contributing to the overall quality of the audio track.
Just knowing that normalization affects audio quality is one thing, but understanding how it affects all the tracks will give you a clear indication of how you can manipulate this effect and use it to your advantage. In this article I will walk you through different way in which normalization affects an audio track, let’s get started.
Dynamic change with normalization
Dynamics is a word that gets thrown around a lot; Because of how valuable it is to convey emotions through an audio track. Dynamics refer to the quiet and the louder parts within a song. These two sections make up the heart of an audio track. This is one of the reasons why you should be very careful when working with dynamics. You can easily mess up the whole track with a small change of angle.
The best way to look at working with dynamic range, is to understand how to use it along with normalization. When you are normalizing a track with dynamic range, the first thing that happens is that the audio track losses quality, in the emotion that it tries to convey. The other problem is that the whole track will end up reaching the 0db mark consistently. This is one of the biggest issues with normalization.
It takes away all the luxury of the dynamic range of an audio track. You will end up with a track that has hard-hitting sound from beginning to end. Simply put, the track sounds flat. To avoid this problem, make sure you use normalization wisely and cautiously, in places where it makes sense and is absolutely necessary, and also ensure you eliminate it from places it shouldn’t be used.
A good example would be an already mixed song with a volume RMS of -2db. For a track of this nature, using normalization to push all the elements up, is the only way to get the volume higher, rather than using a limiter, which causes clipping and then ends up ruining the entire track. Using normalization creatively will yield incredible results in your mixing projects.
LUFS change with normalization
The biggest effect that normalization has is the way it causes the LUFS to make changes to a track. Normalization makes the difference between high and low volume razor thin. This means that a track that is going to be subjected to export will have a flat line, in its audio form. When limiting is performed to make to song reach say -14 LUFS, the song suffers a loss in detail, as there is practically zero dynamic range in the track.
Fortunately there is a remedy for this. You can easily mix the track it to -6db, and work from there by mastering. If you unfortunately only have the mix-down version of the song for mastering, there is not much you can do, other than to making sure that the track sounds as good as it can, with the maximum loudness that it can achieve using minimal limiting.
loss of detail in a limiting stage cannot be recovered. This is why normalization is not considered an essential effect to use for adding loudness to the track. This is also one of the reasons why normalization has been left out of the software production world.
Details above 0db
This is one of the problems that are encountered everywhere. No matter how careful you are in mixing. You will probably make this mistake at least once in your process. This happens when you lose detail in the top end of the song because of having too high a volume when the track is sent for mastering.
The loss of detail in a normalized track can be extreme, as it has profile that is more flat when compared to a song that hasn’t been normalized. This ties into the previous point we mentioned earlier. The increase in the LUFS, when sending a track for mastering results in an increase in detail above 0db.
There are multiple ways to counteract this, the first tip would be to make sure that the mixing reference line is at -12db as the loudest point. This will provide enough headspace for any song, so that it can be mixed and mastered in the studio properly.
The ultimate goal of any mastering engineer is to get the song ready for release to the public for commercial benefit. If the song is not at the right volume level and if it’s losing detail, normalization shouldn’t be involved in such project. This is the case with a lot of other effects as well. if it’s not useful then remove it.
This is why I always advise keeping effects to a minimum and prefer to allow instruments to express themselves.
Should I normalize my samples?
Yes, normalizing your samples will give you more control when you are mixing along with the loudest sounds. If your project has many low volume samples, it’s better to normalize them, so they are level with the loud samples in the mix. This makes sure that you don’t overdo limiting at the last stage of the project to get the sound sharp.
Should you normalize bouncing?
No, you don’t have to normalize a bounced track. Normalizing a bounced track will do more harm than good. It would reduce the quality of the track and mess up the volume levels that are set for streaming platforms. If the output is disturbed then it will lead to weird artifacts after the track gets normalized again by the streaming system.
When should you normalize?
The only time when you should normalize is when you don’t have an option to limit the track, but you need to increase the overall volume of the track without making the track clip. Other than this scenario normalizing a track is unnecessary as there are more valid effects available to achieve the same result.
Does YouTube normalize audio?
Yes, YouTube does normalize audio when the track has a volume below -48db range. This makes sure that the track is always audible in all systems irrespective of the volume that it is played into. Just like YouTube, other platforms like Spotify and apple music follow normalization in all the tracks uploaded to their platform.
What does normalize mean?
Normalization just means the reduction of dynamic range between the low and high sounds in a track, by making the overall volume higher. This reduction in the dynamic range makes the track sound louder and gives it clarity even when there are whispers in the song. This technique is used to make tracks sound loud even in old hardware systems.
Even though normalizing has been around music production and mixing for a long time, there are far more superior methods perform the same function and more effectively at that. This is one of the reasons why the use of normalizing is diminishing.
If you are learning to mix, you should learn this effect as it will help you shape sounds in different ways when the volume of the track isn’t in your control.