On 4/11/25 9:13 PM, shawn wrote:
> On Fri, 11 Apr 2025 22:14:10 -0400, Rhino
> <
no_offline_contact@example.com> wrote:
>
>> On 2025-04-11 3:50 PM, BTR1701 wrote:
>>> On Apr 11, 2025 at 3:38:33 AM PDT, "Rhino" <
no_offline_contact@example.com>
>>> wrote:
>>>
>>>> On 2025-04-11 4:13 AM, Adam H. Kerman wrote:
>>>>>
https://www.youtube.com/watch?v=BDJF4lR3_eg >>>>
>>>> Just as pitch-correction alters the pitch of notes to make them
>>>> "perfect", quantization alters the placement of notes to make them
>>>> rhythmically "perfect".
>>>
>>> Rhythmically perfect is not desirable. It makes the music sound computerized,
>>> like Commander Data is playing it.
>>>
>> Exactly. Beato illustrates that in the cited videos.
>>
>>> Humans are not perfect so music that is perfect sounds inhuman. That's why my
>>> music software has a feature called "Human Playback", which purposely
>>> introduces minor inaccuracies in both rhythm and pitch, making it sound a lot
>>> less digital and cold.
That is the mentality of post-production: first ruin a
recording by processing, than "fix" that broken recording
with more processing, in the attempt to conceal the bad
work already done.
>> Good on the developers for adding that ability. I wonder if the bits
>> which are "de-quantized" are chosen at random or if some other method is
>> employed?
>
> They appear to be chosen at random but it's not enough to duplicate
> humans. It just makes the sound less mechanical until you look at it
> with sound analysis as you can see in some of the other analysis
> videos that the Wings of Pegasus singer did. He showed how the vocals
> are still being forced to fit a perfect note with just a little
> variation added in which isn't what a human vocal sounds like.
It seems that autotune / pitch correction turns the voice
into a keyboard, which is not what singing sounds like.
Greta Van Fleet for instance on their first two albums,
which is all I've heard, has some songs with awful keyboard
like vocals, entirely caused by post-production, not by
the pre-production singing which I can only guess sounded
better.
In another song that is impossible to sing with pitch
correction is the Beatles song Yesterday: the first
vocal note does not comply with the sheet music but is
the "same note" as the the succeeding note, but at a
higher pitch so pitch correction would flatten the note
making it wrong by repeating a note with the same pitch.
The sheet music and covers of it falsely sing the first
note a semitone higher, like in the movie Yesterday (2019).
In related routine post-production stuff like dynamic
compression, this is the muffling of the loud notes,
so the song no longer has loud notes, drums, vocals.
Then they proudly say that the result is loud!
In noise reduction, it is the quiet sounds that are
hurt the most, especially with stuff like fade-outs.
So to sum up:
Compression muffles the loud notes,
Noise Reduction muffles the quiet notes,
Beat Detector muffles the rhythm,
Pitch Correction/Autotune muffles the singer.
>>>> For instance, if the drummer's high hat hits are
>>>> a little bit off the beat, quantization can shift them to be exactly on
>>>> the beat. This too makes the playing sound mechanical. I've heard
>>>> several musicians bemoan the (over)use of quantization just as "Fil"
>>>> bemoans the use of pitch-correction in this video.
>>>>
>>>> This video is a brief explanation of quantization without much of the
>>>> philosophizing about whether it is good or bad. I'm sure there are other
>>>> videos that examine the issue more thoroughly.
>>>>
>>>>
https://www.youtube.com/watch?v=68LtY2aATl0 [7 minutes]
>>>>
>>>> Note: He makes several uses of the term "DAW" (spelled out as individual
>>>> letters) without explaining it. DAW stands for Digital Audio
>>>> Workstation, essentially the software you use to do the recording,
>>>> pitch-correction, quantization etc.
>>>>
>>>> For what it's worth, I've often heard analysts note that bands like The
>>>> Beatles, the Doors, and Led Zeppelin definitely speed up and slow down
>>>> perceptably in some of their well-known recordings, even though they had
>>>> fine drummers. If quantization had existed and been used when those
>>>> recordings were made, we might well have found those songs somehow less
>>>> impressive....
>>>>
>>>> This short by Rick Beato gives an example of a quantized Led Zeppelin
>>>> groove versus the original:
>>>>
>>>>
https://www.youtube.com/shorts/a7dTRgc0Mn4 >>>>
>>>> The short is an excerpt from this longer video:
>>>>
>>>>
https://www.youtube.com/watch?v=hT4fFolyZYU [10 minutes]
>>>>
>>>> When Beato says that the original tempo is 170 BPM, he means 170 beats
>>>> per minute.
>>>
>>>
>>>