https://www.youtube.com/watch?v=BDJF4lR3_eg
https://www.youtube.com/watch?v=BDJF4lR3_eg
On Fri, 11 Apr 2025 08:13:40 -0000 (UTC), "Adam H. Kerman"
<ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
He's done a number of those analysis that shows just how modern music
gets modified all the time. Either pitch correction or compression of
the audio happens all the time.
On Fri, 11 Apr 2025 08:13:40 -0000 (UTC), "Adam H. Kerman"
<ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
He's done a number of those analysis that shows just how modern music
gets modified all the time. Either pitch correction or compression of
the audio happens all the time.
On 2025-04-11 4:13 AM, Adam H. Kerman wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
Just as pitch-correction alters the pitch of notes to make them
"perfect", quantization alters the placement of notes to make them rhythmically "perfect".
For instance, if the drummer's high hat hits are
a little bit off the beat, quantization can shift them to be exactly on
the beat. This too makes the playing sound mechanical. I've heard
several musicians bemoan the (over)use of quantization just as "Fil"
bemoans the use of pitch-correction in this video.
This video is a brief explanation of quantization without much of the philosophizing about whether it is good or bad. I'm sure there are other videos that examine the issue more thoroughly.
https://www.youtube.com/watch?v=68LtY2aATl0 [7 minutes]
Note: He makes several uses of the term "DAW" (spelled out as individual letters) without explaining it. DAW stands for Digital Audio
Workstation, essentially the software you use to do the recording, pitch-correction, quantization etc.
For what it's worth, I've often heard analysts note that bands like The Beatles, the Doors, and Led Zeppelin definitely speed up and slow down perceptably in some of their well-known recordings, even though they had
fine drummers. If quantization had existed and been used when those recordings were made, we might well have found those songs somehow less impressive....
This short by Rick Beato gives an example of a quantized Led Zeppelin
groove versus the original:
https://www.youtube.com/shorts/a7dTRgc0Mn4
The short is an excerpt from this longer video:
https://www.youtube.com/watch?v=hT4fFolyZYU [10 minutes]
When Beato says that the original tempo is 170 BPM, he means 170 beats
per minute.
On Apr 11, 2025 at 3:38:33 AM PDT, "Rhino" <no_offline_contact@example.com> wrote:
On 2025-04-11 4:13 AM, Adam H. Kerman wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
Just as pitch-correction alters the pitch of notes to make them
"perfect", quantization alters the placement of notes to make them
rhythmically "perfect".
Rhythmically perfect is not desirable. It makes the music sound computerized, like Commander Data is playing it.
Humans are not perfect so music that is perfect sounds inhuman. That's why my music software has a feature called "Human Playback", which purposely introduces minor inaccuracies in both rhythm and pitch, making it sound a lot less digital and cold.
For instance, if the drummer's high hat hits are
a little bit off the beat, quantization can shift them to be exactly on
the beat. This too makes the playing sound mechanical. I've heard
several musicians bemoan the (over)use of quantization just as "Fil"
bemoans the use of pitch-correction in this video.
This video is a brief explanation of quantization without much of the
philosophizing about whether it is good or bad. I'm sure there are other
videos that examine the issue more thoroughly.
https://www.youtube.com/watch?v=68LtY2aATl0 [7 minutes]
Note: He makes several uses of the term "DAW" (spelled out as individual
letters) without explaining it. DAW stands for Digital Audio
Workstation, essentially the software you use to do the recording,
pitch-correction, quantization etc.
For what it's worth, I've often heard analysts note that bands like The
Beatles, the Doors, and Led Zeppelin definitely speed up and slow down
perceptably in some of their well-known recordings, even though they had
fine drummers. If quantization had existed and been used when those
recordings were made, we might well have found those songs somehow less
impressive....
This short by Rick Beato gives an example of a quantized Led Zeppelin
groove versus the original:
https://www.youtube.com/shorts/a7dTRgc0Mn4
The short is an excerpt from this longer video:
https://www.youtube.com/watch?v=hT4fFolyZYU [10 minutes]
When Beato says that the original tempo is 170 BPM, he means 170 beats
per minute.
On 2025-04-11 3:50 PM, BTR1701 wrote:
On Apr 11, 2025 at 3:38:33 AM PDT, "Rhino" <no_offline_contact@example.com> >> wrote:Exactly. Beato illustrates that in the cited videos.
On 2025-04-11 4:13 AM, Adam H. Kerman wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
Just as pitch-correction alters the pitch of notes to make them
"perfect", quantization alters the placement of notes to make them
rhythmically "perfect".
Rhythmically perfect is not desirable. It makes the music sound computerized,
like Commander Data is playing it.
Humans are not perfect so music that is perfect sounds inhuman. That's why myGood on the developers for adding that ability. I wonder if the bits
music software has a feature called "Human Playback", which purposely
introduces minor inaccuracies in both rhythm and pitch, making it sound a lot
less digital and cold.
which are "de-quantized" are chosen at random or if some other method is >employed?
For instance, if the drummer's high hat hits are
a little bit off the beat, quantization can shift them to be exactly on
the beat. This too makes the playing sound mechanical. I've heard
several musicians bemoan the (over)use of quantization just as "Fil"
bemoans the use of pitch-correction in this video.
This video is a brief explanation of quantization without much of the
philosophizing about whether it is good or bad. I'm sure there are other >>> videos that examine the issue more thoroughly.
https://www.youtube.com/watch?v=68LtY2aATl0 [7 minutes]
Note: He makes several uses of the term "DAW" (spelled out as individual >>> letters) without explaining it. DAW stands for Digital Audio
Workstation, essentially the software you use to do the recording,
pitch-correction, quantization etc.
For what it's worth, I've often heard analysts note that bands like The
Beatles, the Doors, and Led Zeppelin definitely speed up and slow down
perceptably in some of their well-known recordings, even though they had >>> fine drummers. If quantization had existed and been used when those
recordings were made, we might well have found those songs somehow less
impressive....
This short by Rick Beato gives an example of a quantized Led Zeppelin
groove versus the original:
https://www.youtube.com/shorts/a7dTRgc0Mn4
The short is an excerpt from this longer video:
https://www.youtube.com/watch?v=hT4fFolyZYU [10 minutes]
When Beato says that the original tempo is 170 BPM, he means 170 beats
per minute.
https://www.youtube.com/watch?v=BDJF4lR3_eg
https://www.youtube.com/watch?v=BDJF4lR3_eg
On Apr 11, 2025 at 1:13:40 AM PDT, ""Adam H. Kerman"" <ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
On Sunday, Chris Siddall demo'd a new AI system called Cantai that can read >lyrics in a score and sing them during playback. The realism was stunning. Up >till now, you'd enter the lyrics in the score and just have to accept that >while the software can mimic a human voice with reasonable accuracy, it >couldn't actually read and sing the lyrics, so all you'd get would be an "ah" >or "oh" on each note rather than the actual word. Now it can sing the actual >words and do it with not only an amazing degree of accuracy but with actual >musicality. Doesn't sound at all like a computer. Sounds like an actual person >singing.
https://www.youtube.com/watch?v=J-mTezdPLGI&t=2408s
On Apr 11, 2025 at 1:13:40 AM PDT, ""Adam H. Kerman"" <ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
On Sunday, Chris Siddall demo'd a new AI system called Cantai that can read lyrics in a score and sing them during playback. The realism was stunning. Up till now, you'd enter the lyrics in the score and just have to accept that while the software can mimic a human voice with reasonable accuracy, it couldn't actually read and sing the lyrics, so all you'd get would be an "ah" or "oh" on each note rather than the actual word. Now it can sing the actual words and do it with not only an amazing degree of accuracy but with actual musicality. Doesn't sound at all like a computer. Sounds like an actual person
singing.
https://www.youtube.com/watch?v=J-mTezdPLGI&t=2408s
On 4/14/2025 3:07 PM, BTR1701 wrote:
On Apr 11, 2025 at 1:13:40 AM PDT, ""Adam H. Kerman"" <ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
On Sunday, Chris Siddall demo'd a new AI system called Cantai that can read >> lyrics in a score and sing them during playback. The realism was stunning. Up
till now, you'd enter the lyrics in the score and just have to accept that >> while the software can mimic a human voice with reasonable accuracy, it
couldn't actually read and sing the lyrics, so all you'd get would be an "ah"
or "oh" on each note rather than the actual word. Now it can sing the actual >> words and do it with not only an amazing degree of accuracy but with actual >> musicality. Doesn't sound at all like a computer. Sounds like an actual person
singing.
https://www.youtube.com/watch?v=J-mTezdPLGI&t=2408s
Here's a short story from '53. Fairly quick read...
https://elateachers.weebly.com/uploads/2/7/0/1/27012625/virtuoso.pdf
On Fri, 11 Apr 2025 08:13:40 -0000 (UTC), "Adam H. Kerman"
<ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
Here is another perfect example of the mechanical sound of pitch
corrected vocals. In this case they have a trio singing but apparently
they were recorded separately, then pitch corrected and then put
together to sound like they were singing together. It sounded off to
me from the very start and then Fil starts pointing out the mistakes.
The video is supposedly of three people singing together in a
bathroom.
https://www.youtube.com/watch?v=mMslQam1jRA
On Apr 11, 2025 at 2:05:52 AM PDT, "shawn" <nanoflower@notforg.m.a.i.l.com> wrote:
On Fri, 11 Apr 2025 08:13:40 -0000 (UTC), "Adam H. Kerman"
<ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
He's done a number of those analysis that shows just how modern music
gets modified all the time. Either pitch correction or compression of
the audio happens all the time.
I've noticed this on recordings going back to the 80s. When I play along with film soundtracks, I often have to push the tuning slide in and make my trumpet
almost an inch shorter than it normally is because the recording is so sharp, indicating it's been sped up from how it actually sounded in the studio when it was recorded.
On Fri, 11 Apr 2025 22:14:10 -0400, RhinoThat's why my
<no_offline_contact@example.com> wrote:
On 2025-04-11 3:50 PM, BTR1701 wrote:
On Apr 11, 2025 at 3:38:33 AM PDT, "Rhino" <no_offline_contact@example.com>Exactly. Beato illustrates that in the cited videos.
wrote:
On 2025-04-11 4:13 AM, Adam H. Kerman wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
Just as pitch-correction alters the pitch of notes to make them
"perfect", quantization alters the placement of notes to make them
rhythmically "perfect".
Rhythmically perfect is not desirable. It makes the music sound computerized,
like Commander Data is playing it.
Humans are not perfect so music that is perfect sounds inhuman.
sound a lotmusic software has a feature called "Human Playback", which purposely
introduces minor inaccuracies in both rhythm and pitch, making it
less digital and cold.
Good on the developers for adding that ability. I wonder if the bits
which are "de-quantized" are chosen at random or if some other method is
employed?
They appear to be chosen at random but it's not enough to duplicate
humans. It just makes the sound less mechanical until you look at it
with sound analysis as you can see in some of the other analysis
videos that the Wings of Pegasus singer did. He showed how the vocals
are still being forced to fit a perfect note with just a little
variation added in which isn't what a human vocal sounds like.
exactly onFor instance, if the drummer's high hat hits are
a little bit off the beat, quantization can shift them to be
otherthe beat. This too makes the playing sound mechanical. I've heard
several musicians bemoan the (over)use of quantization just as "Fil"
bemoans the use of pitch-correction in this video.
This video is a brief explanation of quantization without much of the
philosophizing about whether it is good or bad. I'm sure there are
individualvideos that examine the issue more thoroughly.
https://www.youtube.com/watch?v=68LtY2aATl0 [7 minutes]
Note: He makes several uses of the term "DAW" (spelled out as
like Theletters) without explaining it. DAW stands for Digital Audio
Workstation, essentially the software you use to do the recording,
pitch-correction, quantization etc.
For what it's worth, I've often heard analysts note that bands
they hadBeatles, the Doors, and Led Zeppelin definitely speed up and slow down
perceptably in some of their well-known recordings, even though
lessfine drummers. If quantization had existed and been used when those
recordings were made, we might well have found those songs somehow
impressive....
This short by Rick Beato gives an example of a quantized Led Zeppelin
groove versus the original:
https://www.youtube.com/shorts/a7dTRgc0Mn4
The short is an excerpt from this longer video:
https://www.youtube.com/watch?v=hT4fFolyZYU [10 minutes]
When Beato says that the original tempo is 170 BPM, he means 170 beats
per minute.
On 4/11/2025 12:47 PM, BTR1701 wrote:
On Apr 11, 2025 at 2:05:52 AM PDT, "shawn"
<nanoflower@notforg.m.a.i.l.com>
wrote:
On Fri, 11 Apr 2025 08:13:40 -0000 (UTC), "Adam H. Kerman"
<ahk@chinet.com> wrote:
https://www.youtube.com/watch?v=BDJF4lR3_eg
He's done a number of those analysis that shows just how modern music
gets modified all the time. Either pitch correction or compression of
the audio happens all the time.
I've noticed this on recordings going back to the 80s. When I play
along with
film soundtracks, I often have to push the tuning slide in and make my
trumpet
almost an inch shorter than it normally is because the recording is so
sharp,
indicating it's been sped up from how it actually sounded in the
studio when
it was recorded.
I grew up watching, a 1982 musical called "The Pirate Movie" starring
Kristy McNichol. I think I saw an interview with her once where she commented she couldn't sing and it was all done with computers. I never noticed it at the time, but now I can kind of tell there's some auto
tune happening. But until I heard her say that, it never crossed my
mind they had auto-tune in the 80s.
Another favorite musical is "Josie and the Pussycats." I love the movie
and I love the soundtrack. I very recently got to meet Rachael Leigh
Cook at a convention, and I asked her about her singing in the movie.
She said she can't sing at all, and it was all done with computers. That
one sort of surprised me, since I never noticed the auto tune in her voice.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 36:09:40 |
Calls: | 10,392 |
Calls today: | 3 |
Files: | 14,064 |
Messages: | 6,417,152 |