update: It turns out my videoElement
did not contain the true video. Even though the YouTube video was playing, videoElement
did not get any events like play
or timeupdate
(or any of the video data). I thought MediaElement.js connected videoElement
to the true video element inside the YouTube iFrame embed, but it only seems to proxy it.
Replacing with a "true" non-YouTube video results in timeDomainDataArray
with non-zero values.
I have a MediaElementAudioSourceNode
obtained via:
audioCtx.createMediaElementSource(videoElement)
How do I get a buffer of audio data from this that is suitable for input to music-tempo
?
Pass ... MusicTempo [a] buffer ... in the following format: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1[. T]hat is, a [32-bit] floating point buffer, with each [sample] between -1.0 and 1.0. This format is used in the AudioBuffer interface of Web Audio API.
I tried the following code, but timeDomainDataArray
is filled with all 0
values:
let audioCtx = new AudioContext();
let source = audioCtx.createMediaElementSource(videoElement);
const analyser = audioCtx.createAnalyser();
const timeDomainDataArray = new Float32Array(analyser.fftSize);
analyser.getFloatTimeDomainData(timeDomainDataArray);
source.connect(analyser);
source.connect(audioCtx.destination);
console.log({ source });
console.table(timeDomainDataArray);
(The code above was run inside a button click-handler, after manually starting video playback.)
I believe this is similar to code from another project, but I'm not sure what the difference is with my code.
The ultimate goal is to detect the beats in a YouTube video so they can be visualized.
Maybe your video element hasn't been played. Try get the analyser data after media played. eg:
let videoElement = document.getElementById('videoElement')
let audioCtx, source, analyser
videoElement.addEventListener('play', () => {
audioCtx = new AudioContext()
source = audioCtx.createMediaElementSource(videoElement)
analyser = audioCtx.createAnalyser()
source.connect(analyser)
source.connect(audioCtx.destination)
})
videoElement.addEventListener('timeupdate', () => {
const timeDomainDataArray = new Float32Array(analyser.fftSize)
analyser.getFloatTimeDomainData(timeDomainDataArray)
console.log({ source })
console.log(timeDomainDataArray)
})