Calculating The Average Amplitude of an Audio File Using FFT in Javascript

I am currently involved in a project in which I want to find the average amplitude for given audio data in any given AAC file. I am currently reading the file as an array buffer and passing into an Uint8Array.

var dataArray = new Uint8Array(buffer)

Then I set up two arrays, one real(containing the audio data) and one imaginary(containing all zeros), and pass them into an FFT. The audio data is then placed into a new array such that the numbers within the array are no longer treated as unsigned 8-bit integers.

var realArray = [audio data here]
var imagArray = [0,0,0,0,0,0,...]
transform(realArray, imagArray)

I then go through the arrays, looping from 0 to N/2, where N is the size of the initial buffer containing the raw audio data, and calculate the magnitude of each frequency bin. Finally, I divide the sum of these magnitudes by N/2.

The problem is that on some occasions, for audio played at a lower intensity, I get a high value compared to the value given by audio played at a higher intensity. Is my approach correct in relation to achieving my goal or is there a better way of going about it? Thanks.

Note: For those interested the FFT being used can be found here in several languages. FFT I am passing the middle 2^20 bytes of the audio file into the FFT then doing my calculations.


You really don't need to use an FFT for this - Parseval's theorem essentially means that energy in the time domain is equal to energy in the frequency domain, so the FFT part is redundant - you can just calculate amplitude in the time domain. Typically this is done by calculating the RMS value of the signal over a chosen time window (the length of this time window depends on what you are trying to achieve).

链接地址: http://www.djcxy.com/p/33844.html

上一篇: 如何正确地将数组绘制成谱图

下一篇: 在Javascript中使用FFT计算音频文件的平均幅度