audiointegralnormalizediscrete

How to normalize samples of an ongoing cumulative sum?


For simplicity let's assume we have a function sin(x) and calculated 1000 samples between -1 and 1 with it. We can plot those samples. Now in the next step we want to plot the integral of sin(x) which would be - cos(x) + C. Now i can calculate the integral with my existing samples like this:

y[n] = x[n] + y[n-1]

Because it's a cumulative sum we will need to normalize it to get samples between -1 and 1 on the y axis.

y = 2 * ( x - min(x) / max(x) - min(x) ) - 1

To normalize we need a maximum and a minimum.

Now we want to calculate the next 1000 samples for sin(x) and calculate the integral again. Because it's a cumulative sum we will have a new maximum which means we will need to normalize all of our 2000 samples.

Now my question basically is:

How can i normalize samples in this context without knowing the maximum and minimum? How can i prevent, to normalize all previous samples again, if i have a new set of samples with a new maximum/minimum?


Solution

  • I've found a solution :)

    I also want to mention: This is about periodic functions like Sine, so basically the maximum and minimum should be always the same, right?

    In a special case this isn't true:

    If you samples don't contain a full period of the function (with global maximum and minimum of the function). This can happen when you choose a very low frequency.

    What can you do:

    It only need to be calculated once at the beginning.