I'm very new at R and EEG signalling so please excuse me if the answer to the question is obvious.
I'm trying to perform a Butterworth filter to an EEG signal to extract the Alpha band. When I executed the filter, the resulting signal looked very strange and not at all what I expected, with an unusually large peak at the beginning of the time frame. I tried using eegfilter and bwfilter to see if it was a problem with the code but there was very little difference between the two when I plot the results. I'm at a loss to explain the end result and would be grateful if someone could explain peculiar end result to me.
Here is an example from the data I'm looking at: https://ufile.io/1ji48wg6
The sampling rate is 512.
I want to extract the alpha band, so frequencies between 8 and 12 Hz
library(eegkit)
mturk <- read.csv("EEG_alpha.csv", head = TRUE, sep= ",")
mturk.but <- eegfilter(mturk, Fs = 512, lower = 8, upper = 12, method = "butter", order = 4)
plot(mturk.but)
Here is a picture of the data when plotted. The left most image is the raw data. The central plot is the result of applying a Butterworth filter using eegfilter. And the right plot is the result of applying a Butterworth filter using bwfilter.
Plots of data when filters are applied
Header of the dataset:
EEG
-8438.876837
-8442.718979
-8441.877183
-8439.974768
-8443.436883
-8448.900711
-8452.433874
-8441.616546
It seems that eegfilter and bwfilter functions add 0's in front of the data before applying the filter and only then normalises it. As such you end up with something akin to a Dirac at the start of the data once its been processed, making the filtered data go from it's raw state:
To this once you've filtered it:
However, if you normalise the data to 0; subtracting the first value of the time series from all the values before you apply the filter, no Dirac-like artifacts occur: