I found from various online sources that the time complexity for DTW is quadratic. On the other hand, I also found that standard kNN has linear time complexity. However, when pairing them together, does kNN-DTW have quadratic or cubic time?
In essence, does the time complexity of kNN solely depend on the metric used? I have not found any clear answer for this.
You need to be careful here. Let's say you have n
time series in your 'training' set (let's call it this, even though you are not really training with kNN) of length l
. Computing the DTW between a pair of time series has a asymptotic complexity of O(l
* m
) where m
is your maximum warping window. As m
<= l
also O(l
^2) holds. (although there might be more efficient implementations, i don't think they are actually faster in practice in most cases, see here). Classifying a time series using kNN requires you to compute the distance between that time series and all time series in the training set which would mean n
comparisons, linear with respect to n
.
So your final complexity would be in O(l
* m
* n
) or O(l
^2 * n
). In words: the complexity is quadratic with respect to time series length and linear with respect to the number of training examples.