This page last changed on Sep 06, 2008 by iank@bearcave.com.
 Time Series Graphsred: original tick data time series blue: wavelet filtered time series

Wavelet functions can be attractive ways to build time series filters. I have discussed using wavelets to filter histogram distributions here.

The Harr wavelet has problems for time series smoothing because it uses a rectangular basis. The Daubechies wavelet is also problematic because it can leave artifacts when some of the wavelet coefficient bands are set to zero when smoothing the function. Of the wavelet functions that I'm currently aware of, the linear interpolation wavelet is the most attractive choice.

Methods for smoothing a time series include:

1. using a "denoising" algorithm
2. zeroing out selected wavelet coefficient bands

### Problems with Interpolated End Points

As with many wavelet algorithms, the linear interpolation algorithm has problems at the end of the time series. The last point is interpolated. The problem is that this can lead to a last point with a lot of error, relative to the time series. This is shown in the the time series below. This is a day of data (Goldman Sachs, GS, on July 10, 2008). The wavelet function has been calculated on 16K elements of the tick data time series. The highest four bands of wavelet coefficients (e.g., 8K, 4K, 2K, 1K) have been thresholded. The end of the graph is shown below:

A detail of this time series and the result of wavelet threshold filtering is shown below:

When a wavelet filter is used to filter a streaming time series, a window is used. In the case here the window is 256 elements long. The window is filled with 256 time series elements and the wavelet function is calculated. The wavelet coefficients are then filtered and the inverse wavelet transform is performed on the filtered coefficients. The last element of the wavelet window would normally become first element of the filtered time series. When a new element is added on the wavelet filter is performed again and the left most element becomes the next element in the time series.

The problem with the linear interpolation wavelet is that the left most element is interpolated and can have a lot of error. The last element will tend to over estimate the movement of the end of the time series. This is shown in the time series below. Note that the movements in the original time series are emphasized in the wavelet filtered time series. For market models, where we would like smoothing, this is the opposite of what we're trying to achieve.

The problem with the interpolated end point can be avoided by using the element farther from the end. For a time series of N elements, we use the N-4 element. This introduces lag in the result, but for a tick data time series, this lag is only four ticks and in most cases will not matter.

Although the literature on wavelet signal processing seems to recommend using threshold noise filtering, in many cases it appears to be less effective than simply zeroing out one or more high frequency hands. Here the wavelet transform is calculated and the coefficients in the bands are all set to zero. For the 256 element window, the 128 and 64 element bands are set to zero.

The result of a streaming wavelet filter using this zero band technique is shown below:

### Experiements

A variety of variations on the linear wavelet filter were tried. None of these made any real difference over the simple 128 element wavelet filter where the high two bands are zeroed out. Yes, I know, I said 256 above, but it doesn't actually make much difference and 128 is after than 256.

• Longer wavelet buffer lengths (512 and 1024)
• Zeroing out more hands (say 1025 and zero out the high three bands). This just results in more noise.
• Wavelet filter feeding a wavelet filter. Here we filter the data and reconstruct the signal. Then we filter the signal again. It doesn't make any difference.