If you can get good results without filtering, by all means do it that way!

Filtering was big issue in the 1980s when data came from hand-digitized 16 mm film, and was very noisy. Filtering was necessary and results were sensitive to how the filtering was done. Old-timers like me have trouble letting go of that mindset.

Nevertheless, I always try to set the filter cut off as high as possible. Some years ago, we had some arm motion data from throwing, and I was surprised to see that the angular accelerations could be clearly seen above the noise without any filtering. We should be open-minded. As long as motion and force data are treated the same way.

The need for filtering depends on noise level and sampling rate in the data, and the amplitude of the real acceleration signal.

Accelerations are normally calculated by finite differences. If the noise level in marker data is P, and sampling frequency is F, the noise level in velocity data is sqrt(2)*P*F (sqrt(2) instead of 2 because two samples of position data have independent random errors). The noise level in acceleration will be again a factor sqrt(2)*F larger, so 2*P*F^2.

Let's say we are looking at running. If marker data has noise of 0.5 mm, and is sampled at 200 Hz, the accelerations have a noise level of 0.001 * (200)^2 = 40 m/s2. The true acceleration of lower extremity body segments is in the order of 5 g [1], so the noise level is 80% of the signal. Filtering seems necessary in this case, and this will be discovered by the procedure I always use: start with a very high filter frequency (for both markers and GRF). Examine the variables of interest (joint moments), and repeat the analysis while gradually lowering the filter frequency until the signal to noise ratio is acceptable. This is subjective, and the criterion depends on what you want to do with the results. If a peak moment value is needed, it will always be overestimated when there is noise. But this will be the same in all subjects and all conditions, so may not affect statistical analysis.

I agree on gap filling for marker data, it should not be necessary. Unless too many markers drop out, inverse kinematic analysis will still work with missing markers. Virtual markers use single-segment kinematic models. Full IK can tolerate even more marker drop out and is more robust (but requires joint models with less than 6 DOF).


[1] Lafortune MA (1991) Three-dimensional acceleration of the tibia during walking and running. J Biomech 24: 877-886.