Announcement

Collapse
No announcement yet.

Summary - Smoothing/Filtering of Running-Kinematics data

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Summary - Smoothing/Filtering of Running-Kinematics data

    Hello again to all BIOMCH-L subs,

    Last week I posted a query regarding SMOOTHING/FILTERING of RUNNING-
    KINEMATICS data. (look below for the original posting).

    I got many replies, and I THANK YOU EVERYBODY who responded.

    I think I'm going to still use Winter's residual analysis for the
    cut-off frequency - which as you can see in the replies - is the most
    used one.

    AGAIN - thank you ALL very much !!!

    David.

    > Hello to all BIOMCH-L subscribers,
    >
    > I'm a master's student working on a research regarding the influences of
    > fatigue on running kinematics.
    > We are using a video camera (50Hz sampling frequency) to collect 2D-kinematic
    > data of running subjects.
    >
    > Each subject has different running speed and as a result should have
    > different cut-off frequency for the filter. (And the same for each marker).
    >
    > I used residual analysis (as in D.A. Winter's book), but this should be
    > done for each subject and marker which is a huge work :
    > ~20 subjects * 5 markers * 4 tests * 3 (at least) times on each test =
    > = 1200 (!!!!) times doing the residual analysis...
    >
    > So is there anyone who can suggest a systematic way to smooth the kinematics'
    > data with a shorter-simpler way ???
    >
    > Thanks in advance ( Of-coarse I'll post a summary of the replies )
    >
    > David.


    From: Daniel Zlatnik
    ~~~~~~~~~~~~~~~~~~~~
    I was not able to completely understand what you want to do but I can tell you
    what I did for my purpose and perhaps you could find something useful.

    I built a dynamic model of human bipedal gait. The model is used to compute the
    joint torques from measured and computed kinematics. This model can be used for investigating normal and pathologic gait. I took the 3D data of markers (at the joints) from the gait lab, I then plotted (MATLAB) the stick figure
    animation to confirm validity of each data sampled (there is always data which
    is not relevant) and to select the beginning and ending time (or sample) of each
    gait phase (SLS, DLS, stance, swing etc.). Then I compute the inertial angular
    positions of each link (MATLAB) and by fitting a high order polynom to each time
    history and twice derivate it accurately since the derivative is actually the
    n-1 coefficients of the original polynom. The degree of the polynom you select
    has a filtering effect depending on the degree.

    I hope it helps, I am originally from Carmelia (Haifa) and work here for some
    years in developing an intelligently controlled A/K prosthesis.

    Dani.

    ================================================== ==============
    From: Chris Kirtley

    You could just filter at the 6th harmonic of the natural frequency
    (cadence) of the runner. Actually, you might be able to get away with
    the 4th or even 3rd harmonic for more proximal markers such as the hip.

    Chris Kirtley

    ================================================== ==============
    >From Christian.Peham@vu-wien.ac.atThu May 22 11:10:08 1997

    try a Fourier-analysis of your data. Then you know the range
    of frequency. With this information it is possible to adapt your
    filter (Cut-off-frequency) for smoothing the data.

    Good Luck

    Christian Peham

    ================================================== =================
    >From Bing.Yu@css.unc.eduThu May 22 11:10:11 1997

    You asked a question that has been repeatedly asked in the list. You may want
    to try an equation I developed in my master thesis to estimate the optimum
    cutoff frequency for Butterworth low-pass digital filter

    Fc = [1.4845 + 0.1532 sqr(Fs)]^2

    where Fc is the estimated optimum cutoff frequency and Fs is the sampling
    frequency. This equation has been used for different human body movements
    several years and the results are satisfactory.

    It seems that you are expecting different cutoff frequencies for different
    running speed. I have to say that I have never seen any quantitative evidence
    showing that the optimum cutoff frequency is a function of movement speed. If
    you have any quantitative data supporting this view, I would like to have the
    reference.

    Dr. Winter's resisdual analysis procedure for determining cutoff frequency for
    the Butterworth low-pass digital filter can be easily instrumented in computer
    programs, and does not cost a lot of time in use. However, you may already
    noticed, there is no evidence that the cutoff frequencies determined using
    this procedure are optimum.

    Which ever procedure you are going to use, good luck.

    You can find the reference for the equation I just gave to you.

    Yu, B. and Hay, J.G. (1995) Angular momentum and performance in the triple
    jump: a cross-sectional analysis. Journal of Applied Biomechanics, 11:
    81-102.

    Bing Yu, Ph.D.
    Assistant Professor
    Division of Physical Therapy
    The University of North Carolina at Chapel Hill

    ================================================== =================
    >From Bing.Yu@css.unc.eduThu May 22 11:10:29 1997

    David,

    I understood the reason why you expect different cutoff frequencies for
    different markers. I know Dr. Winter said that in his book. His statement
    about different cutoff frequencies for different markers is based on the
    cutoff frequencies determined using his residual analysis procedure. As I
    mentioned in my previous e-mail to you, there is not solid support showing
    that the cutfoff frequencies determined using the residual analysis procedure
    are optimum. In addition, Dr. Winter did not say that different markers have
    different cutoff frequencies because their speeds are different. What he said
    is that different markers have different cutoff frequencies because their
    frequency spectrums are different. I hope this will help to clarify the
    confusion.


    Bing Yu, Ph.D.
    Assistant Professor
    Division of Physical Therapy
    The University of North Carolina at Chapel Hill

    ================================================== =================
    >From G.Giakas@mmu.ac.ukThu May 22 11:10:17 1997

    First of all, if you want to use Winter's method, why don't you write
    a small routine to analyse al these files in one go ?
    For the purposes of my studies I used 1440 signals and it took
    about 2-3 hours of computer proccessing to run everything. You just
    have to use a char code for your data files that the software will
    understand and create the file names. Then all you have to do is make
    a few loops (20 * 5 * 4 * 3).

    I cannot think of another systematic way smoothing your data, if
    I got what you mean. I can however suggest you using other filtering
    techniques. You may also want to have a look on two of my papers.
    --
    1) Giakas G and V Baltzopoulos (1997). A comparison of automatic
    filtering techniques applied to biomechanical walking data. Journal
    of Biomechanics (in press).

    2) Giakas G and V Baltzopoulos (1997). Optimal digital filtering
    requires a different cut-off frequency strategy for the determination
    of the higher derivatives. Journal of Biomechanics (in press).
    --

    Contact me again if you need any more help.

    Good luck

    Giannis

    ================================================== =================
    >From morrisa@ecf.toronto.eduThu May 22 11:10:21 1997

    I have been looking into smoothing routines for our 3-D data of walking,
    but the same still applies. If you want to be rigorous, you should do an
    FFT for each marker, look at it's spectrum and apply a low-pass filter at
    some-determined frequency - 6Hz is good for walking, but I am not sure for
    running. One means of automatically applying a satisfactory filter to
    each marker is through the Generalized-Cross Validation scheme that is
    used by Woltring based on the work of Craven and Wahba. Woltring's
    software is available on the ISB website, you should look at that and the
    associated references.

    Regards,

    Alan Morris

    ================================================== =================
    >From smccaw@ilstu.eduThu May 22 11:10:24 1997

    If you have to enter code for each smoothing, it is a lot of work. I too
    have coded Winters method (in QuickBasic), but with an efficent way to
    process the files the work is minimal.

    Your problem probably stems from how you name your data files. I did my
    doctoral work at the U of Oregon in the late 1980's, and adopted a file
    naming system that lends itself to easy processing.

    Files are called SxCyTz, where Sx identies the subject number, Cy
    identifies the condition number, and Tz identifies the trial number. We use
    nested loops, one for each of sub, condition and trial, and then loops for
    X & Y coordinates of each landmark. This makes analyzing multiple files
    easy. The analysis can run all night, and x,y coordinates for multiple
    landmarks can be easily smoothed. There is no operator input after the
    initial set up. All smoothed data is stored to similarly names files for
    use in further processing.

    This is the most efficent method I know of. It works wonderfully.

    Regards, Steve

    ************************************************** *******
    **** visit our web site ****
    ************************************************** *******
    http://www.cast.ilstu.edu/hperd/facility/hprbio.htm
    ************************************************** *****
    Steven T. McCaw, Ph.D
    Associate Professor, Biomechanics
    Dept of HPER
    5120 Illinois State University
    Normal, IL 61790-5120
    phone: 309-438-3804
    fax: 309-438-5559
    home: 309-452-9411
    ================================================== =================
    >From beyerr@bme.ri.ccf.orgThu May 22 11:10:26 1997

    I am a master's student as well, and so am very inexperienced at this myself. However I did run into a similar problem so I will tell you what I've done. Please take it with a grain of salt, as I may
    have oversimplified the solution.
    My project involves the kinematics of slip-and-fall accidents. I have
    33 subjects, so I understand your concern about doing work for each
    individual, and each trial. I am looking at markers on the knees and feet.
    It occurred to me to filter them differently, so this is how I checked
    for cutoff frequencies.

    I wrote a program to digitally filter my data at cutoff frequencies
    from 5-20 Hz (increments of 1 Hz). I knew that a standard cutoff
    frequency for walking was 6 Hz, but also knew I should go higher for slips
    (however I did not know how much higher). For each of these cutoff
    frequencies, my program determined the average rms error between the
    unfiltered data and the data filtered at that cutoff. I made a plot
    of cutoff frequency vs error. (The shape comes out looking like an
    exponential decay curve.)

    I experimented this way for a dozen or so subjects and looked for the
    point on each graph that was tangent to the curve, a place that represents
    a potential cutoff frequency to use. The reference my advisor gave me
    for this method is K.M. Jackson "Fitting of Mathematical Functions to
    Biomechanical Data". By visual inspection of the graphs, a cutoff
    frequency of about 9 Hz seemed appropriate for my study.
    I did the same thing for knee data, and found that it was not
    significantly different. Therefore I ended up using the same cutoff for
    all subjects and for both the knee and feet markers.

    I'll be interested to see what other methods are suggested to you.
    Good luck with your project. Please let me know which method you end
    up using; it might be worth it for me to do it another way.

    -Rachel

    ================================================== =================
    >From jvdura@ibv.upv.esThu May 22 11:10:32 1997

    You can use the smoothing B-Splines with General Cross Validation or
    standard error method.
    If you write a program in C or MATLAB all the work is done by the
    computer

    Look for :

    WOLTRING, H.J. (1986)
    A Fortran package for generalized cross-validatory spline smoothing
    and differentiation
    Adv. Eng. Software, 8, 104-113

    ================================================== =================
    >From niiler@UDel.EduThu May 22 11:10:42 1997

    A student here wrote a program using Winter's algorithm. This program
    gives the cutoff frequency for each marker when given a *.p3d file. In
    addition, it produces an output *.p3d file which has been filtered at the
    correct frequency. The program is not terribly complex and should not be
    that difficult to create. To obtain a copy of this program, e-mail Ed
    Quigley at quigley@udel.edu (he wrote it).

    Cheers,
    Tim Niiler

    ================================================== =================
    From: Jeffrey Kumer

    Keith Williams here a UCDavis did a study very similar to yours
    and you may want to talk to him about it. His email address is
    krwilliams@ucdavis.edu.

    Good luck,
    Jeff Kumer

    ================================================== =================
    From: Tim Doyle

    I am currently doing some cycling research. I also have to smooth 2-D
    data. I am using Matlab and wrote some files which, using Winter's
    results, smooth my data, and allow for different cut-off freq.'s. Perhaps
    you should consider this.

    Good Luck,
    Tim Doyle

    ================================================== =================
    From: Michael DeLancey

    Look for a Paper by Lauder and Reilly 1992. It uses salamanders as an
    example BUT they devised several methods for smoothing and simplifying
    kinematic data.

    ================================================== =================
    From: Michael Orendurff

    You're right, that level of work is absurd, and
    really you'll probably end up with about the same numbers
    anyway. I did this on my master's project and a single
    file (1000 Hz but only 50 samples) was over 2 MB in excel.
    My experience is that Winter's method almost always
    ends up with a cutoff frequency which over-smoothes the
    data (as judged by experienced biomechanists). From my
    limited experience it appears that no one has come up with
    a fool-proof method for picking cutoff frequencies and that
    sound judgement is always necessary.
    My advice is to pick a point like the knee which
    will have moderate velocity change during a stride and
    perform the residual analysis on it for a single
    representative (read median) subject. (The hip will
    have very small velocity change and the foot will have very
    large velocity change so the knee is perhaps a good medium
    point for residual analysis.) Smooth all data with this
    cutoff frequency. After all is your thesis about fatigue
    in running or is it about smoothing techniques, picking
    cutoff frequencies, and endless computational nightmares?
    My guess is that even if you did this all cutoff
    frequencies would fall between 6-9 Hz.
    If your advisor still insists on a individual point
    by point cutoff value, individual by individual then make a
    deal with him: If you ever finish it a Ph.D. will be
    awarded.

    Michael



    ================================================== ==============
    Bye...& Have a nice day...

    () David Daily
    /\ Dept. of Biomedical Eng.
    Dudi Daily_/) Technion, IIT
    /\ Haifa 32000 , ISRAEL
    / \ E-mail : daily@biomed.technion.ac.il
    _\ _\ Tel. : 972-4-8294141
    ================================================== ===============
Working...
X