No announcement yet.

Oversampled signal: determining the re-sampling freq?

This topic is closed.
  • Filter
  • Time
  • Show
Clear All
new posts

  • Oversampled signal: determining the re-sampling freq?

    My question involves determining the resampling frequency of a
    signal that has been oversampled initially. My end goal is to determine
    the first derivative of this signal with respect to time.

    I am collecting data and sending control signals from an A/D board at
    1000Hz, the rate at which the control signals need to be generated.
    Unfortunately, the tasks on the board can not be split with respect to
    event speed and as a result, data is collected at 1000Hz as well.

    The problem is that for the data I am collecting, a 1000Hz collection rate
    results in an oversampled signal. Giannis Giakas has suggested
    that the resampling frequency can be found by determining the frequency
    content of the signal, then finding the frequency where x% of the signal
    is contained.

    When this is done, the first derivative of the resultant signal will
    dramatically change when resampled at frequencies corresponding to 99.0,
    99.5 and 99.9% of signal content (i.e. 3Hz, 16 Hz and 30 Hz respectively).

    Can anyone:
    1] recommend another method of determining the resampling


    2] provide a rationale for why the resampling frequency should
    be taken at a specific percentage of signal content?

    Thank you in advance. I will publish a summary of any responses.

    __________________________________________________ __________________
    Greg Kawchuk D.C., M.Sc.
    Clinician, University Health Services
    Ph.D. Candidate, McCaig Centre for Joint Injury and Arthritis Research

    To unsubscribe send UNSUBSCRIBE BIOMCH-L to
    For information and archives: