Announcement

Collapse
No announcement yet.

Historical gait data accuracy

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Historical gait data accuracy

    I was discussing this with a user recently and it was suggested that I should post this for the historical record. In the early days of clinical gait analysis I was often helping users set up gait labs, installing the original Vicon 3D motion capture systems, AMTI force plates, and MA100 systems that collected EMG and foot-switch data. Every configuration involved testing and verifying the lab data collection accuracy and once that was done, the labs would start collecting gait data for clinical research.
    Once the data was collected and analysed, I saw a few complaints from labs that the foot-switch data was not matching the force data, their heel switch always appeared to close before the heel hit the force plate. Everyone trusted the force plates so the analysis software was updated to adjust the foot-switch signal timing to synchronize the gait cycle foot switch data with the force plate data because the problem was seen as a foot-switch issue.
    Recently I was working with a lab user to help them verify their system accuracy and we saw a similar issue - we dropped a golf ball covered in reflective tape on the force plate, detecting the ball impact with a speaker connected to an EMG input and saw that the force data was being delayed slightly more than the EMG signal when both signals were compared with the golf ball trajectory. Trying to figure this out made me think about the days helping Dr Kadaba’s team install the Helen Hayes software in many labs; The initial lab installation always used the default AMTI filter setting of 1050Hz but when the Helen Hayes software was installed, the force plate filter settings were always changed to 10.5Hz which would have resulted in cleaner force plate data being delayed slightly and might explain why the heel foot-switch signal did not match the force plate Fz signal.
    I’m only writing this because I think might affect the historical records of normal gait data - I’m just reporting an issue that I’ve seen, I’m not saying that the original data is bad. Setting up the gait labs we had initially thought that we had verified the data collection accuracy, but we didn't realize that we needed to verify the working environment and always assumed that it was good until we saw the “foot-switch problem”.
    As a result, it might be a useful study for a user to verify their gait lab data collection accuracy and then repeat the original gait calculations and compare the results - hopefully verifying the historical data, or reporting any issues.
    Last edited by Edmund Cramp; October 13, 2021, 10:36 AM.

  • #2
    That's fascinating!

    The 10.5 Hz low-pass filter for the force plate is not a bad idea, it matches closer to the low-pass filters used for kinematics, and helps prevent impact artifacts in the joint moments.

    But not a good idea to filter in real time (analog or digital) because this introduces a time lag in the signal. This is a good reminder to always record unfiltered data, so a zero-lag filter can be used off-line.

    With increasing interest in real-time analysis, we must be aware of this issue which really is inevitable in real-time applications. Joint moments and EMG envelope all require low pass filtering somewhere. As long as all signals are processed with the same real-time filter, at least they will have the same time lag.

    For reference the group delay and phase delay for a 2nd order Butterworth filter, at the low frequency limit, are both \(\tau = 1 / (\sqrt{2} \pi f_c)\), which for a 10.5 Hz filter would be 21 ms. Higher frequencies could be delayed up to 50% more. (I can post the full equations if anyone is interested).

    Did the golf ball test show delay times in that range?

    Ton van den Bogert

    Comment


    • #3
      I just spent a while reviewing a lot of old force plate gait data (Heel marker and Fz) and it looks like the delays normally appear to be no more than one 3D frame (8, 16, or 20ms). It's worth remembering that the 3D marker sample rates are normally much lower than the analog sample rates. This can be a factor because the 3D impact accuracy is relative to the 3D sample rate when the marker is moving at 9.8 metres/second.
      One old ball-drop test showed the force data appearing about 20ms before the ball hit the plate when the test was performed at the end of the trial - a 3D/analog data synchronization issue that was resolved by the manufacturer afterwards. I was always been quite confident about the motion capture synchronization in the past because I had to demonstrate it when the labs were setup, but these days I see some significant delays in modern radio-telemetry based sensors so I would recommend that modern motion capture lab users consider always calibrating both the 3D volume and the data collection synchronization environment.

      Comment


      • #4
        Great point about modern telemetry systems. Every lab should have test protocols to evaluate the synchronization between all modalities.

        We have tests to check sync between motion capture, force plate, and telemetric IMU's. What is a good way to test EMG synchronization?

        Comment


        • #5
          The traditional golf-ball test, dropping a reflecting golf-ball onto the force plate and comparing the golf-ball vector to the Fz signal, can verify the EMG signal latency if you connect a small loudspeaker to an EMG input and place the loudspeaker face down on the force plate. When the ball arrives on the plate the loudspeaker will generate a signal in the mV range so you have one event detected by the motion capture system, the force-plate, and the EMG system. This is documented in our MTD manual - the MTD enables users to verify the force plate configuration.
          I think that the significant issue with telemetry is generating reliable data that has a very low risk of being corrupted by interference, latency issues are simply part of the data transfer method. The Wi-Fi protocol was first designed so that users can browse the web, while Bluetooth was designed for low power consumption. Very high data reliability and very low latency were not big factors in the market when those telemetry protocols were created. Our EMG systems support a telemetry option but do not use Wi-Fi or Bluetooth so our telemetry adds less than a millisecond of latency to the default 2ms latency when connected via a cable.
          Here's a picture of the golf ball and loudspeaker PXL_20211018_175727843.jpg
          Last edited by Edmund Cramp; October 18, 2021, 02:08 PM.

          Comment


          • #6
            Originally posted by Ton van den Bogert View Post
            That's fascinating!

            The 10.5 Hz low-pass filter for the force plate is not a bad idea, it matches closer to the low-pass filters used for kinematics, and helps prevent impact artifacts in the joint moments.

            But not a good idea to filter in real time (analog or digital) because this introduces a time lag in the signal. This is a good reminder to always record unfiltered data, so a zero-lag filter can be used off-line.

            With increasing interest in real-time analysis, we must be aware of this issue which really is inevitable in real-time applications. Joint moments and EMG envelope all require low pass filtering somewhere. As long as all signals are processed with the same real-time filter, at least they will have the same time lag.

            For reference the group delay and phase delay for a 2nd order Butterworth filter, at the low frequency limit, are both \(\tau = 1 / (\sqrt{2} \pi f_c)\), which for a 10.5 Hz filter would be 21 ms. Higher frequencies could be delayed up to 50% more. (I can post the full equations if anyone is interested).

            Did the golf ball test show delay times in that range?

            Ton van den Bogert
            Hi Ton,

            I would be interested to see the full equations, if you have a key reference you recommend that would be much appreciated too.

            Thanks
            Dan

            Comment


            • #7
              Originally posted by Ton van den Bogert View Post
              The 10.5 Hz low-pass filter for the force plate is not a bad idea, it matches closer to the low-pass filters used for kinematics, and helps prevent impact artifacts in the joint moments.
              When I was helping Dr Kadaba and H.K. Ramakrishnan set up gait labs to use the Helen Hayes Software we got much cleaner graphs when the force plates filters were set to 10.5Hz, but in those days we simply recorded all of the raw force plate data which occasionally resulting in seeing problems and being able to diagnose the issues - typically noise appearing in the force plate data as the subject walked towards the force plate as a result of the walkway being a little loose and tapping on the force plate mountings. These days it's common to see the force plate data processed to "be clean" and only reported when the foot is over the plate ... this means that any problems are hidden. While filtering and data processing are useful when we look at the results, if you want to be sure that there are no problems with the data collection then recording the original raw data for review is very helpful and gives everyone confidence that the processing is accurate and functional.

              Comment


              • #8
                Dan:

                I looked for a reference, but could not find a good one. Most sources use the Laplace transform and don't pay much attention to the time delay. So I went to my own notes from when I developed the real time filter for the Motek D-Flow system.

                It is rather tedious to type equations into the Biomch-L system, so I have attached a PDF file.
                Attached Files

                Comment


                • #9
                  Edmund,

                  Thanks for the loudspeaker idea, that makes sense!

                  Delsys seems to do a good job with their EMG telemetry (though I have not checked). For the analog outputs (which we plug into our motion capture system) they specify a constant time lag which is 48 ms on our system. I suspect that this fixed delay allows time for resending the data if it did not arrive, and do some buffering. We might verify that with the loudspeaker test.

                  Comment


                  • #10
                    My attitude (probably as a result of many years for configuring motion data collections) is that it's always best to verify the data collection environment - I've told users in the past that a good approach is try and show that the data is bad and then, when you fail to find any problems, you can be very confident.
                    Diagnosing problems can teach the users a lot about their environment - for example I worked with a lab many years ago that thought that one of their force plates was bad because the COP data did not match the subject as they walked over it. They had several force plates so they swapped the "faulty one" and saw the same problem again - they assumed that they had proved it was a force plate amplifier fault and sent it back to be replaced. The replacement amplifier showed the same problem so they returned the force plate ... it was also shown to be good. But this was in the early days of calibrating the 3D volume by walking around waving a rod with a pair of markers. It turned out that the distance between the rod markers was slightly off, resulting in a non-linear 3D data collection volume being distorted as you got further away from the lab origin over the first plate - this resulted in measured locations being incorrect over the third plate because it was further from the origin in a non-linear volume. They had "verified" their 3D volume by walking around with the calibration rod which returned the expected length everywhere because the volume had been "calibrated" by the rod - we all learned a lot once this had been figured out.
                    Data latency can be caused by filtering and data collection methods - the golf-ball test should make it reasonably easy to verify things - if you are performing long trials (e.g running on a treadmill) then I would recommend performing the golf-ball test twice, once at the start of the data collection and again at the end - identical latency measurements would show that everything is synchronized. It's worth remembering that the measurement resolution is influenced by the actual sample rates (both 3D and analog sample rates) so a "measurement" is more accurate if the sample rates are high, but you need to also run a verification at the sample rates that you are using to record the trial data to verify your data collection environment is working.

                    Comment


                    • #11
                      Originally posted by Ton van den Bogert View Post
                      Dan:

                      I looked for a reference, but could not find a good one. Most sources use the Laplace transform and don't pay much attention to the time delay. So I went to my own notes from when I developed the real time filter for the Motek D-Flow system.

                      It is rather tedious to type equations into the Biomch-L system, so I have attached a PDF file.
                      Thanks Ton,

                      This is very useful, did you publish this anywhere so that I can reference you if I use the equations?

                      Thanks again
                      Dan

                      Comment


                      • #12
                        Dan:

                        What I did is guess the differential equation, and see if it led to the known Butterworth transfer function. No, I did not publish this.

                        If you just want the transfer function, there are probably sources you can cite. Unfortunately, the Wikipedia article is not that good and does not provide the complex transfer function and phase response. I found various posts on blogs and stack exchange. There are probably journal papers but I guess that would be from the 1930s...

                        Ton
                        Last edited by Ton van den Bogert; October 22, 2021, 10:09 AM.

                        Comment


                        • #13
                          Hi Ton and Dan,

                          Originally virtually all analog data was sampled by Analog to Digital Convertors (ADCs) and prior to the sampling would have been filtered by resistors and capacitors (potentially an active filter if transistors were involved) so the delay introduced by the filtering could be accurately predicted by the Butterworth transfer function mathematics once the frequency and degree of filtering were known. Analog filtering was normally always applied prior to the analog signal sampling to prevent signal aliasing errors. Users were then presented with a sequence of analog samples at a documented sample rate which could then be analyzed in their data processing environment.

                          But the data collection environment has changed - these days analog data is often sampled by an ADC as described above, but is then transmitted via different methods (Wi-Fi, Bluetooth, BLE, etc.) that may involve gathering samples together, transmitting them at unique rates via a radio protocol, then receiving them and extracting the samples from the transmission protocol and verifying the signal validity while applying different methods of processing any detected errors. This results in additional latencies appearing in the users analog samples that are completely unrelated to the signals - as a result the traditional methods of latency calculation no longer document the data collection signal latencies because the data transfer methods can add many additional factors. For example, individual sensor ADCs may have different sample rates, requiring additional data processing to ensure that users are receiving a sequence of analog samples at a documented sample rate so that they can be reliably processed and analyzed.

                          Originally the Butterworth mathematics described the signal processing, but unfortunately nowadays it's just a small factor.

                          Comment

                          Working...
                          X