Team Members
- Richard Hsiao
- Michael Fanton
Description
This project focuses on using inertial measurement units (IMUs) in the "wild," meaning using IMUs to capture state-specific data in a non-laboratory and non-controlled environment. IMUs are generally used to capture linear acceleration and angular velocity data using gyroscopes and accelerometers. Through some integration, these linear accelerations and angular velocities can then be used to determine the position of the IMU,which provides a useful way to capture biomechanical data such as joint angles and body positions. However, one of the largest issues with capturing real-time position data is the noise apparent not only in the environment, but also in the sensors themselves. Using the integration method to determine positions and joint angles introduces drift error - a systemic artifact due to the integration that causes the data to shift towards one direction. Existing studies have shown that applying a Kalman filter could ameliorate the effects of this drift.
Our project aims to apply the Kalman filter to the raw IMU data in order to find an accurate measurement of the orientation of the head/neck.
Our project aims to use IMUs to capture orientations of the head/neck and then simulating these orientations in OpenSim, where we can analyze and the kinematics and forces acting on the head and neck. The XSENS models outputs neck joint angles as a value, but the underlying biomechanical model is unknown. Other biomechanical models have muscles constrain the neck angle joint such that they are distributed over the cervical vertebra (Vasavada). In distributing the angles between the vertebrae, there may be a more accurate representation of the forces acting on each vertebrae. In this project, we are aiming to trying to determine whether or not if distributing the joint angles is significant in determining the overall kinematics of the neck angles.
Research Questions
How can we integrate biomechanical models with the Kalman filters to more accurately sense orientations from IMUs?
Can multiple IMU's be used to measure neck angle?
Does the way the improved Vasavada head/neck model constrains the biomechanical model (distribution of neck angles on vertebra) agree with the kinematic data shown from motion capture?
How does the improved Vasavada head/neck model compare with the results from the XSENS IMU system for neck angle?
Does the way the model constrains agree with the kinematic data shown from motion capture? Same position change in videos (validation), compare to XSENS. VASVADA how important is disturbiting all that vertebra
Progress
We started our project by collecting data on knee flexion. We wanted to show that using IMUs, you are able to capture the joint angle data, but there exists some error using IMUs. In our project, we wanted to model the head and neck flexion as a pin joint, just as a starting point. In order to show the same issues with using IMUs as orientation trackers, we began the project by illustrating the issues with data from a simple 2D knee flexion movement, which can also be modeled as a pin joint.
- Collected experimental data of knee flexion using high speed motion capture camera (120 bpm, 80 bpm, 40 bpm) and XSENS IMU tracking system.
By using high speed mo-cap, markers on the shank, knee joint, and thigh, and a goniometer, we were able to capture the ground truth of the biomechanical movement and state data during knee flexion.We used knee flexion as a starting point as it was easier to collect the data since the XSENS system has marker data for the thigh and shank. These steps were also a starting point for another group project (Using Inertial Measurement Units to Calculate Knee Flexion Angle) and steps for the knee flexion experiment can be found there. - Processed the motion-capture data in the knee flexion/extension to capture knee angle with markers
The above video shows the knee flexion set-up and movement as well as the trackers (Yellow - thigh, Green - knee joint, Blue - Shank). We can see the knee has a range of motion (ROM) from around 10 degrees to 80 degrees. Using the 2D marker position data, we can find the vector from the thigh marker to the knee joint and the vector from the knee joint to the shank marker. From here, we can calculate the angle between the two vectors and estimate the knee joint angle with the equation: cos (θ) = dot (v,w)/ norm(v,w). The plot below shows the knee angle over time. We can see that it follow what we have observed on the video. - Processed the raw IMU data of knee flexion to obtain position by integrating the linear acceleration and angular velocities
In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data. This step was to illustrate the error.
The linear acceleration of the upper leg is fairly constant, which is what we expect because in the experiment, only the lower leg is moving to flex the knee. The linear acceleration is also correct; however, in both cases the raw data is very noisy. If we take this data and integrate it, we would expect to see position data similar to what was captured in the motion capture video. We expect the upper leg to have be relatively constant and have minimal changes in position while the lower leg should have an oscillatory changes in position. The plot below shows the integrated angular velocity for the upper and lower leg.
And behold, we see that the lower leg after some timesteps begins to drift away from this constant value. This is the systemic error that we expected due to the integration performed on the raw data. SImilarly, in the lower leg we can see some sort of drift nearing the later timesteps. The raw data can be compared to the processed data that comes with the XSENS software, but more on this later. The data shows us some of the basis for our motivations in finding a way to detect more accurate orientation estimates using IMUs.
Now that we've looked into some of the raw IMU data and difficulties with IMU data, we will start by using the processed IMU data that is output by the XSENS system, which presumably applies some blackbox filtering that filters some of the errors inherent in the data (We should put a plot of the the processed sensor data against some of the raw sensor data). From the processed IMU orientation data, we will calculate sagittal neck joint angle from assuming a biomechanical model where the torso and head are rigid bodies connected by a pin joint. We want to take this neck joint angle and fit it into the improved Vasavada model, which constraints the model in a way that distributes the single neck angle across several vertebrae. From there, we can compare the model's kinematics to ground truth data from motion capture and even the results from XSENS.
4. Collected experimental data of neck flexion using high speed motion capture camera (120 bpm, 80 bpm, 40 bpm) and XSENS IMU tracking system.
By using high speed mo-cap, markers on the shoulder, IMU, neck, cheek, and forehead, we were able to capture the ground truth of the biomechanical movement and state data during neck flexion. We used an altered version of the experiment we used to detect to detect neck flexion as in the first part. The motion capture was processed using Kinovea; since we didn't use a goniometer, we used the software to capture the angle and position data. The motion captured with the XSENS IMU data was done with the subject flexing and extending their neck at different rates.
5, Processing the IMU data and marker data to get processed joint angle data of the neck
The XSENS IMU tracking outputs joint angle data for C7T1 and C1Head. We plan to use the joint angles we captured and put them into OpenSim to calculate positions. of the head and compare these positions to the the motion capture data.
The XSENS data for position and orientation are quaternions of the body segment positions and body segment angles, respectively. We're keeping XSENS data the same, but we're taking different models and overall comparing the two.
The way Vasavada model works, there's a function between each joint at angles. If you adjust an angle of 1, the other one would adjuts the otehr one by another factor. To make it a two link model, you could reduce the weight to 0. We will try to limit the number of dependent variables. XSENS into XSENS model vs XSENS into Vasavada model.
6. Plan on looking for other plane rotations and CMC
Bibliography
Vasavada, Li, and Delp. "Influence of muscle morphometry and moment arms on the moment-generating capacity of human neck muscles." Spine, 1998.
Home: BIOE-ME 485 Spring 2017
In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data.