- Richard Hsiao
- Michael Fanton
This project focuses on using inertial measurement units (IMUs) in the "wild," meaning using IMUs to capture state-specific data in a non-laboratory and non-controlled environment. IMUs are generally used to capture linear acceleration and angular velocity data using gyroscopes and accelerometers. Through some integration, these linear accelerations and angular velocities can then be used to determine the position of the IMU,which provides a useful way to capture biomechanical data such as joint angles and body positions. However, one of the largest issues with capturing real-time position data is the noise apparent not only in the environment, but also in the sensors themselves. Using the integration method to determine positions and joint angles introduces drift error - a systemic artifact due to the integration that causes the data to shift towards one direction. Existing studies have shown that applying a Kalman filter could ameliorate the effects of this drift.
Our project aims to apply the Kalman filter to the raw IMU data in order to find an accurate measurement of the orientation of the head/neck.
How can we integrate biomechanical models with the Kalman filters to more accurately sense orientations from IMUs?
Can multiple IMU's be used to measure neck angle?
We started our project by collecting data on knee flexion. We wanted to show that using IMUs, you are able to capture the joint angle data, but there exists some error using IMUs. In our project, we wanted to model the head and neck flexion as a pin joint, just as a starting point. In order to show the same issues with using IMUs as orientation trackers, we began the project by illustrating the issues with data from a simple 2D knee flexion movement, which can also be modeled as a pin joint.
- Collected experimental data of knee flexion using high speed motion capture camera (120 bpm, 80 bpm, 40 bpm) and XSENS IMU tracking system.
By using high speed mo-cap, markers on the shank, knee joint, and thigh, and a goniometer, we were able to capture the ground truth of the biomechanical movement and state data during knee flexion.We used knee flexion as a starting point as it was easier to collect the data since the XSENS system has marker data for the thigh and shank. These steps were also a starting point for another group project (Using Inertial Measurement Units to Calculate Knee Flexion Angle) and steps for the knee flexion experiment can be found there.
- Processed the motion-capture data in the knee flexion/extension to capture knee angle with markers
The above video shows the knee flexion set-up and movement as well as the trackers (Yellow - thigh, Green - knee joint, Blue - Shank). We can see the knee has a range of motion (ROM) from around 10 degrees to 80 degrees. Using the 2D marker position data, we can find the vector from the thigh marker to the knee joint and the vector from the knee joint to the shank marker. From here, we can calculate the angle between the two vectors and estimate the knee joint angle with the equation: cos (θ) = dot (v,w)/ norm(v,w). The plot below shows the knee angle over time. We can see that it follow what we have observed on the video.
- Processed the raw IMU data of knee flexion to obtain position by integrating the linear acceleration and angular velocities
In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data. This step was to illustrate the error.
The linear acceleration of the upper leg is fairly constant, which is what we expect because in the experiment, only the lower leg is moving to flex the knee. The linear acceleration is also correct; however, in both cases the raw data is very noisy. If we take this data and integrate it, we would expect to see position data similar to what was captured in the motion capture video. We expect the upper leg to have be relatively constant and have minimal changes in position while the lower leg should have an oscillatory changes in position. The plot below shows the integrated angular velocity for the upper and lower leg.
And behold, we see that the lower leg after some timesteps begins to drift away from this constant value. This is the systemic error that we expected due to the integration performed on the raw data. SImilarly, in the lower leg we can see some sort of drift nearing the later timesteps. The raw data can be compared to the processed data that comes with the XSENS software, but more on this later. The data shows us some of the basis for our motivations in finding a way to detect more accurate orientation estimates using IMUs.
In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data.