Welcome to the OpenSense documentation! If you try the example and software, please send any issues or feedback to firstname.lastname@example.org.
OpenSense is a new workflow for analyzing movement with inertial measurement unit (IMU) data. In the page below, we introduce you to the tool, show you how to get started, and describe how to use the software to compute and analyze gait kinematics through a hands-on example.
OpenSense is a workflow that enables users to compute the motions of body segments based on inertial measurement unit (IMU) data, as shown in the animation below. The OpenSense workflow is summarized in the text and flowchart below.
To get started, you will need an OpenSim model that includes the joints and degrees of freedom of interest. In the example below, we provide a model for studying lower-extremity kinematics during gait. There are many more models available through our Model Library. If you are solely interested in joint angles and other kinematic quantities (e.g., normalized muscle lengths and lengthening velocities), the model need not be scaled to the anthropometry of the subject.
You must also input orientations from one or more IMU sensors. We currently support Xsens and APDM file formats. You can create your own file converter to support any other IMU system and we plan to add support for additional sensor manufacturers in the future. We currently assume that sensor fusion and syncing have been performed using a vendor’s or third-party algorithm. Several open-source sensor fusion algorithms are also available on GitHub.
OpenSense provides an interface to associate and register each IMU sensor with a body segment of an OpenSim model (as an IMU Frame). We provide a basic calibration routine in which the first frame of IMU data is registered to the default pose of the model. You change the registration pose by changing the default coordinate values of the model. You can also write your own calibration procedures in Matlab, Python, etc. to optimize the initial pose of the model for calibration using other data sources (markers, goniometer, etc).
An inverse kinematics method is used to compute the set of joint angles at each time step of a motion that minimizes the errors between the experimental IMU orientations and the model’s IMU Frames. The angles can then be used as inputs to other OpenSim tools and analyses or you can visualize these angles in the OpenSim GUI. The OpenSense capabilities are available through the command line and through scripting (Matlab or Python). The resulting Model and Motion can be loaded, visualized, and analyzed in the OpenSim GUI. In the future, we will also provide a direct GUI-based tool to run IMU-based kinematics.
The OpenSense workflow is available as of OpenSim 4.1. You can perform the OpenSense workflow on Mac or Windows through:
As with OpenSim, the OpenSense tools use XML settings files to specify the details of your workflow. To get started, you will first need to download and install the latest OpenSim version (minimum version is 4.1). OpenSense can be downloaded from SimTK, with both Windows and Mac builds available.
To get started, follow the instructions below depending on how you plan to access OpenSense.
To use OpenSense in Matlab, you must install OpenSim version 4.1 and then follow the Matlab scripting setup instructions.
To use OpenSense in Python, you must install OpenSim version 4.1 and then follow the Python scripting setup instructions.
Your OpenSim download includes a command line executable that can run all of the OpenSense tools. After you have installed OpenSense, you need to tell your system where to find the OpenSense tools by adding it to your system's path.
On Mac, you will need to open a Terminal window and edit your .bash_profile file.
You can learn more about your bash profile and how to edit it here.
On Windows, you also need to add the path to OpenSense executables and dynamic libraries to your system path by performing the following steps:
Back to top
Now that you've installed OpenSim, we will show you how to use the software through a hands-on example using example experimental IMU data from a study of lower extremity gait kinematics. The example data, models, scripts, and setup files can be found in your OpenSim resources directory under /Code/Matlab/OpenSenseExample. You can also download a zip of the example files _here_.
The basic steps for running an IMU-based OpenSense kinematics analysis are the following;
The flowchart below shows the workflow for the example (TODO: Update this flowchart). We will import the IMU sensor data, calibrate our OpenSim model, compute inverse kinematics, and then visualize the results.
We use Xsens sensor data in this example, but all the steps for using APDM sensors are identical except for data reading. Read how to import APDM sensor data in the section below. Please note that the data in this example is for illustrative purposes and not intended for research use.
The first step is to collect your data and convert it into a format that you can read into OpenSim and process with the OpenSense workflow.
For our example, we have eight sensors placed as follows:
Once you have collected and pre-processed your data, you must convert it to OpenSim's file format and associate it with an OpenSim model. Data from IMUs can be in various formats: a single file with numbered sensor names (e.g., APDM) or multiple files with sensor-specific numbering (e.g., Xsens). Upon import, OpenSim will create a single, time synced, Storage (.sto) file format for orientations, converting the rotation matrices into quaternions.
In this example, we will be using data from an Xsens system that has been pre-processed (e.g., time-syncing and sensor fusion has been performed) and exported to an Xsens text format. You can find this data in the IMUData folder. Each Xsens sensor is represented by a single text (.txt) file with time histories of the internal sensor data.
To read your data, you first need to create a file that lets OpenSense know which sensor is associated with which body segment in the Model. In our example, this file is called myIMUMappings.xml. You can open and edit this file in any text editor. In this settings/XML file you specify the following information:
Each IMU sensor is represented as a Frame in an OpenSim Model, where a Frame is an orthogonal XYZ coordinate system. When you read in your data, OpenSense will find the appropriate IMU frame in your model (based on the mappings XML file) or create an IMU Frame, if it doesn't already exist. OpenSense uses a naming convention where we expect the sensor to be named as <bodyname>_imu. For example, the OpenSim model has a right femur body called femur_r, therefore the IMU sensor must be called femur_r_imu.
The IMU reader then creates a storage file with the orientation data for each sensor, where each column in the storage file is named according to the frame in the corresponding OpenSim model. To read in your data, use the following steps, depending on how you are accessing the OpenSense workflow
TODO: Add some comments to the Matlab code
Include this for each step? Or just at the end?
To read your data from the command line, use the following steps.
Since we are working with Xsens data, we use the -ReadX option (read Xsens data). We next provide the directory where the IMU sensor data is (IMUData) and then the name of the local IMU Mappings file (myIMUMappings.xml).
Running this command line call will generate an orientations file called <trial_name>_orientations.sto (<trial_name> is defined in your myIMUMappings.xml file) in your OpenSenseExampleFiles folder.
TODO: Document all of the settings in the IMU_Placer in this section
The next step is to calibrate the IMUs to an OpenSim model. The OpenSense Calibration step takes an OpenSim Model and the IMU calibration data and finds the initial orientations of the IMU Frames (i.e. offsets) relative to the OpenSim body segments. We provide a basic algorithm for calibration or you can also create your own methods of calibration by developing your own algorithms (in C++ or via scripting) to compute a default pose and/or the transforms of the IMU sensors.
To use OpenSense's calibration, you must provide an OpenSim Model in the calibration step. In our example, we are using the Rajagopal (2015) model. As noted above, on data read, either your Model should have IMU frames attached that correspond to the name_in_model specified in Step Two, or if you use our assumed naming convention (<bodyname>_imu), the calibrate step will add IMU Frames to the model as long as there is a corresponding body segment with a matching <bodyname>.
OpenSense calibration assumes that the pose of the subject in the calibration data matches the default pose of the model. In our example, the calibration pose is with the pelvis, hip, knee, and ankle at neutral, so we did not need to make any adjustments to the model's default pose. If you use a different pose, you can edit the pose of the input in the OpenSim GUI, through scripting, or in XML (see Coordinate Controls and Poses to learn how to edit the default pose through the OpenSim GUI).
You must next provide the calibration data. OpenSense assumes the first time point corresponds to the calibration pose. If you have a trial where the calibration pose is performed at some time other than the first time row, you must edit your orientations file (or make a new one) where the first time row best corresponds to the calibration pose.
You can also specify optional arguments that enable OpenSense to correct or adjust for the overall difference in the heading (forward direction) of the IMU data versus that of the OpenSim model. Typically, an OpenSim model is facing in the positive X direction of the ground frame in the initial pose, but the base IMU (e.g., on the pelvis or torso) can have any initial heading. Given the <base_imu_label> (the label that identifies the base IMU in the provided orientation data) and the <base_heading_axis> (the 'x', 'y' or 'z' axis of the base IMU that represents its heading direction), OpenSense will compute the angular offset between the two poses and use it to rotate all the orientation data so that the heading of the base IMU is now directed along the X-axis of the OpenSim ground reference frame (same as the model). If the <base_imu_label> is not provided, then no heading correction is applied. If the <base_imu_label> is provided and no <base_heading_axis> the 'z' axis of the base IMU is assumed to be its heading direction.
The output of the calibration step is a calibrated model, where each IMU is registered to the OpenSim model. The image below shows our example subject with IMU's on the pelvis, trunk, thighs, shanks, and feet segments and the corresponding OpenSim Model with the matching pose.
>> myIMUPlacer = IMUPlacer('myIMUPlacer_Setup.xml'); // >> myIMUPlacer.run(true); >> imuPlacer.getCalibratedModel().print('calibrated_Rajagoal_2015.osim')
To calibrate your model with IMU Orientations from the command line, use the following steps.
>> opensense -Calibrate myIMUPlacer_Setup.xml
In our example, the pelvis_imu is set as the base IMU and z is the axis of the base IMU that corresponds to its heading,
A visualizer window will appear, showing the calibrated model. The pose of the model is determined by the model's default pose and will not change from one calibration to the next (unless you edit the model's default pose). What will change is the orientation of the sensors attached to each body. You can zoom in on the sensors, represented as small orange bricks located at the COM of each body.
Note: You can close the visualizer window, when selected, by using the keyboard shortcut of ctrl-Q (command-Q on Mac).
The terminal window will print out the calibration offset for each IMU. This is the transform between the model body and the IMU sensor.
To continue the calibration, and print the calibrated model to file, select the visualizer window and press any key to continue.
The Calibrated Model is written to file and will have the prefix 'calibrated_' added (i.e., if the input Model file is called model.osim, the output calibrated model file will be named calibrated_model.osim).
Now that you have read in your data and calibrated your model, you can use OpenSense's Inverse Kinematics to track Orientation data from IMU sensors. The Inverse Kinematics step finds the pose of the model at each time-step that minimizes, in the least-squares sense, the difference between the orientation data from the IMU sensors and the IMU Frames on your calibrated model. The computed kinematics depend on both the calibrated model and the sensor data. Thus to perform inverse kinematics tracking of orientation data you need (i) a Calibrated Model (.osim), (ii) an orientations file (as quaternions), and (iii) an Inverse Kinematics Setup file (.xml). Using the Calibrated Model we generated in the previous section, we will track orientation data for walking that we read in during Step Two.
In a text editor— such as Notepad++, SublimeText, Atom, or Matlab— open the IK_Setup.xml file. The setup file stores properties that tell OpenSense how to run the Inverse Kinematics simulation. In the setup file, you specify:
<time_range> The time range for the inverse kinematics tracking (in seconds). In our example, we use data between 7.25 and 15 seconds.
An example setup file is shown below.
For now, leave these settings as they are. This settings file can be copied and edited for your own workflow.
To perform Inverse Kinematics with OpenSense from the command line, use the following steps.
>> opensense -IK myIMUIK_Setup.xml
The output motion file is written to file and will have the prefix 'ik_' added (i.e., if the input orientations file is called MT_012005D6_009-001_orientations.sto, the output motion file will be named IKResults/ik_MT_012005D6_009-001_orientations.mot)
Visualization isn't generated when running OpenSense Inverse Kinematics through the command line, so it is useful to view the results of the simulation using the OpenSim application (GUI) visualizer. You can also use OpenSim's plotter to plot the kinematics or perform further analyses with other OpenSim pipeline tools. (Note that you will generally need to scale your model and provide ground reaction forces if you want to generate muscle driven simulations.)
To view the Inverse Kinematics results:
To plot the OpenSense Kinematics
You can run OpenSense through the Matlab scripting environment. The Matlab interface provides additional tools to customize your workflow and also allows you to visualize the results of the inverse kinematics tool in real-time.
To use OpenSense through Matlab, you must perform the following setup.
Download and unzip OpenSense on your system. Add OpenSense to your system path with instructions here.
To test that everything is configured correctly, run the following command:
The configureOpenSim.m file will detect any installations of OpenSim that were previously configured with MATLAB, and will "remove" them from MATLAB (the other OpenSim installations are not deleted, they are simply no longer configured with MATLAB). The configureOpenSim.m file also backs up any changes it makes to MATLAB configuration files.
We have provided a set of scripts to run through the workflow from the example above in Matlab.
You can read your IMU data into OpenSense through the Matlab scripting interface. Note that, as in the example above, we will still use the myIMUMappings.xml file to define the mappings from IMU sensor to OpenSim model. A feature of the scripting interface is that you can also read and export the IMU accelerations, magnetometer, and gyro data to file.
You can also perform calibration through Matlab scripting. The scripting interface is similar to the command line. You provide the model names, the orientations file, and, optionally, the base IMU name and heading. You can also specify if you want to have the calibrated model visualized.
To perform calibration, run the OpenSense_CalibrateModel.m script in Matlab. A visualization window will open, showing the calibrated model, as well as the calibrated model being written to file.
The Scripting interface for the OpenSense IMUInverseKinematicsTool gives you finer control over the inverse kinematic properties. In particular, you can visualize the Inverse Kinematics tracking.
To perform Inverse Kinematics, open and run the file OpenSense_OrientationTracking.m script in Matlab. A visualizer window will open, showing the kinematic tracking of the IMU orientation by the model. The results of the orientation tracking will be written to the IKResults directory. Results can be plotted in the OpenSim GUI similarly done above.
You can run OpenSense through the Python scripting environment. The Python interface provides additional tools to customize your workflow and also allows you to visualize the results of the inverse kinematics tool in real-time.
We have provided a set of scripts to run through the workflow from the example in Python. The steps parallel the Matlab instructions described above.
Typically, APDM exports a trial as a .h5 file and as a .csv ASCII text file that is comma delimited, grouped in order by sensor. The OpenSense APDM Reader can only read CSV file types.
To read the APDM CSV file, you must create a file that associates the column labels in the APDM .csv file with an OpenSim model body segment. You can open and edit this file in any text editor. In this settings/XML file you specify the following information:
An example Settings file for APDM sensors can be Downloaded here. You can open and edit this file in any text editor. A snippet of the file is shown below:
To use the OpenSense APDM command-line tool to read in the APDM data and export an orientations .sto file to use in OpenSense, you would use the below call.
>> opensense -ReadA myAPDFile.csv myAPDMMappings.xml
The current version of OpenSense is our first step in bringing IMU-based biomechanics to the research community. We plan several additional enhancements and new features in future releases, for example:
More advanced approaches to calibration and interfaces to better support users developing their own calibration protocols
Support for OpenSense inverse kinematics directly in the OpenSim application (GUI)
Tools to easily visualize data and results for debugging (e.g., are the sensors registered to the correct body segments?)