Kinematics

These routines facilitate the calculation of 3d movement kinematics.

Functions

General

Optical Systems

IMUs

Class

kinematics.IMU([inFile, inData]) Class for working with working with inertial measurement units (IMUs)

Methods

kinematics.IMU.calc_orientation(...[, type]) Calculate the current orientation
kinematics.IMU.calc_position(initialPosition) Calculate the position, assuming that the orientation is already known.
kinematics.IMU.setData(data) Set the properties of an IMU-object.

Details

Utilities for movements in 3D space

class kinematics.IMU(inFile=None, inData=None)[source]

Class for working with working with inertial measurement units (IMUs)

An IMU object can be initialized
  • by giving a filename
  • by providing measurement data and a samplerate
  • without giving any parameter; in that case the user is prompted to select an infile
Parameters:

inFile : string

path- and file-name of infile, if you get the sound from a file.

inData : dictionary

The following fields are required:

acc : (N x 3) array

Linear acceleration [m/s^2], in the x-, y-, and z-direction

omega : (N x 3) array

Angular velocity [deg/s], about the x-, y-, and z-axis

[mag] : (N x 3) array (optional)

Local magnetic field, in the x-, y-, and z-direction

rate: integer

sample rate [Hz]

Notes

IMU-Properties:
  • source
  • acc
  • omega
  • mag
  • rate
  • totalSamples
  • duration
  • dataType

Examples

>>> myimu = IMU(r'.\Data\Walking_02.txt')
>>> 
>>> initialOrientation = np.array([[1,0,0],
>>>                                [0,0,-1],
>>>                                [0,1,0]])
>>> initialPosition = np.r_[0,0,0]
>>> 
>>> myimu.calc_orientation(initialOrientation)
>>> q_simple = myimu.quat[:,1:]
>>> 
>>> calcType = 'Madgwick'
>>> myimu.calc_orientation(initialOrientation, type=calcType)
>>> q_Kalman = myimu.quat[:,1:]
calc_orientation(R_initialOrientation, type='exact')[source]

Calculate the current orientation

Parameters:

R_initialOrientation : 3x3 array

approximate alignment of sensor-CS with space-fixed CS

type : string

  • ‘acc_gyr’ ...... quaternion integration of angular velocity
  • ‘Kalman’ ..... quaternion Kalman filter
  • ‘Madgwick’ .. gradient descent method, efficient
  • ‘Mahony’ ... Formula from Mahony, as implemented by Madgwick
calc_position(initialPosition)[source]

Calculate the position, assuming that the orientation is already known.

setData(data)[source]

Set the properties of an IMU-object.

class kinematics.MahonyAHRS(SamplePeriod=0.00390625, Kp=1.0, Ki=0, Quaternion=[1, 0, 0, 0])[source]

Madgwick’s implementation of Mayhony’s AHRS algorithm

Update(Gyroscope, Accelerometer, Magnetometer)[source]

Calculate the best quaternion to the given measurement values.

kinematics.acc_gyr(omega, accMeasured, initialPosition, R_initialOrientation, rate)[source]

Reconstruct position and orientation, from angular velocity and linear acceleration. Assumes a start in a stationary position. No compensation for drift.

Parameters:

omega : ndarray(N,3)

Angular velocity, in [rad/s]

accMeasured : ndarray(N,3)

Linear acceleration, in [m/s^2]

initialPosition : ndarray(3,)

initial Position, in [m]

R_initialOrientation: ndarray(3,3)

Rotation matrix describing the initial orientation of the sensor, except a mis-orienation with respect to gravity

rate : float

sampling rate, in [Hz]

Returns:

q : ndarray(N,3)

Orientation, expressed as a quaternion vector

pos : ndarray(N,3)

Position in space [m]

kinematics.analyze3Dmarkers(MarkerPos, ReferencePos)[source]

Take recorded positions from 3 markers, and calculate center-of-mass (COM) and orientation Can be used e.g. for the analysis of Optotrac data.

Parameters:

MarkerPos : ndarray, shape (N,9)

x/y/z coordinates of 3 markers

ReferencePos : ndarray, shape (1,9)

x/y/z coordinates of markers in the reference position

Returns:

Position : ndarray, shape (N,3)

x/y/z coordinates of COM, relative to the reference position

Orientation : ndarray, shape (N,3)

Orientation relative to reference orientation, expressed as quaternion

kinematics.getXSensData(inFile, paramList)[source]

Read in Rate and stored 3D parameters from XSens IMUs

kinematics.kalman_quat(rate, acc, gyr, mag)[source]

Calclulate the orientation from IMU magnetometer data.

Parameters:

rate : float

sample rate [Hz]

acc : (N,3) ndarray

linear acceleration [m/sec^2]

gyr : (N,3) ndarray

angular velocity [rad/sec]

mag : (N,3) ndarray

magnetic field orientation

Returns:

qOut : (N,4) ndarray

unit quaternion, describing the orientation relativ to the coordinate system spanned by the local magnetic field, and gravity

Notes

Based on “Design, Implementation, and Experimental Results of a Quaternion-Based Kalman Filter for Human Body Motion Tracking” Yun, X. and Bachman, E.R., IEEE TRANSACTIONS ON ROBOTICS, VOL. 22, 1216-1227 (2006)

kinematics.movement_from_markers(r0, Position, Orientation)[source]

Movement trajetory of a point on an object, from the position and orientation of a set of markers, and the relative position of the point at t=0.

Parameters:

r0 : ndarray (3,)

Position of point relative to center of markers, when the object is in the reference position.

Position : ndarray, shape (N,3)

x/y/z coordinates of COM, relative to the reference position

Orientation : ndarray, shape (N,3)

Orientation relative to reference orientation, expressed as quaternion

Returns:

mov : ndarray, shape (N,3)

x/y/z coordinates of the position on the object, relative to the reference position of the markers

Notes

\[\vec C(t) = \vec M(t) + \vec r(t) = \vec M(t) + {{\bf{R}}_{mov}}(t) \cdot \vec r({t_0})\]

Examples

>>> t = np.arange(0,10,0.1)
>>> translation = (np.c_[[1,1,0]]*t).T
>>> M = np.empty((3,3))
>>> M[0] = np.r_[0,0,0]
>>> M[1]= np.r_[1,0,0]
>>> M[2] = np.r_[1,1,0]
>>> M -= np.mean(M, axis=0) 
>>> q = np.vstack((np.zeros_like(t), np.zeros_like(t),quat.deg2quat(100*t))).T
>>> M0 = vector.rotate_vector(M[0], q) + translation
>>> M1 = vector.rotate_vector(M[1], q) + translation
>>> M2 = vector.rotate_vector(M[2], q) + translation
>>> data = np.hstack((M0,M1,M2))
>>> (pos, ori) = signals.analyze3Dmarkers(data, data[0])
>>> r0 = np.r_[1,2,3]
>>> movement = movement_from_markers(r0, pos, ori)
kinematics.vel2quat(vel, q0, rate, CStype)[source]

Take an angular velocity (in deg/s), and convert it into the corresponding orientation quaternion.

Parameters:

vel : array, shape (3,) or (N,3)

angular velocity [deg/s].

q0 : array (3,)

vector-part of quaternion (!!)

rate : float

sampling rate (in [Hz])

CStype: string

coordinate_system, space-fixed (“sf”) or body_fixed (“bf”)

Returns:

quats : array, shape (N,4)

unit quaternion vectors.

Notes

  1. The output has the same length as the input. As a consequence, the last velocity vector is ignored.
  2. For angular velocity with respect to space (“sf”), the orientation is given by
\[q(t) = \Delta q(t_n) \circ \Delta q(t_{n-1}) \circ ... \circ \Delta q(t_2) \circ \Delta q(t_1) \circ q(t_0)\]
\[\Delta \vec{q_i} = \vec{n(t)}\sin (\frac{\Delta \phi (t_i)}{2}) = \frac{\vec \omega (t_i)}{\left| {\vec \omega (t_i)} \right|}\sin \left( \frac{\left| {\vec \omega ({t_i})} \right|\Delta t}{2} \right)\]
  1. For angular velocity with respect to the body (“bf”), the sequence of quaternions is inverted.
  2. Take care that you choose a high enough sampling rate!

Examples

>>> v0 = [0., 0., 100.] * np.pi/180.
>>> vel = tile(v0, (1000,1))
>>> rate = 100
>>> out = quat.vel2quat(vel, [0., 0., 0.], rate, 'sf')
>>> out[-1:]
array([[-0.76040597,  0.        ,  0.        ,  0.64944805]])