We are trying to keep the project simple and are using Arduino nano this two connected BNO055 IMUs to obtain Euler and Quaternion measurements.

The physical setup has the two IMUs mounted above each other with the orientation of both boards in the same direction and on the same horizontal plane.

The bottom IMU is mounted at a midpoint (center of gravity) to obtain stability and the top IMU can move in forward, sideways and backwards or any combination of this in relation to the bottom.

The arduino is mounted between the sensors. We are using RF to transmit the data to a collection arduino.

We are currently able to obtain the measurements using arduino and have implemented a basic comparison between the Euler measurements to see if there is deviations from the original starting point and plotting this to determine if there is possible bending in one of the directions.

We need to try and establish the movement of the sensors in the horizontal plane (and possibly the vertical) to be able to see at what point the deviation occurred between the sensors.

We have been reading quite a lot but are struggling with converting the Quaternion measurements to movement. We have had a comment:

"Quaternions (or Euler Angles) are used to create a rotation matrix that modifies the raw acceleration values. This gives you accelerations in an inertial coordinate system (e.g. the room your robot is in). From there, you multiply your timestep (integrate time) and use it to update a 1x3 velocity array (matrix), and multiply your timestep again to determine displacement. The displacement is added to a position array (matrix) to determine current position."

but we are not really making any progress to try and calculate this on the arduino.

This is for a project for my son and could possibly be expanded in different directions as it is turning out to be an interesting area.

The Euler basic comparison is the following:

Data is a structure that the measurements are simultaneously logged:

#include <Adafruit_Sensor.h>

#include <Adafruit_BNO055.h>

sensors_event_t event;

bno1.getEvent(&event);

data.OX2=event.orientation.x;

data.OY2=event.orientation.y;

data.OZ2=event.orientation.z;

//data.qOR = quat / quat2;

data.xOR = data.OX1-data.OX2;

data.yOR = data.OY1-data.OY2;

data.zOR = data.OZ1-data.OZ2;

This is then transmitted.

Any help in translating the comment to actual code would be appreciated. We know that we will not have absolute values and that we need to add filters to the processing at a later stage