IMU vs GPS which is better for this application?

nsaspook

Joined Aug 27, 2009
13,553
Most of the BNO086 basic stuff is done and connected into the CANBUS to Ethernet host for remote testing.
1683747970729.png

The Java 3D cube display on the Linux computer uses the vector rotation and total acceleration reports from the BNO086 IMU for movements.

I need to adjust the rest-frame orientations to match hand and screen rotations and accelerations.
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Most of the BNO086 basic stuff is done and connected into the CANBUS to Ethernet host for remote testing.
View attachment 293916

The Java 3D cube display on the Linux computer uses the vector rotation and total acceleration reports from the BNO086 IMU for movements.

I need to adjust the rest-frame orientations to match hand and screen rotations and accelerations.
Nice. I have the display board just about done and almost ready to order it along with parts. So it will be a while before I get to the second board that has the IMU on it. I am considering adding some buttons/switches to the display board to allow for changing of ranges on the display (maybe like G's and milli-G's). Stuff like that. I should get myself a Linux box. I have some tools I use that need Linux OS. Xilinx ISE is one of them, so I have to run a virtual machine to use the software.
 

nsaspook

Joined Aug 27, 2009
13,553
Nice. I have the display board just about done and almost ready to order it along with parts. So it will be a while before I get to the second board that has the IMU on it. I am considering adding some buttons/switches to the display board to allow for changing of ranges on the display (maybe like G's and milli-G's). Stuff like that. I should get myself a Linux box. I have some tools I use that need Linux OS. Xilinx ISE is one of them, so I have to run a virtual machine to use the software.
Remember to add a ludicrous number of test points headers for software development and debugging with scopes and logic probes. The space for a 10 pin header is minimal but the functionality of critical test points, easily connected, is priceless.
1683764888506.png
1683764975728.png
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,553
Added text to the 3D demo so I can get the IMU movements changes correct. :D
1683812213710.png

Java:
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*
* http://www.java2s.com/example/java-api/javax/media/j3d/orientedshape3d/rotate_about_point-0.html
 * send an ascii string of CSV data point via a tcp port
*/
package bnocube;

import java.awt.*;
import java.util.Scanner;
import javax.media.j3d.BranchGroup;
import javax.media.j3d.Canvas3D;
import javax.media.j3d.Transform3D;
import javax.media.j3d.TransformGroup;
import javax.swing.JFrame;
import javax.vecmath.Quat4d;
import javax.vecmath.Vector3d;
import javax.vecmath.Point3f;
import com.fazecast.jSerialComm.SerialPort;
import com.sun.j3d.utils.geometry.ColorCube;
import javax.media.j3d.Text3D;
import com.sun.j3d.utils.universe.SimpleUniverse;
import java.util.Properties;
import java.net.Socket;
import java.net.InetAddress;
import javax.media.j3d.*;

/**
*
* @author root
*/
public class Bnocube {

    public static void main(String[] args) {

        String ttl_eth_host = "NA";
        JFrame frame = new JFrame("Sensor Fusion Visual Test Program");
        Canvas3D canvas = new Canvas3D(SimpleUniverse.getPreferredConfiguration());
        SimpleUniverse universe = new SimpleUniverse(canvas);
        BranchGroup group = new BranchGroup();
        ColorCube cube = new ColorCube(0.3);
        Font myFont = new Font("TimesRoman", Font.CENTER_BASELINE, 1);
        Font3D myFont3D = new Font3D(myFont, new FontExtrusion());
        Point3f textPt = new Point3f(-1.6f, 0.4f, 0.0f);
        Text3D myText3D = new Text3D(myFont3D, "BNO086", textPt);
        Shape3D myShape3D = new Shape3D(myText3D, new Appearance());

        TransformGroup transformGroup = new TransformGroup();
        transformGroup.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
        transformGroup.addChild(myShape3D);
        transformGroup.addChild(cube);

        universe.getViewingPlatform().setNominalViewingTransform();
        group.addChild(transformGroup);
        universe.addBranchGraph(group);

        frame.add(canvas);
        frame.setSize(800, 600);
        frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
        frame.setVisible(true);

        ttl_eth_host = "10.1.1.238";

        try {
            Socket socket = new Socket(ttl_eth_host, 20108);
            // send IP address query to remote module
            socket.getOutputStream().write("L\r".getBytes("US-ASCII")); // or UTF-8 or any other applicable encoding...

            // grab data from the IMU host tcp server
            Scanner s = new Scanner(socket.getInputStream());
            System.err.println("Scanner running.");

            while (s.hasNextLine()) {
                try {
                    String line = s.nextLine();
                    String[] token = line.split(",");

                    if (token[0].equals("  1")) {
                        System.out.println(String.format("dtype = %3s  device = %s :%s:%s:%s:%s:%s:%s:%s:", token[0], token[1], token[2], token[3], token[4], token[5], token[6], token[7], token[8]));
                        // multiply x/y/z by -1 to swap frames of reference
                        double x = Double.parseDouble(token[4]);
                        double y = Double.parseDouble(token[2]);
                        double z = -Double.parseDouble(token[3]);
                        double w = Double.parseDouble(token[5]);
                        double ax = Double.parseDouble(token[6]);
                        double ay = Double.parseDouble(token[7]);
                        double az = Double.parseDouble(token[8]);

                        Quat4d quaternion = new Quat4d(w, x, y, z);
                        Vector3d vector = new Vector3d((az * 0.02), (ay * 0.02), (az * 0.02));
                        transformGroup.setTransform(new Transform3D(quaternion, vector, 0.5));

                        // the inverse cosine of w gives you the pitch *if* you normalize the quaternion with x and z being zero
                        double pitch = Math.acos(w / Math.sqrt(w * w + y * y)) * 2.0 - (Math.PI / 2.0);

                        System.out.println(String.format("w = %+2.3f     x = %+2.3f     y = %+2.3f     z = %+2.3f     pitch = %+1.3f", w, x, y, z, pitch));
                    }
                } catch (Exception e) {
                    System.err.println("Scan Line error.");
                }
            }
            socket.close();
            s.close();
            System.err.println("Lost communication with the IMU socket. Exiting.");
            System.exit(1);
        } catch (Exception e) {
            System.err.println("Unable to open socket to ttl_eth_host. Exiting.");
            System.exit(1);
        }
        System.err.println("Exiting.");
        System.exit(1);
    }
}
 
Last edited:

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Is that J
Added text to the 3D demo so I can get the IMU movements changes correct. :D
View attachment 293961

Java:
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*
* http://www.java2s.com/example/java-api/javax/media/j3d/orientedshape3d/rotate_about_point-0.html
* send an ascii string of CSV data point via a tcp port
*/
package bnocube;

import java.awt.*;
import java.util.Scanner;
import javax.media.j3d.BranchGroup;
import javax.media.j3d.Canvas3D;
import javax.media.j3d.Transform3D;
import javax.media.j3d.TransformGroup;
import javax.swing.JFrame;
import javax.vecmath.Quat4d;
import javax.vecmath.Vector3d;
import javax.vecmath.Point3f;
import com.fazecast.jSerialComm.SerialPort;
import com.sun.j3d.utils.geometry.ColorCube;
import javax.media.j3d.Text3D;
import com.sun.j3d.utils.universe.SimpleUniverse;
import java.util.Properties;
import java.net.Socket;
import java.net.InetAddress;
import javax.media.j3d.*;

/**
*
* @author root
*/
public class Bnocube {

    public static void main(String[] args) {

        String ttl_eth_host = "NA";
        JFrame frame = new JFrame("Sensor Fusion Visual Test Program");
        Canvas3D canvas = new Canvas3D(SimpleUniverse.getPreferredConfiguration());
        SimpleUniverse universe = new SimpleUniverse(canvas);
        BranchGroup group = new BranchGroup();
        ColorCube cube = new ColorCube(0.3);
        Font myFont = new Font("TimesRoman", Font.CENTER_BASELINE, 1);
        Font3D myFont3D = new Font3D(myFont, new FontExtrusion());
        Point3f textPt = new Point3f(-1.6f, 0.4f, 0.0f);
        Text3D myText3D = new Text3D(myFont3D, "BNO086", textPt);
        Shape3D myShape3D = new Shape3D(myText3D, new Appearance());

        TransformGroup transformGroup = new TransformGroup();
        transformGroup.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
        transformGroup.addChild(myShape3D);
        transformGroup.addChild(cube);

        universe.getViewingPlatform().setNominalViewingTransform();
        group.addChild(transformGroup);
        universe.addBranchGraph(group);

        frame.add(canvas);
        frame.setSize(800, 600);
        frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
        frame.setVisible(true);

        ttl_eth_host = "10.1.1.238";

        try {
            Socket socket = new Socket(ttl_eth_host, 20108);
            // send IP address query to remote module
            socket.getOutputStream().write("L\r".getBytes("US-ASCII")); // or UTF-8 or any other applicable encoding...

            // grab data from the IMU host tcp server
            Scanner s = new Scanner(socket.getInputStream());
            System.err.println("Scanner running.");

            while (s.hasNextLine()) {
                try {
                    String line = s.nextLine();
                    String[] token = line.split(",");

                    if (token[0].equals("  1")) {
                        System.out.println(String.format("dtype = %3s  device = %s :%s:%s:%s:%s:%s:%s:%s:", token[0], token[1], token[2], token[3], token[4], token[5], token[6], token[7], token[8]));
                        // multiply x/y/z by -1 to swap frames of reference
                        double x = Double.parseDouble(token[4]);
                        double y = Double.parseDouble(token[2]);
                        double z = -Double.parseDouble(token[3]);
                        double w = Double.parseDouble(token[5]);
                        double ax = Double.parseDouble(token[6]);
                        double ay = Double.parseDouble(token[7]);
                        double az = Double.parseDouble(token[8]);

                        Quat4d quaternion = new Quat4d(w, x, y, z);
                        Vector3d vector = new Vector3d((az * 0.02), (ay * 0.02), (az * 0.02));
                        transformGroup.setTransform(new Transform3D(quaternion, vector, 0.5));

                        // the inverse cosine of w gives you the pitch *if* you normalize the quaternion with x and z being zero
                        double pitch = Math.acos(w / Math.sqrt(w * w + y * y)) * 2.0 - (Math.PI / 2.0);

                        System.out.println(String.format("w = %+2.3f     x = %+2.3f     y = %+2.3f     z = %+2.3f     pitch = %+1.3f", w, x, y, z, pitch));
                    }
                } catch (Exception e) {
                    System.err.println("Scan Line error.");
                }
            }
            socket.close();
            s.close();
            System.err.println("Lost communication with the IMU socket. Exiting.");
            System.exit(1);
        } catch (Exception e) {
            System.err.println("Unable to open socket to ttl_eth_host. Exiting.");
            System.exit(1);
        }
        System.err.println("Exiting.");
        System.exit(1);
    }
}
Is that Java or JavaScript? I was practicing writing JS a long time ago for web page reasons. This looks a lot like JS.
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
More buttons is a wise choice. selection of different scale factors is very handy for navigation. Setting axis rotation is a handy option as well. And possibly display dimming.
For display dimming and excellent brightness levels, I am using the following chips (8 of them on the board). This chip allows up to 75mA CC per LED along with PWM inputs for dimming. I have it set for 18mA per LED. The LED's I am using have 20mA max current capable, so running a little under that for safety. Plan to use a 256 step PWM. The user gets a knob connected to a rotary optical encoder with 32 counts per rotation, thus the user can go from minimum brightness to maximum brightness with about 8 turns of the knob. If the user turns the knob (Quickly) above a certain rate, I plan to interpret that as more increments of the PWM per knob increment.
 

Attachments

nsaspook

Joined Aug 27, 2009
13,553
For display dimming and excellent brightness levels, I am using the following chips (8 of them on the board). This chip allows up to 75mA CC per LED along with PWM inputs for dimming. I have it set for 18mA per LED. The LED's I am using have 20mA max current capable, so running a little under that for safety. Plan to use a 256 step PWM. The user gets a knob connected to a rotary optical encoder with 32 counts per rotation, thus the user can go from minimum brightness to maximum brightness with about 8 turns of the knob.
With day, night preset levels right? A ambient light sensor or signal option is nice.
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
With day, night preset levels right? A ambient light sensor or signal option is nice.
I am glad you mentioned that! I don't have that on my board yet. I will have to research that. Do you have a sensor in mind that is easy to use? Optimum would be a chip that has a digital output with enough bits to allow for some variation. Thanks!
 

nsaspook

Joined Aug 27, 2009
13,553
I am glad you mentioned that! I don't have that on my board yet. I will have to research that. Do you have a sensor in mind that is easy to use? Optimum would be a chip that has a digital output with enough bits to allow for some variation. Thanks!
Not really, for my BNO086 deisgn I'm using the VCNL4040 (https://www.vishay.com/docs/84274/vcnl4040.pdf) but that requires integration with the sensor hub using the BMO08* aux i2c port. For this application you want something that models the human eye so this might be an option, basically just a photo transistor you can use with an adc or comparator.
https://www.vishay.com/docs/81579/temt6000.pdf
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Not really, for my BNO086 deisgn I'm using the VCNL4040 (https://www.vishay.com/docs/84274/vcnl4040.pdf) but that requires integration with the sensor hub using the BMO08* aux i2c port. For this application you want something that models the human eye so this might be an option, basically just a photo transistor you can use with an adc or comparator.
https://www.vishay.com/docs/81579/temt6000.pdf
I may consider the tempt6000.pdf with a very simple interface with non-linear 'ADC' for simplicity:

1683863278726.png

CPLD sets START high and starts a timer.
CPLD waits for DETECT to go high, stores the timer value and sets START low again.
CPLD waits like 10 times longer than timer value to allow C1 to discharge completely.
Process repeats.

The stored timer value is then non-linearly proportional to the ambient light level. Looks complicated but it is only 5 parts not including the light sensor. I have not calculated the exact part values, but the idea seems simple and sound.
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
I have the display board schematic completed and starting PCB layout and route. Here is the schematic of the display board. Per recommendations from others, there is plenty of test points. The test points for this board are plated through holes that allow for a 22 gauge wire to fit 'snuggly'. I have used this method in the past for test points as it keeps it simple and works very well. There is a 40 pin square header that will send all the signals to the main board via a ribbon cable. The other board will have the IMU/Processor etc.
 

Attachments

nsaspook

Joined Aug 27, 2009
13,553
Last revision of this board for this project.
https://github.com/nsaspook/ll-test/blob/newboard/ll-tester_schem.pdf
https://github.com/nsaspook/ll-test/blob/newboard/ll-tester.pdf

Added two environmental sensors to the BNO086 aux I2C port. Testing the BME280 and VCNL4040. The BNO086 sensor software system hub as built-in support for the two devices to provide input reports from the sensor hub so integration is fairly easy.
C:
            /*
             * start the IMU sensor data reports
             */
            sendTareCommand(TARE_SET_REORIENTATION, 0, 0);
            enableReport(ROTATION, UPDATE_MS_R);
            enableReport(TOTAL_ACCELERATION, UPDATE_MS_T);
            enableReport(LINEAR_ACCELERATION, UPDATE_MS_L);
            enableReport(GYROSCOPE, UPDATE_MS_CAL);
            enableReport(MAG_FIELD, UPDATE_MS_CAL);
            enableReport(SENSOR_REPORTID_TAP_DETECTOR, UPDATE_MS_MISC);
            enableReport(SENSOR_REPORTID_STABILITY_CLASSIFIER, UPDATE_MS_MISC);
            enableReport(SENSOR_REPORTID_SIGNIFICANT_MOTION, UPDATE_MS_MISC);
            enableReport(SENSOR_REPORTID_SHAKE_DETECTOR, UPDATE_MS_MISC);
            enableReport(SENSOR_REPORTID_CIRCLE_DETECTOR, UPDATE_MS_MISC);
            enableReport(SENSOR_REPORTID_AMBIENT_DETECTOR, UPDATE_MS_ENV);
            enableReport(SENSOR_REPORTID_PRESSURE_DETECTOR, UPDATE_MS_ENV);
            if (!enableCalibration(true, true, true)) {
                //                while (true) {
                eaDogM_WriteStringAtPos(6, 0, cmd_buffer);
                eaDogM_WriteStringAtPos(7, 0, response_buffer);
                OledUpdate();
                WaitMs(5000);
                //                }
            }
            imu_start = false;
https://raw.githubusercontent.com/nsaspook/ll-test/newboard/firmware/src/bno086.h

https://www.bosch-sensortec.com/products/environmental-sensors/humidity-sensors-bme280/ relative humidity, barometric pressure and ambient temperature
https://www.vishay.com/en/product/84274/ Proximity and Ambient Light Sensor

Also added a few other options like a 3.3V to 5V boost module to run 5vdc QEI encoders and voltage dividers for the A/B outputs to 3.3V logic and corrected a few PCB wiring bugs.
1684790612403.png
The 1.3 version is on the right. The little pinkish light near the IC3 pad is the Ambient and proximity sensor. The Status MHH is the current accuracy status of the 3 (AGM) sensors on the BNO086. H for high, M for medium, L for low and U for unreliable.
This was after sensor calibration.
Will rework the parts from the old boards into the new PCB.
1684790885611.png
QEI encoder, BNO086, BME280, and step-up DC-DC converter MYRBP500080B21RE https://www.mouser.com/datasheet/2/281/1/MYRBP_B_W-2498168.pdf
1684791676454.png
The VCNL4040 is not calibrated or scaled yet but has a usable signal (1.043) for light level detection. The 118.992 is the air pressure sensor readout.
1684791869111.png
19.996 with a flashlight on the sensor.

Some I2C data streams of the BNO086 aux port talking to the sensors on that bus in response to controller data requests to the BNO086 via SPI.
Aux sensor request periods are 1s on D0 and D1, the rest are controller debug signals..
1684794003904.png

1684792214992.png
1684792341651.png

1684792237019.png

1684792307928.png
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,553
A proper scaling for units of Ambient light in Lux L from the VCNL4040. The Pressure in millibars 1018.80M and temperature in 33.40C from the BME280.
1684814651469.png
Darkened room
1684814675114.png
Light on
1684814702447.png
Flashlight
 

nsaspook

Joined Aug 27, 2009
13,553
I am impressed. very much. and it is functional, as well.
The BNO086 and aux sensors is impressive and easy to design controller compatible interfaces with if you have experience and the right test equipment to find issues quickly. The software and documentation from , not so much. The Android compliant 4.4 KitKat and above software specification for full system capability is overly complex for a simple IMU application but once a stable API framework is done, the rest falls right into place.
https://source.android.com/docs/core/interaction/sensors

The current project github branch (PCB design, software, hardware doc, etc..) is here: https://github.com/nsaspook/ll-test/tree/newboard
 
Last edited:

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
The BNO086 and aux sensors is impressive and easy to design controller compatible interfaces with if you have experience and the right test equipment to find issues quickly. The software and documentation from , not so much. The Android compliant 4.4 KitKat and above software specification for full system capability is overly complex for a simple IMU application but once a stable API framework is done, the rest falls right into place.

The current project github branch (PCB design, software, hardware doc, etc..) is here: https://github.com/nsaspook/ll-test/tree/newboard
Nice! Thanks for sharing the project. After I have my display board completed, your experience with the sensor will help considerably when I have to work with the IMU. :)
 

nsaspook

Joined Aug 27, 2009
13,553
Nice! Thanks for sharing the project. After I have my display board completed, your experience with the sensor will help considerably when I have to work with the IMU. :)

I still have a bit of software to finish for using the device as a remote HID controller via CANBUS and need to cleanup and harden the driver functions. I'll be around if you need me.
 

nsaspook

Joined Aug 27, 2009
13,553
One last sensor demo for the Proximity report from the VCNL4040. There are several breakout boards so it's easy to add as a standalone device.
https://www.adafruit.com/product/4161

10cm is max detection range.
Near the center screen for changing values.

The original testing board donated parts for another 'final' board.
1684886671443.png1684886830038.png
 

nsaspook

Joined Aug 27, 2009
13,553
Some software refinements.
Add a few info bits like the software PN/SW versions read from the chip and the current estimated rotational tracking accuracy in Radians RAcc with all 3 raw sensor in High stability Status
1684981797240.png
1684975315941.png
Added a few GUI-HID (display buttons, buzzer) options to set the initial heading and to set only the forward axis for device sensor orientation. My chip is mounted upside down on the bottom of the PCB so there are a few 3D transformations needed to match the PC display to the IMU input.
The first calibration is to align magnetic North.
They supply docs on the Tare Procedures. https://device.report/m/4c258d2249c126f669c247d75c0baab5eb699404610635b5eb7ac24940ea58a0.pdf
Introduction
This document describes the Tare function of the BNO080/BNO085 that redefines the orientation of the
sensor. This allows the outputs of the BNO080/BNO085 to be in line with the orientation with which it
was mounted into the main device. These commands are described in more detail in the SH-2
Reference Manual [1] and this document assumes that the reader has this manual available for
reference.
That output position Rotation quaternion is then transferred (via SPI, CAN and Ethernet) to the end-user program (Java 3D display) that needs to transform that into the correct user display orientation.
Java:
                        // multiply x/y/z by -1 to swap frames of reference
                        Quat4d quaternion = new Quat4d(w, -y, -z, -x); // cube 3D position
                        // -(√2)/2 = -0.70710678118f
                        Quat4d a = new Quat4d(1.0f, 0.0f, 0.0f, 0.0f); // set cube position correction rotation matrix
                        a.normalize(); // make sure we have a proper rotation quaterion
                        quaternion.mul(a); // multiply IMU quaternion with rotation quaternion to flip cube and text
                        Vector3d vector = new Vector3d((ax * 0.02), (ay * 0.02), (az * 0.02)); // acceleration vector for cube vibration
                        transformGroup.setTransform(new Transform3D(quaternion, vector, 0.5)); // place it on the screen with motions from IMU
I need to swap frames () of reference to Display CW/CCW moves match the PCB moves and rotate the BNO086 chip package reference vector frame back to chip facing up normal.
https://source.android.com/docs/core/interaction/sensors/sensor-types
1684989890629.png

// multiply x/y/z by -1 to swap frames of reference
1684976596601.png
I just use the default system axis orientation record when saving the Tare heading and correct off chip.
1684976631117.png
quaternion.mul(a); // multiply IMU quaternion with rotation quaternion to flip cube and text
That's combined with the acceleration vector to display forces moving the PCB and rotations of the PCB.
Vector3d vector = new Vector3d((ax * 0.02), (ay * 0.02), (az * 0.02)); // acceleration vector for cube vibration
transformGroup.setTransform(new Transform3D(quaternion, vector, 0.5)); // place it on the screen with motions from IMU
This is the end result.
 
Last edited:
Top