Best way to find 'down'

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Best way to find 'down'

1,246 Views
scottm
Senior Contributor II

I've got the sensor fusion code integrated with my application and working with my sensors, but this project has been reminding me just how long it's been since my last math class (coming up on 25 years, I think) and we never covered quaternions.

I'm running the 9DOF Kalman filter algorithm, and at the moment I'm just trying to get the 'down' angle in one plane.  The sensor is in the rim of a large, relatively slow-moving wheel (maybe a few RPM) and I need to identify which part of

the wheel is in contact with the ground.

The pitch reading seems like the simplest way to go - if the sensor is at the bottom and level the pitch should be 0, after 1/4 rotation it'll be straight up and the pitch will be 90.  The problem then is that it doesn't distinguish right side up from upside down and when the sensor is at the top and level it's back to 0.  If the wheel wasn't moving it'd be easy - I could get what I need from a single accelerometer reading.

Which parameter(s) would be most appropriate for this?

Thanks,

Scott

Labels (1)
0 Kudos
Reply
5 Replies

1,106 Views
lisettelozano
NXP Employee
NXP Employee

Hello Scott,

As mentioned, the experts in our Sensor Fusion library shared their opinions regarding the implementation of our algorithm with your application. Please see as follows:

We checked it using the Sensor Fusion Toolbox. We think the parameters that are most appropriate for describing a particular position would be both roll and pitch. For example, if the sensor is placed flat on the ground, then roll=0 and pitch=0 ideally. When it is upside down, roll=180 and pitch=0.

The reason for this is the math behind it. For roll and pitch estimation, we use rotation matrices. In math terms, it needs solving either Rxyz or Ryxz. The equations that result from them have an infinite number of solutions. So we restrict both roll and pitch angles to +/-180deg. Though that helps somewhat, it still leads to two unique solutions. Hence, we further have to restrict one of them to +/-90deg. In our case, roll angle is restricted to +/-180deg and pitch is restricted to +/-90deg for the NED frame of reference which is the conventional aerospace sequence. The range might be reversed in Android/Windows reference frames. This now results in a unique solution and these angles cover the entire 1g sphere.

The other way you could simply distinguish front and back positions is by simply seeing your Z values. The values

should vary between + 1 and -1 g on front and back respectively.

Let us know which method is easier for you.

Quaternions are an extension of complex numbers and are the state-of-the-art in animation, sensor orientation, computer graphics, spherical trigonometry etc, basically anywhere where a sequence of rotations are involved. It might not be needed if you are only focused on calculating pitch and roll angles. Rotation matrices should be good enough to calculate these angles.

We hope this information can help. Let us know if you have follow-up questions or if this works well for you.


Have a great day,

Paulina

-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!

- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------

0 Kudos
Reply

1,106 Views
scottm
Senior Contributor II

Hi Paulina,

What you're describing with roll and pitch is the approach I was starting to experiment with.  I feel like I really need to capture video of the system in motion, synced up with the sensor readings, to properly analyze what's going on - it's tough debugging on a target board that's rolling around!

My concern with using Z values to get front vs back is that it's going to be rolling and will have centrifugal forces acting on it, so it's not going to see +/- 1 g.  This is the kind of motion it'll have to deal with.

I know what quaternions are - I just don't have a good grasp of how to use them.  The orientation vector seems easier to get my head around, but I think I'm just going to have to experiment to make sure I really understand what it's representing.

When the sensor fusion toolbox Windows app is displaying the rotated image of the board, which parameters is it using?  Is the source for the Windows app available?

Thanks,

Scott

0 Kudos
Reply

1,106 Views
lisettelozano
NXP Employee
NXP Employee

Hi Scott,

The rotated image of the board is being displayed on the screen from the quaternion calculations. The quaternion values for a particular orientation can be seen in the Sensor Fusion Toolbox -> Dynamics -> Orientation and Quaternion.

Unfortunately, the source code is not available to share.


Have a great day,

Paulina

-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!

- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------

0 Kudos
Reply

1,106 Views
scottm
Senior Contributor II

Hi Paulina,

Do you know if the Sensor Fusion Toolbox is providing the rotation in quaternion form to its 3D engine for that display (I'm assuming OpenGL or maybe DirectX), or is it converting to Euler angles?  It'd be helpful to me to have an example to work from - even if it's just a brief description and not the actual code.  One of the things I hope to accomplish at some point is a preview (something like the one in the toolbox) in a browser using WebGL that'll show the orientation of the device.

Thanks,

Scott

0 Kudos
Reply

1,106 Views
lisettelozano
NXP Employee
NXP Employee

Hello Scott, 

First please accept my apologies for the delayed response. Regarding your question, the Sensor Fusion library has been targeted for hand-held applications; however, it is possible that can be modified for another kind of applications. To double check about this, I have requested assistance from our Sensor Fusion expert and waiting for more information to provide. 


Have a great day,

Paulina

0 Kudos
Reply