This question is about an algorithm, not a specific language.
I have a mobile device, with the following axis -
When the device is pointed perpendicular to the ground (Positive Z facing up), I get an accurate reading of the compass in degrees with the angle being between north and the positive Y of the device.
So if I hold the device and point it west, I will get either the value 270 or the value -90 (not quite sure how to figure out which one I'll get).
When I hold the device with Y pointing towards the sky, the compass is no longer accurate BUT I can use the device accelerometer to figure out when that is the case AND I can use the gyro to figure out what the rotation of the device is.
So, when I get an accurate reading of the compass, I am saving a parameter called "lastAccurateAttitude", which is the Quaternion from the gyro.
I also save "lastAccurateHeading" as the angle from north.
What I am attempting to do is use lastAccurateHeading, lastAccurateAttitude and the current attitude, to calculate the angle between the north and the negative z axis of the device.
Right now, I am stuck on the math, so I would appreciate some help.
If possible, I would love for this to also work when either x, -x, y or -y of the device are pointing upwards. (The north would still be calculated compared to -z).
Related
I made an object tracker that calculates the position of an object recorded in a live camera feed using stereoscopic cameras. The math was simple, once you know the camera distance and orientation. However, now I thought it would be nice to allow me to quickly extract all these parameters, so when I change my setup or cameras I will be able to quickly calibrate it again.
To calculate the object position I made some simplifications/assumptions, which made the math easier: the cameras are in the same YZ plane, so there is only a distance in x between them. Their tilt is also just in the XY plane.
To reverse the triangulation I thought a test pattern (square) of 4 points of which I know the distances to each other would suffice. Ideally I would like to get the cameras' positions (distances to test pattern and each other), their rotation in X (and maybe Y and Z if applicable/possible), as well as their view angle (to translate pixel position to real world distances - that should be a camera constant, but in case I change cameras, it is quite a bit to define accurately)
I started with the same trigonometric calculations, but always miss parameters. I am wondering if there is an existing solution or a solid approach. If I need to add parameter (like distances, they are easy enough to measure), it's no problem (my calculations didn't give me any simple equations with that possibility though).
I also read about Homography in opencv, but it seems it applies to 2D space only, or not?
Any help is appreciated!
I got my rotation values from OPI0 to my pc and now i want to get a object to rotate in that direction where my gyroscope is facing (degree values from 180 to -180) Im fairly new to Python.
glRotatef(1, xrot, yrot, zrot) Only sets the rotation + the current rotation.
But what i actually want to have is that if it is on 180 degrees, i want to place it to 180 degrees. This could be archieved by getting the current rotation, testing if its smaller or largen and then adjust the rotation or setting the rotation with a command to the degree value.
So my main questions are:
Is there a command to set the rotation value of the created object?
Is there a way to read the current rotation value?
Well, if you
glLoadIdentity()
before
glRotatef()
then it should always rotate to the specified degrees, instead of relatively rotating.
Depending on how and where you've setup your camera and object translation, you might have to refactor your code a bit, because of the identity (it will clear all other transformations you done before).
I have got this problem, so there is a set of data as points in the spherical coordinate system - local (not a faithful arrangement of geographic or mathematical)and I'm trying to convert it to a Cartesian system to preview it in any program to draw the shape which should rise from these points .
Points are collected by the meter with a rotating laser head (thus slightly noisy). The head rotates in two axes, called phi, theta and the distance r.
Where
phi - is left-right rotation (-90 to 90)
theta - is up-down rotation (-90 to 90)
r - the distance
This can be seen in the figure below:
I tried to convert the data to Cartesian (xyz) according to the following formulas:
Unfortunately, every time something happens to me run them down and the picture that I get is incorrect.
For sample collection:
sample
I get such a picture (seen from top):
The expected result should be a rectangular tub (with bare upper part). This first arc (at the point where data has not yet been ran over) is called. lens effect, resulting from the fact that the meter was close to the wall, and second end of graph puzzles me end where the data are arranged in a completely unexpected.
With this number of points is hard for me to figure out what causes failure or bad for the conversion of data, or simply meter so measured. I would be grateful for verification my way of thinking and any advices.
Thank you in advance.
I think i am late to answer this question.
I can't see the images, anyway you can go through enter link description here .
It will give you clear idea how to convert spherical cordinate data into cartesian cordinate system.
i am coding 1 app which is same app "iHandy Level Free" on google play.
i am using gyroscope sensor, but i don't know what is the axis around which the device's rotating ? because when i rotate, tilt device, 3 values x, y, z are change too.
thanks
The Android follows ENU(east, North, UP ), have a look at this Application Note : http://www.st.com/st-web-ui/static/active/jp/resource/technical/document/application_note/DM00063297.pdf
convention , so you will get a bigger value for the axis around which the device is being rotated.
It is not possible to get a Zero value around any axis no matter how gently you move the device .You are bound get some angular rate around the tationary axis (which you are assuming to be stationary)
I wish to calculate position of a small remote controlled car (relative to starting position). The car moves on a flat surface for example: a room floor.
Now, I am using an accelerometer and a gyroscope. To be precise this board --> http://www.sparkfun.com/products/9623
As a first step I just took the accelerometer data in x and y axes (since car moves on a surface) and double integrated the data to get position. The formulae I used were:
vel_new = vel_old + ( acc_old + ( (acc_new - acc_old ) / 2.0 ) ) * SAMPLING_TIME;
pos_new = pos_old + ( vel_old + ( (vel_new - vel_old ) / 2.0 ) ) * SAMPLING_TIME;
vel_old = vel_new;
pos_old = pos_new;
acc_new = measured value from accelerometer
The above formulae are based on this document: http://perso-etis.ensea.fr/~pierandr/cours/M1_SIC/AN3397.pdf
However this is giving horrible error.
After reading other similar questions on this forum, I found out that I need to subtract the component of Gravity from above acceleration values (everytime from acc_new) by using gyroscope somehow. This idea is very well explained in Google Tech Talks video Sensor Fusion on Android Devices: A Revolution in Motion Processing at time 23:49.
Now my problem is how to subtract that gravity component?
I get angular velocity from gyroscope. How do I convert it into acceleration so that I can subtract it from the output of accelerometer?
It won't work, these sensors are not accurate enough to calculate the position.
The reason is also explained in the video you are referring to.
The best you could do is to get the velocity based on the rpm of the wheels. If you also know the heading that belongs to the velocity, you can integrate the velocity to get position. Far from perfect but it should give you a reasonable estimate of the position.
I do not know how you could get the heading of the car, it depends on the hardware you have.
I'm afraid Ali's answer is quite true when it comes to those devices. However why don't you try searching arduino dead reckoning which will cover stories of people trying similar boards?
Here's a link that appeared after a search that I think may help you:
Converting IMU to INS
Even it seems like all of them failed you may come across workarounds which will decrease errors to acceptable amounts or calibrate your algorithm with some other sensor to put it back in track as the squared error of acceleration along with gyros white noise destroying accuracy.
One reason you have a huge error is that the equation appears to be wrong.
Example: To get updated vel,use.. Velnew=velold+((accelold+accelnew)/2)*sampletime.
Looks like you had an extra accel term in the equation. Using this alone will not correct all the error....need to as you say correct for influence of gravity and other things.