Shift angle orientation using atan2 - trigonometry

I need to calculate the instantaneous phase of a 2D wave field U using the Hilbert transform H(U).
So far it works out and the result makes sense.
Yet, the instantaneous phase is derived by using atan2(H(U)/U) and the angle orientation is as follows:
Is there an easy way to have zero degrees down and pi up?
Thanks!!!

Related

Triangulate camera position and orientation in regards to known objects

I made an object tracker that calculates the position of an object recorded in a live camera feed using stereoscopic cameras. The math was simple, once you know the camera distance and orientation. However, now I thought it would be nice to allow me to quickly extract all these parameters, so when I change my setup or cameras I will be able to quickly calibrate it again.
To calculate the object position I made some simplifications/assumptions, which made the math easier: the cameras are in the same YZ plane, so there is only a distance in x between them. Their tilt is also just in the XY plane.
To reverse the triangulation I thought a test pattern (square) of 4 points of which I know the distances to each other would suffice. Ideally I would like to get the cameras' positions (distances to test pattern and each other), their rotation in X (and maybe Y and Z if applicable/possible), as well as their view angle (to translate pixel position to real world distances - that should be a camera constant, but in case I change cameras, it is quite a bit to define accurately)
I started with the same trigonometric calculations, but always miss parameters. I am wondering if there is an existing solution or a solid approach. If I need to add parameter (like distances, they are easy enough to measure), it's no problem (my calculations didn't give me any simple equations with that possibility though).
I also read about Homography in opencv, but it seems it applies to 2D space only, or not?
Any help is appreciated!

What is the reference point for measuring angles in OpenCV?

I'm trying to infer an object's direction of movement using dense optical flow in OpenCV. I'm using calcOpticalFlowFarneback() to get flow coordinates and cartToPolar() to acquire vector angles which would indicate direction.
To interpret the results I need to know the reference point for measuring the angle. I have found this blog post indicating that the range of angles is 360°. That tells me that the angle measurement would go along the lines of the unit circle. I couldn't make out much more than that.
The documentation for cartToPolar() doesn't cover this and my attempts at testing it have failed.
It seems that the angle produced by cartToPolar() is in reference to the unit circle rotated clockwise by 90° centered on the image coordinate starting point in the top left corner. It would look like this.
I came to this conclusion by using the dense optical flow example provided by OpenCV. I replaced the line hsv[...,0] = ang*180/np.pi/2 with hsv[...,0] = ang*180/np.pi to get correct angle conversion from radians. Then I tested a video with people moving from top right to bottom left and vice versa. I sampled the dominant color with GIMP and got RGB values which I converted to HSV values. Hue value corresponds to the angle in degrees.
People moving from top right to bottom left produced an angle of about 300° and people moving the other way round produced an angle of about 120°. This hinted at the way the unit circle is positioned.
Looking at the code, fastAtan32f is used to compute the angles. and that seems to be a atan2 implementation.

Projected travelled distance from gyroscope / accelerometer measurements

I want to build an app that calculates project distance traveled by a kicked ball (ball is kicked into a training net) based on the initial readings from accelerometer/gyroscope and some assumptions about weight, air resistance, etc. The ball contains Triple Axis Accelerometer and Gyro. The measurements can either be raw accel/gyro x/y/z values or one of the following:
Actual quaternion components in a [w, x, y, z] format
Euler angles (in degrees) calculated from the quaternions
yaw/pitch/roll angles (in degrees) calculated from the quaternions acceleration components with gravity removed. This acceleration reference frame is not compensated for orientation, so +X is always +X according to the sensor, just without the effects of gravity.
acceleration components with gravity removed and adjusted for the world frame of reference (yaw is relative to initial orientation)
I actually don't know what any of the above means.
Could someone suggest a formula to use in this scenario with this kind of data available? I went through this question, but it doesn't exactly have what I'm looking for.

How is 3D plane normal vector related to its rotation

What i am trying to do http://www.youtube.com/watch?v=CaTI2d0tQME 3:15
In my 3D api there is quad.rotation[x,y,z], quad[x,y,z] which is center of it and width/height. I understand that vertices are being calculated from all of the given. And normal can be calculated from vertices but i have a feeling i should be able to get it just from the rotation?
Yes you can !
Your quad must be axis-oriented (along the X, Y or Z axis, which is its normal vector in its local space). Compose this vector with the quad rotation matrix and you will have your new, nice and shiny normal vector in world space !
A little warning : if the quad transformation matrix is generated by any 3D engine, it could contain scaling factors that will mess the normal vector up. In this case, the classical solution is to compute the transposed inverse of the matrix, or to generate your custom transformation matrix with the quad's rotation values.

Defining Up in the Direct3D View Matrix when Camera Is Constantly Moving

In my Direct3D application, the camera can be moved using the mouse or arrow keys. But if I hard code (0,1,0) as the up direction vector in LookAtLH, the frame goes blank at some orientations of the camera.
I just learned the hard way that when looking along the Y-axis, (0,1,0) no longer works as the Up direction (seems obvious?). I am thinking of switching my up direction to something else for each of these special cases. Is there a more graceful way to handle this?
Assuming you can calculate a vector pointing forward (what you are looking at - your position) and a vector pointing right (always on the XZ-plane unless you can roll). Normalize both these vectors, then up is forward x right (where x is cross product).
In general, you can plug in your yaw, pitch and roll into a rotation matrix and rotate the axis vectors to get right, up and forward, but I guess that's what you are using LookAtLH to avoid.
See http://en.wikipedia.org/wiki/Rotation_matrix#The_3-dimensional_rotation_matricies
The graceful way to handle this is to use Unit Quaternions. A quaternion is a vector of 4 values that encodes an orientation in 3D space (not a rotation as some articles assert) and a unit quaternion is one where the vector length sqrt(x^2+y^2+z^2+w^2) is 1.0. There are a set of mathematical operations for working with quaternions that are analogous to using matrices to encode rotations, with the added bonus that quaternions can never represent an degenerate orientation. You can freely convert quaternions to a 3x3 or 4x4 matrix when you need to feed the result to a GPU.
Your problem is that, while you are moving your camera, you will introduce a little twist into the camera's up direction. By forcing the camera to re-center itself on the (0,1,0) vector every iteration, you are in effect rotating the camera and then clamping the camera's orientation to remain on the surface of a sphere, but when your camera hits the pole of this sphere there is no good direction to call "up" and your matrix goes singular and gives you zero-sized polygons (hence the black screen). Quaternions have the ability to interpolate through these poles and come out the other side just fine, leaving you with a valid matrix at all times. all you have to do is control the "twist".
To measure this twist you should read Ken Shoemake's article "Fiber Bundle Twist Reduction" in the book Graphics Gems 4. He shows a good way to measure this accumulated twist and how to remove it when it is offensive.

Resources