How to slerp quaternion from zero radians to 2PI radians - graphics

I am using quaternion to solve the angle animation interpolation
- (GLKMatrix4)interpolate:(float)progress
{
GLKMatrix4 R1 = GLKMatrix4MakeZRotation(0);
GLKMatrix4 R2 = GLKMatrix4MakeZRotation(M_PI * 2);
GLKQuaternion quat = GLKQuaternionSlerp(GLKQuaternionMakeWithMatrix4(R1),
GLKQuaternionMakeWithMatrix4(R2), progress);
return GLKMatrix4MakeWithQuaternion(quat);
}
however , i find 2*PI is equal to 0 radians in rotation , so the interpolation between zero radians and 2PI radians is zero radians all the time ?
how could i solve the problem ? please

Since both GLKMatrix4MakeZRotation(0) and GLKMatrix4MakeZRotation(M_PI * 2) yield the same unit matrix, interpolation between them will have no effect. If you'd like to achieve 2PI rotation, interpolate quaternions in 0 to PI/2 range. Then quadruple it afterwards.
- (GLKMatrix4)interpolate:(float)progress
{
GLKMatrix4 R1 = GLKMatrix4MakeZRotation(0);
GLKMatrix4 R2 = GLKMatrix4MakeZRotation(M_PI / 2);
GLKQuaternion quat = GLKQuaternionSlerp(GLKQuaternionMakeWithMatrix4(R1),
GLKQuaternionMakeWithMatrix4(R2), progress);
quat = GLKQuaternionMultiply(quat, quat);
quat = GLKQuaternionMultiply(quat, quat);
return GLKMatrix4MakeWithQuaternion(quat);
}

Related

OpenCV get pixels on an ellipse

I'm trying to get the pixels of an ellipse from an image.
For example, I draw an ellipse on a random image (sample geeksforgeeks code):
import cv2
path = r'C:\Users\Rajnish\Desktop\geeksforgeeks\geeks.png'
image = cv2.imread(path)
window_name = 'Image'
center_coordinates = (120, 100)
axesLength = (100, 50)
angle = 0
startAngle = 0
endAngle = 360
color = (0, 0, 255)
thickness = 5
image = cv2.ellipse(image, center_coordinates, axesLength,
angle, startAngle, endAngle, color, thickness)
cv2.imshow(window_name, image)
It gives output like below:
Now, I want to get the pixel value of boundary line of ellipse. If it is possible I would like to get the pixel of ellipse using cv2.ellipse() back as an array of coordinates.
Can anyone help me with this please.
There is no direct OpenCV way probably to get these points of the ellipse but you can extract your points via indirect way like this:
mask = cv2.inRange(image, np.array(color), np.array(color))
contour = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)[-2][0]
contour will store the outer points of your red ellipse.
Here, what I have done is created a mask image of the ellipse and found the externalmost contour's points that is the required thing.
If you want to obtain points (locations) on an ellipse, you can use ellipse2Poly() function.
If the argument type of ellipse2Poly() is inconvenient, calculating by yourself is most convenient way.
This sample code is C ++, but what calculated is clear.
//Degree -> Radian
inline double RadFromDeg( double Deg ){ return CV_PI*Deg/180.0; }
//Just calculate points mathematically.
// Arguments are same as cv::ellipse2Poly (alothough ellipse parameters is cv::RotateRect).
void My_ellipse2Poly(
const cv::RotatedRect &EllipseParam,
double StartAngle_deg,
double EndAngle_deg,
double DeltaAngle_deg,
std::vector< cv::Point2d > &DstPoints
)
{
double Cos,Sin;
{
double EllipseAngleRad = RadFromDeg(EllipseParam.angle);
Cos = cos( EllipseAngleRad );
Sin = sin( EllipseAngleRad );
}
//Here, you will be able to reserve the destination vector size, but in this sample, it was omitted.
DstPoints.clear();
const double HalfW = EllipseParam.size.width * 0.5;
const double HalfH = EllipseParam.size.height * 0.5;
for( double deg=StartAngle_deg; deg<EndAngle_deg; deg+=DeltaAngle_deg )
{
double rad = RadFromDeg( deg );
double u = cos(rad) * HalfW;
double v = sin(rad) * HalfH;
double x = u*Cos + v*Sin + EllipseParam.center.x;
double y = u*Sin - v*Cos + EllipseParam.center.y;
DstPoints.emplace_back( x,y );
}
}

Path tracing cosine hemisphere sampling and emissive objects

I'm building my own path tracer by self-learning from online resources. But I find that my implementation has an issue with emissive objects in the scene, especially in a dark environment (no skybox).
For example, in the following environment:
The box in the middle is the only light source in the environment, with emission value of (3.0,3.0,3.0), and all other objects emission value of (0.0,0.0,0.0). I was expecting the light to scatter smoothly on the walls, but it looks like they are biased towards one direction.
My cosine sampling function is (modified from lwjgl3-demos):
float3 SampleHemisphere3(float3 norm, float alpha = 0.0)
{
float3 randomVec = rand3();
float r = saturate(pow(randomVec.x, 1.0 / (1.0 + alpha)));
float angle = randomVec.y * PI_TWO;
float sr = saturate(sqrt(1.0 - r * r));
float3 ph = float3(sr * cos(angle), sr * sin(angle), r);
float3 tangent = normalize(randomVec * 2.0 - 1.0);
float3 bitangent = cross(norm, tangent);
tangent = cross(norm, bitangent);
return mul(ph, float3x3(tangent, bitangent, norm));
}
This is how I compute the shading and next ray info:
float3 Shade(inout Ray ray, HitInfo hit)
{
ray.origin = hit.pos + hit.norm * 1e-5;
ray.dir = normalize(SampleHemisphere3(hit.norm, 0.0));
ray.energy *= 2.0 * hit.colors.albedo * saturate(dot(hit.norm, ray.dir));
return hit.colors.emission;
}
And the recursion happens here:
// generate ray from camera
Ray ray = CreateCameraRay(camera, PixelCenter);
// trace ray
float3 color = 0.0;
for (int i = 0; i < _TraceDepth; i++)
{
// get nearest ray hit
HitInfo hit = Trace(ray);
// accumulate color
color += ray.energy * Shade(ray, hit);
// if ray has no energy, stop tracing
if(!any(ray.energy))
break;
}
// write to frame target
_FrameTarget[id.xy] = float4(color, 1.0);
I learned the last two functions from GPU Path Tracing in Unity.
Here is another example of the similar error:
I feel that the problem is caused by the cosine weighted hemisphere sampling, but I have no idea how to fix it.
What should I do to get distributed light effect from emissive objects on the diffuse surfaces? Do I have to specify light sources and shapes and sample from them directly instead of emissive objects?
Edit:
It is indeed the cosine weighted sampling that is causing the problem.
Instead of:
float3 tangent = normalize(randomVec * 2.0 - 1.0);
I should have another vector of independent random values:
float3 tangent = normalize(rand3() * 2.0 - 1.0);
Now it is shows
Still not perfect, because it is clearly a cross shape. (Probably caused by sparsity of floating values)
How can I further improve this?
Edit 2:
After some more debugging and experiments, I figure out the "solution", but I don't understand the reason behind it.
The random value generator is from this Shadertoy project, because I see that GLSL-PathTracer is also using it.
Here is part of it:
void rng_initialize(float2 p, int frame)
{
//white noise seed
RandomSeed = uint4(p, frame, p.x + p.y);
}
void pcg4d(inout uint4 v)
{
v = v * 1664525u + 1013904223u;
v.x += v.y * v.w;
v.y += v.z * v.x;
v.z += v.x * v.y;
v.w += v.y * v.z;
v = v ^ (v >> 16u);
v.x += v.y * v.w;
v.y += v.z * v.x;
v.z += v.x * v.y;
v.w += v.y * v.z;
}
float3 rand3()
{
pcg4d(RandomSeed);
return float3(RandomSeed.xyz) / float(0xffffffffu);
}
float4 rand4()
{
pcg4d(RandomSeed);
return float4(RandomSeed) / float(0xffffffffu);
}
At initialization, I pass float2(id.xy) from SV_DispatchThreadID and current frame counter to rng_initialize.
And here is my new cosine weighted hemisphere sampling function:
float3 SampleHemisphere3(float3 norm, float alpha = 0.0)
{
float4 rand = rand4();
float r = pow(rand.w, 1.0 / (1.0 + alpha));
float angle = rand.y * PI_TWO;
float sr = sqrt(1.0 - r * r);
float3 ph = float3(sr * cos(angle), sr * sin(angle), r);
float3 tangent = normalize(rand.zyx + rand3() - 1.0);
float3 bitangent = cross(norm, tangent);
tangent = cross(norm, bitangent);
return mul(ph, float3x3(tangent, bitangent, norm));
}
And the results are: (which looks much better)
My discoveries from the experiments are:
r in the sampling function has to be dependent on w component of random values.
angle can be any in x, y, z.
tangent has to be dependent on current xyz values and a new vector of random xyz values. Order doesn't matter so I use zyx here. Missing either current xyz or new xyz will result in a cross shape on the wall.
I'm not sure if this is a correct solution, but as far as my eyes can tell, it solves the problem.

Converting X, Z coords to RGB using GLSL shaders

I have a Three js scene that contains a 100x100 plane centred at the origin (ie. min coord: (-50,-50), max coord: (50,50)). I am trying to have the plane appear as a colour wheel by using the x and z coords in a custom glsl shader. Using this guide (see HSB in polar coordinates, towards the bottom of the page) I have gotten my
Shader Code with Three.js Scene
but it is not quite right.
I have played around tweaking all the variables that make sense to me, but as you can see in the screenshot the colours change twice as often as what they should. My math intuition says just divide the angle by 2 but when I tried that it was completely incorrect.
I know the solution is very simple but I have tried for a couple hours and I haven't got it.
How do I turn my shader that I currently have into one that makes exactly 1 full colour rotation in 2pi radians?
EDIT: here is the relevant shader code in plain text
varying vec3 vColor;
const float PI = 3.1415926535897932384626433832795;
uniform float delta;
uniform float scale;
uniform float size;
vec3 hsb2rgb( in vec3 c ){
vec3 rgb = clamp(abs(mod(c.x*6.0+vec3(0.0,4.0,2.0),
6.0)-3.0)-1.0,
0.0,
1.0 );
rgb = rgb*rgb*(3.0-2.0*rgb);
return c.z * mix( vec3(1.0), rgb, c.y);
}
void main()
{
vec4 worldPosition = modelMatrix * vec4(position, 1.0);
float r = 0.875;
float g = 0.875;
float b = 0.875;
if (worldPosition.y > 0.06 || worldPosition.y < -0.06) {
vec2 toCenter = vec2(0.5) - vec2((worldPosition.z+50.0)/100.0, (worldPosition.x+50.0)/100.0);
float angle = atan(worldPosition.z/worldPosition.x);
float radius = length(toCenter) * 2.0;
vColor = hsb2rgb(vec3((angle/(PI))+0.5,radius,1.0));
} else {
vColor = vec3(r,g,b);
}
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (scale/length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
I have discovered that the guide I was following was incorrect. I wasn't thinking about my math properly but I now know what the problem was.
atan has a range from -PI/2 to PI/2 which only accounts for half of a circle. When worldPosition.x is negative atan will not return the correct angle since it is out of range of the function. The angle needs to be adjusted based on what quadrant it is in the plane.
Q1: do nothing
Q2: add PI to the angle
Q3: add PI to the angle
Q4: add 2PI to the angle
After this normalize the angle (divide by 2PI) then pass it to the hsb2rgb function.

Determining a spheres vertices via polar coordinates, and rendering it

I am working with OpenGL ES 2.0 on an Android device.
I am trying to get a sphere up and running and drawing. Currentley, I almost have a sphere, but clearly it's being done very, very wrong.
In my app, I hold a list of Vector3's, which I convert to a ByteBuffer along the way, and pass to OpenGL.
I know my code is okay, since I have a Cube and Tetrahedron drawing nicley.
What two parts I changed were:
Determing the vertices
Drawing the vertices.
Here are the code snippits in question. What am I doing wrong?
Determining the polar coordinates:
private void ConstructPositionVertices()
{
for (float latitutde = 0.0f; latitutde < (float)(Math.PI * 2.0f); latitutde += 0.1f)
{
for (float longitude = 0.0f; longitude < (float)(2.0f * Math.PI); longitude += 0.1f)
{
mPositionVertices.add(ConvertFromSphericalToCartesian(1.0f, latitutde, longitude));
}
}
}
Converting from Polar to Cartesian:
public static Vector3 ConvertFromSphericalToCartesian(float inLength, float inPhi, float inTheta)
{
float x = inLength * (float)(Math.sin(inPhi) * Math.cos(inTheta));
float y = inLength * (float)(Math.sin(inPhi) * Math.sin(inTheta));
float z = inLength * (float)Math.cos(inTheta);
Vector3 convertedVector = new Vector3(x, y, z);
return convertedVector;
}
Drawing the circle:
inGL.glDrawArrays(GL10.GL_TRIANGLES, 0, numVertices);
Obviously I omitted some code, but I am positive my mistake lies in these snippits somewhere.
I do nothing more with the points than pass them to OpenGL, then call Triangles, which should connect the points for me.. right?
EDIT:
A picture might be nice!
your z must be calculated using phi. float z = inLength * (float)Math.cos(inPhi);
Also,the points generated are not triangles so it would be better to use GL_LINE_STRIP
Using triangle strip on Polar sphere is as easy as drawing points in pairs, for example:
const float GL_PI = 3.141592f;
GLfloat x, y, z, alpha, beta; // Storage for coordinates and angles
GLfloat radius = 60.0f;
const int gradation = 20;
for (alpha = 0.0; alpha < GL_PI; alpha += GL_PI/gradation)
{
glBegin(GL_TRIANGLE_STRIP);
for (beta = 0.0; beta < 2.01*GL_PI; beta += GL_PI/gradation)
{
x = radius*cos(beta)*sin(alpha);
y = radius*sin(beta)*sin(alpha);
z = radius*cos(alpha);
glVertex3f(x, y, z);
x = radius*cos(beta)*sin(alpha + GL_PI/gradation);
y = radius*sin(beta)*sin(alpha + GL_PI/gradation);
z = radius*cos(alpha + GL_PI/gradation);
glVertex3f(x, y, z);
}
glEnd();
}
First point entered is as follows the formula, and the second one is shifted by the single step of alpha angle (from the next parallel).

Need Algorithm for Tie Dye Pattern

I am looking for an algorithm or help developing one for creating a tie-dye pattern in a 2-dimensional canvas. I will be using HTML Canvas (via fabric.js) or SVG and JavaScript, but I'm open to examples in any 2D graphics package, like Processing.
I would draw concentric rings of different colors, and then go around radially and offset them. Here's some pseudo-code for drawing concentric rings:
const kRingWidth = 10;
const centerX = maxX / 2;
const centerY = maxY / 2;
for (y = 0; y < maxY; y++)
{
for (x = 0; x < maxX; x++)
{
// Get the color of a concentric ring - assume rings are 10 pixels wide
deltaX = x - centerX;
deltaY = y - centerY;
distance = sqrt (deltaX * deltaX + deltaY * deltaY);
whichRing = int(distance / kRingWidth);
setPixel(x, y, myColorTable [ whichRing ]); // set the pixel based on a color look-up table
}
}
Now, to get the offsets, you can perturb the distance based on the angle of (x, y) to the x axis. I'd generate a random noise table with, say 360 entries (one per degree - you could try more or fewer to see how it looks). So after calculating the distance, try something like this:
angle = atan2(y, x); // This is arctangent of y/x - be careful when x == 0
if (angle < 0) angle += 2.0 * PI; // Make it always positive
angle = int(angle * 180 / PI); // This converts from radians to degrees and to an integer
distance += noiseTable [ angle ]; // Every pixel at this angle will get offset by the same amount.

Resources