I'm trying to generate a mesh from a sphere of radius r. My goal is to create a UV sphere such that every point on the polyhedron has distance from the sphere smaller than tol.
The following code creates a grid of points on the sphere. How can I compute parallels_count and meridians_count so that all the point of the mesh are within tolerance?
for j in parallels_count:
parallel = PI * (j+1) / parallels_count
for i in meridians_count:
meridian = 2.0 * PI * i / meridians_count
return spherical_to_cartesian(meridian, parallel)
The code comes from here, and this is a picture of the UV sphere:
The distance between each face of the mesh and the sphere will be maximum around the center of the face.
So, for the distance between a face and the sphere to be smaller than tol it is not sufficient that the distances between the edges of the face and the corresponding circumferences are smaller than tol.
This picture is out of context but helps me explaining what I mean.
the biggest distance between points is on equator so use circle circumference to obtain angular step if I am not mistaken it should be...
dangle = tol/r; //[rad]
where r is sphere radius in the same units as tol you can use smaller step to be sure like dangle*=0.75; use this for both parallel and meridian angles.
If you want your counts instead then try:
meridians_count = (2.0*PI*r/tol)+1; // ceil or +1 just to be sure
parallels_count = 0.5*meridians_count;
It is still early here so I hope I did not make any silly math mistake (the easiest stuff is the worst for silly bugs).
Also take a look at few related QA's of mine:
Applying map of the earth texture a Sphere
Make a sphere with equidistant vertices
Sphere triangulation
[Edit1] well your new definition of tol changes everything
I see it like this:
sin(da/2) = (r-tol)/r
da = 2.0*asin((r-tol)/r)
If you convert to sphericalsurface than max difference is in center of uv grid cell which represents sqrt(2)*dadiagonal so try to use:
da = sqrt(2.0)*asin((r-tol)/r)
so your angle step should be a bit smaller than that ...
Related
I need to offset a curve, which by the simplest way is just shifting the points perpendicularly. I can access each point to calculate angle of each line along given path, for now I use atan2. Then I take those two angle and make average of it. It returns the shortest angle, not what I need in this case.
How can I calculate angle of each connection? Concerning that I am not interested in the shortest angle but the one that would create parallel offset curve.
Assuming 2D case...
So do a cross product of direction vectors of 2 neighboring lines the sign of z coordinate of the result will tell you if the lines are CW/CCW
So if you got 3 consequent control points on the polyline: p0,p1,p2 then:
d1 = p1-p0
d2 = p2-p1
if you use some 3D vector math then convert them to 3D by setting:
d1.z=0;
d2.z=0;
now compute 3D cross:
n = cross(d1,d2)
which returns vector perpendicular to both vectors of size equals to the area of quad (parallelogram) constructed with d1,d2 as base vectors. The direction (from the 2 possible) is determined by the winding rule of the p0,p1,p2 so inspecting z of the result is enough.
The n.x,n.y are not needed so you can compute directly without doing full cross product:
n.z=(d1.x*d2.y)-(d1.y*d2.x)
if (n.z>0) case1
if (n.z<0) case2
if the case1 is CW or CCW depends on your coordinate system properties (left/right handness). This approach is very commonly used in CG fur back face culling of polygons ...
if n.z is zero it means that your vectors/lines are either parallel or at lest one of them is zero.
I think these might interest you:
draw outline for some connected lines
How can I create an internal spiral for a polygon?
Also in 2D you do not need atan2 to get perpendicular vector... You can do instead this:
u = (x,y)
v = (-y,x)
w = (x,-y)
so u is any 2D vector and v,w are the 2 possible perpendicular vectors to u in 2D. they are the result of:
cross((x,y,0),(0,0,1))
cross((0,0,1),(x,y,0))
I'm working on a 3D mapping application, and I've got to do some work with things like figuring out the visible region of a sphere (Earth) from a given point in space for things like clipping mapped regions and such.
Several things get easier if I can project the outline of Earth into screen space, clip polygons there, and then project back to the surface of the Earth (lat/lon), but I'm lost as to how to do that.
Is there a reasonable way to compute the outline of a sphere after perspective projection, and then a reasonable way to project things back onto the sphere?
You can clip the polygons in 3D. The silhouette of the sphere - back-projected into 3D - will always be a circle on a plane. Perspective projection does not change that. Thus, you can clip all polygons at the plane.
Calculating the plane is not too hard. If you consider the sphere's center the origin, then the plane could be represented in normal form as:
dot(n, x) = d
n is the normal. This one is easy. It is just the unit direction vector from the sphere center to the observer.
d is the distance from the sphere center. This is a bit harder but not too hard. If l is the distance of the observer to the sphere center and r is the sphere radius, then
d = r^2 / l
This is the plane which you can use to clip your polygons in 3D. If you need the radius of the circle on it, you can use the following formula:
r_c = r / sqrt(1 - r^2/(l-d)^2)
Let us take a point on a sphere in spherical coordinates (cos(u)sin(v),sin(u)sin(v),cos(v)) and an arbitrary projection center (x,y,z).
We express that a projecting line is tangent to the sphere by the perpendicularity condition of the direction of the line and the vector from the origin of the sphere:
(x-cos(u)sin(v))cos(u)sin(v) + (y-sin(u)sinv))sin(u)sin(v) + (z-cos(v)) cos(v) = 0
This simplifies to
x cos(u)sin(v) + y sin(u)sin(v) + z cos(v) = 1
which is a curve in the longitude/latitude coordinates. You can solve u as a function of v or conversely.
I have a question regarding the projection of an image over a set of 3D points. The image is given to me as a JPG, together with position and attitude information of the camera relative to a cartesian coordinate system (Xc,Yc,Zc and yaw, pitch, roll), as well as the horizontal and vertical field of view (in degrees).
Points are given using solely their 3d position in the same coordinate system (Xp,Yp,Zp).
In my coordinate system, Z is up. To project the image onto the points, I
compute the vector from camera to each point
Vector3 c2p = (Xp,Yp,Zp)-(Xc,Yc,Zc);
rotate c2p according to my camera's attitude (quaternion):
Vector3 c2pCamFrame = getCamQuaternion().conjugate().rotate(c2p);
compute azimuth and elevation from the camera's "center ray" to the point:
float azimuth = atan2(c2pCamFrame.x(),c2pCamFrame.y()));
float elevation = atan2(c2pCamFrame.z(),sqrt(pow(c2pCamFrame.x(),2)+pow(c2pCamFrame.y(),2)));
if azimuth and elevation are within the field of view, I assign the color of the corresponding pixel to the point.
This works almost perfectly, and the "almost" motivates my question. Let me show you:
I cannot figure out why the elevation of the projection is distorted. In the bottom right of the image, you can see that points outside the frustum (exceeding the elevation) actually become colored - and this distortion is null at an azimuth of 0 degrees and peaks at the left and right edges of the image, creating the pillow distortion.
Why does this distortion appear? I'd love to understand this problem both in geometrical as well as mathematical terms. Thank you!
The field of view angles are only valid on the principal axes. But you can do it the other way around. I.e. calculate the x/y bounds from the angles:
maxX = tan(horizontal_fov / 2)
maxY = tan(vertical_fov / 2)
And check
if(abs(c2pCamFrame.x() / c2pCamFrame.z()) <= maxX
&& abs(c2pCamFrame.y() / c2pCamFrame.z()) <= maxY)
Additionally, you might want to check if the points are in front of the camera:
... && c2pCamFrame.z() > 0
This assumes a left-handed coordinate system.
I'm trying to find the best way to calculate this. On a 2D plane I have fixed points all with an instantaneous measurement value. The coordinates of these points is known. I want to predict the value of a movable point between these fixed points. The movable point coodinates will be known. So the distance betwwen the points is known as well.
This could be comparable to temperature readings or elevation on topography. I this case I'm wanting to predict ionospheric TEC of the mobile point from the fixed point measurements. The fixed point measurements are smoothed over time however I do not want to have to store previous values of the mobile point estimate in RAM.
Would some sort of gradient function be the way to go here?
This is the same algorithm for interpolating the height of a point from a triangle.
In your case you don't have z values for heights, but some other float value for each triangle vertex, but it's the same concept, still 3D points.
Where you have 3D triangle points p, q, r and test point pt, then pseudo code from the above mathgem is something like this:
Vector3 v1 = q - p;
Vector3 v2 = r - p;
Vector3 n = v1.CrossProduct(v2);
if n.z is not zero
return ((n.x * (pt.x - p.x) + n.y * (pt.y - p.y)) / -n.z) + p.z
As you indicate in your comment to #Phpdevpad, you do have 3 fixed points so this will work.
You can try contour plots especially contour lines. Simply use a delaunay triangulation of the points and a linear transformation along the edges. You can try my PHP implementations https://contourplot.codeplex.com for geographic maps. Another algorithm is conrec algorithm from Paul Bourke.
I have gone through all available study resources in the internet as much as possible, which are in form of simple equations, vectors or trigonometric equations.
I couldn't find the way of doing following thing:
Assuming Y is up in a 3D world.
I need to draw two 2D trajectories orthogonally (not the projections) for a 3D trajectory, say XY-plane for side view of the trajectory w.r.t. the trajectory itself and XZ-plane for top view for the same.
I have all the 3D points of the 3D trajectory, initial velocity, both the angles can be calculated by vector mathematics.
How should I proceed further?
refer:
Below a curve in different angles, which can loose its significance if projected along XY-plane. All I want is to convert the red curve along itself, the green curve along green curve and so on. and further how would I map side view to a plane. Top view is comparatively easy and done just by taking X and Z ordinates of each points.
I mean this the requirement. :)
I don't think I understand the question, but I'll answer my interpretation anyway.
You have a 3D trajectory described by a sequence of points p0, ..., pN. We are given an angle v for a plane P parallel to the Y-axis, and wish to compute the 2D coordinates (di, hi) of the points pi projected onto that plane, where hi is the height coordinate in the direction Y and di is the distance coordinate in the direction v. Assume p0 = (0, 0, 0) or else subtract p0 from all vectors.
Let pi = (xi, yi, zi). The height coordinate is hi = yi. Assume the angle v is given relative to the Z-axis. The vector for the direction v is then r = (sin(v), 0, cos(v)), and the distance coordinates becomes di = dot(pi, r).