Can someone tell me how to calculate incidence angle from view angle?
I have a satellite image product that has some meta-information including in-track/cross-track/off-nadir view angles. I want to get in-track/cross-track incidence angles from them.
Related
Repeating this question for better visibility. I have a triangular mesh (assume a manifold mesh). I want to sample corners of a square on a mesh that is independent of the triangulation.
I am following these steps
Sample a triangle (based on the areas of the triangles)
Sample a point uniformly on the triangle/face
Sample a pair of random perpendicular directions
I want to calculate the distance of three other corners of the square given an edge length. Since the corners can be on any other face, the output should be of the format (Face, barycentric coordinates on that face).
I am looking at libraries such as Polyscope or pygeodesic that use the heat method to compute the geodesic distance between two vertices of the mesh, but I am not sure how to get points at an arbitrary geodesic distance from another point.
I am trying to cluster a radar database. The observations of a radar are given in Azimuth angle and range. For processing speed reasons I do not want to transform this polar data into a Cartesian system.
A normal Euclidean distance metric needs the same unit in x- and y-direction. I need to find the distance in azimuth direction and in range direction. Basically, you can imagine the Euclidean distance metric working in a circle and my solution should work in an ellipse.
A simple solution to the problem is in the attached image the red example.
Sketech of ellipse radius
I take from a centre point the distance in both directions.
I am searching for a solution that is more advanced like the blue example where I have all the points in an elliptical search.
EDIT:
I asked the same question a bit different formulated in the mathematics forum, maybe that helps you to understand my problem better. https://math.stackexchange.com/questions/3635438/epsilon-neighbourhood-for-polar-data
I have a 3D model of a 10km x 10km topographic map which I've imported from sketch-up , the model is just a bunch of X,Y,Z points (where X+ is the north and Z+ is straight up, perpendicular to the ground)
I know the Latitude Longitude values of the (0,0,0) point. So given a X,Y,Z point how do I get its Latitude Longitude values?
I need to be pretty accurate so you can't assume the earth is a perfect sphere (you can however assume it's an ellipsoid)
For best accuracy you need to know what map projection the map was drawn in. You should be able to find that out from the map. For example in the UK the Ordnance Survey maps use the OSGB36 datum, and the projection is Transverse mercator. The projection tells you how to convert geographic (lat,long for the datum ellipsoid) coordinates to map coordinates (easting and northing) and how to do the reverse calculation, which is pretty much what you want.
If you don't know the projection, the next best thing would be if you could find out -- again from the map, they are often written on it -- the scale-factor and convergence of the projection at some points on the map. The point is that there is usually a slowly spatially varying difference between map north (the direction the north axis points in) and true north (the direction of the north pole from a point, the direction of the latitude axis) and there is always a slowly spatiallty varying scale factor, the ratio of a distance in map coordinates and the true distance. Note that this not the same thing as the scale of the printed map (an inch to a mile or whatever), it is a property of the projection.
Over a 10km square, it would be reasonably accurate to treat both the scale and convergence as constants. Then given an x,y point you compute the map bearing from 0,0 using
b = atan2(x,y)
and convert this to a true bearing by subtracting the convergence.
You also compute the map distance by
r = hypot(x,y)*S
where S is the scale of the map, e.g. if your a change of 1 in x coordinates represents a distance of 100m, S is 100
and convert r to a true distance by dividing by the scale-factor.
Finally you want to compute the lat,long a given distance and bearing from a given point (the lat,long of 0,0). An accurate way to do this is to use Vincenty's formulae.
One thing to note here is that the scale and convergence, if quoted on the map will be relative to the ellipoid used in construction of the map, so you will be computing lat,long coordinates for that ellipsoid.
Please, help me to solve subproblem in my programming task (k-means clustering on a sphere).
Suppose the Earth is a sphere. And there are two points (we know their latitudes and longitudes) with masses m_1 and m_2 on it.
The problem is to find latitude and longitude of these two points' center of mass on a sphere, if the distance is measured as the great-circle distance.
You want to find a point that lies on the great circle arc at distance
l = L * m1 / (m1 + m2)
from the first point, where L is full distance between points.
You can use or
spherical linear interpolation : translate spherical coordinates to Descartes' coordinate system, work with vectors, translate back
or
geodesic approach - find bearing from the first point to the second, find distance L, and move distance l with bearing found. All formulae are at this page: Destination point given distance and bearing from start point
As an input, I receive some planar, triangulated geometry. Now, I need to compute the four coordinates of the corners of the bounding rectangle. Any Ideas?
I'm going to assume that you mean 2D space in the question title, because everything else refers to 2D.
Go through all the vertices (x,y) in your geometry, and calculate the maximum and minimum of the x's, and the max and min of the y's.
Then the vertices of your bounding rectangle will be (min_x,min_y), (max_x,min_y), (max_x, max_y), and (min_x, max_y).