Google Elevation API resolution range and data sources - resolution

What is the minimum and maximum resolution of the elevation data from the Google Elevation API?
"A resolution value, indicating the maximum distance between data points from which the elevation was interpolated, in meters. This property will be missing if the resolution is not known. Note that elevation data becomes more coarse (larger resolution values) when multiple points are passed. To obtain the most accurate elevation value for a point, it should be queried independently." - https://developers.google.com/maps/documentation/elevation/intro
What data sources is it built from? SRTM1(30m US)? SRTM3(90m global) LiDAR? National Elevation Dataset (NED)?

Related

Calculate 3D vector at 90 degree on plane

I am trying to calculate the width and height of a 3D object at different locations of this object.
I skeletonize it, then process through skan to get just the network, which allow me to calculate the longest path.
I then want to calculate the width and height of my object at each point of the network taking in consideration the next point location. As a 3D vector. Below the kind of thing I want.
Point A and B are known, and I want to get and rotate around its own axis step-wise, something like 90 degrees.
After that, I should be able to handle, by adjusting the lenght of this second vector and checking if the pixel is inside my object or not (my source image is a segmented image with square voxel).
From what I understand, I will need to use dot product or/and cross product, but I am at a loss.

what type of model should I use?

I am trying to assess the infuence of sex (nominal), altitude (nominal) and latitude (nominal) on corrected wing size (continuous; residual of wing size by body mass) of an animal species. I considered altitude as a nominal factor given the fact that this particular species is mainly distributed at the extremes (low and high) of steep elevational gradients in my study area. I also considered latitude as a nominal fixed factor given the fact that I have sampled individuals only at three main latitudinal levels (north, center and south). 
I have been suggested to use Linear Mixed Model for this analysis. Specifically, considering sex, altitude, latitude, sex:latitude, sex:altitude, and altitude:latitude as fixed factors, and collection site (nominal) as the random effect. The latter given the clustered distribution of the collection sites.
However, I noticed that despite the corrected wing size follow a normal distribution, it violates the assumption of homoscedasticity among some altitudinal/latitudinal groups. I tried to use a non-parametric equivalent of factorial ANOVA (ARTool) but I cannot make it run because it does not allow cases of missing data and it requires to asses all possible fixed factor and their interactions. I will appreciate any advice on what type of model I can use given the design of my data and what software/package can I use to perform the analysis.
Thanks in advance for your kind attention.
Regards,

What is the default tolerance for ETRS89 in geospatial operations if the user does not specify any?

In Marklogic I'm using cts.geospatialRegionQuery to search for documents that contain (an indexed) geometry that has an intersection with the geometry I search with.
The geospatial region index uses etrs89/double as coordinate system. All geometries in the data have 9 decimal places.
According to the Marklogic Geospatial search applications documentation:
[...] geospatial queries against single precision indexes are accurate to within 1 meter for geodetic coordinate systems.
I would, therefore, expect that my queries would have sub-meter accuracy. However, I get search results from cts.geospatialRegionQuery containing geometries up to ~5 meters away from my search geometry. As far as I can see the only reason for this could be the tolerance option that I'm not specifying yet and is therefore using the default.
The documentation mentions that
If you do not explicitly set tolerance, MarkLogic uses the default tolerance appropriate for the coordinate system.
To ensure accuracy, MarkLogic enforces a minimum tolerance for each coordinate system.
This brings us to the actual question:
What is the default (and minimum) tolerance for the etrs89 coordinate system in Marklogic?
EDIT:
Looked further into the issue with help from Marklogic Support and found the cause of the low accuracy of my geospatial queries.
Before using cts.geospatialRegionQuery I parsed the search geometry with geo.parseWkt. This function does not allow you to explicitly set the coordinate system to be used and therefore uses the coordinate system set in the AppServer settings. By default this is single precision wgs84. This lead to a loss of 2-3 digits on my search geometry.
After setting the coordinate system to etrs89/double in the AppServer settings, geo.parseWkt didn't reduce the precision of the search geometry anymore and my geospatial queries had the expected 5 mm accuracy.
The default tolerance for WGS84 and ETRS89 coordinate systems is 0.5cm for double precision and 5 meters for single precision.
Closing the loop on this issue using feedback provided by MarkLogic support:
When setting up the query geo.ParseWkt was used to create the POINT and as this function does not take a coordinate system or precision as options the result was being truncated to 8 significant digits by default. In the Latitude they were working this reduced precision from 0.5cm to 5m leading to the observed results.
geo.parseWkt("POINT(5.176605744 52.045696539)");
Results in:
POINT(5.1766057 52.045697)
When using JavaScript the solution is to set the correct coordinate system in the AppServer, see https://docs.marklogic.com/guide/search-dev/geospatial#id_77035 and following example (written in XQuery):
xquery version "1.0-ml";
import module namespace admin = "http://marklogic.com/xdmp/admin" at "/MarkLogic/admin.xqy";
let $config := admin:get-configuration()
let $groupid := admin:group-get-id($config, "Default")
return admin:save-configuration(
admin:appserver-set-coordinate-system(
$config,
admin:appserver-get-id($config, $groupid, "App-Services"),
"etrs89/double")
Once this was done the POINT created using geo.ParseWkt had the correct level of precision.
With XQuery you can declare the coordinate system directly in the query:
declare option xdmp:coordinate-system "etrs89/double";

How to find the distribution probability in user's chech-ins?

I read a paper that mentioned the user's check-in behavior follow a power-law distribution. I want to know how to I can calculate a user's check-in behavior?
This is the figure of probability and they said:
To obtain this measurement, we calculate the distances between all pairs of POIs that a user has checked in and plot
a histogram (actually probability density function) over the
distance of POIs checked in by the same user. As shown in
Figure 2, a significant percentage of POIs pairs checked in
by the same user appears to be within short distance, indicating a geographical clustering phenomenon in user check-in
activities.7

Orthographic projection of point to line in WGS84

I need to find points (from a rather small dataset) which are close enough to a polyline. All coordinates are WGS84.
I think of some r-tree thing to reduce the data to just a few candidates which then have to be checked in more detail.
While i managed to do this using "great circle" arithmetic, i am sure this is too pedantic for the following reasons:
The segmentation of those polylines is quite high. A single segment of a polyline can be considered to be no longer than 10 km.
The points in question are not more than a few hundred meters away from segments.
The area in question is Europe, so the algorithm does not need to be valid for extreme (near pole?) conditions. Again: points don't need to be checked agains the whole polyline (which could be hundrets of kilometers). Only the "nearby" segments need to be considured.
Do i need to transform the WGS84 coordinates to
some local cartesian reference system
to a mercator system
Or can i even just calculate with "angle differences"? I know that this is just a matter of accuracy: I can accept an error which is below ~50 meters.
I highly appreciate your suggestions!
On how to measure distance from point to polyline:
you have to measure distances from all your points to all segments of a polyline.
See Distance from a point to a polygon
You can do without converting coordinates to cartesian (especially if the area is rather small, you don't mind 50 meters error and you don't need exact distances, just relative) See https://en.wikipedia.org/wiki/Decimal_degrees.

Resources