how to calculate geographical coordinates / which geo system is this? - geospatial

I have a pair of geo coordinates I don't understand. Does anybody know this coordinate system and how to translate them to longitude and latitude in degrees?
Long. 7662.251 West, Lat. 9144.590 North
Has to be a position in Honduras (13- 16°N and 83-89° W)

You really need to investigate the source of the data, files, print outs, local gandolphs etc. Even if someone can find something with a stab in the dark it's likely to be wrong in some tiny detail. Where did the pair of coordinates come from? A text file? An email? Were there any auxiliary files? With the same file name, but different extension? In the same folder? Attached to the email? Who gave it to you? What sort of computer do they use? Are they alive? Etc.
There's simply not enough information in this question to answer it - tell us at least how you know that they are "geo coordinates" - how do you know that they must be in the Honduras?
Unfortunately this level of cultural metadata is quite often about the best we have. As a massive stab in the dark they might be projected coordinates in km, rather than metres - but they could be in feet - or anything, there's really only educated guesses that can improve on this as it stands.

If it's not one of the common ones, such as UTM, have a search on the epsg registry for coordinate systems for Honduras:
http://www.epsg-registry.org/
and see if any work out. Would be a lot easier if you have any more coordinate pairs that are also in Honduras, because then we can work out an approximate scale.
I'll have a play and see if one of the UTM zones works out...

Related

3D bounding box - is there a simple method for making one?

Is there a simple/quick way to find out the dimensions of a 3D axis aligned bounding box for an item with rotations in the <x,y,z> axes? I know that in math there are different methods/equations to come up with the same answer for a problem and I've previously in the past explained with bare bones information about a method that worked for me, that I probably didn't explain well.
I also know that there are (downloadable)functions available for some languages like java and python to help solve/tackle parts of the math. Though that might not help those that want to understand those different methods better or might not have access to those particular functions for the language they are using.
So if a person has the following information:
The item center point coordinate in the x,y,z
The item x,y,z size
Along with the x,y,z rotation(s)
What would be the easiest approach with a person knowing that information, while either using or not using the item center point, matrix/matrices or cos and sin. Though an approach that would also be compact enough to take up the least amount of memory usage.

Algorithm for finding an empty space that fits a rectangle that is closest to a target rectangle among other rectangles

Let's say you're placing rectangular tooltips on a screen of elements you want to provide information for. You want all these tooltips to be visible all at once and not cover any of the nodes any of the other tooltips are for.
You want each tooltip to be as close to the item its related to as feasible. What algorithm(s) exist to help solve this problem?
I've checked out rtrees, which seem to only help you find collisions, but don't help on the front of actually searching for free locations. I've found rectangle packing algorithms that search for a position unconstrained by a maximization function (like "be closest to this other element as possible").
I can imagine an algorithm that has some physics simulation where nodes and their tooltips are each connected by some kind of rubber band and plays it out until equilibrium, but I'd think that things could be calculated faster and less complicated than that.
Any related algorithms or libraries would be helpful. Bonus points for a javascript library : )
You might investigate map labeling algorithms.
See, for example, these lecture notes by Robero Tamassia #Brown:
PDF download.

Fusion Tables: Polygon not displayed as of certain zoom level

I'm working on a map that shows different population statistics on a rather granular level in Berlin (447 sub-districts).
https://www.google.com/fusiontables/data?docid=1tIAPGaYK1iEWWLANQOupkAqCcPhVauMjdPS1qOs#map:id=3
For some reason, a small number of polygons (3) is not displayed as soon as you zoom into the map (12 or higher).
As the polygons are displayed at the level before, they should have the proper coordinates. I first thought the shapefiles (kmls provided by the local statistics authority) might be buggy, but that does not seem to be the case.
Can anybody explain to me why this happens?
Thank you very much!
Michael
There are two possibilities that I can think of:
it is a complexity problem or a winding direction issue with the polygon. Thread on Fusion Tables Users Group discussing this issue.
it is a complexity issue with the number of "features" on the tile. See Limits in the documentation, it used to be more clearly defined.
Reversing the winding direction of two of the problem polygons seems to fix the issue:
https://www.google.com/fusiontables/DataSource?snapid=S787935DQC4

Fixing an incorrectly taken 3D head scan

The problem I am facing is following.
I have a number of 3D head scans, some of them are taken correctly (like attached example) but in many it is easy to see that the scanned person had his head not exactly aligned with the machine's front and thus one side of the texture (and depth map) seems to be "wider" (the exact reason is that one side was taken more from behind, it can be easily seen if you look at the ears).
Fortunately when I go from the cylindrical coordinates to carthesian ones and render the face with XNA, the face is symmetrical.
Now the thing is that I would like the texture and depth maps of all my heads by as nice and symmetrical as the correct one (because later i want to align them and perform PCA).
The idea I have at the moment is that I could interpolate the surfaces between all of the vertices and from those interpolations take new vertices that are equally distanced from each other.
This solutions seems a lot of work and maybe its an overkill.
Maybe there is some other way (like geting that interpolation data from DirectX/XNA that has to calculate it at some point anyway).
I will be most thankful for helpful answers.
The correct example:
http://i55.tinypic.com/332mio2.jpg
Incorrect example:
http://i54.tinypic.com/309ujvt.jpg
It's probably possible to salvage (some of) the bad scans to some degree using some coordinate transformations, but you would have to guess the "incorrectness" of the alignment and it's probably impossible to do automatically.
But, unless the original subject is dead (or otherwise unavailable); it's probably a lot easier to redo the scans.
Making another scan is very likely to be quicker, and you won't loose quality as transforming the bad scans probably will. The nose on the incorrect sample seems to be shadowing the side of the nose, and no fancy algorithm can ever fix the missing data.

How can I select layer by location AND attributes in ArcMap?

I have a dataset (i.e. a shapefile) containing spatial location data (coordinates) and elevation data as well as other attribute fields.
I want to select points which have at least 200m vertical separation (i.e. are at least 200m apart on the z-axis) AND are within 3km of each other.
The aim is to create a new shapefile with all points that have this relationship with 1 or more other points.
Im sure there is a solution to this problem (maybe not using arcmap at all?) but i just cant find it. any help would be greatly appreciated.
Chris
You are going to have much better luck asking this question in gis.stackexchange.com. Many more ESRI users/programmers there. As a matter of fact I bet you find your solution there without having to ask the question.
You can run the ArcGIS Near tool on all the points.
Then select by attribute points with Z values of >200m and distance values of <3000m.

Resources