Contour detection on sketched lines - graphics

I have a set of curves as input, represented as a list op point coordinates.
I want to merge them into one, more or less beautiful, curve.
Any idea how to do that?
UPDATE:
It should work on a single curve like that:
The most important aspect, to make the result curve nice, and supress the draw errors.

You have zero noise in your image, that is there is no salt-and-pepper artefacts just a somewhat curvy lines. The easiest way to merge them if there are not that many gaps is to use blur or morphological expansion to connect parts together (both may distort a shape just a bit) and then use findContour().
If there are larger gaps you have to use convex Hull on convex parts and then on concave residuals. Snake or active contour algorithm is probably an overkill in this situation.

Related

Convex hull of parallel lines

I have arbitrary many lines in 3D space which are all parallel to each other. Now I want to find the convex hull of these lines. To illustrate this, I've drawn a picture:
I know the start- and endpoints of all lines (the blue dots). The lines are not equally long. If a viewer looks in the direction of the lines (marked as the viewer direction in the pic) he sees only the dots. Now I want to find the convex hull of these dots. Hopefully its clear what I mean.
My idea was to project the start or endpoints on a plane which is perpendicular to the line's direction. After that I can apply some kind of convex hull algorithm to these points. But I have no idea how.
Your idea is exactly correct. One way to accomplish this is to define a vector v along your viewing direction, and then rotate v to the z-axis. The same rotation will convert lines to vertical lines. Then drop the z-coordinate of the endpoints to get your projected points. Then compute the convex hull. There are hull algorithms all over the web, including my own here.
Here's a suggestion based on the calculus of variations.
Consider enclosing your collection of parallel line segments in a simple closed curve minimizing the area of the curve given the constraint that it has to enclose all your segments.
Your "curve" is going to be piecewise linear, so there you might be able to use a P.W basis function in the iterations, though it's possible that you could run into some singularities when the algorithm needs to drop a segment.

smooth curve through points, using only horizontal, vertical lines and fixed-radius arcs

Given an ordered list of points, I want to draw a smooth curve that passes through all of them. Each part of the curve can either be horizontal, vertical, or an arc with given radius r (all arcs will have the same radius). The transitions should be smooth, i.e., the heading at the end of one part should be the same as the heading at the beginning of the next part. There can be any number of arcs or straight line segments between any two consecutive input points.
It's sort of like a train track that should run orthogonally or along sections with fixed curvature.
Is there a good algorithm to construct such a curve? (or, in cases where such a line is not possible, I would like to know that.)
I looked into Bezier curves, but that seems like overkill and I couldn't find a good way to enforce my constraints.
What you are asking for above implies to me that you seek tangent continuity of your curve across points (similar to a spline with tangent continuity at knots). The train track analogy at least conveys this requirement. Given the strict limitations of straight lines, and fixed radius circular arcs I am fairly certain that you will not be able to do this. Why not consider a spline interpolation of your points if you require such smoothness instead?
To see why consider the following image:
Consider replacing the line segment between B and C with a circular arc. You can do it to make the join continuous, but to make it tangent continuous, you would need a great deal of good fortune as there is only one circle that is tangent continuous to the line segment AB that also touches point C. The chances of that circle having tangent at C matching the tangent of line CD is remote. It is possible that your data will line up like this but you cannot rely on it.
If I have misunderstood your question please let me know and I will adjust the answer.

circle drawing algorithm for n-pixel border

I know the Bresenham and related algorithms, and I found a good algorithm to draw a circle with a 1-pixel wide border. Is there any 'standard' algorithm to draw a circle with an n-pixel wide border, without restoring to drawing n circles?
Drawing the pixel and n2 surrounding pixels might be a solution, but it draws many more pixels than needed.
I am writing a graphics library for an embedded system, so I am not looking for a way to do this using an existing library, although a library that does this function and is open source might be a lead.
Compute the points for a single octant for both radii at the same time and simultaneously replicate it eight ways, which is how Bresenham circles are usually drawn anyway. To avoid overdrawing (e.g., for XOR drawing), the second octant should be constrained to draw outside the first octant's x-extents.
Note that this approach breaks down if the line is very thick compared to the radius.
Treat it as a rasterization problem:
Take the bounding box of your annulus.
Consider the image rows falling in the bounding box.
For each row, compute the intersection with the 2 circles (ie solve x^2+y^2=r^2, so x=sqrt(r^2-y^2) for each, for x,y relative to the circle centres.
Fill in the spans. Repeat for next row.
This approach generalizes to all sorts of shapes, can produce sub-pixel coordinates useful for anti-aliasing and scales better with increasing resolution than hacky solutions involving multiple shifted draws.
If the sqrt looks scary for an embedded system, bear in mind there are fast approximate algorithms which would probably be good enough, especially if you're rounding off to the nearest pixel.

Creating closed spatial polygons

I need to create a (large) set of spatial polygons for test purposes. Is there an algorithm that will create a randomly shaped polygon staying within a bounding envelope? I'm using OGC Simple stuff so a routine to create the well known text is the most useful, Language of choice is C# but it's not that important.
Here you can find two examples of how to generate random convex polygons. They both are in Java, but should be easy to rewrite them to C#:
Generate Polygon example from Sun
from JTS mailing list, post Minimum Area bounding box by Michael Bedward
Another possible approach based on generating set of random points and employ Delaunay tessellation.
Generally, problem of generating proper random polygons is not trivial.
Do they really need to be random, or would some real WKT do? Because if it will, just go to http://koordinates.com/ and download a few layers.
What shape is your bounding envelope ? If it's a rectangle, then generate your random polygon as a list of points within [0,1]x[0,1] and scale to the size of your rectangle.
If the envelope is not a rectangle things get a little more tricky. In this case you might get best performance simply by generating points inside the unit square and rejecting any which lie in the part of the unit square which does not scale to the bounding envelope of your choice.
HTH
Mark
Supplement
If you wanted only convex polygons you'd use one of the convex hull algorithms. Since you don't seem to want only convex polygons your suggestion of a circular sweep would work.
But you might find it simpler to sweep along a line parallel to either the x- or y-axis. Assume the x-axis.
Sort the points into x-order.
Select the leftmost (ie first) point. At the y-coordinate of this point draw an imaginary horizontal line across the unit square. Prepare to create a list of points along the boundary of the polygon above the imaginary line, and another list along the boundary below it.
Select the next point. Add it to the upper or lower boundary list as determined by it's y-coordinate.
Continue until you're out of points.
This will generate convex and non-convex polygons, but the non-convexity will be of a fairly limited form. No inlets or twists and turns.
Another Thought
To avoid edge crossings and to avoid a circular sweep after generating your random points inside the unit square you could:
Generate random points inside the unit circle in polar coordinates, ie (r, theta).
Sort the points in theta order.
Transform to cartesian coordinates.
Scale the unit circle to a bounding ellipse of your choice.
Off the top of my head, that seems to work OK

Minimize Polygon Vertices

What is a good algorithm for reducing the number of vertices in a polygon without changing the way it looks very much?
Input: A polygon, represented as a list of points, with way too many verticies: raw input from the mouse, for example.
Output: A polygon with much fewer verticies that still looks a lot like the original: something usable for collision detection, for example (not necessarily convex).
Edit: The solution to this would be similar to finding a multi-segmented line of best fit on a graph. It's called Segmented Least Squares in my algorithms book.
Edit2: The Douglas Peucker Algorithm is what I really want.
Edit: Oh look, Simplifying Polygons
You mentioned collision detection. You could go really simple and calculate a bounding convex hull around it.
If you cared about the concave areas, you can calculate a concave hull by taking the centroid of your polygon, and choosing a point to start. From the starting point rotate around the centroid, finding each vertex you want to keep, and assigning that as the next vertex in the bounding hull. The complexity of the algorithm would come in how you determined which vertices to keep, but I'm sure you thought of that already. You can throw all your vertices into buckets based on their location relative to the centroid. When a bucket gets more than an arbitrary number of vertices full, you can split it. Then take the mean of the vertices in that bucket as the vertex to use in your bounding hull. Or, forget the buckets, and when you're moving around the centroid, only choose a point if its more than a given distance from the last point.
Actually, you could probably just use all the vertices in your polygon as "cloud of points" and calculate the concave hull around that. I'll look for an algorithm link. Worst case on this would be a completely convex polygon.
Another alternative is to start with a bounding rectangle. For each vertex on the rectangle, find the distance from the point to the polygon. For the farthest vertex, split it into two more vertices and move them in some. Repeat until some proportion of either vertices or area is met. I'd have to think about the details of this one a little more.
If you care about the polygon actually looking similar, even in the case of a self-intersecting polygon, then another approach would be required, but it doesn't sound like thats necessary since you asked about collision detection.
This post has some details about the convex hull part.
There's a lot of material out there. Just google for things like "mesh reduction", "mesh simplification", "mesh optimization", etc.

Resources