I have a poly data that looks like this:
What I want to obtain is something that would be smoother, something like this (edited in paint for demonstration purpose):
So far I've tried the following filters:
vtkWindowedSincPolyDataFilter
vtkSmoothPolyDataFilter
However, the closest I got was with the first one, with a result like this:
Is there any filter or strategy in VTK that would allow me to reach something really close to the second picture?
Thanks in advance.
I suggest you play with the convergence and iterations parameter of vtkSmoothPolyDataFilter to achieve the optimal result for a single application of that filter. If this is not satisfying, why don't you go ahead and apply it multiple times, one after the other? This is what I would do if I had this problem at my hands.
One other solution could be to generate a binary vtkImageData from this polydata, using vtkPolyDataToImageStencil, then smooth the image using something like vtkImageGaussianSmooth and then go back to the polydata world using vtkMarchingCubes
You'll need to tweak some parameters for each filter, but that should work and give you more control other the smoothing
Related
I'm playing with some mesh operations.
Suppose we have two meshes, one human head mesh and another one is human body.
we just have to add head to body so that the end result is one single complete human body
mesh.
I think it can be done using python-blender but i'm not that much expert in blender scripting.
May be another python library can be useful.
Please recommend some way out.
Tried join operation in blender. But it's working as expected because we want to add the two meshes
at specific location i.e. neck.
Boolean operation on meshes is a novelty of open3d v0.16, you could take a look at it :
[http://www.open3d.org/blog/][link]
I'm having trouble finding a way to solve this specific problem using MeshLab.
As you can see in the figure, the mesh with which I'm working presents some cracks in certain areas, and I would like to try to close them. The "close holes" option does not seem to work because, being technically cracks and not holes, it seems not to be able to weld them.
I managed to get a good result using the "Screened Poisson Surface Reconstruction" option, but using this operation (rebuilding the whole mesh topology), I would lose all the information about the mesh's UVs (and I can not afford to lose them).
I would need some advice to find the best method to weld these cracks, which does not change the vertices that are not along them, adding only the geometry needed to close the mesh (or, ideally, to make a weld using the existing edges along the edge).
Thanks in advance!
As answered by A.Comer in a comment to the main question, I was able to get the desired result simply by playing a bit with the parameters of the "close holes" tool.
Just for the sake of completeness, here is a copy of the comment:
The close holes option should be able to handle this. Did you try changing the max size for that filter to a much larger number? Do filters >> selection >> select border and put the number of selected faces as the max size into that filter – A.Comer
I'm attempting to evolve optimal strategies for the Iterated Prisoner's Dilemma using a basic genetic algorithm (Stochastic Universal Sampling, 1-point crossover, Canonical GA). I've implemented this algorithm in Haskell and recently added chart output. Unfortunately the graphs produced don't fit the expected pattern for this problem so it appears I have a bug.
All graphs of fitnesses I have seen for this problem look something like this:
Other examples can be seen in On Evolving Robust Strategies for Iterated Prisoner's Dilemma, P.J. Darwen and X. Yao (1993) p6-7
However my output looks like this:
If I set mutation rate to 1 I get:
Perhaps suggesting that my selection function is not being quite so random as I had thought as the graph implies a homogeneous population.
My code is in this git repository should you wish to inspect it.
Now for the question: Could any of you suggest what I might be doing wrong in my GA implementation to make the graph look like this?
e.g. I would assume it is unlikely to be the fitness function as I am using the same fitness function for output that it is maximising so even if the fitness function is wrong in some way it will still be maximising that wrong function (though I'm sure I could be wrong here, I'm rather new to genetic algorithms)
I would just like suggestions for which functions to look at, I'm tearing my hair out trying to fix this.
EDIT: Having added some debug code to my combine function it seems that it is always being passed the same individuals (even with mutation set to 1) so presumably selection is going wrong somewhere.
EDIT: Selection was going wrong, but that wasn't causing all the problems, just homogeneity in the population.
You have a function maybeFlip, which will change an allele to its opposite with a given probability. Hence, when the mutation rate is 1, you will just keep flipping all the alleles back and forth between two opposites. This explains the zig-zag pattern seen in your graph.
Also, swap is in Data.Tuple :)
I have 5 recorded wav files. I want to compare the new incoming recordings with these files and determine which one it resembles most.
In the final product I need to implement it in C++ on Linux, but now I am experimenting in Matlab. I can see FFT plots very easily. But I don't know how to compare them.
How can I compute the similarity of two FFT plots?
Edit: There is only speech in the recordings. Actually, I am trying to identify the response of answering machines of a few telecom companies. It's enough to distinguish two messages "this person can not be reached at the moment" and "this number is not used anymore"
This depends a lot on your definition of "resembles most". Depending on your use case this can be a lot of things. If you just want to compare the bare spectra of the whole file you can just correlate the values returned by the two ffts.
However spectra tend to change a lot when the files get warped in time. To figure out the difference with this, you need to do a windowed fft and compare the spectra for each window. This then defines your difference function you can use in a Dynamic time warping algorithm.
If you need perceptual resemblance an FFT probably does not get you what you need. An MFCC of the recordings is most likely much closer to this problem. Again, you might need to calculate windowed MFCCs instead of MFCCs of the whole recording.
If you have musical recordings again you need completely different aproaches. There is a blog posting that describes how Shazam works, so you might be able to find this on google. Or if you want real musical similarity have a look at this book
EDIT:
The best solution for the problem specified above would be the one described here ("shazam algorithm" as mentioned above).This is however a bit complicated to implement and easier solution might do well enough.
If you know that there are only 5 different different possible incoming files, I would suggest trying first something as easy as doing the euclidian distance between the two signals (in temporal or fourier). It is likely to give you good result.
Edit : So with different possible starts, try doing an autocorrelation and see which file has the higher peak.
I suggest you compute simple sound parameter like fundamental frequency. There are several methods of getting this value - I tried autocorrelation and cepstrum and for voice signals they worked fine. With such function working you can make time-analysis and compare two signals (base - to which you compare, in - which you would like to match) on given interval frequency. Comparing several intervals based on such criteria can tell you which base sample matches the best.
Of course everything depends on what you mean resembles most. To compare function you can introduce other parameters like volume, noise, clicks, pitches...
I need to create 3000x3000 bit map for coords of my robots. In theory I have a array MxM, M=3000, and if my robot see somthing, then in coords [5][5] example I put 1, if see nothing then 0.
When I tried create int[][] b = new int[3000][3000]
I have a problem - outofmemory.
I tried use RMS, but I can create 3000 rows, but only 50 cols
I think to use textfile, but I need a custom update, and working with textfile very hard in j2me.
Thanks for reply!
Some approaches:
Store your coordinates in a file, and
load and update only those
rows/columns of data into memory,
that surround the robot (maybe a
10x10 matrix). Buffering.
Use a quadtree algorithm to store
your coordinates. You may have to use
the external file approach here too,
but maybe you can think of something
better.