This is a pretty rookie question, but unfortunately I'm not really familiar with cmake or CGAL. I just can't figure out how to configure cmake to generate a shared library from my source code. I looked through some docs and some pages on github, but I don't seem to get anywehere.
I would be really grateful if someone could point me to some documentation, or provide an example CMakeLists.txt or something (to be honest I'm completely lost here).
I need the .so-s for a python binding for some functionality of the 2D arrangements package.
The CGAL Manual has copious information you can use to get you started. Here you'll find information for building/installing CGAL itself.
If you already have CGAL built/installed on your system, there are some examples to show how to use it in your own CMake project. Check out their Github wiki, or for even more info, Sections 14 and 15 of their Installation Manual.
You mention python bindings, so if your looking to use something like SWIG, I suggest taking a look here.
I'm working on a fairly simple text-editor for Haskell, and I'd like to be able to highlight static errors in code when the user hits "check."
Is there a way to use the GHC-API to do a "dry-run" of compiling a haskell file without actually compiling it? I'd like to be able to take a string and do all the checks of normal compilation, but without the output. The GHC-API would be ideal because then I wouldn't have to parse command-line output from GHC to highlight errors and such.
In addition, is it possible to do this check on a string, instead of on a file? (If not, I can just write it to a temp file, which isn't terribly efficient, but would work).
If this is possible, could you provide or point me to an example how how to do this?
This question ask the same thing, but it is from three years ago, at which time the answer was "GHC-API is new and there isn't good documentation yet." So my hope is that the status has changed.
EDIT: the "dry-run" restriction is because I'm doing this in a web-based setting where compilation happens server side, so I'd like to avoid unnecessary disk reads/write every time the user hits "check". The executable would just get thrown away anyways, until they had a version ready to run.
Just to move this to an answer, this already exists as ghc-mod, here's the homepage. This already has frontends for Emacs, Sublime, and Vim so if you need examples of how to use it, there are plenty. In essence ghc-mod is just what you want, a wrapper around the GHC API designed for editors.
I've got some unused functionality in my codebase, but it's hard to identify. The code has evolved over the last year as I explore its problem space and possible solutions. What I'm needing to do is find that unused code so I can get rid of it. I'm happy if it deals with the problem on an exportable name basis.GHC has warnings that deal with non-exported unused code. Any tools specific to this task would be of interest.
However, I'm curious about a comprehensive cross referencing tool. I can find the unused code with such a tool. Years ago when I was working in C and assembler, I found that a good xref was a pretty handy tool, useful for many different purposes.
I'm getting nowhere with googling. Apparently in Haskell the dominant meaning of cross-reference is within literate programming. Though maybe something there would be useful.
I don’t know of such a tool, so in the past I have done a bit of a hack instead.
If you have a comprehensive test suite, you can run it with GHC’s code coverage tracing enabled. Compile with -fhpc and use hpc markup to generate annotated source. This gives you the union of unused code and untested code, both of which you would probably like to address anyway.
SourceGraph can give you a bunch of information which you may also find useful.
There is now a tool for this very purpose: https://hackage.haskell.org/package/weeder
It's been around since 2017, and while it has limitations, it definitely helps with large codebases.
I am developing a library and I need to add extra codes to some of my methods of my objects at run time. there are two points here. first of all, the program I wanted to add extra code, is written before by some body else, and I don't wanted to edit it. second, my work is very similar to adding aspect before calling a method.
After searching and reading on in internet, I found out many frameworks like aspectj, ASPECTWERKZ and etc. that could do this job, but for example the problem with aspectj(when using in spring context) is that it doesn't provide you any API to do your weaving at runtime.
I also found out there are some libraries like ASM and javassist and etc. but they are so general and hard to learn, and my job is more likely to aspects.
So what do you suggest? is there any good library out there? Am I wrong about above libraries i mentioned earlier? Please Help!
With AspectJ you can apply your aspects when classes are loaded at runtime. See Load-Time Weaving documentation. Alternatively, you don't have to change your old code and can apply aspects at compile time.
Hear lately I've been listening to Jeff Atwood and Joel Spolsky's radio show and they have been talking about dogfooding (the process of reusing your own code, see Jeff Atwood's blog post). So my question is should programmers use decompilers to see how that programmers code is implemented and works, to make sure it won't break your code. Or should you just trust that programmers code and adapt to it because using decompilers go against everything we as programmers have ever learn about hiding data (well OO programmers at least)?
Note: I wasn't sure which tags this would go under so feel free to retag it.
Edit: Just to clarify I was asking about decompilers as a last resort, say you can't get the source code for some reason. Sorry, I should have supplied this in the original question.
Yes, It can be useful to use the output of a decompiler, but not for what you suggest. The output of a compiler doesn't ever look much like what a human would write (except when it does.) It can't tell you why the code does what it does, or what a particular variable should mean. It's unlikely to be worth the trouble to do this unless you already have the source.
If you do have the source, then there are lots of good reasons to use a decompiler in your development process.
Most often, the reasons for using the output of a decompiler is to better optimize code. Sometimes, with high optimization settings, a compiler will just get it wrong. This can be almost impossible to sort out in some cases without comparing the output of the compiler at different levels of optimization.
Other times, when trying to squeeze the most performance out of a very hot code path, a developer can try arranging their code in a few different ways and compare the compiled results. As a last resort, this may be the simplest way to start when implementing a code block in assembly language, by duplicating the compiler's output.
Dogfooding is the process of using the code that you write, not necessarily re-using code.
However, code re-use typically means you have the source, hence 'code-reuse' otherwise its just using a library supplied by someone else.
Decompiling is hard to get right, and the output is typically very hard to follow.
You should use a decompiler if it is the tool that's required to get the job done. However, I don't think it's the proper use of a decompiler to get an idea of how well the code which is being decompiled was written. Depending on the language you use, the decompiled code can be very different from the code which was actually written. If you want to see some real code, look at open source code. If you want to see the code of some particular product, it's probably better to try to get access to the actual code through some legal means.
I'm not sure what exactly it is you are asking, what you expect "decompilers" to show you, or what this has to do with Atwood and Spolsky, or what the question is exactly. If you're programming to public interfaces then why would you need to see the original source of the the third party code to see if it will "break" your code? You could more effectively build tests to in order to determine this. As well, what the "decompiler" will tell you largely depends on the language/platform the software was written in, whether it is Java, .NET, C and so forth. It's not the same as having the original source to read, even in the case of .NET assemblies. Anyway, if you are worried about third party code not working for you then you should really be doing typical kinds of unit tests against the code rather than trying to "decompile" it. As far as whether you "should," if you mean whether you "should" in some other way other than what would be the best use of your time then I'm not sure what you mean.
Should Programmers Use Decompilers?
Use the right tool for the right job. Decompilers don't often produce results that are easy to understand, but sometimes they are what's needed.
should programmers use decompilers to
see how that programmers code is
implemented and works, to make sure it
won't break your code.
No, not unless you find a problem and need support. In general you don't use it if you don't trust it, and if you have to use it you even when you don't trust it you develop tests to prove the functionality and verify that later upgrades still work as expected.
Don't use functionality you don't test, unless you have very good support or a relationship of trust.
-Adam
Or should you just trust that programmers code and adapt to it because using decompilers go against everything we as programmers have ever learn about hiding data (well OO programmers at least)?
This is not true at all. You would use a decompiler not because you want to get around any sort of abstraction, encapsulation, or defeat OO principles, but because you want to understand why the code is behaving the way it is better.
Sometimes you need to use a decompiler (or in the Java world, a bytecode viewer) when you are troubleshooting an annoying bug with a 3rd party library where an exception is thrown with no useful error message, no logging, etc.
Use of a decompiler has nothing to do with OO principles.
The short answer to this... Program to a public and documented specification, not to an implementation. Relying on implementation specifics and side-effects will burn you.
Decompilation is not a tool to help you program correctly, though it might, in a pinch, assist you in understanding a problem with someone else's code for which you don't have source.
Also, beware of the possible legal risk of decompiling; many software companies have no-decompile clauses which could expose you and your employer to legal consequences.