Fast (possibly approximate) linear programming library [closed] - shared-libraries

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I need to solve a sparse linear programming problem, and I am looking for a library for the same.
Primary requirements:
The most important requirement is that it should be very fast. A randomised approximate solution is acceptable, if it is faster.
LP specifications:
The size of the problem is a function of 2 parameters : P and Q, with P << Q most of the times.
No. of variables ~ P + Q
No. of constraints ~ 2Q
The constraint matrix is sparse - it has only O(Q) non-zero entries.
Solutions tried
1) MATLAB : The linprog function of MATLAB is not particularly useful in our setting, as it takes very long to solve the LP.
2) GLPK : glpk_simplex is also not as fast as expected - for a problem with P=15, Q=15,000, I need to get an answer in at most 10 seconds, but glpk_simplex takes 20-25 minutes. glpk_interior runs out of memory for a problem of the above-mentioned size.
Can anyone suggest some efficient libraries? Please suggest both free and commercially available ones, that can be used to solve the problem exactly or approximately.

Regarding other solver options, here are two SO questions that you should take a look at if you haven't checked them out already:
SO Question on which solvers to
use
Java Solver
Options
But the reason I am posting is that I have a couple of other suggestions for you, rather than going for the solver speed. (Something might work for Q ~ 15K in your problem, but if Q gets bigger, you will have to search for even faster solvers.)
Other suggestions to try
Have you played around with the solver options in either MATLAB or GLPK? There are quite a number of things you could try: Setting iteration limit, or Timelimit (to 10000 milliseconds).
Look into decomposing and relaxing your formulation. Typically, in these large LP's there is a nice underlying structure, but a few dense constraints play spoil-sport and those are the ones that give the solver trouble. If you can identify those, you can relax those, and maybe even throw it into the objective function with a multiplier.
To make it a little more concrete, you consider Lagranian relaxations for 'troublesome constraints.' (As one reference to what I'm referring to see how problem 12.3 becomes 12.4 here
after being relaxed. You can do the same for dense several constraints in your problem.
Hope this helps you move forward.

Related

High Level language with Low level Graphics [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am looking for a high level language which will still allow me to work directly with graphics. I want to be able to modify screen pixels, for instance. But I do not want to write huge amounts of code for each operation. I want simple one line commands for graphics somewhat like those listed below. What are some programming languages which would have these features?
Possible pseudocode:
Screen.clear
Graphics.line(4,5,20,25).color=green
Circle(centerx,centery,radius)
Depending on what you want to do (ie, how complex do you need to get?) Processing is a very high-level, graphics-focused environment. Note, however, that it seems to be focused on the fixed function OpenGL pipeline, which is deprecated (though arguably the easiest and most intuitive way to get started).
Processing is built in Java, runs in web browser (or from your desktop), and abstracts most of the initialization and cleanup code required to use OpenGL.
Edit
I've just noticed your comment that says you're not an experienced programmer. In that case, I'd recommend starting with Processing. Once you get the hang of it, move on to Python.
Another, slightly more complex, option is Python. Python is very powerful, fairly easy to pick up (depending upon your prior development experience), and widely supported. It'll also allow you to use shaders and other features from the 21st century, and is cross-platform See this link for PyOpenGL, the first Python OpenGL site that popped up in google.
Then, there's C# + OpenTK. This can get pretty complex pretty quickly, but is very powerful, and since it's compiled (under .NET or Mono), can potentially give you better performance than Python.
Finally, for close-to-bare-metal performance, C++ is unbeatable, though arguably the most complex of these options, with a significant learning curve. However, most of the example code you'll find online is in C++, which can be an issue if you're not using C++ and aren't comfortable reading it.
Using Qt you can create QImage objects and draw on them usign QPainter.
You've of course pixel control using that abstraction level, but you can also access the underlying memory directly using bits() and bytesPerLine() methods thus accessing the image memory directly.
The format that is easiest to use to do fast special computations is in my opinion QImage::Format_ARGB32 with 32 bits per pixel.
Qt is a C++ library portable on may OSs and platforms, and bindings are available for many very high level languages (e.g. Python).

Any good tools to solve integer programs on linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Are there any good tools to solve integer programs on Linux?
I have a small problem that I want to compute to save time :D. It is kind of a subset sum problem. I have a list of around 20 Integer-Values and I want to compute the subset with the smallest sum that satisfies a certain minimum. You could formulate this with a integer program... something like
\sum_{i=1}^{n} w*x -> min
with
\sum_{i=1}^{n} w*x >= c with x \in \{0,1\}
Or is there an other good way to do this?
I would try either GLPK or SCIP.
They have their own modeling language, GLPK has GNU MathProg and SCIP has ZIMPL, so you can conveniently code your LP problem.
GNU MathProg has the advantage of being compatible with AMPL. Thus, you could try the student version of AMPL with CPLEX or Gurobi with your GNU MathProg model. Keep in mind that AMPL, CPLEX and Gurobi are commercial software.
Have you tried to do that with LibreOffice Calc Solver?
Microsoft Solver Foundation on Mono Framework could also do the job for you if you know C#.
Try Lindo/Lingo. They are not free, but you can try them.
They allow you to specify your problem in a very neat mathematical way.
You could try gnu octave - its a subset of matlab
I wanted to add one more option to the GLPK suggestions that #Ali has made. I suggest that anyone interested in solving LPs/IPs also look into the optimization packages that the R Language offers.
If you already know and use R, then it is just a matter of downloading the right package. And even if you don't, this is a good way to get introduced to R, which is really taking off in the analytics space.
This vignette is very good way to know which R packages are relevant.
For you, RSymphony or Rglpk might be the ones to start with.

Can someone suggest a high performance shared memory API that supports complex data? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm looking at porting an old driver that generates a large, complex set of tables of data into user space - because the tables have become large enough that memory consumption is serious a problem.
Since performance is critical and because there will be 16-32 simultaneous readers of the data, we thought we'd replace the old /dev based interface to the code with a shared-memory model that would allow clients to directly search the tables rather than querying the daemon directly.
The question is - what's the best way to do that? I could use shm_open() directly, but that would probably require me to devise my own record locking and even, possibly, an ISAM data structure for the shared memory.
Rather than writing my own code to re-visit the 1970s, is there a high-performance shared memory API that provides a hash based lookup mechanism? The data is completely numeric, the search keys are fixed-length bit fields that may be 8, 16, or 32 bytes long.
This is something i've wanted to write for some time, but there's always some more pressing thing to do...
still, for most of the usecases of a shared key-data RAM store, memcached would be the simplest answer.
In your case, it looks like it's lower-level, so it memcached, fast as it is, might not be the best answer. I'd try Judy Arrays on a shmem block. They're really fast, so even if you wrap the access with a simplistic lock, you'd still get high performance access.
For more complex tasks, I'd search about lock-free structures (some links: 1, 2, 3,4). I even wrote one some time ago, with hopes of integrating it in a Lua kernel, but it proved really hard to keep with the existing implementation. Still, it might interest you.

How can I make voronoi treemaps? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I want to make voronoi treemaps for statistics data, like
newsgraphy
Do you know how I can do that in Perl, PHP, Ruby, or Python?
Math::Geometry::Voronoi
Nice demos and graphics for Python: http://home.scarlet.be/zoetrope/voronoi/ (Archived copy at wayback)
Just found this page. I've been working on a Voronoi demo applet using Javascript/canvas, after translating into Javascript a C# version of Steven Fortune's algorithm by Benjamin Dittes (available at Code Project, see "Fortune's Voronoi algorithm implemented in C#"). Here is the page which include Fortune's Voronoi algorithm in Javascript:
http://www.raymondhill.net/voronoi/voronoi.php
This is a first iteration, I plan to adapt it further to be better suited to Javascript. Hope this helps.
First of all, the lines are not strange: it's the result of the fact that this is not a normal Voronoi tessellation, but an area-weighted Voronoi (AWT) tessellation, possibly even a centroidal Voronoi tessellation (CVT). That being said, in order to have Voronoi regions (polygons) with significantly differing areas (which would reflect some attribute of the data), you need AWTs (preferably implemented as CVTs to retain nice aspect ratios for the polygons); a normal Voronoi algorithm (as suggested by some people above) will not be able to help you. There is probably no direct solution for this available, especially not for scripted languages, since the computational complexity due to iterative updating steps for AWTs is quite high. You should look up the work on "Voronoi Treemaps" and "Dynamic Voronoi Treemaps" by Balzer et al. and Sud et al. to get an idea of the algorithm and then implement it on your own (everything that you need is in their papers).
the other Python answer seems to point at a raster only solution. I am also interested in solving this problem (in Python) and I think the following script could form a usable starting point:
http://www.oxfish.com/python/voronoi.py
(Archived copy at wayback)
James Tauber is writing a tutorial that uses JavaScript and Fortune's algorithm to draw a Voronoi diagram in a canvas element: Voronoi Canvas Tutorial
It's not complete yet (he's at part 3 of 4) but there's enough there to complete it I think.
The latest version (2.0) of Macrofocus TreeMap has the Voronoi algorithm as an option, among others.

Interactive Statistical Analysis tool [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm looking for a basic software for statistical analysis. Most important is simple and intuitive use, getting started "right out of the box". At least basic operations should be interactive. Free would be a bonus :)
The purpose is analysis of data dumps and logs of various processes.
Importing a comma/tab separated file
sorting and filtering rows on conditions
basic aggregates: count, average, deviation, regression, trend
visualization - plotting the data,bin distribution etc.
Excel fails (at least for me) for the filtering and re-combining data, I guess something like "Excel with SQL" would be nice. I've been using MS Access + Excel and copying around data before, but that's a pain.
Do you have any recommendation?
Clarification I am not looking for a specific tool for IIS/web server logs, but various data end event logs (mostly from custom applications) with tab-separated values.
Specifically for Log file analysis I would recommend Microsoft's Log Parser(free), which will allow you to run queries with basic aggregation against all types of text based files (and across sets of files), XML, CSV, Event Log, the Registry, file system, Active Directory, etc..
There is also a free GUI build on top of it called Log Parser Lizard GUI which makes it more user friendly and can do basic graphing etc.
I would consider looking at R, it is:
Free
Widely used by statisticians
Fairly easy to use.
Can easily do everything you mention in your post
I used Tableau Software at a previous gig, and it's pretty amazing - extremely intuitive and easy to use.
Unfortunately it's also pricey.

Resources