Data analysis tool like MS excel [closed] - excel

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 months ago.
Improve this question
I have a large number of data that needs to be compared,
We are using Microsoft EXCEL, it costs, and it is slow, besides the graph that it generates is also not up to the mark.
Now, is their any other tool, that is free, and has good graph's facility.
Thank you.

If you need good data analysis toolkit, you can spend some time and try R --- free software statistics/data analysis toolkit. It has pretty good graphic capabilities, especially via ggplot2 package for static graphics and ggobi for dynamic data exploration.
Data import/export pretty easy too --- R can import csv/tsv, or Excel data via ODBC and so on.
Pretty good introduction in exploration data analysis with R and ggplot
Introduction to R
Data Import/Export
It takes some time to learn, but after that you are not limited with tool capabilities: R can handle plenty of analysis tasks: from simple data crunching, pivoting. aggregation tasks to advanced statistics/machine learning methods: clustering/classification/ regression etc.
If you more more interested in data transformation with simple calculations, you can use some kind of Extract-Transform-Load toolkit, such as Talend Open Studio.

If you already have MS Office Suite, you probably have MS Access already installed. It does a better job of processing heaps of data, and is a bit more versatile in graphing/charting/displaying data. There may be a bit of a learning curve if you've never used it before.
As for free: I like working with Resolver One, which is free-ish under certain circumstances. At very least its charts are marginally nicer to look at than Excel 2003.
It's going to be difficult to find a free, fast, powerful alternative that generates classy charts though, unless you're willing to code something yourself in, say a web interface.

Related

FORTRAN graphic library on Linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I started learning FORTRAN and need graphic library to plot output
As I'm not familiar with FORTRAN environment, I wanted to ask for recommendation
I'm used to matplotlib, and preferably looking for something similar. Similar in means of available features, and workflow concepts
Searching through Synaptic it seems like PGPLOT is the way to go
PS I know I could wrap FORTRAN code in Python in different ways
I've used PGPLOT, dislin and PLPLOT. In this era I'd use dislin or PLPLOT. PGPLOT was last updated in 2001 and only has a FORTRAN 77 interface, which can be used with Fortran 90/95/2003 but the compiler won't be able to check that your calls have the correct arguments. The other two have Fortran 95 interfaces. Of dislin and PLPLOT, I think dislin to be better documented. dislin also provides widgets for GUI input. dislin is free for some uses; for business uses one is supposed to purchase a license. PLPLOT is open source under the LGPL.
Yes, PGPLOT is an option.
You may also want to look into PLplot: http://plplot.sourceforge.net/
DISLIN - supports several platforms and languages (python included).
Have you considered using VisIt or similar software? VisIt can visualise very large datasets and has a mechanism for in-situ visualization with the libsim library. See this presentation for a nice introduction to in-situ visualization with VisIt: http://calcul.math.cnrs.fr/Documents/Ecoles/Data-2011/CouplageSimulationVisualization.pdf.
See here for the libsim api.
Finally, a few additional notes.
While you sate
I'm used to matplotlib, and preferably looking for something similar.
Similar in means of available features, and workflow concepts
using libsimwould be quite different, but it is very powerful.
and
Right now just general graphic package to plot intermediate data
products. If I do well then perhaps I'll look for package that handles
large data sets. But if I get there I'll probably know what to use
till then.
Rather than write a solution now and then change it to deal with large data sets, why not just write a scalable solution now?
Not familiar with FORTRAN plotting but in most languages a quick way to do easy-to-program static graphics is not to use a library at all but to find that language's equivalent of the C "system()" call, find a program that will do the plotting and write the equivalent of:
1. Write plot data to file.
2. Do system call running plot program with data file argument.
3. [Optional] Delete plot data file.
This gives your program the full functionality of the plot program with minimal programming. The plot data and plot program are likely to be in the disk cache so it's all in memory. It's also easily debugged.
People underestimate the system() call - it gives access to a vast array of functionality including scripting. Some would say it's not efficient or "pure" but the user won't care and it will often drastically reduce the amount of programming necessary. Don't reinvent the wheel.
You can use:
PLplot: http://plplot.sourceforge.net/
gtk-fortran. It is a GTK / Fortran binding which offers also an interface to PLplot: https://github.com/vmagnin/gtk-fortran/wiki

Quantitative finance research language [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am working on an interpreted quant finance library for rapid prototyping of equity derivatives mostly. I do not have any experience with such languages (I've heard of Goldman-Sach's Slang, but have never seen it).
What sort of functionality is found in such languages, and do they have some unique features which correspond to the financial markets?
Have you ever considered Python? There are many mature libraries that can be used for statistical analysis, data acquisition and cleaning. To name a few:
Numpy - N-dim array objects
Scipy - library of statistical and optimisation tools
statsmodels - statistical modeling
Pandas - data structures for time series, cross-sectional, or any other form of “labeled” data
matplotlib - MATLAB-like plotting tools
PyTables - hierarchical database package designed to efficiently manage very large amounts of data
CVXOPT - convex optimization routines
I've personally implemented some pretty complex derivatives pring models in python, including a jump-diffusion Vasicek interest rate lattice, many stochastic processes, and even managed to write a genetic optimizer.
One of my professors is director of research ( PhD. in math ) at a Chicago hedge fund who uses Python exclusively.
Perhaps, every company has something on their own, but there are some materials available on the web ( mainly about DSL-s ):
Going functional on exotic trades
Composing contracts: an adventure in financial engineering
As for your own language ( and libraries / runtime! ) - there is not too much to say whithout knowing your requirements ( to name just few, which immediately came to my mind when I started to think about it ):
Who will use it - sales or traders or quants or all
How will it be used - just pricing of predefined blocks and/or solving optimization problems. It would lead to an ability to define workflows.
Interaction with underlying infrastructure and its level of abstractions
Extensibility ( to what an extent )
Live calculations or simulation
I/O support
Most languages/tools provide constructs for representing and analyzing time series [e.g. time series regression and cross-correlation stuff]
The "unique" features refer to either speed of access, ease of querying, or expressivity.
K is notably quick, having a very terse language
matlab is very expressive, allowing you to use the entire set of toolboxes and extend with java
But at the end of the day it really depends on what exactly you want to do.

Music Visualization [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am intrested in learning about Music Visualization.
(eg: http://en.wikipedia.org/wiki/Music_visualization )
Does anyone have any books to recommend on the subject?
(I know its not a technical question, but it seems like a good place to ask)
Many thanks
You're in luck--it's a great time to get involved in the medium. Lots of new open source multimedia platforms are available now, with great communities forming around them--making it much easy to get something up and running.
I'm not aware of any books specifically on audio visualisation, but I think you'd be well served by reading more general material on:
computer graphics in general
graphic design (color, form, etc)
data visualisation
any of the great new open source multimedia platforms
If you're writing a visualization plugin for a media player, the problem can usually be treated as mapping FFT data and time to pixel space. You get the time and FFT data nearly for free, so the remainder of the problem is graphics programming, visual design, musical sensitivity and imagination. The way you combine these will ideally be your own.
You can expect to find lots of great information, tools, examples and communities surrounding any of the modern open source multimedia platforms:
processing.org -- a Java based platform which makes it really easy to get your works (called "sketches") up and running, with plenty of examples. You could plug in a library like minim to get the audio FFT parts for free
openFrameworks and libcinder -- C++ based platforms. If you want to write plugins for a media player like iTunes, you may need to use a language like C++. If you already know (or want to learn) C++, both are good choices.
I'd recommend jumping straight in with a platform like processing.org, together with a library like minim, play with the bundled examples, and build your knowledge from there.
There are quite a few books on processing if that suits your learning style.
If you want to stay current, blogs like createdigitalmotion are a great resource.
Also check out artists like flight4040 and Memo Akten who are using these frameworks.
Hope that helps.
Check this fantastic blog post:
http://www.ethanhein.com/wp/2011/visualizing-music/
There is also some great material on this book:
http://www.amazon.com/gp/aw/d/0060926716/ref=aw_d_detail?pd=1
The author also has a website with some examples.
http://www.constructingtheuniverse.com/Amen%20Break%20and%20GR.html
Happy visualizing

Article about code density as a measure of programming language power [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I remember reading an article saying something like
"The number of bugs introduced doesn't vary much with different programming languages, but it depends pretty much on SLOC (source lines of code). So, using the programming language that can implement the same functions with smaller SLOC is preferable in terms of stability."
The author wanted to stress the advantages of using Functional Programming, as normally one can program with a smaller number of LOC. I remember the author cited a research paper about the irrelevance of choice of programming language and the number of bugs.
Is there anyone who knows the research paper or the article?
Paul Graham wrote something very like this in his essay Succinctness is Power. He quotes a report from Ericsson, which may be the paper you remember?
Reports from the field, though they will necessarily be less precise than "scientific" studies, are likely to be more meaningful. For example, Ulf Wiger of Ericsson did a study that concluded that Erlang was 4-10x more succinct than C++, and proportionately faster to develop software in:
Comparisons between Ericsson-internal development projects indicate similar line/hour productivity, including all phases of software development, rather independently of which language (Erlang, PLEX, C, C++, or Java) was used. What differentiates the different languages then becomes source code volume.
I'm not sure if it's the source you're thinking of, but there's something about this in Code Complete chapter 27.3 (p652) - that references "Program Quality and Programmer Productivity" (Jones 1977) and "Estimating Software Costs" (Jones 1998).
I've seen this argument about "succinctness = power" a few times, and I've never really bought it. That's because there are languages (e.g., J, Ursala) which are quite succinct but not (IMO) easy to read because they put so much meaning into individual symbols.
Perhaps the true metric should be the extent to which it is possible to write a particular algorithm both clearly and succinctly. Mind you, I don't know how to measure that.
The book of pragmatic Thinking & Learning points to this article.
Can a Manufacturing Quality Model Work for Software?

Interactive Statistical Analysis tool [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm looking for a basic software for statistical analysis. Most important is simple and intuitive use, getting started "right out of the box". At least basic operations should be interactive. Free would be a bonus :)
The purpose is analysis of data dumps and logs of various processes.
Importing a comma/tab separated file
sorting and filtering rows on conditions
basic aggregates: count, average, deviation, regression, trend
visualization - plotting the data,bin distribution etc.
Excel fails (at least for me) for the filtering and re-combining data, I guess something like "Excel with SQL" would be nice. I've been using MS Access + Excel and copying around data before, but that's a pain.
Do you have any recommendation?
Clarification I am not looking for a specific tool for IIS/web server logs, but various data end event logs (mostly from custom applications) with tab-separated values.
Specifically for Log file analysis I would recommend Microsoft's Log Parser(free), which will allow you to run queries with basic aggregation against all types of text based files (and across sets of files), XML, CSV, Event Log, the Registry, file system, Active Directory, etc..
There is also a free GUI build on top of it called Log Parser Lizard GUI which makes it more user friendly and can do basic graphing etc.
I would consider looking at R, it is:
Free
Widely used by statisticians
Fairly easy to use.
Can easily do everything you mention in your post
I used Tableau Software at a previous gig, and it's pretty amazing - extremely intuitive and easy to use.
Unfortunately it's also pricey.

Resources