Open source or free financial analysis programs/libraries - statistics

I'm looking for something containing similar functions to Matlab’s financial and financial derivatives toolbox but don’t have the cash to spend on matlab. I would appreciate any info on free or open source libraries or programs that will let me easily calculate interest rates, risk etc.

How about JQuantLib or QuantLib?

How about the Octave financial functions?
http://www.gnu.org/software/octave/doc/interpreter/Financial-Functions.html#Financial-Functions
I'm not familiar with the Matlab toolbox, so you'll have to judge for yourself.
GNU Octave is a high-level language,
primarily intended for numerical
computations. It provides a convenient
command line interface for solving
linear and nonlinear problems
numerically, and for performing other
numerical experiments using a language
that is mostly compatible with Matlab.
It may also be used as a
batch-oriented language.

Exactly what functions do you need? How advanced? You have some financial functions in .Net
Im sure it doesnt cover everything, but calulating interests and some other are no problem:
http://msdn.microsoft.com/en-us/library/daksysx3(VS.80).aspx
Calculate depreciation. DDB, SLN, SYD
Calculate future value.
FV
Calculate interest rate.
Rate
Calculate internal rate of return.
IRR, MIRR
Calculate number of periods.
NPer
Calculate payments.
IPmt, Pmt, PPmt
Calculate present value.
NPV, PV

Related

How to determine sample size given few parameters

How to determine sample size given there is 20% point reduction [ before change – after change =20% ] with 95% confidence level and 90% power ? Any pointer on how to solve this
A good first step is always to think about, what kind of test you plan to use. From the very little information you give a paired t-test (or a single measurement t-test comparing the difference to zero) is a likely candidate.
You can now google for "statistical power of t test" to which you can add the name of any computer language or statistics software you plan to use. Except maybe for educational purposes I'd advise to compute statics not by hand but via software.
Kind of an obvious option for statistic software on stackoverflow might be R. In Ryou'll find solutions to many sample size or power calculations in the package pwr. Here is the link to a getting started text: https://cran.r-project.org/web/packages/pwr/vignettes/pwr-vignette.html
The pwr.t.test function is good for your problem. Google will readily help you to alternatives for Python and Julia and SPSS I assume for C++, Java and Javascript as well.
However you will have to make assumptions about the variance or the effect size. Will each value be reduced by almost exactly 20% or will some be reduced a lot and some increase? That is of utmost importance to the question. You will need only one observation if there is no variance, a small amount of observations if there is little variance and a large amount of observations if there is lots of variance.

Determine fundamental frequency of voice recordings

I am using the command line tool aubiopitch to analyze voice recordings. My goal is to determine the fundamental frequency of the voice recorded. I know, of course, that the frequency varies – that's why I want to calculate an "average" in Hz over a 30-second recording.
My question: aubio uses different methods to determine the pitch of a recording: Schmitt trigger, harmonic comb, yin, yinfft etc. Which one of those would be my preferred choice when dealing with pure human voice recordings (no background music, atmo etc.).
I would recommend using yinfast or yinfft (default). For a discussion of the algorithms, their parameters, and their performance, see Chapter 3 of this document.
Note that the median is better suited than the average in this case.
CREPE is good and outperforms many others since it uses advanced neural-network machine learning for pitch prediction. It might be unstable in unseen conditions though and might not be very easy to plug since it requires tensorflow.
For more traditional and lightweight solution oyu can try REAPER.

Quickest and easiest algorithm for comparing the frequency content of two sounds

I want to take two sounds that contain a dominant frequency and say 'this one is higher than this one'. I could do FFT, find the frequency with the greatest amplitude of each and compare them. I'm wondering if, as I have a specific task, there may be a simpler algorithm.
The sounds are quite dirty with many frequencies, but contain a clear dominant pitch. They aren't perfectly produced sine waves.
Given that the sounds are quite dirty, I would suggest starting to develop the algorithm with the output of an FFT as it'll be much simpler to diagnose any problems. Then when you're happy that it's working you can think about optimising/simplifying.
As a rule of thumb when developing this kind of numeric algorithm, I always try to operate first in the most relevant domain (in this case you're interested in frequencies, so analyse in frequency space) at the start, and once everything is behaving itself consider shortcuts/optimisations. That way you can test the latter solution against the best-performing former.
In the general case, decent pitch detection/estimation generally requires a more sophisticated algorithm than looking at FFT peaks, not a simpler algorithm.
There are a variety of pitch detection methods ranging in sophistication from counting zero-crossing (which obviously won't work in your case) to extremely complex algorithms.
While the frequency domain methods seems most appropriate, it's not as simple as "taking the FFT". If your data is very noisy, you may have spurious peaks that are higher than what you would consider to be the dominant frequency. One solution is use window overlapping segments of your signal, and do STFTs, and average the results. But this raises more questions: how big should the windows be? In this case, it depends on how far apart you expect those dominant peaks to be, how long your recordings are, etc. (Note: FFT methods can resolve to better than one-bin size by taking into account phase information. In this case, you would have to do something more complex than averaging all your FFT windows together).
Another approach would be a time-domain method, such as YIN:
http://recherche.ircam.fr/equipes/pcm/cheveign/pss/2002_JASA_YIN.pdf
Wikipedia discusses some more methods:
http://en.wikipedia.org/wiki/Pitch_detection_algorithm
You can also explore some more methods in chapter 9 of this book:
http://www.amazon.com/DAFX-Digital-Udo-ouml-lzer/dp/0471490784
You can get matlab sourcecode for yin from chapter 9 of that book here:
http://www2.hsu-hh.de/ant/dafx2002/DAFX_Book_Page_2nd_edition/matlab.html

Are there any generic statistical board software?

I was thinking about monitoring the evolution of a single value (nope, this is not the SO rep :-p), and I'd like to have some nice histograms about it. My needs are simple:
daily / weekly / monthly / yearly evolution histograms;
daily / weekly / monthly / yearly calculation for max, min and average value.
Ideally the product should be scriptable so I can feed it with the result of the script.
Something simple like: set it up, set cron (or, if it has a daemon, even better), set input, enjoy output.
If it does not exist, do you feel like you would be interested in such a tool. Because I could end up coding it eventually.
EDIT :
I am not looking for a lib or a language but an app.
I'm not sure if they are what you are looking for, but I would look at the R statistics packages and the various Python numerical and scientific libraries, like numpy and scipy. They are probably overkill, but they might have graphical components that you can leverage.
How about RRDtool?
From a Java program you can create charts with JFreeChart

What's the best interactive Analysis and Plotting Tool for software testing?

My realtime app generates a data log: 100 words of data #10Khz. I need to analyze it and produce some plots of the results. There are intermediate calculations involved - I need to take some differences, averages, etc. Excel would work fine, except for:
the 32000 item limit on graph data series is too small - that's only 3 seconds of data.
the glacial speed at which it processes changes to graphs containing large data series is unbearable.
What are good alternatives to Excel for manipulating and plotting large quantities of data? I'm looking for something interactive, not a library.
For this sort of stuff we typically roll our own, but I know that isn't the solution you want. Can you use a good quality database (eg Oracle) to do the manipulation, then maybe put the summarized data back into Excel for the plotting? I believe Excel will link to databases these days, so you could make it quite automated.
Otherwise there are statistical tools like [SAS][1], but get your cheque book out first.
[1]: http://www.sas.com/technologies/analytics/statistics/stat/index.html SAS
There are also several free tools for analysing and plotting (see below). But I am not sure whether they have components to handle data in real-time.
R (similar to SAS) for statistical computations
octave (similar to Matlab) for mathematical computations
R (for data manipulation) and its ggplot2 module for creating sexy graphs. Incredibly useful.
If you need real-time graphics, then I'd look at building something using matplotlib. It's a Python module, and you can link it to R using rpy2 if required.
In particle and nuclear physics the big tool is ROOT, which I have seen used in a "update every two seconds as the data comes in" mode with a lot of data and a modest amount of intermediate processing.
Mind you, the student who wrote that module was a very slick programmer, and it took a while to shake the bugs out, even so.
ROOT is available for free, and provides all kinds of tools and support.

Resources