Quantitative finance research language [closed] - programming-languages

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am working on an interpreted quant finance library for rapid prototyping of equity derivatives mostly. I do not have any experience with such languages (I've heard of Goldman-Sach's Slang, but have never seen it).
What sort of functionality is found in such languages, and do they have some unique features which correspond to the financial markets?

Have you ever considered Python? There are many mature libraries that can be used for statistical analysis, data acquisition and cleaning. To name a few:
Numpy - N-dim array objects
Scipy - library of statistical and optimisation tools
statsmodels - statistical modeling
Pandas - data structures for time series, cross-sectional, or any other form of “labeled” data
matplotlib - MATLAB-like plotting tools
PyTables - hierarchical database package designed to efficiently manage very large amounts of data
CVXOPT - convex optimization routines
I've personally implemented some pretty complex derivatives pring models in python, including a jump-diffusion Vasicek interest rate lattice, many stochastic processes, and even managed to write a genetic optimizer.
One of my professors is director of research ( PhD. in math ) at a Chicago hedge fund who uses Python exclusively.

Perhaps, every company has something on their own, but there are some materials available on the web ( mainly about DSL-s ):
Going functional on exotic trades
Composing contracts: an adventure in financial engineering
As for your own language ( and libraries / runtime! ) - there is not too much to say whithout knowing your requirements ( to name just few, which immediately came to my mind when I started to think about it ):
Who will use it - sales or traders or quants or all
How will it be used - just pricing of predefined blocks and/or solving optimization problems. It would lead to an ability to define workflows.
Interaction with underlying infrastructure and its level of abstractions
Extensibility ( to what an extent )
Live calculations or simulation
I/O support

Most languages/tools provide constructs for representing and analyzing time series [e.g. time series regression and cross-correlation stuff]
The "unique" features refer to either speed of access, ease of querying, or expressivity.
K is notably quick, having a very terse language
matlab is very expressive, allowing you to use the entire set of toolboxes and extend with java
But at the end of the day it really depends on what exactly you want to do.

Related

Diagrammatic method to model software components, their interactions & I/O [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I'd like to model software components and their interaction between them, what information is passed, what processes take place in each component(not too detailed) and a clear specification of the input/output of the components.
What i've seen so far in UML is far too abstract and doesn't go into too much detail.
Any suggestions?
Someg guys Design programs on papers as diagrams,
Then pass them to software developer to Contruct.
This appraoach is tried: "Clever guys" do modeling, and pass models to "ordinary" developers to do laborious task. And this not worked.
We like analogies. So many times we make analogy to construction industry where some guys do models-bluprints and other do building-contruction.And we first think that UML or other models diagrams are equivalent to construction industry models-blueprints. But it seems that we are wrong.
To make an analogy with construction industry our blueprints are not
models-diagrams, our blueprints are actually the code we write.
Detailed Paper Models like Cooking Receipes
It is not realistic to design a software system entirely on a paper with detailed models upfront.Software development is iterative and incremental process.
Think of a map maker who make a paper map of city as big as city, since the modeler include every details without any abstraction level.Will it be usefull?
Is Modeling Useless ?
Definitely not. But you should apply it to difficult part of your problem-solution space, not every trival part of them.
So instead of giving every details of system on paper to developers, explore difficult part of problem-solution space with developers face to face using visual diagrams.
In software industry like it or hate it, Source Code is still the
King. And all models are liar until they are implemented and tested

Article about code density as a measure of programming language power [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I remember reading an article saying something like
"The number of bugs introduced doesn't vary much with different programming languages, but it depends pretty much on SLOC (source lines of code). So, using the programming language that can implement the same functions with smaller SLOC is preferable in terms of stability."
The author wanted to stress the advantages of using Functional Programming, as normally one can program with a smaller number of LOC. I remember the author cited a research paper about the irrelevance of choice of programming language and the number of bugs.
Is there anyone who knows the research paper or the article?
Paul Graham wrote something very like this in his essay Succinctness is Power. He quotes a report from Ericsson, which may be the paper you remember?
Reports from the field, though they will necessarily be less precise than "scientific" studies, are likely to be more meaningful. For example, Ulf Wiger of Ericsson did a study that concluded that Erlang was 4-10x more succinct than C++, and proportionately faster to develop software in:
Comparisons between Ericsson-internal development projects indicate similar line/hour productivity, including all phases of software development, rather independently of which language (Erlang, PLEX, C, C++, or Java) was used. What differentiates the different languages then becomes source code volume.
I'm not sure if it's the source you're thinking of, but there's something about this in Code Complete chapter 27.3 (p652) - that references "Program Quality and Programmer Productivity" (Jones 1977) and "Estimating Software Costs" (Jones 1998).
I've seen this argument about "succinctness = power" a few times, and I've never really bought it. That's because there are languages (e.g., J, Ursala) which are quite succinct but not (IMO) easy to read because they put so much meaning into individual symbols.
Perhaps the true metric should be the extent to which it is possible to write a particular algorithm both clearly and succinctly. Mind you, I don't know how to measure that.
The book of pragmatic Thinking & Learning points to this article.
Can a Manufacturing Quality Model Work for Software?

Data analysis tool like MS excel [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 months ago.
Improve this question
I have a large number of data that needs to be compared,
We are using Microsoft EXCEL, it costs, and it is slow, besides the graph that it generates is also not up to the mark.
Now, is their any other tool, that is free, and has good graph's facility.
Thank you.
If you need good data analysis toolkit, you can spend some time and try R --- free software statistics/data analysis toolkit. It has pretty good graphic capabilities, especially via ggplot2 package for static graphics and ggobi for dynamic data exploration.
Data import/export pretty easy too --- R can import csv/tsv, or Excel data via ODBC and so on.
Pretty good introduction in exploration data analysis with R and ggplot
Introduction to R
Data Import/Export
It takes some time to learn, but after that you are not limited with tool capabilities: R can handle plenty of analysis tasks: from simple data crunching, pivoting. aggregation tasks to advanced statistics/machine learning methods: clustering/classification/ regression etc.
If you more more interested in data transformation with simple calculations, you can use some kind of Extract-Transform-Load toolkit, such as Talend Open Studio.
If you already have MS Office Suite, you probably have MS Access already installed. It does a better job of processing heaps of data, and is a bit more versatile in graphing/charting/displaying data. There may be a bit of a learning curve if you've never used it before.
As for free: I like working with Resolver One, which is free-ish under certain circumstances. At very least its charts are marginally nicer to look at than Excel 2003.
It's going to be difficult to find a free, fast, powerful alternative that generates classy charts though, unless you're willing to code something yourself in, say a web interface.

Economics of software development [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Can anyone point me towards any references that attempt to formulate an economics of software development? In my own research, I discovered a book by Barry Boehm on this, but it seems very awkward and theoretical.
Dependency Structure Matrices seem to offer something worthwhile. Carliss Baldwin has used these in some work on modularization, boundaries, and transaction costs. A lot of it comes off as just common sense, though.
Also, economists have developed something called Behavioral Economics. Is there a "Behavioral Software Engineering" that addresses cognitive biases in developers or groups of developers?
Here's an interesting looking reference:
http://www.amazon.com/Knowledge-Sharing-Software-Development-Comparing/dp/3639100840/ref=sr_1_1?ie=UTF8&s=books&qid=1232979573&sr=1-1
Before Hal Varian became the Chief Economist at Google, he had worked on the economics of information technology at Berkeley, although he did not focus on software development per se. Nevertheless I would recommend a look at his paper on the more general topic from 2001. You can find a more complete list of his research work on his website. Hope that helps.
Software as Capital wasn't a waste of time, though you won't find any math in it and it reads like a PhD thesis because it started as one.
Another review.
I think that what you're looking for might fall under a sociology of software development... sociologists study all modern subjects, and from there you will no doubt find references to an economics of software development if there is one.
Facts and Fallacies of Software Engineering by Robert Glass has some dollar amounts associated with some activities (or, at least, percentage of total budget). Don't know if that helps at all, but it's something.
Several years ago I taught an "Economics of E-Commerce" course using Varian's book INFORMATION RULES. His idea of lock-in, though, leads the reader almost towards a drug-addict model of purchaser behaviour and exploitation. This book is more of an economics of e-business than an analysis of the software development process.
In terms of actually making software, there are ideas in the Mythical Man Month well worth knowing about.
The "Applied Information Economics" approach of Douglas Hubbard could be part of what you're looking for. If we assume software development is (often|always|sometimes|???) about supporting decision making by providing (better|more accurate|more up to date|whatever) information, then AIE helps as it's a technique for quantifying the value of better information. Read Hubbard's book How to Measure Anything for a good overview of the idea.
Also, the book Software By Numbers by Mark Denne and Jane Cleland-Huang provides a model for managing software projects by using something they call the "Incremental Funding Methodology". IFM is based on decomposing software projects into features based on the business value created, rather than decomposing them along technical boundaries. They then use a series of calculations based on Discounted Cash Flow (DCF), Net Present Value (NPV), Internal Rate of Return (IRR), etc. to show when in the project lifecycle the project will reach self-funding status, when it will reach "breakeven" and when it will generate a real positive cash return for the organization.
You might also find the Capability Cases book of interest. It doesn't strictly deal with any economic issues in detail, but it's an approach to software specification which attempts to more clearly map software capabilities to business strategy and business issues.

Agent-based modeling resources [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I would like to know what kind of toolkits, languages, libraries exist for agent-based modeling and what are the pros/cons of them?
Some examples of what I am thinking of are
Swarm, Repast, and MASS.
I found a survey from June 2009 that answer your question:
Survey of Agent Based Modelling and Simulation Tools
Au. R.J. Allan
Abstract
Agent Based Modelling and Simulation is a computationally
demanding technique based on discrete event simulation and having its
origins in genetic algorithms. It is a powerful technique for
simulating dynamic complex systems and observing “emergent” behaviour.
The most common uses of ABMS are in social simulation and optimisation
problems, such as traffic flow and supply chains. We will investigate
other uses in computational science and engineering. ABMS has been
adapted to run on novel architectures such as GPGPU (e.g. nVidia using
CUDA). Argonne National Laboratory have a Web site on Exascale ABMS
and have run models on the IBM BlueGene with funding from the SciDAC
Programme. We plan to organise a workshop on ABMS methodolgies and
applications in summer of 2009. Keywords agent based modelling,
Archaeology
http://epubs.cclrc.ac.uk/bitstream/3637/ABMS.pdf
I also recommend NetLogo. It is an IDE+environment+programming language based on logo (which was based on Lisp) which lets you build multi-agent models extremely fast. I have found that I can reproduce (simulate) algorithms from research articles in a couple of hours, algorithms that would have taken weeks to implement with other libraries.
You can check some of my models at this page.
I got introduced to Dramatis at OSCON 2008, it is an Agent based framework for Ruby and Python. The author (Steven Parkes) has some references in his blog and is working at running a language agnostic Actors discussion list.
This page at erights.org has a great set of references to, what I think are, the core papers that introduce and explore the Actors message passing model.
There is also a pretty good link in wikipedia:
http://en.wikipedia.org/wiki/Comparison_of_agent-based_modeling_software
On the modelling side, have a look at FAML, an agent-oriented modelling language. This is a pretty academic paper, but it may help depending on your interests: http://ieeexplore.ieee.org/xpl/freepre_abs_all.jsp?isnumber=4359463&arnumber=4967615
I know this is an old thread, but I thought it would not hurt to add some extra info. There is a great new website which is dedicated to agent-based modeling. The site contains links to papers, tutorials, tools, resources, and researchers working on agent-based modeling in a number of fields.
you should also have a look at Madkit and Turtlekit
Old thread, but for completeness there is also Anylogic and pyabm which can be used for ABMs.
I have experience programming agent-based models in several environments / languages. My opinion is that if you want to implement a relatively simple model, use Netlogo. It's also possible to use Netlogo for heavy-duty models as well (I've done this successfully), but at some point the flexibility of a programming language like java/python/c++ outweighs the convenience of the native methods available in Netlogo, especially when performance becomes a major issue.
Repast is becoming a bit bloated. If you are an experienced programmer, all you really need to start building an ABM is the ability to schedule events and draw random numbers. The rest (defining agents / environments and their behaviors) you can craft on your own. When it comes to managing the objects in your model, use the regular data structures you're used to (arrays / hashes / trees / etc.). To this end, I'm developing a very lightweight Java library called "ABMUtils" (on github) that implements a scheduler and wraps a random number generator. This is in the early development stage but I expect to flesh things out (keeping it simple) over the coming months.
If you are an evolutionary economist you can also check Laboratory for Simulation Development (LSD).
PHP and Java developers should take a look at KATO.

Resources