We are heading to the airport soon and fly to London. The main target is the Workout in Computational Finance on Thursday, 30-Jan-14, but we also have a tight schedule meeting partners, members of focus groups and distinguished quant finance experts. It will be exciting days. I hope there will be little time to have a half pint or two of one of the great UK microbreweries and a short visit at the Serpentine Galleries.
But, I am afraid, we may not be able to post during our stay in London - Sorry for that.
Numbed By Numbers - The Disruption and Overshoot of Quantitative Analysis?
This post is inspired by an article in Wired Jan-14 - Why Quants Don't Know Everything
Don't quants know everything?
Whilst in finance quants are blamed for too much dependence of numerical patterns, the discipline of quantitative analysis is entering unexpected sectors: sports, journalism, culture, … and even politics.
A Workshop on Risk Management
This week I will interrupt my blog post series about models since I want to report about an event.
Yesterday Andreas Binder and me had a workshop on risk management for capital management firms and small banks in Vienna. Topics covered included
Although regulators allow to use the commitment approach for certain types of funds we have shown with our examples that VaR calculations provide more than risk measures for reporting to the regulators but can be used to gain valuable information about the fund structure and its sensitivities to different risk factors.
Yesterday Andreas Binder and me had a workshop on risk management for capital management firms and small banks in Vienna. Topics covered included
- Commitment Approach
- Value At Risk
- Methodologies (parametric, historical, Monte Carlo)
- Individual VaR
- Contribution VaR
- Additional Risk Measures like Expected Shortfall (ES)
- Backtesting
- Extreme Value Theory - Peak over Threshold
Although regulators allow to use the commitment approach for certain types of funds we have shown with our examples that VaR calculations provide more than risk measures for reporting to the regulators but can be used to gain valuable information about the fund structure and its sensitivities to different risk factors.
A Short History of Floating Rates
In my recent post on THE swap, I mentioned the underlying loan (with a notional of 195 million CHF). The interest rate the city had to pay for this loan, was the Swiss Franc 6month Libor plus a margin of 4.9 basis points (i.e., 0.049 percent).
Thinking of today's interest rate levels, this seems to be a quite attractive funding rate. Was this also the case in 2007?
We observe that in 2007/2008 the 6m Libor rate for the Swiss franc was around 3 percent, and I observe that I am quite good in forgetting historical economic conditions. (I had assumed that Swiss interest rates to be lower between 2005 and 2008.)
Nevertheless, compare it to the 6m Euribor:
What surprised me even more, was the Swiss interest levels in the early 1990s.
In order to finish the underlying loan discussion: It was equipped with an early exercise option for the loan creditor: When the creditor thought that margins (the 4.9 basis points mentioned above) were not attractive any more, they had the right to call back the 195 mio CHF (or alternatively, renegotiate the margins.) This (increasing the margins to be paid) happened in April 2013.
Thinking of today's interest rate levels, this seems to be a quite attractive funding rate. Was this also the case in 2007?
Historical rates of CHF Libor6m. Images source:Teletrader. We can clerly observe the breakrdown of Lehman Brothers in autumn 2008. |
Nevertheless, compare it to the 6m Euribor:
Euribor6m (red) vs. CHF Libor6m (blue). Data Source: Federal Reserve Bank of St.Louis |
In the early 1990s: Also in Switzerland, Libor rates close to 10 percent. Image Source: homefinance.nl |
5 Ways Technology Will Change Quant Finance?
This is a transformation of the NewCompany Article 8 Unexpected Ways Technology Will Change The Word by 2020.
I took the ways I found adequate for quant finance:
I took the ways I found adequate for quant finance:
UnRisk Is Not for "You" - Isn't That Arrogant?
In No, UnRisk is Not for "You", I have pointed out why "that's okay, not for you" may be a response to a prospect who knows our offers, but does not buy.
Is that arrogant? We have discussed this intensively and even if it looks like: it is not intended.
Daily Returns and the Black Scholes Model
The Daily Return:
Let Sn be the stock index at the end of the n-th trading day. Then the
daily return of the stock is defined by
i.e. the relative change. If we take the log-return instead
we get an even easier to handle quantity since the log return over k days
is simply the sum over the daily returns
The Desired Properties For Our Model:
Under the assumptions that the log returns of disjunct equidistant time intervals
are independently and identically distributed we can use the central limit
theorem of probability theory to state:
Log-returns seen as a sum of a large number of independent identically
distributed random variables with finite variance are approximately normally
distributed.
Therefore we are looking for a stochastic market model, defined in
continuous time where the log-returns are normally distributed for arbitrary
time intervals. Let St be the stock price at time t. Bachelier modelled the stock price
with a Brownian motion with the disadvantage that the stock price within
this model can be negative. Samuelson modified the model using the geometric
Brownian motion for the dynamics of the stock price.
We denote μ as the expected return rate (drift) and σ as the
volatility. The process described above is a special case of the Ito process with general μ and σ.
Let Sn be the stock index at the end of the n-th trading day. Then the
daily return of the stock is defined by
i.e. the relative change. If we take the log-return instead
we get an even easier to handle quantity since the log return over k days
is simply the sum over the daily returns
The Desired Properties For Our Model:
Under the assumptions that the log returns of disjunct equidistant time intervals
are independently and identically distributed we can use the central limit
theorem of probability theory to state:
Log-returns seen as a sum of a large number of independent identically
distributed random variables with finite variance are approximately normally
distributed.
Therefore we are looking for a stochastic market model, defined in
continuous time where the log-returns are normally distributed for arbitrary
time intervals. Let St be the stock price at time t. Bachelier modelled the stock price
with a Brownian motion with the disadvantage that the stock price within
this model can be negative. Samuelson modified the model using the geometric
Brownian motion for the dynamics of the stock price.
We denote μ as the expected return rate (drift) and σ as the
volatility. The process described above is a special case of the Ito process with general μ and σ.
UnRisk's Culture of Helping
Remember our Agenda 2014: package and disseminate know how.
I just read the January-February 2014 issue of the HBR magazine - the cover: A Great Place To Work, about building high-performace cultures in organizations.
Making collaborative generosity the norm
I just read the January-February 2014 issue of the HBR magazine - the cover: A Great Place To Work, about building high-performace cultures in organizations.
Making collaborative generosity the norm
We valuate THE swap on February 5.
Among Austrian market participants, the so-called "Swap 4175" between "the city" and "the bank" has attracted some attention. It was entered in February 2007 with the purpose to somehow optimize the liabilities of the city arising from a CHF loan with a face value of 195mio CHF and variable (CHF Libor) coupons.
The ingredients of swap 4175 are that the bank pays Libor to the city and gets a (very low) fixed rate from the city as long as one EUR is higher than 1.54 CHF. (The exchange rate then was around 1.65). If EUR is observed lower than 1.54 CHF, then the interest rate to be paid increases by (1.54-x)/x with x being the exchange rate. Hence, if EUR and CHF were quoted at par, this would mean an interest rate of 54%.
In October/November 2011 (the exchange rate being 1.22 then), the city filed a claim that the swap had not been entered on a legally binding basis and that all payments arising form the swap should be returned. The bank filed a counterclaim. The trial at the commercial court is still ongoing.
I will discuss the mathematics (and only the mathematics) of this swap in a public lecture (in German) at the Johannes Kepler symposium. Wir bewerten einen Swap. on February 5.
The ingredients of swap 4175 are that the bank pays Libor to the city and gets a (very low) fixed rate from the city as long as one EUR is higher than 1.54 CHF. (The exchange rate then was around 1.65). If EUR is observed lower than 1.54 CHF, then the interest rate to be paid increases by (1.54-x)/x with x being the exchange rate. Hence, if EUR and CHF were quoted at par, this would mean an interest rate of 54%.
In October/November 2011 (the exchange rate being 1.22 then), the city filed a claim that the swap had not been entered on a legally binding basis and that all payments arising form the swap should be returned. The bank filed a counterclaim. The trial at the commercial court is still ongoing.
I will discuss the mathematics (and only the mathematics) of this swap in a public lecture (in German) at the Johannes Kepler symposium. Wir bewerten einen Swap. on February 5.
Exchange rate of CHF / EUR between Jan. 2007 and Jan 2014. Data Source: Oesterreichische Nationalbank. |
Tribal Reality and Financial Markets
This post is inspired by Noah Smith's Blog Post: Tribal Reality and Extant Reality.
It is about second-hand information vs finding out yourself. In Noah's sense Tribal Reality is a way of signaling "Hey I am in your tribe" (using the same symbols, convey the same informations, play the same game, ..). The problem starts when speculative stuff is mistaken with reality.
Tribal Reality and Technology
This is the part that I find so interesting. It is about the power of information and explorative learning. In low-tech societies, a few people took the lead by simply insisting that they can see more than others (as prophets, cult leaders, ..). Technology helps individuals to to explore and apply useful information, without taking a special lead.
I want to extend to low-abstraction vs high-abstraction societies (clear, I am a biased mathematician interested in the spiral of mathematical innovation).
The Wisdom of Crowds and Financial Markets
How to make money? Why the many are smarter than the few? When information is aggregated, the average guess of a tribe becomes more accurate than the individual guess. To outguess the crowd is not easy and it will cost (the idea of the EMH - remember EMH <==> P=NP)
I am interested in the operational aspects of tribal reality in quant finance. Up to 1987 the option traders played the simple Black Scholes Game. Then far out of the money options wre trade and the headache begun. BS has lost the innocence of being a game rule?
Based on abstraction and technology individual insight is possible - against The Macho of Financial Modeling.
Picture from sehfelder
Join Us at the FREE Workout in Computational Finance in London - Seats Are Still Available
About The Workout in Computational Finance 2014 in London.
"For shaping your body you should go to a gym, while for building up your numerical tool kit you need a workout in computational finance", Philipp Mayer, Finance Modeling, ING Financial Markets, Brussels
"For shaping your body you should go to a gym, while for building up your numerical tool kit you need a workout in computational finance", Philipp Mayer, Finance Modeling, ING Financial Markets, Brussels
A short first view on equity models
We will start our review of models with the family of equity models. What is the empirical evidence ?
In the next blog posts we will pick one model after the other and highlight their advantageous and disadvantageous from a modelling as well as from a numerical point of view.
- First the distribution of stock returns exhibits heavier tails than the Gaussian distribution.
- Implied volatilities of options depend on strike prices and maturities
When we take a look at commonly used equity models we observe that:
- The classical Black Scholes model cannot reproduce market volatilities as these exhibit smiles and/or skewnesses.
- Dupire’s local volatility can – in principle – fit market volatilities but still does not contain stochasticity of volatility
- Advanced volatility models (e.g. Heston) cover additional features and are popular in the community.
- Adding jumps to diffusion models (Bates) or modelling the evolution of the stock prices using jumps only (NIG,VG) may increase the coverage of additional features or provides an alternative way to stochastic volatility for closing the gap between models and reality.
In the next blog posts we will pick one model after the other and highlight their advantageous and disadvantageous from a modelling as well as from a numerical point of view.
Tomography and Adaptive Optics
When you try to increase the angle of view in Adaptive Optics by MOAO techniques (as described in my post on Guide Stars), you typically combine several natural guides stars and several laser guide stars for the calibration of the defromable mirrors you use.
As you realise, the wave front sensors (WFS) and the guide stars are at different angles, therefore the light passes different sections of the atmosphere. In order to obtain a good quality of reconstruction for several arc minutes of view, techniques from computerized tompography have to be applied.
And again, the performance restrictions are on the milliseconds scale.
MOAO system: schematic representation. Image source: eso.org |
And again, the performance restrictions are on the milliseconds scale.
Of Brains and Balls: Trolling of Economists
Of Brains and Balls: … is the recent post title in "Noahpinion".
I picked it, because it is an insightful read about misunderstandings and the complexity of macroeconomics and trading views, advices and concrete deals.
Quant Work - Accuracy vs Resilience
During the holiday time I have been cross country skiing at the Bohemian Forest that is an ideal region from young professionals to elder skaters, like myself (over 65).
My central theme for the first blog posts in 2014 ...
The end of the holidays is coming closer and closer and as a warm up I was thinking about what central theme I would like to cover in my first blog posts in 2014. Then an old college shared a joke from/about physicists with me. You know this a bit nerdy jokes of the kind: "An electron and a positron are walking into a bar...".
But the following joke encouraged me to think (once more) over models, model reduction and parameter identification.
A group of wealthy investors wanted to be able to predict the outcome of a horse race. So they hired a group of biologists, a group of statisticians, and a group of physicists. Each group was given a year to research the issue. After one year, the groups all reported to the investors. The biologists said that they could genetically engineer an unbeatable racehorse, but it would take 200 years and $100bn. The statisticians reported next. They said that they could predict the outcome of any race, at a cost of $100m per race, and they would only be right 10% of the time. Finally, the physicists reported that they could also predict the outcome of any race, and that their process was cheap and simple. The investors listened eagerly to this proposal. The head physicist reported, "We have made several simplifying assumptions: first, let each horse be a perfect rolling sphere… "
Doesn't this joke have some grain of truth in it or is it more the other way round. Do we like to add complexity to our models either by adding parameters, or by combining different forms of propagation. In the next blog posts we will recap some of the most common models and will try to show their (dis)advantages and compare them to each other.
But the following joke encouraged me to think (once more) over models, model reduction and parameter identification.
A group of wealthy investors wanted to be able to predict the outcome of a horse race. So they hired a group of biologists, a group of statisticians, and a group of physicists. Each group was given a year to research the issue. After one year, the groups all reported to the investors. The biologists said that they could genetically engineer an unbeatable racehorse, but it would take 200 years and $100bn. The statisticians reported next. They said that they could predict the outcome of any race, at a cost of $100m per race, and they would only be right 10% of the time. Finally, the physicists reported that they could also predict the outcome of any race, and that their process was cheap and simple. The investors listened eagerly to this proposal. The head physicist reported, "We have made several simplifying assumptions: first, let each horse be a perfect rolling sphere… "
Doesn't this joke have some grain of truth in it or is it more the other way round. Do we like to add complexity to our models either by adding parameters, or by combining different forms of propagation. In the next blog posts we will recap some of the most common models and will try to show their (dis)advantages and compare them to each other.
The Top 5 UnRisk Stories Of 2013
As 2013 is closed, I take a final look back at the stories I found most compelling from the market, as well as the makers, point of view.
1. UnRisk FACTORY UnRisk-Q Bundle
Whilst -Q programmers enjoy the manipulation of all UnRisk objects, deal types of interest rates, equity, FX, inflation, commodities, credit and convertibles with hybrid contract features, models, simulators including the VaR Universe, utilities, ... from the UnRisk Financial Language, FACTORY (CM) users enjoy the broad coverage, integrated valuation and data managment, fast time-to-productivity, low cost of ownership and automation with an enormous throughput.
Since Mar-13 we offer an automated solution and development system in one.
2. CVA/FVA/DVA Development the UnRisk Way
Selecting momentary technologies blindly may make it impossible to achieve the ambitious goals. Data and valuation management needs to be integrated carefully and an exposure modeling engine needs to work event driven. Billions of single valuations might be required ….
Consequently, we have built a blazingly fast foundation first, invited featured customers to join the project and build the high level tasks by closing the feed-back loops of practical needs, future technology implementations and testing - atop the UnRisk Financial Language.
3. A Workout in Computational Finance
The book has been released in Aug-13 and we have decided offer live workouts as a reference class of the UnRisk Academy (first event in London, 30-Jan-14).
Five hours of inspiring practical sessions for those, who want to enjoy the freestyle of quant work by staying strong, agile and balanced.
The open information policy related to the mathematica schemes behind UnRisk includes the Mathematics Wednesday and Physics Friday here.
4. Arming David - A Great Success
To encourage the flowering of small investment and capital management units with competitive advantage we have introduced the bundled UnRisk FACTORY Capital Manager.
In 2013 we have additionally installed it at 4 dedicated capital management firms, the smallest with 7 people, and 2 asset management departments of small insurances.
5. Close the Trap of Negative ….
This seems to be a hidden-detail story? Our customers are quite happy that we offer a multi-model approach - see Black vs Bachelier - and look deeper into the mathematical schemes: Libor and the Negative Eigenvalue Trap.
We have big things planned for 2014, so my hope is that you continue reading our contributions.
1. UnRisk FACTORY UnRisk-Q Bundle
Whilst -Q programmers enjoy the manipulation of all UnRisk objects, deal types of interest rates, equity, FX, inflation, commodities, credit and convertibles with hybrid contract features, models, simulators including the VaR Universe, utilities, ... from the UnRisk Financial Language, FACTORY (CM) users enjoy the broad coverage, integrated valuation and data managment, fast time-to-productivity, low cost of ownership and automation with an enormous throughput.
Since Mar-13 we offer an automated solution and development system in one.
2. CVA/FVA/DVA Development the UnRisk Way
Selecting momentary technologies blindly may make it impossible to achieve the ambitious goals. Data and valuation management needs to be integrated carefully and an exposure modeling engine needs to work event driven. Billions of single valuations might be required ….
Consequently, we have built a blazingly fast foundation first, invited featured customers to join the project and build the high level tasks by closing the feed-back loops of practical needs, future technology implementations and testing - atop the UnRisk Financial Language.
3. A Workout in Computational Finance
The book has been released in Aug-13 and we have decided offer live workouts as a reference class of the UnRisk Academy (first event in London, 30-Jan-14).
Five hours of inspiring practical sessions for those, who want to enjoy the freestyle of quant work by staying strong, agile and balanced.
The open information policy related to the mathematica schemes behind UnRisk includes the Mathematics Wednesday and Physics Friday here.
4. Arming David - A Great Success
To encourage the flowering of small investment and capital management units with competitive advantage we have introduced the bundled UnRisk FACTORY Capital Manager.
In 2013 we have additionally installed it at 4 dedicated capital management firms, the smallest with 7 people, and 2 asset management departments of small insurances.
5. Close the Trap of Negative ….
This seems to be a hidden-detail story? Our customers are quite happy that we offer a multi-model approach - see Black vs Bachelier - and look deeper into the mathematical schemes: Libor and the Negative Eigenvalue Trap.
We have big things planned for 2014, so my hope is that you continue reading our contributions.
Adaptive Optics: Extreme Perfomance Gain by CuRe
Last week, when I wrote about Guide Stars, I presented roughly, how a Shack Hartmann sensor is utilized to obtain information on the perturbation of the wavefront taht should be plane in the ideal case. To determine the actuator commands for the deformable mirror, a linear system of equations has to be solved.
The common approach until, in all modesty, the Austrian Adaptive Optics team entered the play, was the so-called matrix vector method (MVM), which has, a computational effort of the order n^2, when n is the number of actuators. The influence matrix was inverted once and then applyed to the sensor signals.
CuRe (the cumulative reconstructor), which has beeen developed in Linz, takes into account the special form of the Shack Hartmann operations and succeeds to obtain a performance of order n. While the MVM took 20 milliseconds on an 8 CPU realisation, CuRe did the same job in 130 microseconds on a single CPU. A speed-up of almost 1000.
The actuator commands then look as indicated by the following image (which I like to call "psychedelic donut").
More details can be found in CuRe - A new wavefront reconstruction by Andreas Obereder, Ronny Ramlau, Matthias Rosensteiner and Mariya Zhariy.
The common approach until, in all modesty, the Austrian Adaptive Optics team entered the play, was the so-called matrix vector method (MVM), which has, a computational effort of the order n^2, when n is the number of actuators. The influence matrix was inverted once and then applyed to the sensor signals.
CuRe (the cumulative reconstructor), which has beeen developed in Linz, takes into account the special form of the Shack Hartmann operations and succeeds to obtain a performance of order n. While the MVM took 20 milliseconds on an 8 CPU realisation, CuRe did the same job in 130 microseconds on a single CPU. A speed-up of almost 1000.
The actuator commands then look as indicated by the following image (which I like to call "psychedelic donut").
More details can be found in CuRe - A new wavefront reconstruction by Andreas Obereder, Ronny Ramlau, Matthias Rosensteiner and Mariya Zhariy.
Subscribe to:
Posts (Atom)