To Tree is The Answer, Provided …..


you want to perform a valuation of certain real options.

Trees may destroy delta-hedging

In To Tree or Not To Tree, Andreas has pointed out that binomial trees are good, because they are  insightful, but they have a really bad computational behavior. So bad, that they may destroy delta-hedging (the heart of the option theory), when applied to the analytics of options of more complexity.

Real options are different

A real option is the right to perform certain business initiatives in a capital investment project. There are real options referring to project size (expand or contract), life and timing (defer, abandon, sequence or parallelize), and operation (product or process flexibility). Those determine the option characteristics.

Real options are usually distinguished from financial options in that they, and their underlying, are typically not tradable. Most real options have a value but not a price. At the other hand option holders (management) can influence the underlying project.

Whilst financial options can help optimizing the risk of a portfolio, real options can help maximizing the value of a project by managing uncertainty and create value through flexibility.

Real option valuation

Real options can be the underlying principle of agile practices. With this objective insight is more important than computational quality. In real option valuation you have to deal with a co-evolution.  You need to take in account the uncertain development of the parameters that determine the value of your project and the management of decisions that influence them.

So, in general real options, are more complex than financial options. Consequently, they are more challenging in valuing inputs (factors) and define the option characteristics.

To Tree

You can model them by PDEs, apply forward techniques with (Least Square) Monte Carlo methods, but, most of the practitioners use binomial trees as they allow for implementing rules (up and down probabilities under conditions, ..) at each node. Although, they cannot handle problems with higher dimension.

Real option analytics as sparring partner

One can see investment projects as cash conversion cycle - with many decision points. Real option analytics can be a kind of sparring partner, telling the management, what if ...

The return is characterized by the investment, time and the distribution of the cash flows. You need to know the cash drivers, volatilities, formulate possible actions (their options) and know their influences on the cash flows.

If your project is innovative you don't have a history. So, you might need to simulate the project to get insight into quantitative aspects of possible decisions. Trees are of good nature for this purpose, right?

Optionality 

You can buy antifragility.  In finance, antifragility needs fragility, because hedgers need speculators as counter party, who accept the fragile side of a contract.  The difficult thing about this is transparency: who has what fragile/antifragile position and how do those positions cross-connect to fragility concentration or fragility diffusion and buffering. "Correct" pricing, valuation and risk analytics is vital to make the market a "fair" play.

Real options usually pertain to tangible assets such as capital equipment, rather than financial instruments. They are not a derivative instrument, but an actual option - that have a value and you gain from knowing which and when. In a tree of possible decisions in an uncertain world. If you compare this to a traditional Discounted Cashflow Method, you cannot lose ...

But

the real economy could learn from the innovations of the financial systems. They could maybe adopt the "fiction" that the option and the underlying are tradable, or "replicate" the cash flows on the option by a risk free bond and proportions of the underlying?

But this is another story. Trees would be devalued as firewood again?

I am not a RO expert, so I have compiled info from Wikipedia and long discussions with Hermann Fuchs, running a financial controlling advisory firm and RO expert, here. But, I understand that we offer attractive options for a build-an-advanced-risk-management-system project.

Linking the UnRisk FACTORY to UnRisk-Q Chapter 1


In my blog UnRisk Web Service: Combining our Products I gave an example on how the UnRisk Web Service may be used to make data which is stored in the UnRisk FACTORY database accessible via Excel.
Today I will show how this Web Service may be used to combine our two products, UnRisk FACTORY and UnRisk-Q. The necessary steps in this example are:
  • Set up a financial instrument in the UnRisk FACTORY (this can be done in a very convinient way)
  • Load this instrument by the use of the UnRisk Web Service into Mathematica
  • Set up an interest rate curve in Mathematica by the use of UnRisk-Q
  • Price this instrument in Mathematica
Step 1: Setting up the financial instrument (in our example it is a simple fixed rate bond) in the UnRisk FACTORY


Steps 2 to 4 are explained in the following code of Mathematica

Conclusion: By the use of the UnRisk FACTORY the user can set up financial instruments in a very intuitive and convinient way. The UnRisk Web Service enables the user to import these financial instruments into the world of Mathematica. By the use of UnRisk-Q the user can perform valuations or analyze the behaviour of instrument prices under market scenarios which can be set up in a very flexible way within Mathematica.

Sunday Thought - Should I Hope to Live Not Too Long

This Marginal Revolution post pointed me to the Atlantic piece by Ezekiel Emanuel: Why I hope to Die at 75 with this key message:
Once I've lived to 75, may approach to my health care will completely change.  I won't actively end my life. But I won't try to prolong it either.
I am 70 and obviously not neutral to this topic. BTW, when I was 12, people at 35 looked so old, and old fashioned in their behavior, that I thought: "I hope to die before becoming that old". Now, I want to live long - but, I mean long and full, rich, exciting, mobile, recognized, loved, …

Death is a loss. But is living too long also a loss?

Loss is overrated

Beyond philosophical, psychological, socioeconomic and cultural aspects, this sounds to me like a question of prediction and risk management. It's about loss aversion (in the sense of  Kahneman). Paradoxically, people, who hate to realize a loss often take more risk when losses increase. Is this what Ezekiel Emanuel wants to avoid?

Optimal Risk

It is difficult to optimize risk, if you don't have enough quantitative information. In Optimal Risk I have briefly described my try to find the optimal risk when I skate on cross country ski trails. But life is more complex than cross country skiing. It "grants" more unexpected events.

Long, but boring?

But, let me take the roulette metaphor. You know you can't win on the long run. The Kelly Criterion (the Kelly bet on "red" was -1/19) tells you not to bet. But you can use a small fraction of your current payroll to stay "long" at the casino playing (just betting on "red"). Boring, isn't it? And the longer you play the payroll will go to zero and a fraction of it may become really too small to continue …

A complete life?

I have most probably celebrated more wine and dine events, slept, exercised, trained my brain, applied  preventive medications… less than I should, to prolong life as long as possible. And maybe, I risked to become slower, less creative and less productive earlier as necessary.

My statistical life expectation is 78. But statistics also says: most probably, I will suffer from this and that "long" before. To NN Taleb "long in history" means "long in the future, but future is unknown. You can't really predict it, but build it.

Logarithmic loss (LogLoss) vs 0/1 loss?

It's not quantifiable. But maybe I was lucky having found a kind of an optimal risk for a full, rich, … life. Mayby, I have intuitively used a kind of LogLoss pay-off instead of a crisp loss (of living long) function, Ezekiel Emanuel seems to "apply"? LogLoss penalizes the extremes (confident or wrong) and "predictions" under its regime are not 0/1.

Inquisitiveness wins

However, it's too many things I haven't seen, understood, managed, … yet. Too many corners I haven't looked around. Life is never "complete". I still work. Good partners and friends will tell me, when I should stop.

But what I certainly know: I do not want to live infinitely long. There is an individual age x_1 … But 75? Really?

A Very Singaporean Road Sign

This summer, I went to an opera event at the Salzburg Festival. Driving into Salzburg we came into a really heavy rain and hailstorm. I even feared some damage by hailstones (fortunately, my little Fiat 500  stayed unscathed). Bound in a heavy traffic, I had no chance to escape without causing a (further) chaos. No chance to just park in one of the underpasses, no gas station in sight, ... For motorcyclists it must have been much worse.

A few minutes ago, I found this at "Slate". The umbrella sign guides to designated covered areas where motorcyclists can safely wait.

We have a lot of "forbidden" road signs - to our benefit (I guess). So, I find this one pretty cool.

I can't help it, regulation and advisory comes to my mind …

Summer Reading II - Americanah

This were the thicker books I read in the first half of this summer. The second half of the summer happened on a Thursday, so to say. It was chilly, raining, … most of the time. Time for longer novels again, but I read only one:

Americanah, Chimamanda Ngozi Adichie - this is the (love) story of a Nigerian woman, Ifemenu, who left Africa for America and her school friend Obinze, who only made it to the UK (illegally) … It's their plan to return together, but things do not go according to their plans. Where Obinze failed (he is deported), Ifemu thrives. Back in Nigeria, Obinze finds a lucrative job and marries a beautiful wife …

When we first meet Ifemu, she is getting her hair braided at an African saloon 13 years after coming to America. We read that she won a prestigious fellowship at Princeton and writes a popular blog: observations about American Blacks by a Non-American Black. Yet she decided to throw this away and return home. And she returned. In America she was black - In Nigeria, she's an Americanah.

The hairdresser asks why? So did I …. but Chimamanda makes it clear through her characters.

Americanah is the first novel I read from the award-winning Nigerian writer (it's her third). How did I discover it? It was selected as One of The 10 Best Books of 2013, by the NY Times …  I read it in  German.

And I really, really enjoyed reading it. A great story, a great analysis of complicated real life situations (race and identity, love, ..), a virtuously written text.

The next round in the swap trial

The trial between the city of Linz and Bawag PSK concerning the swap 4175 is still ongoing.

On June 30, 2014, the city filed a claim (the full text (in German) here) in which the judge was claimed to be biased. A senate (consisting of three judges) of the commercial court in Vienna had to decide if to allow this appeal or to dismiss it.

On Sept. 12, the appeal was dimissed (in German here ). The senate found no evidence that the judge would be biased.

The city of Linz will not appeal against this decision.

Therefore, the next step will be to decide if Uwe Wystup, who was named as one of two expert witnesses, is biased.   

Don't Ride the Waves of Standardization Blind

No doubt, without standardization we'd have no powerful interfaces that transform one world into another.

My industrial socialization, factory automation, was characterized by standardization. If you don't have standard machine elements, function complexes, mechanisms, ... you need to make everything yourself from scratch. Even languages that are understood by machine, robot, ... controls need to be standardized. Our high level task-oriented offline programming languages were compiled to standardized control code.

Even highly automated factories were individual configurations of (kind of) standard components - hard and software.

But we realized quickly: complex automated manufacturing tasks cannot be centrally superviced, they need to be organized as interplay of systems with local intelligent. The bottom-up fashion.

This is where I come from. And I am still for standardization, but I have reservations about strict supervision and  centralization.

In banking, standards consolidate transactions, rationalize accounts. integrate payment environments and what have you, but

Big regulatory wave

Regulatory bodies often use "(international) standards" to designate principles and rules of financial regulation and supervision for the general and detailed business processes in the financial sector.

After the financial crisis it seems regulation has become a synonym for centralization? It comes like a big wave and causes big changes of financial business principles, far beyond core capital rules, risk management requirements, …

It redefines game rules even in pricing.

ISO standards are voluntary

ISO standards are written international agreements on the use of technologies, methods and processes adopted to the consensus of partners concerned - support consistent technical implementation.

To me this is vital: it is suggesting an orthogonal engineering, implementation and management of technologies and solutions. Decentralized implementation does not restrict a systemic use.

Consequently, standardized platforms do not kill innovation.

Central counter party - Unintended consequences?

My view on central clearing, ....
On a higher-level view central clearing is reducing counter party exposure but may be resulting in an increase in liquidity risk. Such kind of centralization may drive a marginal cost regime with margin compression (OTC revenue reduction) ….

One of the rationales: Deloitte's Central Clearing for OTC Derivatives ...

Technology providers, will not be able to influence the rules, but

Individualize with UnRisk

We will put our best efforts to support our small and medium-sized clients to evaluate their revenue impact and maybe refine product and sales strategies to their business strength.

Quants will become even more important as our partners. Instruments will become less complex but the valuation space will become massive. Market dynamics will change basic rates more frequently. Consequently, the methodology to price a simple swap changes fundamentally, portfolio optimization gets another meaning, …

We will soon offer the methodologies for these new regimes to be managed in our UnRisk Financial Language in combination with the UnRisk FACTORY Data Framework supporting the corresponding financial objects and data.

Designed to enable quants to build systems for better trading decisions and risk-informed sales strategies under a new (regulatory) regime.

Sharpen Your Tools

In fall of 2010 we decided to go cross platform with our quantitative finance tool UnRisk-Q. The library was initially developed for Windows only, but the ongoing shift in platform popularity made us consider also offering it for Linux and Mac OS X. Mathematica, which forms the basis of UnRisk Financial Language, is also available for these three platforms.

When we started, the whole build process of UnRisk-Q was based on manually maintained Visual Studio C++ projects. We looked at different cross platform build tools and finally settled on using CMake as our build tool for the following reasons:

  • It generates native build solutions (Visual Studio projects on Windows, Xcode projects on Mac OS X and Makefiles under Linux).

  • Unlike other build tools, it does not have a platform bias, but works equally well on the three target platforms Windows, Linux and Mac OS X.

  • CMake covers the whole software build process consisting of building, automated testing, code coverage, continuous integration and packaging.

  • A CMake installation is fully self contained and does not depend on a third-party scripting language.

Once the build system was chosen, the existing C++ code needed to be made cross platform. This is a straight-forward process, which requires replacing platform specific code with platform agnostic one where possible and insulating the platform specific code that remains. In doing that, we often had to make changes to widely used project header files, which triggered a rebuild of the whole project. Since UnRisk-Q’s code base consists of about a half a million lines of C++ code, this meant that we had to wait almost half an hour for a build to finish.

The Preprocessor Takes the Blame

A short C++ program, which consists of about 100 lines of source code, is turned into a 40000 line compilation unit by the preprocessor which handles the inclusion of standard headers. So all a C++ compiler does these days is to continually parse massive compilation units. Since any complex C++ project consists of dozens of C++ source code files and many of the source files use the same standard headers, the C++ compiler has do a lot of redundant work.

The downsides of the preprocessor have been known for a long time. In his book The Design and Evolution of C++ Bjarne Stroustrup made the following statement about the preprocessor (Cpp): “Furthermore I am of the opinion that Cpp must be destroyed.” The book was released in 1994. 20 years later the preprocessor is still alive and kicking in the world of C++ programming.

The preprocessor is here to stay, so two different techniques have been developed to speed up preprocessing. The first one is precompiled header (PCH), the other one is single computation unit which is more commonly known as “unity builds”. Both techniques are good ideas in principle, they however failed to gain wide use in many C++ projects for the following reasons:

  • Precompiled headers require the creation and continuous manual maintenance of a prefix header.
  • C++ compiler vendors have implemented PCH support in different, incompatible ways.
  • Unity builds break the use of many C++ language features. They may cause unintended collisions of global variable and macro definitions. Thus unity builds rarely work without source code modifications.
  • Most C++ projects start out small and grow over time. When the need for adding PCH or unity build support is felt, it is too much work to incorporate it into the existing build system.

Given the modern build infrastructure that CMake provides, I thought that adding support for precompiled headers and unity builds should be as easy as stealing candy from a baby. I couldn’t be more wrong. The existing solutions at that time only were hacks divorced from software engineering reality. So this was clearly a case, where Jean-Baptiste Emanuel Zorg’s rule applies. On top of that it was an interesting weekend project to take on.

Designing the Interface

Interface wise I wanted to be able to speed up the compilation of a CMake project by using one of the simplest technical interfaces known to man:

In programming terms, this means that if you have a CMake project which creates an executable:

add_executable(example main.cpp example.cpp log.cpp)

you just call a function with the corresponding CMake target:

cotire(example)

cotire is an acronym for compile time reducer. The function then should do its magic of speeding up the build process. It should hide all the nasty details of setting up the necessary build processes and should work seamlessly for the big four C++ compiler vendors, Clang, GCC, Intel and MSVC.

Once you have designed an interface that you think succinctly solves your problem, it is extremely important to fight the urge to make the interface more complicated than it needs to be just to make it cope with some edge cases. Giving in to that urge too early is the reason why software developers have to deal with subpar tools and libraries on a daily basis.

Implementation

A well designed interface should give you a crystal-clear view of the technical problems that need to be solved in order to make the interface work in reality. For cotire, the following problems needed to be solved:

  1. Generate a unity build source file.
  2. Add a new CMake target that lets you build the original target as a unity build.
  3. Generate a prefix header.
  4. Precompile the prefix header.
  5. Apply the precompiled prefix header to the CMake target to make it compile faster.

Using CMake custom build rules, cotire sets up rules to have the build system generate the following files at build time:

The unity build source file is generated from the information in the CMake list file by querying the target’s SOURCES property. It consists of preprocessor include directives for each of the target source files. The files are included in the same order that is used in the CMake add_executable or add_library call.

This is the unity source generated for the example project under Linux:

#ifdef __cplusplus
#include "/home/kratky/example/src/main.cpp"
#include "/home/kratky/example/src/example.cpp"
#include "/home/kratky/example/src/log.cpp"
#endif

The prefix header is then produced from the unity source file by running the unity source file through the preprocessor and keeping track of each header file used. Cotire will automatically choose headers that are outside of the project root directory and thus are likely to change only infrequently.

For a complex CMake target, the prefix header may contain system headers from many different software packages, as can be seen in the example prefix header below generated for one of UnRisk-Q’s core libraries under Linux:

#pragma GCC system_header
#ifdef __cplusplus
#include "/usr/local/include/boost/tokenizer.hpp"
#include "/usr/local/include/boost/algorithm/string.hpp"
#include "/usr/include/c++/4.6/iostream"
#include "/usr/local/include/boost/lexical_cast.hpp"
#include "/usr/local/include/boost/date_time/gregorian/gregorian.hpp"
#include "/usr/local/include/boost/numeric/ublas/matrix.hpp"
#include "/usr/local/include/boost/numeric/ublas/matrix_proxy.hpp"
#include "/usr/local/include/boost/numeric/ublas/matrix_sparse.hpp"
#include "/usr/local/include/boost/numeric/ublas/banded.hpp"
#include "/usr/local/include/boost/numeric/ublas/triangular.hpp"
#include "/usr/local/include/boost/numeric/ublas/lu.hpp"
#include "/usr/local/include/boost/numeric/ublas/io.hpp"
#include "/usr/include/c++/4.6/set"
#include "/usr/include/c++/4.6/bitset"
#include "/usr/include/c++/4.6/cmath"
#include "/usr/local/include/boost/foreach.hpp"
#include "/usr/local/include/boost/regex.hpp"
#include "/usr/local/include/boost/function.hpp"
#include "/usr/local/include/cminpack-1/cminpack.h"
#include "/usr/local/include/cminpack-1/minpack.h"
#include "/usr/include/c++/4.6/fstream"
#include "/usr/include/c++/4.6/ctime"
#include "/usr/include/c++/4.6/numeric"
#include "/usr/include/c++/4.6/cfloat"
#include "/usr/include/c++/4.6/cstdlib"
#include "/usr/include/c++/4.6/cstring"
#endif

The precompiled header, which is a binary file, is then produced from the generated prefix header by using a proprietary precompiling mechanism depending on the compiler used. For the precompiled header compilation, the compile options (flags, include directories and preprocessor defines) must match the target’s compile options exactly. Cotire extract the necessary information automatically from the target’s build properties that CMake provides.

As a final step cotire then modifies the COMPILE_FLAGS property of the CMake target to force the inclusion of the precompiled header.

Speedup

With cotire we were able to cut the build time of the Windows version of UnRisk-Q by 40 percent:

Users who have adopted cotire for adding precompiled headers have reported similar speedup numbers.

With tools that are developed with a special in-house purpose in mind, it’s all too easy to fall into it works on my machine trap. Therefore we also applied cotire to some popular open source projects in order to test its general-purpose applicability. One project we tested it on is LLVM. LLVM is a huge C++ project with close to a million lines of code, yet the change set that is needed to apply cotire to it is just 100 lines of code. A cotire PCH build reduces the build time for LLVM 3.4 by about 20 percent:

One project where unity builds work out of the box without having to make changes to the source code is an example text editor application for Qt5. Applying a cotire generated precompiled headers to this project reduces compile time by the usual 20 percent, but doing a cotire unity build results in a reduction of 70 percent:

Other users of cotire have reported even larger speedups with cotire unity builds.

Conclusion

As described in the book The Cathedral and the Bazaar, one of the lessons for creating good open source software is that every good work of software starts by scratching a developer’s personal itch. Cotire has been released as an open source project in March 2012. Since then it has beed adopted by hundreds of open and closed source projects that use CMake as a build system. Among those are projects from Facebook and Netflix.

A book each physicist should have in her library

I am currently furnishing my new office and for my book shelf I decided to buy some special edition of my favourite books. During my studies of physics I had a lot of different text books for the basic physics courses,  like Berkeley physics course or Tiplers book. And although many of them were great none of them impressed me like The Feynman Lectures on Physics. Therefore I decided to buy the millennium edition of this book(s)


Between 1963 and 1965  Richard Feynman taught lectures to Caltech freshmen and sophomores - out of these lectures the three volumes of the book has been created by him and his coauthors. Volume I concentrates on mechanics, radiation, and heat; Volume II on electromagnetism and matter; and Volume III on quantum mechanics.

I want to end today's blog post with a cite of Mark Kac:

"There are two kinds of geniuses: the 'ordinary' and the 'magicians'. An ordinary genius is a fellow whom you and I would be just as good as, if we were only many times better. There is no mystery as to how his mind works. Once we understand what they've done, we feel certain that we, too, could have done it. It is different with the magicians. Even after we understand what they have done it is completely dark. Richard Feynman is a magician of the highest calibre."

There Is A National Pygmy Goat Association

As one of the men who starred at goats, wolves and lions I am really amazed - there is a NPGA. Here is the story that there is nothing like a magic forest but a fierce battle over genetic purity …

(again pointed at Marginal Revolution)