New Year 2012. We have passed 10 years of UnRisk-in-bank-use.
Our pricing and calibration engine, building the solving foundation for UnRisk products, is on version 6, the 20th release since Dec-01. UnRisk FACTORY on version 4. The UnRisk DIRECTOR is released.
UnRisk runs on hybrid CPU and GPU systems were the hosting PCs have 16 Cores minimum now. UnRisk supports cloud-computing and smart client devices.
UnRisk Academy tied all learning aspects of UnRisk together and completed UnRisk offerings to know-how packages.
What were the driving forces of our innovations?
connecting problems and ideas from different fields.
For the most sophisticated deal types, we have transfered our high-end numerical schemes and inverse problems knowledge from industrial process and risk control to quantitative finance.
We count on our multi-sector numerical mathematics, symbolic computation, machine learning and sw engineering experiences.
What were the primary developments in the past 2 years?
We know from projects, like weather forecasting, that systems shall be multi-strategy-and-multi-method checking the effectiveness of valuations against alternatives.
To understand financial data better, we apply a set machine learning methods, from svm to fuzzy methods.
Our hybrid system was built in layers from the beginning. Number crunching in C++. In Mathematica we replicate a QF specific programming language. We build our front-ends as little web-based operation systems. And middleware in Java.
Is this enough?
To understand most critical problems in the universe of deal types we established closed customer feed back loops. But we also ask questions that challenge common sense.
Leading to .... ?
In fact, the idea to build the FACTORY on top of the ENGINE was the joint idea of 3 customers and us, questioning whether we can equal valuation and risk management in one system that also does make all actions persistant.
Good base to fully consolidate and exploit?
But there are still challenges, quickly springing in my mind: Are large clusters required to get enough speed up? Why not UnRisk-from the plug? What can we learn from WolframAlpha? Do our customers need to select the models for their deal types explicitly? Is it possible to apply adaptive precision control? Is SQL a bottle neck for high-performance transaction processing? ...
We grow and exploit technologies in co-evolution with customer and market requirements.
But any next big ideas?
by looking out for the behaviour details of our customers, the market and the finance industry. The crisis has shown, how quick markets can shift to new regimes. Beside personal contacts, there are many resources.
I especially like discuss in Wilmott forums. Like, "will the continuum of quantitative analysts expands from the quantitative finance engineers to quantitative finance economists?"
IMO, as modeling, intuitive interpretation of financial phenomena, needs mathematics supported by knowledge an data driven methods.
Is this radical innovation?
Yes, Mathematics and data-drive methods intelligently combined is, IMO.
we try out ideas by creating evolutionary prototyping. My first rendezvous with symbolic computation. parallel computing and machine learning was around 1990. I always took advantage of trying those technologies in challenging experimental prototypes. Andreas, with his industrial mathematics experts working for process engineers and automators also prototyped to verify ground breaking approaches. Both in process and manufacturing industries, life science, ....
15 years ago, were asked by a London based trading desk, to develop pricing tools for asian type convertible bonds. Andreas found the Adaptive Integration algorithm, still used in UnRisk, and verified it in a few-days prototype. It led to a comprehensive system.
10 years ago, Wolfram and us made a reference webUnRisk example.
6 years ago, we remotely benchmarked gridUnRisk at the HPC Center Cambridge.
A year later we began our GPU experiments.
Oh yes, more than 2 years old ?!
Recent experiments dealt with surrogate modelling, machine learning for pre- and post-processing and advanced global optimization techniques for the objective functionals.
Backed-up in a larger scale?
we partner with Wolfram Research, MS HPC Group , Nvidia Tesla Group . This feeds a natural network for high performance computing in finance. This server side technology network was extended to cloud-computing partners.
The quantitative finance community has always emphasized speed and accuracy?
Yes, but also always successfully realized? However, we are in the focused Mobile Decade. Rumors about the "Apple Tablet" were fired by Wired end-09. We were prepared, because UnRisk FACTORY could be accessed from an iPhone before. This opened new partnerships, we are intensifying now.
You seem to be quite technology driven?
No, don't make a mistake, we serve customer individual, whilst driving generic technologies and package know how.
Take Solventis . They run an Unisk FACTORY based application server since approx 5 years.
Other financial services institutions followed. Like us, markets begin to disrupt and re-invent themselves.
UnRisk Academy went far beyond product use training, giving full explanation on advantages, limits, barriers and traps of a solving foundation, use cases and implementation details. This closes the customer feed back loop.
Oh, I read this blog posted on 4-Jan-10. You haven't followed all of your forecasts!
No? Call it demand-first innovation responding to rapid change.
Source of inspiration: the 5 skills of innovators from HBR, Dec-09 The Innovator's DNA