If you look into it with a bottom up view, today many instruments have intelligence inside and that is a big change.
In physics, new tools that are made available by computer power drive new ways of studying and interpreting behavior. Instead of individual specifics and dependencies, formulated in complex models, researchers begin to work with immense collections of data. And yes, if you have a system that let you, say, zoom in from a forest to the inside of a single cell of a chanterelle you will get new insight. Maybe, how the mycelium connects plants.
Katy Boerner calls such a system a macroscope a system that provides a vision of the whole.
Provocatively asking, does this only change the way of doing visual data analysis? No it is more general about visualizing the dynamic of knowledge. But what about the knowledge engines. Do they support the same platforms and tools?
Last Thursday and Friday we conducted a workshop with one of our customers on controlling risk from the valuation of single instruments to large portfolio level in time. Viewing the computational side of the financial processes uncertainty can evolve from the environment, the initial conditions and the intrinsic generators. Let me short call them technology risk that can be poor adequacy, accuracy, robustness but also timing.
If you have clever algorithms and powerful computing muscles you can do intrinsic method and model testing, automated precision control, data plausibility analysis, ....
Wanting to analyze portfolios across scenarios (VaR calculations, back testing, stress testing, ..) in time calls blazingly fast calculations with single valuation time in microseconds.
We at UnRisk are in such a Blazing Business.
The new hybrid CPU - GPU systems empower us to intelligently combine coarse grain and massive parallelism that multiply to speed-ups in the hundred thousands - providing our customers with a macroscope to monitor the dynamic of their blazing business.
For one aspect meet us at Accelerating Risk Analytics.