Sixty years ago, digital computers made information readable and models computational. Twenty years ago, the internet made information reachable. Ten years ago, Google built massive computational corpuses to search massive information. Can it still our hunger for enormous computer performance? Valuating 1000 instruments across 1000 scenarios, 10 sec per valuation, took 10 million seconds, 120 days. If you needed the result in a minute a speed up of 150.000 was required. You might get reasonable speed up by coarse grain parallelization on up to 1000 core processes. A speed-up of 150 is still required.
As we, you might think of running, say, MC simulation on GPU, reducing the number of valuations by principal factor applicationn, applying surrogate models
But having not thought of improving the overall result by expanding to multi-model valuations, cross-model validation, dynamic visualisation, we might lose the gain in the same second. However, who wants to invest in 1000 core processes, for a once-in-a-week requirement? Welcome to cloud computing.
Web clients have replaced rich clients interfaces for applications related to communication, shopping and personal banking. Web interfaces are ubiquitous, because the only software the user needs is a web browser, which increasingly is pre-installed in mobile devices. You want to be reactive and make decisions, wherever you are? Welcome to ubiquitous computing.