Paste your Google Webmaster Tools verification code here

Royal Bank of Scotland fiasco shows the power of networks

The last week or so has seen complete mayhem in the Royal Bank of Scotland and its subsidiaries.  A computer glitch has caused their payments systems to collapse.  Monies have not been processed, 17 million customers have been unable to access their accounts and pay their bills.

The impact for RBS has been catastrophic.   So, an incident of this magnitude must surely have been caused by a massive event?  Perhaps the building containing the Bank’s main computers was burned to the ground?  Or the system was the victim of a malevolent cyber attack by a hostile power?  In fact, nothing like this took place at all.  It seems that an inexperienced operative in India accidently wiped information during a routine software upgrade.

In other words, a relatively trivial problem cascaded across the entire network and ‘went global’. 

This is not an issue which is specific to the RBS.  It is a fundamental feature of any system in which networks are important.  The classic example is outages in electricity supply systems, leading to huge blackouts.  Sometimes, there is indeed a major event which causes a major failure, such as a hurricane or ice storm destroying physical links in the system.  But all too often, it is a trivial failure which leads to a cascade across the system.

Most of the time, of course, the impact of small events is confined to their immediate locality and spread no further.  But it is the connected nature of networked systems which means that, in principle, even small events can have consequences on a scale up to and including the network as a whole.  The probability of any single small event causing a dramatic incident is very, very small.  But trivial problems occur on an almost daily basis in almost all systems.  So at any time, there is the potential for catastrophic failure.

In the scientific literature on the fundamental mathematical properties of networks, there is a jargon to describe this inherent property of networked systems.  They are ‘robust yet fragile’, a phrase initially coined by the top Caltech scientist John Doyle, way back (!) in the 1990s.  They are ‘robust’ in that small shocks, small problems, do not usually spread very far in the system.  But at the same time they are ‘fragile’.  A tiny adverse event can in principle bring the whole system down.

We see this principle very clearly in financial markets.  Think back to the banking credit crisis of the late summer of 2007, the harbinger of the major crash just over a year later.  At the end of June in 2007, there were few problems.  Voices were being raised about the problems of debt, but these were still very much in a minority.  The anxieties had not percolated across the network of banks, and their confidence in lending to each other.  Suddenly, this changed, and we had a major liquidity crisis.  Inter-bank lending collapsed, leading, very quickly to the demise of Northern Rock.  Not much had happened.  But negative sentiment suddenly cascaded across the banking network.

Companies must take these fundamental features of networks into account.  The potential problem extends far wider than financial markets.  Adverse comments, often with no basis in reality, about a firm and its products are posted all the time on the internet.  Most of the time, these do not get very far, often no further than the green-ink perpetrator of the comments.  But, very occasionally, a grievance, even one which is completely ill-founded, will get global traction and seriously damage a brand or even a whole company’s reputation.

One of the real cutting edge areas of scientific investigation on networks is how to spot at a very early stage when a comment has the potential to go global.  So defensive strategies are possible, firms are not powerless in our highly connected world.  But it is crucial that both firms and governments start learning the lessons of the networked world of the 21st century.

by Paul Ormerod

1 Comment
  1. Interesting blog. I much agree with the fact that businesses and governments are being rapidly outpaced levels of development, integration and inter-connectedness in the real world. However, I think we have to be wary of the capacity of scientific models to predict and prevent future problems or potential cascade developments. There is a high unpredictability attached to our economic and social systems, this is extremely difficult to pin down and is inherent to the system. Perhaps a more effective way is to acknowledge our prevention limitations (though this does not mean we cannot and should not improve our power to foresee and act preventively) and work more on safety nets and the fallback side – i.e. scenario analysis: what if this happens what can we do about it. In other words, less predictability science and more exploratory science and scenario analysis.

Leave a Reply

Subscribe to our newsletter

Subscribe to receive our monthly newsletter to keep in touch with what we're up to.

Contact Us

56-58 Putney High Street, London, SW15 1SF
Phone: 0208 878 6333
Visit Us On TwitterVisit Us On Linkedin