Readers who occasionally watch any of the documentary channels on TV might be familiar with the warnings that our highly-automated and interlinked society is really rather fragile. One large solar flare or a heavy ice-storm, and power supplies and communications links suddenly fail, bringing normal activities to a grinding halt. It doesn’t necessarily take a global catastrophe to prove just how much we rely on technology for things to work smoothly, the lack of immediate alternative fall-back positions and the consequent chaos if the technology fails. A recent incident at one client’s place was far less dramatic, but still provides a small-scale illustration.

Take a busy dealer forecourt in the middle of a large modern town. No freak weather, just demolition contractors pulling down a nearby building and that included cutting all of the cables and telephone lines on their patch. The lights didn’t go off on the forecourt; neither pumps nor tills stopped working. The first indications that there was something wrong were that the online card authorisation had gone ’dead’, the Lottery terminal was ’out’, and the office phone seemed quieter than usual. Not very convenient when your site does a large volume of diesel sales to heavy truck fleets, but at least everyone has a mobile phone these days. Calling the card companies for authorisation of every large sale slowed things down, and mobile calls are more expensive, but trading didn’t stop altogether.

Big Telephone Utility

Meanwhile, the site manager tried to get in touch with their Big Telephone Utility to find out what was happening, and when links would be restored. After multiple calls and transfers to various departments, it became obvious that there wasn’t going to be a quick ’fix’.

Now at this site, like most others, cash only represents some 25% of sales receipts; the rest of the daily takings are EFTs of various sorts.

Fortunately, modern pos systems are able to continue working without constant access to the phone system, and will store days or weeks of sales data, so trade didn’t have to stop although it became considerably slower than usual. Of course, it meant that only a fraction of the site takings were going to be hitting the bank account in the following few days, so the client quite sensibly called their bank branch and advised them of what had happened and reassured them that the site was open, trading as normally as possible, and everything should be back to normal in a day or so.

Apparently fixing a cut phone line isn’t quite as simple as it once was. Eventually an engineer came out and advised that a risk assessment was required before any restoration work could begin.

risky business

The risk assessment noted the nearby demolition site and concluded that it was too risky for any repair work to be carried out. Around this time the client, realising that this saga could go on for a while, and was hitting volume, spoke with their oil company. To cut a long story short, (and give some credit to the oil company which apparently helped ’influence’ the telephone utility to get back on the case) the engineers came out again. This time an alternative route for the cable was established, but since it involved putting up a new telegraph pole on a corner of the client’s site, they advised that they’d require permission from the local council.

Hours slipped into days, and the days began to mount. Business was definitely affected, volume was certainly lower, customers and staff very frustrated. Then the bank manager called asking why there wasn’t much going into the account and pointing out that the site was nearly at its account limit. You’ll recall that the client had already explained the situation to the bank right at the start of the week but the ’helpful’ person to whom they’d spoken had seemingly forgotten to make a note of the conversation or mention it to anyone senior.

Eventually a new phone line was connected although even this wasn’t completely successful at the first attempt, since the data line wasn’t working (it seems that the Big Telephone Utility’s engineers didn’t realise that there was one, even though their own company bills the client for it every quarter).

Now the reaction of some readers will be: ’sue someone’, or ’claim off the insurance’. Quite right, in principle, but not always practical.

How do you quantify the cost of an event like this? The obvious answer might appear to be to look at volume losses but you’d have to compare that week’s volume with comparable weeks a year, or two or more years earlier and work out an average. Given the volatility of the fuel market it would be difficult to prove how much disruption had been caused by this incident that would have been easier if the site had actually been forced to close altogether. You can try and quantify any overdraft interest costs assuming that it had created or exacerbated an overdraft situation and add up the mobile phone costs.

stress factor

But in the end, the biggest ’cost’ was the stress and frustration suffered by staff and customers, and it’s far harder to put a financial value on that, even assuming that you were wealthy enough (or just plain bloody-minded enough) to involve the lawyers. Of course, you’d need to know who to claim from. The demolition contractors may seem the prime candidates, but the real irony of this case was that they’d actually done nothing wrong the lines they’d cut had already been marked as ’dead’ on their plans by all of the relevant utility companies.

This wasn’t cutting-edge technology that had failed; the humble landline has been with us for over a century. There may be no way of preventing human error, but if this incident had occurred at a business that was struggling, rather than soundly financed, the resulting cashflow interruption might well have proven the final straw.