The optimist proclaims that we live in the best of all possible worlds; and the pessimist fears this is true. James Branch Cabell

Our technology just can't go wrong 

- an essay on risk management 
that reflects the concerns and events depicted in the series by 
Leo Singer. 

In the aftermath of a major disaster the news media trade in the language of shock, promises and condemnation. You never expect it to happen to you.....This was a disaster waiting to happen.....We will do all we can do get to the bottom of this tragedy.....Experts are questioning the reliability of this new technology......The management knew the risks........We trusted in them to do things properly...... I'll be talking to my lawyers. The bloody mess left behind by death on a large scale fires the emotions. Heartfelt but baseless accusations start flying and blame is apportioned. Those in positions of responsibility or authority will become unduly defensive and say things they will later regret.

 Frederico Pena, in the desolate swamps of the Florida Everglades, surrounded by the wreckage of a DC-9, will say 'ValuJet is a safe airline, as is our entire aviation system'. The Chief Executive of Occidental Petroleum will say "I think we have taken every precaution we know how". The Union Carbide works manager at Bhopal will say "the gas leak just can't be from my plant......our technology just can't go wrong". 

As the initial confusion recedes an official investigation will usually begin. Teams of experts in many different fields will sift through the wreckage, interview the witnesses, debate cause and effect. They unpeel layer after layer, cause behind cause, seeking the real and provable physical events which led to a tragedy. But there will be contradictory testimony, intractable forensic puzzles, and missing pieces of the jigsaw. 

And behind it all is a real and unshakable fear that someone, somewhere, is to blame. Someone, whether through incompetence, inaction, lack of knowledge or carelessness, will have blood on their hands, and this is what distinguishes a man-made from a natural disaster. 

An engineer once compared designing a new passenger ferry or an airplane to throwing knives in a circus act. If everything works, that's fine, that's what you're paid for. But one fatal slip-up and your knife-throwing days are over.

Risky business 

People who own airlines or who run train companies will say; it's safer, statistically, to fly on one of our planes or sit on a train than it is to cross the road (or some other relatively dangerous but mundane activity). Or you are more likely to die from a natural methane explosion than from a toxic leak from a refinery. Or you are more likely to die from a meteorite hit than from a meltdown in a nuclear reactor.

 It seems sensible to be aware of the relative risks involved. It is certainly true that we are far more likely to die in a car wreck or die of cancer than in a plane or train crash. But this is as it should be. This is a pre-requisite for airlines being allowed to fly us in the first place. If even 1% of planes crashed even the most dare-devil among us would think twice about flying. In any case, few people (except the FAA, as we shall see) would use these numbers to justify putting people at greater risk than is absolutely necessary. The likelihood of other possible ways of dying becomes irrelevant when you are sitting on an airplane. You are implicitly placing your trust in a whole hanger-full of peoples; airframe and engine mechanics, government aviation inspectors, airline managers, aeronautical engineers, fuelling workers, baggage handlers, security screeners, pilots and crew, weather information services, air traffic controllers, designers of air traffic control systems, maintenance subcontractors, pilot trainers, and emergency services. 

But mostly you need to trust other road users, and your own driving ability. The overall risk is higher (depending on how much and where you drive, how good a driver you are, and a hundred other factors) but the survivability rate is good. People more often than not survive car crashes. And if the technology fails you are far more likely to splutter to a halt than crash and burn. Cars do not travel 35,000 feet up in the air. So there is difference. Technological disasters affect people who are put in a position of risk, but who are trusting in people to minimise that risk. It is this trust - and lack of control over the outcome - that inflames anger and drives the desire to find out exactly what went wrong. And often we find that tragedy was inevitable, and the underlying causes were easily avoidable.

 Too close to the sun

 Big construction projects or programmes like the space race or a new form of power station are proudly looked upon by those who pay for them and those who build them. They are symbols of wealth, of power, of taming nature or technological innovation. The Channel Tunnel was one of the biggest construction project ever undertaken in the UK. The space programme may reach for the stars but, far more usefully, the Channel Tunnel reached across to France. It was a step towards a modern European future and a signal that Britain was ready to renounce its island status and become part of something bigger. It was also a showcase for British engineering and design. With many prestige projects there is a continual tug-of-war between those who supply the money - whether private financiers or politician - and the engineers. The tunnel was no different. Like most big construction projects undertaken in uncertain waters, both physical and financial, the costs spiralled out of control and arrived years after it was first promised. But the public - who, after all, footed the bill for a large proportion of the work - patiently applauded the link on completion and set to work in Calais hypermarkets taking advantage of the new duty-free laws. But when a fire started on a lorry in November 1996 the honeymoon swiftly ground to a halt. Safety was paramount. But getting the link finished was also paramount and the two objectives were not easy bedfellows. The adoption of open-sided coaches to carry lorries was a cost-cutting exercise which carried with it certain fire risks, risks which were well known to the engineers and worried over by fire service chiefs on both sides of the Channel. In the subsequent report a whole series of errors showed up serious faults in emergency training and systems.

 The Space Shuttle programme was an on-going 'prestige project'. At the beginning of 1986 NASA administrator said 1986 "will probably the most important year of our space programme, and the most important year since the space race began". The Challenger flight in January 1986 was especially important. This mission, as well as launching a satellite and a comet observatory, was to carry the first Presidentially-inspired 'teacher in space', Christa McAuliffe, who planned to broadcast lessons from orbit. NASA had planned an ambitious fifteen missions during the year, which meant any delays would have repercussions for the rest of the schedule. Challenger had already been delayed several time already due to bad weather and a problem with a hatch bolt. NASA also felt its position was permanently under the watchful eye of the Congress who watched billions of tax dollars being poured into a space vehicle which had not really lived up to expectations. From a public relations point of view Challenger was probably the worst shuttle they could have lost, and the programme was set back by at least three years. It was, some say, a classic case of flying too close to the sun.

 Profit and Loss

 With any big transportation or construction project the heaviest load the managers have to bear is money. Money determines everything from the price of a ticket to the strength of steel used to build a ship. For accident investigators and safety engineers money is the devil incarnate, and the desire to save it is a black and dangerous art. The world is full of sharks ready to make money and run. The shipping industry has its fair share of sharks, and some of them own bulk carriers. These ships are the only cost-effective method of transporting large quantities of raw materials. Unless you survive on home-grown produce and solar electricity you are likely to depend on this rarely seen tramp trade of massive ships and foreign crews. In Victorian England Samuel Plimsoll fought for a legal limit to the amount of cargo a ship could carry. Many overloaded ships were lost at sea, but their owners were generally insured, and accepted the losses as part of the business. Eventually Plimsoll got his 'line', a method of easily determining whether a ship is leaving port overloaded. So have things changed in the last one hundred years? Let's say you wanted to get started in the world of bulk shipping. A budding operator might go to a disreputable second-hand ship broker in the Eastern Baltic and buy a bargain bulk carrier which might have carried iron ore between Australia and St Petersburg for 25 years, under seven different owners. New ships are prohibitively expensive, especially for the shipping owners and operators who make their living in the 'tramp' trade, the ad hoc chartering of ship from port to port. You could then go to a large marine insurance underwriter, and get the ship insured. The market has sagged and premiums are low. They need your custom. The fact that the deck is rusting away and cracks are starting to appear around the bulk heads does not seem to ring any alarm bells. Then you would register under the Maltese, Cypriot or some other flag of convenience in order to escape any effective maritime legislation. Finally, you would go to an agency and hire the cheapest crew you can find - probably Indian or Filipino - and wait for your first customer. This combination of circumstances means you are able to run your ship into the ground - so to speak - and run the risk of losing it, along with the crew. This method only holds for a minority of ship owners. But there is a lot of money to be made and they will lobby fiercely to prevent safety issues eating into their profits. If their sailors are killed in the meantime, well, it's unlikely anyone will kick up a fuss; Filipinos dying in the middle of the Pacific rarely make the headlines. There is a balance to be struck between cost and risk. Of course risk should be minimised if the cost to the consumer or taxpayer is negligible. But what if the price of grain was to double in order to save one or two sailors' lives? Or airline tickets increased by 50% in order to save 10 passengers a year? The Federal Aviation Administration in the US conducts cost/benefit analyses for all of the regulation changes it considers. For example, in the past they have considered recommending fire detection and suppression equipment for all sealed cargo holds on passenger jets. They worked out that this had a certain cost to the industry in the United States, say $200m. They set this against the cost, and in order to do this place a price on each human life lost. No-one seems to know how they arrived at this figure, which is about $1.7m. And in this case, they totted up the number of people who might have been saved by these systems, multiplied by $1.7m, and found it came out less than $200m. The measures were not worth implementing after all. Many people are amazed by such a harsh and utilitarian concept, although there has to be some method of comparing risk and cost. But the FAA calculations are actually nonsensical. It often takes a single disaster for the balance to swing the other way, for the costs to outweigh the risks, because of the large number of people killed in airplane crashes. Hence the ValuJet accident, which might have been prevented had fire suppression systems been installed, immediately threw the calculations into reverse. There is no long term view. The FAA are nicknamed the 'Tombstone Agency', because they wait for the bodies to mount up before any action is taken. But in this country we were pioneers in the field. As the Illustrated Times noted in 1867, "we in England never begin making a reform or adopting an improvement till some disaster has demonstrated its absolute necessity". 

Regulatory Bodies

Sometimes regulators have other things on their mind other than safety. The US Federal Aviation Administration, for example, had a dual role enshrined into its constitution; it was responsible for safety, but also for promoting the US airline industry. Mary Schiavo, the government Inspector General who heavily criticised the FAA after the ValuJet crash, says "it's two jobs that just don't fit together. Either you're going to be an oversight agency, and you like that role, or you're going to be a business partner, and it's pretty hard to do both". After the crash the wording in their constitution was changed; but the way the FAA operates has remained exactly the same. Ironically, the manufacturers know full well that FAA regulation is inadequate when it comes to building aeroplanes. An aircraft engineer claims "I could build you an aeroplane which fitted into every one of the federal airworthiness regulations, and you still wouldn't fly in it if you knew what you were doing". In the US the National Transportation Safety Board, which despite having just an advisory role, does concentrate on safety alone, and is open and straightforward in the way it reports and makes recommendations. We are not so lucky in the UK. The Channel Tunnel Safety Authority in the UK has an uneasy relationship with Eurotunnel, the owners and operators of the Channel Tunnel link. They are officially independent, but have been accused of kow-towing to Eurotunnel's commercial desires. Mark Watts, an MEP who has been involved with the case, says "rather surprisingly the Safety Authority that should independently assess Eurotunnel's safety operations is paid for by Eurotunnel". They are also accused of secrecy; "the Safety Authority operates almost clandestinely...its minutes are rarely released and are very threadbare." Their operations are "submerged in a fog of red tape". After the fire in the tunnel it transpired that the Safety Authority had at first opposed the use of semi-open wagons, then changed their mind and allowed Eurotunnel to proceed. In their report on the fire the Safety Authority ignored this factor. Watts says "it was this design that allowed the fire to take hold, and then spread and become so serious". These agencies are independent to the same extent as an unruly teenager; they want complete freedom but still rely on their parents to get fed. The most blatant commercial relationship of this kind is in shipping. Classification societies are commercial companies which survey and classify ships on their register. They give the all clear to insurers and the marine authorities in the country of registration that everything is shipshape and nothing is going to fall off. There is, as I'm sure you've guessed, a flaw in the system. The classification societies are wholly funded by their clients, the ship owners and operators. If the owner feels mistreated by one ociety, he can join another, who might be a little more lenient and ignore the peeling paint or the ageing radio systems. One consequence of this is that commercial losses in international waters, especially when there is no pollution to speak of, are very rarely investigated. The societies are under no obligation to give details of their surveys, and the authorities of the flags of convenience are ineffectual. Most bulk carriers lost at sea lie on the sea bed, never to be seen by the prying eyes of an accident investigator. We don't need lifeboats as the ship can't sink Many disasters are the result of one small error after another, a chain of events which can be broken at any point. This is best demonstrated in aviation disasters. Aeroplanes are designed with built-in redundancy. This means that if one system fails, there is a back-up to take its place. If one engine catches fire, the plane can still be flown. If one hydraulic line is broken, another line keeps the pilot from losing control. Similarly, during aircraft maintenance, the fixes and procedures are checked and double checked by people who must log every single nut and bolt that is used or replaced. It's all in the detail, and accident reports will plough through reams of technical analysis, photographs of offending parts, schematics, reports, and a whole range of other evidence. But then they will ask why it all went wrong. And usually the question takes them rapidly up the chain of command to the management. The Challenger disaster is a well known case where the decision makers overruled the engineers on the crucial question of the temperature at which the O-rings would seal. The decisions faced by NASA were hugely complex, dealing with new technology and a myriad of potential pitfalls. There were hundreds of parts or systems on the 'criticality' list, in the same category as the O-ring joints; loss or failure of any of those systems would have meant catastrophic failure, and there was no reasonable way to build redundancy, or back-ups, into those systems. NASA had to live with imperfection and do it all it could to minimise the risk. What became apparent was that a certain complacency had set in; otherwise known as the Titanic syndrome - 'we don't need lifeboats as the ship can't sink.'. Ironically, in the aftermath of the report, the engineers were dismissed while the managers survived. Nonetheless NASA instigated far-reaching changes to the way they conducted their flight readiness review process. After the Channel Tunnel fire, one member of the Channel Tunnel Safety Authority admitted "fundamentally, of course, it's a failure of management". People in the Rail Control Centre were not properly trained and were singularly unable to cope when the fire began. Procedures were not followed, and those which were followed, were shown up to be flawed. The communication systems failed, and management were directly responsible for their design and implementation.

 Whatever can go wrong

 Murphy's Law states that whatever can go wrong, will go wrong. Murphy was probably an accident investigator. If there is one common thread in most technological disasters, it is that the outcome was never likely to happen; it was just possible, an outcome dangling at the end of a slim thread of circumstance. But in a world of increasing complexity there is an lot which can, potentially, go wrong. A Boeing 747 has over a million parts. Out of those million parts, a certain percentage is likely to fail. This depends on the age of the machine, the quality of the parts, and the care and expertise of the maintenance engineers. The vast majority of failures will be contained and non-catastrophic. But as the millions of flying hours are clocked up it becomes less and less likely that one of the failures will not have serious and fatal implications. So in one respect disasters are, more or less, inevitable. New technology can never be perfectly tested or remain perfectly immune to human error. The success of the Space Shuttle programme up until the Challenger accident was beyond anyone's dreams, an achievement made all the more incredible by the speed with which the shuttle was developed. So fatalism comes easily to those on the front line. Some engineers and others who deal in accident prevention every day will read the outraged headlines after a crash and shrug; well, what do people expect from an antiquated train service? What do you expect if you want to fly to Majorca for under 163 100? How can sub-contracted mechanics perform to the best of their ability when they don't have job security? How can civil engineers foresee every possible hiccup when they were given less than three years to build a bridge? It is wrong to expect engineering perfection whilst squeezing their scruples into a drive for high-efficiency and low cost. But, within reason, we have to trust those people into whose hands we place our lives, and that trust needs to be based on confidence that the risk is minimised. We all have a vested interest. We fly on airplanes, take trains, live near chemical plants, or simply harbour sympathies for the oil workers or sailors who know they are living with unnecessary danger. Trauma and grief lingers long after the investigation is completed and forgotten. There are oil workers who vividly remember the fear they felt on Piper Alpha. Families of the people who died on ValuJet Flight 592 are still battling in the courts. Mothers of sailors will never know the cause of their son's death and will never be able to bury the body. There is a good reason for examining the causes of disasters. We can try to prevent them from happening again. In the meantime, I'll be talking to my lawyers.

If you want to clear out your system sit on a piece of cheese and swallow a mouse - Johnny Carson