...2009 budgets... have already been consigned to the shredder, as the economic crisis has blown away the assumptions on which they were based.
Faced with exceptionally volatile business conditions, senior executives are finding it harder than ever to gauge how their companies are likely to fare in the months ahead...
With even short-term horizons as obscure as the San Francisco skyline during a summer fog, companies are finding their standard budgeting and forecasting of little use. The usual trick of plugging figures from operating units into spreadsheets appeals to number-crunchers, but can often generate misleading targets, especially when conditions change fast. [link and emphasis added]
...the first guy hits a fog bank and can't see squat. He taps his
brakes. The second guy sees red lights and fog coming up fast and taps
his brakes just a little bit harder, and so on. In just a few seconds,
hundreds of cars end up in a tangled heap and people die.
The issue is not the fog itself. Sitting inside watching it with a warm cup of tea connotes a sense of calm introspection amidst muffled sounds and soft light. Rather, the issue is moving through it, i.e., a mismatch between conditions, terrain and speed.
As I've told each of my teenage daughters: speed-limit signs, lateness, adrenaline and energetic music are poor indicators of what will work to get them from point A to point B as fast as possible in a snowstorm at night.
The risk of serious mishap must play a part in gauging what "as fast as possible" means (as the teenage son of a friend learned on St. Patrick's Day when a drunk driver hit him -- thankfully all are OK, it seems). Speed must be matched to what is actually going on, not to some idealized, wishful, outdated view of what should be.
So am I wagging a fatherly finger at managers, advising them to curtail expectations, slow down and wait for the fog to lift? Not necessarily. Here's the big thought to chew on:
Instead of turning up the headlights of forecasting (counter-intuitively deadly in fog), the most important adaptation executives and strategic planners can make to such conditions is to increase collective responsiveness.
How does one do that under conditions analogous to heavy fog when landing the plane or pulling over aren't viable options? And what does that metaphor mean in practice? Several things.
Easy? No. Formulaic? Not at all. In fact, the process is highly creative. Effective? In the current environment, conventional 'steady-state' planning alternatives are actively misleading (i.e., more of the same will only get you in trouble faster). Turn up the headlights and you'll be blinded. Slow to a crawl and you'll be passed or hit.
Citigroup officials have decided they need to reckon with a range of scenarios that were unthinkable only weeks ago.
It neatly captures the essence of why it is vital for scenario thinking to be cultivated as a natural part of an organization's analytical culture, and to become an accepted aspect of ongoing corporate dialogue, and why the term 'scenario' is often maligned and misused.
Two simple ideas: 1) It's hard to think as clearly or creatively when an 18-wheeler is bearing down on you as when it's not, and 2) calling something unthinkable is... unthoughtful. I'll deal with the first point here and second point more extensively in another post.
Studies of cops under stress bear out the fact that fear and adrenaline reduce or even shut down one's ability to access key higher brain functions, including one's peripheral vision and other sensory input. They don't do much for collaboration or listening skills either. A group's thinking may appear to get more expansive under such circumstances; that's typically an illusion.
Ideas long held close to the vest (because others would have thought them crazy) are suddenly poured out on the table in a torrent of collective logorrhea. By definition they are not pro-active, nor have they had time to ripen. It's an environment conducive to the green and rotten ideas becoming indiscriminately co-mingled with the good and timely ones.
Some might be tempted to reply with Boswell's reporting of Samuel Johnson's famous quip, "Depend upon it, Sir, when a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully." (September 19, 1777, p. 351, Volume III of 'The Life of Samuel Johnson"; often misquoted as 'being hanged in the morning'.) It's a quote I love and have used often.
Yet the frame for that quote involves a man already devoid of attractive options (other than prayer, perhaps). He's done. The only mystery is what kind of scene he will make as he leaves the picture and what lessons others will draw from it. The value of having thought steadily, wisely and creatively about a broad array of alternatives years in advance of one's crime, trial, conviction and sentencing doesn't even enter the picture.
Laser-like tactical concentration under extreme stress is very different from the calm, disciplined thoughtfulness and habituated creative behaviors (either corporate or individual) that can help steer clear of circumstances that might lead to one staring death in the face. They make so-called 'unthinkable' futures far less surprising and lead to better responses.
Better choice: create an environment in which 'unthinkable' futures and strategic options for addressing them are systematically thought-out by groups of key managers and executives and carefully positioned in relation to one another (and to the present state) within a detailed, event-based hypothetical cartography. Such scenario maps make it far easier to avoid being surprised by the kinds of dire circumstances Citi and others face today -- or that Johnson spoke about 231 years ago.
How does one reduce the chances of an 18-wheeler bearing down with little room to maneuver? Use a map to figure out where the sidewalks are. Don't walk on exit ramps after dark with your back to traffic. Don't make a desperate, instinctive lunge for the bushes and call the choice of thorns or poison ivy 'scenarios'.
It is actually possible not so see something that is really there... We can’t see patterns that our brains have filtered out... But although the risk hunters saw no danger as they stared into the seemingly benign financial jungle, in retrospect, there were certain signs — whose significance was not realized then — that should have sounded the alarm about the crisis that is now upon us...
The signs were there, but they couldn’t interpret them, even when strange and unusual things were happening. Part of the problem with grasping the significance of the anomalies was that the risk hunters were prisoners of their analytic models. What cues they picked up were interpreted in the context of their mental frameworks. They “saw” through the prism of their models. And their models did not account for the existence of the monster that was now closing in on them...
The scenario work I've done with clients for years is rooted in precisely the precept Fernandez begins with: that the mental models or frameworks that people build up over a lifetime, a career and/or a role strongly color what they believe is possible and therefore the kinds of potential problems and opportunities they are willing to consider. I mean literally color. Fernandez' analogies to invisible spectra are apt. It's not that the spectra aren't there. It's just that I'm not equipped to see ultraviolet, infrared or to hear a dog whistle.
I should note that, while the Economist article is about 'seeing' risk, the concept applies equally well to innovation. More on that in future posts. It's where I've been spending most of my time and energy recently.
Mental models are seldom examined. You and I may talk
about our views on particular future events, but it takes a lot of time
and effort (plus a common 'language' for discussing it) to talk in
terms of entire models. And mental models that are not examined and
described -- at any level -- tend to lead to decisions that don't fit
reality. (Think blindfolded child attempting to hit the party pinata
and hitting grandma instead).
Mental models are also fractal. My own colors the decisions I make, but
I also go about my day constrained, to a greater or lesser degree by
the mental models of the groups to which I belong (family, company,
church, country, etc.)
I'm reluctant to conclude that individuals' mental models are always able to change faster than those of groups. (We all know some stubborn stick-in-the-mud; perhaps he's the one staring at us in the mirror). Anecdotally, however, I know of more individuals who have had a 'Damascus-road' 180-degree 'a-ha!', 'boy I've been stupid, haven't I?' moment in some part of their lives than I do larger groups or companies that have done the same. (When that does happen, e.g., Bill Gates' famous 'Internet' memo ten or so years ago, it tends to be just a powerful individual who changed his mind, not evidence of the group turning quickly together.)
Finally, mental models not challenged systematically and fundamentally on a pro-active basis (e.g., with scenarios) tend to change only defensively and slowly (in response to hard, external truths; often at great cost). That's painful. As Fernandez writes:
I wonder to what extent the policy reaction to the current financial crisis is still colored by the limitations of received wisdom — the financial models — and the need to keep the music playing. The interventionary mechanisms of the last few weeks are designed to fix problems as we understand them. If you’re convinced we understand things now. At any rate we are firing into the last known position of the monster. Nothing can withstand that firepower. The crowds are being told not to worry because the monster will soon be dead. True, there are few doubting souls who are worried that the creature may actually be feeding off our weapons, but their fears are dismissed as nonsense. The important thing, we are told, is to keep things going, which was just the advice the traders gave the risk managers.
is remarkable how the advent of the current financial crisis
structurally resembles the intelligence failures leading up to 9/11:
the same misinterpretation of warning signs, the same blindness to
threats now evident in retrospect. The same belief in a rapid
resolution and a belief that the normal would soon be back.
In other words, we all sense that things have changed fundamentally (in financial markets and geopolitics) but we probably won't be able to develop our new mental models (at least collectively, agreeing on which ones are 'real') for many years -- not without a lot of deliberate thought-work, that is. Nor will we be able to perceive ongoing change properly (assessing both direction and magnitude) or develop said models in a way that enables us to act with precision, resolve and clarity. For years. That's a bit scary. It's also humbling (at least it should be).
Here's another thought that I should probably 'unpack' more fully in another post. Mental models are reflected in, but
less often challenged by analytical tools. Oh sure, insights can most definitely be had with them, but I'm talking about much bigger constructs. Sophisticated computer models, for the most part, tend to only amplify the force and reach of our brains.
They do not, as a general rule, make them better able to 'think
different' (ly) and develop new mental models for seeing the world -- for assessing the big risks in truly complex adaptive systems or for finding innovations.
I don't agree with everything in this piece by Thomas Homer-Dixon that appeared last week in the Toronto Globe and Mail, but this quote is an absolute gem (emphasis added):
Our global financial system has become so staggeringly complex and opaque that we’ve moved
from a world of risk to a world of uncertainty. In a world of risk, we can judge dangers and opportunities
by using the best evidence at hand to estimate the probability of a particular outcome. But in a world of
uncertainty, we can’t estimate probabilities, because we don’t have any clear basis for making such a
judgment. In fact, we might not even know what the possible outcomes are. Surprises keep coming out of the blue, because we’re fundamentally ignorant of our own ignorance. We’re surrounded by unknown unknowns.
It's something I've said for a long time:
It's tempting to think that all things are predictable given enough information, enough minds, enough time and enough computing power. It's just not true. (Which is not to say that some things are not predictable... and with incredible precision... a phenomenon that leads to overestimating the scope of problems and questions that lend themselves to such methods.)
Telling which is which is the trick...
I would go even further to say that really smart people who, by life experience know that some things are fundamentally unpredictable still draw an unvoiced sense of emotional comfort in their business life from the idea that some wise expert somewhere has been to the future (for all intents and purposes) and if we could just find him or her things would be OK... and/or that a really sophisticated computer model or prediction market (the 'collective mind') can provide crystal ball-level insights.
Sometimes yes. Often, no.
I liken Mr. Homer-Dixon's observations to those tragically massive car pile-ups that happen a few times of year in fog-prone areas like the Central Valley of California. Everyone is driving along at a reasonable speed, with reasonable spacing between vehicles. People are sipping coffee, tuning radios, maybe talking on cell phones. Slightly distracted, but mostly responsible. All is normal.
Then the first guy hits a fog bank and can't see squat. He taps his brakes. The second guy sees red lights and fog coming up fast and taps his brakes just a little bit harder, and so on. In just a few seconds, hundreds of cars end up in a tangled heap and people die. All because the guy in front was convinced by every one of his senses and not without justification based on experience that the visibility on the next 100 yards of road would be the same as on the last 100 yards of road.
From global warming to obesity, bird flu to terrorism: 2007 was the year when the threat of an apocalypse became an everyday, even banal public issue... The fear market in apocalyptic scenarios continued to flourish in 2007. Almost every week we were told that ‘the situation’ is far worse than we originally thought... Public figures appear to have lost the capacity to reassure or lead people. Instead, they frequently opt for evoking frightening futuristic scenarios where the line between fiction and reality become unclear.
One consequence of Western societies’ obsessive preoccupation with the apocalypse-to-come is that less and less creative energy is devoted to confronting the all too important problems that exist in the here and now. Take the global credit crunch unleashed by the sub-prime home loan crisis this year for instance.
In terms of its material impact, this was arguably the most significant event of the year. After more than a decade of economic stability, the world economy faces the threat of a major recession with important implications for people’s lives. This threat may not make an exciting plot for a sci-fi movie, but it has a direct bearing on the quality of life of millions of people. It also raises important questions about an economic system that is so heavily reliant on using fictitious capital to reproduce itself.
Events over the past 12 months suggest that what we think and how we think influences how we experience our reality.
Some rules and questions we use to avoid these traps and test whether scenarios are useful include:
Are scenarios sufficiently orthogonal to and distinct from one another? Does each embody both 'good' and 'bad' elements? Real world developments are seldom all good or all bad at the same time and from the same perspective. If participants in a scenario workshop find it trivial to line up the scenarios in the same way from "good/easy for us" to "bad/frightening for us", we haven't done our job of representing real-world nuance in hypothetical future stories.
Do scenarios incorporate "here and now" events and choices? (We usually embody these in what we call 'events', a component of modular scenarios). Scenarios entirely about some far-off, visionary 'place' with no explicit ties to current issues are seldom useful beyond the fiction stacks.
Are scenarios directly comparable to current conventional wisdom? (I.e., as Furedi puts it, "how we think... [and] experience our [present] reality"). Without a concrete "you are here dot" scenario that represents what constituents are thinking and assuming, it's impossible to describe how "far away" hypothetical future scenarios really are, or what change they imply. If I'm contemplating a trip to Miami, it helps to know (in terms of budgeting, preparation and mode of transport whether I'm currently in Juneau, Ft. Lauderdale or Tiera del Fuego.
In a recent interview [PDF] with the Federal Reserve Bank of Richmond,W. Kip Viscusi is asked about the public-policy response to the threat of terrorism and whether we are weighing the costs and benefits in a generally rational way. "The reason this is tricky is we don’t have very good numbers on what these risks are," Viscusi says. "The estimates of the probability of a terrorist attack or the number of people who are going to die in the coming year are all over the map. So if you can’t assess the likelihood of a terrorist attack or how deadly it is going to be, it is really hard to say how much you should spend to try to prevent it." [emphasis added]
Without the qualifier, the idea sounds terribly cold. With all reverence for the families of those who have lost loved ones in terrorist incidents however, it's not. Businesses and governments--not to mention non-profit institutions, individuals and families--all need to assess risks as rationally as possible and take measures to hedge them. Like it or not, the value of a human life can be defined--at least partially--in dollar terms. Anyone who's ever taken a course in economics or statistics (or
balanced a checkbook for that matter) has figured out that
infinite spending to protect against risk is infinitely foolish and
that infinite spending on risk 'A' (and no spending at all on risks 'B', 'C', and 'D')
is only a variant on the same wrong-headed assumption.
That's not what's really at issue. What is at issue is the degree to which the political leanings of some lead them to believe that we can know how much to spend combating terrorism and that the 'right' number is obviously much less than we are spending today (e.g., tactical domestic measures, overhead for business, strategic overseas measures, etc.) To which my response is: really? Show me your pre-9-11 white paper predicting the order-of-magnitude sea-change that occurred in that 'industry' on 9-11.
Bryan Caplan is one of Viscussi's critics. The CHE post notes:
"I am frankly puzzled," Caplan writes at EconLog. Citing the work of John Mueller,we have a long experience with terrorism, which has
"shown it to be an extremely small problem in the broad scheme of
things. How much longer does Viscusi want to wait before he'll conclude
that the risk is very low?"
Unfortunately, long experience in and of itself is not sufficient for prediction, even at a macro level. Hold that thought for a quick diversion.
Here's where it gets weird. In criticizing Viscussi, Caplan, an econ prof at
famously free-market GMU, ends up in league with Schneier, who as far as I can tell, tends towards the opposite end of the
political spectrum. Both conclude, for entirely different reasons, that
the future threat of terrorism can be known and condensed to a dollar
figure and that rational budgets (both public and private) can be
set accordingly. Oh that it were so.
Caplan, in particular "favors the establishment of a prediction
market to help assess the likelihood of a terrorist attack". That's something I can conditionally applaud. If the results are used to "help assess", then we're fine. The possibility that a prediction market might help draw in and roll up marginal, highly distributed, even intuitive information that can supplement traditional (and sadly inadequate) intelligence-gathering mechanisms is certainly a good thing. I've been a huge fan of prediction markets for years. They should be used for more things than they are today. To my delight, more and more are catching on.
...as longtime readers know, I've also concluded that there are some (and arguably many) problems for which prediction markets are not only silly but grossly misleading. I don't have the space to review them all of them here. Thinking that they can precisely predict and quantify particular terrorist threats or even the threat of terrorism generally is a notion that falls into that category. Almost by definition, a successful terrorist venture is compartmentalized, secret and surprising.
In short, Caplan, Schneier and others appear to have an ideologically-induced blind spot that leads them to declare certainty where it does not exist. Let me explain.
Schneier and Caplan draw their essential argument from a backward-looking, actuary-style catalog of terrorist incidents. This many people died. This much property was lost. Productivity was reduced by this much for this long. Etc. Etc. It all sounds very rational. If we had reason to believe that terrorism were a natural, forecastable, perhaps even cyclical phenomenon, that approach would be absolutely correct. We don't.
The unpredictability of terrorism renders any backwards-looking, purely quantitative, actuarial mode of analysis inappropriate and ineffective. That is, future deaths due to terrorism are something that neither Schneier nor anyone else can possibly predict with any degree of confidence.
Until 2001, the biggest single terrorist incident had caused around 300 deaths. Then in the space of a few hours, that number went up by an order of magnitude. There was no consensus (or even a significant plurality) of expert opinion predicting that that would happen - much less when, where and how. [emphasis in original]
And that's the problem.
Sudden, step-function, order-of-magnitude change, precipitated by a small group that has every reason to keep its plans secret is inherently unpredictable. It can only be imagined. If that sounds familiar outside of the terrorist context, it should. Businesses face this kind of challenge all the time; it is the very nature of business, in fact: snowboards vs. skis; digital photography vs. wet-process; PCs vs. mainframes; VoIP vs. legacy telephony; biotech vs. big pharma. The list is very long. It's harder to name an industry that hasn't been touched by this kind of change at some point (often precipitating the re-invention and re-definition of the former "industry") than it is to come up with a long list of examples of industries that have be altered in this way. Bottom line: it's important to differentiate between problems that lend themselves to forecasting and those that can only be dealt with via imaginative scenarios.
“I keep telling them this planning is good for the company. If we have a pandemic we’ll build market share, because our competitors aren’t doing the planning.”
What kind of market share, you may ask? Earlier in the article, another consultant provides examples of opportunity in the midst of hypothetical chaos after avian flu has hit:
A big hotel chain, figuring tourism would collapse, is studying how to rent its properties to governments as places to tend the sick. A hospital operator has contracted for refrigeration trucks [because] their in-house mortuaries are too small... A communications corporation has built a self-contained “clean facility” within its headquarters to house critical personnel and key operations.
Sounds like the kind of grim-but-true advice I'd expect to hear on Six Feet Under.
Elsewhere in the article, Robert E. Mittelstaedt Jr., dean of Arizona State University’s W.P. Carey School of Business comments on scenario planning, noting (as I have before) that scenarios are often misused as a predictive tool and at a far too detailed level for business continuity planning (or much of anything else, for that matter):
...it’s difficult to engage in detailed scenario planning, because no one knows how, exactly, a pandemic would unfold—“but we want to make sure there’s a plan there, while not micromanaging it.”
Shortly after 9-11, a client showed us a massive spreadsheet of disaster possibilities, deeming them all 'scenarios'. They were not. Every conceivable thing was covered--from half a dozen megatonnages and locations for nuclear detonations to countless permutations of bridges and tunnels being taken out, anthrax being mailed, utilities and emergency services being cut and the simultaneity of all of the above.
Too much. Waaaaay too much. The 'scenarios' were also confined by definition to what the client could conceive of on its own, colored, it's now clear in hindsight, by the headlines of the day. We were brought in in part to help make sense of it all.
Another angle some now see, and for which real scenarios are well suited, is thinking through supply chain risk and contingency--something many have woken up to in the wake of the Taiwan earthquake (some of the resilience implications of which I blogged about last week: here, here and here). The article continues:
For manufacturers, supply-chain resilience is a chief concern. Some have been conducting due diligence on the readiness of suppliers, knowing that if critical components aren’t delivered it will render their own contingency plans useless. Others are establishing relationships with alternative suppliers, just in case... [though] some directors privately concede that little attention is being paid to the specific challenges posed by a pandemic. Having lived through Y2K, 9/11, and Hurricane Katrina, many outfits feel comfortable that their general risk-management and business-continuity plans will suffice during a pandemic. [emphasis added]
Key differences to note: 9-11 was local; Katrina was regional; Y2K was long-anticipated. Yes they all had global effects, but none were even in the same league as a 1918-style pandemic. In no case were people unable or unwilling on a wholesale basis to travel to work for more than a few days.
All of which leads me to the conclusion the esteemed Former Secretary has already reached: surviving bird flu as a business entity may require as much clever seizing of new opportunities as it does determination to keep the old business running. In some cases there is no rational continuity strategy, only one of strategic resilience (building the capability--structurally, culturally and otherwise) to flexibly adapt to radically changed and changing circumstances.
Laws designed to keep fishing vessels from dragging undersea cables carry fines pitifully out of proportion to the potential damage (roughly a million-to-one). It's not clear why, in a nation (China) not shy about capital-punishment, someone has not lost their head over previous cable cuts (e.g., April 6, 2004).
The capacity of land-lines from Europe to Asia is tiny in comparison with those from the U.S. to Asia and the U.S. to Europe. This global asymmetry raises a host of issues ranging from economic development to politics to outsourcing to culture.
Satellites are expensive and have only limited capacity. The illusion that they offer backup to undersea cables is just that--little better than thinking that carrier pigeons will fill the gap.
With that and other emerging news, it's also becoming clear that lessons learned after 9-11 haven't 'stuck'.
Within the IT-savvy portion of the New York financial community at least (and, I would have imagined, well beyond it) a great deal was learned, publicly documented and put into practice to address the fragility of such networks. The cost was not small, however the there was unanimous agreement that the alternative was worse. The direct and indirect costs of closed markets (four days in the case of 9-11) were understood to be vastly larger--something that many in Asia are rediscovering to their collective chagrin.
The lessons of 9-11 (from a business resilience perspective anyway) are not particular to New York, to the U.S., or even to a cause. I'll rephrase and repeat that last bit, because without justification, thinking about business resilience is too often segmented by what went bump in the night rather than what happened as a result of it.
The business resilience lessons of 9-11 apply as well to natural as to man-made disasters; as well to widespread as to 'point' impacts; and as well to protracted as to short-duration events.
The lessons are immutable--essential principles for building and maintaining resilient networks and organizations. Two biggies are:
Understand layer one. Don't assume that a carrier's logical architecture (much less it's marketing architecture) showing a ring, double ring, or even mesh network has anything to do with physical reality. Both 9-11 and the Taiwan earthquake illustrate how economic pressures and crowd dynamics drive networks towards similar if not identical paths-of-least-resistance. (I suspect that carrier assurances in articles like this are not really about physical routes.) This is no more the fault of a particular company or person than is Interstate 80 through the California Sierras, the Golden Gate Bridge across the mouth of San Francisco Bay or the majority of the U.S. financial community on one tiny island (Manhattan). Those were simply the most logical and least costly places to put those things. So it is under the ocean...
Forbes characterizes the failure as, "the largest outage of telephone and Internet service in years... demonstrating the vulnerability of the global telecommunications network."
...Up to a dozen fiber-optic cables cross the ocean floor south of Taiwan, carrying traffic between China, Japan, Korea, Southeast Asia, the U.S. and the island itself. Chunghwa Telecom Co., Taiwan's largest phone company, said the quake damaged several of them, and repairs could take two to three weeks.
Taiwan lost almost all of its telephone capacity to Japan and mainland China... Later, Chunghwa said connections to the U.S., China and Canada were mostly restored, but 70 percent of the capacity to Japan was still down, along with 90 percent of the capacity to Southeast Asia.
ZDNet says that "two of seven" cables were disrupted, enough to cause service-disrupting congestion on the remaining lines. Based on experience, I find the higher figure Forbes cites more likely, though it's possible that distinctions in terminology account for the difference.
Technicians in Singapore said that the Internet's built-in protocol was automatically rerouting traffic over alternative routes, either overland through China, west through the Middle East to Europe or even south to Australia.
All true... to a point... the problem being one of layer confusion: if the physical cables don't exist, nothing can be re-routed over them--automatically, manually or otherwise. (Wireless, much less satellite links, do not provide the kind of trunk capacity we're talking about over long distances.)
As this map shows that the IHT's wishful thinking about "overland through China" and "west through the Middle East" are really just ways of expressing grim humor while waiting for the broken undersea cables to be repaired. Last I checked, there were some really big mountains and corrupt Islamic, authoritarian and 'former' communist governments in the way. This map, showing China and vicinity makes this clear. This one shows only Alcatel submarine routes but nonetheless shows what's true of most other providers: most roads go near or through Taiwan. This one by China Telecom confirms the tiny capacity running west over land.
Could this recognition of a geographic reality (i.e., Taiwan as cross-roads on which the mainland is dependent) cause the Chinese to see with greater acuity their dependence on Taiwan... and act on it in new ways?
While the clusters of glass fibers are enclosed in protective material, they remain vulnerable to undersea earthquakes, fishing trawlers and ship anchors. There are also many choke points around the globe, including the vicinity of Tuesday's earthquake, where a number of key cables converge... "It's unprecedented that all seven cable systems suffered damage at the same time," said Au Man-ho, director-general of the [Hong Kong] Office of the Telecommunications Authority.. [emphasis added]
Unprecedented stuff happens all the time. This whole incident is architecturally and organizationally reminiscent of what happened to communications on 9-11, including its effects across a much wider region. I.e., a common single point of failure among several trunk networks previously imagined to have been geographically distributed was discovered only in crisis.
...four repair ships with crews will arrive in the affected area on Jan. 2 to begin repair work on four of the damaged lines.
That's five days from now. The location is less than 100 miles off the coast. I can only surmise that the reason for the delays is one that both plagues and enables business resilience efforts (but which is seldom taken into account), namely: people. Mustering a crew to do the repairs (or perhaps mustering the factories to produce the materials needed to do the repairs) is no easy task over the Christmas/New Year's break. When things break, it's seldom at the most opportune time.