Unending Ransomware Attacks Are A Symptom, Not The Sickness
Opinion It’s been a devastating few weeks for UK retail giants. Marks and Spencer, the Co-Op, and now uber-posh Harrods have had massive disruptions due to ransomware attacks taking systems down for prolonged periods.
Imagine an inverse Black Hat conference, an Alcoholics Anonymous for CISOs, where everyone commits to frank disclosure and debate on the underlying structural causes of persistently failing cybersecurity syndrome
If the goods these people sold were one-tenth as shoddy as their corporate cybersecurity, they’d have been out of business years ago. It’s a wake-up call, says the UK’s National Center for Stating the Obvious. And what will happen? The industry will just press the snooze button again, as we hear reports that other retailers are “patching like crazy.”
The bare fact that entire sectors remain exquisitely vulnerable to what is, by now, a very familiar form of attack is a diagnostic of systematic failure in the way such sectors are run. There are few details of what exactly happened, but it’s not the details that matter, it’s the fact that so little was made public.
We see only silence, deflection, and grudging admission as the undeniable effects multiply – which is a very familiar pattern. The only surprise is that there is no surprise. This isn’t part of the problem, it is the problem. Like alcoholics, organizations cannot get better until they admit, confront, and work with others to mitigate the compulsions that bring them low. The raw facts are not in doubt; it’s the barriers to admitting and drawing out their sting that perpetuate the problem.
We know this because there is so much evidence of corporate IT’s fundamental flaws. If you have been in the business for a few years, you’ll already know what they are – just as surely as you’ll have despaired of progress. If you are joyfully innocent newbie, then look at the British Library’s report into its own 2023 ransomware catastrophe. It took many core systems down, some of them forever, while leaking huge amounts of data that belonged to staff and customers. As a major public institution established by law, and one devoted to knowledge as a social good, the British Library wasn’t just free to be frank about what happened, it had a moral obligation to do so.
The entry point for the attack could have been a host of other things. It really doesn’t matter. What does matter is that legacy components of a very complex whole couldn’t be secured to modern standards, and in many cases they couldn’t be restored once they were disrupted. The Library said in its report that this and the complexity was due to underspending on lifecycle management, and the underspending was due to new projects being preferentially funded from a fixed budget. The complexity was due to the amalgamation of very different departmental systems over time, again without the resources needed. And IT, with its lack of budget, was being asked to do too much in a fiscal environment in competition with other, higher status parts of the organization.
Does this sound familiar? You may not be a national library with multiple collections and legal obligations vying for oxygen, but you may be a large retailer with IT systems bearing the scars of corporate mergers and amalgamations, of integration of old logistics systems with new online customer-facing portals, with migrations and data flows built out of more string and sealing wax than any PowerPoint executive summary dare show. Once something works, however ungainly it is, the attention moves elsewhere. Security is a business expense that, if it works, has no visible effect whatsoever. Ongoing security looks like a bucket with a hole in it to those for whom visible effects are vital
This is basic human psychology that operates at every scale. Getting the boiler serviced or buying a sparkling new gaming rig – there’s a right decision and one you’ll actually make. Promising to run a state well while starving it of funds, is again hardly unknown. Such an act is basic, but toxic, and it admits of its toxicity by being something that polite people are loath to discuss in public.
Where there’s insufficient discipline to Do The Right Thing in private, though, making it public is a powerful corrective. Self-help groups for alcohol abuse work for many. Religions are big on public confession for a reason. Democracy forces periodic public review of promises kept or truths disowned. What might work for the toxic psychology of organizations that keeps them addicted to terrible cybersecurity?
It’s unlikely that entrenched corporate culture will reform itself. You are welcome to look for historic examples, they’re filed alongside tobacco companies moving into tomato farming and the Kalashnikov Ploughshare Company.
There is one way of publicly accepting shortcomings with consequences, one that is frequently and not inaccurately seen as advanced ass-covering. It’s a variant of the It’s Everyone’s Fault So It’s Nobody’s Fault. Often an excuse for inaction, it does at the least get problems out in the open, where there is opportunity for innovation in fixing them. Imagine an inverse Black Hat conference, an Alcoholics Anonymous for CISOs, where everyone commits to frank disclosure and debate on the underlying structural causes of persistently failing cybersecurity syndrome, or PFCS. The British Library report could serve as a starting point, with hundreds of other major incidents used to create a properly abstracted, generalized description of PFCS. Diagnosis is a prerequisite of a cure, it’s like getting an owner’s manual for what ails you.
What then? A protocol for ensuring, or at least encouraging, the security lifecycle of a project or component. How long will it live, how much will it cost to watch and maintain it, what mechanisms are there to reassess it regularly as the threat environment evolves, what dependencies need safeguarding, and, lastly, what is the threat surface of third party elements? In short, we must agree to accept that there is no such thing as “legacy IT,” no level of technical debt that can be quietly shoved off the books. If all that isn’t signed off at the start of a system’s life, it doesn’t happen.
No silver bullet, nor proof against toxic psychology. It would be a tool for everyone who knows what the right decision is, but who can’t see how to make it happen. There are plenty of accepted methodologies for characterizing the shape of a project at its inception and development, and all came about to fix previous problems.
One that takes on board the nature of PFCS would be the start of fixing an unacceptably embarrassing and highly dangerous toxic reality that has been in plain sight for decades. ®
A considerable amount of time and effort goes into maintaining this website, creating backend automation and creating new features and content for you to make actionable intelligence decisions. Everyone that supports the site helps enable new functionality.
If you like the site, please support us on “Patreon” or “Buy Me A Coffee” using the buttons below
To keep up to date follow us on the below channels.