Is it fair to criticise government agencies for drowning in a growing flood of security alerts and associated patches?
Probably not; it's hard to see how they could do anything else in the present situation.
Take the Department of Human Services whose chief information officer Gary Sterrenberg recently told the parliament public accounts and audit committee that his organisation alone had patched two and a half million devices over the last year.
That's a stupidly large number, but with the number of required patches doubling in a decade and an otherwise sane thirty-day rotation policy for updates, Sterrenberg could not ensure that all devices were secure at any given time.
Government chief information officers and administrators are being criticised for being slow, yet face an unenviable Sisyphean task that's becoming harder to perform each time.
Their organisations on the one hand have an efficiency requirement that necessitates an increasing amount of functions being performed by machines rather than people - it's faster, more convenient usually, and almost always cheaper. You can't argue against that if you spend taxpayer money.
On the other hand, the machines and systems replacing people are often badly designed and can fail catastrophically in another key requirement, namely keeping government and citizens' data safe.
Meanwhile, the amount of data for safekeeping is growing along with the complexity of managing the systems storing and processing it.
While it's generally a good idea to keep the systems up to date, patching them isn't without risk and can cause more problems than those it's supposed to fix.
This is where the IT business model is the opposite to just about everything else on the planet.
Apart from rare recalls, we would never accept a car manufacturer that would ask owners to "patch" against or repair potentially dangerous or even lethal flaws in vehicles on a monthly basis.
Even if we did, it would be completely unacceptable if the manufacturer issued fixes that failed as they often do in IT. This is for obvious reasons - if something goes wrong in a car, there's a good chance people will be hurt badly and even die.
Yet in IT we accept that one day of the month at least will be dedicated to fixing flaws, some minor, some really major, in systems that are critical to the operation of government, business, healthcare and military. Again, if something goes wrong with these systems, there could be serious damage done and even risk to people's lives.
A vendor's inability to provide sufficient quality control to avoid regular patching of hardware and software actually makes the problem worse by attracting criminals - like flies to a rotting cadaver.
In other words, there is a reason that hacking, cracking and digital miscreancy in general are so popular - it's easy to achieve.
There are plenty of existing high volume and value targets with more coming online each day offering lucrative opportunities for criminals and others who face very little risk of being caught.
Clearly, this is an untenable situation, one that won't be fixed by adding more perimeter defences and working out how to patch even faster.
Instead, IT systems vendors need to step up and stop delivering insecure crapware that requires regular patching. You could be excused for thinking that vendors ignore security on purpose, in order to tie in customers with lengthy maintenance contracts.
There may be a way to fix this: a regional telco vendor outsourced the maintenance of its network to contractors some time ago to save money. To make sure it could realise the planned cost savings, the telco wrote a sinking lid clause into the contract with the company that said that each year, the number of faults repaired would have to decrease.
This had the desired effect - it removed the incentive for the contractor to avoid preemptive maintenance, because over and above a certain number of faults a year, the company would not be paid by the telco.
A similar sinking lid policy for IT systems, in which vendors are expected to supply less patches over the life of their goods and services, while being penalised for security issues that breach this requirement, could make them take their responsibility seriously for a change.