Feature

Cyber attack, atom bombs and your PLC software

I suppose it had to happen eventually: computer viruses have morphed from nuisances into real hazards to business data and our personal identities. For plastics processors, the silver lining


Print this page

November 1, 2010 by Jim Anderton, Technical Editor



I suppose it had to happen eventually: computer viruses have morphed from nuisances into real hazards to business data and our personal identities. For plastics processors, the silver lining

has always been that the computer power that runs production machinery, PLC’s and more modern integrated PC-based controllers wasn’t affected.

Until now.

The so-called Stuxnet worm specifically infects Siemens WinCC/PCS7 Supervisory Control and Data Acquisition (SCADA) software, and can reprogram systems and then hide the changes. What does it change? Why? We still don’t know, but the rumours sound like something out of a John LeCarre novel. Hackers are apparently off the hook, industry experts say, because Stuxnet is too big, too complex and too specific to be the product of some nerdy kid in a university dorm. Beyond that, any theory goes. One thing we do know is that, presently, only Siemens systems are affected. But these are extensively employed worldwide — including in Iran’s nuclear facilities, which raises the question of whether Stuxnet is a cyber attack on the Iranian nuclear program.

Whatever the truth about Stuxnet, the larger point is this: it may be the start of many similar infections. How can we protect against this kind of attack? Mainly by treating our control software the same way we would financials, personal records or any other critical business information. Here are some ideas:

1. Restrict access to production controllers. Sure, it’s great that your system can download upgrades automatically, but do you have to let software decide who gets into your system? Configure it so that upgrades have to be approved by plant personnel before they’re downloaded. Better still, download patches and upgrades to a USB stick and check them for viruses before loading. Log downloads the same way you’d document changes to a blueprint. Make software backups a part of your PM program, just like greasing bearings or adjusting clearances.

2. Keep several backups of previous configurations. I’ve kept as many as four USB backups of software, making a copy of each version going back for upgrades/patches. Why? Because you’re certain to have a workable instruction set that predates the infection, allowing you to get your machines up and running even if you have to blow out the entire program set to get rid of a virus or worm.

3. Control USB storage. Use USB memory from different manufacturers and add a massive, non-pocket-size key fob to backup sticks. I use paint stirrers because they’re easy to label and find on a desk or workbench. The idea is to prevent them from accidentally going into an engineer’s pocket, where they’re used as convenient storage for every other file on his or her PC.

4. Isolate the hardware. I prefer production equipment to have no physical connection to front office computers, no matter how much your IT people say it’s the best/cheapest/most convenient way. In my world, any intranet system controlling production equipment should be accessible by production personnel only, and master processors should be configured to prevent access to Web browsers, email and internal communications software. If someone emails an upgrade, put it on a USB stick and check it for viruses first.

In the end, we’re all stuck with systems that are vulnerable to a certain degree. Stuxnet in specific isn’t the issue here — the very flexibility of modern control software means that one-size-fits-all controllers will eventually accept malware intended for a sinister purpose.

Try telling your customer that the shipment will be late because your press shut down after a cyber attack on somebody’s atomic bomb program. It sounds far-fetched and I hope it never happens, but why roll the dice?