“Bad guy gets in, lights go off” seems to be the general understanding of offensive industrial control system (ICS) operations and a perspective many still propagate. Disruptive and destructive cyber operations are not straightforward. They’re complex. They take time measured in years, money, research, and usually several teams or individuals each contributing their own expertise.
- Sometimes this is a product of differentiated responsibilities–created through organization, legal, experience, or otherwise–where some members are limited in what they can do or how far they can go. For instance, in the US, intelligence access and effects operations are different legal authorities (Title 50, Title 10 respectively)
- Sometimes this is a product of specialization where certain teams or individuals have spent years building the skills and knowledge necessary to conduct a specific portion of the mission.
- Sometimes it is a collaborative effort across wider boundaries, like two countries working together (i.e. foreign technical assistance).
- Sometimes part of the work is done by researchers in an industrial automation or engineering laboratory whose work is later and unknowingly incorporated into a larger military mission.
- Sometimes expertise across domains such as computer science, chemistry, electrical, physics, etc. are used to limit risky and catastrophic negative effects due to international legal sensitivity.
- Sometimes legal frameworks forbid action by one party but not another, enabling collaboration in the interest of both parties (e.g., the “wink and nod” approach). This type of collaboration is common among offensive operations by private individuals or organizations conducted on behalf of a state.
In any case, among the most complex intrusions in the world are the hallmarks of different elements working together.
Difficulty in Attributing Offensive ICS Operations
The complex and “messy” nature of offensive ICS operations also often precludes simple attribution to a single entity. It would be a significant mistake to attribute an entire ICS operation to a single entity (either specific or general) based solely on a few matching elements. One could easily say “Operation X” is the work of “Group Y” because X had several matching elements of Y. But, if X were really a combined operation of W, Y, and Z then your attribution statement would be wrong and misleading – even if partially correct.
Further, offensive ICS cyber operations pose a significant risk to the operational party due to likely blowback and retribution for damage and disruption. To limit potential blowback, effects operations have a higher than normal likelihood of being infused with disinformation; this disinformation supports deniability or redirects retribution onto an innocent party.
Access vs. Effects
There are generally two elements of ICS disruptive or destructive operations: access operations and effects operations. In almost every case access is necessary prior to causing effects – hence the two necessary elements.
Access Operations gain access to (e.g., exploit) a target network. Once access is achieved it can be used to conduct espionage, maintain access for future planning, or enable other teams–such as effects teams.
Effects Operations manipulate industrial devices to deny, degrade, disrupt, or destroy visibility or control of an industrial process.
However, ICS access and effects may not be the same individual, group, or even government. Due to the time, cost, and expertise necessary to understand industrial control systems sufficiently to cause determinate effects, an “effects team” may not be able to exploit networks at all and are only good at manipulating industrial control equipment. In fact, there may be a different effects team for different purposes such as a “downstream oil and gas refining effects team” or an “electric power generation effects team” because industrial environments can vary so greatly.
Therefore, in understanding ICS threats, we separate the access operations from the effects operations. It’s common for ICS threat analysts to say, “evidence suggests this threat is limited to access operations at this time,” meaning we only saw network exploitation and possible data exfiltration without process control or view manipulation. It is an important distinction.
Intent and Preparation of the Environment
Not all intrusions into ICS environments will lead to a destructive or disruptive effect. While it is this author’s and Dragos’ position that any illicit access to industrial control systems could lead to disruption, it may not be the adversary’s intent. There are times when simply having and maintaining access to industrial environments for future contingencies (e.g., a potential military conflict) is enough. In these cases, we say the adversary is engaging in the “preparation of the environment” to create an advantage if ICS disruption becomes necessary. This is why Dragos does not ascribe intent to most offensive ICS operations. Intent is a human element that is not always clear in computer operations.
Importantly, prepositioning access to industrial processes is bad for defenders and should be a clear and present danger to industrial control systems. However, we can’t say just because they have access they want to do something damaging or disruptive, either. Those must remain two separate thoughts.
An Extension of Statecraft
Offensive ICS operations are, at this time, limited to states and similarly-focused and resourced entities; they are an extension of statecraft and just one component modern states use to exert power and influence. The impact of ICS disruption is deep and wide, not only affecting their local geographical region but having societal and psychological dimensions. Industrial control systems have no value themselves; their value is tied to the output of the industrial process they’re managing and the life and environment they’re protecting. Attacking industrial control systems is not an end-goal, it is a means to an end. When you attack control systems, you’re not attacking computers or data, you’re attacking the process (e.g., power, gas, food, pharmaceuticals) and the people who rely on them.
This fact also begs the question whether these (or any) offensive cyber-enabled ICS effects reach the burden of “armed force” necessary to invoke the Law of Armed Conflict (LOAC) incorporating the Geneva Conventions, jus in bello (i.e. laws of warfare), and similar protections. At this time the question is left unanswered by the international legal community, meaning the behaviors of state cyber operations against industrial control systems will likely be more irregular than other elements of statecraftpossibly putting innocent civilian lives at risk.
- Offensive ICS operations are difficult and expensive and therefore will normally express hallmarks of complex organization
- Accurately attributing an offensive ICS operation without substantial insight is difficult and likely to be wrong or at least misleading
- Offensive ICS operations can be understood as two elements: access to an industrial environment and causing effects; both being analyzed and understood separately
- Not all access is intended to lead to industrial process disruption and may instead be to preposition assets for some unknown future limiting our ability to assess intent accurately
- ICS operations cannot be understood in a vacuum, but in the larger context of statecraft and used to exert influence and cause change including the complexity of international legal frameworks and norms
Offensive ICS operations are a shady and complex business filled with a myriad of private entities, individuals, and states–some working collaboratively and some competitively. ICS threats are not straightforward, and their long-term consequences can be deadly. We must treat ICS threats with the seriousness the situation we face deserves; but, also seek to understand them before we leap to conclusions and erroneous judgements from a simple perspective on a complex problem.
“Demystifying the Title 10-Title 50 Debate: Distinguishing Military Operations, Intelligence Activities & Covert Action” by Andru E. Wall https://www.soc.mil/528th/PDFs/Title10Title50.pdf
See Joint Publication 3-12 (https://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/jp3_12.pdf)
Introduction to the Law of Armed Conflictby International Committee of the Red Cross https://www.icrc.org/en/doc/assets/files/other/law1_final.pdf
The Law of Armed Conflict: Conduct of Operationsby International Committee of the Red Cross https://www.icrc.org/en/doc/assets/files/other/law3_final.pdf