It’s time to rethink security certification for OT devices

0

Those who do not learn history are doomed to repeat it.

We’ve heard this saying countless times, but it takes on added significance when it comes to operational technology (OT) cybersecurity.

History repeats itself over and over as entities – from manufacturers to third parties to end users – follow the same legacy processes and execute the same strategies.

For the past three years, Forescout has conducted research into OT device security issues and led the largest security assessment of TCP/IP stacks – the communication protocols that OT devices rely on to operate – discovering over 95 new vulnerabilities. This continued with the assessment of OT equipment and protocols earlier this summer in our OT: Icefall research, leading to the discovery of 56 additional vulnerabilities.

Similar conclusions could be drawn from all the research: legacy processes, insecure practices by design, and reliance on previous certifications are the main culprits and need to be addressed. One way to do this is to use security certifications.

Difficulty following the same processes and certifications

We live in an ever-changing connected world. Industries that harness commerce, support our health, and create new innovation deliver their value proposition at a faster pace, thanks to OT devices.

It is precisely for this speed and these constant changes that it is no longer enough to follow the same processes and rely on the same certifications.

To know where we want to go, it is imperative to examine what we have been through so far. While not the case for all OT devices, security is often a level two or three priority before a device hits the market. Actions such as finding vulnerable code often take place, as do component and protocol walkthroughs to ensure the device meets compliance requirements. These actions inform the security certification process, which is flawed because it is a static, one-time assessment.

The problem with this is that a device may go through a rigorous security risk assessment process before being released to market or deployed on a network, but that doesn’t mean it’s secure for its entire lifetime. of life. Additionally, during this security risk assessment process, the security of actual protocols and software components is rarely examined to a satisfactory level. Our OT:Icefall research found that 74% of product families affected by discovered vulnerabilities had already received some form of security certification.

That doesn’t mean security certifications don’t make sense. This means we need to reassess the security certification process.

How to reassess security certifications

Security teams, manufacturers, and regulators have grown accustomed to certifications based on opaque security definitions and functional testing. They’ve also gotten into the habit of playing hot potato when it comes to safety accountability. Government agencies have attempted to impose greater responsibility on manufacturers and, in turn, on the manufacturers of security teams. This is the problem. Security certification and managing the long-term security risk posture of OT devices needs to take a more holistic approach and be a team sport.

Security certification in an OT world should encompass the following:

  • Well-defined and widely accepted security requirements tied to realistic attacker models. Security certifications should clearly state what they certify on. Some schemes adopt certification levels that correspond to increasingly sophisticated classes of attackers. This sophistication, however, is defined in generic terms, such as moderate resources, sophisticated means and specific skills. These vague terms lend themselves to interpretations that reflect a listener’s perceptions and expectations. Attacker models and their capabilities need to be standardized. Additionally, lower certification levels sometimes only consider issues such as unintentional misuse, which is too lax, allowing for insecure designs. Basic security requirements should include signed firmware and encrypted and authenticated protocols.
  • Rigorous testing of protocol implementations. Many certification systems limit the assessment of security requirements to functional testing, which means that functionality is verified but no inspection is performed. These tests generally exclude proprietary protocols. Thus, a functional security assessment may conclude that authentication is present on an engineering interface, while the protocol is unauthenticated and all authentication is performed on the client side. Similarly, communication tests often only evaluate open protocols known to listeners. The specification of everything communication protocols should be provided to auditors during certification efforts, and ideally these protocols should be evaluated at the implementation level to avoid issues where a feature is present but vulnerable.
  • Certification of individual components of a connected device. Supply chain vulnerabilities are widespread. Since virtually every device is made up of a myriad of reusable software components, these components should be considered the basic unit for testing and certification. This could lead to reliable component libraries and reusable certifications that would allow device manufacturers to choose from known designs and implementations.
  • Automatic certification invalidation. The discovery of vulnerabilities on a device should automatically invalidate its safe-certified status until the issues are resolved and corrected. This automatic invalidation could be done with recent technical developments, such as software nomenclatures, the Common Security Advisory Framework and Vulnerability Exploitability eXchange.

Once a certified device is running and communicating with a company’s network, the real work of managing long-term security risk begins. Consistent monitoring and contextual risk assessment of OT devices by the security team is critical. Similarly, manufacturers of these devices must continually test these devices in new situations, reassess device components to identify emerging risks, and share this information with enterprise end users. When it comes to improving security posture, remember that we are all part of the same team.

About the Author
Daniel dos Santos is the Head of Security Research at Vedere Labs at Forescout, where he leads a team of researchers who identify new vulnerabilities and monitor active threats. He holds a PhD in computer science, has published over 30 articles in cybersecurity journals and conferences, and has spoken at conferences such as Black Hat, Hack In The Box, and x33fcon.

Share.

About Author

Comments are closed.