The mission of this group is to bring together utility professionals in the power industry who are in the thick of the digital utility transformation. 

Post

Executive Order: “Improving the nation’s cybersecurity”

Tom Alrich's picture
Supply chain Cybersecurity Risk Management and NERC CIP-013 consulting Tom Alrich LLC

Currently with Tom Alrich LLC, I provide strategy and compliance consulting to electric power industry clients and vendors to the power industry, focusing on the NERC CIP cybersecurity standards....

  • Member since 2018
  • 188 items added with 43,489 views

 

Yesterday, the White House put out their long-rumored EO on cybersecurity. From what I’d read, I thought it would focus on software security. It certainly does that, but it also addresses some other areas of cybersecurity as well, especially incident response. Regardless, every topic covered in the EO addresses an important need – and some of them were needs I hadn’t thought of, for example the “cyber National Transportation Safety Board”, which now that I think of it is a great idea.

Your access to Member Features is limited.

In this post, I’m not going to try to summarize the whole order. Rather, I’ll just focus on what’s in Section 4 “Enhancing Software Supply Chain Security”; and I’ll skip some of the items that I find less interesting than the others. There’s a lot in there!

  • On page 13 in item (c), NIST is ordered to issue preliminary guidelines for most of the items in Section 4, within 180 days of the date of the order.
  • In item (e) on the same page, NIST is ordered to issue guidance for those items, although this happens 90 days after the guidelines were issued. The guidance includes “standards, procedures or criteria” regarding:

(i) secure software development environments, including such actions as: (A) using administratively separate build environments; (B) auditing trust relationships; (C) establishing multi-factor, risk-based authentication and conditional access across the enterprise; (D) documenting and minimizing dependencies on enterprise products that are part of the environments used to develop, build, and edit software; (E) employing encryption for data; and (F) monitoring operations and alerts and responding to attempted and actual cyber incidents;

  • It seems to me that all of the above controls were probably inspired by the SolarWinds attack, since the key event (if you want to call a 9-month process an “event”) was when the Russians penetrated the SolarWinds build environment for their flagship Orion platform and implanted the Sunburst malware in 7 or 8 Orion updates. In any case, all of items A-F are important for securing a software build environment, and I certainly support these items being made mandatory for all software suppliers to the federal government (and by extension, suppliers to private industry as well, since few if any suppliers are going to sell one very secure product for government use and one not-so-secure product for everyone else. That just doesn’t work well on a single marketing brochure).
  • The first item I want to discuss is item (vi) at the top of page 15. This deals with what I consider to be a really important area of concern for software supply chain security: The security of what went into building a software product, not just the security of the code written by the developer of the product. Item (vi) includes:
    1. Maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components;
    2. Controls on internal and third-party software components, tools, and services present in software development processes; and
    3. Performing audits and enforcement of these controls on a recurring basis.
  • Note the second item: Besides controls on components, this items says there need to be controls on tools and services that were “present in software development processes”. Tools and services aren’t components, and they aren’t normally included in an SBOM; on the other hand, a lot of people think they should be included. Note to self: This is something to be discussed in the Energy SBOM Proof of Concept (which BTW will hold their second public workshop next Wednesday, as I described in yesterday’s post. Anyone who uses energy is welcome to attend – and if you don’t use energy, please tell me how you accomplish that feat).
  • Item (vii) on page 15 requires that, starting 270 days from yesterday, software suppliers to the federal government provide a software bill of materials (SBOM) with every software product they deliver (it uses the germ “guidance”, not “requirement”, but in this context I think a federal agency will consider this to be an offer they can’t refuse). And since – again – I’m sure that suppliers of software to the Feds will deliver SBOMs to their commercial customers as well. How could they not do this?
  • So since SBOMs will be “mandatory” in 269 days, does this mean that the participants in the NTIA Software Transparency Initiative can disband and go home? No, not at all. This is because “providing SBOMs” isn’t a one-off deal. Yes, it is essential that a software supplier provide an SBOM with a new product, but they can’t stop there. Ultimately, they will probably need to provide a new SBOM every time there is a change in the software (for example, when an older component is replaced with a newer version of the same component), or even when just the value of one field in the SBOM changes without any change in the code at all (e.g. the supplier of a component is acquired by another supplier, so their name changes).
  • But this is still an unresolved question; the same goes for a host of other questions about how SBOMs are produced, distributed and especially used. Most of these questions can’t be decided simply through a bunch of wise people sitting around a virtual table and stroking their chins, but through a proof of concept in which these procedures are tested in practice. An NTIA PoC for healthcare has been in operation (with different iterations) since 2018; PoCs for autos and energy are starting up now; and others will undoubtedly follow.
  • So even though delivering a single SBOM will be “mandatory” for software suppliers in 270 days, that is just the beginning. The real goal of the Software Transparency Initiative is for most suppliers to be producing SBOMs (and also VEX documents) as often as needed, which may in some cases be quite frequently. This goal will take many years to achieve, but I don’t doubt that it will be achieved at some point. Meanwhile, the initiative needs to keep testing the procedures in PoCs and documenting the results.
  • Another interesting item on page 15 of the EO is “(x) ensuring and attesting, to the extent practicable, to the integrity and provenance of open source software used within any portion of a product”. It’s just about certain that the majority – one study says 90% – of software components are open source. So this is obviously an important requirement. I’ll point out that it’s very fortunate that the words “to the extent practicable” are included here, since this won’t be easy to comply with. Of course, knowing what open source components are included in a software product requires an SBOM.
  • One item on page 15 specifically calls out NTIA: “(f) Within 60 days of the date of this order, the…Administrator of the National Telecommunications and Information Administration shall publish minimum elements for an SBOM.” This should in theory be easy, since the minimum elements for an SBOM were described in this document in 2019: Author name, Supplier name, Component name, Version string, Component hash, Unique identifier and Relationship. However, I know there’s still some disagreement about whether this is the right list, so I’m sure this will be a topic in some NTIA meetings over the next two months. The nice aspect of this is that the question will without a doubt be settled after 60 days!
  • Item (g) on page 15 reads “Within 45 days of the date of this order, the Director of NIST…shall publish a definition of the term ‘critical software’…That definition shall reflect the level of privilege or access required to function, integration and dependencies with other software, direct access to networking and computing resources, performance of a function critical to trust, and potential for harm if compromised.”
  • Furthermore, item (h) on page 16 reads “Within 30 days of the publication of the definition…the… Director of CISA…shall identify and make available to agencies a list of categories of software and software products in use or in the acquisition process meeting the definition of critical software…” So NIST will define critical software and CISA will decide what software that definition applies to in practice.
  • I find the above quite interesting. Anybody who has been involved with NERC CIP compliance knows that the foundation of that compliance is identification of “critical assets” (particular servers, workstations and integrated devices like relays and firewalls); these are what the protections required by the CIP standards apply to (and I’m sure most other cybersecurity standards follow this approach to determine applicability). However, in this case the assets being protected are software assets, which would normally be run on generic Intel-standard servers and workstations.
  • But what about integrated, sealed devices, like Cisco™ firewalls or Schweitzer™ relays? The software running in them can certainly be considered critical, but the user is prevented from ever opening up the device and accessing the software directly, in order to apply security controls.
  • Of course, maybe CISA would address this issue by not categorizing as critical any software that runs on a sealed device. But doesn’t that strike you as a cop out? For example, it would be very hard to argue to a substation engineer that the software running on a protection relay isn’t critical to the safe functioning of the power grid.
  • We’ll see what happens with this, but meanwhile it seems there may be a serious flaw in the EO, in that it doesn’t seem to know how to deal with intelligent devices, just software sold separately from hardware. Of course, there would be various way to remediate this problem; it doesn’t have to sink the whole EO!
  • What happens once critical software is identified? Item (i) on page 16 reads “Within 60 days of the date of this order… the Director of NIST…shall publish guidance outlining security measures for critical software…, including applying practices of least privilege, network segmentation, and proper configuration.”
  • Note that here we’re no longer talking about measures that software suppliers have to take, but rather that federal agencies have to take to secure “critical software”. In other words, Section 4 of the EO applies both to suppliers and users. Of course, there’s nothing wrong with this approach. Sometimes, the best mitigations for supply chain cyber risks are the mitigations that are applied by the user organization itself.
  • Now we jump over a couple pages of discussion of implementation to item (r) on page 18, which reads “Within 60 days of the date of this order…the Director of NIST…shall publish guidelines recommending minimum standards for vendors' testing of their software source code, including identifying recommended types of manual or automated testing (such as code review tools, static and dynamic analysis, software composition tools, and penetration testing).”
  • In other words, rather than try to specify up front how vendors should test their source code (a best practice for software developers, but one that’s seldom mandatory), the EO requires NIST to recommend types (note the plural) of tools and methodologies for testing. This is how security requirements should be written.
  • Items (t) and (u) on page 19 mandate NIST, within 270 days of the order, to develop a consumer software labeling program – i.e. some way of classifying consumer software by categories ranging from “You have nothing to fear from this software” to “This is toxic s__t”. I wish NIST good luck in developing this program, but the White House may have just handed them the tail of a raging tiger and told them to capture it and lock it in a cage. This probably won’t go well, but I admire the WH for even being willing to try it.
  • Finally, I point you to item (j) on page 31. This is a “definition” of SBOM, which goes much beyond what a normal definition would. It seems that somebody – perhaps Allan Friedman? – decided to use this definition as a teaching tool, even more compact than the recently-produced SBoM at a Glance document (which I also recommend you read).

Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC. If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com.

 

Tom Alrich's picture
Thank Tom for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member

Discussions

Spell checking: Press the CTRL or COMMAND key then click on the underlined misspelled word.

No discussions yet. Start a discussion below.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »