New SEC cybersecurity rules put more onus on the CISO, not so much on directors
Myke Lyons
August 3, 2023
0 mins readWith the SEC's adoption of new rules on cybersecurity risk management, strategy, governance, and incident disclosure by public companies, one thing is clear: better definitions are required.
Disclosure of cybersecurity breaches: open to interpretation
If you haven't read the SEC filing from July 26, 2023 (or the promptly released dissent from Commissioner Hester M. Peirce), let's take a quick look at it together. Let's start with:
The new rules will require registrants to disclose on the new Item 1.05 of Form 8-K any cybersecurity incident they determine to be material and to describe the material aspects of the incident's nature, scope, and timing, as well as its material impact or reasonably likely material impact on the registrant. An Item 1.05 Form 8-K will generally be due four business days after a registrant determines that a cybersecurity incident is material. The disclosure may be delayed if the United States Attorney General determines that immediate disclosure would pose a substantial risk to national security or public safety and notifies the Commission of such determination in writing.
Ok, simple but not simple. The simple interpretation is that publicly traded companies (all under the purview of the SEC) must report cybersecurity incidents that have a material impact on registrants within four days to the SEC via Form 8-K. The "not simple" parts are:
What is “material”? It may be wise for CISOs to align material with SOx descriptions of materiality. “Hi CFO, we are going to spend more time together.”
What is a "reasonably likely" material impact? You've all received breach notices from companies that assure you that you're under no risk, that all data is encrypted and that bad actors have no way to decrypt. Would those same breaches still require an 8-K, and if so, does that send a very mixed message to users and investors?
What about breaches of non-material data (ex: PII) that only harm users but not the company itself? If bad actors get your home address, does the SEC care? Likely not (for now).
How are companies supposed to respond to and resolve a vulnerability, as well as file an 8-K, within 4 business days? As a CISO, a rule that's been beaten into my brain (and I've since beaten into other brains) is that security teams need to minimize the amount of people that know about an incident until we know the scope and impact. That's why we create access programs like TLP (Traffic Light Protocol) that emphasize "need to know" and "do not share". Sharing breach information publicly within four business days makes me think the authors may assume "breach" is a binary.
Does disclosing nature, scope, and timing put companies at risk? If you've only discovered parts of a breach does your disclosure let attackers know that they had access to your system for three months before detection? Some could argue that publicly sharing this information puts your company at risk, as well as potentially giving feedback to bad actors on how to refine their attacks.
These are just a few items that come to mind. Commissioner Peirce laid out her own, very articulate reasons for concern, and I encourage you to read them.
Disclosure of security processes: a public peek under the hood
Up next is a rule that requires companies to publicly disclose their security practices, as well as the potential risks that can come from a breach.
The new rules also add Regulation S-K Item 106, which will require registrants to describe their processes, if any, for assessing, identifying, and managing material risks from cybersecurity threats, as well as the material effects or reasonably likely material effects of risks from cybersecurity threats and previous cybersecurity incidents. Item 106 will also require registrants to describe the board of directors’ oversight of risks from cybersecurity threats and management’s role and expertise in assessing and managing material risks from cybersecurity threats. These disclosures will be required in a registrant's annual report on Form 10-K.
Disclosure of some additional concerns
This new set of rules was created in an effort to protect investors. "Whether a company loses a factory in a fire — or millions of files in a cybersecurity incident — it may be material to investors," said SEC Chair Gary Gensler. He followed with, "Currently, many public companies provide cybersecurity disclosure to investors. I think companies and investors alike, however, would benefit if this disclosure were made in a more consistent, comparable, and decision-useful way. Through helping to ensure that companies disclose material cybersecurity information, today’s rules will benefit investors, companies, and the markets connecting them."
This all makes sense in a vacuum where every company is the same and every incident is neatly cleaned up within three days. A large company, while fully staffed in security, would be challenged by sprawling infrastructure where four days isn't reasonable. A small company, on the other hand, may not have the personnel or expertise to fully manage an incident that quickly and could be forced to go offline for a while. In either case, a rigid response time protocol doesn't help the company, and in some cases, will hurt it. If a company feels pain, its investors feel that same pain.
Additionally, what are the repercussions of not disclosing within four days? Any company is going to perform a cost-benefit analysis as soon as the breach is discovered. If disclosing the breach quickly could put them at more risk, CFOs will work with CISOs to calculate that potential cost and then compare it to the daily fines that will aggregate from disclosing late. If it makes more financial sense to disclose later, that's just what will happen. Also, companies could just lie about when they discovered it. Not recommending that, but breaches of trust don't necessarily incur long-term punishment from the market (NYSE: EFX > $200 USD).
Finally, this is a ruling from the SEC. This only impacts publicly traded companies. What about massive private companies that have a lot of highly sensitive data? X (formerly Twitter) has a lot of private information but remains out of the SEC's grasp as long as it remains private. What about startups that often "move fast and break things" to get to the market first? What about government agencies that may or may not be listening to me through my computer because sometimes I talk out loud while typing?
In general, this is a well-intentioned first step, but leads astray at times and falls short at others. And as the CISO of a privately owned business, this doesn't impact me today thankfully.
The unspoken message: design and develop securely from the start
Ok, that was a lot of naysaying, but I'm a CISO — it's my job to look for potential flaws. Overall, I'm supportive of transparency. I believe public companies should be transparent. Heck, I think all companies and governments should be transparent, but people aren’t great at measuring or defining what a breach is.
If we read it as it's written, we can identify problems. But if we read between the lines, what the SEC is really saying (whether they realize it or not) is that companies need to always be putting security at the forefront of any technology, hurray! If your "material" data can't be breached, the disclosure time could be an hour and it still wouldn't impact you.
That said, I understand that there is no way to stay fully secure. No matter how fortified a castle is, an attacker only needs to find one weak spot in the wall (which could be building a ridiculously large trebuchet). So assuming we can't build a perfectly secure technological ecosystem, here are some things I think we should all be doing, regardless of SEC rulings:
Know your assets: Asset discovery is a core fundamental needed to secure your systems. If you don't know everything you have running and all the points that they're accessible, someone could walk in undetected.
Keep shifting left: I know, I know, a well-beaten drum, but it's an important one. Systems must be designed with security in mind from the start. Retro-fitting security is a crappy option — and considerably more expensive.
Build muscle memory around IR & comms: Have a practiced comms plan in place for if/when a breach happens. Have PR, IR, the CISO, and lawyers create a practiced approach with variations for breach types. Even if you're private, you should have this in place. It's only a matter of time before this type of requirement will go federal. This is where an ounce of prevention goes a long way.
Use accurate tools to fix fast: Security teams are generally understaffed and overwhelmed. Give them the tools they need to succeed quickly. Not to toot Snyk's own horn too loudly, but when Log4Shell hit late 2021, 98% of our customers were able to resolve their instances of the vuln within the first 48 hours. That would've given them 48 more hours to file their 8-K.
Lastly, partner with your engineering friends: As security people, we rely on many others to help us secure our stuffs. Use technology that is digestible and accessible to other technologists. The right milkshake will bring them to the yard (mint chocolate chip for me, thx).
Developer loved. Security trusted.
Snyk's dev-first tooling provides integrated and automated security that meets your governance and compliance needs.