Skip to main content
Episode 118

Season 7, Episode 118

This Is How They Tell Me The World Ends - A Look At Supply Chain Security With Nicole Perlroth

Guests:
Nicole Perlroth
Listen on Apple PodcastsListen on Spotify Podcasts

Nicole is a cyber security journalist and has covered many high-profile cases, such as the Russian hacking of nuclear power plants, North Korea’s attacks on movie studios, and Chinese government-sanctioned cyber-attacks around the globe. She is also the author of This Is How They Tell Me the World Ends, which provides readers with details about the most secretive, government-backed market in the world, cyberweapons.

In this conversation, we learn why cybersecurity is such an essential topic for non-technical people, cyber security threats that exist within global supply-chain markets, and the definition of cyber hygiene. Hear some examples of high-profile cyber attacks, steps companies should take regarding cyber security, why cyber security stories rarely make headlines, and the human-behaviour element behind the problem. We also learn ways in which society and governments can act to overcome the challenges of cyber security, and what advancements are needed within the space. Tune in to learn everything you need to know about the undercover cyber threat and what you can do about it, with expert Nicole Perlroth!

Teilen

EPISODE 118

“Nicole Perlroth: We need to basically, hand developers a grab bag of intuitive tools that gives them no excuse not to use these tools as they're developing code. Just, “Oh, let me stop and just click on this button, almost spell check.” Actually, I've never said that before. We need to spell check for code.”

[INTRODUCTION]

[00:00:25] ANNOUNCER: Hi. You’re listening to The Secure Developer. It’s part of the DevSecCon community, a platform for developers, operators and security people to share their views and practices on DevSecOps, dev and sec collaboration, cloud security and more. Check out devseccon.com to join the community and find other great resources.

This podcast is sponsored by Snyk. Snyk’s developer security platform helps developers build secure applications without slowing down, fixing vulnerabilities in code, open-source containers and infrastructure as code. To learn more, visit snyk.io/tsd. That’s S-N-Y-K.IO/TSD.

[INTERVIEW]

[00:01:14] Guy Podjarny: Hello, everyone. Thanks for tuning back in to The Secure Developer. Nicole, we're really excited to have you here with us on the show. Thanks for coming on.

[00:01:22] Nicole Perlroth: Thank you so much, Guy. I'm excited to be here.

[00:01:26] Guy Podjarny: Nicole, we all know now that you've written this great book, This Is How They Tell Me the World Ends. Quite a riveting read or listen for me as I listened to it. For those who are not familiar with it, haven't read it yet, tell us a little bit about what the book is about.

[00:01:41] Nicole Perlroth: Sure. The book is really about this race to the bottom that we have been in with cybersecurity. The title came to me in the shower one day and just stuck. It's a little bit about my own journey in learning about the threat landscape and our vulnerability. It comes from the point of view of someone who had very little technical expertise, and was essentially thrown into the deep end of the pool, onto the cybersecurity beat at the New York Times during what would become one of the most consequential decades for cybersecurity, when threats really moved from identity scam, spam, the occasional high-profile DDoS attack, to this post Stuxnet era. This recognition from every country under the sun, that they could use code not just for espionage, but for destruction.

The goal of the book was really to wake up not the technical community. People in this industry have known for a very, very long time what the stakes were here. But to wake up the non-technical community and to say, “Hey, this is not really just a technical problem anymore. This is a societal problem. This is a culture problem. This is a leadership problem. This is a policy problem, and this is an awareness problem.” It's time that all of you learn what's been happening and the tradeoffs we've been making on cybersecurity in the name of everything else, whether it was cost savings, or convenience, or even national security, old thinking of national security, we were sacrificing cybersecurity for. That's the book. That's the long answer.

[00:03:42] Guy Podjarny: The book is full of stories that are interesting, although not always – you're not always happy to hear that they've happened. They're riveting, even as a techie and even as someone in cyber, there's still some newness, not as much as I would like. I wouldn't sing to it. It gets dark at times in terms of all these stories. Did you have a moment where you wondered, “You know what? Maybe I don't want to really continue exploring here,” or were there crisis moments in that front?

[00:04:09] Nicole Perlroth: There were a lot of crisis moments. It took me seven years to do this book. It's not that it took me seven years to write 400 pages, or whatever it is. There was a lot of writer's block involved. I think, some of that was fear related to the subject matter. I think some of it was impostor’s complex, sort of how dare I, this non-technical woman journalist tell this story? I think, there was a lot of fear that because I was a non-technical female, New York Times mainstream journalist telling the story, I was definitely scared of the Twitter blowback that I knew would inevitably follow.

But interestingly, I never thought I need to stop here, because I'm afraid of nation state hackers coming for me, or I am scared of what will happen if I expose these highly classified programs, offensive programs within government. It was really more about how is this going to land with the cybersecurity community? I think at a certain point, I just had to put them out of my head, and remind myself, I'm not writing it for the community that already knows and has known for a long time what's been happening in this space. I'm writing it for my mom. I'm writing it for Senator Klobuchar. I'm writing it for someone who might have some vague notions of what's going on in this space.

Maybe they've read my articles every now and then, but they don't really know how we got here. They don't really know our own government's role and how we got here, and don't understand that this is not nuclear. This is not a situation where just because you're on the top, and you have the top weapons in the space, your adversaries won't get there. It's really a space where there's a low barrier to entry, and the gap between the cyber superpowers and everyone else was closing much quicker than I think anyone in the intelligence community certainly gave them credit for. At the end of the day, I did a book.

I'll tell you that, actually, the publisher, my original publisher canceled on me about two years before the book came out. I had to go out and resell the book. Everyone turned me down, except for one editor at Bloomsbury; Anton Mueller, who said, “I just want it as is. I don't want to change the title. I don't want to change a single word. I see that this book needs to get out there, and I will do whatever it takes to get you across the finish line.” Thank God that happened. I even got a note from the old publisher saying, “Wow, we really screwed up here.”

[00:07:11] Guy Podjarny: Yeah. Well, I mean, it's a good thing for you for persevering here. I guess, it's like JK Rowling has some famously got turned down by some 50, 60, 70 publishers before –

[00:07:23] Nicole Perlroth: Right. Guess what? My publisher was her publisher, Bloomsbury. The only one who took off on it, is the only one taking up on it.

[00:07:32] Guy Podjarny: That's some good insight, if there's the repetition over there. I think, again, I found the book very interesting. Also, I feel it actually is, I think, quite relevant to people in the technical community. Sometimes, maybe if you're non-technical. By the way, technical people get Twitter blowback all the time as well. I think a lot of the problem of security is that it's invisible. It doesn't have a natural feedback loop. You don't really know that you're not doing it right, that you're causing problems. Like it or not, one of the ways of raising awareness and raising how much people care to look is to understand the cautionary tales, understand the what-ifs.

It's hard when you hear about a hack that happened to corporation, other. It's not you. There's a lot of in what's happened here. I feel like, part of what the book does is it talks about geopolitical impact of cyber espionage and cyber warfare. You can't really say, “This doesn't affect me.” I do think, you do need to touch home sometimes and I think it achieves a lot of that.

[00:08:33] Nicole Perlroth: Yeah. I think that was definitely my take away over the last 10 years. I think it was one that when I went to the person that I call the ‘godfather of cyber war’ in the United States, this character, Jim Gosler, who's considered a real hero in the intelligence community. He emphasised this point over and over again. If you didn't think that I was paying attention, he stated again, he would say, “Nicole, I need to repeat this. You can never be fully sure that a system doesn't have vulnerabilities.”

It goes back to this famous speech by, I believe it was Ken Thompson, Reflections on Trusting Trust that he gave in the 60s, where he said, “Unless you've written the code yourself, you can never be fully confident that it doesn't contain a backdoor.” What Jim Gosler said to me was, “Think of where we are now. Well, think of where we are now with global supply chains. Not only do you not write the code, you don't make the chips, you're not overseeing the factory. In fact, a lot of these factories are in countries that we would call adversaries, like China, that are some of the most prolific backers of state sponsored hacking and cyber espionage.” It's just this huge wake-up call that even the people who are at the top of their game don't trust anything.

[00:10:00] Guy Podjarny: Yeah, absolutely. I love that you mentioned the Reflections on Trusting Trust. I had a talk at QCon a few years ago titled ‘Developers: A Malware Distribution Vehicle.’ Not as good as your title, but I was fond of it at the time. Talk about god and X-code goes than today. The world is full of malicious component examples.

Indeed, there's a lot of good reasons to bring you on here and share the stories with the audience. Specifically, I think an area that touches home to some people listening here is the whole world of supply chain security. Indeed, software produced by one entity may be consumed by another. We'll talk a little bit about maybe the intricacies and what can we do about this. Maybe let's start with some stories. You've come across quite a few of those over here. Can you just share a few examples from the book, or not, around supply chain security and how it indeed plays into the geopolitical cyber warfare world?

[00:10:50] Nicole Perlroth: I think, the biggest supply chain security moment in recent times was SolarWinds, this attack that I'm sure your audience is familiar with. Just in case they've been hiding in a cave somewhere for the last couple of years, you have this attack conducted by Russia's SVR intelligence agencies, hackers, that essentially hijacked the software update from SolarWinds, which was a Texas company that marketed itself as a security company, visibility software, and a software that essentially lets you see what's on your network. They hijacked the software update, so everyone who did what you're all told, we're all supposed to do is keep our software up to date, patched, etc.

Now, when they downloaded the latest SolarWinds software update, they were instead getting a Russian Trojan horse. We now know that Russia used that access to potentially breach some of our most critical government agencies, like the Treasury Department, the Department of Justice, the Department of Homeland Security, the very agency charged with keeping us safe in this realm, Department of Energy, even some of the nuclear labs used SolarWinds.

What struck me at the time was that once this was disclosed, and by the way, it wasn't disclosed by the NSA, or the top hackers in government, who for years had been saying, we have a policy of defend forward. Active defence. So long as we're walking into foreign networks, into the SVR systems, into Russia systems, we’ll have this early alert system.

[00:12:31] Guy Podjarny: I’m not sure, overconfidence over there in the – if that’s what they said then.

[00:12:34] Nicole Perlroth: Not only overconfident. Right. This attack really exposed our blind spot, because they staged it at servers inside the United States where the NSA can't look. It was only discovered, because Mandiant or FireEye discovered that they had been breached. Out of the goodness of their own hearts, disclosed that breach of – it didn't touch PII, Personally Identifiable Information, which would have triggered state data breach notification laws. They raised the red flag and thank God that they did.

Then in rewinding the tape, discovered, okay, they started with the update that we did from SolarWinds, and we're able to flag that for SolarWinds and for everyone else. Otherwise, we might not have ever known about this breach. They were inside our systems, I believe, for as long as nine months before anyone knew about it. The final thing I'll say about SolarWinds is what was stunning to me as a reporter was calling up all of these companies listed and government agencies listed on SolarWinds website as customers and trying to confirm that they had been impacted, or were at least in the process of trying to forensically understand whether they had been impacted. Half of them didn't even know that they had been using SolarWinds.

Let alone that SolarWinds had been really under-investing in their own security for a long time, that they were now owned by a private equity firm who'd been cost-cutting, was essentially now run by an accountant. That most of the build code was done in Poland, which is Poland, but also Belarus; not exactly a friendly. It just shines this huge light on our blind spots in the supply chain, in the code itself, in the NSA’s ability to actually look for this stuff, in the overconfidence of our offensive strategy, and being all we needed to defend ourselves.

I think since then, we've only had more and more wake up calls. The Log4Shell issue last December, where we learned that this critical open-source protocol, I guess you'd call it, was basically a giant vulnerability. There too, we only learned about it because an employee at Alibaba defied new national security laws in China to flag it for everyone. Actually, Alibaba paid dearly. As a result, I believe the Chinese Communist Party ended up suspending some of their government contracts, because a couple years then, they put in place this new law saying, if you find a critical vulnerability, a zero day, you must give the state first notice. You can't go out to the broader market, which is what this Chinese engineer had done.

Thank God, he did. Because as a result, we all learned just how vulnerable these systems that we're still learning, right? Jen Easterly has called it the most critical vulnerability she has seen in her career. Over and over again, this just keeps happening. We can't ignore it any longer.

[00:15:49] Guy Podjarny: Both are indeed key stories. I guess, the poster children now for a supply chain security concerns on it. I guess, maybe a question more from the media lens on it, Lot4Shell was interesting, because for a moment it was top news on mainstream media sites, for I say a moment, because I feel like, I don't know if it was even the day it was top news. It felt, maybe there was an appreciation for the seriousness of the moment they made it.

Then pretty instantly, I think definitely by the next day, it was nowhere to be found. It was maybe buried somewhere in the some of the tech pages of the paper. How do you think about that? Do you think there is popular appetites to hear about these things, to hear about these concerns? What needs to happen for something to stay in the consumer awareness of the people's awareness for more than a few moments?

[00:16:44] Nicole Perlroth: Yeah. I mean, you could extrapolate that generally too so many different things. There are so many news items, shootings, mass shootings in Uvalde, Texas that I mourned the fact they're not front page news still, even these many weeks or months later. How are we still not talking about this? How have we all just seemingly moved on? It's incredibly frustrating, I think just as a citizen. Security, cybersecurity, it's even worse, because it is a technical topic. It's an intimidating topic for a lot of people. I will say that the hardest part of my job, actually, the hardest part of my job at the New York Times was Twitter. Being a female journalist on Twitter working for The New York Times, you're going to have a target on your back. It got really ugly very quickly.

The second hardest part of my job at the New York Times was the badgering. I would have to do, to try to get some of these major cybersecurity stories, in my opinion, on the front page, convincing the masthead, “Listen, I know that this is technical. We have to put this on the front page and we have to follow up tomorrow. We probably have to do a longer deep dive. It probably won't be ready for a week or two, when everyone has moved on. But this is so important. We need to make sure that this sticks.” Sometimes I was successful. Sometimes I wasn't. I do think that it is too bad that we had incidents like Y2K, where there was just this –

[00:18:30] Guy Podjarny: A lot of noise. Yeah.

[00:18:32] Nicole Perlroth: A lot of noise. Then nothing really seemed to happen. Now, we learn later that actually, some serious things did happen in the periphery, but nothing that really made its way into the –

[00:18:42] Guy Podjarny: Yeah. Not of the magnitude of what it was made up to be. Yeah.

[00:18:46] Nicole Perlroth: Exactly. That's a huge balance. That was probably my third hardest challenge in my job is, how do you walk this tightrope between, whoa, waving hands. Hey, this is really important. Look over here. Okay, let's not create panic. That's actually why I ended up doing the book, was because I would parachute into these situations, whether it was Log4Shell. I was just out of the New York Times when that happened, so I didn't cover it. Heartbleed, SolarWinds, APT attacks, what the US was doing. I'd parachute into these situations, and so much was happening at once and almost simultaneously, that you’re jumping around, it's hard to hold people's attention.

I realised, wow. I think this really needs someone to step back and hold the reader by the hand, the non-technical readers too, and walk them through the space almost in chronological order and make it entertaining. Technical accuracy is critical, but hold their attention long enough to say, “Hey, these things are all connected. There is a through line here. It's pretty ugly where it's going. It's time for all of us to pay attention and connect the dots here.” That, I would say, my book in the end, I feel had far greater impact in holding attention spans than any one article I did at the New York Times.

[00:20:17] Guy Podjarny: Yeah. Well, and I guess there's people come in, and you get the chance to build up a bit of their knowledge, amidst the story and included, versus the brief attention span of a story. I will say that what you're describing here sounds awfully similar to developer security training programs that have been successful. We had teams from Segment, now Twilio, for instance, talking about how it's important to get real-world vulnerabilities that happen in their systems into the training that happens to – that they give to the developers. Because otherwise, people generally have a hard time engaging. They have a hard time focus on remembering and retaining information.

Within the software world, I think it's at this point fairly accepted that the whole notion of a culture of security and having people understand that security matters and care about security, they often pushback from the top by developers. Securities people say, “Well, developers don't care about security.” You have to dispel that and get developers engaged. Is that also true from a societal perspective? I mean, can we address the gravity of this concern, without getting public appreciation for it? Or is that a necessity to get these stories to stick around, or arrive on the front page a bit more often?

[00:21:32] Nicole Perlroth: Just stepping back a bit. I think we've been on this collision course, certainly, while I was at the New York Times and probably, before when I was at Forbes, just covering the startup scene in general. We've been on this collision course between what Mark Zuckerberg articulated as, “Move fast and break things,” and what Marc Andreessen articulated as, “Software eats world.” These things that move fast, it doesn't have to be perfect. Speed is critical. That is the enemy of security. Speed has always been the enemy of security.

What do we do to remind the developer to essentially slow down and fix their shit? Which, I point out at Facebook now, where there used to be graffiti that said, “Move fast and break things.” The last time I visited, it said, “Slow down and fix your shit.” Certainly, the awareness has trickled down at Facebook.

[00:22:27] Guy Podjarny: What's called written in blood, right? Like, based on historical learnings.

[00:22:33] Nicole Perlroth: Yes. There's a point where I thought about making my gravestone say, “Slow down and fix your shit.” No, I think the best example of how to hammer this home is what Google now does. When you are a new hire at Google – I believe they still do this. I hope they still do this. Heather Adkins, who's headed up their information security walks you through the Aurora attack, the 2009 attack by a Chinese APT on Google. It doesn't matter who you are, or where you fit into the organisation. Whether you're a developer, or you work in a completely non-technical sales office somewhere. Doesn't matter.

She'll tell you the story about how someone in their Beijing office at the time was convinced to click on a link. I believe then, it was their America online instant messenger, MSN instant messenger in those days, whatever the Slack predecessor was in those days, and how that translated to dissidence Gmail accounts getting hacked. I think, what's so powerful about that story is that, hey, wake up. It starts with you. You click on one spear phishing link – It doesn't happen anymore, because Google, I think, epitomises zero trust now as an organisation. But, you click on this one spear phishing link, you could get someone killed. Now, it's not that far of a stretch.
That is our only way out is that kind of storytelling to everyone in the organisation. When I interviewed Bob Lord, as he was leaving the DNC, the Democratic National Committee after the 2020 election, I said, “What did you do to keep the DNC from getting a repeat instance of what happened in 2016?” He said, “It was really hard. I did not have the budget I had for security at Yahoo, or before that at Twitter. We have operatives everywhere. This is a really hard to defend organisation that is in extremely high target. You can't deny it’s a target.” We all saw what happened with John Podesta’s risotto recipe. Something as seemingly innocuous as risotto become weaponised in an election.

I said, “How'd you do it?” He said, “I wish I could tell you there was a magic algorithm, but what it was, was the organisation made it a priority. They told me that I had 5, 10 minutes at every single all-hands meeting to use however I wish, but basically, to hammer home the criticality of information security and cyber hygiene.” He put a checklist on the back of the women's bathroom, men's urinal that had a big Bobmoji on it, and it said, “Have you turned on MFA? Are you using signal for your most sensitive communications? Have you done the latest software update? Etc., etc. Are you using hardware MFA keys?”

That's what he credited was basically, putting a Bobmoji above the urinal and take like, “Don't forget, you too.” It's almost like the Smokey the Bear equivalent for forest fires. It's just someone reminding you, don't throw your cigarette out the window. You play a very critical role in preventing the next forest fire. As weird and soft as that sounds, we absolutely need to address the human behaviour element to this problem. Why? Because how did Colonial Pipeline get breached? It got breached, because one employee whose account they never deactivated had a crappy password. How did Equifax got hacked? They got hacked through, I believe, it was a vulnerability in what was it? Apache?

[00:26:38] Guy Podjarny: Apache Struts, like an open-source Java library.

[00:26:41] Nicole Perlroth: Right. A mistake in the code. Is that too simplistic?

[00:26:45] Guy Podjarny: No, it's okay. It's a bug. Security bug. Yeah.

[00:26:48] Nicole Perlroth: Right. It's user error. We think of this as a technical problem, but at the end of the day, it's really user error. The only way to address user error is to convince the user to care about this. The only way I know how to convince people to care about it is either to hack them myself. Everyone always seems to get religion after they get hacked. You only turn on MFA after your Instagram account gets taken over, or whatever it is. Or you tell them the story, the really bad story about Colonial Pipeline, or Aurora, or your friend that got hacked and what happened in the aftermath. That's the only way I know how.

[00:27:29] Guy Podjarny: Yeah. I mean, it's interesting. It's the constant debate in security. Because on one hand, maybe to your example about Y2K, right? If you yell wolf on a regular basis, and most of the time. “Well, they keep telling me I need to do this, but look, it's been five years I've been doing this and nothing happened.” Probably nothing will happen in the next five years is a constant concern, right? Instead of alert fatigue. I guess, the balance is always between constant mindfulness and awareness and desensitisation of saying, “Well, you keep telling me about it, but you're just spreading fear, because whatever, it serves your purpose or your agenda, and you're not – it's not real.”

I guess, the more concrete and more close to home it is, the more people will get into the religion. As you point out, everybody gets religious after they're hacked. Hopefully, we'll find slightly better approaches over time.

I'd love to talk a little bit about observations that you have coming into the industry from outside, actually, because you're maybe non-technical, and you're looking at it, sometimes we get blinded to it a bit inside. Before I do that, just before we leave the stories side a bit, I'd be remiss if I don't ask you about Ukraine. I know you're not at the New York Times at the moment, but feel from what you see. Have you seen cyber warfare, specifically supply chain play a role? There seems to have been a sense of almost, I don't know if I want to say, disappointment. But there was maybe expectation that there would be more visible blowouts on the space from Russia's force doing it. Any thoughts on what's happening over there, or what you heard?

[00:28:57] Nicole Perlroth: In some ways, my book was prescient, in that it starts in Ukraine after the 2017 notpetya. When the invasion started, I was getting a lot of great questions – Very positive, affirming, validating questions like, “How did you know to start this in Ukraine? Did you know that this would happen?” Now the question is, “Do you think you were wrong that Ukraine created all this blowback for cyber?”

I think, it's just more nuanced, as most things are. I think that we are seeing maybe the limits of cyber war tested in real time. That's one thought I have. I think that the horrors of what is happening on the ground in Mariupol, some of the – to me just, it's cringeworthy to read about some of the atrocities, human atrocities that happened. Rapes, forced migration.

[00:29:58] Guy Podjarny: Yeah. Horrible stories.

[00:29:59] Nicole Perlroth: Horrible stories. Those obviously, are going to consume your bandwidth for attention in ways that code and cyber will never be able to compete, unless you see a big explosion, or maybe a Colonial Pipeline. When you look at the facts on the ground, we now know that actually Russia did, in fact, a couple Ukrainian substations, power stations in the weeks of the invasion. Now, they didn't turn off the power, but they could have. They could have turned off the power to millions of Ukrainians, but they didn't.

I suspect and you'd have to be a fly on Putin's wall to know why they didn't shut the power down. I suspect that if the reporting is accurate, that Russian intelligence believed that Russia would have its puppet government installed in Kiev in 48, 72 hours. They wouldn't need to shut the power off. They'd only be sabotaging themselves. They got in. They got the foothold. Then they waited until things weren't necessarily going their way to time the actual detonation, because we know they didn't time impact until April, when it was clear things weren't going Russia's way.

Thank God, thanks to really what I would call unprecedented collaboration between allies, government agencies, including CISA (Cybersecurity and Infrastructure Security Agency), the agency I now advise here in the United States, and the private sector, private security companies on the ground, they discovered and were able to remediate this attack before Russia could turn the lights off. Similarly, Microsoft raised the flag, when they discovered wiper malware on critical Ukrainian banks and government agency systems and were able to help remediate before we saw the worst-case scenario.

Similarly, we know that Russia, we believe, I think, I don't know if the attribution has been nailed, but hacked Viasat, this satellite Internet broadband provider, not just to Ukraine, but to Europe, and basically, interrupted people's modem connections. That could have really swung the information war Russia's way, if we hadn't been able to see this incredible footage, not just from news organisations, but from grandmother's phones from their rooftops, of these Russian atrocities on the ground.

Lo and behold, in came Starlink and Elon Musk. We all probably have some strong feelings about Elon. In this case, this was a really, I think, underappreciated Elon moment that he was contacted, actually by Ukrainian CEO here in Silicon Valley. He went to college with Zelensky and was asked if he could do anything, because Zelensky said, “Hey, we have five Internet links in and out of the country. We believe, Russia is going to bomb all of them.” Which they eventually, I believe did. “Can you help?” I think the quote I heard Elon say was something like, “Oh, you want me to help take out an authoritarian dictator? That's on my bucket list.”

[00:33:26] Guy Podjarny: Sounds fitting of the stereotype. I don't know him personally, but sounds in character.

[00:33:31] Nicole Perlroth: They were able to mitigate the Internet going out, the power going out, all of their data being destroyed. These things were attempted.

[00:33:40] Guy Podjarny: Right. They didn't make the titles, because sadly, there were far worse things from a human interest perspective, and what people are looking to find out about that were happening in it.

[00:33:52] Nicole Perlroth: Yeah. I mean, I think the last thing I'll say about Ukraine is, we're not out of the woods yet. The more we tighten the screws on Russia, with sanctions, with the ban on Russian gas, vodka, diamonds, the more we explore this theory of do we ban visas, etc., the more likely it is that Putin will respond directly to the west. So far, it's all been a lot of just fear mongering and threats. I think, the most likely way to respond directly to the west is via cyber, because until now, it really has been declared a short of war weapon. We haven't seen anyone respond in kind to some of these egregious attacks, like Colonial Pipeline, or even the Ukrainian power outage that Russia did in 2015, 2016.

I think, something will come of this. I do have to say, I've been surprised and actually, rather hopeful for the first time that we might be able to hack our way out of this with the kind of declassification that the US government and our allies have been doing together in tandem with the real-time information sharing that has been happening. I mean, Slack channels have been lighting up for a year now, sharing what people are picking up on their networks. Basically, we're doing all the things that we've talked about, being critical and necessary, but never actually made progress on. We are not letting a good crisis go to waste. I hope that shields up as CISA calls, just becomes the new normal.

[00:35:37] Guy Podjarny: Yeah. I think I agree with you that it's a demonstration a bit of what could happen if you collaborate, not dissimilar to the COVID coming together of the scientific community there to say, well, when there is some unity, along with the caliber, along with capability, you can actually afford some pretty serious enemies for now. You know that the job is not done.

Thanks. This is super insightful on it. Yeah, you've rated well. Don't let a good crisis go to waste, in the sense of understanding what happens if this happens again, and how do we continue? To an extent, an acceleration of certain trends and drives for transparency and security and things like that, that you have actually mentioned in the book, and they were long-term trends. This might have fuelled that fire.

[00:36:22] Nicole Perlroth: Yeah. It's almost the antidote to what happened during COVID. I was racing to finish my book during COVID. We were seeing a huge uptick in the frequency of attacks. It was clear that adversaries, threat actors, cyber criminals were all exploiting remote work, the rush to remote work. It was clear that because this was a global pandemic, that we were leaving it to each nation to figure out how to respond. There was a lot of APT (Advanced Persistent Threat) activity, some from countries who’ve had never really crossed our desks, that were hacking each other just to figure out what everyone else was up to, or how far along they were with vaccine development, or what China knew when they knew what when.

What I said in my book was something – it's been a while now. Something like, this is either going to be a learning lesson, and we're going to adjust our defences accordingly. Or, this is just a very, very small glimpse of the nightmare that will come. And i think, Ukraine, interestingly, is the more hopeful prism. Like, here's a glimpse at what cyber defence and collaboration and threat sharing and declassification and partnering with our allies and having a unified response could look like in cyber. Here's the impact it could have. We can't just let this trickle out. We have to figure out how to keep it up and keep improving, keep iterating and sharing.

[00:37:55] Guy Podjarny: Yeah. You actually mentioned to me in a conversation before this around, actually seeing a bit of a dip in ransomware payouts during, basically, potentially affiliated with the Ukraine war?

[00:38:07] Nicole Perlroth: Yeah. It's really interesting. I think this data is still being collected. Nozomi, which specialises in industrial control security, were now an advisor, shared with me that one of the things they're seeing is, instead of ransomware, they're seeing just wiper malware. That's one data point. The other is what recorded future and others really are tracking ransomware have said is that there has been this mysterious dip in ransomware attacks on the United States.

I know, also, TRM Labs, which tracks blockchain intelligence and tracks ransomware payments, is seeing a big reset among some of the major Eastern European, let's say, ransomware groups, that they're rejiggering. One theory here is that for a long time, we know ransomware victims were just paying. The ransomware groups knew just what to charge. It was always going to be cheaper than the cost of total remediation and the threat that your data was going to pop up online. Victims were paying for a long time.

After the Russian invasion. General counsels had to say, “Hey, I don't know if we can pay this. I don't know where this ransomware group is based. But if there is even a minuscule chance that they are based in Russia, we would be in violation of sanctions by paying them right now. We can't pay.” We started seeing some resistance pop up, that I don't think ransomware groups expected, or anticipated for a long time –

[00:39:49] Guy Podjarny: They needed a bit more a roundabout implication.

[00:39:52] Nicole Perlroth: Right. We also started seeing a pickup in ransomware against other countries in Latin America, for example. There is an interesting question to be asked, which is, huh, maybe, actually, this idea we should ban ransomware payments and just have a blanket policy, have some legs as a deterrent. Then like I said, Nozomi said, in some cases, there is no ransom. They're just wiping. Maybe these groups that would have charged you a ransom are just giving you the middle finger.

[00:40:24] Guy Podjarny: Just looking to attack and cause damage. Super, super interesting. We veered, I think, a little bit that, although it's still all impacts of vulnerabilities and activities. Let's maybe spend the last stretch talking about what we can do about them. We talked about this a little bit, we talked about awareness and coming in. You mentioned the value of training and raising awareness inside the organisation. When you think maybe on two fronts, on one side from an industry or society, and on the other side, maybe more so from a DevSecOps development side, what do you think, or what do you heard suggestions on that we should be doing, that we can do to change a bunch of these equations, supply chain security is massive, maybe at the scale of fake news in terms of its complexity? What can we do about this?

[00:41:14] Nicole Perlroth: The first thing I'll say is we need our Steve Jobs of cybersecurity. We need someone to come in, who makes this a lot easier to be secure than it currently is. I'm out there talking all day long about how it's inexcusable if you don't have MFA turned on on your email, on your financial accounts, on your social media accounts. Then I have friends that I went to Princeton with, who were like, “I've been trying for three days now to turn MFA on my Venmo account, and I'm sorry, I can't figure it out. I give up.” That's ridiculous.

If it's hard for my Ivy League educated friend who peripherally works in cybersecurity, to turn MFA on on her Venmo account, then we are failing everyone as an industry. That's just ridiculous. I think, we need a Steve Jobs of cybersecurity to make security as intuitive as the iPhone was when it was released. It should be just as simple to turn on MFA, as it is to download an app on your iPhone. Right now, it's a nightmare. The stakes only get higher, the longer you wait.

That's number one is we need to attack this problem from a usability and design perspective. I'm still waiting for that. I think at the DevSecOps layer, we need to basically, hand developers a grab bag of intuitive tools that give them no excuse not to use these tools as they're developing code. Just, uh, let me stop and just click on this button, almost like spellcheck. Actually, I've never said that before. We need to spell check for code. There's a huge market opportunity there.

The other thing we need is we need quantifiable risk metrics. We need to know how much risk I'm taking on when I hire you, or I bring in your software, or I acquire you. There's nothing. There's nothing like it. There's no FICO score for cybersecurity. Now, there are companies like BitSight, who are looking at this problem. They are doing it from the outside. They're looking for anything they can from the outside, scanning your network, seeing if you have some server out there on the web that doesn't have MFA turned on, or that they could potentially access the hacker, that kind of thing. That is a really good start, and we need to standardise that.

At some point, we need to figure out a new incentive model. Hopefully, not a regulation model, that incentivises boards to want to know how risky they are. To want to know how much risk they are bringing to their customers and business partners when they work with them, or when they finally merge systems. There's just nothing like it. We have these one-off piecemeal, “Oh, maybe in this contract we require you to get a pen test.” It's not enough.

Again, it's ridiculous. I mean, to do a basic bathroom remodel at my house this year, I had to wait several months, have an inspector come over. I was forced to move the toilet 2 inches and he wouldn’t certify our remodel plans, until we agreed to move our toilet 2 inches, because it created some risk for plumbing. It's annoying, okay. I hope we don't get to that level of regulation. It's ridiculous that that's the level of effort I have to go to, to put some tile on my bathroom floor and do a new shower. There's no equivalent when Verizon acquires Yahoo, or a company agrees to work with SolarWinds and give them blanket access, essentially, to their network. I mean, that is stupidity. It's just insanity. We need standards.

I really am pushing hard on this idea that listen, as a capitalist society, we have to work with incentives. We have to figure out how to work with incentives, because unfortunately, when you do regulations in the United States, number one, we have a terrible track record at creating comprehensive, nuanced cybersecurity legislation on –

[00:45:45] Guy Podjarny: It’s a complicated and fast-moving space. It's hard for legislation to keep up the zone.

[00:45:51] Nicole Perlroth: Exactly. Why can't we create tax breaks for companies that agreed to get a real pen test, not a compliance guy come in with a checklist, but a real pen test, and then show, you don't need perfect security, but show, over time, between this time, this quarter and two quarters ago, I was able to minimise my attack surface by 30%. Great, you should get a tax break, because your cybersecurity posture is – especially if you're a water treatment plant, or a SolarWinds, or whatever you are, there's so many companies that really, our critical infrastructure, United States critical infrastructure, global critical infrastructure, so we should be figuring out how to help secure them. We haven't figured that out yet on the regulation front, so I'm hoping we can figure it out as a form of incentive structures.

[00:46:49] Guy Podjarny: Yeah. First of all, just get back a little bit to the squiggly line, because I'd be remiss to say a 100% on the line. In fact, to an extent, what we're doing at Snyk, literally, we have little squiggly line behind the code. Still a long way to go, like –

[00:47:02] Nicole Perlroth: I love that. You're writing the code, and it's like, are you just checking as you go?

[00:47:06] Guy Podjarny: Technically, yeah, I think it's as you hit save. The autosave works as well. They’ll show you if you’ve a written a security mistake in the library, or in your code.

[00:47:15] Nicole Perlroth: I mean, I think some of the biggest advancements in cybersecurity are not the ones that the cybersecurity community generally thinks about. I think it's things like, biometric facial recognition on your iPhone. I think, it's things like, when you're typing in a new password, and it tells you it's weak, medium, or strong. It's these basic things that trigger that cortisol response, where it’s like, “Oh, I don't want to a crappy password. I want an A-plus.”

[00:47:46] Guy Podjarny: Yeah, to do the – We had Adrian Ludwig on the podcast a while back, and he’s a CISO with Atlassian. He was commenting. He says like, “I don't want enterprise security. I want consumer security. I want my developers to get consumer-grade security. Because in consumers, you don't expect people to have a degree in how to write secure code. You expect the security control to be easy enough that anybody, that your mom can handle it.” I feel bad for moms, they’d become the poster child of – maybe it's grandma, maybe today, it's also that your grandma can operate.

[00:48:17] Nicole Perlroth: It’s actually grandmothers. Moms have enough on their plate. It's an interesting point, because, yeah, think of the busiest person you know. It's probably a working mother, or even just a stay-at-home mother or father. There’s so much on their plate. When I see them at school drop off, they're like, “Ah, cybersecurity. How do you live with your socks off?” It's like, “Well, you're the person that we should be helping.”

[00:48:44] Guy Podjarny: With the new executive order from Biden, a decent amount of that is about transparency, where you can argue useful or not. But for instance, it requires the software bill of materials to pass on from vendor to the customer, alongside a whole bunch of other requirements. How do you see that playing into the mix here? Do you think it moves the needle? What's your perception here?

[00:49:05] Nicole Perlroth: I just am so happy to have seen the word software bill of materials in the executive order. I was jumping up and down. I mean, listen, I now serve on this CISA advisory committee. Some people will think this is a bias, or a political bias. I wouldn't have done that. I wouldn't have joined them if I didn't think that this administration was serious about cybersecurity. That Biden cyber executive order is the most comprehensive piece of cybersecurity policy, anything we have seen in this country. Frankly, I think it's a brilliant document. Because software bill of materials, I'm just putting aside for a second. I apologise. To me, the magic in it was this recognition that the government has limited purview, or authority when it comes to our cyber hygiene. It recognises that that is a terrible place to be when 85% of our critical infrastructure is in private sector hands, right? It's completely backwards.

What the document did was it said, “Okay, we can't regulate this, right? We're going to be accused of being regulation hungry Democrats, if we try and go crazy on this.” What they did was they said, “Okay, here are the NIST standards. We will let you self-certify that you meet these NIST standards, okay. You don't even have to go to a third-party auditor. We're not going to make you do the thing I had to do when I did my bathroom remodel, by moving the toilet 2 inches. You can self-certify that you meet these NIST standards. However, if we catch you lying to us, and you are a federal contractor, you are banned from ever doing business with the federal government again.” That is the first stick I have ever seen really, in this space. It has had a huge impact.

Good on them for figuring out how to use, essentially, the power of the purse to say, “Hey, cut it out. Can’t accuse us of adding regulation here, because we're letting you self-certify that you better meet these requirements if you want to do business with the biggest buyer out there, which is us.” Hats off to them for that. I've been pleased. I note this in the book, medical devices. There's been a lot of progress on medical devices on SBOM, on software bill of materials. I think that is where we have to go. We should know what open-source code is baked into these systems that are so critical, like a pacemaker, and what software is in there. Then ideally, back to my previous point about quantifiable risk metrics. What's the risk score of that software? I think that is a nutrition chart, basically, for cybersecurity. I think is brilliant and it's amazing. We don't have it already, and it's going to be an ugly process to get there, but we absolutely need it.

[00:52:04] Guy Podjarny: Yeah, I mean, I think I agree. It aligns well, if I recap a little bit, a lot of this does come back to the right incentive model and the right visibility a bit more interesting concept, I haven't heard of it. The tax credit. I think it aligns, we’re talking about it can’t be all stick. There needs to be some carrot for doing the right thing. That carrot on one hand might be the tax incentive, but also, it would be the ability to sell. I don't know, you can debate whether that's a stick or a carrot. It's a pretty big stick that you cant sell there.

In general, be more successful by touting your security certifications, security investments to sell more. Also, raise and simplify that awareness everywhere and increase the transparency, for the hope that that mobilises us to act. I think just to say that Tim from Mandiant here, and I'm name dropping a bunch of past guests. Tim from Mandiant and Jeff from LinkedIn. Susan, they're both commented on how they can expect public markets. Also, introduce things that are along the lines of the generally acceptable accounting principles that happened post Enron and the fraud over there, when it comes to security.

[00:53:09] Nicole Perlroth: I think, I call that the digital Sarbanes Oxley, and it's coming. It is coming. My hope is that all of the people involved in that legislation will have read my book, and have had at least a basic primer in the issues involved, before they send out that legislation, because it could go terribly wrong, or it could be exactly what we needed.

I think, we had almost too much hope, for instance, that cyber insurance would save us and it didn't. It ended up being an enabler for ransomware. Because in too many cases, insurance companies were saying, “Oh, just pay the ransom. That's going to be a lot cheaper than the other option.” Sometimes market solutions are not perfect, and neither as regulation. Yeah, I mean, like I said, when I said, among my target audience was Senator Klobuchar, it's like, “Yeah, someone who is interested in this issue, could potentially be a leader on this issue, and has the capacity for nuanced policy thinking is perfect. Yeah, let's hope for improvement.”

[00:54:12] Guy Podjarny: Some improvement. There's definitely raised awareness in it. Then we continue to do our share and see where it leads us. Nicole, huge thanks for coming onto the show and encouraging everybody to find. I assume your book is basically, you can find it on any one of the platforms that you might expect. Really a good read for awareness, and for what it's worth, also, an interesting fun reading, great storytelling. I recommend it. Thanks for coming on and sharing your insights over here.

[00:54:37] Nicole Perlroth: Thank you, Guy. I really have loved speaking with you. I'm always intimidated when I speak to someone with your technical calibre. But when you start having these discussions, you realise, there's just so much overlap and alignment in our thinking and we come at this from such different perspectives. This is one way out, too, having these conversations with people who are coming at the same problem from completely different areas. I really appreciate you doing this. Thanks for having me.

[00:55:10] Guy Podjarny: Oh, no, no. I think, I mentioned this to you as well, which I believe that you need to leave security to fix security. When you're in an echo chamber, when you look around, it's hard to see how things might evolve or change. It's great to get your fresh perspective. I think, there's been much more alignment than misalignment in the views from inside. Thank you and thanks everybody for tuning in. I hope you join us for the next one.

[END OF INTERVIEW]

[00:55:37] ANNOUNCER: Thanks for listening to The Secure Developer. That's all we have time for today. To find additional episodes and full transcriptions, visit thesecuredeveloper.com. If you'd like to be a guest on the show, or get involved in the community, find us on Twitter at @DevSecCon. Don't forget to leave us a review on iTunes if you enjoyed today's episode. Bye for now.

[END]