Skip to main content
Episode 114

Season 7, Episode 114

Ask Guy Anything!

Listen on Apple PodcastsListen on Spotify Podcasts

We’re switching it up in this episode and putting Guy Podjarny in the hot seat to answer all of your most pressing security questions! Following his astute prompts, Guy comprehensively explains everything from how startups can build in security with limited resources to how security teams need to transform going forward. We discuss the balance of security and usability, the security implications of quantum computing, and the role developers are predicted to play in DevSec. We also speculate how NoOps might affect DevOps and the potential of achieving zero trust for application security. For all of this and so much, tune in for an in-depth AMA with Guy as he answers all of your unanswered DevSecOps-related questions!

Share

[EPISODE]

[00:00:37] ANNOUNCER: Hi, you're listening to The Secure Developer. It's part of the DevSecCon community, a platform for developers, operators and security people to share their views and practices on DevSecOps, Dev and Sec collaboration, cloud security, and more. Check out devseccon.com to join the community and find other great resources.

This podcast is sponsored by Snyk. Snyk’s developer security platform helps developers build secure applications without slowing down, fixing vulnerabilities in code, open source containers and infrastructure as code. To learn more, visit snyk.io/tsd. That's snyk.io/tsd.

[INTERVIEW]

[00:01:27] Simon Maple: A big hello to The Secure Developer listeners out there. This is going to be another special session where normally, Guy, you and I are on these podcasts typically for a year-end review of what's been happening over the previous year. So this is a slightly different one, where we're going to be running an AMA with Guy. Quite frankly, Guy has asked too many questions on this podcast. We're getting our own back and asking guy a number of questions as well. So welcome Guy, how are you?

[00:01:54] Guy Podjarny: I am good. I'm looking forward to sort of seeing if I can match the guests here.

[00:01:58] Simon Maple: Absolutely. Well, we're going to ask you some strange questions to start off with, just as an icebreaker, and these are going to be quick fire questions. I don't want you to really think about it too much, maybe a couple of seconds and it's shorter answers as you can muster. So let's get started. Question one favorite color?

[00:02:13] Guy Podjarny: Blue.

[00:02:14] Simon Maple: Blue. Question two –

[00:02:15] Guy Podjarny: Should have been purple given the –

[00:02:18] Simon Maple: That leads on. Next question, Snyk or Sneak?

[00:02:22] Guy Podjarny: Snyk.

[00:02:23] Simon Maple: Question three, favorite fruit?

[00:02:26] Guy Podjarny: Probably passion fruit.

[00:02:28] Simon Maple: Passion fruit. Question four, dev or sec.

[00:02:31] Guy Podjarny: Dev.

[00:02:33] Simon Maple: Question five, would you rather be – this is a tough one. Would you rather be a horse-sized duck or a duck-sized horse?

[00:02:41] Guy Podjarny: Horse sized duck, definitely.

[00:02:42] Simon Maple: Horse-sized duck, interesting. I don’t know if a horse-sized duck would even functional work, but good answer. Question six, favorite meal?

[00:02:52] Guy Podjarny: Favorite meal, burger. A good burger.

[00:02:53] Simon Maple: Burger. Nice. Interesting. Question seven, tabs or spaces?

[00:02:57] Guy Podjarny: I refuse to answer. I might incriminate myself. I'm actually not quite as religious as it sounds.

[00:03:05] Simon Maple: Okay, well, let's not offend anyone with that. But let's offend people with this one instead. Question eight, Star Wars or Star Trek?

[00:03:13] Guy Podjarny: Star Wars.

[00:03:15] Simon Maple: Star Wars, that’s the answer to that.

[00:03:16] Guy Podjarny: Star Trek is nice. I like it as well. But yeah, Star Wars.

[00:03:20] Simon Maple: No, you can't have both, Guy. Come on. Star Wars is your answer, let’s stick with that. So let's get into some more serious questions then. That was a nice icebreaker. But in this session, we may do this over a couple of sessions, just depending on the number of questions and amount we can get through in a reasonable podcast duration. But let's start straight away with a question from Leron Talet. “You’re one my favorite developer advocates in the industry.” Now his question is really around what you would do as a CISO starting at a company. And actually, first question, have you ever been a CISO before? Is that ever something that might interest you?

[00:03:59] Guy Podjarny: I've never been a CISO. I like sort of thinking through the problem. I leave that to the experts to apply them.

[00:04:07] Simon Maple: So well, let's assume you were a CISO starting a company that didn't have any prior good standing of security posture. Can you imagine that, Guy, kind of like beyond imagination? What would be the first three things that would be on your to-do list?

[00:04:22] Guy Podjarny: First of all, it's easier to imagine that than being a duck-sized horse, or vice versa. Overall, the questions are getting easier. So the first thing that I would do as a CISO, if you don't have anything in place, the first thing you need to do is visibility. You can’t fix what you can see, and or you can optimize it. You have to work on visibility. I think this is a time to kind of get the basics right. So optimize for speed, forgetting broad visibility, versus necessarily comprehensiveness, identify. In the world of open source, for instance, is about knowing which open source components you are using, and then which ones are vulnerable, and in code, and whatever it is, like, really, whether it's IT or other. Just know what you have, know what the sort of the lay of the land as quickly as you can. I think visibility definitely has to be first.

Once you have visibility, the second thing I would do are probably actually sort of two parallel paths. So on one hand, you've just gotten visibility, so you have to kind of get going, fixing the top issues, the most critical issues. It's very likely that you will have a few special unpleasantries that you've encountered, and you have to get going on fixing them. But that fixing does take time, even for the critical issues. So I'd say there's sort of two things you need to do in parallel, which is fix the top issues, and stop the bleeding. Stop getting worse.

So introduce a few things that just raise visibility to new problems that are occurring, maybe new vulnerabilities introduced, new mistakes, and try to instill some process again, 80/20 to stop the bleeding and not get any worse. I think that also just gets a certain set of muscles that improve. So maybe a bit of a two in one. And then it gets kind of no third or fourth year is to start moving the culture forward. That's a long, long journey. So on one hand, it doesn't have urgency like the others, on the other hand, if you don't do it, then you're always going to be chasing yourself. So you have to build some security mindset

in here. I'd also say there's probably like two views to it. There's the more traditional breath approach. So you want to get everybody into some base level. I see a lot of people as well do this, which is pick a problem, try to kind of get everybody to start addressing it, start talking about that problem, celebrate some wins, get the right education. And think that's good, kind of get that across the board. I think when you're talking about AppSec, specifically talking about engaging with developers, then it's important understand that developer tools don't get adopted top down, they get adopted through top teams that become role models.

So you have a team, they pick up some dev tool, they love it, they master it, the team next door says, “Hey, I want to use that as well. I'm seeing the greatness.” So same for security, probably alongside the breath, because you should probably pick a team, it's like a role model in the company that people look up to that has more appetite, and more aptitude, maybe to embrace security, and help them get to mastery. Also, for the rest of them, can I get to basecamp? This team, you're trying to get to the summit, and then have them become the North Star as you can roll out further in the company. So it's not an easy job. But those are probably the order of things that I would approach.

[00:07:24] Simon Maple: Yeah. That's great advice. And I think particularly with those teams that really do kind of push the boundaries, and always wanting to change for the better. Those are always the teams that are always very good at documenting what they do, why they're doing it, and others can actually learn a lot from that, and follow that, almost like that paved path to what they're doing. Really great advice. I think a lot of people will resonate with that very, very well.

Another question in from Gerald from our Discord community. As a startup, how would you build in security, knowing that you have very limited resources, and that what you're building at the moment can be sent to waste at any time, because achieving either a product or a market fit might require you to alter and change your solution as you go?

[00:08:12] Guy Podjarny: Yeah. I mean, I think the easiest way to approach this is to think about security as an aspect of quality. So like when you're a startup, do you care about uptime that your system doesn't go down? Do you care about quality, that the system doesn't have bugs. I mean, you care about those. I think as a startup, if you're straight up at the very beginning, you’re trying to find product market fit, it's probably not your top priority. So you need to do a certain amount over here. But you probably don't need to necessarily master that. That's not the most important thing. The most important thing is figuring out the kind of the product capabilities, the set of the fit, how to message them, how to get them to customers.

So there is an order, but just like uptime is important all the time, just like no quality and avoidance of bugs is important, because that sort of sets a precedent. Security has to be something you do right away and how you do it. I mean, at the beginning, I wouldn't go straight to programs. You don't need some methodology. It's just about asking the question. So a very, very simple question is to just add, if you start embracing some form of template for your features, or for your sprint plans and things like that, it's just add the security tab there. It says, “What are we doing about security over there?” And just kind of communicate that. So that just starts getting you in the habit of thinking about it.

Also, it flags or sort of identifies any gaps of knowledge that you might have. Because sometimes within the team, you may have more or less security knowledge. And again, just like operational skills and things like that, you want to have that in the mix. Probably like the next milestone comes and today that is coming earlier and earlier with the customers, especially if you're a B2B, increasingly, what you find, and I see this with customers all the time now that the demand for like SOC 2 compliance or some other security certification, it's really starting very early. That's a bit of an industry specific need, clearly a little bit less demanded maybe in B2C. But in B2C, you might be demanded to demonstrate privacy or other things.

I think that's probably the next bit. And I do see that sort of coming earlier and earlier. And so, at that point, it goes hand in hand with understanding where is it that security can help you actually succeed more with your products, except for specific products that are very security sensitive, security is probably not a core component of your product market fit. People expect the product to be secure, but they don't necessarily buy it because it's secure. Once it's above a certain bar. Other products, again, sometimes you deal with very sensitive things, and that's not the case. But for all products, I think the notion of being trustworthy, demonstrating that you are trustworthy, that you will protect their data, that you will protect your customers is increasingly important.

So SOC 2 compliance is sort of an element of that. It's just like a formal way of saying, “Look, you can trust me, at least to this degree.” But in general, sometimes their own security capabilities around how much access do you have around encryption, around just making it a part of your original pitch, all of those are useful. Probably a step past the sort of the initial, just sort of get them in the door and support, so I think that's probably the sequence. After that, you start getting into proper evolution of your security program as you scale.

[00:11:13] Simon Maple: Yeah. And security probably goes alongside a number of other capabilities, whether it's high availability, reliability, and a whole ton of those other things whereby, very often when you're at very early stages of prototyping, it's kind of like a nice to have, but it's not necessarily that I think that you start with, over and above getting something actually working and out there.

[00:11:32] Guy Podjarny: Yeah, absolutely. Just like those elements are different in different cases. If you're building a healthcare system that's really like vital signs, if you have some service that gets signals from pacemakers, your uptime is really, really important. And you can't really cut corners, even with the initial offering. While if you're on another social media platform, or if you're something that's very async, then maybe uptime isn't as critical. So similarly, for privacy or for security, it depends on the sensitivity of the data, and how much that is table stakes to your system. And then some products need to excel in security from the very beginning, and hopefully, for those, the founders are – they are sort of the right founders of that product, are aware and mindful of the importance of it and try to invest in it right away.

[00:12:21] Simon Maple: Excellent. So we're going to take a step into more of the security and DevSecOps hot takes that you've got. And a question over LinkedIn, from Ian Andrews, the CMO of Chainalysis. And Ian asks, he's interested in your take on Blockchain smart contracts, Dev security. He says, “It seems like tooling is super immature relative to other software ecosystems, hacks, like the one on wormhole resulted in massive losses.” So what's your hot take on this?

[00:12:49] Guy Podjarny: So crypto security, Web 3.0, Blockchain security is super interesting. If you sort of look at the stats, the last that I saw was that there are probably about $100 billion in defi, in decentralized finance enabled Web 3.0, and about 12 billion of those were stolen in the last year. So clearly, security is a concern, and a lot of it is through holes in smart contracts. Is security important for Blockchain? Absolutely. And is there a concern around smart contracts, having problems? For sure.

It's a little bit trickier when you think about it as a market for tools. There are probably about 300, give or take, probably like big, smart contracts out there, and they really command the vast majority of transactions, that kind of money handling in the system, and they're still fairly specialized. They're still a relatively small, all things being told, a number of authors of smart contracts, and specific organizations. So, if you're building a company that builds tools to help smart contract developers secure what they build, it's a fairly small market. Also, these contracts are very, kind of small and specialized. And so they oftentimes justify some pretty heavy kind of manual review, although still tools can help that manual review.

I'd say, right now, security of crypto, probably the most interesting thing is just as a risk management exercise. As an investor, you purchase some tokens, whatever it is, how you invest in crypto, and you're probably pretty blind, to your ability to assess the risk of these contracts that you're assessing. So that's an immediate need and probably there's a lot of people that would need that type of value.

As far as dev tooling, I do think that eventually it arrives. It's easy to imagine a future in which every like E-commerce purchase is a smart contract with some things attached to it. There are a lot of transactions that you might do that are smart contracts, at which point more and more companies and organizations will start building and writing smart contracts. And I think then there will be more of a need for automated tooling, because you really get into that point, again, where it's not really the sort of top, top experts writing it. It's more people building functionality. And just using smart contracts as part of that. Anybody's guess really, at the timeline, to that point in time. Yeah, getting it right is probably going to be quite lucrative.

[00:15:07] Simon Maple: And what would your guess be?

[00:15:11] Guy Podjarny: I actually really don't know. Crypto is such a volatile world, that it's hard to know. NFT's for example, they are smart contracts. They are smart contracts, but very templated. So it's almost like security for no code, right? Like it's not the most obvious immediate concern, because it's something that's fairly constrained. I think it will happen. I don't think it's an overnight thing, and I think it's also another interesting question is how much of these tools become separate from regular software development tools, versus how much are they the same set. But yeah, I wouldn't hazard a guess.

[00:15:44] Simon Maple: We shall see, we shall see. Next question is from Rajesh, who is the Director of Cloud Infrastructure and Security Services. And Rajesh asks about what is your view on achieving a pragmatic zero trust for application security?

[00:15:58] Guy Podjarny: Yeah, it's an interesting question. So zero trust is very big term. So I don't think everybody – when you say zero trust, it doesn't necessarily mean the same thing to everyone. I guess when it comes to application security, I would say that mostly relates to permissions, and to sort of access permissions and the notion of how do you not just rely on something being in the network, and therefore you kind of allow it to do everything, but you constrain its permissions to the relevant scope.

I think in that context, microservices and cloud architectures are a good step in that direction because unlike a monolith, they are split, and each of these microservices does have on a cloud platform certain constraint permissions around what it can do. If you're doing it correctly, especially today, with a variety of service meshes, you can constrain that authorization, rather what they can do regardless of their subnet. The challenge has always been in for microservices, less around the technical elements of blocking it. That world is evolving nicely. I think there was like open policy agent like Styra. There's a bunch of service meshes, Solo, all sorts of controls that can sit there. But more about the definition is how do you know what is this microservice allowed to do?

I think that's still an unsolved problem. A lot of tools in the security space, try to kind of do it automatically, try to reverse engineer from looking at the application, what is the intended behavior, and what isn't? I'm not a fan of that approach. It can be immediately practical, because it would find a bunch of problems. But I don't think it's the right long-term solution. You don't want to reverse engineer the intended consequence you actually want to specify. But then specifying it, has been demonstrated to be something that isn't done well, right? Permission is just sort of, for convenience, just expand and expand until someone's adds an asterisk, and then just everything is allowed.

So I think that's the problem to solve. I do think it will be solved. But we need to figure those out. I think – so all of those, I think, are very immediate activities and an opportunity. Maybe like the CIEM world, Cloud Infrastructure Entitlement Management gets a bit closer to it, but they're very non-dev friendly mostly. Long term, I do think that there are some interesting things to talk about getting the granularity lower. You see Dino, which is the node, the secure node or sort of the attempted a more secure node environment for it, being more constrained with permissions that you can think about portions of the application modules being constrained from what they're allowed to do. It's quite tricky. You start thinking about how do you know what is the context right now? And then even more, it just exacerbates the problem of how do you configure worries that that you say, what will happen?

I do think over time, that is what we will see. We will kind of approach the more kind of mobile style intense sort of constraints that really limit what a component can do. But how do you make that in such a way that is successful, is still something that needs to be resolved.

[00:18:57] Simon Maple: I think that's so interesting, because the way you talk about when these things aren't designed correctly, developers can actually really easily just work around them so easily or so quickly. For example, the security mantra in Java, for example, me as well, back in my early days, at the time, it was just like, “Okay, this has fallen because the Java security manager says, let’s just stop that.” It's the first thing that you do when you're actually, “No, you're doing something that's reasonable, but you actually can very similar with other things.” And SaaS is a great example, right? SaaS with so many traditional SaaS false positives, that historically, that's what the world is used to, from a development point of view. So what do developers do? They don't trust the results. As a result, they overlook actual real vulnerabilities, because there are so many false positives in there. So yeah, interesting the way developers will always find a way whether it's the way you want them to do it or not.

[00:19:52] Guy Podjarny: The Java security manager was actually something that we looked at the very beginning of Synk. The original name of Synk was from, we’ll sneak data out and then kind of allowed to protect the application. We looked into the notion of kind of containing components and things like that. Java security managers, technically, a very powerful tool and that it's running, and you can configure it to a fairly granular degree, but the practicality of doing that is just extremely, extremely low. So it becomes a sort of black box, never tunes to – so I think this is doable. I think this is something that can be done. But it has to be done hand in hand. What I don't believe in, is again, the kind of convenient attempt at a platform-y approach or like a post deployment approach that just sort of looks at the system and figures out what would work. I've seen that failed with firewalls. I've seen that failed with pretty much like every attempt to it. It doesn't provide immediate value, but I just can't accept that as the long-term solution.

[00:20:52] Simon Maple: Yeah, that's good, good, good, good points. Okay, so we're going to jump to a question from Rachel, Head of UX at Multiverse. And Rachel on LinkedIn asks, and this is an interesting question, actually about – you’ve been at Snyk now, I guess, Guy for how long? Gosh, since you founded it. Six, nearly seven years.

[00:21:09] Guy Podjarny: Seven years now, six and a half, seven years, yeah.

[00:21:11] Simon Maple: And of course, as the founder, you've been the founder, the CEO, the president, the prime minister, and I'm not sure if there's any other – so the question that Rachel asks is, have your views around the balance between security and usability changed over time? So of course, back in the day when you created Snyk, one of the core problems around the market was tools, security tools weren't focused around developer usability and developer experience, and that's the premise behind Snyk as a company. So how have your views changed, I guess in the six years?

[00:21:48] Guy Podjarny: Actually, I know Rachel asked this question. And Rachel and I gave a talk at the time of that Snyk was created. I think about seven years, like a little bit before Snyk was created, titled security ergonomics. Back then, we talked, Rachel was a designer in the Chrome team, and we talked about how the browsers have really evolved the security usability of handling of certificates and such, where it used to be that if you had a bad certificate, you would just accept and push forward.

Today, it's actually pretty hard to accept, even when you know, developers actually probably relate to this. You know it, you're intentionally browsing your dev servers, you know their certificate is bad. And yet, you kind of need to really click through, find the gray text within the gray area, expand it say, “Yes, I do it.” Are you sure? Are you absolutely sure? And that is really successful in the data shows that that has been tremendously helpful.

Fundamentally, I think my views haven't changed. A lot of the tactics have changed, but I think the core of it remains the same. There's a certain balance between how much you care about a topic and how hard it is to do the right thing, and you need to care more than it is hard. That's kind of true in life, for everything you do. If you go on a diet, or if you, whatever, spend time with the kids, or if you go on a holiday, like all of those decisions, you need to care about them, want to do them more than it is hard to do. So I think, historically in security, it's just very, very hard to do security well. There's a bunch of things that we can do, and we are doing around elevating the importance of security and the security industry has consistently tried to kind of scare you into submission. To an extent, you have to highlight the risk. You can’t talk about avoiding risk without talking about the implications of it, and you also have stories about the costly implications of downtime, for instance, in ops. So it's legit.

But it can't be the only tool in your toolkit. So we have to invest in making it easier for people to do the right thing and making the default path, the secure path. And there are a lot of ways to do that. I think the browsers are ahead of where we are. The developer security in general, actually had Adrian Ludwig on the podcast, quite a while ago now. Now, he’s Chief Trust Officer there. He had a good point there, which is you're talking about enterprise security grade. But that's actually not the best part. The best part is consumer grade security, because really consumers, you don't expect them to know what to do, you need to build your systems such that it guides them to do the right thing, and it needs to be secure.

So I think we're still quite far from it, but the quest is conceptually the same. Let's try to – on one hand, keep people from not giving work possible, and not given them even the option of shooting themselves in the foot, and where we do need to loosen up, when we do need to kind of allow flexibility, really invest in making it easier to make the secure decision than the insecure decision, or at the very least, as easy as possible to choose that secure path.

[00:24:53] Simon Maple: Yeah. And even, being able to choose that path without necessarily being an expert in the domain of security.

[00:25:00] Guy Podjarny: Absolutely. Yeah. And to me, that's a part of the difficulty, the expertise required, the attention required, the time required, all of those, they just don't. At the time, we talked about browsers and so on. If you go on to Facebook, and you're trying to log in, you want to see baby pictures, and it doesn't matter, nothing is going to get in your way. You want to see those maybe today, seven years later, it’s not the same thing you're chasing. So if you're just sort of promises like, do you want to allow X, Y, Z? You're sort of like, “Yeah, get out of my way. I want to achieve this thing that I'm trying to do.” So you have to rethink how you paint things.

If you're a developer and trying to build a piece of functionality, you're trying to sue it and try to make it work, and you're whatever, failing because of some permission, or whatever it is, you want to say, “Get out of my way. Let me put that asterisk over here, so I can actually get on with the thing I'm trying to do, which is, see this feature in action.” And then the attention and effort and time required to go back and change that is different. But yeah, if we auto tuned it in, or whatever, in this example, go to a prompt and say, “Hey, do you want to add just this specific thing now as you're trying to do it?” Maybe you'd be more inclined to create a lower privileged system and there's a million examples like that.

[00:26:09] Simon Maple: Very, very true. And after two kids, I think I'm more scrolling for dog pictures these days rather than baby pictures. There's a quick-fire question I didn't ask you, Guy. Dogs or cats?

[00:26:21] Guy Podjarny: Dogs.

[00:26:21] Simon Maple: Dogs, of course, yeah. That is the right answer. A lot of them have preference, but that is the right answer. So, Dogeard asks on Discord, following on, really from how much developers need to be experts. How far left do developers need to shift? That is to say, at what point do developers take on being security experts?

[00:26:43] Guy Podjarny: Yeah, I'd say there's like a false dichotomy in that question. You need to shift left, so much so that you don't need to know this you're doing security, almost the opposite of becoming a security expert. Well, there's no real – first of all, shifting left is not – it’s an industry term, and we use it, and I think it's valuable, because like DevSecOps, it helps push a movement, but it's not really about just left. It's around top to bottom. It's around getting developers to embrace security. Because as a developer, if you're a modern developer, you're not just living at the left. You've expanded to the right, you're working on things that are deployed, you are carrying a pager, sort of virtual pager, and dealing with on call.

So you have to think about more about deepening and about top to bottom around embedding security into software. The answer is, until it's part of the fabric of software development. From a left perspective, from earliest stage perspective, it probably starts in education, probably continues into design and planning, how much security is embedded into it, and then it goes on to the coding, and deployment and such. But really, all of those predicates on ownership, predicates on people thinking about developers, thinking about security as a part of what they need to do, and they never – before, the point in which we need to make them security experts, we would have failed.

In fact, I also think, for instance, like the level of Kubernetes food that you need to do to sort of deal with systems today, that's also a failure. Those are temporary glitches. Typically, what you want to do is you want to be simpler, want them to care about the concepts, but you want them to have tools that help them actually apply those concepts correctly. You want them dealing with the logic or the decisions, not the technicalities.

[00:28:28] Simon Maple: Yeah, excellent answer. So yeah, it sounds like developers shift left, they expand right, then they did a push down. So I think they've only got out and up to go.

[00:28:39] Guy Podjarny: When you’re about 30 in a bit, there's also like a bit of a midsection typically. I don't know if it's exclusive to developers. But definitely explore it. It helps that a little bit.

[00:28:48] Simon Maple: Well, that leads us nicely on to Martin's question from Discord as well. What's beyond shift left? How do we build towards our next step in secure coding? And how do we help the companies that are struggling at the beginning of their DevSecOps journey, or stalled out in the middle? That's probably a little bit of what you've said already there.

[00:29:05] Guy Podjarny: Yeah, it's a little bit of what I was saying right now, which is it's not about shift left needs to sort of evolve into that top to bottom, into that embedding, into software. It's not the same as asking about, how do we help companies that are sort of struggling at the beginning of their DevSecOps journey. I mean, at the beginning, it's more about showing you that it's possible about, you know, showing developers yes, there could be a security tool that as a developer, you will actually enjoy using, that would not only not get in the way, but actually make you kind of feel more proud about the craft of what you've created.

It's much more around providing visibility to something that previously was visible, and raising questions you never need to ask. That's the sort of the very beginning for developers that for organizations that are just starting with DevSecOps, and to be frank, security teams needed to go through a similar change as well. They need to go to kind of transform from being the ones that are maybe running the tests, and finding the issues and choosing the issues that would be there to be more platform builders, and that's uncomfortable. When you think about the world of SREs today of sort of service reliability engineers, system reliability engineers, they are, even that world is split a little bit, and some people over there like building platforms that allow the developers to build operable software, and when the problem occurs, address that and fix that. Some like being superheroes. They like coming in and saving the day when there was a really bad problem, and they took the page they came in, and that's fine. There's going to be a bit of a mix of the two, but we have to shift towards the former and not the latter. Otherwise, we're never going to change that equation.

So it's really starting that transition. And at the end of it, it's a natural part of it. I don't think it ever fully, fully ends. If you look at ops, ops is probably like a good role model for us. There are still experts, there are still teams that remain there. They're still kind of building out centers of excellence. They're still building platforms. There are still escalation points. But most of the work is done by the core teams. So that's what we aspire too, for security.

[00:31:03] Simon Maple: Great answers. So we have another question from Discord. This one from CloudGeek7, and CloudGeek’s question is, how do you transform your team from DevOps to the DevSecOps mindset? I guess there's some terminology questions here as to how much you see DevSecOps, being distinct from DevOps there as well.

[00:31:23] Guy Podjarny: There are sort of two transitions that happen at once. On one hand, it's a change to the people doing DevOps to kind of put security into the fold. I don't know if they think of it as expanding DevOps or transitioning to DevSecOps. That really happens by whenever you're sort of asking about ops, ask about security. If you're dealing with operability, as part of your team's KPIs, why aren't you asking a security question over there? If you're thinking about resilience and uptime in terms of your budgeting and resourcing for the team, why aren't you doing the same for security?

So, there's a whole practice to it if you're equipping tooling wise, if you're equipping your team with tools too – when you think about the tools that we have today in the ops world that weren't at developer’s hands a decade ago, it's night and day, right? If you think about like the dashboards that we have around seeing our systems work, the logging systems that we have. Logging used to be something you only do at the at the edge case, it was in the exception catch. Now you log throughout, so that you know what the system has done, what it hasn't.

So there's a bunch of these types of practices and tooling them. From the DevOps side, it's more about stopping and thinking, “Okay, how can I weave security into different phases? From your stand up meetings, to your plans, all of these different stages?” I mentioned this before, there are two paths that you need to take. On one hand, you try to get the whole organization to level up a little bit on a specific space. On the other hand, you want to find a team and help them master that, and have them become the sort of the North Star, and that's also how kind of DevOps really kind of gets rolled out in organizations typically.

The other transition that happens is for security teams, and they need to let go, they need to sort of switch to being an empowering organization. They need to switch to becoming platform builders. They need to figure out how do they answer compliance questions when they're not the ones doing the work. They need to figure out how do they govern when they need to be supervising. How do they become collaborative with the development teams? All of these are journeys. So you shouldn't come into it thinking you could just flip a switch and make it happen. And I'll just emphasize, I’m a bit of a broken record here. Don't try to do it to the whole organization, especially if it's a large organization, don't try to do it to the whole organization at once. Get everybody like aligned on the general mission, but then get a subset going full strength forward.

[00:33:43] Simon Maple: Yeah, great answer. I totally resonate with that take about, it's about learning that feedback as you go and then changing your approach depending on how certain teams have taken it. So yeah, love that. Love that. It's actually very refreshing to hear a security take mentioned logging so much these days without talking about love for J. So thank you for that. I'm sure many people would appreciate that.

But now, actually, we're going to take a switch across to the DevSecOps future. We'd love to hear some future predictions and future takes. Deepak, co-founder and CEO at BoxyHQ, on LinkedIn asks, he says, there are some exciting developments in the area of software supply chain security management, like s bombs, software bill of materials, and software signing sixth door, for instance, what are your thoughts on the futures of these initiatives?

[00:34:27] Guy Podjarny: Yeah, I think software supply chain security is a mouthful, but it's also a very important topic, and it's very big. It's big and complicated, so much so that we're actually going to do a little miniseries on it here in the podcast. So stay tuned. We'll update when it’s there because there are a lot of terms to sort through and a lot of steps to consider.

In a nutshell, I think, one, it's a real problem and it's going to get bigger. There's this chain of trust, because of complexity, we are using components, and at best, you kind of choose to trust a component, but you don't think about the fact, you're also trusting its maintainers and the desktops or whatever, how they secure their desktops, and the organizations that they work in. And then subsequently, all of those have the dependencies that they chose, and the journey, there's so much that we depend on, it's so big, it's so hard that we revert to blind trust, and that's just fragile.

What we're seeing is we're seeing attackers exploit that SolarWinds code call, just now note IBC on it. Sometimes it's about fragility due to an action of someone that was in the system, and sometimes it's around someone that was outside and manipulated. So it's absolutely a big problem that we need to do. There are a lot of forming practices and tools in the industry right now, commercial and open. For instance, the open source security foundation has been formed. I'm on the board of it, Snyk is a premier member. There's a lot of support from it and different commercial vendors. I think there's a lot of work coming out of it.

SixDoor is an example of it that allows you to kind of keep in store, at the stations improves through the process of building a component. There's also like Scorecard, which is an attempt at a standardized way to evaluate and assess a repository. There are many others. What I would say is like, stay close to it. Again, we'll have a podcast series here that describes it in more detail, but it's a journey. I would say the first thing you need to do comes back to things that you really should have been doing anyway, which has started with visibility, and they'll start with just knowing what is it that you have, which components are in your system? What are the problems that are already known in these systems? And if you don't do that, if you don't really know how to sort of deal with the basics, all of these other things, they're really like a tool 202. Their further expertise that you need to be able to address and start managing risk of problems that might occur downstream.

So my view is, stay tuned, keep track of these activities, there's no obvious set of actions unless you're really quite further ahead in your security journey, and just sort of make sure that in the meantime, you set yourself up to have a good handle on your dependencies in the first place.

[00:37:02] Simon Maple: Yeah. I look forward to those sessions coming out, and others too, they sound interesting. So staying on the DevSecOps futures, Vincent Goulet, the founder and CEO at Sonder Security. I would love your take on the implications of application security with regards to cyber warfare and cyber espionage. He mentioned APSET is typically well understood in the space of management systems, ERP, HR systems, et cetera. But what is the future with regards to industrial systems, SCADA, and the like?

[00:37:31] Guy Podjarny: Yeah, it's a good question. I mean, so we have to think a little bit about the technology trends that occur here. What has happened in the world of more advanced technologies, is that in the name of sort of agility, and speed, increasingly, we move towards these sort of independent teams that are able to run quickly, and we move more and more responsibility to the hands of those teams, so they have fewer dependencies in the org, and they can run quickly, and some cheap software to a customer, and iterate. So those actually didn't start in ERP and HR systems. They sort of started in commerce and media and such and continue into a variety of startups and Fintech and such. And then they're getting into ERP and HR.

So with that agility, for that independence, the scope of the application grows, and the importance of securing that application grows. More issues are at the app layer, which in turn, draws more attackers to sort of expose on it, because just the decisions and the power, and therefore the flaws are growing in the AppSec space. It took a long time for this to happen, and now I think, a lot of it, y those lines are blurring, and a lot of infrastructure today is really an aspect of the app, et cetera.

When you look at mission critical systems, you look at industrial systems, they don't tend to be at the cutting edge of those technologies. But they're not immune to these demands, even managing a nuclear plant or an oil pipeline, those are still subject to the desires to out innovate competition, to move faster to satisfy more automation, more efficiency, more better user experience, so that you can run faster, so that you can be more competitive in the market or just provide a better outcome. Even if you're not competitive. Again, like a nuclear plant or something, maybe competition is less around that. But it is around efficiency, just about getting better results through technology.

So that drive for speed happens there as well. And a bit further behind, but they're going through the same process of moving things into the application layer, taking advantage of containers, of open source components, because that's the other thing. It's not just application layer, it's also the supply chain and leveraging dependencies, leveraging the open source ecosystem. I think that would continue, that would grow, and with it, AppSec importance will grow within these systems. You've seen, that was the trend with every other security practice, for instance, like network security started with systems that are far more networks, far more sort of open and connected to the public networks. Subsequently, they did become important in-house as well.

Fundamentally, I'd say it's the same journey, and as always the sort of the SCADA systems, the mission critical systems, they're just going to be a little behind. Just like one interesting tidbit is, for instance, Biden's latest factor or the Biden administration's latest factsheet, they emphasize because of the Ukraine crisis, kind of fears of cyber war. That one explicitly calls out, out of a fairly tiny set of bullets calls out, I think about three or four of them, the importance of developer security, the importance of software security, the importance of handling your dependencies. So it's already there. It's already a concern, and it would only strengthen.

[00:40:39] Simon Maple: That sounds very interesting. I think it would be very interesting to see how the rest of the world as well follow suit with similar kind of implications that Biden has put on organizations. Question from CloudGeek7 on the DevSecOps community Discord. Is no ops the end of DevOps?

[00:40:57] Guy Podjarny: You got to love the sort of buzzwords. No ops, no more ops, no more operator –

[00:41:04] SPEAKERS: Maybe it’s NoOps. We’ll call it NoOps, as well.

[00:41:09] Guy Podjarny: I think maybe it’s NoOps. But fundamentally, no ops, first of all, there is an application and it needs to be operated. And so it's not about not having an application that needs to be operated any more than serverless doesn't have servers behind the scenes. Clearly, it needs to be operated, and there are servers – I think the difference, really, is about who's doing the work, and where is it that you want to differentiate? So, for certain scale and type of applications, there's absolute value and absolute merit in advancing towards a place in which you don't need to invest in ops as much. Maybe you don't have an ops team. If you're using serverless, you need to worry less about ops in the sense that you've offloaded some of that to the platform if you’re using a path like Heroku at the time and monitor systems, you do less than that.

If you look at sort of Salesforce app exchange, and things like that, that's even less sort of operations responsibilities on it, with fewer degrees of freedom in what you do. So once you hit a certain scale, or if you actually want to differentiate through your tech jobs, you don't want to be beholden to these constraints, then you do need of skills inside your organization. So I don't think no ops changes DevOps. I think it's just a continuing of how much is offloaded to the platform, and I think there would always be – if you're looking to be good because of X, right? If you're looking to sort of excel at operations, at security, you would still need dedicated teams. You would still need those Centers of Expertise. You would still need to invest in choosing the right kind of components of your security or ops platform and build those out.

What you shouldn't do, is you shouldn't make those teams bottlenecks. So in some teams, no ops means no DevOps team. Frankly, DevOps teams are our misnomer. They shouldn't exist. It’s platform teams. Today, you see a lot more at the term platform teams, because DevOps is all around breaking the barriers. But they've all embraced the platform approach, and eventually, the same is true for security. But security is probably easily a decade behind in that journey.

[00:43:19] Simon Maple: Yeah. Very interesting. We have a couple of questions left. This next one is a very future looking one. Very interesting question from developer Steve, again, from the Discord DevSecOps community chat. He asks, what are your thoughts around security in the emerging quantum computing space? And how long do you think until we start seeing the first quantum of vulnerabilities appear? You mentioned visibility into your vulnerabilities, I wonder if you observe quantum vulnerabilities whether they change state or disappear? Who knows?

[00:43:49] Guy Podjarny: Yeah. There are a few very interesting spaces when you think about weaknesses in them. So like Blockchain is one. The notion of biotech is interesting, machine learning and machine deception, in it are interesting. Quantum kind of applies to security in two ways. In one side, and probably the bigger conversation is around the use of quantum in exploits. So it's more about where is it that we relied on an encryption, and we assume the system to be secure because of this kind of crypto defenses, and maybe quantum changes in breaks that equation. So that type of defense. I'm actually seeing growing a fair bit, so it's gone – the evolution here, while it's probably still kind of a bit of a ways away from quantum being able to break, really kind of the top tier of public encryption algorithms. It doesn't feel farfetched anymore. It's when, not if. So you do see some defensive mechanisms come into it, things that really kind of overlay other layers of protection, where you can't rely just on crypto, and you need to maybe talk about where the data is, and maybe you just need different mechanisms. It's still a scary proposition, and when that time happens, I expect some turmoil for a while, especially when you think about quantum bring crypto.

Vulnerabilities in quantum itself, I haven't heard of. Basically thinking about exploiting quantum algorithms. I would say, fundamentally, attackers are lazy, just like developers are lazy. They're seeking for the best way to make money. And so, if there are fewer victims to look at, there they're less likely to invest their resources at to trying to attack them, unless they are very, very, very lucrative. So I don't think quantum is a very compelling target at the moment for attackers to invest in. I think there are sort of more appealing targets, for instance, Blockchain or machine deceptions on the cutting edge. And of course, things that are far more mundane, but very lucrative, like Cloud and open source vulnerabilities and supply chain interest.

[00:46:00] Simon Maple: Yeah. And that leads us actually on to the next question, the final question from Propaganda Panda. What an amazing handle. That definitely wasn't a Propaganda Panda. What will be the new major challenges that security and engineering will need to face in the coming years? How can engineering keep up with the expanding attack surface?

[00:46:18] Guy Podjarny: I think in general, the growth of complexity, the increasing complexity of applications is probably the biggest challenge that we have. What we can’t do is keep piling on developers who also care about this and this and this and this. That's just not a winning strategy. So, we have to figure out how do we eliminate some problems. We have to figure out, how do we build defense in depth? Or how do we build defense such that if you've established some specific set of practices, a whole set of problems go away.

I think that type of answer might sound impossible, but it isn't. When you think for instance about desktop vulnerabilities, and you think about the risk of like hacks around windows and things like that, and you look at where it was 20 years ago, and where it is today. 20 years ago, you would say, it's impossible. There's no way these operating systems are really poor. Two things happen in the meantime. One, mobile happen. And in mobile, the company has rethought security, and it's a much more modular and a much more kind of componentized security model, and it holds a lot better alongside with attention from those companies. So, cloud potentially allows us that type of opportunity, not just for security, both for resilience, and also for a bunch of other challenges, nonfunctional attributes of the application. And we just need to take advantage of that.

The second thing that happened was that, in this case, Microsoft really turned around in terms of its attention to vulnerabilities. 15 years ago, I think, or 10 years ago, maybe if it was a punching bag for attackers, it was so easy to find it and no Mac was even running on this sort of premise of being more secure. And today, maybe that's still the case. I don't know, it's still a more open platform, Windows than Mac, I believe. But fundamentally, it's far, far more secure. There’s a lot more security built in, there's a lot more security awareness. So that's the other piece that we can do, which is just to raise awareness to it. I'm kind of focusing on security with this question. And I think the question is about more than just that. But to me, if I think about security as an attribute of quality, it is around embedded in, figure it out. Now, it's a part of how you build software and you keep pushing it forward. We need to deal with that complexity at the core versus just sort of patching and detecting and responding and things like that. We have to think about how do we actually solve some of these problems so we can move on.

[00:48:50] Simon Maple: Well, thank you very much, Guy. That concludes our AGA, our Ask Guy Anything. How was it for you, Guy?

[00:48:58] Guy Podjarny: It was fun. It's nice to sort of voice some opinions over here. Much of it learned from the smart guests on the show, to be frank.

[00:49:05] Simon Maple: Yeah. And it's always great chatting and listening to your insights and takes on the industry, on a DevSecOps and security in general. So a big thank you, Guy, for all for all your answers. And of course, we'd love to hear from the audience if this is a format that you enjoyed that you really liked. Let us know and we’ll make sure we do more of those. But for now, we’ll see you next time on The Secure Developer podcast.

[OUTRO]

[00:49:28] ANNOUNCER: Thanks for listening to The Secure Developer. That's all we have time for today. To find additional episodes and full transcriptions, visit thesecuredeveloper.com. If you'd like to be a guest on the show, or get involved in the community, find us on Twitter at @devseccon. Don't forget to leave us a review on iTunes if you enjoyed today's episode. Bye for now.

[END]
[END]

Up next

Security Ownership And Culture With Peter Oehlert

Episode 115

Security Ownership And Culture With Peter Oehlert

View episode
Open Source Security, Vulnerabilities, And Supporting Women In Technology With Emily Fox

Episode 116

Open Source Security, Vulnerabilities, And Supporting Women In Technology With Emily Fox

View episode
Shifting Security Left With Rupa Parameswaran

Episode 117

Shifting Security Left With Rupa Parameswaran

View episode
This Is How They Tell Me The World Ends - A Look At Supply Chain Security With Nicole Perlroth

Episode 118

This Is How They Tell Me The World Ends - A Look At Supply Chain Security With Nicole Perlroth

View episode
Securing The Modern Software Supply Chain With Adrian Ludwig

Episode 119

Securing The Modern Software Supply Chain With Adrian Ludwig

View episode