Skip to main content
Episode 107

Season 7, Episode 107

A Look Into The Future

Listen on Apple PodcastsListen on Spotify Podcasts

Today we have a fun episode lined up for you! Over the last year of 2021, we’ve been honored to have some incredibly smart people on the show to share their views and practices in the DevSecCon space with us all. And in each episode, they were asked a slightly open-ended question: if you took out your crystal ball and you thought about someone sitting in your position or your type of role in five years’ time, what would be most different about their reality? For this special installment, we’ve put together some highlights of these brilliant answers! Hear perspectives that cover everything from changes on the data, AI, and ML front to the idea of ownership when it comes to security. We also touch on the increased fragmentation in the DevOps scene that we’re going to need to work with, bigger picture concerns about how regulation might be different in five years, and some final optimistic predictions on ways we could all be in a much better place! We hear some golden nuggets from the likes of Robert wood from CMS, cybersecurity influencer Ashish Rajan, Liz Rice from eBPF pioneers Isovalent, our very own Simon Maple who weighs in with his concrete expectations of what will happen, Dev Akhawe, Daniel Bryant, Rinki Sethi, and so many more! So to hear what these top industry professionals have to say about the future, join us today!

Partager

ANNOUNCER: Hi. You’re listening to The Secure Developer. It’s part of the DevSecCon community, a platform for developers, operators and security people to share their views and practices on DevSecOps, dev and sec collaboration, cloud security and more. Check out mydevsecops.com to join the community and find other great resources.

This podcast is sponsored by Snyk. Snyk is a developer security platform, helps developers build secure applications without slowing down. Fixing vulnerabilities in code, open-source containers and infrastructure as code. To learn more, visit snyk.io/tsd.

[INTRODUCTION]

[0:00:51.8] Guy Podjarny: Hello everyone, thanks for tuning back into The Secure Developer. Today, we have a fun episode. Over the majority of 2021, as I had all these smart people on the show, I really liked to ask them at the very end of the show a slightly open-ended question to just to get some raw thoughts out.

The question I’ve asked for most of 21 was, “If you took out your crystal ball and you thought about someone sitting in your position or your type of role in five years’ time, what would be most different about their reality?” It was really great to hear all these different perspectives about what would increase or decrease in the industry from these smart people. What we tried to do today is take highlights of these answers.

Unfortunately, there’s too much to fit into one episode to get all of them in, albeit, there’s a lot of brilliant perspectives and categorize them a little bit, try to spot the themes. So let’s dig in and see some of these predictions for the future from our past guests.

[INTERVIEW]

[0:01:57.0] Guy Podjarny: The first batch of predictions revolved around data or machine learning or AI which was definitely, I think, the most recurring theme amidst the answers. Let’s hear four predictions about data and AI and those are coming from Jet at Nike, from Joshua at United Health Group, Robert Wood at CMS and Ashish at PageUp. Let’s hear what they have to say about how in five years’ time, data and AI will play a different role in our lives.

[0:02:22.8] Jet Anderson: That’s a fantastic question, and I’ve been thinking about it. One of the things that I see being the greatest change in our technical infrastructure landscape right now is the explosion of data. By it, the explosion of fields like data science, AI and ML leveraging this data to gain insights and train models to help us make better decisions as companies and reach our customers in better ways and that sort of thing.

With that comes a huge set of data privacy concerns that are off the charts scary. Couple that with the fact that we haven’t been training software engineers. Now, are we even thinking about training data engineers and those who are involved in creating these models and the security of those models, and so forth, the security of the training data that goes into these models?

I think that that landscape will only get more complicated. My job as an educator, for everyone who produces computed insight and applications that do whatever they do, is to really try to adjust to that changing landscape and provide value for everyone writing code, whether it’s R and Python with NumPy or if it’s a frontend developer using React or Vue or Angular. Everyone deserves an equal chance at writing the most secure software.

[0:03:54.1] Guy Podjarny: Well, well-said. Jet, this was excellent. Thanks a lot for coming onto the show and sharing your learnings here.

[0:04:02.6] Joshua Gamradt: What I see coming, especially — and this kind of a maybe an overused term or misunderstood term, is the idea of how artificial intelligence is going to make its way into how we drive security. A lot of what we see today is humans still interacting with their particular systems and realizing technologies to kind of be the mediary to show us the telemetry that gives us, “How do I interact with you from a security analyst point of view to you as a developer?” And then, “How are we working through it to identify false positives?” That entire process.

What I see coming is, an artificial intelligence platform that identifies, based upon input, looking at code, saying, “Just so you know, you’re most likely with your team going to be introducing this type of quality, bad quality into your product.” It’s going to give us intelligence to be able to point us in the directions that we would have had more human interaction with before, that I think artificial intelligence is going to work its way into.

I think the other thing from an artificial intelligence standpoint is that it’s going to reduce the amount of, like I said, the manual interaction that goes along with how we do our work, and the way we analyze and what we do is going to be totally different as we look into the future. I think it’s going to free us up to again be more of a community of how we develop together versus having, I see a combination of an industry where you’re not necessarily having the security industry and the engineering industry. We’re actually seeing a quality - a development industry that’s built into it.

[0:05:44.9] Guy Podjarny: Yeah. That’s a great kind of lookout into the future. I relate to a lot of this. Shua, this has been great. Thanks a lot for coming onto the show and sharing these insights and learnings. As we said, you are now signed up to future episodes to do a V2 or sort of learnings of a year later. So, we’ll get back to you.

[0:06:04.7] Robert Wood: Yeah, I think it’s more and more likely that the job of a CISO in the future is more akin to managing a data platform. So security telemetry is only going to increase. We’re only going to get it more frequently and more like streaming data, as opposed to these point-in-time sort of snapshots of data. And it’s almost like a chief data officer crafting intelligence out of this mass of data that we’re getting on an ongoing basis.

If I was to think about security as a big data platform and all of these things dumping into this data link or data warehouse, whatever sort of way in which you’re set-up, I am able to craft or the CISO of the future is able to craft intelligence out of all of that and then share it out to produce intelligence products, back to the people who need to be accountable for it, closer to the point where they need to make a decision on it. That, I think, is really – it is a game-changer for instilling this sense of ownership and accountability in the way one might manage risk.

I think that brings in the need for data science talent in this field and not just building machine learning models to like triage stock alerts or find malware, things like that. But it is really making sense out of this massive data that we sit on. Because typically we have all of these security activities that stove pipe themselves. They all sort of operate as a standalone thing or they just generate a report. They generate insights but they don’t generate data that can then go and be connected to something else to generate even more valuable insights.

It’s almost like thinking about security as a platform or security in a network effect sort of way and that’s my hunch. I hope that’s where we’re going or at least some flavor of it because I really believe that there is a lot of power in taking that approach because our field hasn’t really embraced the sort of big data revolution so to speak like other fields have. Marketing’s got it on lockdown, other fields have but we really haven’t yet and you know, if and when we do, I think there is a lot of opportunity in there for us.

[0:08:19.1] Guy Podjarny: Yeah, no fully agree – security is naturally invisible and you don’t really know what risk you’re taking on, so shining a spotlight there and understanding what is their smart data management makes a lot of sense. We’re out of time here. Rob, thanks a lot for coming onto the show and for sharing your insights.

[0:08:37.8] Ashish Rajan: Funny. Depending on if they’re in cloud. I definitely feel the job would be a lot more different. I think I definitely feel security as a team should try and do enough automation that they can get themselves out of a job. So, all of us can start working towards AI and ML, instead of trying to still solve patching problems.

If I had a crystal ball, that’s where I would see the future would be, where someone who’s probably the head of security or CSO is trying to solve AI and ML problems or using AI and ML to solve security problems. Instead of - we have a lot of data, which we’ve been consuming for so long or in terms of creating chord, in terms of generating security events. How do I make sense of that in a way that it makes sense? And not the drinking Kool Aid kind of way where everyone says they’re doing AI and ML, but what they’re really doing is pattern matching.

It’s completely fine to call pattern matching, but it sounds much more nicer if you say AI and ML. So, I would love for security to be kind of going down that path as well as developers leading that race with, they seem to have a lot more evolved AI and ML programs than what we do. But I definitely see this already happening in Netflix and other places as well.

At least from the guys that I spoke to in Netflix, it seems like they’ve hired data scientists and they’re already going down that path already. So, I would love for more security and someone, I guess, Ashish in five years to be possibly solving that problem for a much bigger organization as well.

[0:10:01.1] Guy Podjarny: That’s a powerful destination there. So hopefully, it comes to fore. Ashish, thanks for coming on the show. This has been a great conversation.

[0:10:07.7] Ashish Rajan: No problem. Thanks for having me. I appreciate it.

[0:10:10.2] Guy Podjarny: Okay, you’ve probably now seen that there’s probably some changes happening on the data and ML front. The next most recurring theme may be a slight selection bias given the nature of this podcast. There was one around developer ownership of security. We’re going to hear Daniel from Ambassador Labs, which is not a security focus necessarily, Rinki was the Twitter CISO, Nick runs security at Pierson, and Simon, our very own Simon Maple here from Snyk talk about dev ownership and how that would be different in five years’ time when it comes to security. Let’s hear them out.

[0:10:47.0] Daniel Bryant: It’s a great question, Guy. I think riffing off some of the things we’ve talked around low code, no code and citizen developers, I think more and more people are going to be “developers,” if that makes sense. That brings fantastic opportunities, but it also brings challenges. We’ve talked about meeting folks where they’re at, and that’s going to be a big challenge.

A lot of us with a lot of engineering background, we’re very comfortable with fellow engineers. But when you start getting what we would historically call business folks involved in the mix, and let’s be honest, we’re all business folks, right, to some degree. But when you start getting folks that are coming up with fantastic ideas, doing POCs, or even bigger systems with these low-code, no-code approaches, you do have to think that level deeper as those systems get more adoption. You do have to think about sustainability, you have to think about cost, you have to think about security, all these things.

Imagine, part of my job in the community is going to be really taking the time to meet folks where they’re at. Now, it’s some degree of, “Are you a Go developer? Are you a Node developer? Are you a JavaScript developer?” I think now it’s going to be — I’m sorry, five years in the future, it’s more around this goal-oriented idea that we talked about earlier in the podcast. It’s that, what are you trying to achieve?

It doesn’t really matter whether you’re a Go developer, a low code developer or whatever. What are you trying to achieve and what are your constraints? I imagine the job antics of the DevRel is going to pivot a lot more to even more thinking about empathy, meeting folks where they’re at and then understanding of the problem space before then, guiding folks, “Here are your options, here are your tools, here are your practices, here are your platforms.”

I’m super excited. I think the whole developer education space is going to be super interesting over the next five years. I think software is eating the world, as the cliché goes, and the Kubernetes and the Internet is eating software. There’s just more and more opportunity for us, and I think it’s just really paying attention to the folks going on board on their goals.

[0:12:39.9] Guy Podjarny: That’s a, I feel a fairly fair description. I would subscribe to that crystal ball of the reality changing and now, it’s just the question of timeline. Is it five or ten?

[0:12:49.5] Daniel Bryant: That’s it.

[0:12:50.3] Guy Podjarny: Yeah, cool. Daniel, thanks for coming onto the show. This has been great.

[0:12:54.8] Rinki Sethi: I think one of the big shifts that I’m seeing and I see it with Snyk, I see it with LevelOps, I’ve seen it with now more companies that are coming out, there’s this desire to make engineering lives easier and more engineering companies selling security to engineers, right? I think that model is going to go places. I’ve seen the success with Snyk. I’ve seen the success with other companies that do that. I think we’re going to see the shift where engineering teams are going to start owning security more and more. I think that’s one thing that I see happening more in five years, especially when you’re talking about large-scale security engineering organizations.

On the dream side of it, I would say I’m waiting for the day where we work ourselves out of cybersecurity functions. This is what drives me and this is why I have passion around, this is where security is part of the DNA of companies, period. Right? That’s where I’d love to see things go, and I don’t think that’s the five-year crystal ball. I think that’s a future crystal ball, but I hope that every security person out there is doing that for that reason that it’s embedded within a company well understood in the DNA of a company.

[0:14:02.1] Guy Podjarny: That is definitely in kind of the right direction towards which we need to drive.

[0:14:07.2] Nick Vinson: I think, or I hope at least, there’ll be actually less need for dedicated security engineers. It’ll be something that developers take on a lot more of the responsibility themselves and it becomes more of their day-to-day activities. Similar to the way that it was for QA with developers writing their own tests and TDD and also from ops activities, where developers are responsible for building and maintaining their own CICD pipelines. I hope that’s going to be a similar case for security activities. Developers and teams are going to be much more able to do their own threat modeling and their own security testing and being able to easily interpret those results and carry out those security processes to create merge requests for security updates and carry a lot of those activities themselves.

[0:14:54.4] Guy Podjarny: Yeah. Well, that’s awesome. I’d love to agree and see that. I do think that’s the trajectory. What would you think then the person wearing your hat, assuming does that role go away, or what would that future clone of yours be doing then?

[0:15:08.3] Nick Vinson: I think, it might look similar to how it is with QA, where you’ll still have a lead, you’ll still have someone who runs that department. Instead of having manual testers integrated into every team, you’ll just have a QA automation lead who’s coordinating activities, making sure that’s working within a good framework. I think that’d be similar to this for security and I still think you’ll have security teams, but I think the security engineers will be able to spend their time researching and also facilitating training. I think, just less hands-on day-to-day just specifically in projects to be making security changes.

[0:15:45.2] Guy Podjarny: Yeah, perfect. Well, Nick. Thanks a lot. Thanks for coming onto the show, for sharing this great insight about the journey, the learnings for it. Thanks for coming on.

[0:15:54.6] Simon Maple: Yes. That’s a great question, and I think there are plenty of very obvious expectations that we would expect in a couple of years. Docker and container adoption to continue to increase. I think some of the stats that Gartner have recently released about what they expect in the years to come of container adoption, looks to be coming true from these kinds of data points. I expect that to continue to grow as people in companies of all sizes adopt further.

A couple of things I would love to see, and I think will happen, is that people won’t just adopt the technologies, but they’ll adopt the correct practices that will require them to make best use of those technologies. We’re talking about automated pipelines. We’re talking about automated deployments. As a great part of having the confidence in those automated deployments, you need to make sure that security testing is at all parts of that pipeline.

I perceive the levels of automations increasing. That 30%, that third of people who are fully automated, I would expect that to grow in the next couple of years. As a result, that will – looking at the stats that show the amount of adoption in local development and IDEs – that as a catalyst will encourage developers to be more exposed, to be more visible to security practices, as well as processes and programs.

I absolutely see more of that shift left and I would expect to see, in fact, companies and organizations doing more testing in their local environments and early in that process than they would in the later stages because it is so cheap and it is so valuable for companies to test and get early feedback. I’d like to see that kind of a shift. It’s not far away now. We’re only 10, 15% away there now, so it only takes a little bit more to do that.

In terms of, with that automation, yes, we will absolutely see the speed at which people can react, to be much, much faster. It will be great to be able to see the speed of fixing critical vulnerabilities to be much, much closer to that one day. I would expect that one day or less to be much faster as fixing a critical security issue is just the same as fixing any other bugs. Security will be just another piece of that quality testing for a developer. It will be much more ingrained in the developer’s day-to-day life.

A couple of things that I would like to see before we jump on to people, the misconfigurations and the known security vulnerabilities. It’s a really interesting one. That was very interesting to see, as one of the, one of the big highlights of people are very concerned about it, and people are seeing more incidents.

I think that misconfiguration in particular is a very hot topic right now, in a very strong area, that people are adopting quickly in IAC but need to make sure that that is done in a secure way. We’re seeing the headlines. We’re seeing companies look as to what they can do to fix these issues.

I think, I would expect this to actually come down in terms of being a concern and down in terms of – well, maybe it will continue in the incidents as this is still growing. But I would like to see that actually come down and I’d like to – that may sound a bit weird, given that we’re growing more and more into this space, but I think the security tooling and the abilities to test and understand where those misconfigurations are, I see that as such a core area of learning that that is something that will be dealt with in the next couple of years by organizations.

They’ll see that as a critical thing that they need to jump on and fix, and that will become less of a concern because more organizations are dealing with it and tackling it head-on.

[0:19:35.9] Guy Podjarny: Yeah, very much hope so. I mean, I think it’s an area that would just be as important so probably take a while to kind of reign in the chaos but at least people would sort of feel like they’re better equipped with it.

[0:19:45.3] Guy Podjarny: Very cool. You know, definitely some optimistic perspectives around how developers would own security more, which I subscribe to and hope they will actually happen. We’re definitely doing our share here and trying to make that a reality. The next section talks less about security and more about really the increased fragmentation in the DevOps scene that we’re going to need to work with.

Both of those are actually coming from people representing companies that are not security-focused but rather indeed in the DevOps scenes, so that’s the reality we need to live in. We’re going to have Justin Cormack from Docker and Liz Rice from Isovalent as well as running the CNCF technical advisory community talk about this different fragmentation that we might see in five years’ time. Let’s hear them out.

[0:20:29.8] Justin Cormack: We have already made some progress towards fixing these things. I mean, I think we’re just seeing as a Cambrian explosion in software, and there’s so much more of it, and it’s becoming much more diverse and in five years’ time, we have to be able to manage this better, because it’s just going to get more and more messy and complex.

I think about the shipping industry and containers quite a lot because I like history, and I mean, we’re in the container business. I think that if you think about the container shipping industry, we think about in software terms, we think a lot about containers as these boxes, and I think the first years of the content, we’re all about, like, “Can you escape from the box?” Runtime security-type things.

But the value in containers in the shipping industry was not the box, it was the supply chain and the modern industry that enabled, that everything changed because you could get things from China to Europe, fast and cheaply. You could build things where you used a lot more external components, and it was more reliable and we built all the machines for getting them off the ships, we built gigantic container ships, all those bits of infrastructure that we built around it.

People think that containers are about the container and the runtime, but they’re really about the content of what’s in the containers, the things that you’re using to build in your software from, that reach you in containers. I think that we haven’t made a lot of those sort of industrial processes kind of easy with containers yet.

We’ve spent a lot of time thinking about the boxes, and those things and not about helping people build stuff using the supply chain that we’ve built from modern software. The Cambrian explosion kind of metaphor, I think someone came up originally with like JavaScript frameworks, but every area of software, there is more and more diversity and we have got to be able to manage it. The Cambrian explosion lasted for 20 million years. There’s just a huge amount of stuff going on.

[0:22:46.9] Guy Podjarny: It’s not going to go away anytime soon. I think it’s well said, and we focus on the new technology bits and how to use them but really, it’s easy to forget that containers are still relatively new to the ecosystem, and that the practices around them have evolved. It’s not just containers, it’s indeed kind of a holistic change of process and culture and means of software development.

We have to sort of think about problems a bit more from first principles and understand what’s the right way to approach them, versus constantly iterating and kind of retrofitting some past processes to, to address them. That’s very, very well said here. Justin, it was great to have you on. Thanks for coming on to the show.

[0:23:24.6] Liz Rice: Five years is such a long time, isn’t it in this space. Five years ago, Kubernetes was barely, you know, it was just a new thing. We didn’t know –

[0:23:35.0] Guy Podjarny: Like eBPF was.

[0:23:36.5] Liz Rice: Yeah, absolutely. Setting aside whether or not we’ll have the climate apocalypse in that timeframe, let’s hope that doesn’t happen. I mean, we’re seeing such rapid adoption of cloud technologies. I think it will be table stakes for any company to be just using the cloud. We won’t be talking about early adopters anymore for sure. It’s so difficult to say. I’m sure we will see huge advances in observability.

We’ll see massive scale - we’ll see the proliferation of things like edge and the internet of things. So many more entities that we’ll need to sort of monitor, and secure and control. I think that will add a whole new layer. I think things are pretty massive scale already, and I think that’s only going to explode over the next few years. We’ll all have our work cut out for us, building the tools that will manage that, and secure that and allow us to keep an eye on what’s happening.

[0:24:49.7] Guy Podjarny: Yeah. Sort of seeing it getting messier before it gets cleaner, I guess in the evolution. More and more moving parts.

[0:24:56.2] Liz Rice: Yeah. I mean, everything is always up in about an arm’s raise, isn’t it? Things get worse and then people build their solution, and then things get worse and people build a solution. I suspect that will be true with the scale of internet-connected devices.

[0:25:09.6] Guy Podjarny: Indeed. Liz, thanks for coming on the show. This was a great conversation. I do fully believe we probably could have spoken for another hour and still kind of scratch a lot of things but thanks for coming on and sharing more knowledge and views with us.

[0:25:25.3] Guy Podjarny: Yeah, so it seems like we’re going to have some more complexity to deal with as time goes on. I’m sure we’ll do well. The next couple talk about a slightly bigger picture and they talk more about regulation. More about how the government or the public markets needs to get involved in security or need to or not, you know, might actually do so. And those two comments come from Geoff Belknap, who is now the CISO of LinkedIn and Tim Crothers who run security at Mandiant.

Let’s hear those two bigger picture perspectives about how regulations might be different in five years’ time.

[0:26:00.7] Geoff Belknap: Yeah. I think the abstract position I have on this is, I expect within the next five years that there will be more regulation around this role. I really – If you had asked me this when we did this before, I probably would have said, “Ah, maybe 15, 20 years, there’ll be sort of a Sarbanes-Oxley or something like that that’s for CISOs.” But I think now, as things are changing really rapidly and we’re seeing direct impact for things that used to be nuisance attacks on critical infrastructure and the way that you use customer data being so heavily regulated or so under scrutiny, I suspect in the next five years you’re going to look at something not unlike the CFOs role or the general counsel’s role. Where the CISO has some regulatory accountability for how the security program operates in our organization.

I don’t know what that’s going to look like or where it’s going to come from first, but I can’t help but look down the road and see that like that’s going to come. A CISO is going to be on the hook for these things directly and directly accountable to regulators and lawmakers. I don’t know how I feel about that but I definitely feel like it’s coming.

[0:27:04.0] Guy Podjarny: That’s a really interesting perspective. You expect this to come with transparency as well, kind of reporting just like you do for accounting, your kind of method of handling vulnerabilities?

[0:27:13.5] Geoff Belknap: I think, look, Alex Stamos talks about this a lot, and I think it really resonates with me. Pre-Enron, CFOs didn’t really have skin in the game. Your auditors didn’t have the same skin in the game that they do now, at least in the US markets. I think the CISO today, there are a lot of companies that hold them accountable. Certainly, in the financial services space, like the State of New York, is definitely in a path to making the CISO somebody that has to sign off on audit reports or has to have a transparency report about their cyber security activities that come out on a regular basis.

I see that all as foreshadowing towards, especially for public companies or companies of a certain size, they’re going to be required to have a CISO, probably required to have somebody that has cyber security experience on the board, and then be required to report something either in their regular quarterly filings or sort of an annual report as to the state of security in their organization. I don’t see how we, as a society, leaning so heavily on technology and everybody’s data being something that is part of their daily lives, get to avoid something like that. I think transparency is good. I think that makes us all better.

[0:28:15.4] Guy Podjarny: Yeah. I fully agree with that. Geoff, this has been a pleasure. Thanks a lot for coming onto the show and sharing all these great insights and learnings.

[0:28:22.7] Guy Podjarny: I found those comments on regulation very, very interesting and I think it could very well happen that we live in a different regulatory perspective, which you can argue whether it’s good or bad for security but it’s definitely a change in our reality. For the last two, I want to finish off with some optimistic predictions about how, if we took on the sort of rosy glasses, things can actually turn out very, very well and be in much better place in five years’ time.

We’re going to hear Dev from Figma and DJ from VillageMD! Talk about how things might be kind of cool in five years’ time, it might be pretty good. Let’s hear Dev and DJ.

[0:29:00.3] Dev Akhawe: I would say like, let me break it down into like maybe cultural, technical and people. I think the cultural change I think will happen is, more and more people would say, “Security has to be a positive for us. Security has to be a team, what they call ‘solves a yes’ rather than ‘figures out a no’.” That cultural relationship between security and the rest of the company and rest of engineering and security being seen as like the enabler and a positive force in the company will keep growing and that, I’m excited for.

I think on the technical front, I touched on this but the ecosystem around security just keeps improving. It’s an exciting time to be in security. I think, like the different startups and tools are making things — we can do more and more powerful things. In particular, one of the areas that I’m excited for is probably like data. Like I think security teams right now don’t use data as widely as they should because a lot of our tools haven’t been great and like logging was expensive.

But I think with modern data infrastructure, like the same reason everyone else is like able to use insights from data, I think security teams will also grow. We’ll have like logs and analysis, and machine learning models or something that will be much more powerful on that front. Even on AppSec, I think a lot of the tools around static analysis, dynamic analysis, testing, automated testing will just become more and more powerful. I’m excited for that.

I think around people, honestly, every year as we hire and I meet new grads and new engineers joining security teams. Like it’s easy to forget, but like 10, 12 years ago, new grads did not join security teams. Like new grads joined software engineering teams. But every year, new grads and new engineers who join security, who like even — I feel like an old man now, but they’re like fresh out of school, they know more about security than I did after my PhD.

They’re so smart and they’re writing amazing systems and have a passion for fixing security. I think like the sort of security ecosystem will have a lot of like really talented people, doing a lot of amazing work, and with a passion and sort of an engineering culture. That’s going to be amazing also. Yeah. I’m super optimistic about the future, I think.

[0:31:08.8] Guy Podjarny: Yeah, that’s awesome and that’s very reassuring because I do think you have a pulse on things. Hopefully, that’s a good crystal ball and I think all of them makes a lot of sense. Dev, this has been a true pleasure. Thanks a lot for coming onto the show.

[0:31:22.6] DJ Schleen: That’s a deep question. I think the reality, if I could look into a crystal ball would be that they have a single view of the overall security and quality state of their applications as they deploy, right? So, the utopia for me is to get to a point where your system gets deployed, which has multiple microservices, UI, data, whatever it is, and have a holistic view of all the vulnerabilities or all the issues that can be around supply chain, code quality, infrastructure as code configuration, information, like all in one.

So, I think five years from now, I think we’re going to get to that. I think that’s where things are starting to go or where the business is looking at. I want a number. I want to know what my risk is for this app, right? Because everything, again, it comes back to, you start looking at, “Gosh, I have some certification books here.” This certified information security manager. It’s great. It does talk about software.

So, it’s fantastic. We can track like locks on doors and stuff like that but there’s nothing in there about actual software. If we take the risk from there, we can start running through a risk calculation and really getting an idea of what our risk exposure is for our organization from a monetary perspective or non-monetary or legal perspective compliance, right? I think people won’t have a problem getting to that information.

[0:32:52.6] Guy Podjarny: I think that’s a great optimistic kind of a crystal ball.

[0:32:56.8] DJ Schleen: I’d love to see it.

[0:32:59.0] Guy Podjarny: I’m with you. I relate to it. I hope it happens.

[0:33:03.2] DJ Schleen: I need to make sure it happens for us and talking with people in the industry that are building tools, and just seeing the amazing creativity that they come up with to solve some of these issues that we never even knew we had. I think that we’re just going to have safer software. I hope that the tables turn a little bit on the attackers where we’re not reactive anymore, that we’re more proactive to security and the results show.

[0:33:28.7] Guy Podjarny: Yeah. Definitely the only path to go really, the only way to kind of raise security.

[END OF INTERVIEW]

[0:33:35.0] Guy Podjarny: That’s it for today’s episode. I hope you enjoyed some of these sort of peeking into the future that our guest gave us. I love these types of questions, and how I don’t want to hold back some of the brilliance of the guests that I bring on. And just want to get their perspectives and really enjoyed asking this question.

This year, I’m going to be asking a different question so stay tuned to see what that is. I hope it gives us just as much insight and variety of answers as this one did today. Thanks again for listening and I hope you join us for the next one and the whole next year that’s in front of us.

[OUTRO]

[0:34:12.7] ANNOUNCER: Thanks for listening to The Secure Developer. That is all we have time for today. To find additional episodes and full transcriptions, visit thesecuredevelop.com. If you’d like to be a guest on the show or get involved with the community, find us on Twitter at @devseccon. Don’t forget to leave us a review on iTunes if you enjoyed today’s episode. Bye for now.

[END]