Episode 34

Season 4, Episode 34

Positive Security With Siren Hofvander

Guests:
Siren Hofvander
Listen on Apple PodcastsListen on Spotify Podcasts

In episode 34 of The Secure Developer, Guy speaks with Siren Hofvander of Cybercom about her enlightening journey from the digital medical space to running a secure developer consulting team, as well as her empathy-driven ethos in the one-size-fits-all security world.

The post Ep. #34, Positive Security with Siren Hofvander of Cybercom appeared first on Heavybit.

Teilen

“SIREN HOFVANDER: I tend to say that the security team is kind of like a seat belt. It's not the seat belt that prevents you from driving 900 kilometres an hour, but it is the thing that will keep you safe if something bad would happen. The seat belt in the spaceship looks really, really different from the seat belt in the car, but they both have the same function of keeping the driver and the vehicle safe. I've never met a developer that comes to work and says, ‘You know what? I don't care about security.’ But what they are classically lacking is the ability to make what they're already doing visible and seen.”

[INTRO]

[00:00:34] GUY PODJARNY: Hi. I'm Guy Podjarny, CEO and Co-Founder of Snyk. You're listening to The Secure Developer, a podcast about security for developers covering security tools and practices you can and should adopt into your development workflow. It is a part of The Secure Developer community. Check out thesecuredeveloper.com for great talks and content about developer security and to ask questions and share your knowledge. The Secure Developer is brought to you by Heavybit, a program dedicated to helping startups take their developer products to market. For more information, visit heavybit.com.

[INTERVIEW]

[00:01:07] GUY PODJARNY: Hello, everyone. Thanks for tuning back into the show. Today, we have a great guest, a CSO, a Security Pony, and many other things. We're going to talk about that in a sec. We have Siren Hofvander on our show. Thanks for joining us, Siren.

[00:01:20] SIREN HOFVANDER: Hi. Thank you very much for having me.

[00:01:22] GUY PODJARNY: Siren, before we dig into sort of the topics that we like, there's a lot for us to discuss, tell people a little bit about who you are, how you got into security, and kind of what you do these days.

[00:01:34] SIREN HOFVANDER: I got into security a long, long time ago. All of my friends at the time were developers, and I was the only girl, and I was a difficult child and a difficult teenage years. They were all developers, and it turned out that I was really good at puzzles. I got into finding where they missed things, so to speak, which I later learned with pen testing. I kind of fell into it because I liked making fun of my friends' mistakes. It wasn't really a glorious – it was just the fact that I was a brat.

I started as a pen tester, became a systems administrator. I was turning on people's lights at three in the morning. I’m super cranky and super salty. As many people do, I got really lucky, and I knew a guy who knew a guy that was looking for a blue team security person. I came in that way through luck, and I started on secure development because I knew a guy that knew a guy. I am unbelievably further lucky that the guy who took me on from my crazy red team backgrounds was a saint, and kind of tamped down my rather wilder sides, and helped me to fall in love with the blue team and building things from a secure perspective.

I was really tired of finding the same bugs over and over again. It got to a point where you feel like you're kicking someone that's laying down. It broke my heart. I did that, worked at ClickTech, Securitas, Verisure. I did their alarm systems. I was their product head for the Northern systems. From there, I went over, and I got into the digital-medical space. I was a CSO for a company called Min Doktor for a couple of years, which was super cool. I've recently just left that, and I've started a secure development consulting team, which is actually super cool.

There are a lot of companies that can't really afford to have an entire security team, so I've set up kind of a – you can hire a team where you hire chunks of time from a range of capabilities, as opposed to having to find, "We need 30 different skill sets from 30 different people," and it’s a year later, and you're still not done. I'm continuing my journey of trying to help people stand up.

[00:03:43] GUY PODJARNY: Perfect. I think maybe we start from the approach. We start from people, not from the tech or maybe from job definition aspect of it. When you talk about doing security, you mentioned blue team and going into protection or sort of defense, versus just the one finding holes. You talk about versus maybe the pen testing, the red teams. What is it that security teams to do even in the first place? How would you even define the job?

[00:04:12] SIREN HOFVANDER: I would actually avoid defining the job because I think a lot of the problems come from a definition. Because security has suffered from being something that a certain set of people do, whereas security is found in everybody's job, and everybody has a little bit of security in their job because everyone understands that, “These are the risks within my particular function."

If you expand on that to take the security team's job, I would say it's helping people to find and identify the risks within their own areas and being kind of a supporting function for them in that journey. That looks different if you're helping the finance team or if you're helping the office administrator or if you're helping a developer, but the function is the same. You're helping somebody else become more risk-aware and to treat those risks in a relevant manner.

How you do that can vary vastly, and it should vary vastly. It's the one-size-fits-all. We're going to document the problem to death, or we're going to scare you to death with our pen testers. That has made it really, really difficult. I tend to say that the security team is kind of like a seat belt in a car or a seat belt in general. It's not the seat belt that prevents you from driving 900 kilometres an hour, but it is the thing that will keep you safe if, God forbid, something bad would happen.

The seat belt in a spaceship looks really, really different from the seat belt in a car, but they both have the same function of keeping the driver or the passenger in the vehicle safe.

[00:05:36] GUY PODJARNY: Is it the security team's job to help build the seat belt, or are you the seat belt? How much would you say? When you are the security team, how much are you advising and consulting and making people aware of the risk versus hands-on creating, providing tooling, doing some of that work?

[00:05:58] SIREN HOFVANDER: I think in a utopian scenario, we would be helping other people to create their own seat belts because everybody is an expert in their own particular area, and the best seat belt is always going to be the one that's the best suited for the context in which it's in. That being said, security's done a really, really poor job of marketing how awesome it is. A lot of the hands-on stuff, a lot of the hands-on know-how, it just isn't there.

Currently, the reality is I do a lot of stuff with automation. We do a lot of stuff in infrastructure. We do a lot of stuff with configuration and training and risk awareness and documentation.
But in a utopian state, I want to be the person helping somebody else to make a seat belt because one of the things that's always really bothered me with security is the fact that there's always been an idea of, well, you have to know this much to be secure. Just saying that, it means that we're saying that some people are unworthy of security. I like the seat belt analogy because a seat belt doesn't say it has to be a good or a bad driver or a good or a bad person driving the car. The level of security should be the same. I really dislike the idea of we're going to blame the user. We're going to blame whoever it is for not doing something.

An agnostic consulting role is the ideal state. The reality is I have literally picked up someone's dry cleaning to help them have more time to do security. I have written the YAML files. I have done automated testing. I will literally clean your bathroom because the end goal should be a more secure system. So I'll do whatever it is.

[00:07:26] GUY PODJARNY: You're very much describing security as indeed this supporting entity. Oftentimes, in many organizations, it's not like that. People are, “Security is the big stick,” right? It's the one beating you down on the head when you misbehave, versus something that's looking to uplift you.

You've got the Security Pony handle. You're very much about positive security. How is that different? How do you approach implementing this notion of support or positive security when you work with development teams?

[00:08:01] SIREN HOFVANDER: I've led a couple of teams, and I've always said that our job is always going to be to keep showing up because security has done such a poor job of marketing, and it's taken such an aggressive kind of offensive tone for a really long time that people are afraid of what we do. We can see that as, "It's not fair," and you can scream into the night. Or we can be the person that's constantly showing up, picking up your dry cleaning.

Because if our end goal is to have a secure system, then that has to be an agnostic approach. That can't be, "I'm only going to do it if you're nice to me," because that's not an agnostic approach. You have to be able to give people space to be afraid of what you do because they come from history. It's constantly showing up with, “Okay. So this is where the problem is, and this is how I’m going to help you to fix it," and to be looking for a context and looking for something to make that person feel like you're there to help them.

That is truly what the security's function is. We are there to help, but we've done a poor job of presenting that help. We've hidden it behind pen testing and documentation and being the big stick, whereas if you look at a lot of the really, really big security fails, I can guarantee that there were people on the inside that knew about those that were too afraid to go to the security team or to tell somebody. They became worse than what they needed to be. Had they had a culture of, “Hey. If you see something, say something. We're here to help you. We can do this together," I think a lot of those could have been handled in a much better way.

[00:09:27] GUY PODJARNY: Practically speaking, when you run, how do you handle the typical size delta, right? Typically, there's one security person to many. If I focus on developers because that's the core principle, many employees but also specifically many developers, how do you do that? I mean, if you come along and you can only pick up laundry or dry cleaning for so many developers, right?

[00:09:50] SIREN HOFVANDER: Exactly. Well, I mean, let's say the ratio is 1 to 100 developers, probably. One of the biggest areas I see that people or security teams fail is that they try and apply more security onto a development team that probably is overburdened as it is. Even if it is 100 people, they're working full-time. They're already doing something full-time. So you need to find ways of scooching security into what they're already doing, rather than trying to give them, “Hey, we're going to run static code analysis in a tool that's going to slow down your build with X amount of minutes.” That's going to have an extra process that they're not going to be able to see any return on investment or anything that actually helps them.

Rather than trying to take a standardized approach, you can do a lot of really good security tests in unit testing. You could use Dev tools. You can use a lot of the tools that developers are already using. Rather than take kind of, "These are our square Excel requirements," you can look at what they're already doing and manipulate security requirements to reflect that better.
Because I guarantee that a lot of development teams are already doing a lot of security work already. The problem is it's not visible. They might have really good session handling, despite the fact that there aren't any security session requirements have been written.

Why don't you write those requirements and bring that effort that's already being done to the light of their managers, to their bosses, whoever the case may be? Because it's never the case that's 100 people that think, “Clear text is awesome, MD5 is the way to go, and whatever. Throw it over.” That's not the case. But the work that they're already doing should be made visible so that they can get credit for something they're already doing, as opposed to showing up and being like, "All of the standard security tools aren't here. The standard way of working isn't here. Therefore, you're doing nothing.” That's simply not the case, and it sets you up for an immediate adversarial relationship.

I've never met a developer that comes to work and says, "You know what? I don't care about security. I'm never going to do anything. I just want it all to be wrong." That's not the case. But what they are classically lacking is the ability to make what they're already doing visible and seen.

[00:11:56] GUY PODJARNY: I think I understand and relate to the notion of celebrating the work of developers for the good security they do. Maybe let's hone in on that, and then later come back out and talk about the tips and tricks on how do you prioritise. Give us some examples of how have you celebrated success. You highlighted whatever, the session handling or the likes. Give us a few other examples of this positive mindset.

[00:12:21] SIREN HOFVANDER: Well, we have Cake For No Reason Day. We've had cake for all the things that didn't happen because the system didn't fall over, so let's celebrate that. I am a big fan of trying to spread knowledge in a way that people actually understand. One of the prides of my life is developers when they've made a bug, and I've helped them to see how to do it. That I see it come up in a code review at a later state, and they're helping another developer to not make the same mistake and not involving my team at all. They're like, "Oh, you know. I'm never going to write this open redirect again." Rather than, "Oh, you have to talk to the security team. You have to do this. You have to do that," they're taking ownership of that issue and making sure that the next person in line doesn't make that same mistake.

I make sure that when I talk about security success, I don't talk about my team. I talk about the person in the development team that found the security bug. Hey, did you know that such and such had this really good idea? How can we make sure that such and such really gets time for this idea? It needs to be highlighting their work first because we're a supporting function. That’s not an ego thing. It's not an us versus them. It's us together, where what we're bringing to that environment initially is the ability to highlight somebody else's skill. A lot of the time, what's stopping development teams is confidence. Taking that first step is really difficult because there has been a lot of negativity. It’s the buzzword. It's the cool thing to do. No one really wants to raise their hand and be like, "This is something I already do," unless they have a lot of courage or craziness. Or they're on the dedicated security team because somebody could call them out, and that's a really difficult thing.

What you can do is give them the confidence to say, "Hey, I'm already doing this, and it may be small, but at least it's something." This is also something where you can come in and say, "Okay, management is saying we have all these requirements, but the development team gets no extra minutes to train, to learn, to do anything. Let's use our big stick for good and go beat up on the management to actually give them time to do it."

[00:14:27] GUY PODJARNY: Yes. This really all boils down to being the developer's champion, which is a very positive thing. Highlight their work. How do you equip your team to help?
You described things like getting down right to unit tests or even uncovering a bunch of these things that you're doing. Would you hire more coders or more pen testers in terms of background, to a security team that has that approach?

[00:14:54] SIREN HOFVANDER: I would hire the person with empathy. I think I've hired a lot of people, and the first quality I always look for is empathy. If you're a pen tester, if you're a coder, I've hired really good people that can't code at all but still work really, really well with development teams. But it has to be a person with empathy that's willing to pick up the dry cleaning. As a supporting function, we have to keep showing up. It might take three months for the development team to lower their guard and really let us in. But once that's done, you can have year after year a positive relationship with that team. But it means you have to keep showing up for three months and making sure that you worm your way into processes that may already be there.

It also means that you can't be the type of person that – here's where I see a lot of people that come from the really, really offensive side. This cross-site scripting bug, it’s really, really bad. But if it costs more to fix than the bug actually exploits, then beating the developer over the head with it is never going to be a positive.

[00:15:57] GUY PODJARNY: Yes.

[00:15:58] SIREN HOFVANDER: I don't think I have a standard it has to be this type of person. I think it has to be a person with a lot of empathy and that's willing to go the extra mile to solve the larger issue.

[00:16:10] GUY PODJARNY: This is inside the organization, right? We've identified three parties here so far, right? We talked about the security team having high empathy and having whatever is right in your constellation variety of skills to be able to apply that empathy and help out. Then you have management that might need that stick still in the notion of like, "Hey, you need to make time and budget for this because this matters." Then you've got the developers who are looking to celebrate because they've got enough work on their plate as is. So you want to highlight successes versus focus on failures.

What about the compliance bit? What do the auditors think about this type of approach? When an external entity comes along with their stick, how does that translate into the surrounding?

[00:16:55] SIREN HOFVANDER: I actually really like compliance work, and I think compliance is another area where it's been misappropriated for big stick people and people that can – I'm going to be really generalizing here, but can hide their lack of work behind a really big pile of documentation. To go back to my seat belt example, compliance is a thing that says the seat belt has to go over the person and has to be connected to the car and actually has to work when the car stops. All of those are compliance regulations written from the law. 

If they didn't exist, it would be good enough to stretch my example a little bit further to throw a seatbelt in through the window of the car. That is the end of the story. There's no one that I can think of that thinks that's a good solution for a seatbelt. Compliance has this, "Oh, you show up in the tie, and you're a jerk." I know a lot of compliance people that are like that. To them I say, "Peace be on you. I'm not here for that."

But compliance can also be, "Hey, let's make sure that we take the integrity and privacy of our own staff seriously. Let's make sure we take the banking information that they give us in consideration. Let's make sure we treat their health data correctly." A lot of these compliance rules aren't actually written as, "You must do this thing in all circumstances always." That's not what it says. It says, "Based on the risk." A lot of what you can do with development teams you can actually do at a company because it's based on the risk. It's a lot of are you aware of what you're actually doing?

When I did pen testing, one of the things that actually was the greatest return on investment was these are the systems you have. A lot of companies don't know what systems they are actually using. That shouldn’t be a huge moment. That should be like, "Yes." If you have packets that are super vulnerable, as long as you know about them and are treating them with due care, that's probably going to be okay.

[00:18:48] GUY PODJARNY: Yes. I guess that's another aspect of the security team or whoever it is that's handling compliance because you have that perspective and you apply that perspective internally.

[00:18:56] SIREN HOFVANDER: Exactly.

[00:18:57] GUY PODJARNY: Sometimes, the auditor that comes in to audit your company, you don't necessarily have as much choice. You need to have – basically, somebody needs to explain and bridge those gaps. That's, basically, the security team needs to be able to match those risk elements and explain that, even if that conversation is a little bit more complicated than, "Yes, I put this breaking thing or this hard constraint into my development process."

[00:19:22] SIREN HOFVANDER: Exactly. The benefit of showing up and being a supporting function is that if you have a positive relationship with your developers and your QA staff, they're going to tell you where all of the dirty bodies are buried. When you're sitting there with an auditor, and I've sat with a couple that are, shall we say, less friendly and less solutions-oriented, you can definitely make sure that you present information in a way that caters to the audience you're speaking to.

It comes back to the empathy I was talking about. Their job and they're risking their own certification if they certify a company that's wild and crazy. So how can you make their job easier? They're probably not technical, so sitting with a giant dump file of a configuration isn't going to help them. But walking them through how the configuration file is created, how you make it safe, the training the staff has, it’s going to make them feel like, "Yes, this is a company that knows what they're doing." They'll lower their defenses as well.

There are jerks out there, but I would say that the broad mass of compliance people that I know, they just want to make sure that they're doing their job. They're not there to eat you either. I would say compliance, it’s a natural part of my job. But it should be a natural part of everybody else's job, too, because no one just tosses a seat belt through the car window and is like, "Done." No one does that.

[00:20:40] GUY PODJARNY: Yes. This is an exciting scenario or model to run on. What are the primary challenges you came across when you were implementing this or that you see around, and I guess what solutions?

[00:20:54] SIREN HOFVANDER: I think there were two major ones. I think the first one was, "It will never happen to us ever because it never has." Which, first of all, most companies wouldn't know, and second of all, isn't a good indicator of anything. That's a really, really difficult one, especially if you're working with teams where it's the developers who've written the first lines of code, and it is their baby, and they will defend it as if it is their child. The second one is we say it's really important, but it's not important enough to get time, budget, or emphasis in any real way from anybody.

[00:21:29] GUY PODJARNY: Yes. Let's look into solutions for those for a sec. The first one was more developer-oriented. I mean, what have you seen to be successful?

[00:21:35] SIREN HOFVANDER: I think it's not even developer-oriented, but let's focus on the developers because they're more interesting. I would say you just don't even go into the argument of it'll happen or it won't. I don't even engage in that argument because I don't have a crystal ball or magical powers, no matter what myths about me exist. I would say it's more I go from a hygiene factor of assuming that this is a thing that could happen, how can we tackle it to make sure it's less crappy if it does, and go at it from a problems perspective rather than a probability. 

If you look at cross-site scripting, you can do a lot of that with framing and validation frames.
How can we make sure that the traffic that's passed from end point to end point, whatever the case may be, goes through a filter and make it easier so that rather than you have to validate every input, you can do it from a framing perspective or from a back-end perspective and make it a development architectural question rather than a point of this is where this particular bug is? Then it immediately goes from we're solving something that could possibly happen to we're making our product better.

It happens to be cross-site scripting, for me, and that'll probably be what I may emphasise in a report or whatever the case may be when I'm talking to an auditor. But if I'm talking to a developer, I want to give that person a problem that they understand and are passionate about, which is craftsmanship. It could be hygiene. It could be performance. It could be the latest tech.

I had a really interesting conversation the other day about authentication and serverless systems, where we were looking at security requirements from authentication, but the system was serverless. How can we have a threat model for that where it had absolutely nothing to do with the security bug that I found? It was assuming that we were going to do this, how could we do it better? If it gets into a crystal ball, that'll never happen. You can look at risk forecasting. You can make some really educated guesses. The problem becomes if it doesn't happen, you're going to become the little girl who cried wolf.

A key risk is you have to be really careful and pick your battles. They can't just be, "Oh, OWASP Top 10 says, "It can't just be the latest exploit on Twitter. It can't be my friend who does it.” It has to be something that actually delivers value, and you have to be really, really careful that you don't become – the latest glittery security whatever it is can't be everything that you jump on.

[00:24:04] GUY PODJARNY: Yes. Do pen tests come in there? It’s this notion of saying to the world, "This is a real problem we've observed on your code." Do you need that? Or is that too little too late?

[00:24:13] SIREN HOFVANDER: I think pen tests are interesting, and I think they're more often than not poorly used. Pen testing should be a validation of the flow from requirement to production, rather than an end-state check of like, “This is everything," because it's not.
A pen test – you can take the most talented pen tester in the entire world with all of the tools ever, but you're paying that consultant for a time period. If that person has a bad week, they're looking in the wrong place. They can find things, but they may not be the right things.

If there's an overemphasis on pen testing in a black box or a blind state, it can give you an overconfidence in security. You could also wind up fixing bugs that aren't actually problems.
"Oh, we found this super exploit on a system that's going to be killed in three weeks." That's not relevant. Or it’s a system that only two people have access to, and we know that because they have FIDO keys, and it's only this, that, and the other thing. There are lots of – and I know there's just the Yubico exploit that came out the other day, but the amount of effort to exploit that bug is so high that it might not be worth it to actually fix.

What you can do with pen testing which can be really interesting is we have had an emphasis on validation from an architectural perspective. Let's use pen tests to actually test the validation and verify the work that has been done in that architectural principle of validation for input fields, for example. How good was that work? Because then the bugs that can come up, it can be – all right, so we use Blue Monday, for example. Did that actually work, yes or no? Because then it's not the packer out in Wonderland. It's the developers that were trying to fix this architectural problem missed this particular part. How can we make this more practical and more focused on what's actually relevant for them?

[00:25:58] GUY PODJARNY: Yes, okay. Interesting. In this context, I think about bug bounties sometimes as this less focused but constant. It’s almost like – I sometimes think of them as a Pingdom for security results. Like a Pingdom, they wouldn't necessarily represent user experience. They don't necessarily – but they're a bit of a barometer, I guess. How do you feel about bug bounties? Do you find them helpful, not helpful? How do you contrast them to pen tests or other team activities?

[00:26:26] SIREN HOFVANDER: I've run bug bounties, both public and private. A few years ago, I think they were really great. I think now they have been oversold, and they've monetised bugs in a way that's just I don't see it as sustainable because they're 800; 900 dollars for a cross-site scripting bug. If that's the incentive of finding the bug, what is the incentive for the developer? Fix that bug in three minutes or he or she can go home and earn $900 dollars for not fixing it.

The economy of the bugs just becomes completely unskewed, and there's been so much marketing around it that you get a lot of people coming into the bug bounty space that aren't ethical and that have wild, "I'm going to make crazy money," but don't spend the time looking for the crazy bugs. So it becomes, "I'm going to blackmail your company." All of that time to handle those people, to handle the PR, to handle that person, all of those hours from the security team cost something. It could be a really good way to do responsible disclosure and to give the security community a safe and a sane way to report in bugs. But I think right now, it's been so overly buzzworded that it's become really, really damaging.

I think it's also because there's been an overemphasis on that. If you look at the budgeting for bug bounties when they first came out, it was super easy to get a budget. But now, it's been a few years since, and it's super hard to get budget for bug bounties because the bugs are coming in, but they can't be tied back to requirements. They can't be tied back into business investment. They can't be tied back into strategy stuff because it's random bugs. Why are you paying for them? It can be an awkward conversation if your management team isn't technical security, and most of them aren't. It’s a hard one to sustain, but I like the idea.

[00:28:06] GUY PODJARNY: Very interesting. I guess maybe it's just an evolution element which you're describing around how do you define the scope for which you will get paid for that type of activity. I guess maybe that's indeed going up and down both on the tooling side, but mostly on the community-to-company interaction.

[00:28:22] SIREN HOFVANDER: Like I said, I've worked with many bug bounty people. I know a lot of researchers. I think responsible disclosure is something that's really good. I know a lot of people that report in because they genuinely just want to help a company to fix good problems.
Giving them a safe and a sane way to do that, I am all for it.

[00:28:39] GUY PODJARNY: I think we've talked a fair bit here. If I just wrap up in the meantime, it's to say we talked about empowering developers relative to security. I think that's probably the strongest theme here is around the security team as a supporting entity and just running a startup here. This is very near and dear to my heart because we have the same conversations about all the finance organizations or any IT organization or even the people or HR people in the organizations. They're really important entities, but their job is to help everybody else do their job better.

Taking that angle to security is helpful. It helped developers celebrate their successes more than bashing them over the head. You gave a bunch of good examples of it. Have the security team have a high empathy to be able to achieve that, I guess, all rolls into that. Then think about practical application of that in the context of software quality, of functionality and correct architecture or feature building for your systems, as opposed to just fixing a pointed bug that happens to be a vulnerability. I think all of those are great advice. Thanks for sharing that. 

Maybe teeing up a little bit, you're doing the security consultancy now, right? A bunch of companies don't have indeed that – I don't want to say that they don't have the budget for it. Maybe they just haven't chosen to hire yet. There is a certain scale of budget that indeed they might just not be able to afford. But they haven't yet built up a security team. How do you see that work? Is that a part of what you're trying to do with this community?

[00:30:11] SIREN HOFVANDER: Yes. I think a lot of companies, not only do they not have the budget, but they don't even really understand what they're looking for. You see this with a lot of small to medium startups. It’s kind of we know we're supposed to be doing something. So the first hire, who should that person be, is a really, really difficult question because it's, "Okay. Well, if we pick a technical person, they're immediately going to be overwhelmed, and they're not going to have the time to do the compliance, the GDPR, the larger program, awareness training. But if you pick just a compliance person, all of the technical stuff may not be possible."

It becomes this weird, "We're looking for the magical uniform that has 18 arms and that can do all of it," which is just an absurd – I mean, I'm sure that person exists, but Google or someone else has probably hired them. Where the truth is most companies need a wide range of skill sets because there are a wide range of problems that need to be tackled. I have set up a team where it's – we have security developers. We have security architects. We have people that are really good with encryption, with RF stuff. But we also have people that can do compliance. We have people that can do requirements, the program, my C-level. I can talk board.

Rather than set a company in the position of, "Okay, we need something,” now we have to spend six months trying to find these 20 people. It's a team that can help them stand up on their own two feet, not company two feet, figuring out what that means for them. That might be a team of people that might be consulting with other companies to have that help over time. I think that security can look really, really different for everybody. It doesn't have to be, "It must be this type of person. It must be the skill set." It is that one-size-fits-all approach that means it's really, really hard to start because so many people assume it must be a hire. I think for some companies that's probably not true.

[00:32:03] GUY PODJARNY: Yes. I think it's a really interesting angle and, frankly, goes hand-in-hand with that perspective on the support entity because when you talk about, "Hey, you're too small to have a CFO," or, "You're too small to have an HR person full-time," that's a bit more arguable, given you can always hire people. But still, a lot of companies take that approach.
Definitely, a lot of companies outsource IT. If you're not there and if you have that notion of like, "This is a GNA function or a general supporting entity that you need over here,” why not have things that scale before you're able to either hire or fully staff the whole team?

[00:32:41] SIREN HOFVANDER: Exactly. I mean, I've hired a lot of people, and I'm the type of person that I want to be somebody that's helping other people to stand up. I know a lot of people in security have that mindset. So I think that it's a good way to enable them to help people in a productive manner.

Security is a hard job, and so it's hard to be the first person at a company because you get overwhelmed, and burnout is a real thing. By having different options that, hey, if you want to help people, you can do this type of work because this can't be the only consulting security team out there. I'm not that genius. There have to be more. But I think the more options that people have to have security within their company, the better.

[00:33:20] GUY PODJARNY: Yes. I do think it's a new-ish model in the sense of smaller companies. I mean, I think the notion of before you hired is not one that is typical. I don't know if that's attributed to new approaches or just the rise of profile security people just appreciate they need to do security more.

[00:33:37] SIREN HOFVANDER: I think security has been buzzworded. Now, it's GDPR scaring everybody and public cloud and all of this stuff. Yes, I think it's a new-ish model. But I'm hoping that it makes the community better.

[00:33:49] GUY PODJARNY: Yes. I think to an extent, GDPR, another one that sometimes is often spoken in a negative sense, in a sense that it created a lot of FUD. But at the end of the day, it's the thing that forces you to build the right seatbelt as well, right? It has all the right intention, and it has mobilised an industry that has been waiting and procrastinating on a bunch of these things, right?

[00:34:11] SIREN HOFVANDER: Exactly. I mean, I love GDPR. It has a lot of things that are wrong with it, but the intention with the law is good. If the security teams can help people focus on the intention of, “Hey, this will give us more time to do privacy work." I think most developers are excited about being able to do things well. I don't think anybody wants to mishandle people's information. We can use this as the ability to give them time to do that.

[00:34:35] GUY PODJARNY: Yes, indeed. Cool. Siren, this was a great conversation. We're already, as expected, going a little bit over time here. Before I let you go, I like to ask every guest coming on the show, if you have one bit of advice or one pet peeve or something you want to give or tell a team that is looking to level up their security, what would that be?

[00:34:54] SIREN HOFVANDER: Start small. Don't add extra tools. Don't add extra processes. Don't do any of that. Go and look at ASVS and MASVS which are standard security development requirement set. Look to find the security that they're already doing.
Find ways to highlight that because, otherwise, they're just going to get super conference burned out. "I'm going to do this," and then they're not going to have time to do it and just never do it again. Spend time to find and emphasise the work that's already being done and build on that instead. Then you can add all the cool tools later.

[00:35:26] GUY PODJARNY: Yes. I love that. It's super practical. Well, Siren, thanks a lot for coming on the show.

[00:35:31] SIREN HOFVANDER: Thank you very much for having me.

[00:35:32] GUY PODJARNY: Thanks, everybody, for tuning in. Join us for the next one.

[00:35:35] SIREN HOFVANDER: Bye, everybody.

[END OF INTERVIEW]

[00:35:38] GUY PODJARNY: That's all we have time for today. If you'd like to come on as a guest on this show or get involved in this community, find us at thesecuredeveloper.com or on Twitter @thesecuredev. Visit heavybit.com to find additional episodes, full transcriptions, and other great podcasts. See you next time.

Snyk ist eine Developer Security Plattform. Integrieren Sie Snyk in Ihre Tools, Workflows und Pipelines im Dev-Prozess – und Ihre Teams identifizieren, priorisieren und beheben Schwachstellen in Code, Abhängigkeiten, Containern, Cloud-Ressourcen und IaC nahtlos. Snyk bringt branchenführende Application & Security Intelligence in jede IDE.

Kostenlos startenLive-Demo buchen

© 2024 Snyk Limited
Alle Rechte vorbehalten

logo-devseccon