In today’s episode of The Secure Developer, Guy Podjarny is joined by Dr. David A. Wheeler, an expert in both open source and developing secure software. David is the Director of Open Source Supply Chain Security at the Linux Foundation and teaches a graduate course in developing secure software at George Mason University. He has a PhD in information technology, a masters in computer science, and a certificate in informations security, all from GMU, and he is also a Certified Information Systems Security Professional (CISSP) and Senior Member of the Institute of Electrical and Electronics Engineers (IEEE). Today’s discussion revolves around open source security (or OSS), in which David is an expert, not just from the perspective of consuming open source but also creating and even governing open source. Tuning in, you’ll learn about some of the primary security concerns in open source and the necessity to educate developers about secure software, and David shares some of the tools, tests, and initiatives that you include in your security arsenal. Ultimately, David believes that knowledge is critical, and this episode will educate users and developers alike about common OSS vulnerabilities and how to counter them. Tune in today!
Episode 91
Season 6, Episode 91
Open Source Security With Dr. David A. Wheeler
Dr. David A. Wheeler
[00:00:15] ANNOUNCER: Hi. You’re listening to The Secure Developer. It’s part of the DevSecCon community, a platform for developers, operators and security people to share their views and practices on DevSecOps, dev and sec collaboration, cloud security and more. Check out devseccon.com to join the community and find other great resources.
This podcast is sponsored by Snyk. Snyk is a dev-first security company, helping companies fix vulnerabilities in their open source components and containers, without slowing down development. To learn more, visit snyk.io.
Welcome back to another episode of the secure developer on today's episode Guy Podjarny, President and founder of sneak is joined by Dr. David A. Wheeler. David is an expert in both open source software and developing secure secure software. He is the director of open source supply chain security at the Linux foundation, and also teaches a graduate course in developing secure software at George Mason university.
He has a PhD in information technology, a masters in computer science and a certificate in information security from GMU. He's also a certified information system security professional and a senior member of the Institute of electoral and electronic engineering. We hope you enjoy their conversation and don't forget to leave us a review on iTunes if you enjoyed today's episode.
[00:01:45] Guy Podjarny: Hello everyone. Welcome back to the secure developer thanks for tuning back in. Today, we're going to talk about open source security and not just from the lens of consuming open source, but also creating open source and even governing the whole body of open source and to discuss that and share, you know, what is being done and what can be done about this is David A. Wheeler who is the director of open-source supply chain security at the Linux foundation.
David, thanks for coming onto the show.
[00:02:11] David Wheeler: Thank you very much.
[00:02:14] Guy Podjarny: David before we start digging in into, you know, what Linux foundation does about security and many, many more topics there. Tell us a little bit about what is it that you do and how did you get into this chain into security in the first place?
[00:02:25] David Wheeler: Well, my puddle is an unusual one, but my role is fundamentally to try to help make open source software more secure, both when it's initially developed and throughout the supply chain, all the way to use and operations. Obviously I cannot possibly do that all by myself. So instead what I do is I coordinate, I work with others.
I basically try to enable many, many different projects and tasks with that larger goal in mind. So how did I get into this? The quick answer is I've been interested in open source software and also in developing secure software for a long time, for decades. You know, I started programming when I was 14. I thought it was awesome.
It's amazing and you know, as I got older and wrote more different kinds of programs, I started to become interested in, well, how do I do this better? You know, how do I make software that's better quality. How can I make it more easily? How can I make it so that it's easier to maintain. And eventually I started realizing, Oh, in fact, even in high school, I wrote a paper for our class on computer security, because I became very, very interested, very, very early in that topic.
And really ever since, I've been interested in the security of software and as open source software became a thing or more of a thing. I became very, very interested in it. I did some research and analysis. Some people may remember me from that on just open source in general. So that combination of those two interests, you know, how do I develop secure software or just improving software development in general and open source software merged very nicely. The Linux foundation was getting increasingly concerned about how to improve security because obviously open-source is everywhere now, everybody depends on it. I like to say there's two kinds of organizations, the one that know that they're using open-source software and the ones they don't know that you're using open source software.
Those are the options. And as a result, You know, because we're all dependent on software it's security is mattering more and more and more. And unfortunately the attackers are getting wilier and they're just getting more persistent and so really we, as the defenders, the folks developing the software, getting it deployed, need to up our game to do a better job at countering what the attackers are doing.
[00:04:56] Guy Podjarny: Yeah. Yeah, absolutely. I think it's always, even when you think about some of the, uh, big fame or the designer vulnerabilities, designed brand vulnerabilities, like Heartbleed and likes, a lot of it is really about just the broad adoption of, uh, those components, right? The severity of the vulnerabilities are noteworthy, but they're not very different than many other vulnerabilities that happen.
It's the prevalence and the adoption of open source of those specific open source components that make those vulnerabilities such an ecosystem event.
[00:05:24] David Wheeler: That's right. And nobody cares about software that's vulnerable if no one uses it, but when it's everywhere and Heartbleed actually brought some other issues. I think, you know, I mentioned earlier the organizations that don't know - one of the problems with the, uh, with Heartbleed was not just, it was a vulnerability, but that it was in places that people didn't realize it was in. And people were not prepared to update their systems. In some ways that's even gotten worse because containers should make that easy, but a lot of people grab a container, add some stuff and assume that magic somehow happens and vulnerabilities never occur and that's of course not true. So the preparation for it, there will be vulnerabilities found and therefore preparation, so that you're ready for that and can respond before the adversary does is very, very important.
[00:06:16] Guy Podjarny: Yeah, for sure. So let's lean into this a sec. So before we dig into what the Linux Foundation does around open-source security, what do you see as indeed the primary challenges you just named one, you know, maybe about people feeling software should automatically update itself. But I guess when you, when you look in general at the, this of the opensource ecosystem and you think about security concerns. What bubbles to the top?
[00:06:41] David Wheeler: There isn't just one thing, really. So let me just point out a couple major areas and I'll kind of move, uh, left to, right. I realize we're on audio, so I can't see that we can mentally figure that. First we need to get software developers knowing how to develop secure software at all.
You know, a significant proportion, I think, depending on which numbers you believe over half, software developers don't have any formal training at all in software development. And even if you had formal training, most programs related to software development, you know, computer science, that sort of thing, you know, applied computer science, don't have anything involving secure software. So it's very, very easy to get out of even a formal education and know absolutely nothing about the topic. And by the way, I'll mention just as I talk about a couple of things that we're trying to do to counter some of that. So for example, for that problem, the Linux Foundation has released a free course on edX - Secure Software Development Fundamentals.
You can literally just show up and start. It's like many edX courses. If you want to take some tests and get a certificate to prove you learn the material, there's an extra fee, but just to learn the material doesn't cost anything. And I would urge people take that course, take some other course. Do something because the attackers are attacking the software they're developing and most people, it's not that they're stupid, it's that no one's told them what they need to know. And I think that's a critical first step is knowledge is absolutely critical.
[00:08:15] Guy Podjarny: This predates open source or rather this is just for software as a whole?
[00:08:19] David Wheeler: Software as a whole, open source or closed source. In fact that that fundamentals course, it mentions a couple of things specific about open source, but you could be developer of proprietary Software and it would be just as valuable to you because the problems, in fact, as far as security goes are essentially the same.
Next step once you know how to do it, you know, picking tools wisely to help you do that anywhere from if there's no reason not to, unless there's a reason not to choosing a memory safe language, choosing tools and languages that make it easier to write secure software, getting tools into your CI pipeline so that you immediately detect on the commit or even earlier during while you're editing.
Oh, wait, don't do that. That's a terrible idea. There's a danger in referring to tools because no tool is going to just find everything you'd like it to find, but tools are an important part of the arsenal just depending on manual analysis is not, for most organizations, a cost-effective approach.
So you need to include tools in your mental toolkit and there's many different kinds of tools. Use them all grab them, grab as many as you can convince yourself to each one will only find some things. Each one will find some things that aren't really a problem. Fine. Ignore the ones that are nonsense.
Be grateful when it finds these, you know, it's like a reviewer on the side, some of that stuff, you can ignore some of that stuff you'll be glad to know and your user will.
[00:09:54] Guy Podjarny: Yeah, and this is a natural kind of augmentation of the knowledge we can know stuff, but we're all human and we make mistakes and there's going to be some things we don't know.
And that's when some expertise built into the tools, as well as just some, almost like an extra pair of eyes. So a combination of an extra pair of eyes, but also maybe eyes with a slightly different sort of lens level of security expertise can help you get the job done.
[00:10:16] David Wheeler: And I probably should mention in between those and I can easily argue this is part of the education is before you write your code, think about what you're going to write. You know, what you know, where does, what are you trying to accomplish and how are you dividing up your problem? Typically it's called design. What classes are you going to use? Are you going to talk to a database?
If so, how? Those kinds of basics. You have to make those decisions in order to write software. Think about it first and think about how an adversary might look at that. Because for example, in most applications, your database shouldn't be accessible to the public. Okay. You talk to it, you don't make the database [public].
Sometimes you need to, and that's okay. But. In general, for example, it's a principle called least privilege. Don't give access if it's not needed. Okay. You know, in checking for inputs. And so on after that test your software, what an the amazing concept? Thankfully, more and more projects I see are including writing, developing automated tests. It's been a best practice for a long time, but now I'm actually seeing it as an increasingly actually performed practice, which is great to see, you know, so get your automated tests in there and make sure they have negative tests. So, you know, test with things that aren't supposed to happen, make sure they don't happen. I'm not logged in. I tried to delete something. Oh, it worked that wasn't supposed to happen. That should be a failed test.
[00:11:43] Guy Podjarny: Yeah, it's one of the gaps that people have because they don't know to test for the unexpected behavior. Generally tests bias in favor of testing for desired for the expected actions of a user versus those that they are not supposed to be doing.
[00:11:57] David Wheeler: That's right. And there, I mean, there are even a number of courses and so on that talk about testing and I bang my head because they focus very much for example, on the first, you know, some folks want to do test driven development, you know, write your test, make sure it fails, write your code and make sure it passes, but a remarkable number of the TDD folks totally miss this idea that some test should fail, at least in this test should pass because the activity fails and a majority of the TDD stuff I've read totally misses this, that some tests should be testing for things that should not be allowed and make sure it doesn't, it stays not working, which is a little bit of a head flip. When we think about things from a security viewpoint, some things shouldn’t be allowed. If that's important, you test for it. That's also going to come later when I talk about updates, you're going to bring home components, numbering, bring libraries, use a package manager, crazy to do that by yourself. Use a language level package manager uses this system level package manager use various kinds of package managers to automate that stuff. When you only have one reused component, it was fine to do it yourself when you've got a thousand, that's not really sensible. Okay. Obviously we need to get that code out to our co-developers and eventually users.
This is actually a big challenge right now because signing in the world of open source, there are tools that are a little painful to use, but that's not actually the bigger problem. The bigger problem is figuring out, okay. It was signed. Is this the right signer, anybody can sign anything. Is this the right signer is a really, really hard problem turns out. So some projects like the Linux kernel have actually spent quite some effort in Debian and also have some quite some effort working on this. Linux Foundation has just started a project called sigstore. And we're also working with some folks to propose improvements to get widely used version control tool, to try to make this whole signing thing more practical.
You can do it in theory. I don't care. Can you do it in practice? If it's too difficult to actually work, it doesn't matter what the theory and math do. You just, it doesn't happen.
[00:14:14] Guy Podjarny: That's sigstore?
[00:14:14] David Wheeler: sigstore - S I G S T O R E. Okay. Yeah. Now this is relatively early work. It's if you're developing a project, if you're running a project, you probably shouldn't run out today and expect that it's going to do what you need to do, but it is very much an I mean, there's, there's working code right now, people are working on it to harden it up. Their goal is to actually work a lot, like let's encrypt. Now. I want to be careful. You know, let's encrypt is actually another LF project and that's been a wild success. And the success is actually pretty easy to explain, people didn't use HTTPS before because the certs were expensive.
Here are free certs. Oh, that's all right. So sigstore the goal there is to provide a somewhat similar, you know, here we will provide a free way to do signature verifications and certificate transparency logs with the result that people can sign things. I think it's extremely promising. As I said, we're not quite ready yet for deployment, but code's there, and I would certainly encourage you to look at it. We need to make sure we, on the other end that you get the software, you thought you were going to get. Part of that is signature signing of course, verifying that, that was where you thought came from. In the longer-term I think we really need to move more towards reproducible builds.
So the Linux Foundation has funded another group called Reproducible Builds. We're working on funding some other projects to work on that. And if you're familiar with the recent tax on solar winds that involved a subverted built environment, the developers wrote code, but what was shipped to the users was not the code that was written. An attacker subverted the build environment so that when the source code went in to be converted into the actual executables to be run - on this case Java class files - the code was changed to be malicious. That's a nasty, devastating attack. There's a well-known solution it's been known about for years, it's called reproducible builds, and it basically gives you a way to verify that what you received was actually the code that's written.
[00:16:23] Guy Podjarny: So if I dig into this one a sec, just on reproducible builds. How do you separate the desire for updates, which we kind of touched on a second ago with reproducible builds. I mean, oftentimes in an open-source consumption, you encounter things like NPMs, version range updates, or Docker's latest tag, or not even latest, but rather just tags as a whole, that are not reproducible per se, you take the same source code, you build it today, or you build it in a month. You might get slightly different open source packages.
[00:16:58] David Wheeler: For a specific version it better produce exactly the same bites. You're right that for a different version, it's going to be different, but that's okay.
Because that also prevents some, some other attacks, one sneaky attack, which I don't know how many, but you know, there, there are all sorts of fun, sneaky attacks. If you get into this stuff, once sneaky attack is basically convincing users to download or install an older version, and you can even do this by lying, Oh, I'm downloading version two.
Well, actually it's version one, but I'd tell you it's two and it has all the vulnerabilities of one, but you think you're up to date. There's actually a whole bunch of attacks that hit the recipients. So let's go back to reproducible builds for reproducible builds. The idea is given this source code and certain other assumptions.
Okay. Cause you may build, for example, if you build software for a Mac, build it for Windows, build it for Linux. It may produce different bits from the same source code, but that's okay. You say for this source code, for this particular configuration, compiler, whatever, these are the bits that should be produced, if he can reproduce them exactly then you understand what the assumptions are and what the source code is.
[00:18:10] Guy Podjarny: Got it. And this is assumes a log file, sort of assumes that within the open-source consumption aspects of the same source code, you have explicitly specified versions for your transitive dependencies or your base image.
[00:18:25] David Wheeler: Well, it very much depends on what, what it is you're downloading. You can do reproducible builds of libraries. So in that case, you typically don't yank in the other libraries you depend on, what you're doing is, and this works just as well with Javascript, Java scripts, thats compiled. Yeah. But it's often delivered minified. Same thing. Okay. It's using your program to transform fashion.
That's right. There's a packaging format. So what you're doing is just saying, giving the source code, given this other information I can verify. Now for verifying one specific library, what you're verifying is this library says it depends on these other libraries and may not pull them in. You're just verifying that it says it depends on those other things. Now, how do you verify those other ones? Same way. This is actually good from the point of view of verification, because I can tell you what I'm verifying and the other end. I know what I verified and if I only verified one part that's okay, then I know that that parts okay. When I pull in the other pieces, I'll go to verify those. And there's nothing that says that every individual has to do this. I would envision multiple organizations operating as a check, sort of like auditors do on companies when they look at their financials, you know, I'm going to look to somebody else to verify.
So that's what reproducible builds are and I can talk more about that, but let me walk through that other chain. There's a whole lot of challenges on the recipient, making sure things like, I want to get program X. Well, wait a minute. Do I really want to get program X or do I want to get program Y you know, is it foo or foo1? There's an attack called Typosquatting in the open source world right now, that is the most common kind of supply chain attack. Which is creating names that are misleading. It's very much like domain squatting, same kind of thing. But there, of course there are other kinds of attacks.
Sometimes developers accounts get subverted and so somebody inserts something malicious in there. There are, although it turns out to be rare, but it does happen that some developers are actually creating malicious code. You certainly don't want that in there. So basically the LF right now, we'll come back to, but I should shout out to a group within the open SSF called the Open Source Software Foundation, which is focused on trying to deal with a number of these kinds of issues.
[00:20:52] Guy Podjarny: Open Source Security foundation or open source software...
[00:20:54] David Wheeler: It's the Open Source Security Foundation, open SSF, go to openssf.org. You'll find out that they've got six working groups and lots of people working on different things and basically the issue there is open source is really important. And so they are working to improve the security of open source, writ large. That's the foundation I currently work the most with.
[00:21:16] Guy Podjarny: Yeah, we definitely will come back to the OSSF for sure, in a sec.
[00:21:21] David Wheeler: Okay. Awesome. And of course, you know, I should note that even after you get the right code, you bring in and you put in your containers and after you deploy, you got to make sure that, first of all, deploy it securely and make sure that you go back and are notified when there's vulnerabilities, cause that's the other problem, and that's why testing by the way, automated testing is important, quickly update, quickly ship, and you can only quickly ship if you've got automated testing. So long answer to a short question.
[00:21:52] Guy Podjarny: No, I think it's a good answer and it's very structured. Well, this answer is really like sort of the, the chain of software development, right? When you are creating software and when you are consuming software, you know, these are steps or concerns that you should be mindful of. When you think about open source, the ecosystem, kind of the whole community, you know, you alluded to one example of the Typosquatting attacks, what are the biggest broad sort of open source security concerns?
You know, what would you say are, are the top to the community as a whole, because they are prevalent because they're especially, common, well, I guess it's the same as prevalent, whether they're especially severe.
[00:22:32] David Wheeler: There are multiple ways to answer that question, actually. So let, let me try to take a couple steps, cause there's different ways to measure for some organizations you know, there's certain components that they really desperately depend on. It's very, very critical, critical infrastructure and so on. And they're especially worried about very specific components. Are they resistant to attack, Are they the right things, that sort of thing. More broadly, as I mentioned, as far as supply chain goes, when, when users are trying to grab in libraries, one of the most common attacks today right now is typeosquatting and so in some sense, one countermeasure that is really effective and costs you almost nothing is, double check before you download that library. Is that the library you actually meant to get? Because there's lots of awesome open source software, but there's also a number of people that love to make names that are a little misleading and you know, they spelled not quite the right way.
And there are some ongoing efforts to try to detect and counter them and in the long-term I'm hoping that we're going to get better at that. The main type of squatting is still a thing, but it's been reduced over time, but it isn't easy. So really the first step is before you download it, make sure that's actually the one you wanted to download.
You know, you don't need to be a rocket scientist to do some double checks about if it was released last month and everybody has been using it for the last 10 years. That's not it. So, you know, checking the name is one. As far as open source is software is just another kind of software. And therefore it has all the challenges of normal software.
There are certain well-known common kinds of mistakes that lead to vulnerabilities. I suggest if you're developing a web application looking at the OWASP top 10. For everything else, or even if you're doing a web application, there's the CWE top 25. And those lists what's really common. You should, as part of your education, know what those are, make sure you know what those are and how to counter them.
The vast majority of vulnerabilities are those kinds for any software closed or open, knowing about them and knowing how to count them means you're much, much less likely to make those kinds of mistakes. You're going to basically drop by an order of magnitude, the number of vulnerabilities, at least in the software you develop just by knowing that.
[00:24:58] Guy Podjarny: So let's, uh, let's sort of shift gears a little bit to talk about the tools you've already mentioned a few or the platforms that the Linux Foundation is providing, I guess, kind of for starters, how do you see the Linux Foundation's role in, in helping open source security? How do you define, is that sort of, what, why is this being done, What's within scope. What's out of scope.
[00:25:18] David Wheeler: Well, let's see, as far as the openSSF is concerned, if it's open source software and security is an issue, and by the way, it always is. So at least in the sense of, do you want it secure, then we want to improve things. Certainly if they're LF projects, but it's not really limited to LF projects, you know, the LF and the OPENNET, the LF funds projects that aren't LF projects themselves because they rely on them. And we, as a larger ecosystem, rely on them.
[00:25:47] Guy Podjarny: Does that include consuming open source? So if you're thinking about it's an open source security problem on one step, you know, there's, I'm writing open source and it needs to be secure. Maybe even that open source itself consumes open source so I can help there, but does that extend to I'm a closed source product or I'm a commercial entity or whatever it is, and I am consuming open source. Clearly I need to be updated and things like that. Do you perceive that as something that the Linux Foundation aims to help with, is that where you draw the line?
[00:26:19] David Wheeler: Let me try to draw the line in a way that I hope is clear. If you're developing a closed source application, it's not really our job to make it your closed source program secure, However, you said commercial. So let me do a quick side note, at least in the US under US law. Open-source offers commercial software. So there's two kinds of commercial software there's open-source software, and there's closed source software because if it's released to the general public by license or sold or leased, it’s commercial.
[00:26:50] Guy Podjarny: Yeah. Good correction. So it's open source versus closed versus open source versus;
[00:26:54] David Wheeler: That's right. I try to use that terminology because I think it's important. Open source has become so broad and so important to the industry that it's tempting, I think to some people say, well, that's something else.
No. In fact, a lot of studies have shown that 80 to 90% of applications are in fact, when you open up the hood, they're open source software, even if the entire application is closed source, majority of its insides are open source. So basically what our role is to meet, you know, if you're building a closed source application, we're going to try to make it so that when you bring in the open source software, you're going to have confidence in that open source software for a variety of reasons.
You know, that it's, it's less likely to be malicious. You get. The software, you actually intended that sort of thing. So basically I don't want people to think, Oh my gosh, open-source software is hideously insecure. There's actually potential advantages from security for open source software, but we're trying to turn those potential advantages into realities.
And while some of it's really, really secure and there's lots great about it, others less so. And so we want to raise the bar and improve for those projects which aren't doing as well as they should be. And for those who are well great, you know, well praise you and try to learn from those open source projects and get those lessons learned elsewhere.
[00:28:14] Guy Podjarny: Got it. Excellent. Okay. So, so with that in mind, so you have this sort of lens or mission, you know, for the Linux Foundation and I guess the open source security foundation, what are some of the key initiatives or tooling that come out of this work?
[00:28:29] David Wheeler: There is no way I could possibly list them all. And in fact more is going on, so I'll just mention a few and my apologies to everybody who listens for all those many other things that I've forgotten or just didn't say, but since I'm lead of a particular project, I get to shout out about it, so I've been leading for several years something called the CII Best Practices Badge.
It's a set of simple criteria for open source projects that are in best practices for developing secure, open source software. And basically, if you meet enough, you get a badge and there's three badge levels passing, silver, gold. It's absolutely active. I probably should have grabbed the numbers. I'll see if I can do that while talking at the same time.
But you know, basically we continue to get more and more participation, more and more projects are earning badges. Numbers as of today, let's see here. So over 3,700 projects participating and over 500 have earned at least a passing badge. And, you know, they're just more and more over time. So bestpractices.coreinfrastructure.org.
If you're involved in an open source project, please check us out, get involved, get a badge.
[00:29:39] Guy Podjarny: How are the badges verified? Like how do you know is itself declared? Is it tooling?
[00:29:43] David Wheeler: It's a combination. There are some questions which are very, very hard to automate and so we require self declaration, others we can automate. I mean, we can tell all of them are using version control it's not that hard to figure out, you know? And so basically it's a combination of things and also we force publication of what your answers are, including any justifications. And so people come back and say, yeah, that's just false and question things, because it's, it's posted publicly.
[00:30:13] Guy Podjarny: Is this something that you could use to badge your own internal closed source project as well, or is there also some staffing or back office methodology here that is required to operate this?
[00:30:24] David Wheeler: I mean, there is staffing. The whole point of this is to encourage projects to collaborate openly in a public way and so a number of these criteria really don't make sense for a closed source. Indeed. You know, if your monetary model is to ensure that very, very few people can see the code and have access to it, managing to make sure you have open collaboration is probably not what you were looking for, okay. They're very much intended for open source software, but I don't think that's a negative.
[00:30:59] Guy Podjarny: It's just, purpose-built.
[00:31:00] David Wheeler: Yeah, it's, it's focused on that, but it is designed to handle all sorts of scale anywhere from small one-line JavaScript all the way to kernels. Although some of the, you can imagine some of their criteria say "if this, then..." but that's it.
I think some of their criteria might be useful instructed for closed source as well, but that's not that wasn't the goal. The goal is to help open source software projects become more secure and very much focusing on enabling that worldwide collaboration that is vital to a well-run open source project, but there are so many ways to not get there.
[00:31:38] Guy Podjarny: Right! Yeah. And helping consumers of open source, be able to spot the badge and have greater trust in the security practices behind that open source project.
[00:31:47] David Wheeler: That's exactly right. That's exactly right. It's a simple way to say for projects to know what they should be doing. And it's simply for users to know now, you know, it doesn't guarantee that there are no vulnerabilities.
I don't know of any technique that truly can verify that even for a methods have limitations, but, I'm sorry?
[00:32:06] Guy Podjarny: Applied from the internet, you know, sort of don't have any connections to anything. No interfaces….
[00:32:11] David Wheeler: Yeah, no interfaces, never turned it on, but nevertheless, we believe that there's very good reasons to believe that these greatly reduced the likely vulnerabilities and greatly increased responsiveness when a vulnerability is found.
[00:32:29] Guy Podjarny: Awesome. ****So that's one. So the CII badges, the best practice badges.
[00:32:33] David Wheeler: Yep. I've already mentioned sigstore, which is an effort to make signing better. I mentioned the open SSF. There's actually another educational thing. If you know, the ethics course is very much a quick rapid fundamentals, If you want something that takes more time, more hands-on there's coordination with OWASP SKF. Do they give you a hands-on kind of education, there's efforts, ongoing to identify metrics and eventually create a dashboard so you could go to a place and say, tell me about this program, should I be concerned? What should I be concerned about?
There's another group called security and critical projects. That's where, for example, Harvard and LF have worked together done. Harvard's done some research to try to identify what are the most critical open source projects and maybe ones that need funding for hardening of their security and also surveys of open-source developers to get a better understanding of what's more likely to work and less likely to work when working to improve their security. And if you're interested by the way, and just in general and improving the security of open source software, you know, I would suggest go to that openSSF.org site go click on them.
Find a meeting you might be interested in, show up, get involved, basically get involved.
[00:33:50] Guy Podjarny: Yep. Perfect and then we'll put links to all these tools in the initial notes.
[00:33:54] David Wheeler: There's a million more I'm forgetting. And my apologies, because in fact, we're trying to do a lot of different things. It's not one answer I think that's really, the key takeaway is there are multiple challenges. And so there's multiple approaches that are working to resolve those different channels.
[00:34:09] Guy Podjarny: Yeah. Understood. I think one kind of clear takeaway is that the role that you can play is you can provide tools to help open source maintain, or is actually sort of build more secure software, but you know, what is probably even more open source specific as indeed this element of being the trusted entity that says this is secure, like with the badges or even just sort of certifying, Hey, these are the metrics you should look at, or this is a perspective on what it looks like, never perfect, but significant. I know that one project that I just, I know of because Nick was involved as the LFX security piece.
And I don't know if that's, I think it's an evolution of the community bridge from before and maybe I'm kind of, um, misrepresenting..
[00:34:49] David Wheeler: There's a relationship. Yeah. LFX's, is kind of a rebranding regathering of several different tools. There's a lot of open source projects that are within the Linux Foundation. I don't know how many of your listeners are familiar with the Linux foundation, but essentially it's a foundation that creates foundations. So, you know, it's got Linux in the names of the Linux kernel is one of our projects, but at this point there's a huge number of them. You are almost certainly using a number of the open source projects that we, the Linux foundation develop release.
So with the Linux Foundation creates foundations within those foundations, our projects and working groups, and so on to work different problems, the openSSF focuses on improving security of source software. There's others like LF public health, works on software related to pandemics. I wonder why we created that.
Uh, so there's many, many others. So because there are so many different projects within the LF, we don't actually impose that many, Thou shalts, the main vowel that the Linux Foundation imposes is, must obey the law because there's a lot of different companies that are competitors, and there's all sorts of antitrust issues that can show up when companies work together.
And so there's rules that the Linux Foundation absolutely enforces to protect everybody against antitrust concerns. Cause nobody wants, nobody wants that. And we all want to obey the law, but you know, there's a lot of flexibility that said there are certain kinds of tools that projects need over and over and over again and so we certainly use a number of existing tools, and lots of organizations for example, use, GitHub and GitLab, but there's a lot of tools that say, GitHub and GitLab don't provide, or maybe we can provide a higher quality or a different approach, or even just for some security tools, it's better to have multiples.
And so LFX is a suite of tools, primarily intended for Linux Foundation projects to help them do stuff. Now there's a whole bunch. You mentioned LF Security. LF Security is intended to be a suite of tools to improve security. One in there right now, it's basically looking for dependencies that have known vulnerabilities.
It's using Snyk's technology. And we do intend to add others to that tool chain within security, and actually just within, in general, uh, the goal is, you know, as people say, Hey, I need this. And we start noticing, Oh wait, all our LF projects need that. Then sometimes the work with organizations outside, and sometimes it makes sense to try to provide them with tools, but, you know, coming back to the whole security thing, it's important to have tools in the toolbox and that's one of them, is looking for dependencies which have known vulnerabilities.
[00:37:43] Guy Podjarny: I think it kind of goes full circle because, you know, when we just started this conversation about building secure or sort of open-source security as a whole, you started off by talking about, Hey, build secure software. And here's a bunch of things. One of the key elements is you need tools. We went down to talk about the open-source ecosystem as a whole. Typosquatting being able to, you know, choosing the right practice, the tools that the Linux Foundation helps. To help you consume open-source correctly, know that you're doing the right things with the badge with the matrixes, but then you come back to say, well, the other role that LF can provide is to actually provide you with these tools to actually provide you maybe, Enterprise caliber or, you know, that's a poorly phrased, you know, I had Adrian Ludwig on the show and he was the CSO of Atlassian. And then he, he made a great comment, which is enterprise security is inferior to consumer security because enterprise security is something you might need to like work on and educate people on and sort of get people to do while consumer security needs to just work.
So I don't know if this is enterprise grade security tools or if we should say consumer grade security tools, but basically equip them with the tools that generally organizations might have a team or such that is helping build it is that the right state of mind is completing. I guess they, uh..
[00:39:01] Guy Podjarny: Let me respond a little bit to that. I think what he's hinting to at least in part is, you know, ease of use. I mean, the easiest thing to use is when you don't have to do anything and I think where possible we want to go there, if we can't make it so it's impossible to not do the right thing, then we can at least try to make it maximally easy, and I'm not one of those people that believes that you have to always trade off 'is it easy or is it secure?' In fact, one of the main principles of security that goes back to the seventies is still just as true is ease of use, psychological acceptability. If it's too hard to use, people will not do it or work around it, which makes things worse.
Because now you're imposing extra work to avoid something that was supposed to actually be an improvement. Usually the real problem is people didn't think about security in the beginning and they try to jam something on the side. That's not thought through, and if that's your experience with security, I totally get it.
And I agree that that's terrible. So the solution isn't to say no security, the solution is to try to make it either automatic, where you can, or at least as easy as possible. And you mentioning the tools, you know, ideally we want, for example, we are trying to make these tools to not give many false positives.
There is a trade-off, it's easy to have no false positives by never saying anything. Well, that's not useful. Okay. So we need to keep working with the tools to make them better at not having false, false positives, but we also need to help the developers understand that, yeah if a tool says something that is supposed to be helpful that doesn't mean that you always have to do whatever the tool says.
You know, it's an opportunity to think. I am a very much a believer that thinking is necessary. If you're not thinking then you're not doing it right.
[00:40:57] Guy Podjarny: David, this has been great. We talked again indeed about sort of building secure software as well as consuming and the Linux Foundation's role in helping the open source maintainers and consumers, you know, be, be more secure.
One last question before we kind of run out of time here. If I ask you to kind of take out your crystal ball and think about someone doing your role, or even just the, sort of stay the reality of the open source security, I guess the security of the open source ecosystem in five years time, what would you say would be the most different?
[00:41:29] David Wheeler: Well, that's a really hard question. My crystal ball is very fuzzy. Okay. That said, let me, let me just try to give a few prognostications and you know, if I get one in three, I'm doing better than expected, but I think in many ways, strange as it may seem, things will not change, okay. If you look at the list of common vulnerabilities, you know, the OWASP Top 10, there are a very few that have dropped off over the years and a very few that have been added.
And the vast majority has stayed exactly the same. And there are reasons for that. But I think the reality is that attackers will keep attacking that certain kinds of mistakes will continue to be made because we keep having more and more new developers who are not trained in any way in how to develop secure software.
And until we get the vast majority of software developers knowing how to develop secure software, that's going to be a continuing problem. That said, what do I expect to change, frankly, I'm going to be looking around at what's already happening. You know, there's that phrase by. Science fiction author. The future's already here just not evenly distributed.
I think we're already seeing a lot of transitions. I mean, we're already in a world where containers are all over the place. We'll see more of that. We're already seeing people use more and more tools to detect components with known vulnerabilities, to detect vulnerabilities using static source code analysis, using fuzzers and so on, web application scanners.
We're going to see more and more of that. More and more projects are going to integrate those tools. I think we're going to see more and more package managers, both language and system level package managers, as well as frankly, the container package managers, which are packaged managers though, we don't often call them that. That's what they are. I think we're going to see them, include more defenses in the package managers or the repositories that back them to help counter attacks. I think we are going to see more reproducible builds. That's been working on for a while. It's not easy. It's not that it's rocket science.
It's just that historically, nobody paid attention to it. And so there's lots of just little issues that need to be done to make that work. But it's not rocket science. It's just, "Oh, I didn't know that was important. Now I do. And then I can check for it."
So those are at least some of the things that are, frankly, I think we're seeing more recent memory saved languages, rust in particular is coming in. I think some others will be in use as well.
[00:44:03] Guy Podjarny: In lieu of C++ or?
[00:44:05] David Wheeler: in lieu of C and C++, right. The frank reality is, and I'm not saying you should never, ever use C or C++, and there's lots of applications where it would be incredibly expensive to just rewrite, no, especially trying to rewrite everything. But already a number of folks who are, have written programs that were originally all in C are taking up piece parts and trying to replace piece parts with, written in other languages, such as rust, simply because it's incredibly difficult, to develop secure code and C and C++, it's actually become worse over the years because the C standard in particular has a lot of undefined behaviors.
A lot of developers don't even know they're undefined behaviors. I would guess, for example, that most C programmers think that if you take Maccent and add one, that it wraps, that's not guaranteed. And in fact, compilers can exploit that fact that that's not guaranteed to do things that you didn't expect it to do.
And so either if you're going to use C & C++, you had better sit down and study your specs for a couple months. Okay. Read every line very carefully…
[00:45:16] Guy Podjarny: Right there are with, uh, security education, uh, coming up there.
[00:45:20] David Wheeler: No, it's not even mainly security education, it's just knowing your tool because if you do anything wrong in those languages, the assumption is the program is always right. So if you do something wrong, that must mean you meant to have a vulnerability. And that's very, very, it's very, very tricky. So there are programs which I'm sure won't change and if they've got resources or security isn't really that important, I mean that's okay. But the Linux kernel folks spend an incredible amount of effort to look for vulnerabilities and to counter them. A lot of projects really can't afford that kind of effort.
[00:45:58] Guy Podjarny: Yep. Understood. So maybe a good chunk of those would move to, uh, to the safe…
[00:46:02] David Wheeler: To some other languages. And I don't think it's instantaneous. I think that's going to be overtime.
[00:46:05] Guy Podjarny: We talked about, you know, what happens in a bunch of years. David. This has been great. Thanks for coming onto the show, sharing some knowledge about these projects and a good luck, you know, continuing to secure the, the open source ecosystem.
[00:46:16] David Wheeler: Thank you. And thanks everybody for tuning in, and I hope you join us for the next one.
[00:46:25] ANNOUNCER: Thanks for listening to The Secure Developer. That's all we have time for today. To find additional episodes and full transcriptions, visit thesecuredeveloper.com. If you'd like to be a guest on the show, or get involved in the community, find us on Twitter at @DevSecCon. Don't forget to leave us a review on iTunes if you enjoyed today's episode. Bye for now.
[END]
Up next
Episode 92
Being A Cybersecurity Influencer And Finding Security Champions With Ashish Rajan
View episode