Hacking The Pentagon - An Interview with Lisa Wiswell of Grimm & HackerOne
Your browser doesn't support HTML5 audio
Interview with Lisa Wiswell, of Grimm and HackerOne:
Cyber Security Dispatch: Season 1, Episode 15
Show Notes:
Today on the show we welcome, Lisa Wiswell. Lisa is a leader in the security space with nearly a decade of programmatic and cyberwarfare experience. Lisa helped start the Hack the Pentagon program during her time working at the Department of Defense. Hack the Pentagon was initially a three-week long bug bounty where the department allowed 1,187 people, completely unaffiliated with the U.S. government, to hack them. Now an ongoing program, Hack the Pentagon, continues to create great cultural shifts in cyber security practices. In this episode, we discuss the challenges of overcoming institutional resistance to having outsiders hack your systems and the surprising success and praise the program received. We also touch on current issues about vulnerability and disclosure and how to create a system where vulnerabilities can be disclosed in a responsible way. Today, Lisa works as a Principal at Grimm and and an advisor at HackerOne, and in this episode, she reminds us why you cannot tell the world you are secure if you aren’t!
Key Points From This Episode:
- Discover how Lisa entered the field of cyber security.
- How Lisa came to work as a “bureaucracy hacker” at the Pentagon.
- Learn more about the aims and direction of the DARPA program.
- Lisa shares more about DARPA’s flagship program titled PlanX.
- Find out more about the intricate links between Cybercom and the NSA.
- Hear what Lisa believes is the problem with standards and compliance.
- How to ensure mature cyber security ecosystems today? Lisa’s thoughts.
- Hacking the Pentagon: How, why, when did this happen? Because it did.
- Also, hacking the defense travel system, the Army and Air Force (twice).
- How Hack the Pentagon saved over a million dollars in defense.
- The effects of the demonization of hackers in popular media today.
- Why you cannot tell the world you are secure if you aren’t!
- How Hack the Pentagon created a culture shift in security practices.
- Lisa shares her view on vulnerability disclosure and policy.
- See something, say something: The importance of reporting vulnerabilities.
- And much more!
Links Mentioned in Today’s Episode:
Lisa Wiswell Twitter – https://twitter.com/ljwiswel
Lisa Wiswell LinkedIn – https://www.linkedin.com/in/lisa-w-35470432/
NSA - https://www.nsa.gov/
Department of Defense - https://www.defense.gov/
DARPA – https://www.darpa.mil/
PlanX - https://www.darpa.mil/program/plan-x
Cybercom – https://www.cybercom.com/
Hack the Pentagon – https://www.hackerone.com/resources/hack-the-pentagon
iWatch – https://www.apple.com/watch/
Fitbit – https://www.fitbit.com/home
FCC – https://www.fcc.gov/
OPM hack - https://en.wikipedia.org/wiki/Office_of_Personnel_Management_data_breach
Grimm – https://www.grimm-co.com/
HackerOne – https://www.hackerone.com/
Introduction:
Welcome to another edition of Cyber Security Dispatch. This is your host, Andy Anderson. In this episode, Hacking the Pentagon, we talk with Lisa Wiswell who helped start the Hack the Pentagon program during her time working at the Department of Defense. We talk about the challenge of overcoming institutional resistance to having outsiders hack your systems and the surprising success and praise the program received. We also touch on current issues about vulnerability and disclosure and how to create a system where vulnerabilities can be disclosed in a responsible way. Now, let’s hear from Lisa.
TRANSCRIPT
[0:00:47.5] LW: Yeah. My name is Lisa Wiswell. I’m a principal at a company called Grimm, and I’m an adviser to HackerOne.
[0:00:56.8] AA: So I think it’s always interesting to kind of hear how people got into this space, because I don’t think it’s — It’s sort of like a career path that’s on like a third grader’s list of future careers.
[0:01:06.6] LW: Yes. Hopefully it’s on some third-grader’s lists nowadays, but it will be 20 years before they are doing anything. Yeah, honestly, this was an accident for me winding up in this community, in this space. I’ve been working for my as a Member of Congress on Capitol Hill, and we represented very rural Pennsylvania. I was focused on a lot of issues like education reform and things like that. Then my boss wound up involved on — He was the chairman of a subcommittee that’s not called this anymore, but it was the subcommittee on Management Investigations and Oversight on homeland security.
We started thinking about cyber, finally, and that was a long - that was 12 years ago. I started becoming really interested in it because of that and have just sort of by happenstance been offered a job basically keeping the trains on track for a couple of program managers at DARPA. At the time they were called networking program managers, which just translates to cyber warfare today.
So I went over to DARPA thinking it’s going to be the cerebral challenge of a lifetime, and it was. A few years later, we declassified the fact that the Department of Defense did offensive cyber and then I could start talking about it. When I did and started to not be in the shadows any longer, I had been picked up or been asked to serve my country further by going to work as a political appointee at the Pentagon. So we worked in a number of policy-like shops in the Pentagon, and that's when I was able to come up with some innovative program ideas in addition to things that really implicated policy compliance.
[0:02:49.9] AA: Yeah. So your background has been completely out on the policy side, not — Did you have any sort of technical background before?
[0:02:56.1] LW: None. So you can imagine, I was just straight from not just policy, but politics when I went to DARPA, which makes me a really effective bureaucracy hacker as was my last title at the Pentagon. At the time, I was just learning — Just the difference between IPV4 versus IPV6 was crazy for me to try to figure that out, technically, with zero background. Four years at DARPA, you get almost a Ph.D. in it and you’re surrounded by some of the most brilliant minds in the world technically, academically.
[0:03:34.8] AA: Yeah, there’s something to be said for kind of like approaching things from a place that’s outside the sort of typical pathway. I mean, if there is a typical pathway in cyber, but you sort of can ask those dumb questions or like stupid questions, but it often lead to kind of really interesting things.
What you can talk about kind of from your work at DARPA, what were the sort of the things that you guys were kind of pushing and driving from that program?
[0:04:02.7] LW: Yeah. One of the things that — At my time at DARPA, one of the things that became really clear to me and a couple of the program managers there that I had worked with was that the NSA's mission and the very newly created, developed Cybercom’s missions really should be completely different. Their objectives are different, but a lot of the military leadership that was directly responsible for both organizations were really clouding those missions, clouding the people, clouding the skillsets and it just became like this bigger, this mess of a situation.
So we had said, “Listen. We have got to do something that will only enable the war fighters. Something that is specific for them. That does not mean living in the shadows and doing this foreign intelligence mission.” That means, just like when we rolled into Iraq, it's not like we went in all stealthy. We had American flag at the front, and that was our position for what we needed to do as we enabled the DoD (Department of Defense) war fighters, so that the if you're using cyber capability for the purposes of war fighting, it doesn't need to be really super sophisticated and stealthy. It needs to get the job done and it needs to send a message to the world, which generally also means you have to take ownership of it. You have to say in the packet header; “Hugs and kisses, America.”
That was one of the things that – the final program I worked on was a program called PlanX and that was kind of the flagship program in cyber at DARPA for a while. It was around that time when I decided, “Okay. I'll go to this policy Pentagon thing and try to figure out how to help policymakers really catch up with the realm of technical feasibility for what we can and can't do.” I took that whole vision with me so when I went to the Pentagon, I was not popular person trying to fight that fight internally too, but it is important.
[0:06:10.3] AA: Yeah. There’s so much there. I mean, I think — Obviously, anything that you feel like you can’t talk about for whatever reasons, you’re welcome to take a pass. I mean, the sort of idea that the two — That sort of Cybercom and the NSA are sort of intricately linked. The argument that I’ve heard, and I’m probably wrong here, is that the reason that that is done is because of the capabilities of the two? If you separate them, suddenly you’re going to lose a lot of the sort of cutting edge kind of newest stuff, understanding kind of how we break in the system. How did you guys kind of get comfortable with that, losing that potential touch with the capabilities?
[0:06:52.1] LW: Yeah. It is a valid argument for those that see the missions overlapping as much as they think. I really don't think that the missions overlap that much. Obviously, in order to touch someone's network, you have to have access capabilities to do that. I do think that there are occasionally reasons to reuse, if you will.
I don't think that you should rely on the same set of capabilities for everything. Look at the news. Look at the last number of years. If it gets rolled up once, it’s rolled up everywhere. So you shouldn't rely on these super stealthy, super sophisticated capabilities to do everyone's mission.
[0:07:36.2] AA: Yeah. You need that, that idea of like redundancy and diversity. Yeah. In all your systems. This week I also interviewed Dr. Ross from Nist and thinking about — I don’t know if you’ve seen the new kind of cyber resiliency mis-standards that came out?
[0:07:52.9] LW: I have.
[0:07:53.7] AA: You’re giving me a no. The audience can’t see the knowing smile that you’re giving me. What’s your sort of thinking on this? I’m curious.
[0:08:02.3] LW: It’s so hard. You’ve heard me — I think probably anybody who’s heard me talk about cyber even for 20 minutes has heard me say I think that some of the problem with standards and compliance is that it strips away a person sort of need to make smart informed decisions on their own when it comes to security, when it comes to — Being resilient is a very personalized thing, right? So you, your network, have to figure out how to survive in very difficult situations, and that’s not a one-size-fits-all thing. I don't think anything in security is.
Security sometimes means you're implicating privacy. Security sometimes means you're implicating safety. Security means a whole lot of things and you have to just kind of take a step back and look at your organization and determine who might be coming after you and why, and then figure out, “Okay. What is it that I care most about based on what it is that I do?”
I do think that — And this has a tremendous amount of really hard-working, smart people and that work is good and necessary, but it doesn't allow people all the time to think creatively like they need to, like real security demands.
[0:09:23.3] AA: Yeah. I'm curious, for those of us — I, like you, I don’t come from a technical background, but I sort of — I think it’s hard sometimes for people who are — Kind of didn’t grow up in this space to understand kind of the challenge of, literally, an ephemeral system that you can’t see and necessarily get your hands around.
I was sort of thinking through a much more tangible space, which is one of the spaces that I know, which is like the construction industry or like how you build things, right? There’s this really — When you build a building, there’s sort of a mix or different professionals. There are guidelines of standards. You need to make sure that you frame using this size wood and electricity is done this way. But then there’s also the interpretation of it that both of your architects and your engineers go through, and there’s a third step where you’ve got the actual trades who put it in who also are experts. Then you’ve got another step where there’s like an inspector who’s coming back throughout the process.
I go to conferences and I hear people who like rail against one of those and it’s like — From my perspective, it’s like you need them all, and a mature ecosystem has multiple layers of influence, accountability and guidance.
[0:10:39.7] LW: That is the smartest analogy I have ever heard for this argument. I think you’re absolutely right, but the problem is almost nothing is a mature ecosystem today, right? Except for top five tech firms who has a real mature ecosystem. I worry even more about organizations, like IoT vendors that just have no real impetus to do anything right now, and you’ve got a lot of folks who are — That’s terrible. I shouldn’t have said no, but many do not. They’ll point fingers to anybody else that they can get away with saying, “It's your job. It's your responsibility.”
In your example, all those people have very designated roles and responsibilities, and that's clearly codified, right? In this space, nobody really — A least in most of the space, nobody has roles and responsibilities, which means nobody's actually held accountable unless somebody needs to get fired because of some breach, and that not codified anywhere, and that's just done kind of willy-milly.
[0:11:46.4] AA: Yeah. I wonder — Just to kind of beat a dead horse with the analogy whether we’re like — You saw that those changes in that — I mean, one, there’s hundreds of years in that sort of space, but it wasn’t until you started to see like the 1906 earthquake or other — Unfortunately, like until, literally, people are being hurt, that changes happen. I’m wondering whether we’re starting to feel that in cyber, like the public — It’s not only the pain, but also like the sort of public outcry, where there are so many new stories. I think this year, it’s just gone to another level in terms of public awareness.
[0:12:28.6] LW: Yeah, you’re right, but we don’t have enough people that care yet, then we had election tampering happening. That’s something that every U.S. citizen should care about. Until, I’m afraid, more people get hurt or are killed, it’s just not going to be enough to force something major to happen. I never suggest that regulation is the end all be all, but that is one way to start to hold people feet to the fire in terms of designating roles and responsibilities. Maybe that’s what it will take. I hope not. I hope we can just kind of figure this out on our own, but I’ve been hoping that for 12 years now.
[0:13:14.4] AA: Yeah. I mean, unfortunately the tech industry has like tried to push regulation off and quite quick pointedly, like there was some interesting stuff that the FCC (Federal Communications Commission) was going to do in the Obama administration that basically got lobbied away, right? Now GDPR (General Data Protection Regulation has come in and is starting to be that regulation. If you thought you didn’t like what came out of congress, like are you really going to like what comes out of the European union?
Anyways, I’m not pontificating. This isn’t about me. This is you. So walk me through kind of — I know one of the things that I really wanted to hit upon was your Hack at Pentagon, kind of efforts and program. For those who kind of haven’t heard that story, I think they’d love to.
[0:13:56.7] LW: Yeah. We hacked the Pentagon. We still are. I had gotten to a point where I’d been trying to throw the bowling ball through the window and hit the Pentagon and shake up the culture a fair amount. I’d spent a lot of years of my life on the offensive problem set, and I kept thinking, you can use the same set of people for defensive purposes too. I had thought about this for quite a few years.
Microsoft had come out with the their Bug Bounty program and we had been asking a lot of questions about that, straight to the Microsoft folks. It got to a point after — Well, the OPM hack happened. I had a buddy who is a hacker for the government who called me up and said, “Liz, my information was just stolen too. Can you please just find some kind of legal way for me to jam on some of these other government systems? The kinds of things that implicate my life too, and I’ll do it on my weekends just so that we know where our big problems are."
When you have a bunch of folks like that that just want to kind of just help. They just want to help. I thought, “You know, we really have to figure out how to do this.” He was exactly right when he said, “Can you find a legal way for me to do this?” Because at the time, even though he was a Department of Defense contractor, it would have been completely illegal for him to even, at the time, do basic scanning of certain systems in the government.
We spent a lot of time trying to figure out where all of the third rails were and what the laws would be so that we could allow people completely unaffiliated with the Department of Defense or the government to hack us. Hack the Pentagon was initially a three-week long bug bounty where we allowed 1,187 people completely unaffiliated with the U.S. government to hack us. They signed up with a username, so the government didn’t even know who they were, but they signed up and told us – they committed to us that they would follow our instructions to a tee, and they did.
And that was the most shocking part for me. I thought, “For sure, we would have a couple of incidents where folks would try to do something nefarious,” and they didn’t. What was interesting, being that I was on the inside trying to be the person to actually get seniors across the government okay with the fact that we were going to let people who we didn’t know hack us. We had to really kind of do a lot of PR management in some ways. I had to do a lot of promising that I was going to make this as controlled as possible given the parameters. A lot of people were still really nervous and we’re talking to a lot of people with a lot of stars in their shoulders. Were really nervous.
Afterwards, after we did a lot of forensics, I think one of the things that I did very well in terms of making this a successful ongoing program, is I wrote the hell out of everything afterwards. I would write after action reports. I would write a final report and I would push the information out as broadly as I could to make everybody in the government feel very comfortable about it, and they did, and now it’s an ongoing thing. We just announced that hack the defense travel system is going to happen. We hacked the Army. We hacked the Air Force twice. It’s cool, because we’ve got a good cadre of researchers across the globe who are looking for weaknesses in DoD systems, telling us what they are and providing recommendations to us on how to fix them. So when you do that, when you outsource basically two-thirds of the problem; discovery, disclosure and remediation are the three parts of the vulnerability life cycle. When you outsource two of those parts [discovery and disclosure] and you just let your work force focus on the remediation phase, suddenly you go from having vulnerabilities for months or years. In some cases, years from the time that you’ve known about the vulnerability to, in cases, days, weeks is the long time. I mean, that’s a really, really empowering thing. It’s made a big, big security impact. The secretary of defense of the time, Ash Carter, said that we saved easily a million dollars just by doing the first bug bounty pilot. There’s been a lot of obvious potential that we continue — I say we, I’m sorry. I’m no longer with the U.S. government. Keep trying to push the ball forward on, and I’m really proud for that.
[0:18:45.3] AA: Yeah, and I think when I heard you tell this story, you kind of talked about what it was like before that, before there was a sort of an official program. I think it was interesting. What was it like? I think the idea that the sort of the community of sort of security researchers and hackers, right? In the sort of general media, hackers are always seen as a nefarious kind of criminal potential.
[0:19:09.6] LW: Always wearing that black hoodie too.
[0:19:12.4] AA: Right. Yeah, always the same picture.
[0:19:13.4] LW: Green coat going above, over their heads. Funny.
[0:19:16.2] AA: Yeah. It’s like we need a new picture, right? Talk about what it was like before that program was enacted.
[0:19:20.6] LW: One of the things that I think is most — I just don’t even know how those can be, but it’s most interesting for me is that within the Department of Defense, folks really didn’t understand that the kind of skills that you need or the people that you need to do the things that you expect the NSA and Cybercom to do, those are hackers. They’re hacking. That’s what they’re doing.
So the people that are doing that, contractors a lot of time, are the same people that are going to be able to help you if you allow them to participate in this wider amount, or this skillsets, the same skillsets that they have, which are a finite amount of people across the globe. You can get more of that, which means you can look at more things. People really didn’t even put that into perspective, because as you said, the word hacker has always been a really bad thing, which is why we called them security researchers for quite a while. That’s one of my hacks, right? Just call it something else.
They did have a really bad rep for a long time, and if I were a hacker, I wouldn’t want to do any work with the government given how badly they’ve been demonized for as long as they have. We were getting information, and in fact you can just Google it and see all kinds of stuff there. People would put cease and desist notices that they were issues from some lawyer at the NSA because they’ve done some port scanning of DoD IP space. I can’t even imagine how terrified I would be. Violating the computer fraud and abuse act of course is a felony, and so a lot of people have done some hard prison time for doing that kind of stuff.
We just need to do really fix that. We needed to be done with that and we needed to make it not illegal by either de facto or not and we needed to make sure that people could continue to do what it is that they do, whether it’s part of their every day job or part of — One of their hobbies. But the problem was, and this is why it took a while, a lot longer to set up than most folks would have you believe it would take a bug bounty to set up, at least at the Department of Defense, is because people honestly —This isn’t because they’re bad people, obviously, but a lot of the senior leaders really didn’t understand how insecure we were. This is after the OPM hack and all kinds of DoD hacks that — They just didn’t really quite understand.
I said, “Okay. Let’s do this. We’ll have a couple of our own red teams go against our systems for a few weeks before we open it to hackers. Will that make you feel better? At least then we’ll have eyes on where we know our weaknesses are,” and these are our best people, internal workforce that is. When Hack the Pentagon opened, we had somebody submit a vulnerability within 13 minutes. That wasn’t one of the ones that was found in a red team, right?
Well, across the next few weeks, people really started to understand that security through obscurity does not work. You cannot just tell the world that you’re secure, because you still aren’t. So that was a big cultural change that we pushed away from.
Fresh eyes are really the key when it comes to finding weaknesses. If you have the same set of folks looking at all kinds of things all the time internally without bringing in fresh blood and fresh eyes, it’d be like if you were looking at a term paper in college for the 9th time, you would have missed every typo after the second time you read it, but if you gave it to your buddy, he’d find all of the typos. It’s the same concept.
When you break it down that way and then you show them some numbers too, like, “Hey, we found 138 actionable vulnerabilities during this three-week program,” our people were then able to go fix. That’s a really empowering thing and that has created a big cultural shift, I’d say.
[0:23:27.1] AA: Yeah. We’ve talked a maturing industry, right? Any industry that’s sort of doing unique research or sort of pushing the sort of knowledge base forward has some sort of process where it is like peer reviewed and open, right? I think about sort of medical research and those sorts of things where I’m sure it’s a scary moment when you send out that study and then suddenly all your peers come in and — Right? But you do that. That is how sort of things move forward.
I think there’s I think that second piece, which I think really interesting, is like the changing the culture of how you respond to things, right? The very fact that it is — The sunshine cures a lot of ills, and suddenly the idea that there is sort of sunshine on these issues, you can no longer as an organization, as an individual sort of delude yourself into believing that, “Hey, no, no. We’re fine. This isn’t a big deal.”
[0:24:28.0] LW: Yeah. Certainly as an organization, that’s gone. I think that’s starting to be pulled back across the U.S. government as well and across many governments. The U.S., of course, started this whole force forward, but plenty of other governments are following suit, which is exciting.
There are always going to be individuals who say, “Don’t worry. I got this. We’re cool. My people are great.” To those folks, all I can say is, “Listen. The problem is until — For the foreseeable future, human being are the ones that are developing code, which means they are going to have flaws." It’s just the way it is, and it doesn’t mean that anybody is doing anything maliciously. It’s just the way it is.
Unless you find different unique ways to make sure that you’re minimizing your risk based on that, and most important, you’re fixing the weaknesses, the vulnerabilities that you know about it in a very short timeframe instead of just putting it on the shelf and saying, “Just because I know about it, it doesn’t mean anybody else does and they’re not going to — Nobody is going to bother to use this vulnerability.” So long as you do that, then it’s definitely going to — Your organization is going to be more secure because of that.
[0:25:43.7] AA: Yeah. I mean, I think the pure complexity of systems is just like to — I think whether humans create it or machines create it, right? There’s just so much — I think you’re starting to reach a point where it’s almost organic in terms of its like complexity. It’s just we can’t — The classic joke, like a butterfly like flaps its wings in Beijing and that affects the weather here in New York. It’s like we may be beginning — And I think if you start to look at these diagrams of like the number of IoT devices that are just exploding. There is that level of complexity, and you have multiple actors at all different sort of levels of sophistication acting in different ways with different motivations and sometimes uncertain, unclear what those are.
Let’s talk a little bit about disclosures, right? Thinking through that, what we were talking about before we started the recording, that how do you think about the challenges that are in a lot of the devices that we are increasingly surrounding ourselves with and depending on.
[0:26:49.7] LW: Yeah. The congress has done us some — Us, meaning our community, some favors, though they’re not able to pass much by legislation these days. They are still helping kind of shine a light on certain big issues. Obviously, IoT devices and now election equipment are two really big thrusts. I kind of lump them all under this, this idea of IoT.
I think any time you deal with systems in which you’re dealing with the sort of the inner section of hardware and of software, that seam is the place where most vulnerabilities lie. In those two examples, not that all IoT vendors are without great security plans, but most of them are, because that’s not what they necessarily do.
I think that they really need to take a hard look, particularly the IoT vendors that offer devices for private citizens like myself to go purchase that might implicate privacy and/or safety. If that’s the case, then we need to find a better way to do what it is that they’re doing today, because if you don’t start to take security more seriously, obviously perhaps dire things will happen. Medical IT devices or course is a really great example of that. Automative, every single widget in a car these days. Those are the kinds of things where you’re putting lives at jeopardy, not just data breaches and things, ransomware. Things that are bad, but they don’t — When you get to a point where it could affect a life, then you need to really stop and take a real hard look.
I do think that a vulnerability disclosure policy is a tremendous way to at least have what I call a "see something, say something", right? If I was a researcher who was trying to figure out if I wanted to buy an iWatch or a Fitbit, maybe I would want to actually test the security of both or at least have enough information to understand the privacy, implications of either before I bought that device. There’s a lot of things that are going to evolve over the next couple of years that allow consumers to make better decisions on things like that, I think. But the first step is 100%. If there are vulnerabilities in these devices, and it’s not even worth saying if, because there are vulnerabilities in these devices. The vendors should have some way of knowing about that from the security researcher or community.
[0:29:37.2] AA: Yeah, obviously your experience in the space is much extensive than mine. Let’s say I’m a security researcher and I find a vulnerability in a smart watch or some sort of device, can I disclose that publicly or do I risk doing — Do I risk like legal action from the company if I do that? Is there — Walk me through that situation of how that works from a legal perspective, because it’s not something that I really thought about.
[0:30:01.6] LW: Right. Sometimes it’s murky, right? Because computer fraud and abuse act still exists as do other pieces of legislation, and unauthorized access is still a felony.
[0:30:10.9] AA: Even for like a normal — Like whatever —
[0:30:15.1] LW: Okay. If you were to —
[0:30:18.1] AA: Like I bought that watch. Is that considered illegal?
[0:30:22.4] LW: Unauthorized access is still not permitted. However, if you —
[0:30:28.8] AA: But if it’s a product that I owned, that’s considered unauthorized.
[0:30:30.5] LW: Do you own the code?
[0:30:31.4] AA: Interesting.
[0:30:33.7] LW: Right? Who owns what is one of the things that — I’m just grateful I’m not lawyer of that, right? Because those are the kinds of things that would drive me insane. It’s kind of a circle of doom. It depends on different states and localities. Not to mention, federal courts. One of the ways that companies have made this easier for researchers is instead of making it public, so instead of tweeting to the world, “Yo! I found this vulnerability.” That is something you definitely don’t want. Whether you’re the researcher or the vendor. You don’t want to do that.
Vulnerability disclosure policies are a really great way to avoid that. It still allows that researcher a legal avenue just to tell the vendor, “Hey, I found this vulnerability.”
[0:31:23.5] AA: A safe harbor system.
[0:31:24.5] LW: Exactly. Yes, a great analogy for it. Then the company, because they have that policy, has authorized that individual to do the testing activities to find that vulnerability.
[0:31:37.2] AA: Although, whether it happens or not, you would love the company to do the right thing, right? But we’re also realists and students of history, at least I am. So also the ability for that researcher to disclose it in a reasonable time period if he feels like the company is not being responsible.
[0:31:59.7] LW: Yeah, and that’s kind of a normal practice that a lot of organizations have done in the past. In my company, Grimm, we do this thing called not-quite-O-day-Fridays, where we’ll have found a vulnerability and we’ll discuss what it was, how it worked, how we found it. Frankly, it’s a great recruiting mechanism. It’s a great tool for fellow researchers across the globe to look at and learn something from.
But we always work with the vendor first and we try really hard to get that information to them. We’re a pen test type of firm. This is not extortion. We’re not trying to say, “Hey, give us all the moneys. Here, take this.” It’s a completely legal and ethical, but still there’s always going to be issues like that if a company is not willing to even listen to you when you try to tell them, “You’ve got this vulnerability. It’s really important to me that you fix it. I’m sure it’s important to your other users as well. Here what it is. Happy to help you in any way if you need it.”
[0:33:08.9] AA: Yeah. I mean, I think even the situation with like the major — Well, the major sort of vulnerabilities and chipsets that melt down in spectrum, that those came out. You could completely understand that company basically like wanting to keep those private and not ever wanting to disclose them. There’s the court of public opinion, and so you may win there, but if you had — I would not want to be getting sued by Intel those have very, very deep pockets and they could put me kind of in litigious hell for a long time, right?
[0:33:44.9] LW: For a long, long time. Long, long time. Yeah, I think one of the ways that a company could minimize that risk though, is just kind of a good feedback loop with the researcher. In my experience — So the hack the Pentagon pilot was so successful, we were able to launch a DoD-wide vulnerability disclosure policy, and we’ve had thousands of vulnerabilities submitted to us, thousands, and our people have been able to fix them. Think of how empowering that is, and we don’t pay those folks a cent for it. It’s not incentivized. It’s just there as a, “Hey, if you have information about a DoD vulnerability, please submit it here. Thank you very much.”
We wind up communicating with those researches a lot. The researchers are very understanding if something is taking longer to fix than what they expected. Now, it’s not a chip that two-thirds of the world relies on or anything, but obviously there’s some work that would need to be done for things like that that are kind of one off edge cases.
So as long as normal firms with normal vulnerability issues are kind of constantly providing feedback to the researcher and not making feel as though they were, “We don’t even take this seriously, kid. Thanks so much,” then you’re in good shape.
[0:35:07.0] AA: Where do you sort of see that, the sort of — If there is a regulatory, sort of like oversight happening at least in the U.S. I’d be curious sort of your experience kind of hacking bureaucracy, what agency do you see that coming out of? Is that consumer protection bureau?
[0:35:25.0] LW: I do. I think so. In part because IoT is the thing that most people think about now when they think about cyber security risk, because it’s the thing that implicates everyday citizens. If you can give your toaster an IP address, that’s kind of a scary thing. So consumer protection probably is the right place for it.
There’s going to be a lot of —
[0:35:53.9] AA: Currently, the FCC though is getting pretty involved, because it’s now — Any time that spectrum or essentially network communication, the FCC starts to step in.
[0:36:02.7] LW: That’s right. This happens with everything when it’s an initial go at trying to no kidding come up with some kind of standards, and whether that’s regulation or not. There will be a lot of stakeholders initially, a lot of stakeholders. You’re already starting to see sort of the effects left of that, where you have members of congress from every walk of life that are suddenly very interested in these spaces from every state all over the place.
So there will be many, many stakeholders and I think in the end it will shake out. Very likely because this all boils down to protecting consumers. That’s very likely really is the one.
[0:36:44.1] AA: Also do they have any money?
[0:36:46.2] LW: Well, does anybody right now? If we’re going to have to give it to the Department of Defense, just because they’re a government agency.
[0:36:54.9] AA: We’ll just be taking all our IoT devices and dropping them off in the middle of the Pentagon, in the middle of that courtyard.
[0:37:03.0] LW: By the hotdog stand. Yeah.
[0:37:03.8] AA: Anyways, thank you so much. This is fascinating. Really interesting and we touched upon a lot of good stuff.
[0:37:08.9] LW: Thank you. Super fun.