Uncle Sam is Learning New Tricks - An Interview with Steve Orrin, CTO of Intel Federal

Steve Orrin.jpg

Full Transcript of the Interview

In this interview, we talk with Steve Orrin, CTO of Intel Federal and take a deep dive into how government agencies are speeding up and changing their process for adopting new technology

Well, Steve just introduce yourself, your name, where you’re from, company, those sort of things.

Sure. I’m Steve Orrin. I’m the Federal Chief Technologist for Intel Corporation. I drive our strategy, direction and technologies working with the government, as well as helping the government work with Intel.

Awesome. Everybody seems to have ended up in the world of security in a unique way. What was your path into this space?

I started out as a research biologist going all the way back and was going to do something in that field. Then had an interesting idea back in ’95 and did my first security startup, before going to med school. The rest is as they say history.

Med school, I trust never really happened.

Never happened. I did about four security startups throughout the 90s and early 2000s. Then one of them got acquired by Intel, which is how I ended up here back in 2005. It’s interesting, the two benefits of having not come from a classic CS or EE background is I don’t assume that things work in a certain way, because I wasn’t taught that that is the way it has to be.

In many cases, it also has helped being able to translate what we’re doing in cyber security and security in general to audiences that don’t have that background. Working on within healthcare organizations or on HIPAA and understanding how the security technologies help, I can speak their language. Now with things like the genetics research and AI being applied to it, being able to translate – be the translation function, having some of that background has really helped be able to communicate to multiple domains.

Yeah. I bet that biology background is the interplay of complicated systems

Absolutely. We’ve seen examples of where they – computer systems are being compared to biological systems. Now in the current world, Neuromorphic computing and brain neuroscience has brought it back to the fore. We’re seeing modeling chips after brain activity, so it comes back to haunt you on a regular basis.

We certainly steal the terminology viruses and malware and all these sorts of things, right? That’s interesting. One of the things that – I’ve seen you talk about is that you’re thinking about defining the problems that some of the organizations that you work with and those tend to be – your focus is really government and particularly the federal government, and bridging that gap between the problems that they have and the new technology that’s either out there, or potentially coming down the pike. How does that process work?

Well, I think it’s part of how we’ve approached the industry for a long time is the problem solving. Really, what it starts with is listening. Spending time with the customers, spending time with the various agencies and the ecosystem that supports them to understand what they’re trying to do, some of the challenges they’re facing, some of the areas both today, the problems they’re facing, or also where they want to go.

In the government space especially, they design things well in advance of actually building them. Getting a feel for what they’re actually trying to achieve and talking to multiple customers. What’s happening in one agency maybe slightly different than another, but you can find those common themes.

Then understanding the breadth of the technologies both from security obviously, but also from just the – whether be compute capabilities, networking, data analytics and artificial intelligence, and piecing together and saying, “You know what? If I bought something from over here and used this system here in this way, we can actually solve not only the problem that this one agency is having, but the commonality that we’re seeing across multiple agencies.”

Really, the nice thing about the working with the government, a lot of times they’re a vanguard for the broader industry. A challenge that the government sees today with a drone, or with a compute system is something that the banks, healthcare, industrial are going to see, or are already seeing, but don’t necessarily know that they have the problem yet.


In the government space especially, they design things well in advance of actually building them. Getting a feel for what they’re actually trying to achieve and talking to multiple customers. What’s happening in one agency maybe slightly different than another, but you can find those common themes.
— Steve Orrin

We can use those requirements and bring them to the broader commercial space. That’s been the step around that for a long time. One of the big changes that we’re also seeing is the government are willing to take commercial systems. As opposed to being – everything has to be special for the government. You’re looking more at how do we take what’s already working and scaling in commercial, or even consumer use cases. Take 80% of that and do the federalization for the last 20% to harden it, or to make it work in that mission context.

As we’re seeing there’s a two-way street now that we haven’t seen for a long time. Adapting technologies that are more readily available and quicker time to deployment, but also helping to drive the commercial to get better security, get better performance.

Yeah. Particularly in security, I mean it seems like the flow of individuals and ideas is also between government, academia and private industry is more fluid compared to other spheres. There is a lot of the funding might be coming from different portions and the people working in them.

We’re seeing a lot more of the – whether you call it government industry collaboration, government industry academic collaborations of funding going through universities to commercial entities. We’re seeing that both on the engagement on ideas, but also on the actual funding of projects. I think a lot of these is because of the recognition that good ideas can come from anywhere and it doesn’t need to be a 20-year program to get us to something usable.

Yeah. I mean, we’ve encountered certainly you’ve seen the wish list come out of an agency, “Okay, we’re looking for potential solutions here.” How is that? Who is going to realize that and where that might come from, from academia or academia morphs into private industry, those sorts of things?

We’re seeing a lot more of private industry sort of separate into two buckets. The big companies like Intel and Microsoft and Amazon and others have a focus on federal and engaging with the commercial capabilities. But also bringing in the smaller companies. There’s a lot more vehicles or contracts available for small companies to get involved. There is organizations like DIUX and others that are specifically tasked with go find those innovative companies and help bring them into the DOD.

Then there are really companies, a couple of them in the valley that provide training courses. How can your startup learn how to work with the government? We’re seeing a lot more of the investment, because we can’t wait for some lab to come up with the next big thing. From the government’s perspective, they have needs today. They want to be able to do what Silicon Valley does, what New York is doing for the business, the high-speed transaction processing is needed as well. You’re seeing a lot more openness to engage. I think we’re seeing them from both sides.

One of the areas I think that I know you’ve spoken about in the past is – it’s tied in with those issues is the movement to the cloud and sort of whether you’re putting resources, or data out there and how that process looks. For someone who is not as deeply meshed in that space, walk through some of the challenges, as well as sort of opportunities there.

The cloud presented some very interesting challenges for the government. Most people think, well of course the security problem. Yes, that was a huge problem on how do we secure the cloud? How do you get apply the security controls for data protection, for system, the regulations around the stakes and so forth. How do you deploy to an environment we don’t control, you don’t own it?

Some of the early stages were hosted private clouds. That’s still important part of the puzzle, but I think part of what changed with things like FEDRamp and others to help make it easy for Cloud providers to provide that. The biggest challenge especially in the early days wasn’t the technology and wasn’t the security. It as the contractual capability.

Typically you buy a thing. I want to buy a phone. You can’t buy a cloud. Idea of a subscription model, or buying services was not something that – then being able to deploy it through the process. Also when the government wants to buy something, it’s not just why I purchase this. There is an accreditation process and there is authority to operate. Those are documentations, certifications and test that have to be done. If I did it once for one agency and then I was going to go sell my product or widget to another, start over from scratch. I can use the documents, but I have to go through that process again. That doesn’t work in a cloud model.

In Federal now we provide a framework for the security controls, but also for the contractual mechanisms to enable the adoption of services. That was a key of change that help the governments start to adopt. It also gave the com providers the means to figure out how they could take their cloud and make it a GovCloud, or Azure for government on these other things.


In Federal now we provide a framework for the security controls, but also for the contractual mechanisms to enable the adoption of services. That was a key of change that help the governments start to adopt.
— Steve Orrin

That worked for the civilian. It worked really well and we’re seeing a lot of adoption by civilian agencies into the cloud and state local of course. But then you start looking at things like DOD, which have a higher level of security requirement and the separation and segmentation requirements. That required a little bit different engineering on the cloud provider side. I think the opportunity with making up that we’ve seen the GovCloud, we’ve seen the Amazon C2S and other implementations that where the cloud providers have done that extra mile and area going through it to get the classifications and clearances they need.

It’s been a two-way street on that. There’s still some fundamental challenges. You have this notion of I want to be able to protect my data independent of we’re living. I need to be able to apply my security controls into an environment that I no longer control. Even in the hosted, you still could go point to there’s that data center and that data center I have my people sitting there and operating on it. When you have shared services, we have figuring out how do I still get that level of security control and visibility with a fundamental challenge.

This is why things I worked on for a number of years is being able to use that physical hardware capabilities and be able to implement that into the virtual and cloud domain as we bridge that gap to give you this ability, give you – their term, or uses that to station to the environment before you deliver your workload, or your data.

That was another key change that once that got adopted enable people to be more comfortable with these models. We won’t see DOD throwing everything into a public cloud. That is what happened. The notion of a hybrid cloud and the thing able to have data in a specific community clouds and things is definitely happening already and will continue to do so.

The next challenge will be as they look to that next stack, so PAS, SAS, function as a service presents unique challenges that they have to be worked through. Because again, even in a infrastructure I still have a thing that I can fill my software load as a gold disk I can put out there. This is my database. I know where it is. When you distribute that across multiple services, across multiple systems, again it’s another area of how do I then wrap that with the self-controls.

A related question and I’m curious, because this isn’t an area that we’ve had as many conversations about and thinking about is – an agency, if your fed ramp or more on the DOD side has vetted the environment of one of the major cloud providers, right? AWS, or Azure or whatnot, do you then have the opportunity for some of the smaller startups to piggyback on that work that’s already taken place?

Hey, I have my SAS solution. It normally runs in an AWS environment. Can I then deploy it? You federal agency are comfortable with that. Can I then create essentially a private deployment, the spoke deployment of that in that environment, then I don’t have to go through all the hoops and whatnot?

I can give you really two good examples. One is when we worked with IBM Federal with IBM Softlayer –

Working with theCloud provider and then being able to build and bring in to companies like Hytrust that provides access control and policy control and data encryption. Being able to have that one in that environment is part of a service that can be provided to the tenants, to the government customers.

In that case, it was picking very specific set of ecosystem vendors and building them in is part of that offering. Then you see others, we see this in Amazon and Google and others of where – if the application from that SAS provider passes the appropriate security, because they still need to have the security control in your system, then there is an easy migration path to go from the public cloud, assuming you have a government agency that’s going to sponsor it, or you meet the certain level of bare-minimum requirements.

We’re seeing – I don’t think it advertises the Amazon marketplace for GovCloud, but there’s an Amazon marketplace for GovCloud, as well as for Google and others, where ecosystem environments that have done the security controls and have gone through that certification process can now run on that infrastructure and provide their services to government customers as well. It comes from both directions, either a ecosystem and I want to get into the GovCloud, or wants to have their product run for government. They go through the heavy-lift. Or you see an agency and say, “I need this product.” And they Work with them to help get them there.

Yeah. Then you know the checklist of what you need to do just becomes a lot shorter. There’s still a checklist, but it’s not a 100, it’s 10.

Exactly. The other thing is you can ride on top of the fact that you’re running on the FedRamp clouds. You don’t have to go into a re-inspection. You can leverage that certification and then authority to operate.

Yeah, very cool. In talking through this and you touched upon it a little bit, thinking about bridging the digital security and the physical security. How are those two worlds coming together and maybe trying to improve each other?

There are a couple of really good examples of this. One is around root of trust and being able to measure and then attest to the measurements. That’s being able to use the physical hardware in an environment to securely boot the system and to then attest to that secure boot and wrap that up in a quote as we call it, that then can be used by the virtualization infrastructure by the Cloud management&  policy infrastructure and set to go to look and say, “Before I provision this workload to that particular server, or to that group or cluster of servers, I can attest to the security state, boot it in hardware and verify that prior to provisioning.”

Or I’m going to encrypt something to an environment. I can verify that the key is protected in a hardware TPM and only will be decryptable on that system that it has security.  That’s a really good example of how those, from just being able to bridge the gap between physical and virtual.

Another great example and this comes up a lot in the current regulations is sovereignty and geo-location. A data physically sits within a country, within a region. If you have certain regulations, whether it be some of your European Union regulation –

GDPR.

Or in the case of FISMA that it has to be based in the US, as well as other safe harbor things, you need to know not only is that system secure, but is in the location that it’s supposed to be for me to be able to do that work.

Being able to get location tags embedded into a system and attestable, being able to hook up to some of those other systems, or the physical and then be able to communicate them across the digital as part of that attestation, so I know it’s a secure server, in Virginia in the Chantilly data center. Then be able to use that as a policy just like you would when you would say, “I want to only provision the systems that can handle this throughput and this capacity and have this memory.” You can now have these attributes of is the system reporting itself to be still secure? Is it in the correct geo-location, or is in the correct cluster for my policy and use that as just part of the migration or provisioning policies?

We’re seeing that as again another linkage of the digital and virtual world. Where this gets more interesting is when you get out of the big data center side of cloud, you start looking at the tactical cloud and deployed clouds. There you’re looking at not only can I know that this cloud is secure in a rack sitting in some location in a closet, but also has anything changed on it that has been accessed? Be able to get that evidence from not just the actual software and the boot of that software, but from all the access points and all the touch points on that system. Is the firmware secure? Has it been upgraded? Again, it’s all part of the attestation to the physical and virtual state of a given system.

Tied in there also encryption thinking about where is it encrypted? Is it encrypted in transit? Is it encrypted in rest? What are those?

Does it have the capabilities to support the at rest, in-transit and now in use? Are the keys that are supporting that protected and verifiable?

I know public – private key exchange is near and dear to your heart.

Indeed.

Awesome. Yeah, I think it’s interesting. A lot of the things that you have touched upon, kind of that understanding at the station of where the loads are, the servers are, the encryption piece, as well as control over the data, all of those factors or things we’re talking about every day. We spend a lot of time thinking about moving target defense and how – can you change your sort of network profile and make the attack service look different and change for an adversary? You couldn’t do that without a number of things that you talk about, understanding where things are, what is there, can we attest to them and then also are things encrypted?

I think those technologies and being able to attest to the security state, physical location and the operational configuration are going to be essential for scaling those software-defined perimeter and dynamic defensive measures approaches. One of the key challenges is that it does thwart certain reconnaissance and attack vectors, but if you can’t manage something at scale then it doesn’t do you any good either.

Be able to build in that I can make these changes and have a dynamic environment, but from a management from the good guy side of the gap, I can get that visibility, I can attest that this was the system and the configuration is what I deploy, even though that the port maybe changing, the IP, the services that where they’re hosted can be dynamic in moving around from a management perspective and security management perspective, I can get that visibility.

It’s going to be how you then you’re going to get comfortable to scale that and operate that across multiple nodes. I think those two are going to work hand-in-hand. I see a lot of promise in this dynamic digital perimeter, especially it’s the idea of what it is you’re protecting has changed. It’s not longer, well I’ve got this big enterprise and I want to keep a hard shell on the outside, and then we talked about also there’s things coming through so I’m going to push it with defense in depth.

Well, even defense in depth is really something that’s falling apart now when you have microservices and SAS, where it’s no longer even your perimeter to defend. Being able to have – not just have a dynamic or soft environment, but have a perimeter that moves with the application and with the data inside and out is ultimately how we’re going to try to solve some of the security challenges with the integrated and incorporated services.

Yeah, and moves with the teams too, right?

Exactly.

That we often are talking with people and thinking about segmentation. It’s usually they’ve got three categories; untrusted, trusted and super trusted. Beyond that actually implementing things becomes really, really challenging.

I think that the shift that we should be looking at is moving away from the walls of – again, it’s a hard change, but from people to networks to data-driven. At the end of the day, that’s what the attacker is after and that’s what’s important to the company is the access to and use of that data. The controls we deploy should be dependent on the data. The data should have those classifications, and then it’s just a mapping of the firewall, the network and the system to map to the data and we often call this data use controls. I think if we take a data centric approach, that will help us deal with this more dynamic environment, because data will live wherever it is and data wants to be free.

Right. Are you starting to see that happen in practice? Are you starting to see either organizations and – in this business, you never name names, right? Or projects where they’re starting to really hone in on where is the data going and how are we protecting it, that sort of focus?

We are definitely examining obviously things like GDPR and then past ones regulations as well – require people to really understand their data, understand what’s important, what’s PII, what’s the IP that I want to protect. At the same time, looking in the military and the government, we have things like classification levels.

It’s always been there and I think the current state of the development of the architectures is really driving that we need to get better at how we manage the data life cycle. I think one of the things that will also help exacerbate that will be analytics, where the data – it’s a good thing. I know my data. I know I can create it there. What about the aggregate of that data, or the inferencing I got from that data?

That really starts to stretch, “Well, I need to have better controls in metadata around that that I can then protect its access points.” At the same time, be able to get you access to the inference or to the aggregate without getting access to the data. I think those things are really getting people thinking differently.

I think there’s a lot of good work that’s been done. I think we’re starting to see organizations get smart about how they do data use protections, controls and access. The answer isn’t just encrypt everything. That is part of it, but it’s also about how you encrypt it, where you encrypt it and how you decrypt it.

I think it’s that full life cycle. It’s really the thing that’s going to help change. We are seeing a lot of folks in government and in the industry look hard and fast at how they’re working with data. Because of these, things like cloud, we have shared infrastructures and information sharing and shared analytics are really driving them, “Now we have to put with the better model.” Are looking at solutions, whether they’re coming from the enterprise side, or on ERM approaches, or looking more at the IoT space and looking at things that can be embedded and drive around. We’re seeing a lot of activity in that space.

Yeah. I mean, I’m curious. We were having a lot of conversations and everyone is certainly talking about GDPR and trying to understand how it’s implemented. Curious in your experience is the – some of the conversations have basically people are thinking GDPR will in some ways become a worldwide standard, right? Because if you’re going to have to comply with what’s happening there, it’s crazy to maintain potentially other standards. Is that the thinking you’re hearing certainly from your conversations with our government, or whatnot?

There are folks that would love to see one standard to rule them all.  Just have one thing to deal with. I don’t think that’s reality. I think that what we’ll end up seeing is multiple interations and alternates, either existing data centers will adopt GDPR-like facets, for each country, with major countries coming up with their own flavor of it. Organizations, like they did when PCI and HIPAA and all the other things came out a number of years ago are going to have to manage and deal with that across them.


I think one thing to keep in mind is that compliance does not equal security. Even if we had just one standard and we had a good standard, it means that we can document what we did and report it to an auditor. It still doesn’t mean that your security prevents you from being attacked, or that you’ve protected the data. It means that you adhere to a set of requirements that came about via consensus.
— Steve Orrin

I think one thing to keep in mind is that compliance does not equal security. Even if we had just one standard and we had a good standard, it means that we can document what we did and report it to an auditor. It still doesn’t mean that your security prevents you from being attacked, or that you’ve protected the data. It means that you adhere to a set of requirements that came about via consensus.

I think that GDPR is going to drive security. It’s going to drive a lot of product sales for a lot of companies. I think if we use that as an opportunity to go back and look at our data, because again GDPR is really about the data. If we go back and use that as an opportunity, so how would we –independent of the GDPR, we’re going to have to protect our data better. It gives us that opportunity to take another look.

As far as one standard – because it’s the EU and there’s a lot of companies are multi-national, I think a lot of organizations are going to have to adopt it. I think what you’ll find is that it’s a subset of a broader set of regulations. There’s going to be – this for now, there’s going to be multiple regulations and then just when we get to figure that out there will be a new one that we’ll have to come find to in the future.

Progress recedes like a horizon mind. I’ve been peppering you with questions. You’ve got a soapbox, what do you want to talk about? What are you thinking about wish the community was more aware of these days?

I think there are two things that are keeping me engaged and excited – I think one is around how do we secure this artificial intelligence, machine learning environments, and understanding the complexity of that full lifecycle? I think that’s an exciting area that we’ve only just started to understand the different aspects of protecting the training, the inferencing, the analytics and cognitive side and then the actuation or visualization and understanding that those are different systems with different parties involved.

Some of them we have more control over than others. Some of the system capabilities, what you have in a data center on training side, what you have in a camera that’s doing the inferencing are very different. I think there’s a lot of exciting work that needs to be done on how do we secure the AI. The fact that these are starting to make real decisions, and I don’t think we have good visibility into what went into training it, to recognizes a person,  a truck, tree and how then the feed affected that outcome.

That is one of the things that are near and dear to me is how we start securing that. I think another key area is looking at the overall – the way that systems are actually getting deployed and we’re seeing very complex backend data center, cloud services. All them talking to the very edge and understanding how we get the best, the right security at the right place. I know it sounds like a little bit of hygiene, which it is. The idea that your laptops and the data center are the only thing you have to care about, that should be a gone notion.

You have to worry about the entire enterprise includes the devices, the sensors, the components that are all connecting either directly or indirectly into your network, or that you’re relying upon for mission critical systems. The complexity there and the challenge of what I can do on my laptop with fully capable and what I can do in a phone, which is somewhat capable and what I can do in a smart meter are very different things. I need to do it the right amount of security good enough, but that’s still a pretext.

I think that’s an area where every single – a lot of interest of what is the right approach. Do I make the smart meter, the most secure thing on the planet? Do I put a gateway in place. Do I aggregate the security? How do we get those – the evidence and the controls deployed around that complex organization? I think if you see the theme there, I like complex systems.

Yeah. All right. I mean, we had some scary conversations about IoT devices earlier today and what’s there and what’s broken is – yeah, and as you think they’re all in the same network, right? It’s great if you secure the things that you’re more aware of, more thinking about, but you can’t even patch, let alone have visibility in some of those devices. It’s really a challenge.

Exactly. It’s like, what do you do when you’re being DDoS’d by your refrigerator, right?

We’ve all seen the Silicon Valley like incredible things. Steve, this has been great. Thank you so much for sitting down with us. This is really such a – covered so much ground and different things. We’d love to have you back anytime.

Absolutely. Thank you.

Awesome. Thanks so much.