Trusted Computing has a Negative Connotation
Join our host Reece Guida, Beyond Identity's CTO Jasson Casey, Product Evangelist Nelson Melo, and VP of Product Strategy Husnain Bajwa (HB) as they discuss trusted computing and what the term really means.
Transcription
Reece
Hello, everybody, and welcome to another episode of "Hot Takes." We're here today with me, your host, Reece Guida, Enterprise Sales rep at Beyond Identity. And next to me is...
Jasson
I am Jasson Casey, the CTO of Beyond Identity.
Reece
Sounds about right. And Nelson, who are you exactly?
Nelson
Hey, I'm the founding engineer.
Reece
Then there's that other guy, HB.
HB
Hey, HB here, out of Austin. I'm in charge of project strategy.
Reece
So, last time we sat down to record an episode, we were talking about CircleCI and the token compromise that happened to them. And we got really into talking about trusted computing. So, we said, "Hey, why not do an episode about it?" And Nelson begged me to be the one to kick off the Hot Take. I was about to say hot cake. Because it was really personal to him. So, Nelson, I'm going to hand it over to you. Tell the people what's on your mind, please.
Nelson
And it's 12.
Jasson
And share the hotcakes.
Nelson
Yeah, and it's 12 and a hotcake would be kind of cool right now. So, we're talking about… I got curious because Jasson very casually dropped trusted computing into it. And we've talked about it internally, but the term, for me, the first time I've heard about trusted computing was actually in a negative connotation. It came up, if I remember correctly, I watched a video of someone, it may have been the EFF, really railing on why trusted computing was a bad thing.
And the thing I remember is kind of the origin of the term or the folks that were trying to use trusted computing were associated with Microsoft and it had some tinge of DRM that didn't sit well with some people. And I got curious and I wanted to ask these guys if I missed the boat there and is there a longer story or… Because actually, I grew up in Cuba.
I heard of the term in the context of the Free Software Foundation. I think Richard Stallman made mention of it at some point in a talk I was in. So, what's the deal, guys?
Jasson
So, I think it's interesting for everyone to remember. So, trusted computing is older than the stories we're going to talk about. But one of the first major industrial uses in the 2000s was about digital rights management or more specifically, how does the music and entertainment industry prevent people from ripping CDs and DVDs and VHS tapes and sharing media?
Because if a device, if hardware could verify it was only running very specific type of software and nothing else, then all of a sudden, you start to have some of the tools and techniques that you would need to actually be able to distribute digital media with some sort of structural guarantee that, maybe not impossible, but incredibly difficult for people to essentially steal, right?
Now, that was the industrial application of it. The Free Software Foundation and really the larger community, I would say, of open source developers and whatnot, were kind of against the idea that Dell or Compaq... or Compaq was still a thing then, right...
Gateway computer, that they could ship computers that would refuse to load the Linux operating system or BOS or BSD, right, or, you know, on and on and on and on. And, you know, it's obvious why they would think that way, right? Like, this is what they worked on. This is their life's work. And also, like, there's a large hobbyist community where we just...we want to buy hardware, we want to tinker with it and adjust the software and whatnot.
So, the big part of the backlash that I remember, because I was actually a young, impressionable engineer at the time, was the early incarnations of TPM specifically, or a version of a trusted computing, was going to get in the way of me being able to run whatever software I wanted to run on the hardware that I purchased. And because I purchased it, "You know what? Screw you. This is my thing, I'm going to do whatever the heck I want with it."
And a similar argument carried over to music, right? Now, legally, this is probably not a viable argument. But from the consumer's sentiment at the time, I bought the CD. I should be able to copy it onto all of my digital devices if I want to and play it from whatever I want to because I already paid for it. Like, why not, right? And that was really where a lot of the pushback was coming from and I think why it slowed down. Now, the benefit of having these things in place to help us actually verify the integrity of the operating system, the integrity of the bootloader, the integrity of the things the operating system loads, to be able to store encryption keys that literally do not work unless the integrity of the operating system has been verified.
Having these things is not just useful, but necessary for modern enterprise protection and modern cybersecurity. I don't think that was obvious in the 2000s. In the 2000s, security, it was still a problem, but it wasn't a problem that was material to most companies. It was really a problem that was material to critical infrastructure and radios, radio shows.
Nelson
Here's the thing where I feel like sometimes we talk past the regular user. What's in it for me? I am an employee of a company that runs infrastructure that I want to be secure because my pay is coming from a company that has to provide a business and provide a product.
How do I… If that company is, all of a sudden, asking to put all these requirements or all these security features on my devices, is there any impact for me as a user?
Jasson
Well, there's definitely impact on you as a user, right? Essentially, you're going to have a system that's locked to a particular configuration. But I would rephrase it a little bit. I would phrase it along the lines of, so, in the scenario that you just described, the primary benefit is not the employee of the company.
The primary beneficiary is the company itself. Very few companies in the world do not fall under some form of compliance regulation. Most regulation regimes expect some form of data-at-rest encryption, right? The encryption keys for the file system in most modern systems are protected using these devices, these secure enclaves.
And they're protected in such a way where the key that literally opens the file system, if you will, won't even be given up by the enclave unless the booting firmware and operating system pass integrity checks in a very specific and ordered way, right? So, like, imagine I were to walk up and boot a machine with, you know, my custom operating system with the intention of reading the disk, the disk is encrypted, the keys are stored in the enclave, and the enclave won't burp it up for me unless I can actually prove my boot sequence.
But, you know, in most trusted systems, you never prove the trust of yourself. It's always whoever came before you, right? So, BIOS or UEFI, bootloader one is trusted because it's usually on dial on the processor. It does the calculation and verification of bootloader two. Bootloader two does the calculation and verification of bootloader three.
Bootloader three does the calculation and verification of Grub or LILO or whatever the primary bootloader is. And then that does the verification calculation of the operating system kernel. And in each of these verification processes, there's this call out to the enclave. It's basically saying, "Extend my recurrent checksum, right, my cryptographic checksum based on this particular value."
If that's not followed, then the initial conditions to use the disk de-encrypt aren't true, aren't satisfied. And that key is never burped up or divulged to actually open up the disks. This is exactly how Microsoft BitLocker works. And we're specifically talking about this feature called PCR and TPM. But the benefit, you know, it's kind of a very, very deep hole, but coming back up to the top, the beneficiary is the company.
They're forced to do these sorts of things. But if they weren't forced to, you would still want to, right? You know people lose devices. You know devices get stolen. You don't want someone to be able to run an arbitrary OS and actually read your disks. Well, most businesses would not want that.
HB
I almost felt dewy-eyed there remembering Jasson 20 years ago as a young impressionable engineer.
Reece
Same, I was about to start crying.
HB
When I look at trusted computing, having been in the embedded hardware and network computing space, especially, like, for enterprise and government applications, I had, like, sort of a split relationship with trusted computing for a number of years that the companies I worked for were able to deliver high-assurance products based on secure bootloaders and signed software images with assurances that weren't previously available.
But I also felt like sort of the concerns around DRM. And, you know, those DRM concerns didn't go away in a lot of other spaces too. Companies I worked for and worked with used to take that signed software and they would only allow select signed accessories to work with their solutions even when they were standards-based, so, optical modules, various kinds of accessories.
They would basically lock it down. So, in the early days, I think, like, the lack of creativity and commonality of the idea brought the DRM stuff to the forefront. Then I think, oddly, governments stepped in and provided an interesting resurrection of that entire space by focusing on how to create that integrity that Jasson was talking about with all of the various attestations and focusing on things like disk encryption.
And when I look at it now, like, I almost don't remember all of those DRM concerns that I had in the 2000s. And I spend most of my time thinking about, like, all of the integrity and attestation conversations that have now sort of converged on a user level to mostly be about privacy risk.
Like, when I look at what the major challenges are to the next level of adoption of trusted computing, the operating system vendors are having the same kind of philosophical battles that existed around treacherous computing versus trusted computing.
And now, it's all about, like, whether attestation is a sort of side channel privacy leak that gives off too much information about a user. And yeah, I don't know. A lot of those things are interesting problems to tackle now.
Jasson
There was a watershed moment in the early 2010s that… Well, technically speaking, if you worked in the networking industry, you kind of knew about this a long time ago because even as far back as the early 2000s, we were seeing counterfeit line cards for, like, Cisco routers show up.
And so there was this fundamental question that we were trying to grapple with is, how do I know I'm loading Cisco firmware and nothing else, and how does my firmware get the device to prove to the software or the firmware that it is, in fact, Cisco original and whatnot, and whether it was Cisco or HP or whatnot like this? So, in a small veneer…or not veneer, but in a small thread of the industry, like, we were aware of the problem in the early 2000s, but there was a watershed moment in the early 2010s that I think brought it mainstream.
I mean mainstream in, like, nerd standards, right, not...basically outside of just telco. Do you guys might remember what that might have been?
Nelson
No idea.
Jasson
I'll give you a hint. Bootkits and Rootkits.
Nelson
Still lost.
Jasson
HB, do you know?
HB
I think you're going towards, like, Stuxnet and other, like, sort of, like…
Jasson
Close. Imagine Dora the Explorer, if she went to China and then Russia and ultimately probably didn't have the best intentions for our nation at heart, but did open a bit of Pandora's box.
Reece
You should take that idea to Hollywood.
Jasson
What idea?
Reece
The Dora the Explorer.
Jasson
Yeah, except, yeah, she's a better person. Yeah, so, with the Snowden files, right?
HB
I think, in a way, he's got Carmen Sandiego and Dora possibly confused, but…
Jasson
Oh, you know what? You're absolutely right. So, Husnain knows me really well. We've been together for a very long time. I always mix metaphors and now you're getting the real-time correction.
Reece
Honestly, I like the original.
Jasson
I meant Carmen Sandiego. Exactly. But, yeah, so, I remember, when the Snowden disclosures happened, it became very obvious that equipment was being interdicted, and essentially firmware was getting rewritten, right, with backdoors, with essentially implants to phone home and, you know, be useful later.
And I think that is the point where awareness kind of elevated outside of just telco into the general enterprise community that, "How do I know this firmware is…" The problem is old, but I think it elevated the "Why should I care?" if that makes more sense.
Nelson
So, HB was probing into something in the last episode that kind of has something to do with something you said before, HB. Governments have started to catch up to this and create regulation. Is there a next level that needs to happen? SOC 2 and HIPAA, they seem like base-level regulation that sets the standard for what it means to do okay.
Jasson
I'm not a big fan of regulations that prescribe what to do. I'm more of a fan of regulations that describe kind of outcomes and penalties to bad outcomes because, honestly, I think we can take… Let's just focus on SOC 2 Type 2, right?
We can find two organizations that are compliant under SOC 2 Type 2 where one of them is probably a really solid security operation and one of them is not, right? Like, security is a team sport. You've got to have a good team. You've got to have a good plan. The plan never works in reality, but planning is valuable in forming the team, right? A lot of these compliance architectures that say, "Do this.
Do this. Do this. Do that," they're not bad, right? They're great mental models for us to break down how our organization is composed and take what seems to be an intractable problem and make it tractable. But the people who actually make that into a secure operation that minimizes bad outcomes more measure of the quality of the team.
And again, this is my opinion, but on the regulation side, I like the idea of regulation and penalties around outcomes. I don't like the idea of regulation that thou shalt do X and Y. Now, maybe there are some industries that, like, you know, that's just not practical and if you don't mandate it, it just won't happen. And I just don't know better, but personally, outcome-based policy is kind of preferred.
HB
I do think that we're seeing an interesting shift that's maybe not super obvious yet, but if you look at sort of the continuum of functional safety tools in a digital environment, a lot of these kinds of tools are fundamentally… Like, the way that trusted computing has been designed thus far is really around maximizing trust.
And there's, like, something on the continuum that goes past trust, which is the stuff that we're seeing about attestation right now. And that speaks to provenance that, like, once you understand, like, the relationship, you want to know, with a high level of certainty, what the origin was.
We often kind of ignore the complete origin or the root that we want to consider and we allow ourselves to sort of take a transactional view that says, like, you know, a couple of levels lower.
Somebody else is taking care of it and it's good enough. And with software bill of materials, I feel like that's the starting point of someone starting to think, "Hey, provenance matters." But it's a really incomplete solution. It doesn't take a look at it from, like, a sort of structural and end-to-end perspective.
Nelson
That's an interesting way to break it down because when trusted computing talks about provenance, it's not scary. When it talks about operations that are allowed based on provenance, that gets into the space of DRM and being able to use your own devices.
That's the piece that gets scary for most people.
Jasson
I mean, I'll be honest. Anytime I hear words that come out of the bag called emotion, in my mind, I just file it into the marketing shelf and move on. With that said, though, I do… This is one of those areas where you don't have to be an expert and you can go to your engineering team, your security leaders, and you can basically play the dummy and ask the five whys.
How do we know this is what we think it is? Why do we think that? And why do we think that? And why do we think that? And just, it probably, honestly... This would be a game, we should actually try it. But I'd be willing to bet that inside of the five whys is probably system-level provenance.
Right? Like, was this board actually manufactured by Dell and how do I know?
Reece
So, it sounds like, you know, to kind of recap this, every 10 years, there's been a little paradigm shift for trusted computing. So, in the 2000s, it was "Leave me and my CDs alone." In the 2010s, it was like, "Oh, wait a second. There's a good use case here." Hey, it's 10 years later.
We're in the roaring '20s, early on. What do you think the sentiment is? Do we still have that emotional baggage or is there something optimistic on the horizon here?
Jasson
It's still there.
Nelson
Is it being talked about enough, though?
Jasson
No. So, it's totally still there, right? And we get this. Like when we interact with customers and new prospects and whatnot, there's always this contingent of privacy concerns, right? Like, what's trackable? What can I see? What harm could this cause?
Right? So, absolutely, is it still there? It's motivated in a different way. Right?
Reece
Got it.
Jasson
This is no longer I want to load whatever software I want to run. And in fact, like, the hardware manufacturers have done a good job there. There's literally a little switch that you just flip saying, "I want to load only things that's in your trust store or I want to load stuff that I just put on there." But the motivation that matters today is much, much more about privacy tracking. So, for instance, if I were a controlling government and I wanted to always be able to attribute something, I could set up a national authentication system, just like a national ID, and I could start using essentially a hardware-backed authentication, and then retroactively, anytime I wanted to investigate someone, I could track things back to their exact device.
Now, there are provisions in the TPM to do things like EK resets and whatnot to kind of move on and be someone else, but the key thing there is you can't erase history, right? All you can do is claim a new identity. And much like some of the cases in cryptocurrency have taught us, like, cryptocurrency doesn't guarantee anonymity, right?
It just gives you an ability to assume different identities. But when you're passing the same kind of digital artifacts through these digital identities, if you're doing something that connects these disparate events, it will eventually become possible to essentially decloak you, all of your activities, right? So, it's like this… So, classically, we call this data flow analysis, right?
It's just if this same value just flows through all of these variables, then I know these variables are intimately related. It doesn't matter if they all have different names. So, it's still important. It's still a real concern. It's still something we need to be conscious about. And government solutions versus company solutions versus consumer solutions are going to treat this a bit differently.
HB
But that's like one of the things that really attracted me to the company, like, when Jasson described to me Beyond Identity and the idea of using distributed HSMs to generate credentials, it was mindblowing to me initially, like, as silly and seemingly operationally deep as the idea is. But the idea of avoiding, like, the centralization risk of a CA and a centralized high-security module, that's been the traditional model for PKI for as long as PKI has been around, so, 1986, '87, in terms of a standardized approach to these things.
The ability to make these cryptographic associations and establish provenance and some sort of chaining to entities that we choose to trust, it's an interesting thing.
And I feel like the challenge of managing globally unique IDs and anonymity is one that's really solvable. Like, you know, we see a lot of progress in the sort of cryptocurrency space, whether or not you see it as progress is a function of how much you care about knowing your customer and anti-money laundering, but...
Jasson
I think we're going to have to care.
HB
But I think for enterprises and organizations, this is, like, a really interesting kind of important moment, and finding that correct balance is super important. And I think there are certainly directions that we could go that would be bad, right, like, accidentally recreating the mess that cookies are today, especially third-party cookies are today.
We should definitely try to avoid those kinds of things and also, like, complex fingerprinting technologies that people use to do, you know, multi-channel or omnichannel like ad tracking and reconstitution of identities.
But I think there are a lot of clean and good ways to use trusted computing and most of the people who seem to be deeply involved in driving this are coming at it from the constructive side which I think is a really positive sort of development over time.
Reece
Absolutely. And like you said, we are at a pivotal moment here. Let's see what the future holds. I'm going to try to contact Dora the Explorer after this to see if she has any thoughts to add to the podcast. We'll edit her in. Thank you, guys, for bearing your souls today when it comes to trusted computing. I look forward to gathering around the round table with our fancy new podcast equipment very soon.
Thanks for tuning in, audience. Don't forget to smash that button and subscribe.