Life’s Certainties: Death, Taxes, and Violating Security Policies

https://cisoseries.com/lifes-certainties-death-taxes-and-violating-security-policies/

This week’s episode is hosted by me, David Spark (@dspark), producer of CISO Series and Andy Ellis (@csoandy), operating partner, YL Ventures. Our guest is Bruce Schneier (@schneierblog), chief of security architecture, Inrupt and fellow and lecturer at Harvard Kennedy School.

Full transcript

[Voiceover] Best advice for a CISO. Go!

[Bruce Schneier] I teach a class in cyber security policy at the Harvard Kennedy School, basically trying to teach cyber security to people who didn’t take math in college doing policy. And I’m trying to teach them about economics. What I said is if there’s one thing they can take away from my class is that they should use policy interventions to solve economic problems that prevent the implementation of technical solutions. Right now that is my best piece of advice.

[Voiceover] It’s time to begin the CISO Series Podcast.

[David Spark] Welcome to the CISO Series Podcast. My name is David Spark. I am the producer of the CISO Series. Joining me for this very episode is my cohost, Andy Ellis. He’s also known as the operating partner over at YL Ventures. Andy, make the sound of your voice so others know what it sounds like.

[Andy Ellis] [Speaks foreign language 00:01:01] Sorry, I just got back from Israel.

[David Spark] But we’re not eating a meal, are we. Oh, [Speaks foreign language 00:01:06], I’m sorry. That’s good morning, isn’t it?          

[Andy Ellis] That would be good morning if you’d like, [Speaks foreign language 00:01:10] depending on when you’re listening.

[David Spark] We’re available at CISOseries.com. I don’t know how to spell that in Hebrew unfortunately. Our sponsor for today’s episode is PlexTrac. They’ve been a phenomenal sponsor of the CISO Series. And by the way, if you’re doing any kind of red teaming or purple teaming exercises, they’ve got an amazing platform to help you do that to sort of iterate your exercises to improve your overall proactive security program. More about them later in the show. But first, Andy, I have a question for you.

[Andy Ellis] I have an answer for you.

[David Spark] Do you have something personal, and it does not have to be security related, that it is a personal victory that when you get it, you like to brag about it?

[Andy Ellis] A personal victory that…? Because I like to brag about a lot of things. So, I’ll brag about the fact that I just signed a contract with Ashat [Phonetic 00:02:05] Books.

[David Spark] I’m talking small personal victory. I’m going to give you an idea where I’m going with this. For me, I’m a big fan of pinball. And if I get a high score on a pinball machine, I take a photo of it, and I let my family know and sometimes my friends as well. That to me is a personal victory that I love to brag about.

[Andy Ellis] I’m trying to think what it’d be like. I take pictures of food and send it… Oh, when I go to Israel, I take pictures of the ice cream and send it to my kids. I can spell in Hebrew the flavors that I’m having, and I send it to them. And I troll them with like, “I’m getting good ice cream in Israel, and they’re not.”

[David Spark] All right, let’s bring in our guest. I just want to know about personal victories.

[Andy Ellis] Personal victories. I got to think on these because you took me off guard with that one. You don’t usually catch me by surprise.

[David Spark] Well, let’s hope you do better through the rest of the show, okay?

[Andy Ellis] Yeah. Well, and I’m jet lagged, so I’m going to apologize in advance for anybody.

[David Spark] Don’t. We may have to do two, three runs of this episode if you’re that jet lagged. Let’s bring in our guest. I’m so thrilled to have our guest. We’ve had him on before. Everyone who listens to the show I’m sure knows his guest because he’s pretty much a well known legendary figure in our space. It is Bruce Schneier. He’s the chief of security architecture over at Inrupt, and he’s also a fellow and lecturer at the Harvard Kennedy School. Bruce, thank you so much for joining us again.

[Bruce Schneier] Yeah, thanks for having me again.

First 90 days of a CISO.

3:24.179

[David Spark] I asked both of you, Andy and Bruce, do you have any special tips for a first-time CISO. Now, we offer plenty of that kind of advice on this show, but this post on the AskNetSec subreddit is from a soon to be CISO who is just asking the reddit masses for some advice for a first timer. I’m just going to mention a few of the tips right now. One was listen to the tech runts and give them exactly what they need to do their job, not a bastardized version of what they want.

[Bruce Schneier] Guarantee you the tech runt wrote that tip.

[Andy Ellis] Oh, absolutely. The person who wrote that is tired of not getting what they ask for, whether or not it will be helpful.

[David Spark] Well, they may have at one time. That’s true. Hold on. I’ve got other tips for you. Next tip is find out who controls the budget and who influences the budget. And lastly, find ways to insert security practices into existing processes. So, I think this is some pretty good advice for those just starting out and trying to establish their position. I’m going to start with you, Andy. What do you think for a first time CISO would be sort of good advice in that first 90-day push for them?

[Andy Ellis] So, first 90 days I have the same piece of advice I give to every CISO, whether they’re first time or not. Go meet with all of your peers, and you’re going to ask them two questions. First question is what is the stupidest risk that we’re not taking care of that you can’t figure out why nobody has dealt with it. Tell me what that is. And then go deal with those. And then the second one is what is the dumbest security control that gets in your way and figure out if you can get rid of them. But people will hand you the roadmap to easy victories, and then they will talk about how you are aligned with the business when all that you did was you used all of your peers as sensors on the existing security program. And they told you exactly what things they hate and what they would love for you to do.

[David Spark] That is great advice. That is super-fast wins right out of the gate. I like that.

[Andy Ellis] Right out of the gate. And now you have political capital. People are morning willing to work with you. And conveniently, there is a never ending list with those two questions. You can just keep asking these…

[David Spark] Well, I’m sure they light up when you ask that question, too.

[Andy Ellis] Because people will say things like, “I’ve never understood why we don’t have an X program.” And sometimes this is an opportunity for you to explain how expensive, and painful, and how disruptive it would be to solve that problem that way. And then they’re like, “Oh, you’re protecting me.” Other times there really is no good reason why you’re not doing it. Just go do it.

[David Spark] Bruce, what’s your advice for first-time CISOs?

[Bruce Schneier] It’s tough. Being a CISO is only half security. The other half is politics. Is interpersonal. That advice Andy just gave has nothing to do with security if you think about it. It’s like advice for being in charge of anything. Ask the people what they want you to do and what they want you to stop doing, and then figure out if those are good ideas, and do them if you can. So, it’s really interesting that the advice was not domain specific at all. And I think that is important to know when you’re a CISO. That really the hard problems are not the security problems. We actually have tech. The question is do they make sense, are they cost affective, will they actually mitigate the risk that you think they will. It’s all the stuff around. I think we know that about security in general. It’s rarely the tech. it’s the stuff around tech. That’s what makes that job hard. That’s why you get fired all the time. That’s why you get hired all the time. So, I think it really is that subtle. And I think if someone is good at that, they will get that. I don’t know. So, maybe my advice is sweat the politics more than the tech. But my real advice is don’t listen to me. I’ve never done the job.

[Andy Ellis] Actually I really do appreciate though Bruce’s willingness to say don’t listen to him. If you look at the advice that was being given… And we made jokes about how the advice that was in this reddit was obviously from people who hadn’t done the job but had felt burned by the job. And so be careful about when people do give you advice what is their vested interest in you following that advice.

[Bruce Schneier] And, again, that has nothing to do with cyber security and everything to do with interpersonal relationships.

Close your eyes and visualize the perfect engagement.

7:50.232

[David Spark] In the digital age, the incentives of the free market are often counter to society’s values around individual privacy and security. Market forces cannot solve this problem. For that reason, as you, Bruce, wrote on your blog, US National Cyber director, Chris Ingles, outlined a plan for a new social contract for the cyber age. It was a public/private partnership akin to say the Clean Air Act and efforts around airline regulations. He recognizes it won’t happen quickly. And also he recognizes it won’t be easy, and it’ll take many, many iterations. So, Bruce, what do you feel are the market forces fighting the most against achieving societal values in the digital space? And focusing on this public/private partnership aspect of it, what is the one sign you’ve seen that we’re actually moving in the right direction at developing a digital social contract?

[Bruce Schneier] I think Ingles’ essay is a really good step in the right direction. I think he’s recognizing what a lot of us in cyber security understand – that the organizations, the people we’re tasking for defense, aren’t up to it. That Russia against a small power company in the middle of Ohio isn’t a fair fight. And what Ingles is really saying is we need to rethink this. That in fact that small power company in Akron, Ohio has national security implications. It’s not really their job to defend against the Russians. Like other aspects of society through history that were technical and complicated, governments stepped in to help figure things out. It’s a good essay. I really recommend reading it. It goes into what I think are problems, solutions, problems with solutions, steps towards solutions. Really trying to rethink how we might achieve a digital future, and a lot of the essay is, what I like about it, iterating what could be possible. If it was more secure, how would the power grid work, how would powerless cars work, how would privacy and trust work. He’s really saying not just there’s national security implications to problems, but doing it right becomes a national asset. This is really the first time I’ve seen this laid out by a government official like this. I can quibble around the edges, and there’s a lot to quibble about. But I really like what he’s saying.

[David Spark] Well, also what I was impressed by was he actually outlined, “Look, the government is actually making a lot of moves in this right direction.” And again, it’s more like moves in the right direction, not necessarily solutions.

[Bruce Schneier] And a lot of it was government only – governments making moves inside itself rather than with industry. We recognize the fact that at least in the United States, so much of our critical national infrastructure is in private hands. The whole notion of, “Well, government will just take it over,” like scares everybody. So, that’s not going to work. Right? So, how are we going to do it?

[David Spark] Andy?

[Andy Ellis] First of all, everybody should read this. Although entertainingly it is behind the paywall.

[Bruce Schneier] You can subscribe to the email list and get it for free.

[Andy Ellis] Give them your contact information and join the surveillance capitalism state to be able to read this article, which I just found slightly ironic in prepping for this. I think this is interesting. It’s a very utopic vision, which often worries me when you say, “Oh, look at all these beautiful, wonderful things we can do.” I think there is an excessive focus on how government can take over, to use Bruce’s words. Although I don’t think Chris phrases it that way. He phrases it more as this unprecedented intimacy between government and industry.

[Bruce Schneier] There’s an undercurrent of, “Just let us handle it,” that worries me, too.

[Andy Ellis] Right.

[David Spark] Hence the need for many iterations.

[Andy Ellis] But we’ve seen places where government has done that and not done well, and I would prefer to see some lessons from like how has FedRAMP actually improved the cyber security of the industry that interacts with the government because that’s what it was supposed to do. Have we seen successes from partnerships like this? So, I’m excited about the cyber security review board. Let’s see if we can get some good positive outcomes. Is that actually going to be comparable to the review boards that we see in aviation safety and in highway safety? Does that model work, or does it not work?

[Bruce Schneier] And I think Log4j is the first thing they’re going to be looking at. It was going to be solar winds, but that’s too old.

[Andy Ellis] And it’ll be interesting. I hope they come out and say, “By the way, none of you knew what Log4j was, and the next incident will be in something that none of you know what it is today, so don’t plan for Log4j. You’ve got to plan for the next thing>” That’s a hard problem, and I think one of the reasons why cyber is a uniquely hard problem is because we spend so much time reinventing the wheel. If you want to talk about automobile safety, automobiles until the last few years functionally haven’t changed. They’ve just gotten a little more powerful. Some small things change. But recently they became networks of computers that are also mobile, and that’s this fundamental shift. Buildings haven’t changed in thousands of years. Small twists around materials. But a software component built today looks nothing like one built five years ago and absolutely nothing in common with the lessons we learned 20 years ago except at a philosophical level. And so I do sometimes wonder how much is our knowledge of, “Oh, we learned from this incident [Inaudible 00:13:42] going to move forward to the next incident because that’s built on an entirely different technology.

[Bruce Schneier] Yeah, some of his examples, he looks at the FDA. He looks at the FAA reducing airline crashes. Certainly the pharmaceutical industry moves faster. But you’re right, there are unique aspects of the internet. We need to get this right. We actually need to figure this out. And I’m glad that he’s putting in and saying something.

[David Spark] I think that’s kind of the point of the whole article is that we can’t keep the course we’re in now because one of the things he sets it up with is single actions are causing catastrophic events, and there’s not enough necessarily regulation but a means for that to not happen essentially. And how we’re going to come about that, who knows. Andy, you have the closing comment.

[Andy Ellis] I want to disagree with two things you just said, David. So, one is we’re not seeing catastrophic events from single actions. We’re seeing bad events, but they’re not catastrophic. We haven’t wiped out our country because of a cyber-attack.

[David Spark] I overuse the term. Go ahead.

[Bruce Schneier] That’s actually important, because we here in our industry exaggerate the effect. You walk out on the street, and nobody has freaking heard of Log4j even though the disaster happened. So, as disasters go it’s pretty benign.

[Andy Ellis] Yeah, and I always worry about any plan of action that starts with, “Well, we have to do something.” And this is something? Something isn’t always better than nothing.

[David Spark] Well, something in a positive direction. Let me say it that way.

[Andy Ellis] But just because it’s marketed in a positive direction does not always mean it will deliver.

Sponsor – PlexTrac

15:11.975

[Dan DeCloss] I think CISOs and security leaders in general want to know that their team is being the most efficient as possible. These are highly paid resources. They’re valuable resources. And they want to be making sure that their team is firing on all cylinders, working as efficiently and affectively as possible.

[Steve Prentice] This is Dan DeCloss, founder and CEO of PlexTrac, whose focus is to help security professionals stay focused on winning the right cyber security battles.

[Dan DeCloss] We definitely fill the niche of workflow and efficiency gains when it comes to conducting security assessments – whether those are penetration tests, risk assessments, vulnerability assessments, and even purple team engagement. So, being able to help facilitate all the nuances of those workflows and speeding up the reporting life cycle. And then the most important part is being able to report those affectively and get them in the hands of the people that they need to fix them and being able to track those issues accordingly.

[Steve Prentice] And then this kind of information needs analytics such as…


[Dan DeCloss] Where you actually stow the progress over time – how are we doing, where are our biggest gaps, and what are the highest priority items that we should be focused on. When you can bring all that data together, assign it to the right people, get good visibility on who’s working on what, the security leaders really want to know, “Hey, how are we doing? Are we making progress?” And it actually improves morale because you can actually visualize, “Hey, we are making a difference.”

[Steve Prentice] For more information visit plextrac.com.

It’s time to play, “What’s worse?”

16:41.896

[David Spark] All right, this is always a fun game. I love playing it. This comes from Simon Goldsmith of OVO, and here is his “what’s worse” scenario. You have to determine which of these two horrible situations is the worst situation. It’s a risk management exercise.

[Andy Ellis] But the important part is you’re not allowed to qualify the situations that are given. You are stuck in the situation, and you cannot improve it.

[Bruce Schneier] So, at the law school they call this fighting the hypothetical. Don’t fight the hypothetical.

[Andy Ellis] Yes.

[David Spark] Don’t fight the hypothetical.

[Andy Ellis] Except we have to fight the hypothetical in this case.

[David Spark] Don’t fight it. Here’s the “what’s worse” scenario.

[Bruce Schneier] I totally get to fight the hypothetical.

[David Spark] Bruce, don’t answer first. Andy answers first. Here we go. A team of brilliant jerks who make info sec the department of no, albeit a superbly well argued no. That’s scenario number one. You got the team of brilliant jerks who make it a department of no. Or you get a team of hard working team players so fatigued by the organization’s poor attitudes to info sec that they offer a weak challenge to that attitude and become accomplices to a default response of documenting and accepting risk. Which one is worse, Andy?

[Andy Ellis] Oh, this is the easiest one you’ve given me yet. That first one is awful. I’ve got a team of people who are basically the paranoids who say no to everything. And the business just ignores him, or the business goes out of business. Those are the two outcomes of that organization. Versus the second one where we at least have documented what our issues are. If we put those in front of people over time, hopefully that will ameliorate things. They’re both pretty bad. Neither one is affecting the world. If it turns out the business is going to exist and keep doing its things, whether or not we did A or B, it’s not clear there’s much difference between them. I would just prefer to hang out with the not jerks all day. So, I’ll take team B.

[David Spark] But it seems like they’re kind of passive, and they’re like, “Yeah, whatever,” kind of attitude.

[Andy Ellis] Yeah. But hey, they have to do a lot of documentation at least. They got to learn how the system works, and they’re getting educated because they’re being exposed to new ideas on a regular basis. Where it’s very clear the team of jerks, not actually being exposed to new ideas because they’re not sinking in. So, I’m going to say the team of jerks is A. Normally I’d take the team of jerks over the team of not jerks in “what’s worse” scenarios. But in this case, I’m going to say I don’t want the team of jerks.

[David Spark] All right, Bruce, do you agree or disagree?

[Bruce Schneier] I agree. Andy said exactly what I was going to say when he said that they’re both basically the same. Neither make any difference. Neither affect the organization. So, now it’s who do you like better, and jerks kind of makes them bad, and you pick the other one. But what’s really interesting is how similar they are in the end because neither are going to be affective in figuring out what an organization should or shouldn’t do. And that’s basically their job. All yeses and all nos are equally ignored.

[Andy Ellis] And also I will admit to being swayed by the wording of this argument, when you said one of them were brilliant but jerks, and you said the other ones were accomplices. I’m like, “Oh, I’m totally going to hang out with the accomplices just because the person who wrote the question didn’t like them.” But they wanted me to pick the other way, so I’m not going to go there.

[David Spark] Let’s not attack Simon.

[Andy Ellis] I guess that makes me the brilliant jerk today.

[Bruce Schneier] I don’t know. I think that’s too much meta-analysis, but all right.

[Andy Ellis] It would require me to be brilliant first.

[David Spark]  You’re leaning too hard into that.

Are we making the situation better or worse?

20:03.801

[David Spark] Now for years I personally have attended the ad:tech conferences to hear people talk endlessly about how individuals will have the power to see the ads they want to see, and we all swallowed that as something people wanted. I never actually heard any individual actually say they want that. Now, last year we got irrefutable evidence that people don’t want that. Only a measly four percent of US iPhone users allow apps to access their identifiers for advertisers, as reported by Flurry Analytics. So, what that tells you is for years we’ve all been forced something that almost nobody wants. So, it appears there’s more of a desire for advertising to be like pre internet advertising where ads were tied to the content being consumed, not the individual consuming the content. So, I’m going to start with you, Bruce. Wouldn’t that be the inevitable result if all users had the same power as iPhone users? Would it revert, or would we see the rise of a new model that isn’t as intrusive to our privacy? But this goes back to our earlier discussion regarding market forces.

[Bruce Schneier] Right, because this is actually all about the monopolies again. There’s a reason why as a consumer you didn’t get a choice whether you want to be spied on or not. Because everyone is spying on you. And so Apple is a little bit different here. Apple makes its money selling you overpriced electronics, not by spying on you. Because they have that different business model, they can go against ad:tech without their loss of revenue. Although they spy on you in the iTunes store. They do their spying in their own space.

[Andy Ellis] So, Bruce, you just argued that the Apple monopoly is a good thing? Just checking.

[Bruce Schneier] Oh, no, no, no. Oh my God. Right now we are trying to open up their monopoly platform on the iTunes store, and they’re of course fighting that. They have all their own monopoly issues. But it is not surveillance capitalism. So, here they are actually able to work against the Googles and the Facebooks who make their money spying. It is interesting. I have been following this, and the companies that do make their money spying on you are going to look for other avenues.

[Andy Ellis] To spy on you. Just to be very clear, Bruce is not saying to look for other avenues to make money. They’ll look for other avenues to spy on you to make money.

[Bruce Schneier] Because their business model is spying on you.

[Andy Ellis] Right.

[Bruce Schneier] We can imagine Google changing their business model… That’s probably something we’re not going to talk about. So, yes, look for more ways to spy on you because that’s their business model. And what are they going to have? Google has tried… They backed off getting rid of cookies and sort of building an integral surveillance model and then [Inaudible 00:22:53] other people not to. They backed off on that. We’ll see how that goes. It’s not clear…

[Andy Ellis] They’ve claimed to have backed off on that.

[Bruce Schneier] Yeah, that’s going away.

[David Spark] Neither one of you think my theory that advertising will be tied to the content, not the individual…that’s never going to happen.

[Bruce Schneier] If it’s forced, it’ll happen. If we as society decide that this is an immoral business model like selling your kidney or sending five-year-olds up chimneys, we can agree to make the business model illegal. And then the companies will do something else. Short of that, I think not. I think advertising tied to the person but not the content is believed to be and probably is more affective advertising.

[David Spark] Oh, for sure. For sure.

[Bruce Schneier] That drives surveillance capitalism. And I don’t think that goes away unless society says, “This is an immoral business model.”

[David Spark] But the point I was trying to make at the very beginning is that we had been spoon fed this for years, saying, “This is what people want. They want to choose their advertising.” But until iPhone came about and said, “Okay, you actually can do this.” And nobody wanted it.

[Bruce Schneier] And I think speaks to the power of the monopoly. That you had no choice. There wasn’t an alternative.

[Andy Ellis] Right, so you’re not talking about a monopoly. You’re really talking about there’s an established way that everybody does this. It’s more of a trust than a monopoly…cartel. Because the cartel does this.

[Bruce Schneier] Yeah. And this is, I think… Talking about this, I think… A monopoly doesn’t necessarily have to be one.

[Andy Ellis] Literally it’s in the name, Bruce. Come on.

[Bruce Schneier] Right. Yeah. There’s having a few choices or having many.

[Andy Ellis] But I think, David, that you posited a question, and I don’t think the…do people prefer to choose their advertisements and have advertising they like. And you’re saying they opted out of everything targeted, means they wouldn’t prefer to choose. And I don’t know that that’s necessarily true. I do think it’s probably somewhat true. That’s a thing that we need to come to terms with. We’re not going to get rid of targeted advertising because that’s all marketing is. And we’re talking about the mass market down to the consumer layer. We feel odd about it. It’s how can we get to a world… And first of all, how can we figure out if advertising is really affective. I think I’d love to see more data that says this is valuable rather than just an assumption that we throw money at it, and of course it might work.

Pay attention. It’s security awareness training time.

25:16.941

[David Spark] When an employee violations cyber security policies knowingly or unknowingly, they are often creating a pathway for cyber thieves to bypass technical controls. Now, according to an article on Harvard Business Review by Clay Posey and Mindy Shoss, people violate cyber security policies at a rate of 1 out of every 20 job tasks for the following reasons – one, following the rules will limit my ability to do my job affectively for myself and others. Two, high stress reduces an employee’s desire to follow rules. And non-malicious rule breaking is 28 times more likely than malicious attacks. So, the fact that 1 out of 20 job tasks has a security violation seems daunting. How would an organization’s security change if you could just solve this one problem? Is employees violating security policies the top issue that needs to be resolved?

[Andy Ellis] I read this article when it first came out, and this one also has the highlight that says when they’re stressed, they’re more likely to violate policies. We should note that this is an example of a bad reading of causality in a study. You asked people to self-report their stress level and whether they had violated any policies. And you discovered that these two things correlated, so I call this the zero [Beep] left to give correlation, which is when you are stressed and willing to admit it to a third party, you’re also willing to say, “And I violated the security norms also.” So, there may not actually be a correlation between stress and you violating but certainly a correlation between you being willing to assert both of them. And you see this in a lot of social studies – that correlations are often about what people are willing to self-report, not necessarily about ground truth. So, always be cautious about self-reporting. Now let’s come back to let’s assume the data is useful, and I suspect people violate security policies all the time. I know I do. I’m pretty sure Bruce does. I’m willing to bet David does.

[David Spark] It’s like driving over 55. We’ve all done it many times.

[Andy Ellis] Oh, we know that 85% of the people do. They actually designed the speed limits for the 15 percentile because they know you have to raise the speed limit, otherwise 15% of the people are going too slow, and it creates a disaster situation. Some fantastic studies there. Total side note over there. But let’s come back to people are violating our policies. Are they violating our policies because our policies were helpful, and they didn’t like them? Or were they violating our policies because our policies were not helpful, and they were trying to get their job done?

[David Spark] Or they don’t even know what the policies are. Throw that out there.

[Andy Ellis] Well, in this case they had to know what the policy was. They’re self-reporting that they violated the policy.

[Bruce Schneier] But they might have found out after the fact.

[Andy Ellis] Okay, they might have found out after the fact. But let’s just assume they’re violating the policy, and they’re getting their job done. That sounds like a problem you need to solve by fixing the policy.

[David Spark] Bruce, where do you stand on this? That people are just chronically violating policies. And, again, non-maliciously.

[Bruce Schneier] So, it speaks to a mismatch between the policies and the jobs. And it’s not obvious that the job is more important. If your job is actually bad for the company, bad for…maybe the policies are good. I’m a mass murder, and all of these murder policies are inhibiting my job, and this is really annoying to me.

[Laughter]

[Bruce Schneier] I violate them all the time. What should I do? So, it is not obvious that the policies are to blame. But there is a mismatch here, and that needs to be figured out. There’s a lot of stuff that we do that is normal that is stupid to do. There’s a lot of things we do that are bad for security, and if we thought about it we wouldn’t actually do them. We maybe as the bosses give employees unreasonable tasks given what’s secure, and now we get mad at them for violating policy. That makes no sense. So, I would look at the disconnect more than assuming the policy is at fault. I will point out that is Andy’s libertarian predilection to immediately go to the policy and say, “That’s what’s bad,” when in fact it could be either.

[Andy Ellis] What I will say – Bruce does have a fair point. It’s not necessarily the policy is wrong in an absolute fashion, but it’s almost certainly wrong in a relative fashion. But if I walk into a situation, walk into a company, and there’s some awful practices, one of the things CISOs often get taught and security people get taught is like write a policy and then enforce it. That’s actually a really ineffective way of making organizational change. First you start changing the practice. Make the practice easier. Then once you have gotten most of the organization over then you write the policy so you only have to enforce it on cleanup. But everybody is already onboard with the new policy. What a lot of organizations do is shove in like, “Here’s a bunch of policies, and we just expect that you will follow them even though it is neither convenient to do so nor even sometimes possible to do your job by doing so.” And that’s why I said to me that’s like the policy is to blame. It’s really more of the implementor of the policy is to blame, and that’s the thing we got to watch out for.

Closing

30:33.281

[David Spark] That brings us to the very end of this episode. I want to thank our guest, Bruce Schneier, and also my cohost, Andy Ellis. But before the two of you speak, I want to mention our sponsor, PlexTrac, the proactive security platform. They have been a phenomenal sponsor of the CISO Series. Check them out at PlexTrac.com if you’re looking to improve your essentially proactive security posture. Andy, any last words?

[Andy Ellis] So, I just want to give a quick plug for critical thinking. Whenever we’re going to debate some thorny topic like this, take both sides of it in your head – how could this be wrong, how could this be right. Because in anything there’s two sides. Actually there’s like 12 sides. But there’s at least two sides to the conversation. Maybe have both of them in your own head before you pick a fight with everybody on the internet.

[David Spark] Very good point. Cheers for critical thinking. We want critical thinking to be a sponsor of this show, too, by the way. Bruce, please plug away. Inrupt. I’m assuming you’re hiring over at Inrupt. Also any books? What’s the last word from you, Bruce?

[Bruce Schneier] Yeah, it seems like every event for the past two years that has been cancelled or postponed is happening this year in June. I have gotten more speaking invites than I could possibly fulfill, but I am doing a bunch of them. I’m going to Europe and speaking at a few different places. So, I hope to see people in person again. Remember when we used to do things in person with people and do things? Hopefully we will all do this before the next variant shuts us all down again.

[David Spark] Let’s hope so. And by the way, I know you are hiring over at Inrupt. If you want to work with an awesome person like Brue or Davi, who was a previous guest as well, just reach out. I’m sure they’ve got a careers page, yes?

[Bruce Schneier] We certainly do.

[David Spark] All right. Thank you very much, Bruce. Thank you very much, Andy. Thank you to our audience. As always, we greatly appreciate your contributions and listening to the CISO Series Podcast.

[Voiceover] That wraps up another episode. If you haven’t subscribed to the podcast, please do. We have lots more shows on our website, CISOseries.com. Please join us on Fridays for our live shows – Super Cyber Friday, our virtual meet up, and Cyber Security Headlines – Week in Review. This show thrives on your input. Go to the participate menu on our site for plenty of ways to get involved, including recording a question or a comment for the show. If you’re interested in sponsoring the podcast, contact David Spark directly at David@CISOseries.com. Thanks for listening to the CISO Series Podcast.


Posted

in

,

by

Tags: