We Built This City on Outdated Software

https://cisoseries.com/we-built-this-city-on-outdated-software/

“The biggest threat to national security is that many of the most vital systems on the planet CURRENTLY run on outdated and insecure software,” said Robert Slaughter of Defense Unicorns on LinkedIn. That’s at the core of the third-party security issue.
This week’s episode is hosted by David Spark (@dspark), producer of CISO Series and Andy Ellis (@csoandy), operating partner, YL Ventures. Our sponsored guest is Richard Marcus, vp, InfoSec, AuditBoard.

Full transcript

[Voiceover] Best advice I ever got in security. Go!

[Richard Marcus] The best advice I ever got in security is that good news should travel fast, but bad news should travel faster, so I make it my mission to foster transparency and accountability through the connectedness of people and data.

[Voiceover] It’s time to begin the CISO Series Podcast.

[David Spark] Welcome to the CISO Series Podcast. My name is David Spark, producer of the CISO Series. Joining me for this very episode, you’ve heard him before, and this won’t be the last time you hear him. Unless he does something incredibly stupid that we can’t for some bizarre reason edit out, which I don’t think will happen. It’s Andy Ellis who is the operating partner of YL Ventures. Andy, thank you so much for joining us.

[Andy Ellis] Thanks for having me, David. I will say – those are a challenge.

[David Spark] [Laughter] Saying something not unbelievably stupid? You won’t do that. I always had this fear because I did standup comedy many years ago that there’s some video of me floating around saying something horrible on stage, which I used to say completely inappropriate things on stage that would not float by today’s standards at all.

[Andy Ellis] Yeah. And now you have editors to take those out of the show.

[David Spark] Yes, exactly. We’re available at CISOseries.com. Our sponsor for today’s episode is AuditBoard. And by the way, they are a brand-new sponsor of the CISO Series, and we love having them onboard. Connect risk, connect your teams – AuditBoard. And they’re responsible for bringing our guest, who I’ll introduce in just a moment. But first, I want to ask you, Andy. You have been to many an RSA. I believe I actually met you at RSA and I’ve interviewed you many times at RSA.

[Andy Ellis] Yep. I remember that.

[David Spark] Can you think… Because supposedly a week from our recording right now is when they’re closing out the call for papers or call for presentations, whatever they call the…

[Andy Ellis] Call for speakers is what they now call it.

[David Spark] Speakers, call for speakers. Can you think of an extremely memorable presentation you saw at RSA? And you can’t say one of your own, by the way.

[Andy Ellis] Oh, see, but that’s the problem, I normally don’t go see other people’s presentations at RSA. But if I have to pick one, it would be Wendy Nather’s “The Security Poverty Line.”

[David Spark] Let me tell you – she hit a frigging home run with that line because that has so hit a chord with the industry, and I’ve quoted her endlessly on that.

[Andy Ellis] Yeah, and I think that was like 2012, if I recall correctly, because I know that I referenced in talks I gave. Maybe they were 2014, so it might have been 2013, but I think it was 2012.

[David Spark] Well, the trickle-down references to that line, the security poverty line, because especially as we’re seeing more and more third party attacks happening and how important it is that we raise the tide for everybody, not just ourselves. And by the way, I absolutely – and we’ve mentioned this before – I hate it when people make that reference to the “You don’t have to be faster than the bear, you have to be faster than the guy who’s also…the bear is chasing.” I hate that.

[Andy Ellis] Yeah, Mike Smith who used to work for me, which is not very specific given how many Mike Smiths are out there, he used to say, “Yeah, but if you’re the one who’s covered in bacon and slathered in honey, it doesn’t matter if you’re faster than some other guy. The bear’s coming for you.”

[David Spark] So, we don’t recommend anybody slather themselves in bacon and honey, which is a very bizarre combination, but I’m sure it would taste good because both of those things taste good.

[Andy Ellis] Probably, and it’d have to be beef bacon or lamb bacon for me.

[David Spark] There you go. We’ll make that happen for you. Let’s bring on our guest, enough of this nonsense. I’m very excited to have our guest because AuditBoard has been doing an awesome job, and we’re thrilled that they have sponsored us. It is actually the VP of information security over at AuditBoard, none other than Richard Marcus, our sponsored guest. Thank you for joining us, Richard.

[Richard Marcus] Hey. Thanks for having me, David. I’m excited to be here.

There’s got to be a better way to handle this.

4:03.404

[David Spark] “Any conversation around ‘better’ securing a system by adding more controls to the software supply chain needs to appreciate that the biggest threat to national security is that many of the most vital systems on the planet currently run on outdated and insecure software,” said Robert Slaughter of Defense Unicorns on LinkedIn. Now, Robert argued that these debates of risk are taking our eye off of the big issue that software needs to be updated. Both need to happen here in tandem, which is the issue of just vulnerabilities and also software updates, but software is the bigger issue, and it needs to be handled quickly, he said. Speed can only happen if an organization beefs up its process. Is insecurity, Andy, just the result of a lack of an efficient process? What do you think?

[Andy Ellis] So, first I just have to say that Robert Slaughter of Defense Unicorns, if there’s a more Voldemort name out there, I’m not really sure what it is.

[David Spark] It’s a good name.

[Andy Ellis] So, I just want to shout out, I’m assuming that’s your real name. Hopefully, you don’t mind the humor at it. So, “just” is a very scary word. Is insecurity partially driven by poor processes? Absolutely.

[David Spark] Yeah, I should have qualified that. Yeah, I would agree there.

[Andy Ellis] And you should think about the history of safety systems in general. The job of a safety system is to enable the rest of the system to move more quickly. Why do you wear a seat belt? Why do you have a brake in your car? So that you can drive faster. If you think the purpose of those is to save your life, you’re absolutely wrong. The purpose of those is to make you more comfortable driving more quickly. They will save your life in certain circumstances, but you’re going to go take more risk anyway.

[David Spark] And by the way, we’ve seen those stats, and that when seat belts got enabled and airbags, that people drove faster.

[Andy Ellis] Absolutely. And if you look, until the advent of Waze and Lyft and Uber, you actually see year over year just sort of a continuous slow, steady decline in fatalities per mile driven. It’s continuous, it’s the whole way down. It doesn’t matter what safety systems we add in, people adapt very quickly. Waze, Lyft, and Uber all of a sudden put people on roads they’re not familiar with, where they’re being distracted by a nav system, and we’re seeing an increase now. Which I think is attributed to those, or it might just be correlated. We should remember that while correlation is not causation, it usually is indicative of a shared causal factor, so I think there is probably some relation we should explore.

But let’s come to Bob’s question, or Robert’s question, I don’t know that he goes by “Bob.” Absolutely, this is a problem. People don’t know how to patch quickly enough, people don’t know if patches are safe to take, integration is a challenge. What point do you trust some library? Software is so massive and interconnected, and people don’t even know what’s in their software. Your developers added in some random library because you had to handle PNGs at some point, and when libpng has a vulnerability in it, does anybody remember to update that library? Maybe there were major point releases that changed the interface, and that’s dangerous.

[David Spark] And I would say down to the user. I mean, users prevent updating Microsoft for the fear of, “What is it going to break in the process?” and we always have this fear.

[Andy Ellis] I think there’s a little bit of it around the user piece of it, but there’s far more around… We’ve seen cases where you update the browser, and it breaks some app that’s on the back end. Now, who’s fault is that? Probably the people who allowed you to buy an app that said, “This app only works with Internet Explorer version 5.6 and below.” That should have been a deal killer in your purchasing process.

[David Spark] All right. Let me throw this to Richard. Richard, the whole process. I mean, this has to do with the supply chain, but we have this fear of the interconnectedness. So even though we know we should update software, we have the fear of the act.

[Richard Marcus] Yeah. I’ll actually introduce another contender into the debate. I think software supply chain integrity is of course important, and process around hygiene and basic patching is important. One of the spaces a lot of organizations need to get better in is around visibility and the prioritization of their third party ecosystem. It’s not uncommon in some organizations to have tens of thousands of third party vendors, and we know that any piece of commercial software you might be using might contain thousands of downstream dependencies. So, the complexity there gets really big, really fast. And I think through the use of enhanced third party risk assessments, really identifying who are the key suppliers in your supply chain that present the greatest risks to you, allow you to categorize and prioritize your efforts around hardening that supply chain, and then ultimately applying controls around them in any number of different ways.

That’s something that we see a lot of organizations starting to focus more on is that enhanced third party assessment and inventory sort of practice. And then from there, applying a lot of the sort of best practice techniques that you guys are talking about around integrity and process. From an integrity standpoint, one of the things that we’re really focusing on an awful lot at AuditBoard is looking at the integrity in our own build process and trying to incorporate some of the best practices that are coming out now from NIST and the executive orders around supply chain security, making sure that we’ve got those sort of incorporated in our development practices.

And we’re actually also finding a lot of great utility from Google SLSA framework which was published last year, which is really focusing on do you have the necessary integrity checks in your build process, in your software supply chain to produce secure code. And so those are two things that we’re doing that I think really has resonated a lot with the users of our software, the customers of our software. And I think as this trend around transparency in software sort of starts to develop down the road, we’re going to see a lot more demand for more transparency in the process. We’re starting to get requests for things like software bill of material and SBOMs, and even things like collaboration in our vulnerability management process.

How would you handle this situation?

10:22.960

[David Spark] Richard, in an article on Dark Reading, you said, “How transparent companies are before, during, and after a breach tells you a lot about their corporate character.” You laid out a bunch of great examples of essentially telling people what happened and suggesting of ways they can protect themselves so that they also don’t fall victim. Now, these were the “show me” examples to the commonly uttered phrase after a breach, “We take security and privacy very seriously.”

[Andy Ellis] Very, very seriously.

[David Spark] Very seriously. Sometimes we don’t take it so seriously, but we usually take it very seriously. So, two questions here, and I’m going to start with you, Richard. How does the business feel about disclosure during a breach? So, it’s not about the security environment. The business, and what are they comfortable doing beyond what is legally required? And second, what’s the behavior after a breach you want to see that reaffirms your commitment to doing business with an actual vendor? Because we always say this – it’s not the breach, it’s how they behave. So, what say you, Richard?

[Richard Marcus] Yeah, I think it’s really clear that the software market is demanding more transparency and really putting a premium on partners that they can trust. But it is really a balancing act. I think you have to protect operational security posture, disclosing too much can certainly put your customers at further risk, but you definitely want to see, when a vendor’s impacted by a breach or even some high profile vulnerabilities, you expect timely and transparent responses. Especially if you as a customer are at risk or there’s something that you need to do or can do to further mitigate that risk. The behaviors that you really want to see are that vendors are being honest about the impact, they’re partnering with you in the recovery, and that their updates are rapid and reoccurring on their status. I think those are all kind of key signs that your partner really understands their ethical responsibilities as your partner.

[David Spark] All right, Andy. What would you like to see? What have you done? And I’m interested in how far can you go with the disclosure that does, like what Richard said, it’s kind of a balancing act.

[Andy Ellis] Yep. So, I’m going to give some tactical advice here. First piece of tactical advice is as soon as you disclose anything, or even as soon as you know there’s a breach, you write down the time you are about to do your next disclosure.

[David Spark] Oh, yeah. You’ve mentioned this on the show before, and I love this advice. Repeat it again.

[Andy Ellis] So, you don’t wait until your disclosure is perfect and everybody has scrubbed it because it’ll never get out the door. Because everybody’s got an incentive to drag their heels. But if you have a breach, you say, “In one hour, we’re going to notify the world.” Or maybe it’s going to be four hours, whatever it is, everybody’s working to a clock. And so at every point, you have something you’d be willing to put out there, and so you just keep doing that. And the moment you’re done, you’re like, “Hey, four hours to run our next update. Let’s start drafting it.” And maybe you pull it forward because you’re like, “Yep, we’re good, we’re happy, we have something meaningful.” Great. Now very important – you should recognize that as bad of a day as you are having, you have caused a lot of people to have a much, much worse day. And you don’t get to play the victim card even if you are a victim because you have been part of victimizing a lot of other people.

[David Spark] That’s a good point.

[Andy Ellis] And they’re not going to be happy with you, and so you have to be very apologetic. When people say mean and awful things to you, you have to ignore tone. You can only respond to technical content. Like if someone says, “Hey, you complete bleep-bleep-bleepety-bleepety-bleep. You lied in this blog post because you said this when this thing is actually true.” Great. Respond to the technical correction, ignore the fact that the person said awful things about you. I’ve had to do this, it’s really hard. Where you’re like, “We got this one thing wrong,” and you spent an entire page telling me that I didn’t deserve to be a human being. I have to ignore all of that and just respond to, “Yep, I got something wrong. Here’s what we’re doing now.” But in that moment you just don’t take it personally.

[David Spark] So, that’s a really good point, I like that point. What is the behavior you like to see after the breach? Like you’ve gone through all the mechanisms, sort of the time has passed, we’re now in the sort of X months past that, and this is more of, “I want to know that I can still work with this vendor because this is the way they behave.”

[Richard Marcus] Yeah. So, I think some of this is actually going to become mandated through law and regulatory requirements. We’re seeing like the SEC, for example, propose disclosure rules that would involve continuous updates on past breaches. And so you’re seeing, of course, the immediate disclosure requirements that I think are causing the most heartburn in industry is that we’re expected to be able to report on breaches within four days after discovering that breach to be materially impacting to your organization.

[Andy Ellis] Potentially material. Not even material.

[Richard Marcus] Potentially material, right. But then also on a quarterly basis, going and doing that lookback and saying, “Are there events across the last quarter that collectively represent materiality?” and making sure that you’re reporting on those too, in addition to providing an update. So, what’s the status? How’s the status changed since that incident was initially reported? Is there existing risk that still exists? What progress have you made towards remediation, etc.? So, that’s going to become required I think for most companies, but you’re seeing I think across our industry a lot more transparency and frequency of updates in terms of past incidents.

[David Spark] Quick response, Andy, on this one.

[Andy Ellis] I mean, I just want to see how has this changed how you approach your business. It’s possible this breach does nothing for you to change what you’re doing. That can sometimes be okay. But when I am talking to you in a year or seeing you give a talk at a conference, I want to see that this breach has informed your new security architecture.

It’s time to play “What’s Worse?”

16:39.085

[David Spark] All right, it’s time to play “What’s Worse?” Richard, you’re familiar with this game, correct?

[Richard Marcus] Yes.

[David Spark] All right. It’s a risk management exercise, I make Andy answer first. I love when our guests disagree with Andy. Andy does not feel that way, he likes it when you agree with him. So, let’s see how you fall. All right. Andy, this comes from Jason Dance who’s now working at StubHub, and Jason has given us a host of great “What’s Worse?” scenarios. Here’s another one.

You are told that your security controls sometimes block customers from consuming your services. What’s worse? Okay, you got two scenarios here. You are told to immediately disable all security controls, and you know if you do that you will have a very tough battle to turn them back on afterwards. Or your company does nothing, choosing to accept the risk that customers will have problems with your platforms as the cost of doing business, but your security controls stay in place. What’s worse?

[Andy Ellis] Man. This one is challenging. I think your challenge here is, like if you have security controls that are interfering with your customers on a regular basis, that’s likely to drive your business into a bad place. And if the rest of the company doesn’t care, like it’s not your job as the CISO. I mean, it is your job to care about that, but it’s not your job to be the biggest customer advocate. But if the people who should be customer advocates are not telling you to get yourself out of the way of the customer, that really worries me. The other model which is the equivalent of in the firewall world was “Any Allow.” How many firewalls got that dropped in and you’re like, “Oh. There goes all of our firewall rules. We let anything through.”

[David Spark] Why pay for a firewall?

[Andy Ellis] Why bother paying for a firewall? And I think while those were bad, I think those became incentives to security teams to do things better. And I think the other one doesn’t. So, I think I’m going to go slight edge for the what’s worse to the rest of your company doesn’t care, feel free to keep screwing over the customer base.

[David Spark] So, that’s worse because it’s really damaging the actual business.

[Andy Ellis] Yeah.

[David Spark] The first scenario is probably going to damage the business, but I’m going to throw this out – it could damage more than just your business. And Richard’s nodding his head. Are you going the other way, Richard?

[Richard Marcus] Well, you might say that I’m not allowed to give this as an answer, but I would say it totally depends on the product you’re selling, the nature of the relationship with the customer.

[Andy Ellis] You’re not allowed to give that as an answer.

[David Spark] No, you’re not allowed to say that.

[Richard Marcus] Dang it.

[David Spark] I know. I know. You know you’re not allowed to do that.

[Richard Marcus] But in all honesty, you should be falling back on your risk assessment, and you should understand what’s the impact of a false positive versus a false negative for that control. And some customers, depending on the product you’re offering, might appreciate you failing closed, right? Depending on what the impact of not having that security control in place might be. So, I would say it really varies depending on the product and the control.

[Andy Ellis] Richard, when we tell you you’re not allowed to answer, “It depends,” you don’t get to…

[Crosstalk 00:19:31]

[Andy Ellis] …it varies.

[David Spark] Yeah, it varies, it depends, whatever.

[Andy Ellis] Nice try, though. I appreciate the try.

[David Spark] By the way, you know you have to pick one of these, and I still don’t know which one you’re going with.

[Richard Marcus] Sure. Well, I think in my line of work, I would rather the control fail closed. But we also had a really interesting example of this just this past week with a pretty popular antivirus vendor that had an issue that blocked all customers from accessing Google, right? Which is a really critical foundational thing that lots of companies rely on. If you’re a Google shop, you use it for mail, you use it for calendar. And rather than making the decision for their customers, they could have disabled web protection sort of proactively and eliminated the disruption, but they didn’t do that. They let each customer decide based on their own risk assessment whether it made more sense to continue blocking access to Google or to remove that web protection temporarily. So, I know a lot of security leaders were sort of struggling with that decision this week, but really it’s a contextually driven decision and each customer might want to make their own decision.

[David Spark] All right. That’s a good story. It’s very relevant.

[Andy Ellis] You still didn’t answer the question. [Laughter]

[David Spark] You still did not answer the question. So, I need to know is it you disable all the security controls, or you leave them on and make the customer’s life miserable?

[Richard Marcus] I’ll tell you that in my organization, we decided to take the outage for the course of a few hours, keep the security control in place.

[David Spark] All right. So, you flipped with Andy on this.

[Andy Ellis] He didn’t agree. But he’s not allowed to agree with me on this one.

[David Spark] Yeah. Because…

[Andy Ellis] If he agreed with me, they’d be like, “Richard, that’s the last time you get to publicly speak on our behalf.”

[David Spark] [Laughter]

Please. Enough. No more.

21:04.543

[David Spark] Today’s topic is “compliance does not equal security.” We’ve heard this a lot. We’ve heard variations of this, is compliance is check-the-box security, and our audience is pretty darn savvy, and they’ve known this for quite some time. So, Andy, let’s move on from that. What have you heard enough about from this “compliance is not security” argument, and where would you like it to go? What would you like to hear more about?

[Andy Ellis] So, I think I’d like people to talk more about the fact that compliance is two things, right? One is it’s a set of product requirements. Like, you’re doing product management. I need to comply with HIPAA to sell into healthcare, so those are just a set of requirements, and it’s a typesetting exercise of taking your security program and configuring it in a way that is pretty and beautiful and makes your auditors happy. And unfortunately, I think what happens often is people try to do those two things at the same time and they say what is the requirement, how do I make you happy, and they don’t do that connection back through an actual security program. So, I’d like to hear us talk about compliance as being an input and an output, but that they’re not connected, they’re not the same thing because that’s I think what gets us into the check-the-box security problem.

[David Spark] That’s a really good point – compliance because you got to do it could be a great structure to developing an awesome security program. Richard?

[Richard Marcus] Yeah, I would agree. I think most CISOs are looking to strike a balance between use of established frameworks but also risks and threat focus. So, one of the things that I’d like to see more is sort of more discussion around risks and threats. As a SaaS vendor, I see a ton of security questionnaires, I get a lot of requirements from customers, a lot of requests for sort of industry standard compliance. And I think one of the things I’d like to hear more about is questions that show that the customer really understands the threat model that the product operates under and seeks to really understand how those threats are being managed within the product. So, more open-ended discussion around risks and threats, and less focus on a standard list of requirements that may or may not actually apply to the product being sold.

[David Spark] So, how are you dealing with I guess customers who have kind of a check-the-box attitude towards compliance?

[Richard Marcus] Sure. In a connected risk platform, you’re really striving to show traceability between assets and risks and requirements, right? So, you really understand why you’re doing certain control activities, you understand how investment in those control activities address your risk or threat landscape, and really makes it a lot easier to focus on the key controls that matter and really let go of maybe the ones that don’t. So, I know a lot of times CISOs need to be able to communicate the why around security requirements, and a lot of times maybe the why around required investments, and they need to be able to do that in terms other than “XYZ framework says we need to do it.” And so it really facilitates that storytelling and really optimizes the prioritization of your control activities.

[David Spark] Let me throw something out, and I want Andy to jump in on this as well. We hear this a lot in that the relationship with the auditor is two ways. They don’t have all the answers on how things should be handled, and [Inaudible 00:24:18] be communicating back to the auditors like, “This is the better way to do it. We should be auditing this.” Andy?

[Andy Ellis] Oh, absolutely. In fact, most auditors, the person who actually shows up to read through all of your paperwork, are very junior. There’s a high turnover in that career field, so you’ll sometimes get somebody that this is like their first audit in their first year, they’re just out of school. And so you do have to do a lot of education and that is a challenge because how do they know that they should trust what you say to them? But you should absolutely be doing that, and sometimes you do have to have the argument with an auditor. I’ve been there where I’ve auditors that were insisting that a certain control was critical.

In fact, for me it was with the FedRAMP program office insisting that measuring humidity sensors was an interesting control in data centers. I’m like, “I’m in 3,000 data centers. I don’t care about the humidity in any one of them because data centers are disposable to me.” But they’re coming from the point of view of like, “You’re only in one data center. If you don’t have good humidity controls and you’re the one operating it…” I’m like, “I don’t operate data centers. Other people operate data centers for me, and I assume that they check the humidity because if the humidity screws up all the computers in the data center, they’re out of business, and I just lost 1 out of 3,000 nodes. What do I care?”

[David Spark] So, this goes back to you, Richard, and correct me if I’m wrong, I haven’t seen this. Given the AuditBoard is essentially a good storytelling tool, it can help greatly in this conversation to move this discussion in the direction you want for your own darn security program. Yes?

[Richard Marcus] Yeah, absolutely. It’s tough to ask an auditor to take your word for it, but if you can show the math on how that risk management calculation works and show that that’s been reviewed and management is aware and that’s been worked through a formal process, I think that conversation goes a lot more smoothly. You’re really trying to paint a picture that risk management is done in a mature and effective way in your organization, and that any gaps or any maybe risk acceptance that you’ve taken is thoughtful and transparent across the organization.

[Andy Ellis] There’s a really good thing in what Richard just said that we should highlight around that management is aware and what does that mean. And this was really telling for me. I used to have fights with my internal audit team because I was like, “Why don’t you highlight certain things, and you highlight other things?” when they’re pointing at other organizations. And then they come point at my organization. They did an audit and they had like six findings, and I looked at three of those findings and I’m like, “That’s BS. I will happily stand up in front of the board and say I’m not going to fix that,” and they deleted the findings. And I was like, “What are you doing?” And he said, “Oh, our findings are not the acceptable risks. You’ve accepted that’s a risk, that’s not a finding.” He said, “The findings are the things that you would be embarrassed to have to say that you’re doing it and so you want to fix it. So, that’s what we report to the board is the things that you’re embarrassed to have in front of the board.” And I’m like, “Ooh, I can be pretty ballsy and just say I’ll accept all of these, get rid of all your findings.” Obviously, they didn’t take me up on that.

What annoys a security professional?

27:36.042

[David Spark] “If your position in an organization includes responsibility for security but does not include corresponding authority, then your role in the organization is to take the blame when something happens. You should make sure your resume is up to date.” And this was said by Gene Spafford, a very well-known security professional who teaches at Purdue University, and it was actually quoted by a redditor on the cybersecurity subreddit. And this was in reaction to a post on the cybersecurity subreddit from a redditor who said, “I feel like cybersecurity is a sham at a lot of companies. They’ll say they care, but in reality they don’t.” And we go back to the line of “We care, very serious about security and privacy.” So, the responsibility/authority paradox is what highly frustrates security professionals, but at the same time, they’re not running the business. So, if you’re falling into the gap, Andy, what can be done to remediate it outside of leaving?

[Andy Ellis] So, I look at this and I say I don’t think that most people have a coherent definition of what the words “responsibility” and “authority” actually mean there. Gene’s got a great statement, I love Gene. He’s really good at saying things from an academic perspective that sound nice and just don’t play in the business world. And he’ll admit that, so I don’t think he’s going to think I just stabbed him in the back because he wasn’t here.

The reality is if you’re a security professional, you have two jobs. One is to help your business make wiser risk decisions, and the second is to execute on the risk decisions that they have made. Because sometimes we’re operational and we get told, “Oh, you’re the one who has to go manage the IAM system.” “Okay, great. I’ll go do that.” But there’s no responsibility or authority other than to make sure your business partners are making good choices. That’s it. And good choices doesn’t mean no risk. It means they’re taking profitable risk decisions that are consistent with the business, and that you have made sure that the right level of the business is aware of the decision and has bought off on it. And that’s your only decision. That’s it. You have the responsibility to do that, and you have the authority to do that. If you don’t like the decision someone is making, and you think their boss should have made it, take it to their boss. You have the authority to go do that.

[David Spark] So, there’s a clear definition of authority and responsibility. What say you, Richard? And do you think though there’s a trap that many security professionals fall into?

[Richard Marcus] Well, I think what, Andy, you’re really talking about there is influence. I’d rather have the authority, but if I don’t have the authority, at least I do have some influence hopefully. And I think how you use your influence may impact the amount of authority you gain or lose in the future as the organization sort of understands how that influence has been helpful in support of their broader business objectives over time.

So, there’s a couple of different ways that I think you can use your influence. It kind of goes back to what I said at the beginning of the podcast here about sharing the bad news faster, right? You can use that influence to kind of highlight certain things in the organization and sort of drive the decision making that you think is most prudent. One of the ways that you can do that is by demonstrating alignment between your security initiatives and business objectives. In my line of work, it’s about using the customer feedback to my advantage, and it’s amazing how fast my budget initiatives get approved when it’s not just my idea but it’s our biggest strategic customer that’s coming in and saying they need this to continue doing business with us.

[David Spark] So, the idea of letting somebody else lead the charge, and you’re just sort of being the champion for it, if you will?

[Andy Ellis] Right. He’s being a product manager, right?

[Richard Marcus] Sure.

[Andy Ellis] It comes down to my earlier thing about compliance. We have a product requirement from a customer to do this thing. This isn’t my authority saying, “I’m going to do the thing.” This is the business making a choice about what product features we’re going to go do.

[Richard Marcus] Sure.

[David Spark] Good point.

[Richard Marcus] It’s about alignment. And it’s not that they think my ideas are bad, it’s just if I can demonstrate traceability back to revenue or some business objective, it’s a lot easier for them to be influenced by that guidance.

[David Spark] That’s the key right there. Trace it back, “Why am I doing this?” I’m going to quote Steve Zalewski who says this all the time, he says, “How is this going to sell me more jeans?” when he was the CISO over at Levi’s. How do I trace it back, good point.

[Richard Marcus] Yeah, I mean, there’s also traceability back to risks and threats. I mean, that’s probably obvious, but I think making it easy for your executive sponsors to understand kind of how to connect the dots between what you’re trying to get done and the risk and threat landscape and just making it easier for everyone involved in that process, whether they’re a risk owner, an asset owner, they’re on the executive team, internal audit, or the board. You’re all kind of understanding kind of how those dots connect and sort of why you’re doing what you’re doing.

[David Spark] Excellent point.

Closing

32:39.938

[David Spark] And that brings us to the end of this fantastic show. This was spectacular, Richard. This was awesome. Thank you, Andy, too. I’ll give you some credit as well.

[Andy Ellis] Just a little bit.

[David Spark] Just a little bit. I don’t want your head to inflate. Here’s my fear – when your book comes out, and it’s going to get spectacular reviews, your head is going to swell to a size that I am not going to be able to handle anymore.

[Andy Ellis] It’s going to be so big you’ll have to launch a new podcast just for my head.

[David Spark] Just for your head. [Laughter] It’ll be called Andy’s Swelled Head Podcast.

[Andy Ellis] There we go.

[David Spark] Well, let’s close this show up. Thank you very much. I want to thank our sponsor, that’s Richard’s company, that’s AuditBoard, auditboard.com, and just what a great sort of way to attack this problem. We’re all dealing with compliance issues, we’re all dealing with auditing, you can do it the boring way and literally chug through it again and again and again. If you actually want to connect some dots here, which is what I’m seeing that AuditBoard is offering, take a look at what they’re doing. And by the way, Richard, I’ll let you have the very last word here. Andy, any last thoughts yourself?

[Andy Ellis] Well, I just want to say Happy Thanksgiving to those folks in the United States. If you’re in Canada, I’m saying Happy Thanksgiving, but it’s going to happen in about a week from our recording rather than in a week from the aired episode.

[David Spark] So, you can say, “I hope you had a good Thanksgiving.”

[Andy Ellis] So, I hope you had a great Thanksgiving, but from where I am right now, I’m wishing you all a happy Thanksgiving, you just don’t hear it until afterwards.

[David Spark] No, I’m recommending our Canadian listeners travel to the United States and essentially knock on somebody’s door…

[Andy Ellis] And do both.

[David Spark] …and have an American Thanksgiving.

[Andy Ellis] That’s an awesome idea. In fact, I know at least one listener who I think is going to come down right after this and come down for Thanksgiving.

[David Spark] Sounds good. All right. Richard – the question I always ask our guests – are you hiring? So, please let me know the answer to that, and any last words you want to say about AuditBoard? And if you’ve got an offer for our audience or anything like that.

[Richard Marcus] Yeah, sure. We’re always hiring. We’re growing like mad. So, if you’re interested in helping us tackle this problem, definitely reach out to us. Just wanted to say thanks Dave and Andy to have me here, it was really an enjoyable conversation. And just as a kind of closing thought, I think a lot of the topics we touched on today really point out that more and more the effectiveness of the CISO role is really about your ability to wield data and influence people.

[David Spark] Tell a story. We hear that all the time.

[Richard Marcus] Yeah. And I want everyone to know that AuditBoard was designed by practitioners for practitioners to help you connect the dots, bring people together internal and external, operating from the same data core, same workflow, reducing effort around this process to really help increase transparency and accountability in your organization. So, if that’s something you could use some help with…

[David Spark] I’m going to sum it up for you here, Richard.

[Richard Marcus] Sure.

[David Spark] AuditBoard, the cybersecurity storytelling tool.

[Richard Marcus] Sure, yep. That’s it. That’s it. So, if that’s something that you got challenges with, definitely check us out.

[David Spark] Awesome, auditboard.com, you’ll be able to link to it also from the blog post of this very episode as well. Thank you again to my host, to Richard Marcus who is a VP of information security over at AuditBoard, and to our spectacular listeners. We greatly appreciate your contributions and listening to the CISO Series Podcast.

[Voiceover] That wraps up another episode. If you haven’t subscribed to the podcast, please do. We have lots more shows on our website, CISOseries.com. Please join us on Fridays for our live shows – Super Cyber Friday, our Virtual Meetup, and Cybersecurity Headlines Week in Review. This show thrives on your input. Go to the Participate menu on our site for plenty of ways to get involved, including recording a question or a comment for the show. If you’re interested in sponsoring the podcast, contact David Spark directly at David@CISOseries.com. Thank you for listening to the CISO Series Podcast.


Posted

in

,

by

Tags: