Why security teams need an empathy filter
28th Feb, 2023#24
Everyone who works in cybersecurity needs this reminder from time to time: people who are not in this space aren’t obsessed with the latest attacks and their impact. They probably don’t care at all because they already have other difficult projects they’re working on or personal issues that stretch them thin.
Any security team that wants to be effective and make a difference needs to keep this idea at the top of their mind when rolling out an awareness campaign or sending out an email.
Overly technical and dramatic messages about trending or successful attacks fly right by busy ears.
So what’s the solution?
Creating simple messages that resonate with people in their context. This is a practical way of using empathy to create true resonance, but it’s often difficult to accomplish without help. That’s why a non-IT specialist with communication expertize can act as an empathy filter for the security team when bringing them on board.
My guest today, Lance Spitzner, Director of Security Awareness at the SANS Institute and founder of the Honeynet Project, coined that term (“empathy filter”) as we were recording.
His over 20 years of security experience in cyberthreat research, security architecture, and awareness training really shine in this episode, creating momentum and motivation for change.
Lance has published three security books, consulted in over 25 countries, and helped over 350 organizations build awareness programs to manage their human risk. He remains hands on, dedicated, and an energetic vector for the cybersecurity community.
In this Cyber Empathy episode, Lance explains why simplifying security is the best approach to protecting cybercriminals’ favorite target: people. He also shares examples of how to do this in practice and who to ask for help to achieve this. What’s more, this episode helps you find out how to determine if the security team is empathetic.
In this episode, you will learn:
Why simplifying security is the best approach to secure people (02:24)
Why security teams need an “empathy filter” and who can play that role (10:20)
The importance of having an empathetic security team (18:13)
Lance shares an empathetic security approach success story (30:00)
Lance Spitzner is the Director of Security Awareness at the SANS Institute and founder of the Honeynet Project. He has over 20 years of security experience in cyberthreat research, security architecture, and awareness training. He has also published three security books, consulted in over 25 countries, and helped over 350 organizations build awareness programs to manage their human risk.
Through it all, he remains hands on, dedicated, and an energetic vector for the cybersecurity community.
[00:42] Andra Zaharia: Something that people don't really see when they look at cybersecurity from the outside is that this discipline offers a lot of principles, concepts, and practical ways to develop a healthy company culture. At the core of cybersecurity are a couple of ethical principles that really stimulate critical thinking, focus on proactive behavior, offer a lot of insight into understanding risk and reducing risk, and a lot of other helpful concepts that are simply oriented towards positive outcomes. So, it was a pleasure for me to talk to a specialist who spent over half of his three-decade career in cybersecurity on the topic of building a security culture. So, this episode features Lance Spitzer, who has been transforming how security teams operate across the world. He's actually helped over 350 organizations build security awareness and culture programs to manage their human risk. He's also worked with specialists in over 25 countries to improve their security culture. So, through that experience and wealth of knowledge, he can pinpoint what makes empathy practical for cybersecurity teams, which is what this episode is all about. Very excited for you to hear it, and I'm even more excited to see what you do with the insights that you get from it. So enjoy it.
[02:24] Andra Zaharia: I wanted to dive right in and ask you, how do you get people to reach that moment of self-awareness where they realize that empathy makes a real difference in their work and the lack thereof as well?
[02:40] Lance Spitzner: Sure, and I'm assuming this is from the cybersecurity perspective. So, let's take a step back. And before we go into the how, let’s go into the why. If we think about it, what is cybersecurity all about? Managing risk. For the past 20 years or so, that's really been focused on the technical side. I would argue, we've actually gotten pretty good at using technology to secure technology. I mean, you and I were just talking earlier about endpoint technologies, perimeters technologies, cloud-based technologies — we've got a lot of technology we've been throwing at this security challenge, and we've gotten really good at it. And in some ways, we've gotten so good at it that we're now deriving the cyber threat actor to target human because human is now the most insecure operating system out there, and we see that in phishing attacks, password attacks, things like that — and it's working. So, talk with most security leadership today, and they'll tell you that the human is the biggest challenge. Sometimes we even hear the human is the weakest link, a term I'm not a fan of because it blames people — empathy. And what I prefer is people are not the weakest link, people are the primary attack vector, and the reason is that we've done a really good job at securing technology and a really bad job of securing people. So, there's the why, now we have to make the connection to empathy. Okay, Lance, people are the primary attack vector, what does this have to do with empathy? Well, let's take the next step. Why are people the primary attack vector? Because they're vulnerable. Why are they vulnerable? Because we've done a lousy job of securing them. Why have we done a lousy job of securing them? Because we've made security very complex, overwhelming, and scary. So, when security teams are telling people what to do, they tend to speak on their own terms. And as a result, people are confused and they often just give up or they try to do what we tell them to do and fail. Passwords — a classic example — we always wonder why people don't exhibit secure behaviors with passwords. Well, think about it, we're overwhelming them: Passwords have to be 15 characters, uppercase, lowercase, symbols, numbers, mixing the blood of a virgin, then change it every 90 days and don't write it down.
[05:20] Lance Spitzner: So, if we think about it, really, in a lot of ways, it's the security community's fault. So, what we need to do is communicate, engage, and make security simple. Because if we don't, right now, if I go to most people and I say, “What words or what terms would you use to describe your security team?” They're probably going to say words like arrogant, egotistical, unfriendly, unhelpful, technical, confusing, and overwhelming. And why is that? And that's kind of like what you and I were talking about earlier because security teams are not thinking in terms of the people they communicate to. Your workforce — they're not stupid. They're doctors, lawyers, accountants, scientists, engineers, nurses, and researchers. They are smart people that want to do the right thing. It's just that security is not their like, security is not their passion, and it's not their job. So we can't get mad at them if they don't understand MFA, 2FA, SSO, and things like that. This all comes down to empathy. If security teams just took a moment and thought of terms, “Well, what's their world like? Well, how could I make cybersecurity simple for them? How can I communicate in their terms?” We’re far more likely to engage and make cybersecurity simple. So, remember all those terms are used to describe a typical security team. Well, the security team can take a step back and go, “Well, you know what? Empathy, emotion is such a big driver of people's behaviors.” If we start making security simple, if we start becoming friendly and approachable and helping them, then people start describing the security team in terms of “Wow! They're friendly, helpful, collaborative, approachable, engaging.” And really, this builds trust.
[07:18] Lance Spitzner: So, why do we keep coming back to empathy? Empathy is that key ingredient in getting security teams to effectively engage your workforce and really create a strong security culture. So, what's interesting is I do a lot of research in culture. Because think about it, culture is something businesses and organizations have been studying for decades. There are so many great models on culture out there ADKAR, John Carter's eight steps, and there's also all sorts of great research by Daniel Kahneman, Cass R. Sunstein, Cialdini, the list just goes on and on. So, if you take all this great research, they keep saying that same thing to build that strong culture, it really comes back to empathy. In fact, the latest book I'm reading right now, written by the CEO of Microsoft, Satya Nadella. He emphasizes when he went in there, Microsoft was notorious for a very toxic internal culture, and he specifically brought out this word, empathy. So, that's why I was so excited to see your series about empathy because a lot of times we don't realize it. But when we are struggling to secure our workforce, it's not the technology, it's not the processes, it's not the policies; it's the security team, how are they engaging the workforce? Quick example, phishing simulations. That is the easiest way to tell if a security team has empathy or not. Phishing simulations are when we send out phishing emails to train our workforce. If your security team is sending out super targeted phishing emails, and then when people click, the security team is like, “Oh, they're bad. Let's fire them.” You're going to develop a really toxic security culture, nobody's going to trust the security team, and wham, things go down. But if the security team approaches as, “Hey, folks, we want to be very helpful. We're only going to use the attacks that cyber attackers are using. If you click once, not a big deal.” It's a great reflection of how empathetic your security team is. And I've seen phishing simulations go very bad, and I've seen them go very good, where people love it, it's almost like gamification. So, that was a big data dump there, but I'm so excited that this word “empathy”. At first, you're like, “What does this have to do with cybersecurity?” Then when you take a step back and you realize, “Well, the humans are the biggest risk.” Why are they such a big risk? Because we've done such a lousy job engaging them. Why have we done such a lousy job of engaging them? Well, it comes back to emotion — empathy. If security teams could view through the lens of empathy, they’d be so much more successful. And it's hard to do. And it's not like the security teams are bad; it's hard to do. Big data dump, sorry.
[10:20] Andra Zaharia: No, please do not apologize. That was amazing. It's like a tiny workshop in a few minutes, which is a huge thing. You really took a deep dive into the problem and exposed the entire ecosystem in a simple way, which I think is so important to do, to be able to take a step back and have that big-picture view. And you mentioned that this is something difficult to do and I have a few hypotheses about this. Well, they're rooted in psychology, which, obviously, communication is based on. You're mentioning the culture and the culture of hacking in general — and I don't mean malicious hacking, I mean the culture of ethical hacking — is so obviously rooted in some values that are very adversarial. The big picture is good versus bad, obviously. And I think that often people end up in the bad category in this kind of dichotomy, which obviously helps no one. So, now that we're bringing in more nuance and people from other disciplines into cybersecurity, I see the discourse and the culture changing just a little bit. But it takes a long time, obviously, to change those mentalities. So, I was wondering, because you have such a huge extensive experience with training people, what have you seen are some, let's say, examples or moments, those aha moments where people start to realize that this is something that's viable and practical? And by this, obviously, I mean, actually practicing empathy.
[11:54] Lance Spitzner: Fantastic question. And once again, like you said, let's dive into the world of human psychology. Sometimes I get people from the security community, “We need a strong security culture.” So they develop an entirely new process to do that. Why? There's all this research out there on how to build strong cultures, innovative cultures, friendly cultures, wellness cultures, and safety cultures, let's just steal. So, bringing up psychology, the Curse of Knowledge is a cognitive bias. So, cognitive biases are basically, our brains are short-wired or have these shortcuts, so we have the ability to make decisions very quickly when we're overwhelmed by a lot of information. And there are hundreds of cognitive biases out there. Curse of Knowledge is a cognitive bias that states the more of an expert, you are at something, in many ways, the worse you are at communicating it. So, what really happens is, this is where that empathy thing kicks in because a lot of times security teams will communicate to the workforce, “Hey, we're rolling out a new tool, password managers.” “Hey, we have a new policy, single sign-on.” “Hey, we add these new expectations.” “Hey, we just got hit by this attack.” And when the security team communicates to their workforce, they communicate on their own terms — it's very technical, it's about protecting the organization, talking about these compliance standards. So, a lot of times, people when they read it, don't understand it, and then don't care. And the security team is not doing this with malicious intent, it's just they assume everybody is just as passionate about cybersecurity as they are, but they assume people know what they know. So you'll see security teams talk, and I've seen this happen, “Hey, we've had this APT thread actor try out a new TTP, specifically password brute forcing on our SSO active directory systems.” I'm just throwing words. And to them, that makes perfect sense. And then they send it out to the workforce, and it's just boom, doesn't work.
[13:59] Lance Spitzner: The key thing with these ideas of empathy and culture is it's not a one-time event, it's just your culture is built over time. And as we keep engaging, communicating, and interacting with our workforce, over the years, that culture goes one way, toxic and untrusting, or the other way. So, when you said light bulbs, which is quite often just helping the security teams under this curse of knowledge, “Hey, you're communicating on your own terms.” So, I've literally done this, and it works very well. I'll work with security teams, and they're going to make a big announcement, “Hey, there's a big patch update, a new tool rollout, a new policy, we got hit,” whatever. The security team will craft the email, and I'll say send it to me first, and then I'll recraft it so it's easier for people to understand it. So, they'll send me a five-paragraph, highly technical email that has all the details of what's happening and why, all this stuff people don't care about and don't understand, people just want to know what they need to know. So, the security teams will send me that email, and then I'll simplify it, I'll shorten it, I'll get straight to the point, but also convert it into the terms that people care about — “Hey, how this benefits you,” and things like that. And then I'll show the security team the email, and they're like, “Holy shit, we never thought of that!” So, it's not that they're malicious intent, but they don't know how to do this. So, a quick example, and then I'll give you a solution to this. Password managers are something that organizations typically are quite often rolling out. Are they perfect? No. Are they better than what we have now? In a lot of cases, yes.
[15:42] Lance Spitzner: So, a typical security team will roll out password managers, and you'll see in the email, “Hey, folks, we're rolling out this technology called the password manager. It's a secure encrypted database that exists in the cloud, enabling you to secure your passwords by doing all these things.” It's a little confusing, it's a little overwhelming, and people are like, “Oh, crap! Why do I have to do this now?” Whereas if the security team sends out an email, “Folks, do you hate passwords? Do you hate remembering them, creating them? We've got a solution that's going to take care of all your problems and simplify your life. There's this thing called password managers that's going to do all the work for you.” So the security team, talking in technical terms about the importance of securing the company. Whereas if we flip it, “Hey, folks, the security team is here to simplify your life.” Boom! It's just a simple example. So, this is why, you and I were talking earlier, sometimes it helps if the security team has somebody with a communications background or organizational change background. Sometimes I don't want them to be a security expert, I want them to understand the concepts of risk and risk management, but I don't want them to be a cyber threat intelligence, TTP, world experts. And the reason why is this: If your security awareness, culture, or communications officer is not a world security expert. When the security team puts together a tool rollout, an email communications, a new update, or a new policy, that communications person can go, “I don't understand this. And if I don't understand it, the workforce won't understand it.” It's almost like the curse of knowledge litmus test. So, I always get a little frustrated when we're working from the comp approach of culture, behavior, awareness, and training, they're looking for somebody with a computer engineering degree. No, they don't need to be a security expert, because you already have a security team of experts. The security teams can tell the communications person what needs to go out. The communications person is basically your empathy filter. In other words, getting a security team to do empathy takes a while, they will get it as they start seeing the impact. I just came up with that term but you made me think of it. Your security awareness culture person, they are your empathy filter.
[18:13] Andra Zaharia: And a great contributor to the overall culture of the organization. All of the language that you mentioned, and the way that security teams tend to typically communicate with the people that they serve is such a good reminder that most security companies, unless they are talking to a very specialized B2B audience, tend to communicate similarly with their customers, with consumers, and it's equally just confusing, overwhelming, and fear-inducing, obviously. So, I think that the principles that you mentioned apply across the board to a large part of our industry, both in the teams that serve organizations, but well beyond that as well. And when it comes to the culture, I'm such a firm believer that cybersecurity works and functions on a couple of principles that are really great drivers for ethical behavior, for inspiring critical and stimulating critical thinking, and for developing a culture that is focused on good values that benefit everyone. And I see it as a very powerful tool for personal development. And I've seen it in people such as yourself and other members of the community who are pushing things forward and changing things for the better across the board in terms of how companies are built, how teams are developed, how products are designed, and so many other aspects that influence society at large. So, I wanted to ask, because you've obviously seen both sides of the coin, the great progress that people make once they have these realizations and backed on them, but also the, let's say, lack of progress or development after those experiences. So, I was wondering if you could pinpoint for listeners, what are the unseen costs of that lack of empathy. How does growth stall in those companies? But also on a personal level, because I think that that's where we feel the consequences the most.
[20:14] Lance Spitzner: So, I'm going to take this back to just cybersecurity perspective, just because that's where my expertise lies — like, what's the impact? What's the cause? I teach a security culture class. And one of the things I'd cover is what are the indicators of a strong, healthy security culture. And what are the indicators of a bad or toxic security culture? And really, what it comes down to is if you have a toxic security culture, there's really no empathy, there is no trust. Whereas if you have a strong security culture, you've probably got a lot of empathy and you've got a lot of trust. So, let me give you an example of that indicator, but also gives you an example of the cost — what happens if you don't have that? So, here's a question, this is what I would ask your workforce because I'm measuring trust, measuring empathy: How safe, how comfortable do you feel reporting an incident even though you know you caused the incident? So, there's a lot of training out there, “Hey, how to identify an incident?” “Hey, how to report an incident?” Has anybody ever asked how comfortable you feel reporting the incident? Well, think about it. If you have a very negative, toxic security team where there's no empathy, people don't trust the security team, the security team is known as punitive. All right, think about scenario: One of your workforce click on a link and instantly see their computer infected and they get a popup message: “You are infected with ransomware. Pay $200 to get your computer back.” Now, is that person gonna go, “Oh, man, I just got my computer infected! I better let the security team know.” Or is that individual gonna go, “Oh, man, I just infected my computer. I'm gonna pay that $200 so nobody finds out.” So, first of all, you've got to ask yourself, if you've got a negative or toxic security culture, what are people not reporting? What are people not identifying or pointing out?
[22:26] Lance Spitzner: So, for example, another one of my favorite indicators of the strong security culture is this. I don't want to know how often the security team is communicating to or engaging your workforce, that's a training metric. What I want to know is how often is your workforce taking the initiative to reach out to the security team? In other words, “Hey, security team, I've got a question.” “Hey, security team, this policy is a little confusing, can you clarify?” Or “Hey, security team, we've got a team meeting next week. We're the legal. Could you stop by and brief us on cybersecurity issues?” “Hey, security team, we're designing a new product and would love to involve security from the beginning.” So, what happens is, if the security team is trusted quite often, that means people in your workforce are going to be reaching out to and engaging the security team. Whereas if you have a toxic security culture, untrusted, or lack of empathy, people are going to be doing everything possible to avoid the security team. So, once again, you have to start asking what are the costs there? What are the vulnerabilities being introduced by people trying to avoid the security team because they're known as the team of pain? And it really comes down to trust. But what builds that trust? Empathy. Because really what security teams don't realize is this — and this, once again, goes back to all the great research out there. And the best one out there is written by Daniel Kahneman, Thinking Fast and Slow. He won a Pulitzer Prize for the book. It took me three months to read the book because it's so much information. But really what the book comes down to is people don't base their decisions on facts and logic; they base it on emotion. Our brains, by default, we make decisions by emotion. Lots of other books support that — Nudge, Malcolm Gladwell, Simon Sinek, all this great research out there — emotion drives people's behavior, not logic. And what happens is security teams communicate and engage through the lens of logic when we need to communicate and engage through the lens of emotion. Here’s a security culture class, I have a picture of Spock and Homer Simpson. See this Spock guy, that's who security teams develop their security policies for. You see Homer Simpson, that's who we should be developing our policies for. That's what it really comes down to, and that's why I love this idea of empathy because that's the foundation. If you go and communicate through the lens of empathy, it all comes together.
[25:11] Andra Zaharia: The point that you made about adding cognitive effort versus reducing it by creating that comfortable space through motion — so powerful. I keep waving this flag of let's focus on people's emotion, even to the most technical people that I worked with. And although, obviously, their instant reaction is this doesn't belong here. That’s starting to change a lot with younger generations being a lot more, let's say, emotionally mature. I think they developed emotional maturity a lot faster, perhaps, than we've had a chance to do so. And this is very interesting to watch how the conversation changes, the aspects that they incorporate into their work and how multidisciplinary they end up being. When I see people thriving in this industry, whether as a team or individually, I see that there's very close alignment between their structure of values and principles and the work that they do. And it's that alignment that creates deeper involvement, it makes it easier to be empathetic and to connect to people. So, I was wondering, because you're that kind of person who does this kind of great work that has such huge impact that travels in corners of the internet and of the world that perhaps you may not even realize, do you have a personal story of someone showing you empathy that made you want to pursued this as a key component in your work?
[26:41] Lance Spitzner: That's a really, really good question. You run into it all the time in this field. I would say probably one of the biggest things that really helped me. I'm not going to go into the details of it, but I had a very close family member just recently go through cancer. And it was not fun, everything's clear and everything's fine, but we went through six months of the year of having to deal with cancer. And one of the things that it really taught me is, everybody out there is dealing with some challenge in their life, it might be a financial challenge, healthcare challenge, emotional challenge, dealing with depression, maybe a child having problems in school, maybe they just had a big car wreck — all of us are dealing with something, and it's going in the background and we're not going to see it. So, if they're late to a call, if the seem a little distant, if they seem a little frustrated or something like that, what it's really taught me is the importance of empathy. I’ve had a lot of things go really good for me, and then there was a six month period of real darkness because somebody very close to me was going through something very difficult. And then that really taught me the importance of “Hey, everybody's gonna have something rough in their life, so go through the lens of empathy, try to understand and just assume the best.” They didn't respond to your email, it's not because they don't like you, it's not because they're malicious; it's they're overwhelmed, they're crushed, they're busy, things like that. So, that really taught me the importance of empathy, and it's just something I always try to keep in mind when communicating with others or working with others.
[28:37] Andra Zaharia: Thank you for sharing that, and thank you for the vulnerability. I know that it's not easy to dive into these things, but that's such a powerful reminder for everyone. And this is one of the things that I really appreciate about this community is that people tend to talk about their personal struggles a lot and explain how it influences their career and help set healthier expectations for other professionals who are trying to just grow in this field, whether they're dealing, like you mentioned, with depression, with health issues, with other kind of problems in their lives, breakups, moving houses, all sorts of things, even getting fired and other things like that. I think that those conversations are very valuable because they normalize not being able to be 100% there, but still trying to make a positive contribution as much as you can. So, that factor really helps. And honestly, sometimes showing other people empathy comes easier than showing ourselves the same thing because we just tend to push ourselves a lot, many of us, and I think that that's a good exercise to engage in as well. As a, let's say, way to round up a conversation that I honest wish wouldn't have to end, but we're trying to give listeners as much of an insight into the kind of work that you do and what you bring into this community as possible. I wanted to ask if there is, let's say, a particular story or example where you saw this empathetic approach really make a difference in a longer timeline because you have a chance to see people evolve in this field. I was wondering if you could share one of those stories, because I know you're also a great people connector, and you basically just bring people together around you and then just manage to build such a great community.
[30:45] Lance Spitzner: It's not a big deal. But sometimes these little aha’s is can have a big impact. I've worked with one smaller organization where they didn't and don't have a security awareness officers security, security culture officer, it's just been on 5-10 total security geeks on the total security team — so, smaller companies, smaller security team. But I worked with them because they wanted to get a better understanding of the people side. So, they took my class, I have a class on security awareness, security culture, and things like that. They took my class and they wanted to start applying it. And they sent the first couple of emails and communications, tool roll up, and things like that. And bless their hearts, they tried to do the right thing but the emails were just horrid; they were technical, confusing, overwhelming, you read them and people just didn't know what they were supposed to do, if anything. So, I started working with them and I said, “Look, anytime you're going to send out an email, send it to me first.” I would recraft it and send it back to them, and then they would send it out what I crafted. And after a while they started getting it, and I started seeing these emails are shorter, easier to read, things like that. Now, could they be improved? Absolutely. Communications experts would look at this and may even cringe. But bless their heart, these technical guys, they're trying to do the right thing and they are making a big improvement. So, what it really helped me understand, it’s not that this lack of empathy is something with malicious intent; they just have no idea that how they're engaging has the perception of no empathy. From their perspective, they are showing empathy, they're trying to do the right thing, they're trying to secure people. But remember, reality is perception. What they don't realize is the workforce has this perception that the security team is toxic, and what they don't realize is the impact of that perception. So, working with that small security team in this small company helped me really realize it's just a knowledge issue, an education issue. It’s not just educating the workforce, it's educating the security team on how to engage the workforce.
[33:02] Andra Zaharia: Thank you so much for that example, that is super, super helpful. And especially for emphasizing, again, the fact that this doesn't come from lack of trying or from a lack of willingness, but simply because these people come with a different background. And that's, again, something that also the rest of us, let's say, who work in a different space or in a different niche in cybersecurity, you should not criticize but look to support, understand, and then try to engage people in that constructive conversation, because there's also a lot of hot takes, takedowns, and criticism around the language that some people tend to use. But again, going on the negative never helped anyone change for the better ever. It's only through making things more palatable for people, making them digestible, making them feel like they're part of their universe already, and helping them use the skills that they already have. I mean, specialists in this industry are so insanely talented, they seem to have such a capacity to learn and to be creative. And that's something that can be applied to language and communication, like anything else. But yes, I'm going on a tangent here simply because I get very excited about this topic, and of course, talking to you, which has been such, such an honor. Thank you so much for being so generous with your insights, and for everything that you do for the industry. Could you tell us where people could find your work or where to follow your work more closely and start to dive into that, and perhaps teachers class if they're up to it and make that change?
[34:46] Lance Spitzner: We're a big believer and building community. And what I tend to find is the folks that are focused on the human side of cybersecurity are probably, not surprisingly, some of the most fun, supportive and interactive folks out there. So, if anybody wants to get involved in this community to learn more about the classes, we have summits, which are free to attend and virtual from around the world, we have an online community forum. If anybody wants to learn more, probably the best way is just to reach out to me. I'm on LinkedIn. I'm on Twitter. So, once again, my name is Lance Spitzner, or if you want to just email me, email@example.com. Reach out. You have questions, you want free resources, “Hey, how do I access the free summit, the course?” Happy to help out.
[33:03] Andra Zaharia: That is absolutely awesome. Thank you so much for taking on that level of openness. It takes a lot of patience and generosity with your time. So, thank you for that.
[35:51] Lance Spitzner: Well, thank you. This is awesome. We’ve got to do this again sometime. Let's get you out to the summit.
[35:55] Andra Zaharia: Thanks so much!