Creating psychological safety in cybersecurity
14th Mar, 2023#25
Psychological safety doesn't just help performance. It also creates space for us to name problems, admit vulnerabilities, and then find ways to get better.
Besides keeping technology safe to use, cybersecurity also needs to protect the people using it. Arguably, this is the most important goal for this industry.
That’s why cybersecurity specialists are responsible for making things safer, but also for making people feel safe. To achieve this, specialists in the information security space need to understand people’s emotional background: their fears, their motivations, their perceptions of cybersecurity, and what makes them pay attention and become emotionally invested.
When people feel that it's them against cybersecurity policies and specialists, they resist new information, processes, and habits. But when they feel heard and understood with no judgment, they open up to sharing their experience, to learning, and to putting those lessons into practice.
If cybersecurity specialists approach their work from an empathetic point of view, it becomes easier and it creates a powerful feeling of camaraderie between them and their colleagues.
Today's guest, Emma W., experienced cybersecurity specialist with the UK Civil Service, offers detail-rich, practical observations and examples on how cybersecurity teams can create a positive experience for the people they serve.
You’ll also learn about some of the abilities that information security specialists can use to create the psychological safety that’s conducive of openness and learning. Additionally, you get to hear Emma talk about her work and what keeps her going in such a complex and challenging space.
In this episode, discover how to make cybersecurity less intimidating and more approachable by:
Creating a positive experience for people (02:26)
Developing the abilities to guide people (13:13)
Learning from Emma about what her role looks like (17:35)
Practicing self-care to make high-pressure cybersecurity work sustainable (29:55)
Emma W. brings decades of experience of working to keep people safe through her role in the UK Civil Service. For over 10 years, she's been focusing on cybersecurity, bringing clarity, kindness, and nuance to a space dominated by technology-related conversations.
Her expertize spans disciplines and covers areas that remain hidden for most people in the information security industry. It is those exact areas (e.g. psychology, sociology, anthropology, etc.) that form the key stepping stones for improving how we do cybersecurity awareness.
[00:42] Andra Zaharia: Maybe we've never thought about this, but cybersecurity gives us plenty of tools and opportunities to act out our values. What that means is that we have a chance to do things that are very much in line with our moral compass, such as keeping things safe, being fair to others, and protecting what's important to us and other people as well. When we look at how we choose to use technology, cybersecurity actually helps us understand the consequences of our actions. And something that I've noticed in this space is that people who resonate with the concept of empathy, compassion, and kindness, actually really behave according to their values and key principles. And they actually set an example that's really helpful for other people in the entire community. Emma offers the clarity, kindness, and nuance that we need to just proceed with more caution and more generosity and curiosity in our work as cybersecurity specialists. It's very interesting how she highlights certain situations that remain hidden for most people, but actually represent key lessons for just improving how we do cybersecurity awareness, how we talk to people, and how we build relationships in the industry, and of course, the other side of it as well. I am absolutely delighted for you to listen to this episode, and to meet Emma, and just soak in all that she has to offer.
[02:26] Andra Zaharia: Emma, I am absolutely delighted to have you on the Cyber Empathy podcast to have the opportunity to talk to you about all of the ways in which you're helping people improve their relationship with cybersecurity in general and improve their lives as a result. So, thanks so much for being here.
[02:46] Emma W.: Thank you for having me. It's a pleasure.
[02:47] Andra Zaharia: To dive right in, I wanted to ask, you were at the forefront of all of these opportunities to get people to feel good about themselves when they interact with any sort of cybersecurity technology or experiences. How do you try to make cybersecurity specialists aware that that is such an opportunity, that it is an opportunity to just create a positive experience for people?
[03:16] Emma W.: That's a really good question. There are so many different ways you can use. I think, instinctively, I always head towards creating a personal connection. I think when you find someone who is not experiencing the positivity in cybersecurity, and is may be tempted to start framing others in a more negative way, why did that user over there do that? Why did they push that button? Why did they click that phish? I think the first thing to do is to try and empathize with them, and what's got them to that position where they're feeling this amount of frustration and desperation almost, I think, sometimes because the one thing I've seen security is that it's not a case of the good guys and the bad guys or anything like that; it's a bunch of people who are nearly all doing their best with the skills that we have and the perspective that we bring. And our position in organizations, within families, or wherever we are, nearly everyone is trying to do their best and trying to do the right things. So, I think when you see someone who's gone some way down a path of negativity, the main thing is to understand why they are there and to try and empathize with why they are there. Often, I think it's a sort of bitter experience. People don't pull these views out of nowhere, they have many years behind them of trying to do things right and to help other people do things right. But to be frustrated, I think that others don't share their perspective and their skills. And then once you've made the connection with them and let them know that you've seen them, you understand them, and you completely get why it's hard because you've been there as well, then you can start trying to bring in alternate perspectives and try to encourage people to see the world from the perspective of the person that they are talking to or talking about.
[04:53] Emma W.: As an example with phishing, it's so easy sometimes to look at sort of a test phishing email and go, “That's really obvious, why on earth would anybody ever click on that?” Well, it's obvious to you partly because you know it's a fish. The person who is on the receiving end, by definition, does not know. They are in the middle of a busy inbox with tons and tons of emails there, most of which they need to click on, many of which have attachments that they need to open, and many of which are not expected because that's how email works, that's how life works. Some of which will have spelling and grammar errors and all the things that we traditionally tell people to look out for. And I think when you start trying to put yourself in the position of that person there, it's easier to go, “Oh, yeah, okay, actually my expectation isn't reasonable, and my expectation that everyone should find it as easy to spot phishing emails as I do, then that's not going to work.” I think the other thing is to encourage self-awareness. Even as an alleged cybersecurity professional, I know how often I use weaker passwords than I should, and it does happen. I know that I write passwords down sometimes insecurely — it happens. I know that I reuse passwords sometimes always in allegedly a sensible, risk-managed way, allegedly, according to what I think at the time.
[05:05] Emma W.: But one thing research shows us is that cybersecurity professionals are actually worse for this sort of behavior because they have a certain grasp of the risk, and just sort of enough to make it feel that it's okay for them to break the rules. But I think this is, and that's fine, and it's probably not something that we're going to change very quickly, but it's just something to be conscious of. If we find cybersecurity too difficult, if we find it too difficult to manage passwords, the way that we know we should, well, how on earth would we expect everybody else to magically do it perfectly when we cannot do it ourselves? And I think just being honest with ourselves about that gets us a long way. So I try to notice all the time and say, “I managed your password in a slightly suboptimal way. I skipped an update because I go ‘Oh, God, I just don't want to do that now,’” even though I know I should do it now. I always will get around to it later, I always will. But I think just being conscious of those things and the thought processes that we go through, and the fact that those are the same things as that other people experience because we're all people at the end of the day — newsflash — but we're all people and we all experience these frustrations and difficulties. And again, I think just encouraging that self-awareness and honesty is a great thing, it helps remind us that we're all in the same boat to some extent.
[07:12] Andra Zaharia: Those are such powerful examples. Thank you so much for that. I feel like the way that you talk about compassion and self-compassion, and it always starts with our own examples and experiences, it's just like, for example, when you learn to give feedback, the first thing you learn or would help to learn is how to receive feedback. And when you learn how to receive it, then you know how to properly give feedback in a constructive way. This is the same thing when we pay attention to our experiences, that starts to build up empathy muscles, starts to give us practical ways in which we can actually just create that, let's say, compassionate space for others to make mistakes, to be themselves, and also to be vulnerable because this is a notion that is entirely demonized in cybersecurity for good reasons, because it's in the vocabulary — vulnerabilities are bad. So, people don't really feel comfortable when they feel vulnerable, when they feel like they don't know enough, they don't know how to behave. I'm talking here about both categories that we're discussing, both the people that we're trying to protect, but also cybersecurity specialists when they feel like they're not managing to communicate clearly enough, they can’t make that connection, they feel like they can’t reach people in the way that they want to, that also puts a state of vulnerability. So, how do you think we could go about making personal vulnerability a good thing, a thing that actually helps connection? Because, also, research shows that vulnerability is actually the one thing that most helps people connect in a very genuine way and that it really helps with self-awareness and transformation. So, how do you think we could get people to change their perspective a little bit in this industry around what it takes to be personally vulnerable?
[09:13] Emma W.: I think we need to be talking about psychological safety a lot more. It's not a term that I hear terribly often at work, and I wish I heard it a lot more because you're right, it's that sort of explicit permission to be vulnerable, and deliberately going out and making yourself vulnerable, that actually creates the space for you to build trust and for others to admit vulnerability and for you to forge a bond where you say, “Okay, actually, we're in this together and we can try to get out of it together.” And partly, that's what I'm doing when I admit to poor password behavior or the fact that it's just as possible that I will click on a phishing email as the next person, is deliberately making that vulnerability happen and putting myself out there to let other people know that it's safe to admit their vulnerability as well. I do suspect this is a big part of where we are. Insecurity is that it hasn't been safe to go actually some of this stuff is a bit rubbish for everyone. Actually, there is no one who isn’t invulnerable to phishing emails, there is nobody who can happily memorize 200 passwords. It's like a weird conspiracy of silence. And I think you're on the right line in saying that it's about making space for us to be vulnerable and be safer. As far as how you do this, how we make this conversation around psychological security happen?
[10:25] Emma W.: As with so many places, we need to be bringing security more into line with how our wider organizations operate at work. Because in organizations, when we're doing leadership and management development, you will hear talk about psychological safety. And leaders are encouraged to know about the importance of making psychologically safe spaces for their teams in order to help that team bond and to help it perform better. And I think it is just the case of applying it to the security context as well, to understand that psychological safety doesn't just help performance, but it helps us to be more secure by, again, creating that space for us to name problems and identify problems and admit vulnerabilities and then help us get better. And I think it's similar to comms and security. There are so many excellent comms professionals whose skills we are not taking full advantage of insecurity in terms of how we communicate threat information and how we give actionable advice to help people to deal with those threats. So often, I think, all of this stuff is there, it's just a case of bringing it into security and using it well within that context, that's what we've historically been rather bad at, I think, which needs to change.
[11:26] Andra Zaharia: It does. And I think we have such great premises now to support that change and to give it, let's say, momentum simply because people from different disciplines are coming into cybersecurity and contributing their communication skills, their design, and their UX skills. There are so many people coming in from the outside with less familiarity, with fresh eyes, which I think is always such a helpful thing to have always. It never fails to show the real experience of interacting with a cybersecurity product, experience, procedure, or policy. And something that also came up to me while you were exactly pointing to how we can do this, is that there's a powerful feeling, the feeling of camaraderie, the feeling of ‘we're in this together.’ It's not something that I've seen very much being intentionally cultivated in companies. But even to me, for example — and I wanted to ask if this is true from your incredible experience — to me, even the cyber security incident could be an opportunity to build that camaraderie, to build that sense of ‘we're going through this together, we're learning things together.’ And I was wondering if you’ve seen this in your practice and from your experience.
[12:54] Emma W.: I think so. There was something about a sort of a shared incident that nothing can really replicate in terms of team building. So, really, I think accelerating that process of everyone learning to rely on one another in order to deliver, it’s going to save all of us and save all the people who were working. Yeah, that's a really big thing.
[13:13] Andra Zaharia: I mean, we see this in the industry. For example, I keep going back to 2017, when the entire world woke up to a new reality. That was a tipping point for the industry as a whole, but also for, let's say, the public understanding of what cyber attacks can actually do. And obviously, it's been increasingly more difficult or challenging ever since. But I feel like these are some opportunities that we can find and leverage to create that positive association and to create that sense that we can rely on one another because I think that that's where that trust relationship really starts to blossom and to actually have very practical effects on everybody's work. So, in terms of normalizing imperfection, normalizing that there's a lot of nuance in this industry where people want to put things into boxes and to declare them safe, untouchable, accessible, or whatever. We want clear labels, but we can’t really always provide those labels. What kind of abilities have you seen people exhibit when they're in a situation where they have to manage all of these nuances and all of these tricky situations of internal politics and every other expectation that people have from cybersecurity specialists?
[14:52] Emma W.: Gosh! I think it's really difficult. It's really, really hard when you're in an incident and you're up against the wall, and particularly if people are looking for you for leadership, that's personally, a really, really challenging space to be. A lot of that difficulty is down to the inherent complexity of cybersecurity and the complexity of any incident that you might face. And the knowledge of how fast things can and do change. It's like you've always got to be prepared for what is around the corner regardless of the fact you have no idea what there may be. There's a lot of power, I think, in being the person who can simplify the narrative, who can take in all of the complexity or a lot of the complexity, and still manage to cut through to ‘this is the situation, this is where we are, this is what we need to do next.’ If you can do that, then people around you will immediately start feeling safer because there's a plan and there's a leader. And the plan will change because plans always change, but the important thing is that you make people feel like ‘we have a direction to go.’ Then the next thing that brings to mind, and I think this is difficult during an incident when everything is very pressured, but in a longer-term way, it helps to try to build resilience in the organization by letting people know that it's okay that we don't know what's around the corner always. I think a lot of traditional approaches to leadership rely on leading from the front, being the person who knows where we're going, and who's going to get us there. And that may have worked in the past, to greater or lesser extents, but I don't think it works nearly so well now, again, because of the increasing complexity of cybersecurity, the threat environment, and the general world in which we work.
[16:28] Emma W.: I think a lot of it is around helping more people to become comfortable with ambiguity, and comfortable with the idea that we don't know where we're going but we can still take sensible steps according to our assessment of the immediate risks in front of us, our assessment of the probabilities of what is going to happen a few days down the line and how we talk about this. And again, psychological safety is a huge part of that I think, and equality, diversity, and inclusion, so that we have all of the right perspectives in the room, and that we allow all of those perspectives to be heard because often it can be just one little voice in the corner going, “Guys, am I the only person seeing this?” And they may be the only person seeing that. And if I don't speak up about the fact about what they're saying, then, you know, it’ll be a lot less for me than we were. I've seen leaders get really uncomfortable in the past at the idea that they cannot provide guidance to their team more than like a couple of steps ahead. And I think for them, really, really, it's okay. No one else in the situation would be doing any better at foretelling the future unless they have an actual crystal ball. It is okay and it is not your job to tell the future; it's your job to help your team get to the next step, and then the next step, and then the next step, in light of the emerging information that we are finding out.
[17:35] Andra Zaharia: That is a really powerful example, especially because it touches on the changing paradigm. We're changing the culture around leadership; it's changing before our eyes, we can see it, we can see that there's no planning ahead, the environment is a lot less predictable than it was a decade ago, it's incomparable, and there's no perfect model to get through this. I think that we as humans obviously need a certain degree of certainty and we've gotten used to a society in which we can plan ahead a lot. I remember actually reading this a couple of weeks ago, in terms of, we have this expectation of predictability because we can plan our routes because we have maps because we know exactly where we're going, how long it's gonna take to get there, how the weather is going to be like tomorrow, and all of these other things. So, we have the same expectation from other things in our lives, which are a lot more complex and a lot less predictable. So, this changing story, and this changing culture is such a great influence because cybersecurity has so many great thinkers, people who are rebels, people who are incredibly creative, who this culture is rooted in challenging the rules and understanding how systems work, deconstructing them, putting them back together, improving them. So, I feel like there are a lot of elements in cybersecurity that can just help people make even more of their innate abilities and the skills that they've built over time. And something that I wanted to touch on in this area is the alignment between your personal set of values and the work that you do in this field because, from your example, I can see that you're the kind of person who is very much aligned with her work — you act out your values. So, I was wondering how things evolved for you in your career that you ended up in this place where you can build on these values and use these values to guide others and to help them do the same.
[20:03] Emma W.: That is an interesting question. I've had the same working values, I think, for probably my whole career. I've worked in the public sector all my life because it's a very strong driver for me just to try and make the world a better place. That's the bottom line is the thing that gets me out of bed every day to keep doing the work that I do. And I've been lucky to have quite a varied career in public service where, I think, every single role I've ever done has met that sort of ultimate value or that ultimate need for me at work. I've been working in the exact field that I'm in, in cybersecurity, for about the last 10 years. And that is by far the longest that I've ever stuck at anything in my life. I can't, at the moment, imagine wanting to move away from it, but never say never, things happen. And I think the reason here is that I'm able to bring a perspective in cybersecurity, which still isn't terribly common, which is more of that naive user perspective, frankly. I came into my role with which, of course, a department stuffed with amazing technical experts — I throw stones from here, and I hit 10 of them — and really wonderful people, generous with their time and skills. They have technical skills that I will never ever have. And to be honest, I don't really want to have, they're not my thing, and I don't get excited by these things they do. And it's great that they do and I'm really, really grateful for their expertise, but it's just not me, that's never going to be me. Partly, I say I bring that naive user perspective with “Well, okay, so you're doing this very, very clear thing. It's got an audience over here, who does not share your perspective, how do we make it work for that audience?” And it's a question that’s quite hard to ask in the abstract, but it's easier to have a better conversation on that, whether it's someone in front of you, i.e. me, who is representing that perspective and saying, “Look, I'm an expert in my field, I have a seat at this table. But the perspective that I bring is more along the lines of someone who doesn't understand a word that you just said.” Make it make sense to me. Because if you can make it make sense to me, then we have half a chance of making it make sense for the people out there.
[21:57] Emma W.: So, I think it's partly that sort of perspective; the audience perspective, the customer perspective that experts always kind of need to keep hearing. Obviously, more advanced than that is the fact that I've done all of the reading, learning, and research that I've done in the socio-technical areas of cybersecurity, particularly psychology and social sciences. I'm not an expert in any of these things by any means, but I have done enough around it to know what I'm talking about, basically, for the past 10 years. So, it's never just my voice in the room. It always starts with “Well, this is my perspective, this is how I see things, but I can back it up.” I can say, actually, there is research over here that points in this direction. There's research there that shows that not everyone feels the way that I feel about this, but here's the range of ways that people feel. So, here's the whole range of things that we need to consider when we're here. And honestly, I think that's kind of it, that's my shtick, that's what I've always done, and what I really enjoy doing. I think the other part of it is getting fired up sometimes on the basis of things that just aren't fair in cybersecurity. And again, I mostly feel aggrieved on the part of the user, the person who is easily written off as the weakest link in ways that is completely unfair; written off because they don't have the supreme cybersecurity skills that the experts have; written off because they don't come to work to do security, they just want to do their jobs. And you say this in a meeting and people start shifting on their chairs, it's like almost an unsayable thing sometimes. We're talking about wanting people to put security first. Why would they put security first, what has security ever done for them, from their perspective? And again, you can ask these challenging questions, like, “Oh, yeah, actually, this is the perspective that we need to think about.” We need to think about selling security in ways that make it attractive, not just saying to people, “Well, you should care about security. And if you don't care about security enough to come into work every morning and spend an hour updating your device and refreshing all your passwords and doing all the things, well, then you're a bad person.” So, yeah, that's how it has to be. It's so unfair, the ways that we sometimes treat people in security and the fact that we do fail to consider their needs, perspectives, and their capabilities. And then we tell them it's their fault, and I'm like, “Come on, I'm just not having that. I'm not having it. I'm here to make all of that better.” So, in terms of values, I think it's everything I just said: wanting to make the world a better place, wanting to bring diverse perspectives into security where they are needed, and wanting to stand up and give a voice to a set of people who don't often have the voice that they should when decisions are being made — so, the user, for want of a better term, even though I'm not particularly fond of that term.
[24:34] Andra Zaharia: I know what you mean. I try to find ways around it all the time as well. Thank you for sharing your story, and thank you for pinpointing exactly those moments, those points of friction that still create a lot of resistance between cybersecurity specialists and the people that they serve. I feel like those particular areas, which is exactly where you're bringing in your expertise and your perspective and you're getting cybersecurity specialists to build that empathy muscle to see and understand the bigger context around their customers sort of speaking. I feel like those points of friction are very tied to the stories that cybersecurity specialists tell about themselves and the identity that they're part of; the identity of a protector, the identity of just the technical prowess, I think, that that's the element that's part of the story.
[25:35] Emma W.: Like a gatekeeper role sometimes, the protector, the knight who is briefly protecting the castle and all the people inside. It's a bit like their mentality sometimes, I find.
[25:49] Andra Zaharia: It's not bad at all, but it does create a distance between them and the people that they serve, which serves no one in the end. It's not helpful for everyone. So, I was wondering if you might be able to surface some of the elements with this identity that is slowly fading away because you have contact with so many specialists from this industry, who serve in so many different roles, and they bring change into the organizations that they work with, they contribute to shaping the culture in those organizing. So, I was wondering if you could provide some examples of what these people look like and how they speak so that listeners can really picture and see these opportunities, which are, at the end of the day, growth opportunities, or let's ask for personal and professional development for them.
[26:50] Emma W.: This is such necessary work — this work of bringing that more diverse set of perspectives into cybersecurity, that nearly always cybersecurity is seen as a highly technical profession. And in every meeting that I think I've ever been in, regardless of how technical it is, there's an immediate acknowledgment that the difficult bit isn't the tech, the difficult bit is the people work, the process work, and the business work that goes around it. And I do think that gets lost, and I wonder quite why because it's entirely well acknowledged by everybody. In terms of which kinds of diverse perspectives we bring into the room, which we rely on to succeed. Well, it's pretty much everyone you can imagine. I think some of the people I've most enjoyed working with are the comms professionals, from whom I have learned so much about how to connect with audiences and how to bring audience perspectives into the work. We talked earlier about the expert problem. The expert problem can only be solved collaboratively, it is impossible to solve it by yourself, by definition. The only perspective you can ever bring to your work is your own perspective.
[27:50] Emma W.: I think working with the tech, the comms team, and the user researchers that I've been able to work with here, who have sort of amazing systems, processes, and methods for bringing people other diverse perspectives into the room and getting those perspectives considered in an appropriate way, and a generous way, and a kind and respectful way. I think that tone is perhaps one thing that I would pick out that is often different. It's never ‘Oh, my God, that person over there did something I don't understand. What's wrong with them.’ It's always ‘Oh, that person over there did something I did not understand and that I would not do. Oh, why did they do that?’ It's a tone of positive curiosity, rather than frustration and distance. So, coming back to the term ‘user’, I always fall into using it every so often, but I think we all do. But I don't like it because it's essentially othering, it creates division between us over here and those people over there. And that othering is dehumanizing. It allows you to more easily right off that person over there for doing the thing that you don't understand or doing something in a way that you wouldn't have done it or even using a word that you wouldn’t have used. So, yeah, I hate it. We need to bring more people into the tent, so to speak. Remember, we're all on the same side and we're around the same table — so, the comms professionals, the user researchers. I don't even know how to say this, but it's the people who know how to get stuff done in organizations, the people who have the ability to work outside silos and to bring people together to achieve common goals. In the government, where I work at the moment, it's really complicated. It's very complicated getting stuff done, bringing people together in that way, getting money to spend on the right projects and find it the right way. I used to spend that money jumping through hoops and the governance that you rightfully have to go through in order to be able to spend public money. It’s a whole set of skills in itself. And those people are just as much cybersecurity professionals as any of the more technical people or I am. I understand all of them on a daily basis because I know I couldn't do what they do. I couldn't achieve any effect without the people that were here. We're enabling roles that allow us to do that.
[29:55] Andra Zaharia: I was just thinking about how fascinating it is to work in a government institution in cybersecurity where there's so much responsibility. I mean, not that there isn't that same level of responsibility in private organizations, but it's even more because the expectations are even higher from governments who have to protect critical infrastructure, who have to not only deal with legacy technology but also make sure that they're at the forefront of threats and state-sponsored attacks and all kinds of very, very difficult things that apply to politics and diplomacy and so many other things because cybersecurity is no longer a technical issue. Well, it hasn't been for a while just a technical issue, it's a political issue, it's a geopolitical issue, and has so many implications that it can be overwhelming for everyone. So, I was wondering, in this space that has so much complexity, how do you practice self-empathy? How do you practice self-compassion? And what keeps you sane and healthy in an environment that is so demanding? Because it is. It is emotionally demanding, intellectually demanding, and even physically demanding sometimes.
[31:16] Emma W.: Huge assumption there that I'm sane and healthy, by the way. It's difficult. I've worked in a number of different kinds of operational environments that work different kinds of things at stake, sometimes the most extreme stakes. And you're right, it's terribly, terribly hard to feel like you can ever put work down and walk away, even if at the expense of taking care of yourself. It's something I think we've got a lot better out over the years. And again, I think the role of leadership there has been really, really key. We've focused a lot on bringing on leaders who think, “No, that's perhaps not fair.” I think leaders always thought about people first, but maybe they kept it to themselves a little bit. Now we have leaders who practice that care of their people right out loud and at the most demanding moments. It's always been the case that during crises at work, you will always see leaders walking the floors and walking around just doing that management by walking about things, checking on everyone, having conversations, picking up the mood, seeing if everyone is okay, and ordering a pizza — the pizza is a huge part of the self-care. I cannot think it's wrong bringing in ideas from other realms that just aren't traditionally known about in cybersecurity because we have to practice self-care in all areas of our lives and recognize good ways to do that. And I think a lot of those good ways are just the same. It's about trying to keep your working hours reasonable, recognizing that if you work too long, even on the most crucial incident, you're going to be getting much, much less effective every hour that passes and you need to be able to step away and pass the baton to someone else, go home, have a sleep.
[32:46] Emma W.: And again, there's the role of leadership there in structuring working environments, and structuring team rotors, and so on. Leading from the front and saying, “It really is okay to go home for a bit and look after yourself.” In the longer term, or maybe less pressured circumstances, the biggest piece of self-care I do is to get out and talk to other people, really. And this has, I think, become rather pronounced the past couple of years during the pandemic period, particularly when we were mostly stuck in our homes most of the time and running the world over Teams, it was very, very hard to get that same degree of connection. And we had to work quite hard and quite explicitly on achieving connection. And these days, of course, we're doing a bit more hybrid workings. And I can choose largely where I work, at home or at office. So, I make it a priority to come into the office at least once a week just to talk to people, literally to talk to people for no particular agenda, but just to have a shared experience and shared concerns, and just to be reminded that you're not the only person in the world feeling the way you do at that particular moment. Sometimes have a bit of a moan together and be a bit negative and then laugh and go, “Oh, well, never mind,” and we carry on. Other times, I think more formally, it's about recognizing the frustrations and going, “Well, how do we move forward? What can we actually do here now?” In cybersecurity, we talk a lot about encouraging self-efficacy and empowerment of people, and not letting people take onboard threat information to the point where they feel paralyzed and powerless to help themselves. And I think that applies to us as well. I think often the challenge is what can we ourselves do here now to make things better? What can we start to do? We think the problem is those people over there, what can we start to do to make these things better? And again, that's I think how you encourage people to feel happier about things is by reminding them that they do have a certain amount of control of their environment. It might feel quite limited, but whatever control you have, you can use it to the max, and make sure you do.
[34:44] Andra Zaharia: That's beautifully articulated, plus that sense of there's someone else out there who has my back, there's someone there that I can count on, there’s someone else there who I can talk to about the fact that this is too much for me, about the fact that I need just to slow down a little bit, that I need help, whatever it is. I feel like there's a lot of cultivating individual success, high growth path careers, relentless learning, and certifications, and there's an intensity to building your career in cybersecurity. But that comes at a huge cost. There are conversations on Twitter and other places where the information security industry kind of congregates. We've seen people complain about burnout, and we've seen people go through a lot of things. And obviously, this is not just specific to cybersecurity, but there is an added intensity and complexity to it. And being able to talk about this in public to see that there are other ways, other approaches to building this path, to just taking your own journey at your own pace, and have periods where you can put in more work but others where you rest more, and normalizing that. I think that, again, this is one of the ways that we can contribute as an industry to the company culture. Again, just like you mentioned, leadership does make a huge difference. And something that I wanted to point out here is that I've seen CEOs, managers, and leaders, not necessarily all of these roles at the same time, and I've seen people with a highly technical background, who are more empathetic, closer, and more supportive to their teams than other people who come from a more social background. So, I think that this is a space where we can break down stereotypes, that we should not walk in with assumptions, but rather with that healthy curiosity that you mentioned earlier, which is just so much more rewarding and it eases the tension in the body, in the mind, and in the room as well.
[37:08] Emma W.: That's right. Let's start off with the assumption that we're all on the same team, and let's be interested and curious about what other people bring. Even when we see people doing stuff we don't understand, that doesn't make sense at first, let's approach that again with empathy and curiosity because there must be some reason why they did the thing. If it doesn't make sense to us, all that means is we don't know what the reason is. We're not seeing that reason from where we're standing. So let's get over to where they're standing and see why they did it. Because I guarantee when you understand it, as they would explain it, it makes more sense.
[37:40] Andra Zaharia: One of the questions that I recently saw at a talk someone gave is that you can go and ask people, “Show me how you do it. Just show me how you do it. Walk me through it.” And that's such a simple question. These kinds of questions really help when you have them handy to go back to them and to activate that, like, “You lead the way. You show me how things look like for you, in your own context, exactly where you're sitting, and I’ll follow along and try to learn from that.”
[38:13] Emma W.: Exactly. And you ask people that and you see them come alive because you have shown interest, and you've shown that you respect what they do, and you want to understand what they do. And you're coming at it from a position of, say, ignorance and curiosity, and you're relying on them to lead you through this space. And again, I think some of the least successful security comms, I think, I've ever seen have come from that perspective of “Well, we're security, and it's our job to tell you what to do, and it's your job to listen and do exactly as we say.” I mean, where's the relationship there? Where's the trust? Where's the feeling? If there's just an assumption that the people on the far end of your process, that it's their job to uncritically put into practice whatever advice you choose to give them from maybe your central perspective, where you don't understand the local context that they're working in, you don't understand the pressures that they're under, you don't understand all the minutiae and the complexity, the things that drive their daily decision making, then, of course, your comms are going to fail. Admittedly, on the flip side of that, mass comms is a very hard challenge. And of course, by definition, it's impossible to put out successful central comms that go across a very, very broad customer set that perfectly applies to everybody. It's not really possible beyond the very, very highest-level stuff. So, it's a challenge, but you can do so much with the tone and the approach, and how you break things down for people. And I think how you seek feedback, primarily, it's just understanding how your words land and then tweaking and changing slightly what you do next time in response to that.
[39:42] Emma W.: Occasionally, I’ve seen that almost quoted as a source of weakness, like, “Well, why would we care what they think? It's our job to tell them what to do. We’re the authority. They need to listen to us. We're supposed to have all the answers.” And that's just another flavor of that kind of toxic or outdated leadership model I talked about earlier, where people think that because they're the leader, they should have all the answers. And if they don't have all the answers, then other people aren't going to have any faith in them. In fact, that's just not the way the world works anymore. Leadership job, I think more than anything, is to show the way in general terms to employ the frameworks to show the behaviors, and otherwise, to create a safe space for us to discuss things and explore things together. It's more of almost a facilitation role, I think, than anything else. I was going to say this earlier when we were talking about incident leadership. It's very easy and natural for leaders to devolve back to more directive leadership or those prime times because when you're feeling the pressure, you feel the need to be definite and directive. And it often works quite well short term because that's what people are expecting of you. And when people are feeling fearful, uncertain, and they don't know what to do, it's great just to be told what to do. And you can go and do the thing and you haven't got to worry any further. So, it can work in a limited way in the short term, but I think all wise leaders will be very conscious of that dynamic. And they will be maybe using it consciously, but aware that it needs to exist in that sort of wider context of “All right, well, we’ll do what we have to do right now to put the fire out. But after that, we're going to take a breath and come and do things a bit differently.” When we're through the incident, and when we're looking back at what caused the incident, and even “How did we deal with it? And was that the best possible way?” You were going to bring a different dynamic to that kind of leadership. Again, more curious, more open, more accepting, and more willing to hear all the perspectives in the room, rather than “That fire is burning, we're gonna just put it out.”
[41:28] Andra Zaharia: It's beautiful to know that there are increasingly more people who practice this kind of leadership, who really understand that, let’s say, the servant leadership model and manage to apply it. And I feel that generations that are coming down to the workplace — Gen Z specifically — has the potential to reshape the culture in this sense because their expectation from leadership are exactly the ones that you described. They no longer just blindly and unequivocally accept what's thrown at them; they question everything, they want to know why, and they are led by their need for meaning and for different models. I strongly believe that they're bringing on a change that is irreversible and that they're going to really shake up the way that we do things and slowly push away these outdated models that stand in the way of change in the way of getting meaningful work done, work that helps others and that help the people who do it as well because that's ideally where we want to land in. Especially when you're doing the hard, complex, really challenging work day in and day out, you need those wins, you need that human connection to keep you going.
[42:47] Emma W.: It's fascinating times. I think it can feel a bit scary at times, that kind of working model that you described. It can certainly be really, really disruptive and unsettling to people. I think the biggest trend I've seen is kind of an expectation that the workspace will be reshaped to suit the person who is asking, like, “Well, why not? Why can't I have things the way I want them?” And other people sometimes can sort of look and go, “No, it's your job to fit into what is already happening.” And there has to be a tension between those perspectives, basically, because you’re right, if people never come along and ask questions and demand for things to be different than nothing will ever change in the ways that it needs to change. But equally, this is a collective endeavor and we will have to find some way to work together that suits everybody. And I think there's a notion of Chesterton's fence. Have you heard of Chesterton's fence?
[43:35] Andra Zaharia: No, I haven't.
[43:36] Emma W.: Oh, I love it. I heard it from my friend, Rob, and now I repeat it at every opportunity. So, it's from the author G. K. Chesterton, who wrote that sometimes if you're walking in the field or a forest or something, you may see a fence. Your tendency might be to think, “Well, I don't see why that fence is there; therefore, I'm going to take it away.” The notion of Chesterton's fence is that you should never take away a fence that you don't see the need for until you found out why it was put there in the first place. Because sometimes the fence will be very necessary, you just don't immediately see why it's necessary. So, obviously, the analogy is before you change working practices or upend things to any significant degree, you do need to understand why they were first being done the way they are. It doesn't mean that change is wrong, it just means you can very, very easily put your foot in it and end up lifting the ball out if you don't understand why exactly why that offense was there and why it is still there.
[44:26] Andra Zaharia: That is absolutely true. The power of context will probably never become less important in any conversation ever. And I think that context is, let's say, the red thread that connected all of the key topics that we touched on today. I really enjoyed this conversation. It's given so many insights to start with. It opened so many doors to many pathways to explore in terms of personal development and just understanding how cybersecurity people work, what opportunities there are to contribute to, again, bringing in positive change, reshaping the language, just strengthening those relationships and creating an atmosphere of trust. And, again, shifting cybersecurity from something that is compelling and constructive to something that is helpful, dependable, and just an ally for people.
[45:23] Emma W.: Yeah, and accessible, something that is not distant, scary, and surrounded by gatekeepers who are using technical terms, terminology, systems, and processes that you don't understand. But something that everyone should have a reasonable expectation of being able to understand, grip, and use to do things that they want to do. It shouldn't be as scary as is.
[45:44] Andra Zaharia: It definitely shouldn't, and we're very lucky to have people like you leading the way and showing that this is possible, and actually contributing to that change. That gives me so much hope and so much enthusiasm that the momentum of this conversation is just rippling across the industry and beyond it as well because I really think we need that. So, thank you so much for sharing all of this and for being here, and just for doing the work that you do.
[46:13] Emma W.: Cool. Thank you very much for having me. It's great to talk to you and it's great to feel a part of, as you say, that wider community that is making change and helping to build great things for the future.