Taking a day off work, I’m thinking about how work gets structured. Use standards such as CIS Critical Security Controls, NIST SP800-54b, and the National Initiative for Cybersecurity Education (NICE). Define what the team will do and, just as important, what the team will not do.
The common belief in CyberSecurity is that end-users want security that’s all but invisible. But studies are showing a surprising fact: people want to be involved and want to put in some effort. Let’s take a closer look at the IKEA Effect and Effort Justification cognitive biases and see if we can’t piece out what’s going on.
With apologies to Yves Saint Laurent, who once said fashion fades but style is eternal. While we’re all caught up in what’s changing, it’s imperative to look at aspects of CyberSecurity that are, if not eternal, certainly are long-lived.
People are stressing their jobs, their health, their family, and their friends. Meanwhile, people are working from home with lessened security. These two are a dangerous combination.
Charlotte Perriand was inspired by the American cowboy, stretched out, feet up, lounging after a long hard day’s work. This inspiration carried over into the LC4 Chaise Longue chair. Perriand was also a bit punk, and would fit in well with today’s hacker and maker community. “Perriand embodied l’esprit nouveau. She was often pictured wearing a homemade ball-bearing necklace, giving her the look of a lithe component plucked from a finely tuned machine.” Her impressive career stretched decades and focused mainly on architecture. But back to the LC4 Chaise Longue, designed early in her career while with Le Corbusier. More specifically, back to the inspiring metaphor.
Technology advances
at the speed in which new metaphors are identified, shared, adopted, and
absorbed. Metaphors make the new feel familiar. Metaphors provide the language
and mental models for discussing and thinking. Our minds love easy to recall
and easy to consider ideas, and so these ideas are more readily adopted. But
then a curious thing happens. The more we learn and play with the idea, the
less we need the metaphor, and eventually the metaphor fades away altogether. This
is the point where a new set of innovations and ideas emerges, along with a new
set of metaphors, and the cycle repeats.
Around 1930, Perriand applies the metaphor of the lounging cowboy to the LC4 Chaise Longue. Twenty years later, around 1950, Børge Mogensen applies the metaphor of Perriand’s chair to Morgensen’s Hunting Chair. And twenty years after that, we have lawn furniture inspired by Mogensen and Perriand. Nearly a hundred years later, none of us look at deck furniture on a cruise ship and see a cowboy. We don’t need to. Culture has absorbed the metaphor.
The same pattern happens in IT, albeit at a much faster pace, leading to three considerations for designing security capabilities. First, cultivate a garden of metaphors. We need inspiration to innovate and, perhaps more importantly, we need to inspire to our organizations. Second, don’t move security along faster than the metaphor. Organization need time to adopt and absorb our metaphors. Go too fast, skip metaphors along the way, and we’ll lose people, which will hinder or even stop the organization from adopting our security practice. Beware the curse of knowledge. Finally, increment the metaphors while incrementing the design. Think in stages.
From the castle
to the perimeter firewall, from the perimeter to network segmentation, from
network segmentation to micro-segmentation, take it one comparison at a time.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
Sheffield has leaked its automatic number-plate recognition (ANPR) dashboard, including where cars went and when. This brings up the topic of data toxicity: when combining data with new data and when using combined data in new ways creates dangerous data. Here are three things you can do about toxic data.
On a recent webinar, an attendee asked how we should talk to our end-users about passwordless authentication. My answer: don’t.
Look to Doug Dietz to understand why. Dietz is the principal design thinker at GE Healthcare. The book Creative Confidence featured his work on MRIs for children. Originally, the MRI was a technologist’s technology. This meant it scared the kids, often to the point of them needing sedation. Dietz realized this and redesigned the MRI as an experience attractive to kids. The key insight was empathy. To paraphrase Dietz’s TED talk, “Empathy at the beginning sets the heartbeat of the project. When you move forward into the iteration and prototyping and some of the design phases you go through, you need to refocus and see what the empathy was that got you started.”
We don’t talk to kids about the MRI. We talk to them about the jungle experience. We don’t talk to end-users about passwordless. We talk to them about a more enjoyable work experience.
When designing security, we start with the vision, the business capabilities, and the business outcomes. We begin with empathy and then, as Dietz put it, let empathy be the heartbeat through the design process. Don’t do this, and we end up with the equivalent of the MRI machine. That is, security which people avoid and workaround. Possibly security that will have people wanting to be sedated. Good design creates security experiences that people adopt and, in rare but exciting cases, actually enjoy.
Empathy is incredibly hard. Seeing the world through someone else’s eyes always is. It is doing the hard things that elevates design.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
(TEK Keynote) We turn to DevOps for speed. We turn to Cloud for flexibility. We adopt faster, leaner, more collaborative processes to drive change. And then? We turn to information security for protection. But can we secure the technology without slowing the pace? This session presents an entirely fictional development organization adopting DevOps. We will discuss which traditional software security processes work, and which ones fail entirely. Awareness training, muscle memory, culture shifts, all will be brought together. The presentation will conclude with take-aways for applying security to your DevOps team without slowing down.
Who do we need to include to successfully PoC a security product? Let’s look to Walt Disney for inspiration. Seems like a stretch, but trust me, cartoon animals are a success criteria.