Design Thinking for Cyber Security Services – Design Monday

June 18, 2021

IDEO has been at the center of many fundamental designs in computing history. This includes the simple and ubiquitous mouse.

Thought it was Apple? Think again. Steve Jobs came to a firm called Hovey-Kelley in the late seventies, a firm which would become IDEO in 1991. Jobs had a problem. The only other computer mouse in existence cost 16 times what people could afford. The mouse also broke frequently and was, well, ugly. None of this would work for the Lisa and Mac.

David Kelly (of David Kelley Design, Hovey-Kelley, and later one of three founders of IDEO) assembled a team. Douglas Dayton worked on the frame. Jim Yurchenco was responsible for the mechanical design. Bill Dresselhaus, with his love of Art Deco, handled the packaging. The technology of the day was finicky and “required such precision that it probably couldn’t be mass-produced.” There were practical debates about the sound of the click, or the number of buttons. Each change required every other part to be redesigned to fit in the tiny space. But even in those early days, the firm that would become IDEO had a secret weapon.

Design thinking. IDEO refined it and popularized it. Design thinking is a way of problem solving and developing solutions that’s a departure from how we in IT have long done things. Consider the following five points of design thinking:

  1. Empathize – think about people who we’re serving (empathy is the heartbeat)
  2. Define – think about the main problem we’re trying to solve
  3. Ideate – brainstorm, mindmap, whiteboard, play
  4. Prototype – build a possible solution
  5. Test – sit down with the people and have them test the prototype

Now compare the design thinking steps to ITIL service design:

  1. Service solution – think about requirements, deadlines, costs, budgets
  2. Information systems and tools – think about the service portfolio, configuration management, capacity, and security
  3. Technology and architecture – think about designs, plans, and processes to align IT policy and strategy
  4. Design processes – think about the process model for operation and improvement
  5. Measures and metrics – think about what we’ll measure to ensure the service is working

Notice what’s missing? People. I mean, ITIL practitioners will reply, “no, no, no. We have the 4P’s: Product, People, Process, and Partner.” Fair enough. But compare the two lists. People are not the focus. And to anyone who has been in the workforce as an enterprise end-user? It shows. We can feel it. Because people designing IT and IT security don’t think much about the people who’ll use it, the people who use it don’t think much about what we’ve designed.

Case in point: credentials. Research shows that people with more technical knowledge don’t take more steps to protect their data than people with basic knowledge (User Mental Models of the Internet and Implications for Privacy and Security). Most people know they should use separate passwords for every app (91%). But most people use the same password anyways (66%). Most people know they should use MFA. But most people don’t (66%). The problem isn’t one of awareness. (Source: LastPass and Security Ledger.) In not considering how regular people use and secure technology, we’ve created a situation where people simply opt out.

Enterprise IT is a like the original mice. Xerox, the mouse Apple copied, cost $400 or $1200 in 2020 US dollars. Doug Engelbart’s, the mouse Xerox copied, required a training course that took 6-months to master the damned thing. That’s ITIL thinking. That’s the type of technology people will be aware of, but not take steps to use.

Design thinking, the focus on people and rapid prototyping, led to a mechanical mouse setup which would dominate mice designs for the next twenty years. The original Apple mouse was $25. (Adjusted for inflation, that’s $79, which is coincidentally the price Apple charges for the optical Magic Mouse in 2020.) A child could pick up the mouse and immediately use it. Most of my generation learned in grade school. It just worked, worked well, and worked at a fraction of the cost.

In my office hangs Five Phases of Design Thinking by Maisey Design. It’s a reminder. When working on security services and specific controls, keep the focus on people.

The Apple Mouse, Photography Wikipedia

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Swatch and the Crisis Business Case – Design Monday

May 10, 2021

Crisis creates momentum for security initiatives. When the existing ways no longer work, change becomes possible. It could be as mundane as an audit finding, or it could be as high profile as a security breach. Once it happens, once things get moving, implementing security controls become much easier.

Yet crisis response isn’t conducive to thoughtful security design. The trouble is, with greater support comes greater time pressure and greater expectations.

The Quartz Crisis

A case study in turning crisis into strategy way the Swiss watch makers ASUAG/SSIH and the Quartz Crisis. The cause of the crisis was a disruptive technology which changed the game. Namely, the quartz watch movement.

The Swiss dominated the watch market. Capturing the majority of the market share in the 1950s following World War II, the Swiss built their industry on the mechanical watch movement. The movements were a feat of mechanical engineering. Beauty is a set of precise gears and levers operating in lock step. The watch was a matter of national pride — and rightfully so.

Then came the quartz watch movements. Cheaper. Simpler. At first, serious watch makers looked down on these upstarts. But by 1970, quartz watches were disrupting the industry. The crisis hit the Swiss hard. Bankruptcies. Unemployment. By the early 1980s, two-thirds of the Swiss watch industry had evaporated.

Then, in 1983, ASUAG and SSIH merged and set a course for becoming the Swatch Group in 1998. Yep. That Swatch.


Swatch was the cool kid watch for those growing up in the 1980s. It was bright. It was colorful. Swatches were affordable without feeling cheap, Swiss prestige without feeling stodgy.

This led to a cultural shift. Before Swatch, a Swiss watch was a gift given on a major life milestone. The watch was unwrapped then tucked away, taken out for special occasions, carefully (or not so carefully) wound and set, and then placed back in the box afterwards.

With the Swatch, people used watches for everyday style. It became common for people to have a few, in different colors and patterns. Sometimes people wore several watches at once. Swatch was a form of rebellion and a form of personal expression. The eighties were a crazy time.

A decade later, it even inspired Jony Ive when designing the Apple iMac. In Ive’s biography, he says:

“We talked about companies like Swatch—companies that broke the rules—that viewed technology as a way to the consumer, not the consumer as the path to the technology.”

Crisis Response

The good thing about crisis is that it gets the ball rolling. The danger with crisis is people taking short term action without an eye towards a bigger vision.

We can tell when a security program has been built by crisis. There are many great things being done, but they don’t integrate or connect. The security products purchased in response to an event tower above the rubble like well-funded monuments in an under-funded abandoned development.

When the Quartz Crisis hit, look at what ASUAG and SSIH did not do:

  • Double-down on what they’ve always done – they developed a Swatch not another expensive heritage watch
  • Give up what their strengths – Swatches brought Swiss quality to the Quartz watch market segment
  • Respond in isolation – the Swatch strategy tapped into a cultural moment, and that success was directed to provide a comprehensive product strategy
  • Ignore the workforce – early profits from Swatch went to engineering talent, shoring up the profession

Sure, the Swatch was more engaging, was simpler, was more affordable, was more fun. But what made Swatch a successful response to the crisis was that the watch was tied to an overall strategy. It wasn’t an isolated action. The Swatch was well-built cog in the ASUAG/SSIH mechanism.

The Crisis Business Case

When the business case is to address the crisis, build a Swatch.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Look back to future plans – Design Monday

May 3, 2021

The future won’t follow the futurists. Never does. This is problematic when designing security for the future. We know this. That might be why security presentations about the future often ask the tough questions. Where’s our jetpack? Where’s our flying car? Where’s our round house above the clouds?

Let’s answer one of these. Right here. Right now.

The House of the Future

Futuro was the closest we came to the round houses of science fiction. Designed in 1965 by Matti Suuronen as a ski cabin, the sleek Futuro resembles a plastic flying saucer. The owner of the ski cabin loved it. Absolutely loved it. Fresh from this success, Suuronen turned towards mass production of the house. The resulting retail price was less than half the cost of American homes at the time. When mankind landed on the moon in 1969, there was a wave of interest in space. Everything space-themed was suddenly trendy. Futuro was affordable, stylish, and well-timed.

And then? After about a hundred Futuro pods were produced, the entire line was shuttered.

When trying to predict the future, security leaders often look to peers. Who has done a similar project before? What worked well, and where did they stumble? It’s a valuable source of insights for project planning and project risk management. There’s another area, however, that’s often overlooked. What happened here, in our organization, in our past?

The Futuro story offers a few lessons:

  • The house landed too far outside of Most Advanced, Yet Acceptable (MAYA) Remember the first house, the ski cabin? The locals held public protests. Futuro was too different to be acceptable.
  • Like Wallace Neff’s concrete bubble houses, like Buckminster Fuller’s aluminum Dymaxion house, the round Futuro didn’t offer a comfortable living experience. Round houses are a struggle when so much is optimized for the rectangular. Futuro wasn’t human-centric.
  • The economics didn’t work out as planned. Yes, the house was half the cost of a typical American house. But it was a third of the size. Moreover, there were many unexpected costs in delivery and installation. The oil crisis in 1973 dealt a final blow as raw material costs skyrocketed. Futuro’s initial total cost of ownership was unaffordable.

Looking Back on the Future

I visited my first Futuro during a trip in Europe. The Futuro offered this look back at a more optimistic time. A time when jetpacks and flying cars were within reach. I keenly felt the gulf between the future we predicted and the future we lived.

We gap assess all the time in security. What’s the compliance standard for this IT environment, and where do we fall short? Given the reference model for this security capability, and how do we measure up? What’s the gap between our approach and our peers in the industry? But rarely do we look inward, look backward, look at the gap between our expectation and our execution.

Find your organization’s Futuro, those projects with great promise which fizzled. Look there for lessons to apply to your next security project.


Futuro House in Carlisle, Ohio, with DMC DeLorean. Photography by Jeremy Popp.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Pilot with security chaos engineering – Design Monday

April 26, 2021

No security capability operates as intended. Even with perfect data, perfect planning, and perfect foresight? Small differences between our assumptions and reality quickly add up to unpredictable situations. Security faces the proverbial butterfly flapping its wings in Brazil producing tornado in the United States.

The butterfly effect was coined by Edward Lorenz, a meteorologist and father of chaos theory. It all started when the limitations of computing led to the limitations in forecasting. It’s a pattern that still plays out today, leading some to point to the need for chaos engineering.

Edward Lorenz was working on one of the first desktop computers: the Royal McBee LGP-30. Desktop in the sense that the computer was, in fact, the size of a desk. It also cost nearly a half billion dollars, in today’s US currency. We’re talking state-of-the-art vacuum tube technology. A teletype machine, the Friden Flexowriter, provided both input and output. It printed at a glacial ten characters per second.

These constraints of his machine inspired Edward Lorenz. But I’m getting ahead of myself.

So there Lorenz was, modeling the weather. To save memory, as he ran the calculations, he printed the results to charts. At ten characters a second this was tedious. To save time, he printed to three decimal points.

The LGP-30 would hum and pop while it calculated a value to six decimal places. The Flexowriter would bang and punch out the result to three decimal places. Calculate 0.573547 and print 0.574. Again and again, line by line, while Lorenz waited.

This shouldn’t have been a big deal. The differences between the calculated results and printed values were quite small. But when Lorenz retyped the numbers and reran the models, he noticed something extraordinary. Weather on the original chart and the new chart would track for a day or two. But pretty soon, they’d differ widely, unexpectedly. What was once a calm day suddenly turned into a tornado. All due  to the tiny differences in the source data. Edward Lorenz had discovered chaos theory.

“Complexity. It’s extremely difficult to predict all the outcomes of one seemingly small change.” David Lavezzo of Capital One wrote in the book Security Chaos Engineering. “Measurement is hard.” And even when we have metrics, which we rarely do, these small changes compound and lead us into unforeseen territory.

You can’t just rely on the temperature numbers predicted at the beginning of the week. You have to actually go outside. See if you need a jacket. See if you should be wearing shorts. The same is true of security. We can’t rely on our long-range forecast. We need to check the reality on the ground. Regularly. From there, adapt according to our principles.

We future-proof our security architecture by choosing versatility. We design for adaptability by prioritizing principles over rules-based approaches. But when we get to implementation, we should expect that we’ve missed something. Expect people and applications and devices and butterflies have behaved in ways that are a few decimal places further than we had considered.

We need some applied chaos to test and harden our implementation. The emerging domain of security chaos engineering is providing some useful techniques. Inject some evidence. Change some settings. Run some exploits. Validate that the security controls continue to operate. Security chaos engineering provides a way to explore the unexpected.

But ultimately, the take-away from Edward Lorenz is one of humility. We simply don’t know what will come. With the data we have, we can’t predict what will happen. Decades of advances in computing since the Royal McBee LGP-30 haven’t changed this equation. When implementing security, pilot with chaos to prepare for the unforeseen.

Royal McBee LGP-30 replica by Jürgen Müller

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

We got it wrong! – Great Lakes Security Conference

April 20, 2021

This session is on all the things we all say all the time, about all the things we call know. Security through obscurity is bad. Defense in depth is good. Stop clicking things. Next generation is bad, or maybe, next generation is good. The list goes on and on. The resulting rules of thumb are sometimes contradictory and often misleading. With war stories and anecdotes, we’ll explore what happens when teams run security by tribal knowledge instead of research and reason. Spoiler alert: they get pwned. Turns out, we were wrong.

Presented for Great Lakes Security Conference (GLSC) 2021.

Watch more videos on my YouTube channel.

Change the game – Design Monday

April 19, 2021

Cyber security can be thought of as a game. Offense and defense. A set of motions and movements to score points, or to prevent the other team from scoring. Red team and blue team. A series of tactics and techniques to break in, or to detect and prevent such action. This thought is a good starting point. But we shouldn’t simply work on being better at the game. We need to change it.

Take basketball. When basketball debuted at the Berlin Summer Olympics in 1936, the game looked much the same as it does today. Sure, there have been subsequent rule changes. But the ball and hoop, well, those are classic.


During the first fifteen years of basketball, no one thought beyond the basket. Peach basket, to be precise. James Naismith famously nailed a peach basket to a gymnasium wall and thus invented the game. But it was the whole basket. After points were scored, a ladder would be used to fetch the ball. Sometimes, they used a stick to push the ball out. For fifteen years.


One reason is it’s hard to see beyond things. Functional fixedness. Another reason? We’re hardwired to add rather than subtract. Given the choice between adding a fetching stick and removing the bottom of the basket, we almost always choose the stick.

This human tendency has been studied. (See: People systematically overlook changes). There’s even book on the topic, Subtract: The Untapped Science of Less. The Subtract book looks at it from practically every domain, science to business to medicine and more. Except cyber security. Perhaps we can make it into a future edition.

Imagine people using IT in the organization. Imagine that’s the game we’re seeking to win. Get a sense of the players and the ball using business impact analysis. Get a sense of the movement and plays using journey mapping. Now imagine ways to secure this.

Your instinct will be to add. Pause. Look around for the peach baskets which can be replaced with hoops. Find something to subtract that improves the security.

Then change the game.

Peach baskets: the basket in basketball.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Good security is like a good coffee pot – Design Monday

April 5, 2021

Coffee. Coffee fuels hackers, coders and security wonks alike. For hackers of my generation, we tackled many a problem and brewed many a pot with a Braun. And within its hourglass shape lies a lesson for today’s security professionals.

The chief designer at Braun from 1961-1995 was Dieter Rams. He was behind the ubiquitous Braun coffeemaker from the 1980s. (I had a hand-me-down pot in my workshop in the 1990s.) Now you might think the shape was for decoration. Makes sense. One of Dieter Rams’ ten principles for good design is that good design is aesthetic. You’d be wrong.

Attractiveness for the sake of attractiveness isn’t Dieter Rams point. His design aesthetic was first solving the problem, and then solving the problem in a beautiful way.

The hourglass coffeemaker’s shape stemmed from a problem with the plastic. Plastic casings were still relatively new at the time. The process wasn’t producing plastic that was strong enough. The fluting provided strength and structure. As Dieter Rams wrote, “what was often misunderstood as some kind of post-modern decorative element had in fact a definite structural function.”

Applying this to cyber security: first design to meet the security requirements, then redesign using the same elements to provide a good experience.

Braun KF 157 Coffeemaker, Photography via WorthPoint.

Good Design is Aesthetic

I’m nostalgic about Braun KF 157 coffeemaker. But I’m in love with the Braun KF 20.

The KF 20 was ahead of its time. It looked like science fiction. In the futuristic world of Alien set in 2122, there was the Braun KF 20.

Florian Seiffert designed the coffeemaker in 1972. Following Dieter Rams direction and principles, every stylistic element has a functional purpose. The end result is well-designed, well-intentioned, beauty.

“It is truly unpleasant and tiring to have to put up with products day in and day out that are confusing, that literally get on your nerves, and that you are unable to relate to.” Dieter Rams spoke of products like coffee pots. But he just as easily could have been describing security controls.

Good security has a design aesthetic that is relatable and understandable.

Braun KF 20 Coffeemaker, Image via Dan Gorman

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Add some nice rims – Design Monday

March 29, 2021

“Simple cars need complex wheels.”

So said automotive designer Lowie Vermeersch about the Pininfarina Nido. When you make something so incredibly simple, a bit extra makes the entire thing pop.

The equivalent of nice rims in a security capability is that one thing we do that goes just a little bit further to make the end-user happy. It’s not something we have to do. We’re going to need wheels anyway. It’s a little extra.

It’s not something that adds much to the cost of the project. A nice set of rims runs around $1,000 with the average price of a car being $40,000. But its something the end-user notices and appreciates far above the price tag.

The path for designing a security capability goes from complexity to simplicity, taking those steps with empathy and understanding. As we follow that path, keep an eye open. Find opportunities to spend a fraction of the budget (say 1/40th?) on one detail that pleases people.

Simple security still needs chrome.

Pininfarina Nido EV, Photography courtesy

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Contrast the status quo with the new vision – Design Monday

March 22, 2021

“I want to be Batman.” This is the greatest answer I’ve received to the interview question, “where do you see yourself in five years?” 

I hired him. Of course.

If only stopping criminals and villains was as simple as hiring superheroes. But we need equipment. We need partners and support. And before we get our batcave and police commissioner Gordon, we first need to reach people. 

Leaders excite and engage people to get things done. We use strong clear communication that cuts through debate and doubt, and provides a solution we can agree upon. It takes strong visual and verbal communication.


One more thing about superheroes, what happened to them visually? The Golden Age and Silver Age comic books were full of bright bursts of primary colors. These days, superheroes have been drained of color. DC’s Superman’s original bright blue and bright red are so muted, they look nearly black-and-grey. Marvel has taken a similar approach. Looking at you, WandaVision. The Scarlet Witch isn’t scarlet but a dark burgundy. Modern heroes are a study in dark contrast. 

Christopher Nolan’s Batman trilogy takes the blame. The films defined the noir look which has played out across all recent comic book movies. But who inspired Nolan?

Visual Contrast

The answer is Johannes Itten from the Bauhaus. That’s Bauhaus the design school, not Bauhaus the band. t’s final form was in Berlin, where Ludwig Mies van der Rohe was the director. Before that, the Bauhaus was in Dessau, getting its start in Weimar in 1919. Many great names, and many great designs, trace back to this time. But in Weimar? In the start? There was Johannes Itten. 

Johannes Itten taught art and color at the Bauhaus. Had a blast doing so, from what we can tell. “Play becomes joy, joy becomes work, work becomes play.”

While with the Bauhaus, Itten studied colors, establishing the fundamental categories for contrast: hue, light-dark, cold-warm, complementary, analogous, saturation, and extension. This work, specifically with contrasting seasonal color palettes, inspires painters and artists to this day. And nearly a century later, Christopher Nolan would turn to Itten’s desaturated and muted color palettes when establishing the mood of The Dark Knight Rises.

Contrast is what makes the visual beautiful.

Verbal Contrast

The communications expert Nancy Duarte studied storytelling and presentations. She looked at superhero movies, she looked at boardroom talks. “After all this study, it was a couple of years of study, I drew a shape,” Duarte recounted on the TED stage. “There is this commonplace of the status quo, and you need to contrast that with the loftiness of your idea.” 

Duarte details her contrast model and shape in her presentation, The secret structure of great talks, and in her Resonate book.

It was a pattern I followed when establishing the vision for my monitoring program. I explained the status quo of audits and manual efforts. I painted the picture of automation and visibility. I showed where we were weak, and pitched how my team could be stronger. I leaned into the contrast. In the end? I obtained the funding for the SIEM and equipped my team’s Batman.

Contrast is what makes the verbal actionable.

Sell the Vision

“The objective laws of form and color help strengthen a person’s powers and to expand his creative gifts,” Johannes Itten once said. Duarte’s research shows similar laws of form and content strengthen a person’s persuasive powers. 

Explain your vision by contrasting what is and what will be. Use this approach to gain buy-in, support, and budget. That’s how hire the Batman, and that’s how we get those wonderful toys.

A noir color study in contrast.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.