Frameworks and Relationships

Archive for the ‘Design’ Category

Frameworks and Relationships

Posted by

“Today we welcome J Wolfgang Goerlich, an advisory CISO, mentor, and strategist. We delve into the intricacies of security design frameworks and the importance of building and maintaining relationships in the cybersecurity field. Wolfgang shares his expertise on creating effective security programs, fostering trust within teams, and navigating the challenges of the CISO role. Tune in to gain valuable insights on cybersecurity strategy and the significance of collaborative relationships in achieving security goals.”

Empathy, kindness, and behavior economics on We Hack Purple Podcast

Posted by

Tanya Janca invited me onto her We Hack Purple Podcast to discuss vulnerabilities beyond code. Along the way, we cover behavior economics and the importance of empathy in cybersecurity design. “Kindness is the original security principle” makes an appearance, as we talk about how all this and more applies to building better products.

Our conversation was sponsored by the Diana Initiative, a conference committed to helping all those underrepresented in Information Security.

 


To see listen to other podcast interviews, click to view the Podcasts page or the Podcasts category.

Things Wolfgang Goerlich Says – Design Monday

Posted by

Alright, alright. This feels a bit strange. But I’m collecting my folksy sayings on cybersecurity leadership and design in one place. I’ll update this over time.

Good Security

  • Good security is usable security.
  • Good security gets out of the way of users while getting in the way of adversaries.
    • Good security frustrates attackers not users.
  • Good security first delivers a business outcome and then, as a result, increases security.
  • Good security supports changing maturity.
  • Good security projects leave people hungry to play again

Cloud Security

  • Ownership is not a security control.
  • Security is not what we control, it is what they do.

Defense and Offense

  • When work looks like work, work gets done.
  • Risk isn’t the language of the business. Story is.
  • Security happens where mankind meets machine.
  • The more constraints placed on users, the more creative they become.
  • All a better mousetrap does is breed better mice.

Media Mentions

Always remember friends: The Cyber War will not be won with platitudes.

— Wolf

Identify improvements as security matures – Design Monday

Posted by

In writing the book Rethinking Sitting, Peter Opsvik manages to do with chairs what we should do with cyber security: study the item in the wider context of how people interact.

Peter Opsvik’s critique is that furniture design isn’t “particularly concerned with the needs of the sitting human body.” Many rituals, he believed, are driven by a need to relieve people and compensate for poor seats; like kneeling to pray or standing to sing. Opsvik considered how the positioning of a chair, say in a kitchen or dining area, can make a person feel more or less connected, more or less important. He also spent considerable time thinking about how sitting changes as children grow into adults.

Design spans time frames: an experience lasting an hour, a stage in life lasting years, a lifetime. It spans contexts: personal, communal, societal.

We struggle with this in cyber security. Take, for example, break glass account. Right then. We setup an account with administrative-level access, write the password on an envelope, and stuff the envelop in a vault. But what happens when most administrators are working remotely? Fair point. Let’s move the password from a physical vault to a password vault, and share the vault with our backup person. But what happens when the vault goes down? How about when the person resigns and leaves for another company? How do we handle the longer lifecycle of this seemingly simple control?

Peter Opsvik’s answer to the lifecycle question is the Tripp Trapp chair. The chair is well-made, long-lasting, and stable. Simply change the seat and footrest, and the chair accommodates the user from infancy to adult. Five sets of adjustments as they mature.

The chair reminds me of the five stage maturity models. Security capabilities move from initial, repeatable, defined, capable, and finally, to optimized. To design a Tripp Trapp security control, think through how to reconfigure the control to support the evolving capability. Ideally, simplify these adjustments down to a small number of items.

What’s the seat and footrest in our break glass example? I suggest the credential storage and credential access. That is, how we set it up, and how the person handling the emergency breaks the glass.

Tripp-Trapp-Tresko is Norwegian for Tic-Tac-Toe. In the kids game, like chairs and like security, you succeed by thinking ahead. “The best sitting position,” Opsvik once said, “is always the next position.” Start with minimum viable security. Plan for future stages early, and identify the adjustments we can make. Good security controls support an evolving capability maturity.

.

The Tripp Trapp Chair from Stokke.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Nudge and Sludge: Driving DevOps Security with Design

Posted by

 

Nudge and Sludge: Driving DevOps Security with Design

Security people say users are the weakest link. When security becomes burdensome, users take shortcuts jeopardizing security. Design offers a solution. We will walk through affordances, nudges, sludge and principles to inform and direct our design. Come learn how better usability leads to DevOps security.

This talk was given at DevOpsDay Tel Aviv 2021.

Security Architecture Principles – Design Monday

Posted by

Clack. Clack. Two hands. Hunt and peck typing. Clack. Clack. The beautiful red Valentine typewriter. Clack. Dieter Rams at his desk. This is the opening shot of the Rams documentary. What is he typing? Ten principles for good design.

An entire series of articles could be written applying Dieter Rams‘ principles to cybersecurity. This is not that. Instead, let’s look to Rams as an example of creating and living with principles.

What makes a good architecture principle? It makes a statement. “Good design is honest,” Dieter Rams might type out. “Buy not build” is one I often encounter. A good architecture principle has a rationale. “It does not make a product more innovative, powerful or valuable than it really is. It does not attempt to manipulate the consumer with promises that cannot be kept.” For buy not build, our development resources are valuable and must be deployed only in areas where there is a clear advantage and where an existing solution doesn’t satisfy the majority of our needs. Finally, a good principle makes an impact. It has implications for later decisions.

“I like orderly confusion very much. But this is neither orderly nor properly confused.” Dieter Rams says about an hour into the documentary, while evaluating objects against his esthetic and principles. “Others may like it. I do not.” A set of good architecture principles enables the team to make decisions. These decisions may be very different from other security teams, even other security teams in similar industries and at similar times. The success of a security architecture depends not upon the individual decisions. Rather, success depends on the consistency across decisions, initiatives, and capabilities. Consistency through principles.

Consistency poses a challenge. The same thing means different things to different people. For architecture principles to work, the team must debate implications and applications. An example of this comes in the documentary when Mark Adams walks Dieter Rams through the new Vitsoe headquarters. For background, Adams is the managing director of Vitsoe, the firm which produces Rams’ furniture. “I want it to be completely honest that that is a fire barrier,” Adams explains. But is it honest? And does the honesty balance against the other principles? After a moment of thought, Rams says simply: “It’s a little bit irritating.” After some back and forth, they decide to sand it and blend it in. (In the photo below, you can see the resulting gray fire panels.) The moment captures this discussion of application. Principles live through debate.

Be principled. Develop a small set of architectural principles to guide the technical design. Live with them. Argue them. Disagree and commit. Apply and iterate them. But be principled.

Vitsoe London Headquarters, Photography by Vitsoe.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Pilot with security chaos engineering – Design Monday

Posted by

No security capability operates as intended. Even with perfect data, perfect planning, and perfect foresight? Small differences between our assumptions and reality quickly add up to unpredictable situations. Security faces the proverbial butterfly flapping its wings in Brazil producing tornado in the United States.

The butterfly effect was coined by Edward Lorenz, a meteorologist and father of chaos theory. It all started when the limitations of computing led to the limitations in forecasting. It’s a pattern that still plays out today, leading some to point to the need for chaos engineering.

Edward Lorenz was working on one of the first desktop computers: the Royal McBee LGP-30. Desktop in the sense that the computer was, in fact, the size of a desk. It also cost nearly a half billion dollars, in today’s US currency. We’re talking state-of-the-art vacuum tube technology. A teletype machine, the Friden Flexowriter, provided both input and output. It printed at a glacial ten characters per second.

These constraints of his machine inspired Edward Lorenz. But I’m getting ahead of myself.

So there Lorenz was, modeling the weather. To save memory, as he ran the calculations, he printed the results to charts. At ten characters a second this was tedious. To save time, he printed to three decimal points.

The LGP-30 would hum and pop while it calculated a value to six decimal places. The Flexowriter would bang and punch out the result to three decimal places. Calculate 0.573547 and print 0.574. Again and again, line by line, while Lorenz waited.

This shouldn’t have been a big deal. The differences between the calculated results and printed values were quite small. But when Lorenz retyped the numbers and reran the models, he noticed something extraordinary. Weather on the original chart and the new chart would track for a day or two. But pretty soon, they’d differ widely, unexpectedly. What was once a calm day suddenly turned into a tornado. All due  to the tiny differences in the source data. Edward Lorenz had discovered chaos theory.

“Complexity. It’s extremely difficult to predict all the outcomes of one seemingly small change.” David Lavezzo of Capital One wrote in the book Security Chaos Engineering. “Measurement is hard.” And even when we have metrics, which we rarely do, these small changes compound and lead us into unforeseen territory.

You can’t just rely on the temperature numbers predicted at the beginning of the week. You have to actually go outside. See if you need a jacket. See if you should be wearing shorts. The same is true of security. We can’t rely on our long-range forecast. We need to check the reality on the ground. Regularly. From there, adapt according to our principles.

We future-proof our security architecture by choosing versatility. We design for adaptability by prioritizing principles over rules-based approaches. But when we get to implementation, we should expect that we’ve missed something. Expect people and applications and devices and butterflies have behaved in ways that are a few decimal places further than we had considered.

We need some applied chaos to test and harden our implementation. The emerging domain of security chaos engineering is providing some useful techniques. Inject some evidence. Change some settings. Run some exploits. Validate that the security controls continue to operate. Security chaos engineering provides a way to explore the unexpected.

But ultimately, the take-away from Edward Lorenz is one of humility. We simply don’t know what will come. With the data we have, we can’t predict what will happen. Decades of advances in computing since the Royal McBee LGP-30 haven’t changed this equation. When implementing security, pilot with chaos to prepare for the unforeseen.

Royal McBee LGP-30 replica by Jürgen Müller

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Change the game – Design Monday

Posted by

Cyber security can be thought of as a game. Offense and defense. A set of motions and movements to score points, or to prevent the other team from scoring. Red team and blue team. A series of tactics and techniques to break in, or to detect and prevent such action. This thought is a good starting point. But we shouldn’t simply work on being better at the game. We need to change it.

Take basketball. When basketball debuted at the Berlin Summer Olympics in 1936, the game looked much the same as it does today. Sure, there have been subsequent rule changes. But the ball and hoop, well, those are classic.

Except.

During the first fifteen years of basketball, no one thought beyond the basket. Peach basket, to be precise. James Naismith famously nailed a peach basket to a gymnasium wall and thus invented the game. But it was the whole basket. After points were scored, a ladder would be used to fetch the ball. Sometimes, they used a stick to push the ball out. For fifteen years.

Why?

One reason is it’s hard to see beyond things. Functional fixedness. Another reason? We’re hardwired to add rather than subtract. Given the choice between adding a fetching stick and removing the bottom of the basket, we almost always choose the stick.

This human tendency has been studied. (See: People systematically overlook changes). There’s even book on the topic, Subtract: The Untapped Science of Less. The Subtract book looks at it from practically every domain, science to business to medicine and more. Except cyber security. Perhaps we can make it into a future edition.

Imagine people using IT in the organization. Imagine that’s the game we’re seeking to win. Get a sense of the players and the ball using business impact analysis. Get a sense of the movement and plays using journey mapping. Now imagine ways to secure this.

Your instinct will be to add. Pause. Look around for the peach baskets which can be replaced with hoops. Find something to subtract that improves the security.

Then change the game.

Peach baskets: the basket in basketball.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Good security is like a good coffee pot – Design Monday

Posted by

Coffee. Coffee fuels hackers, coders and security wonks alike. For hackers of my generation, we tackled many a problem and brewed many a pot with a Braun. And within its hourglass shape lies a lesson for today’s security professionals.

The chief designer at Braun from 1961-1995 was Dieter Rams. He was behind the ubiquitous Braun coffeemaker from the 1980s. (I had a hand-me-down pot in my workshop in the 1990s.) Now you might think the shape was for decoration. Makes sense. One of Dieter Rams’ ten principles for good design is that good design is aesthetic. You’d be wrong.

Attractiveness for the sake of attractiveness isn’t Dieter Rams point. His design aesthetic was first solving the problem, and then solving the problem in a beautiful way.

The hourglass coffeemaker’s shape stemmed from a problem with the plastic. Plastic casings were still relatively new at the time. The process wasn’t producing plastic that was strong enough. The fluting provided strength and structure. As Dieter Rams wrote, “what was often misunderstood as some kind of post-modern decorative element had in fact a definite structural function.”

Applying this to cyber security: first design to meet the security requirements, then redesign using the same elements to provide a good experience.

Braun KF 157 Coffeemaker, Photography via WorthPoint.

Good Design is Aesthetic

I’m nostalgic about Braun KF 157 coffeemaker. But I’m in love with the Braun KF 20.

The KF 20 was ahead of its time. It looked like science fiction. In the futuristic world of Alien set in 2122, there was the Braun KF 20.

Florian Seiffert designed the coffeemaker in 1972. Following Dieter Rams direction and principles, every stylistic element has a functional purpose. The end result is well-designed, well-intentioned, beauty.

“It is truly unpleasant and tiring to have to put up with products day in and day out that are confusing, that literally get on your nerves, and that you are unable to relate to.” Dieter Rams spoke of products like coffee pots. But he just as easily could have been describing security controls.

Good security has a design aesthetic that is relatable and understandable.

Braun KF 20 Coffeemaker, Image via Dan Gorman

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Add some nice rims – Design Monday

Posted by

“Simple cars need complex wheels.”

So said automotive designer Lowie Vermeersch about the Pininfarina Nido. When you make something so incredibly simple, a bit extra makes the entire thing pop.

The equivalent of nice rims in a security capability is that one thing we do that goes just a little bit further to make the end-user happy. It’s not something we have to do. We’re going to need wheels anyway. It’s a little extra.

It’s not something that adds much to the cost of the project. A nice set of rims runs around $1,000 with the average price of a car being $40,000. But its something the end-user notices and appreciates far above the price tag.

The path for designing a security capability goes from complexity to simplicity, taking those steps with empathy and understanding. As we follow that path, keep an eye open. Find opportunities to spend a fraction of the budget (say 1/40th?) on one detail that pleases people.

Simple security still needs chrome.

Pininfarina Nido EV, Photography courtesy NetCarShow.com

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.