CyberSecurity design weekly recap for October 26-31.
This week: Renzo Piano and the California Academy of Sciences. There’s a tension when designing a security architecture. The architecture must meet and mirror culture of the organization. The design can’t run contrary to how the organization works. But at the same time, the new controls must facilitate a cultural change towards a more secure way of being. The architecture mirrors while it modifies. Principle: Design for change and stability.
Previously: Paul Hekkert and the Unified Model of Aesthetics. Most Advanced, Yet Acceptable (MAYA) is the name Hekkert has given this principle. How advanced can the design be while still remaining familiar, still being acceptable, still looking like work? The answer will vary from organization to organization due to culture. But the question must remain top of mind for security leaders pushing the envelope. Principle: Balance familiarity with novelty.
One thing more: I was asked this week: “How can companies reduce the human errors that so often lead to security breaches?” Here’s the thing. The number one cause of problems in early flight? Human error. The number one cause of manufacturing accidents? Human error. Number one cause of nuclear power plant problems? Human error. Security problems? Yep, human error. The root cause of all these issues: poor design.
Check out User Friendly: How the Hidden Rules of Design are Changing the Way We Live, Work & Play for more on the root cause of human error in flight, manufacturing, computing.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
Posted by