“Why is SIEM an area of unease for so many security officers? To make detection and response successful, we need tools capable of upscaling the practitioners as well as equipping them to be successful. We need tools we can rely on.
In today‘s episode, we had an inspiring conversation with J Wolfgang Goerlich, Advisory CISO at Cisco Secure. We discussed how trust is a determinant factor in building the security tools of the future, why so many CISOs lost trust over SIEMs and what we can do to rebuild it.”
Nudge and Sludge: Driving DevOps Security with Design
Security people say users are the weakest link. When security becomes burdensome, users take shortcuts jeopardizing security. Design offers a solution. We will walk through affordances, nudges, sludge and principles to inform and direct our design. Come learn how better usability leads to DevOps security.
Google last week revealed that it was coordinating efforts with global partners to hand out free USB security keys to 10,000 elected officials, political campaign workers, human rights activists and journalists, and other users considered to be at high risk of getting hacked.
“Whenever a major organization makes a major announcement bolstering their security controls, it sparks conversation and movement in the broader industry,” agreed Wolfgang Goerlich, advisory CISO at Cisco Secure. “Google’s announcement that it is enrolling 10,000 people in authenticating with strong security keys will make it easier to explain a similar need in other organizations.”
And this isn’t the first such corporate endorsement of hardware-based authentication. Among the companies using FIDO’s standards for Universal 2nd Factor (U2F) authentication keys is Yubico, which like Google has been working with DDC to provide its hardware-based authentication keys to campaigns from both major parties.
This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category. Do you want to interview Wolf for a similar article? Contact Wolf through his media request form.
79% percent of people used two-factor authentication at least once in 2021, with 72% regularly using the technology, as remote work, social media, and online retail spur demand.
SMS texts continued to be the most-used type of two-factor authentication, with 85% of people using that 2FA technology. Verification emails are the second most common type at 74%, while passcodes issued by mobile authentication apps came in third with 44%.
Companies need to educate consumers more on the pitfalls of SMS text messages as a second factor, Goerlich says. More than half of people surveyed would choose SMS as the second factor for a new account, while less than 10% would choose a mobile passcode application and 7% would use a push notification. SMS tied with security keys, such as YubiKey and other technology, for highest perceived security and topped the list for usability.
“There is a clear mismatch between what the survey respondents are using in terms of security and what researchers have found and identified in terms of security,” he says. “It makes sense that SMS is rated high in usability, and there is a really strong familiarity with the factor, but a lot of issues have been identified by researchers.”
Attempts to educate people on security problems with SMS should be careful, however, not to dissuade them from using two-factor authentication at all, Goerlich stressed.
This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category. Do you want to interview Wolf for a similar article? Contact Wolf through his media request form.
“Wolfgang Goerlich, Advisory CISO, explains the current state of information security, and why he thinks many environments are focusing on the wrong things. We speak about ransomware, extortionware, and phishing, even giving examples where we know we have personally been phished! He explains how this illustrates his point that we need more emphasis in different areas of information security.”
“Hacker History sits down with Wolfgang Goerlich. We learn about the importance of having root and admin when becoming a hacker and how truly important it is that we have patience and empathy for one another. We also learn that skateboarding in a datacenter doesn’t actually work.”
In this episode of CyberWire-X, guests will discuss a framework of technical controls to ensure only trusted sessions authenticate, regardless of faults or failures in any one factor. We will share a path forward for increasing trust in passwordless authentication. Nikk Gilbert of CISO of Cherokee Nation Businesses and retired CSO Gary McAlum share their insights with Rick Howard, and Advisory CISO of Duo Security at Cisco Wolfgang Goerlich from sponsor Duo Security offers his thoughts with Dave Bittner.”
An entire series of articles could be written applying Dieter Rams‘ principles to cybersecurity. This is not that. Instead, let’s look to Rams as an example of creating and living with principles.
What makes a good architecture principle? It makes a statement. “Good design is honest,” Dieter Rams might type out. “Buy not build” is one I often encounter. A good architecture principle has a rationale. “It does not make a product more innovative, powerful or valuable than it really is. It does not attempt to manipulate the consumer with promises that cannot be kept.” For buy not build, our development resources are valuable and must be deployed only in areas where there is a clear advantage and where an existing solution doesn’t satisfy the majority of our needs. Finally, a good principle makes an impact. It has implications for later decisions.
“I like orderly confusion very much. But this is neither orderly nor properly confused.” Dieter Rams says about an hour into the documentary, while evaluating objects against his esthetic and principles. “Others may like it. I do not.” A set of good architecture principles enables the team to make decisions. These decisions may be very different from other security teams, even other security teams in similar industries and at similar times. The success of a security architecture depends not upon the individual decisions. Rather, success depends on the consistency across decisions, initiatives, and capabilities. Consistency through principles.
Consistency poses a challenge. The same thing means different things to different people. For architecture principles to work, the team must debate implications and applications. An example of this comes in the documentary when Mark Adams walks Dieter Rams through the new Vitsoe headquarters. For background, Adams is the managing director of Vitsoe, the firm which produces Rams’ furniture. “I want it to be completely honest that that is a fire barrier,” Adams explains. But is it honest? And does the honesty balance against the other principles? After a moment of thought, Rams says simply: “It’s a little bit irritating.” After some back and forth, they decide to sand it and blend it in. (In the photo below, you can see the resulting gray fire panels.) The moment captures this discussion of application. Principles live through debate.
Be principled. Develop a small set of architectural principles to guide the technical design. Live with them. Argue them. Disagree and commit. Apply and iterate them. But be principled.
Vitsoe London Headquarters, Photography by Vitsoe.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
No security capability operates as intended. Even with perfect data, perfect planning, and perfect foresight? Small differences between our assumptions and reality quickly add up to unpredictable situations. Security faces the proverbial butterfly flapping its wings in Brazil producing tornado in the United States.
The butterfly effect was coined by Edward Lorenz, a meteorologist and father of chaos theory. It all started when the limitations of computing led to the limitations in forecasting. It’s a pattern that still plays out today, leading some to point to the need for chaos engineering.
Edward Lorenz was working on one of the first desktop computers: the Royal McBee LGP-30. Desktop in the sense that the computer was, in fact, the size of a desk. It also cost nearly a half billion dollars, in today’s US currency. We’re talking state-of-the-art vacuum tube technology. A teletype machine, the Friden Flexowriter, provided both input and output. It printed at a glacial ten characters per second.
These constraints of his machine inspired Edward Lorenz. But I’m getting ahead of myself.
So there Lorenz was, modeling the weather. To save memory, as he ran the calculations, he printed the results to charts. At ten characters a second this was tedious. To save time, he printed to three decimal points.
The LGP-30 would hum and pop while it calculated a value to six decimal places. The Flexowriter would bang and punch out the result to three decimal places. Calculate 0.573547 and print 0.574. Again and again, line by line, while Lorenz waited.
This shouldn’t have been a big deal. The differences between the calculated results and printed values were quite small. But when Lorenz retyped the numbers and reran the models, he noticed something extraordinary. Weather on the original chart and the new chart would track for a day or two. But pretty soon, they’d differ widely, unexpectedly. What was once a calm day suddenly turned into a tornado. All due to the tiny differences in the source data. Edward Lorenz had discovered chaos theory.
“Complexity. It’s extremely difficult to predict all the outcomes of one seemingly small change.” David Lavezzo of Capital One wrote in the book Security Chaos Engineering. “Measurement is hard.” And even when we have metrics, which we rarely do, these small changes compound and lead us into unforeseen territory.
You can’t just rely on the temperature numbers predicted at the beginning of the week. You have to actually go outside. See if you need a jacket. See if you should be wearing shorts. The same is true of security. We can’t rely on our long-range forecast. We need to check the reality on the ground. Regularly. From there, adapt according to our principles.
We future-proof our security architecture by choosing versatility. We design for adaptability by prioritizing principles over rules-based approaches. But when we get to implementation, we should expect that we’ve missed something. Expect people and applications and devices and butterflies have behaved in ways that are a few decimal places further than we had considered.
We need some applied chaos to test and harden our implementation. The emerging domain of security chaos engineering is providing some useful techniques. Inject some evidence. Change some settings. Run some exploits. Validate that the security controls continue to operate. Security chaos engineering provides a way to explore the unexpected.
But ultimately, the take-away from Edward Lorenz is one of humility. We simply don’t know what will come. With the data we have, we can’t predict what will happen. Decades of advances in computing since the Royal McBee LGP-30 haven’t changed this equation. When implementing security, pilot with chaos to prepare for the unforeseen.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.