79% percent of people used two-factor authentication at least once in 2021, with 72% regularly using the technology, as remote work, social media, and online retail spur demand.
SMS texts continued to be the most-used type of two-factor authentication, with 85% of people using that 2FA technology. Verification emails are the second most common type at 74%, while passcodes issued by mobile authentication apps came in third with 44%.
Companies need to educate consumers more on the pitfalls of SMS text messages as a second factor, Goerlich says. More than half of people surveyed would choose SMS as the second factor for a new account, while less than 10% would choose a mobile passcode application and 7% would use a push notification. SMS tied with security keys, such as YubiKey and other technology, for highest perceived security and topped the list for usability.
“There is a clear mismatch between what the survey respondents are using in terms of security and what researchers have found and identified in terms of security,” he says. “It makes sense that SMS is rated high in usability, and there is a really strong familiarity with the factor, but a lot of issues have been identified by researchers.”
Attempts to educate people on security problems with SMS should be careful, however, not to dissuade them from using two-factor authentication at all, Goerlich stressed.
This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category. Do you want to interview Wolf for a similar article? Contact Wolf through his media request form.
“Wolfgang Goerlich, Advisory CISO, explains the current state of information security, and why he thinks many environments are focusing on the wrong things. We speak about ransomware, extortionware, and phishing, even giving examples where we know we have personally been phished! He explains how this illustrates his point that we need more emphasis in different areas of information security.”
“Hacker History sits down with Wolfgang Goerlich. We learn about the importance of having root and admin when becoming a hacker and how truly important it is that we have patience and empathy for one another. We also learn that skateboarding in a datacenter doesn’t actually work.”
In this episode of CyberWire-X, guests will discuss a framework of technical controls to ensure only trusted sessions authenticate, regardless of faults or failures in any one factor. We will share a path forward for increasing trust in passwordless authentication. Nikk Gilbert of CISO of Cherokee Nation Businesses and retired CSO Gary McAlum share their insights with Rick Howard, and Advisory CISO of Duo Security at Cisco Wolfgang Goerlich from sponsor Duo Security offers his thoughts with Dave Bittner.”
An entire series of articles could be written applying Dieter Rams‘ principles to cybersecurity. This is not that. Instead, let’s look to Rams as an example of creating and living with principles.
What makes a good architecture principle? It makes a statement. “Good design is honest,” Dieter Rams might type out. “Buy not build” is one I often encounter. A good architecture principle has a rationale. “It does not make a product more innovative, powerful or valuable than it really is. It does not attempt to manipulate the consumer with promises that cannot be kept.” For buy not build, our development resources are valuable and must be deployed only in areas where there is a clear advantage and where an existing solution doesn’t satisfy the majority of our needs. Finally, a good principle makes an impact. It has implications for later decisions.
“I like orderly confusion very much. But this is neither orderly nor properly confused.” Dieter Rams says about an hour into the documentary, while evaluating objects against his esthetic and principles. “Others may like it. I do not.” A set of good architecture principles enables the team to make decisions. These decisions may be very different from other security teams, even other security teams in similar industries and at similar times. The success of a security architecture depends not upon the individual decisions. Rather, success depends on the consistency across decisions, initiatives, and capabilities. Consistency through principles.
Consistency poses a challenge. The same thing means different things to different people. For architecture principles to work, the team must debate implications and applications. An example of this comes in the documentary when Mark Adams walks Dieter Rams through the new Vitsoe headquarters. For background, Adams is the managing director of Vitsoe, the firm which produces Rams’ furniture. “I want it to be completely honest that that is a fire barrier,” Adams explains. But is it honest? And does the honesty balance against the other principles? After a moment of thought, Rams says simply: “It’s a little bit irritating.” After some back and forth, they decide to sand it and blend it in. (In the photo below, you can see the resulting gray fire panels.) The moment captures this discussion of application. Principles live through debate.
Be principled. Develop a small set of architectural principles to guide the technical design. Live with them. Argue them. Disagree and commit. Apply and iterate them. But be principled.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
No security capability operates as intended. Even with perfect data, perfect planning, and perfect foresight? Small differences between our assumptions and reality quickly add up to unpredictable situations. Security faces the proverbial butterfly flapping its wings in Brazil producing tornado in the United States.
The butterfly effect was coined by Edward Lorenz, a meteorologist and father of chaos theory. It all started when the limitations of computing led to the limitations in forecasting. It’s a pattern that still plays out today, leading some to point to the need for chaos engineering.
Edward Lorenz was working on one of the first desktop computers: the Royal McBee LGP-30. Desktop in the sense that the computer was, in fact, the size of a desk. It also cost nearly a half billion dollars, in today’s US currency. We’re talking state-of-the-art vacuum tube technology. A teletype machine, the Friden Flexowriter, provided both input and output. It printed at a glacial ten characters per second.
These constraints of his machine inspired Edward Lorenz. But I’m getting ahead of myself.
So there Lorenz was, modeling the weather. To save memory, as he ran the calculations, he printed the results to charts. At ten characters a second this was tedious. To save time, he printed to three decimal points.
The LGP-30 would hum and pop while it calculated a value to six decimal places. The Flexowriter would bang and punch out the result to three decimal places. Calculate 0.573547 and print 0.574. Again and again, line by line, while Lorenz waited.
This shouldn’t have been a big deal. The differences between the calculated results and printed values were quite small. But when Lorenz retyped the numbers and reran the models, he noticed something extraordinary. Weather on the original chart and the new chart would track for a day or two. But pretty soon, they’d differ widely, unexpectedly. What was once a calm day suddenly turned into a tornado. All due to the tiny differences in the source data. Edward Lorenz had discovered chaos theory.
“Complexity. It’s extremely difficult to predict all the outcomes of one seemingly small change.” David Lavezzo of Capital One wrote in the book Security Chaos Engineering. “Measurement is hard.” And even when we have metrics, which we rarely do, these small changes compound and lead us into unforeseen territory.
You can’t just rely on the temperature numbers predicted at the beginning of the week. You have to actually go outside. See if you need a jacket. See if you should be wearing shorts. The same is true of security. We can’t rely on our long-range forecast. We need to check the reality on the ground. Regularly. From there, adapt according to our principles.
We future-proof our security architecture by choosing versatility. We design for adaptability by prioritizing principles over rules-based approaches. But when we get to implementation, we should expect that we’ve missed something. Expect people and applications and devices and butterflies have behaved in ways that are a few decimal places further than we had considered.
We need some applied chaos to test and harden our implementation. The emerging domain of security chaos engineering is providing some useful techniques. Inject some evidence. Change some settings. Run some exploits. Validate that the security controls continue to operate. Security chaos engineering provides a way to explore the unexpected.
But ultimately, the take-away from Edward Lorenz is one of humility. We simply don’t know what will come. With the data we have, we can’t predict what will happen. Decades of advances in computing since the Royal McBee LGP-30 haven’t changed this equation. When implementing security, pilot with chaos to prepare for the unforeseen.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
This session is on all the things we all say all the time, about all the things we call know. Security through obscurity is bad. Defense in depth is good. Stop clicking things. Next generation is bad, or maybe, next generation is good. The list goes on and on. The resulting rules of thumb are sometimes contradictory and often misleading. With war stories and anecdotes, we’ll explore what happens when teams run security by tribal knowledge instead of research and reason. Spoiler alert: they get pwned. Turns out, we were wrong.
Presented for Great Lakes Security Conference (GLSC) 2021.
Cyber security can be thought of as a game. Offense and defense. A set of motions and movements to score points, or to prevent the other team from scoring. Red team and blue team. A series of tactics and techniques to break in, or to detect and prevent such action. This thought is a good starting point. But we shouldn’t simply work on being better at the game. We need to change it.
Take basketball. When basketball debuted at the Berlin Summer Olympics in 1936, the game looked much the same as it does today. Sure, there have been subsequent rule changes. But the ball and hoop, well, those are classic.
Except.
During the first fifteen years of basketball, no one thought beyond the basket. Peach basket, to be precise. James Naismith famously nailed a peach basket to a gymnasium wall and thus invented the game. But it was the whole basket. After points were scored, a ladder would be used to fetch the ball. Sometimes, they used a stick to push the ball out. For fifteen years.
Why?
One reason is it’s hard to see beyond things. Functional fixedness. Another reason? We’re hardwired to add rather than subtract. Given the choice between adding a fetching stick and removing the bottom of the basket, we almost always choose the stick.
This human tendency has been studied. (See: People systematically overlook changes). There’s even book on the topic, Subtract: The Untapped Science of Less. The Subtract book looks at it from practically every domain, science to business to medicine and more. Except cyber security. Perhaps we can make it into a future edition.
Imagine people using IT in the organization. Imagine that’s the game we’re seeking to win. Get a sense of the players and the ball using business impact analysis. Get a sense of the movement and plays using journey mapping. Now imagine ways to secure this.
Your instinct will be to add. Pause. Look around for the peach baskets which can be replaced with hoops. Find something to subtract that improves the security.
Then change the game.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
Coffee. Coffee fuels hackers, coders and security wonks alike. For hackers of my generation, we tackled many a problem and brewed many a pot with a Braun. And within its hourglass shape lies a lesson for today’s security professionals.
The chief designer at Braun from 1961-1995 was Dieter Rams. He was behind the ubiquitous Braun coffeemaker from the 1980s. (I had a hand-me-down pot in my workshop in the 1990s.) Now you might think the shape was for decoration. Makes sense. One of Dieter Rams’ ten principles for good design is that good design is aesthetic. You’d be wrong.
Attractiveness for the sake of attractiveness isn’t Dieter Rams point. His design aesthetic was first solving the problem, and then solving the problem in a beautiful way.
The hourglass coffeemaker’s shape stemmed from a problem with the plastic. Plastic casings were still relatively new at the time. The process wasn’t producing plastic that was strong enough. The fluting provided strength and structure. As Dieter Rams wrote, “what was often misunderstood as some kind of post-modern decorative element had in fact a definite structural function.”
Applying this to cyber security: first design to meet the security requirements, then redesign using the same elements to provide a good experience.
Good Design is Aesthetic
I’m nostalgic about Braun KF 157 coffeemaker. But I’m in love with the Braun KF 20.
The KF 20 was ahead of its time. It looked like science fiction. In the futuristic world of Alien set in 2122, there was the Braun KF 20.
Florian Seiffert designed the coffeemaker in 1972. Following Dieter Rams direction and principles, every stylistic element has a functional purpose. The end result is well-designed, well-intentioned, beauty.
“It is truly unpleasant and tiring to have to put up with products day in and day out that are confusing, that literally get on your nerves, and that you are unable to relate to.” Dieter Rams spoke of products like coffee pots. But he just as easily could have been describing security controls.
Good security has a design aesthetic that is relatable and understandable.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.