For years, the Peerlyst social network has been a resource for software developers looking for a job or cybersecurity enthusiasts wanting to host meetups across the world. But on Aug. 27, the website will shut down, Peerlyst founder Limor Elbaz said Monday, citing financial pressure.
Cybersecurity professionals lamented the end of the platform. “I took the news hard,” said J. Wolfgang Goerlich, an advisory CISO at Duo Security who has posted nearly 700 times on Peerlyst. “With the Peerlyst going away, we’re losing a central watering hole. The conversations may continue over LinkedIn and Facebook groups. But the loss of a dedicated security social media site will be felt for some time.”
The site also let users plans their own offline meetups in various cities in Asia, Australia, Europe, and North America.
I was an early adopter of Peerlyst and a regular contributor. I end up the 22nd most popular user on the site which boasts of serving “70% of security professionals around the world and the site ranks higher than the majority of security companies.” Also? Peerlyst once put my face on the side of a bus during the RSA Conference. So I’m a little biased.
There is tremendous value in community. Apple itself got its start at the The Homebrew Computer Club. I spent many years and cut my teeth as a top poster in the Citrix online community, back in the early 2000s. And in the last decade, more people than I can count had their careers launched through my local security community, MiSec.
I’m sad to see Peerlyst go and am grateful to Limor Elbaz, Evgeny Belenky, and the entire Peerlyst team. My thanks to them for the memories and connections.
To you the reader, I ask this: what community will you build?
This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category. Do you want to interview Wolf for a similar article? Contact Wolf through his media request form.
Saul Bass designed corporate identities. He created movie posters. In both, his signature style was minimalism and clarity. Consider the iconic AT&T bell logo (1969), or the Magnificent Seven poster below (1960). Clean. Concise. But he is best remembered by his reimagining of the movie title sequence. Originally, the titles were how the film provided credits. And because of this, people naturally ignored them, using the time for a concession run.
Saul Bass saw it differently: “The audience involvement with the film should begin with the first frame. Use titles in a new way to create a climate for the story that was about to unfold.” Take my favorite of his title sequences: Grand Prix (1966). The engine revs. The cars come into view. The engineers and mechanics movements are isolated, amplified, repeated, glorified. Everything about those first few minutes pumps me up. I frankly can’t recall anything else about the film. But I never forgot that intro.
Of course, my reaction was a bit of a problem for studios. “There was a backlash against inventiveness in credit design, first from the industry and then from at least one well-known critic.” Jan-Christopher Horak writes in Saul Bass: Anatomy of Film Design. Quoting Variety in 1957, “An offbeat credit runoff, while pleasing to the patrons, does an injustice to the talent since the audience’s attention is diverted from the names.”
Let’s put Saul Bass’s story aside for a moment and turn towards designing and architecting cyber security capabilities. In the final phase, when planning the implementation, how are we treating the critical beginning of the project?
Most kick-off with the equivalent of running credits while stakeholders are getting popcorn. A 2018 study by the Project Management Institute (PMI) into project failures reflects this status quo. Projects failed due to vision (29%), poor communication (29%), and unsurprisingly, inadequate support from stakeholders and sponsors (26%). We read off the checklist and they check-out.
“In a sense,” says Art of the Title, “all modern opening title sequences that introduce the mood or theme of a film are a legacy of the Basses’ work.” It’s short form storytelling. It’s an entire theme of a movie boiled down to simple ideas well visualized. An opening title sequence frames the movie and creates excitement for what’s to come. If we want our implementation to be successful, this is what our kick-off meeting must deliver.
Start strong. Start with style. Plan the kick-off meetings like Saul Bass planning a title sequence. The project will be our blockbuster. Start it like one.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
We spend far too much time talking about defense in depth and far too little time talking about economy of mechanism.
As a design inspiration, look to Alfred Heineken. Not a designer, Heineken was a brewer and a businessman. In the 1950s, modernizing the look of the Dutch brewing company, Heineken made two changes to the beer’s logo. He dropped the upper-casing and then, to be playful, he tilted the e until it resembled a smile. Simple.
Defense in depth suggests more controls and more tools are better. However, this complexity comes at a cost. In a study performed by Cisco, the number of vendor tools was directly correlated with the downtime from a security incident. Security teams using one vendor averaged four hours or less of downtime, while teams managing more than 50 averaged more than 17 hours of downtime.
I suspect the downtime is driven by the team’s confusion when responding to incidents. It fits my personal experience, and reminds me of what Donald A. Norman wrote in Living with Complexity. “Modern technology can be complex, but complexity by itself is neither good nor bad: it is confusion that is bad. Forget the complaints against complexity; instead, complain about confusion.”
Economy of mechanism suggests implementing the fewest controls and fewest tools to mount an adequate defense. We have a finite cognitive throughput from people doing the work and people securing the work. We have a finite budget. After we have the requirements and possible tooling options, ask how we can achieve the same results with less. Ask again, and again.
Find the letter e, tilt it a bit, and smile.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
Artists create unique piece for a limited audience. Designers create for scale. The tension exists between creating something that works and building something that’s repeatable.
This tension came up in conversation around the article I wrote about Kenji Kawakami and the art of Chindōgu. The principle is employing playful anarchy to bring security controls from useless to un-useless to useful. People were quick to point out that quantifiable, repeatable, scalable security is jeopardized by the ad hoc chaos of creation.
For guidance, look to George Nelson who was the Director of Design for Herman Miller from 1947 to 1972. One of the first designs George Nelson brought forward was a “sculpture-for-use” table by Isamu Noguchi. Sculpture remade as a repeatable product. Nelson also managed designers such as Charles and Ray Eames, Alexander Girard, and Robert Propst. It’s a simple comparison to draw from furniture to technology, from the difficulty of managing people like the Eames to the difficulty of managing today’s cybersecurity talent.
Here is how Nelson did it for twenty-five years:
Philosophy. Reading George Nelson’s introduction to the Herman Miller catalog in light of the intrinsic motivation framework laid out in the book Drive. Autonomy, mastery, purpose. Nelson’s philosophy is finely tuned for getting the best out of innovative people. An unstated undercurrent is that designs must be producible. After all, Herman Miller is a business. The trick was to protect the playful anarchy while harnessing the results for manufacturing at scale. “There is a hint of the craftsman as opposed to the industrialist.”
Methodology. In modern times, George Nelson has been described as a meta-designer. That is, he spent more time designing the furniture design process than he spent designing the actual furniture. While he retired some twenty years before the founding of IDEO, Nelson would have been right at home in the world of design thinking. He pioneered a formal way to go from a series of conversations, to a series of prototypes, to a finished product. Along the way, capturing information and providing feedback to refine not only the design but also the lifecycle itself. Nelson’s approach was showcased in the “The Design Process at Herman Miller” exhibit in 1975.
The challenge in cyber security design is taking a successful proof-of-concept and scaling from prototype to securing the overall organization. How to balance the artist with the designer? The craftsman with the industrialist? Playful anarchy to well-defined operations? Nelson held a philosophy geared to foster those intrinsic motivations of the creative mind. He created a methodology for taking ideas to market. George Nelson combined both into his meta-design approach.
For security leadership to get meta, develop a philosophy and methodology, design a way to design, and improve based on feedback.
Philosophy drives the satisfaction of our people. Methodology drives the success of our initiatives. We need both, and both need continuous improvement.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
This week: Yves Saint Laurent and fashion. CyberSecurity can be a bit too much like fashion. Every major event, there’s a new trend. The media buzz will say that new threats appear every day. The buzz is that our ways of defending become dated and ineffective as quickly as they’re implemented. What to do? Do the fundamentals well. Do them consistently. Do them with style. Principle: Frameworks fade but security is eternal.
Previously: Charlotte Perriand and the LC4 Chaise. Principle: Take it one metaphor at a time. Around 1930, Perriand applies the metaphor of the lounging cowboy to the LC4 Chaise Longue. Twenty years later, around 1950, Børge Mogensen applies the metaphor of Perriand’s chair to Morgensen’s Hunting Chair. And twenty years after that, we have lawn furniture inspired by Mogensen and Perriand. Technology advances at the speed in which new metaphors are identified, shared, adopted, and absorbed. Principle: Take it one metaphor at a time.
One thing more: YouTube has a documentary called Charlotte Perriand: Inventing the World. “An opportunity to review Perriand’s life and career from the perspective of her artistic activities as well as her social and political engagement. We talked about her stance on the individual’s role in nature, the position of women in society, a new type of living environment, the way different types of artistic creation relate to each other, and the concept of a synthesis of the arts.”
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
Frameworks fade but security is eternal. Said with apologies to Yves Saint Laurent.
Yves Saint Laurent was a dominant force in fashion from the 1960s through to end of the century. His strengths stemmed from three areas. First, seeing the underlying fundamentals and being able to re-envision them across genders, across times, and across trends. Second, the ability to cross artforms for inspiration, most notably with Piet Mondrian and geometrical shapes. Finally, the ability to reformulate high fashion at couture for mass production. Yves Saint Laurent was the first to open a ready-to-wear line in Paris. He was a designer who mastered how to take the pieces apart and put them back together for new tastes and new markets. It Yves Saint Laurent who once famously said, “fashion fades but style is eternal.”
Last week, we looked at how the adoption of a control — doing something right but rare — has surprising stopping power against common attacks. But the fast-changing early adoption must be balanced with slow-changing fundamentals.
CyberSecurity can be a bit too much like fashion. Every major event, there’s a new trend. The media buzz will say that new threats appear every day. The buzz is that our ways of defending become dated and ineffective as quickly as they’re implemented. New frameworks cry out that the old ways were wrong.
This last bit is particularly on my mind in 2020. A new version of the CIS Critical Security Controls came out late last year. NIST is releasing a new version of its standard for security and privacy controls (NIST SP 500-53B). And the new PCI DSS (Data Security Standard) for credit card security is due any time now. Each framework will be accompanied by a wave of press on how everything has changed. The last version is so last season, and simply won’t do.
But is it? Is it really?
Like style, fundamentals in security remain the same even while the specifics evolve. We need to know our people and our technology. We need visibility into what’s happening and what’s changing. We need to think in terms of lifecycles and act in terms of incidents. We need to make sure the essential habits which result in defensible positions are done regularly. Finally, we need to understand the adversary’s objectives and tactics. From mainframes to data centers to cloud infrastructures to tomorrow, the fundamentals hold true.
A security architecture is comprised of a series of building blocks. Some building blocks should be innovative and ahead of our peers. Most building blocks should do the fundamentals and broadly cover the frameworks.
Do the fundamentals well. Do them consistently. Do them with style.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
This week: Wim Crouwel and the New Alphabet. The first computer screen font predated the personal computer by a decade. Crouwel saw the possibility of CRTs and glimpsed the future of computers. By accepting the CRTs limitations as creative constraints, Crouwel redesigned the alphabet with straight quick lines. Crouwel released New Alphabet in 1967. It was innovative. It was unreadable. But it made a statement. Principle: Be ahead of the curve and ahead of the criminals.
Previously: Colonel John A. Macready, Bausch & Lomb, and Ray-Bans. A little-known fact: Ray-Bans are safety goggles. You wouldn’t know it today. You can pay a couple hundred to buy these as sunglasses from Luxottica. How Ray-Bans went from practical to luxury is a story with a lesson for developing implementation plans. Principle: Hand out Ray-Bans not safety goggles.
One thing more: There’s a YouTube video on How Ray Ban Became the King of Sunglasses that’s worth checking out. One thing I didn’t mention in the Ray-Bans article was how the invention of a technique that makes lenses from molten glass which were impact-resistant made the sunglasses possible in the first place. It was a technical leap forward.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
The first computer screen font predated the personal computer by a decade.
The tech wasn’t about to cooperate. For those who weren’t around during the CRT (Cathode-ray tube) screen days, here’s the thing. CRTs, in the sixties, refreshed slowly, updated even slower, couldn’t draw curves, and could barely draw a pixel. Any sane person would stay away from them.
Enter Wim Crouwel. Crouwel saw the possibility of CRTs and glimpsed the future of computers. By accepting the CRTs limitations as creative constraints, Crouwel redesigned the alphabet with straight quick lines. The resulting font, New Alphabet, displayed clearly on the limited screens. Crouwel released New Alphabet in 1967. It was innovative. It was unreadable. But it made a statement. New Alphabet informed the designers of the personal computers. It took a decade. But when the Apple II, Commodore PET, and TRS-80 hit in 1977, each computer featured a CRT screen and a fully readable font. The possibility Crouwel saw had come true.
With all the talk about cyber security constantly changing, we’re surprising slow at adopting new and innovative controls. We give the same excuses Wim Crouwel would have heard from his peers: the technology isn’t ready, it’s too hard, it’s too new. I recall running into this when deploying firewalls in the early 2000s. An excellent control was egress filtering. Most thought about firewalls protecting traffic coming in. But by looking at traffic going out, we could stop malware and attackers from calling home. Most engineers didn’t want to do this because it was too hard. We did. And until most defenders adopted egress filtering, attackers didn’t bother working around it, so the simple control caught many a bad guy.
Early adoption of a control — doing something right but rare — is super effective against casual attackers and commodity attacks. It may be easily bypassed by advanced attackers or sophisticated tools, but the majority of the time organizations face more common threats. The control continues to be effective until many have adopted it. Consider:
Example 1) Mac OS X computers were more secure on the Intel platform from Windows when released in 2006. Macs had 8% of the market share by 2014 and little malware. By 2019, the share of the desktop market running Macs climbed to 17%. That same year, Windows had 5.8 malware detections per computer per year. Macs had nearly double, 11 malware detections per computer. Macs had great stopping power for thirteen years.
Example 2) Windows 10’s market share reached 25% by 2017. Windows 10 had a feature that auto-played image files like ISO. This was a great new feature for phishers because most spam filters blocked executables like EXE. In May 2017, criminals started repackaging their malicious EXEs in ISO files and sending them on through. Sure, some organizations were filtering ISOs. But most weren’t, at least, until 2019. When spam filters finally caught up, April 2019, criminals simply switched from ISO to IMG image files. But for nearly two years, a simple ISO filter had stopping power.
Example 3) One last example that’s near to my heart. When Microsoft Office 365 email launched in 2011, the early adopters quickly rolled out multi-factor authentication (MFA). Attacks reusing stolen credentials were easily blocked, stopping phishing for passwords. By 2019, MFA adoption on Office 365 email exceeded 20%. The criminals began to switch from trying to steal passwords to trying to steal the authentication tokens, thereby bypassing MFA altogether. Eight years. While MFA still has stopping power, the threats are beginning to adapt.
Wim Crouwel was a decade ahead of his time and his font never saw wide adoption. Though it did have a resurgence in popular culture in 1988, when Peter Saville and Brett Wickens used New Alphabet for Joy Division’s Substance album cover. Wide adoption wasn’t the point. Showing others the possibility of the new medium was, and at that, Crouwel succeeded.
When designing and implementing cyber security controls, Crouwel is an inspiration. The tech will not cooperate. The result won’t look normal. But doing something right but rare, adopting a security control ahead of the pack, has demonstrated stopping power. Because it’s right, it stops the common attacks. Because it’s rare, criminals aren’t incentivized to work around it. The early adopter strategy can give our organizations and advantage that lasts years.
Being ahead of the adoption curve is being ahead of the criminals.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.
IDEO has been at the center of many fundamental designs in computing history. This includes the simple and ubiquitous mouse.
Thought it was Apple? Think again. Steve Jobs came to a firm called Hovey-Kelley in the late seventies, a firm which would become IDEO in 1991. Jobs had a problem. The only other computer mouse in existence cost 16 times what people could afford. The mouse also broke frequently and was, well, ugly. None of this would work for the Lisa and Mac.
David Kelly (of David Kelley Design, Hovey-Kelley, and later one of three founders of IDEO) assembled a team. Douglas Dayton worked on the frame. Jim Yurchenco was responsible for the mechanical design. Bill Dresselhaus, with his love of Art Deco, handled the packaging. The technology of the day was finicky and “required such precision that it probably couldn’t be mass-produced.” There were practical debates about the sound of the click, or the number of buttons. Each change required every other part to be redesigned to fit in the tiny space. But even in those early days, the firm that would become IDEO had a secret weapon.
Design thinking. IDEO refined it and popularized it. Design thinking is a way of problem solving and developing solutions that’s a departure from how we in IT have long done things. Consider the following five points of design thinking:
Define – think about the main problem we’re trying to solve
Ideate – brainstorm, mindmap, whiteboard, play
Prototype – build a possible solution
Test – sit down with the people and have them test the prototype
Now compare the design thinking steps to ITIL service design:
Service solution – think about requirements, deadlines, costs, budgets
Information systems and tools – think about the service portfolio, configuration management, capacity, and security
Technology and architecture – think about designs, plans, and processes to align IT policy and strategy
Design processes – think about the process model for operation and improvement
Measures and metrics – think about what we’ll measure to ensure the service is working
Notice what’s missing? People. I mean, ITIL practitioners will reply, “no, no, no. We have the 4P’s: Product, People, Process, and Partner.” Fair enough. But compare the two lists. People are not the focus. And to anyone who has been in the workforce as an enterprise end-user? It shows. We can feel it. Because people designing IT and IT security don’t think much about the people who’ll use it, the people who use it don’t think much about what we’ve designed.
Case in point: credentials. Research shows that people with more technical knowledge don’t take more steps to protect their data than people with basic knowledge (User Mental Models of the Internet and Implications for Privacy and Security). Most people know they should use separate passwords for every app (91%). But most people use the same password anyways (66%). Most people know they should use MFA. But most people don’t (66%). The problem isn’t one of awareness. (Source: LastPass and Security Ledger.) In not considering how regular people use and secure technology, we’ve created a situation where people simply opt out.
Enterprise IT is a like the original mice. Xerox, the mouse Apple copied, cost $400 or $1200 in 2020 US dollars. Doug Engelbart’s, the mouse Xerox copied, required a training course that took 6-months to master the damned thing. That’s ITIL thinking. That’s the type of technology people will be aware of, but not take steps to use.
Design thinking, the focus on people and rapid prototyping, led to a mechanical mouse setup which would dominate mice designs for the next twenty years. The original Apple mouse was $25. (Adjusted for inflation, that’s $79, which is coincidentally the price Apple charges for the optical Magic Mouse in 2020.) A child could pick up the mouse and immediately use it. Most of my generation learned in grade school. It just worked, worked well, and worked at a fraction of the cost.
This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.