Security is not the control, it is the context – Design Monday

Archive for the ‘Architecture’ Category

Security is not the control, it is the context – Design Monday

Posted by

Seeing is Forgetting the Name of the Thing One Sees. A fantastic title, right? I was having a coffee meeting with a new product designer a few months back. As can happen, I was pretty wound up, going on about the need for usability and human-centric design in cybersecurity. She told me, “you need to read Seeing is Forgetting the Name of the Thing One Sees.”

The book covers conversations Lawrence Weschler, the author, had over three time periods with Robert Irwin. It gets to the heard of Irwin’s philosophy and approach. Irwin began abstract in the 1960s. He painted lines. He painted dots. But when displaying his work, Irwin noticed the way the art was experienced was influenced by factors outside of his paintings. Any of us who have seen optical illusions with colors and lines understand this instinctively and likely think nothing of it. But to Irwin, who was obsessed with the experience to the point of banning photography, this simply wouldn’t do. Irwin took to replastering and repainting walls, sometimes whole studios, where his art was displayed.

Robert Irwin insisted on controlling the entire experience and this led to the realization that the surroundings were just as important as the artwork itself.

We’ve been slow at coming to a similar realization in cybersecurity. Consider the Web application. A thousand things have to go right for it to work, and a thousand things can go wrong from a security perspective. OWASP framed these issues up into a top 10 list. This simplified the work of developing a secure Web app. However, OWASP initially focused solely on the app itself.  Of the six releases since 2003, only the last two releases included the walls and studios, the vulnerable server components, on the OWASP top 10. We’re slow to recognize the importance of the surroundings.

Robert Irwin’s obsession with the surroundings transformed the artist from painter to landscaper. He has gone on to produce more than fifty large scale projects since 1975.

From the perspective of a designer, we must consider how the new capability fits into the existing cybersecurity portfolio and, more broadly, into the organization. We have to replaster the walls. We must make sure it fits in the studio. From the defensive perspective, this makes a lot of sense. A criminal faced with a strong control will look at the environment for other weaknesses and take advantage of gaps. From the usability perspective, Robert Irwin reminds us that how something is seen is as much about the thing as it is about the overall experience.

Security is not the control itself. Security is the surroundings.

Robert Irwin’s Double Blind exhibit at the Vienna Secession, Austria.
Photography: Philipp Scholz Ritterman

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Mies and IBM Plaza: Knowing When More is More – Design Monday

Posted by

The building came into view. My vantage point was on the Chicago River. It was Valentine’s Day. Now Chicago natives had warned us about the cold February winds. But there my wife and I were, on a river tour of Chicago’s architecture. Frozen to the ship’s deck, we looked up as the IBM Plaza came into view.

Ludwig Mies van der Rohe designed the building in the 1960s. Mies came from the famed Bauhaus school, another of my favorite sources of inspiration. In fact, Mies was the last director of Bauhaus. He moved from Berlin to Chicago in 1937 to head the architecture department of Illinois Institute of Technology. There’s a direct line from Bauhaus to Second Chicago School of architecture. Specifically, in minimizing ornamentation in favor of emphasizing building materials themselves.

It was this modernism which drew IBM to Mies van der Rohe. But there was a problem. Many, in fact, with the building IBM wanted. Computing technology of that age was notoriously hot and power-hungry. Moreover, computer engineers were at a premium, which meant a large workforce with little patience for waiting on elevators. Every minute counted. Moving to the ground, the lot was an oddly shaped. Triangular. It sat partially atop of a train line which restricts the foundation needed for a skyscraper. And to top it off, the site had an agreement to provide storage for the Sun-Times. That’s a lot.

“Less is more” was popularized by Mies van der Rohe. Boil down architectural requirements to the essentials. In cybersecurity, we’ve embraced less is more. You see it in concepts like least privilege, least trust (aka Zero Trust), economy of mechanism, and limited security blast radius. You see it in my security principles; like when I discuss building Roombas not Rosies. Less is more is a reminder to take a minimalist approach.

Even from the Chicago River, you can feel the minimalism of the IBM Plaza. The exposed vertical beams, the glass and steel materials on full display. Less is more. But it’s more than it seems. The building has more than double the elevators of a comparable building. The cooling system is similarly over-powered. Designed by C.F. Murphy, the HVAC is tuned for 1970s era computing. Mies also made several floors to be taller to support raised flooring, and reinforced to support the weight. The building is subtly shifted back to make use of the lot, with weight shifted back onto a strong foundation. This feature explains the open pillars in front and allowed Meis to neatly avoid the question of the railway. Less is more? If anything, much of the IBM building is overdone.

Less is more is not a call for doing less. It is a reminder to save our energies to do more where it counts. It is a reminder to pour the savings into solutions for the problem at hand. When we save resources for priorities, less isn’t loss.

IBM moved into IBM Plaza in 1971. For more than three decades, the building was the Chicago office of the tech giant. “The building was declared a Chicago Landmark on February 6, 2008 and added to the National Register of Historic Places on March 26, 2010.” Today, the building at 330 North Wabash is known as the AMA Plaza. It stands as a testament to Ludwig Mies van der Rohe’s ability to balance less and more.

The design lesson: More of what matters is more.

The floating foundation of 330 North Wabash, Chicago. Photography by Ryan Cramer.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

Build Roombas not Rosies – Design Mondays

Posted by

The Jetsons debuted this month in 1962. The cartoon depicted a family living a hundred years in the future, 2062. The swooping architectural style, with the quite fun name Googie, serves as the visual language of the future in shows from The Incredibles to Futurama. The everyday gadgetry in the Jetsons foreshadows today’s drones, holograms, moving walkways and stationary treadmills, flat screen televisions, tablet computers, and smart watches.

Remember color television was on the very cutting edge of technology when The Jetsons debuted. This list is impressive. But that smart watch? That last one wasn’t by accident.

The dominant smart watch in 2020 is the Apple Watch, designed by Marc Newson and Jony Ive. In an interview with the New York Times, Marc Newson explained his fascination with the Jetsons lead him into the world of design. “Modernism and the idea of the future were synonymous with the romance of space travel and the exotic materials and processes of space technology. Newson’s streamlined aesthetic was influenced by his Jetsonian vision of the future.” I imagine the first time Newson FaceTimed Jony Ive on an Apple Watch, they felt the future had finally arrived.

Designing the future has constraints that imagining the future lacks.

For starters, people and culture constrain innovation. Consider George and his flying car, Elroy and his jetpack, and space tourism. All these are technically feasible in 2020. But I wouldn’t trust a young boy with a jetpack, nor would most of us have money for a trip to the moon. Another constraint is technical complexity. Sure, we have talking dogs. But the reality is much different from the Jetson’s Astro. And yes, we have AI and robotics. But Siri is no R.U.D.I.

When designing future security capabilities and controls, we need to identify and quantify the constrains. One technique for this is the Business Transformation Readiness Assessment. Evaluate factors such as:

  • Desire, willingness, and resolve 
  • IT capacity to execute
  • IT ability to implement and operate
  • Organizational capacity to execute
  • Organizational ability to implement and operate
  • More factors here: https://pubs.opengroup.org/…/chap26.html

With this evaluation, we can rank what’s feasible against what’s needed. We can act on areas with momentum (desire, willingness, resolve) and build capabilities that can be maintained. But! There’s one additional step.

We don’t need a robot to push around a vacuum when we have a robot vacuum. We don’t need a full AI/ML deep learning platform when we can have a well-tuned SIEM. Implement security in a minimum viable way.

Identify the constraints. Select the security capability the organization is most ready for. Then build Roombas, not Rosies.

Rosie the Robot, The Jetsons, Photography by Brilux.

This article is part of a series on designing cyber security capabilities. To see other articles in the series, including a full list of design principles, click here.

CSO: Implementing Zero Trust

Posted by

Having a vision and a specific use case help get companies started toward Zero Trust implementation.

Excerpt from: Zero Trust Part 2: Implementation Considerations

A piece of advice at the outset: “Don’t do too much too fast,” says Wolfgang Goerlich, CISO Advisor with Cisco. “Have specific goals, meaningful use cases, and measurable results.”

To build momentum, start with a series of small Zero Trust projects with deliverable milestones, and demonstrate success every few months by showing how risk has been reduced.

“We need to show the board progress. With specific initiatives aimed at specific use cases, we can demonstrate progress towards Zero Trust,” Goerlich says. “You build momentum and a track record for success.”

Read the full article: https://www.csoonline.com/article/3537388/zero-trust-part-2-implementation-considerations.html


This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category.

CSO: Demystifying Zero Trust

Posted by

Despite the fact that Zero Trust has been around for a decade, there are still misconceptions about it in the marketplace.

Excerpt from: Zero Trust Part 1: Demystifying the Concept

Zero Trust is not one product or solution. Better to think of it as an approach, says Goerlich.

“Zero Trust is trusting someone to access something from somewhere,” he says. “Is it an employee, an application, a device? What is it accessing? What was can we determine if we trust this request? At the end of the day, Zero Trust means providing a consistent set of controls and policies for strong authentication and contextual access.”

The term was coined by Forrester Research in 2010. It was established as an information security concept based on the principle of “never trust, always verify.” Since then, the National Institutes of Standards and Technology (NIST) has produced comprehensive explanations and guidelines toward the implementation of Zero Trust architecture framework.

“NIST has a draft standard that dictates their view of Zero Trust — what the principles are, and what an architecture looks like,” Goerlich says. “The U.K. NCSC has done the same. Zero Trust has matured, and the need for it is now in sharp relief due to changes in the market and the way we use technology.”

Read the full article: https://www.csoonline.com/article/3537189/zero-trust-part-1-demystifying-the-concept.html

Wolf’s Additional Thoughts

I am leading a series of Zero Trust workshops this year. One concept I always stress: we’re applying existing technology to a new architecture. If you think back to Role Based Access Control (RBAC) was first being standardized, we used off-the-shelf x.509 directories and existing Unix/Windows groups to do it.

Now of course, better products offer better solutions. But the point remains. The application of existing standards to realize the principles of Zero Trust brings the concept beyond hype and into reality. Moreover, it makes it much easier to have confidence in Zero Trust. There’s no rip-and-replace. There’s no proprietary protocol layer. We’re simply taking authentication and access management to the next logical level.

Want to know more? Watch my calendar or subscribe to my newsletter to join an upcoming workshop.


This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category.

Dark Reading: OS, Authentication, Browser & Cloud Trends

Posted by

New research shows cloud apps are climbing, SMS authentication is falling, Chrome is the enterprise browser favorite, and Android leads outdated devices.

Excerpt from: OS, Authentication, Browser & Cloud Trends

Application integration is up across most key categories. The number of customers per cloud app is up 189% year-over-year, and the number of authentications per customer per app is up 56%.

The massive spike in cloud applications means any given employee has at least two or three cloud apps they use to do their jobs, says Wolfgang Goerlich, advisory CISO for Duo Security. “It was a big explosion of shadow IT,” he adds. “It really got away from a lot of the organizations.” Some people often use the same applications for personal and business use, driving the need for businesses to enforce their security policies for cloud-based applications and resources.

Read the full article: https://www.darkreading.com/cloud/security-snapshot-os-authentication-browser-and-cloud-trends/d/d-id/1335262

Wolf’s Additional Thoughts

IT history repeats itself.

The organization moves slow to provide employees with tools and technology. Consumer tech fills in the gap outside of the office. People get savvier and more experienced with tech. People innovate with what they know, to get done what they need to get done.

The organization notices people doing things in an innovative yet ad hoc way. Work is done to standardize tech use. More work is done to secure the tech use. The wild ways of people, the wilderness of shadow IT, is tamed and brought into the light.

We’re at this point now. That’s what the numbers show. But tamed IT is slower than shadow IT. If the past has taught us anything, it is that the cycle will repeat.


This post is an excerpt from a press article. To see other media mentions and press coverage, click to view the Media page or the News category.

Cloud adoption and use

Posted by

I am tremendously in favor of virtualization, a staunch proponent for cloud computing, and I’d automate my own life if I could. After all, we dedicated most of last year to investigating and piloting various cloud backup solutions. But take a peek at my infrastructure and you might be surprised.

Why is my team still running physical servers? Why are we using so few public resources? And tape, really?

I am not the only one who is a bit behind on rolling out the new technology. Check out this study that came out on Forbes this week. “The slower adoption of cloud … reflects a greater hesitancy … remain conservative about putting mission-critical and customer data on the cloud. Regulations … may explain much of this reluctance. The prevalence of long-established corporate data centers with legacy systems throughout the US and Europe … may be another factor. Accordingly, the study confirms that overcoming the fear of security risks remains the key to adopting and benefiting from cloud applications.”

I have a sense that cloud computing, in the IaaS sense, is roughly where virtualization was circa 2004. It is good for point solutions. Some firms are looking at it for development regions. Now folks are beginning to investigate cloud for disaster recovery. (See, for example, Mark Stanislav’s Cloud Disaster Recovery presentation.) These low risk areas enable IT management to build competencies in the team. A next step would be moving out tier 3 apps. A few years after that, the mission-critical tier 1 apps will start to move. This will happen over the next five to eight years.

This logical progression gives the impression that I see everything moving to the cloud. As Ray DePena said this week, “Resist the cloud if you must, but know that it is inevitable.” I can see that. However inevitable cloud computing is, like virtualization, it does not fit all use cases.

Why are some servers still physical? In large part, it is due to legacy support. Some things cannot be virtualized and cannot be unplugged, without incurring significant costs. In some cases, this choice is driven by the software vendor. Some support contracts still mandate that they cover only physical servers. Legacy and vendors aside, some servers went physical because the performance gains outweigh the drawbacks. Decisions, decisions.

The majority of my environment is virtualized and is managed as a private cloud. Even there, however, there are gaps. Some areas are not automated and fully managed due to project constraints. We simply have not gotten there yet. Other areas probably will never be automated. With how infrequent an event occurs, and with how little manual work is needed, it does not make sense at my scale to invest the time. This is a conscious decision on where it is appropriate to apply automation.

Why are we not using so more public resources? Oh, I want to. Believe me. Now I am not keen on spending several weeks educating auditors until cloud reaches critical mass and the audit bodies catch up. But the real killer is costs. For stable systems, the economics do not make sense. The Forbes article points out that the drivers of public cloud are “speed and agility — not cost-cutting.” My team spent ten months in 2011 trying to make the economics work for cloud backup. Fast forward a half of a year, and we are still on tape. It is an informed decision based on the current pricing models.

Is cloud inevitable? The progression of the technology most surely is, as is the adoption of the technology in areas where it makes sense. The adoption curve of virtualization gives us some insight into the future. Today, there are successful firms that still run solely on physical servers with direct attached storage. Come 2020, as inevitable as cloud computing is, it is equally inevitable that there will be successful firms still running on in-house IT.

Many firms, such as mine, will continue to use a variety of approaches to meet a variety of needs. Cloud computing is simply the latest tactic. The strategy is striking the right balance between usability, flexibility, security, and economics.

Wolfgang

Side note: If you do not already follow Ray DePena, you should. He is @RayDePena on Twitter and cloudbender.com on the Web.

Peer Incites next week

Posted by

I will be on Peer Incites next Tuesday, March 6th, for a lunch time chat on team management. The talk is scheduled for 12-1pm ET / 9-10am PT.

DevOps — the integration of software developement and IT operations — is a hot topic these days. In my current role, I took on IT operations in 2008 and took on software development in 2010. I have been driving the combined team using value proposition lens of the nexus of passion, skillsets, and business value. Add to this my favorite topic, training and skill hops, and we get a winning mix for leading a productive DevOps team.

I will dig into the nuts-and-bolts next Tuesday. Details are below. Hope you can join us.

Wolfgang

 

Mar 6 Peer Incite: Achieving Hyper Productivity Through DevOps – A new Methodology for Business Technology Management

By combining IT operations management and application development disciplines with highly-motivating human capital techniques, IT organizations can achieve amazing breakthroughs in productivity, IT quality, and time to deployment. DevOps, the intersection of application development and IT operations, is delivering incredible value through collaborative techniques and new IT management principles.

 

More details at:
http://wikibon.org/wiki/v/Mar_6_Peer_Incite:_Achieving_Hyper_Productivity_Through_DevOps_-_A_new_Methodology_for_Business_Technology_Management

Comments on Cloud computing disappoints early adopters

Posted by

Symantec surveyed several businesses to find out how they felt about cloud computing. The standard concerns about security were expressed. Still no concrete statistics on the difference between the threat exposure of in-house IT versus the threat exposure of public cloud IT. The concern about expertise surprises me, however, as managing a cloud environment is only slightly different than managing an enterprise data center. I have a hunch that it may be IT managers protecting their turf by claiming their guys don’t have the expertise, but I may be off. So what’s going cloud? Backups, security, and other non-business apps. No surprise there. Give it a few more years yet.

“While three out of four organizations have adopted or are currently adopting cloud services such as backup, storage and security, when it comes to the wholesale outsourcing of applications there is more talk than action, Symantec found. Concerns about security and a lack of expertise among IT staff are the main factors holding companies back, according to the survey of 5,300 organizations …”

Cloud computing disappoints early adopters:
http://www.reuters.com/article/2011/10/04/us-computing-cloud-survey-idUSTRE7932G720111004

Private clouds, public clouds, and car repair

Posted by

I am getting some work done on one of my cars. I never have any time. I rarely have any patience. Occasionally, I occasionally have car troubles. So into the dealership I go.

Every time, I hear from my car savvy friends and coworkers. The dealership takes too long. The dealership costs too much. If there is anything custom or unique about your vehicle, it throws the dealership for a loop.

Sure. Doing it yourself can be faster and cheaper. But if, and only if, you have the time, tools, and training. Short of any of these three, and the dealership wins hands down. If you are like me, then you have no time and no tools more complex than pliers and a four bit screwdriver set.

What does this have to do with cloud computing? Well, it provides a good metaphor for businesses and their IT.

Some businesses have built excellent IT teams. Their teams have the time to bring services online, and to enable new business functionality. These are the businesses that equip their IT teams with the tools and provide the training. Hands down, no questions asked, these teams will deliver solutions with higher quality. These IT teams can do it in less time and for less cost.

Other businesses have neglected IT. These are the teams that are told to keep the lights on and maintain dial-tone. Their IT systems are outdated. Possibly, their personnel has outdated skillsets. It makes as much sense for the internal IT teams to take on infrastructure projects as it does for me to change out my transmission. The costs, efforts, and frustration will be higher. The quality? Lower.

These are two ends of the spectrum, of course. Most IT teams are a mix. They are strong in some areas, and weak in others.

I suggest we play to our strengths. Businesses look to enable new functionality. Like with car repairs, we can step back and consider. Does our team have the time, tools, and training in this area? What will bring the higher quality and lower costs? That’s the way to decide build versus buy and the our cloud versus public cloud questions.