When you look at the FedEx logo, do you see an E and an X? Or do you see an arrow? Let’s look at the cognitive processes that drive what we see. I’ll give two tips on how to get more creative when designing security controls.
A friend of mine tweets he’s up at 1 o’clock watching my videos with his daughter. When you can’t sleep, apparently, my videos do the trick. She had questions. Here are my answers.
Secure360 2020 – Security happens where man meets machine. Or, fails to happen, as we see all too often. Blame the users. They’ll click anything. Blame the developers. Half their code is riddled with vulnerabilities anyways. Blame the IT staff. You’d think they’d at least know better. But perhaps, we’ve been placing the blame on the wrong places. What exactly happens where people and technology meet? At that moment, that very moment, what factors in human psychology and industrial design are at play? And suppose we could pause time for a moment. Suppose we could tease out those factors. Could we design a better experience, design a better outcome, design a better path to the future? This session explores these questions and identifies lessons the cyber security field can learn from industrial design.
Taking a day off work, I’m thinking about how work gets structured. Use standards such as CIS Critical Security Controls, NIST SP800-54b, and the National Initiative for Cybersecurity Education (NICE). Define what the team will do and, just as important, what the team will not do.
The common belief in CyberSecurity is that end-users want security that’s all but invisible. But studies are showing a surprising fact: people want to be involved and want to put in some effort. Let’s take a closer look at the IKEA Effect and Effort Justification cognitive biases and see if we can’t piece out what’s going on.
With apologies to Yves Saint Laurent, who once said fashion fades but style is eternal. While we’re all caught up in what’s changing, it’s imperative to look at aspects of CyberSecurity that are, if not eternal, certainly are long-lived.
People are stressing their jobs, their health, their family, and their friends. Meanwhile, people are working from home with lessened security. These two are a dangerous combination.
Sheffield has leaked its automatic number-plate recognition (ANPR) dashboard, including where cars went and when. This brings up the topic of data toxicity: when combining data with new data and when using combined data in new ways creates dangerous data. Here are three things you can do about toxic data.
(TEK Keynote) We turn to DevOps for speed. We turn to Cloud for flexibility. We adopt faster, leaner, more collaborative processes to drive change. And then? We turn to information security for protection. But can we secure the technology without slowing the pace? This session presents an entirely fictional development organization adopting DevOps. We will discuss which traditional software security processes work, and which ones fail entirely. Awareness training, muscle memory, culture shifts, all will be brought together. The presentation will conclude with take-aways for applying security to your DevOps team without slowing down.