Claiming to be Anton Chernoff, one of the (DEC) project's lead developers, I placed a simple phone call to the system manager. I claimed I couldn't log in to one of “my” accounts and was convincing enough to talk the guy into giving me access and allowing me to select a password of my choice.
Something stands out to me here. Without an account name and password, he wouldn't have been able to get in. The way he acquired those credentials was by social engineering. Social engineering in a cybersecurity context is all about fooling human beings into helping you acquire access to computer systems you aren't allowed to have. The specific kind of social engineering Mitnick did is called vishing. Vishing is when someone uses phone calls to pretend to be a trusted party, such as DEC developer Anton Chernoff, to acquire information that you're not entitled to have and that you can use to facilitate a cyberattack. Vishing is a category of phishing, where media such as text messages, web pages, emails, or social media messages are used to impersonate trusted entities to acquire malicious computer access. All kinds of phishing, including vishing, are common types of social engineering attacks. Mitnick exploited human psychology. The Art of Deception, indeed.
Mitnick started to learn social engineering when he was really young. In the mid-1970s when he was 12, he wanted to be able to ride Los Angeles public transit for free. So, he dumpster dived for unused bus transfer slips. He tricked a bus driver into giving him a ticket punch by saying he needed it for a school project. From there, young Kevin Mitnick was able to spoof bus transfers for free rides. But he couldn't do it without social engineering the bus driver.
Mitnick's successful Los Angeles bus exploit gave him the confidence to attempt social engineering in other ways. He went on to trick his way into DEC's computer system. After years of criminal investigations and a trial, he was convicted in 1988 and sentenced to a year in prison and three years of supervised release. By the early 1990s, toward the end of his supervised release, he conducted his second notorious cyberattack.
Mitnick social engineered his way into the voicemail system of Pacific Bell, a major telecommunications company in California. His techniques were very similar to how he penetrated DEC. Those in the know didn't consider Mitnick to be a master of computer science; rather, he was a clever conman. Eventually, Mitnick targeted an actual computer science master, Tsutomu Shimomura. Shimomura studied physics with the famous physicist Richard Feynman before he pursued computer technology research at San Diego Supercomputer Center full time. Mitnick wanted access to Shimomura's work. He chose the wrong target this time, because Shimomura helped law enforcement investigate Mitnick's Pacific Bell breach and other criminal activities. The FBI arrested Mitnick in 1995, and he was in prison until 2000.
From there, Mitnick decided to use his skills in law-abiding ways. He wrote books, some of which were published by Wiley. And he also started his own cybersecurity firm, Mitnick Security Consulting, LLC.
The Importance of a Strong Security Culture
The cyber threat actors who will try to harm your company could be just glorified conmen like Mitnick or brilliant computer scientists like Shimomura. Either way, the majority of cyberattacks involve social engineering at one point or another. A strong security culture hardens against social engineering exploits by making your employees, contractors, and executives less likely to succumb to them. A strong security culture also encourages your workers to develop good habits in the ways that they use computer technology, so your precious data assets are better protected.
A strong security culture doesn't stop at your IT department. Everyone from the janitors to the CEO must be a part of it because computer systems aren't used only by people with IT certifications. Even an authorized person entering your office could put your computer networks at risk.
One of the most important things you can do to make sure your company can thrive in our rapidly evolving cyber threat landscape is to establish and maintain a strong security culture. And that's what step 1 is all about. With this crucial step taken care of, the other seven steps in my book will be feasible. For a cybersecure business, start with people's behaviors and attitudes.
Let's start by demystifying the word hacker, shall we?
Hackers Are the Bad Guys, Right?
When most people hear the word hacker, they think of cybercriminals. Apparently, hackers are the bad guys. This is a misconception that's not only reinforced in Hollywood movies and TV shows but also in the news. When cyberattacks are covered in TV news shows, newspapers, magazines, and online news sources, the bad guys who perpetrate the crimes are called hackers. Those of us who promote a more accurate use of the word face an uphill battle with the public consciousness.
One of my favorite books of all time is Steven Levy's Hackers. It was published by Dell, Penguin, and O'Reilly in various editions between 1984 and 2010. That book is one of the best ways to learn about the history of actual computer hackers, beginning with the first proper electronic computer, ENIAC, deployed in 1948. Levy covers the history of hacking from the 1950s onward.
Hackers are people who find new and innovative ways to use computer technology. Some of the people who became famous billionaires in the tech industry, such as Steve Wozniak, Steve Jobs, Bill Gates, and Mark Zuckerberg, started as hackers themselves. In fact, the street address of Facebook's Menlo Park, California, headquarters is 1 Hacker Way.
Hackers developed the computer technologies you use every day: the TCP/IP backbone of the modern internet, the Linux kernels of the Android systems and Red Hat servers you interact with whether or not you're aware, the GNU Public License and MIT Public License, much of the open-source code you directly or indirectly use was published under, and so on.
Hacking can develop useful new technological applications. But hacking can also be used harmfully. The general public seems to have focused on the latter connotation of the word hacker in lieu of its original meaning.
Many computer programmers, cybersecurity professionals, software engineers, and other computer technology specialists call themselves hackers, in the spirit of the original meaning of the word. If someone innovates with computer technology, you can safely call them a hacker.
I'm an advocate of an organization called Hacking Is Not a Crime, led by my friends Bryan McAninch, Chloé Messdaghi, and Phillip Wylie. Wylie is also the coauthor of the first book I cowrote for Wiley, The Pentester Blueprint. The book you're reading right now is my debut solo work for Wiley. And Wylie isn't related to Charles Wiley, who founded this company back in 1807. But perhaps this illustrates how tight knit the cybersecurity and hacker communities are: we tend to know each other quite well.
I'm an idea person within the cybersecurity community, so my contribution to Hacking Is Not a Crime's mission to promote the positive use of the word hacker is to use my work in the media and writing books like this in a mindful and responsible way. During the many years I have been writing about cybersecurity and hacking, I always refer to the people who use computer technology to harm as cyberattackers, cybercriminals, or cyber threat actors. This distinction is a vital pillar of both cybersecurity culture and hacker culture.
Even if you're 100 percent businessperson and 0 percent computer geek, understanding this will help you work with cybersecurity professionals and foster a strong security culture.
What Is Security Culture?
Lifestyle and wellness writer Tim Ferris once said, “Culture is what happens when people are left to their own devices.” There are all kinds of cultures in our world, from ethnic cultures and national СКАЧАТЬ