How many times have you read marketing propaganda for information security products that includes slogans that sound like the following?
- “Find out what’s lurking inside your system.”
- “With network security, if you’re not ahead of the threat, you’re cleaning up behind it.”
- “Your system could be infected right now.”
The difference between the first and the last example is a time span of almost thirty years, yet the tactics haven’t changed. Underlying all of these slogans is a theme of fear. Fear has been a prevalent marketing strategy in the personal computer industry since its inception. Ultimately, this fear is at least partially what gave rise to the information security industry as we know it today, and it’s exactly that same fear we must now continuously battle in order to actually build a more secure environment. Companies are so fearful of being breached, that they are constantly looking for a quick and easy fix to solve all of their information security woes. And if companies are so fearful, vendors as well as those in security have an entry point into the boardroom or the manager’s budget. Enter Cyber Security FUD.
Chris Roberts, Geek In Residence, Hillbilly Hit Squad, calls the quest for a security panacea “blinky box syndrome”. By “blinky box”, Roberts is referring to hardware supposedly utilizing whatever the current technological marketing craze is such as Artificial Intelligence, Machine Learning, or Next-Generation, each of which are meant to be the newest “easy button” for InfoSec. Yet despite the billions of dollars spent by companies each year in an attempt to become more secure, we’ve still managed to “lose” somewhere between two and 8 billion records in 2017 alone. Clearly, blinky boxes cannot be the silver bullet as their vendors claim. How, then, did we get to a place where vendors are continually able to market a new one successfully every year? Once again, I am reminded of the fear tactics used in marketing campaigns. But is fear always bad? Let’s take a look at the history of that fear in InfoSec, explore how its impact has shaped the industry, and whether the continued use of fear will help or hinder our industry moving forward.
The Early Days of Cyber Security FUD
In the early days of computing, neither the terms “cyber security” nor “information security” existed. Instead, the original goals of computer security focused around government risk assessment, policy, and controls. However, by the mid-1980s the ARPANET was expanding rapidly, the personal computer boom had begun, and companies were starting to use this thing called the Internet for communications leading to concerns about security and privacy. Thus, the realm of “internet security” was born, and it wouldn’t be long before keeping computers secure when they were attached to the internet became big business.
The first significant security product to become mainstream for the PC was anti-virus software. Although the first PC virus, “Brain,” was written as a copyright infringement tool and not to harm systems, it caused a wide-spread response from the media in January of 1986 after numerous people flooded the developers’ phone line with complaints that Brain had wiped files from their computers. The only remediation at that time was to format and reinstall, which was frustrating and time consuming. Brain was also the catalyst for John McAfee to enter the anti-virus market after he reverse engineered its code in the hopes that he could help individuals remediate their systems. Intending to profit only from corporate customers, he launched McAfee Associates at the end of 1987 and by 1990 was making $5 million a year. McAfee wasn’t the only successful anti-virus vendor in this realm. Symantec and Sophos also made their debuts in the late 80s. Although the prevalence of viruses and other forms of malware continued to slowly increase, by 1989 there were actually more anti-virus vendors than viruses.
The fear of being hit by a virus infection was also beginning to grip the US government. An article titled “Future bugs Computer world dreading electronic ‘virus’ attack”, published in the Toronto Globe & Mail on August 5, 1986, described US government computer security experts using phrases such as “potentially devastating weapon” to describe a virus and further stating that “the ‘virus’ is a high technology equivalent of germ warfare.” To some degree, their anxiety was well founded as only two years later the Morris Worm rapidly spread to and crashed 1/10 of all the computers on the internet at that time. It was the first worm to gain significant mainstream media attention and ultimately lead to the formation of the first CERT (Computer Emergency Response Team) Coordination Center at Carnegie Mellon University for the purposes of research and responsible disclosure of software vulnerabilities. Moreover, during a subsequent 1989 hearing before congress about the Morris Worm, John Landry, executive VP of Cullinet Software Inc., stated that, “virus attacks can be life threatening. Recently a computer used in real time control of a medical experiment was attacked. If the attack had not been detected, a patient might have been injured, or worse.”
By 1990, both corporate reliance on the internet and the acceleration of new viruses being propagated were inexorably tied together. However, the projected cost to the worldwide microcomputing community to remove malicious software was approximately $1.5 billion per year. At this point, the cost of either purchasing protective software at $5-10 per month per machine or hiring two additional staff members to triage infected systems at an estimated $120k-$150k now seemed reasonable. Companies were finally realizing that the need for something to combat the growing problem was absolutely necessary, thus giving rise to beginnings of the internet security industry.
As more and more systems connected to the internet, fear of attacks from external networks grew. Enter the network firewall, which first appeared in the late 80s but became commercially available in the early 90s. The goal of a network firewall, which originally provided only basic packet filtering, was to provide a secure gateway to the internet for private networks and keep out unwanted traffic from external networks. As Frederic M. Avolo, a well-known early security consultant, observed, “Firewalls were the first big security item, the first successful Internet security product, and the most visible security device.” In other words, firewalls were essentially the first “blinky boxes” in the industry. For some time, firewalls were thought to be “virtually fail-safe protection”, but that all changed when Kevin Mitnick attacked the San Diego Super Computing Center in December of 1995 by spoofing his address and using a sequence prediction attack. So much for the first blinky box being a successful “easy” button!
Fear and Its Consequences
Fear, as we’ve seen, played a significant role in the evolution of what we now know as the information security industry. Fear of infection by virus lead to the development and sales of antivirus products and fear of malicious external attacks led to firewall technology and their sales. If we continued the pattern, we’d likely see similar patterns for many other security offerings throughout the years such as IDS/IPS or web application firewalls. However, fear is a tricky beast to use as a motivational tool. Research has shown that while it’s possible for fear appeals to be effective, it’s extremely difficult to gauge what level of fear is needed for success.
Provide not enough fear and people are complacent, but present extreme amounts of fear and people are paralyzed. Furthermore, fear can also backfire causing negative association with a company or a product.  All too often, fear appeals in our industry lean toward the provocative and thus the typical response is over-reactive at best. My overall experience is that people need to believe that they have a sense of agency with regard to computer security in order to successfully motivate them to act. If they think that there is no way for them to stay safe, they suffer from paralysis and do nothing.
One of the worst types of fear-based marketing strategies is FUD – fear, uncertainty and doubt. FUD is a tactic used by a company to propagate misinformation about a competing product or technology in order to discourage people from buying it. The earliest example was 1970’s IBM’s advertisements against their former employee, Gene Amdahl, and his line of computers. IBM advertised that his computers did not have cooling fans and thus were susceptible to overheating leading to data loss. However, Amdahl had explicitly designed his machines to not need a cooling fan, because the power supply was located outside of the main case. Because of these tactics, Amdahl himself coined the term FUD. By the 80s, FUD was widely used by many other companies such as Microsoft and SUN. As Rich Smith of Duo Security said in a recent keynote, “The security industry generates FUD in order to sell hope.”
Fear also spawned the image of the lone figure in a black hoodie, mask, and gloves hunched over a keyboard that is so pervasive in the marketing materials of many technology companies. This image has become synonymous with the word “hacker”, which is unfortunate because it implies that a hacker is always someone with malicious intent. That is simply not true. Instead, I believe it is more accurate to think of hacking as “the art of understanding how computers work, rather than how you are told they ought to work.” The same can be said of almost any device and those engaging in the art of understanding, that they are the “hackers” of their given industries. Thus, hackers are people who are inquisitive and want to understand how something works, but criminals are people who use hacking techniques with malicious intent. In her October 2018 blog post “Infosec Has an Image Problem” Hafsah Mijinyawa of Duo notes that, “In large part due to mainstream media, the idea of security often becomes entangled with fictional concepts of who the people in the world of security are and what the data battlefield looks like.” As we often fear what we don’t understand, it’s not too surprising that the mainstream media paints this negative picture of the hacker community.
Not only does the InfoSec industry have an image problem in terms of marketing, but we also suffer from a far more fundamental flaw – one of consistency. Consistency is typically seen as something positive; however, in this case, the consistency involves the handling of information security the same way over and over again typically as an afterthought. The basics like knowing who has what data are ignored, fear is encouraged such that new blinky boxes are purchased annually in order to provide quick fixes, and we often work against our users instead of with them. As previously mentioned, billions of dollars have been spent in an attempt to make organizations more secure only to have even more records “lost” each year. Just as critical is the fact that despite being around for over fifty years, computer science degree programs continue to turn out graduates who have no training in the basics of security. Therefore, why are we so surprised when new security flaws are discovered, exploited, and data is exposed as a result? So our fundamental flaw of consistency actually more resembles the often misattributed Einstein quote, “The definition of insanity is doing the same thing over and over again and expecting different results.”
Where do we go from here?
The way that fear is used throughout our industry must change. Plain fear must be replaced by a healthy skepticism and nuanced learning. A healthy skepticism involves things like questioning what is received in email, what is posted on web sites, and better awareness of the risks faced each day. Nuanced learning involves teaching our users to be our partners in online safety regularly, using language that they can understand and concepts that they find relatable. This should also happen with regularity as opposed to once a year “User Awareness Training”. Pay attention to those around you and change the “us” vs. “them” perception that information security professionals often perpetuate. Take the time to listen to the concerns of the corporate stack who are not experts in IT/InfoSec and provide honest responses that avoid condescension in words or tone. Only by doing so can we begin to change this negative mentality that implies that “they” are somehow less important than we are. Furthermore, by giving users actionable information, they are far more likely to want to act and are less likely to be paralyzed by fear.
Within the ranks of InfoSec itself, we must also replace plain fear with a healthy skepticism. Be skeptical of any marketing materials from vendors that perpetuate fear and speak out against the image that they have built, including the verbiage they are using. For example, focusing on scary sounding lingo such as “advanced persistent threat” does nothing but drive fearful customers to the marketplace looking for an easy solution that does not exist. We must stop looking for the next blinky light box to help “secure all the things” and return to the basics. Returning to the basics involves things like knowing where the data lives, who is using it, and what kind(s) of data exists within the environment. But that’s not all – you also must know where all of your assets actually are. Yes, this is a tedious task that no one wants to do, but it’s an absolutely critical first step. Remember, too, that especially in a BYOD world, there are no longer any perimeters, and security plans must be adjusted accordingly. Remove the easy ways into the network (why, hello RDP!) and find a way to monitor logs efficiently. Focusing on the basics and away from abject fear will ultimately give you a better bang for your buck than will any expensive gizmo on your network. And, as with all of these steps, don’t let the goal of perfection become the enemy of good.
The basics also involve fixing the simple things like XSS and SQL injection problems, which continue to plague web applications and persist on the OWASP Top Ten list. Preventing these vulnerabilities requires developers who understand how to write secure code and spot insecure code from others. Here, fear plays a slightly different role in that it involves fear of consequences. Developers are often under the gun and expected to produce significant amounts of code and/or complex code in a relatively short time. Not surprisingly, then, they are also haunted by the fear of potential shame or humiliation if any of their code lead to the occurrence of a breach. Unfortunately, because security is not integrated into most development training including undergraduate computer science programs, it is often an afterthought thereby requiring additional time and effort to address. As Sarah Zatko observes, “Although security is integral to just about every computer science topic, curricula tend to treat it separately, often making it an advanced elective for interested seniors or graduate students.”  She further states, “Rather than addressing security after the foundations of computer science are laid, curricula should integrate it into areas such as networking, programming languages, and OSs. Even introductory courses need applied security content.” Certainly, training developers to integrate security from the ground up is a more measured approach than scaring people into buying blinky boxes.
Fear played a significant role in the rise of the information security field and continues to do so today. Initially, this fear was constructive in bringing the problems of network security to light, and, I’d contend, was instrumental in helping to educate the public about the inherent risks in networking and computers as they evolved. However, the continued use of FUD, scary negative stereotypes in marking campaigns, and fear verbiage over the years have only succeeded in causing a mixture of complacency and paralysis both inside and external to the InfoSec community. We, as security practitioners, continue do the same thing over and over again, somehow expecting different results. As a result, little useful action is taken and minimal strides are made in information security as a whole.
The deep-seated fear that has been endorsed for years must cease. Instead, we must encourage the development of healthy skepticism and nuanced learning both for ourselves in the InfoSec community and for those outside of it. Rather than running to the next blinky box solution, a return to the basics is required in order to create a more secure computing environment. Help change the paradigm so that others want to make things more secure and feel they have agency to do so. Furthermore, security needs to be built-in from the ground up, including being at the core of all developer training in order to fix some of the most basic vulnerabilities such as those found in web applications. Ultimately, we need to get off the merry-go-round, do away with deep-seated fear tactics, and stop the insanity.
- (1986). Future bugs Computer world dreading electronic ‘virus’ attack. Globe & Mail. Toronto ON, The Globe and Mail Inc.
- (1989). Network World. Framingham MA, IDG Communications. 6: 41
- (2007). Computer World. Framingham MA, IDG. 41: 51.
- Alexander, M. (1990). Health insurance for computers. ComputerWorld. Framingham, MA, IDG News Service. XXIV: 1.
- Avliolo, F. (1999). “Firewalls and Internet Security, the Second Hundred (Internet) Years.” The Internet Protocol Journal 2(2).
- Davis, J. (2012). “John McAfee Fled to Belize but He Couldn’t Escape Himself.” Retrieved Sept 25, 2018, from https://www.wired.com/2012/12/ff-john-mcafees-last-stand/.
- Design, A. J. (2018). Retrieved November 1, 2018, 2018, from http://www.alexjdesign.com/works/darktrace-adverts/.
- DiDio, L. (1989). “To Keep your System virus-free, use proper computer hygiene.” Network World: 90.
- Elliott, D. S., Ethne; Herbane, Brahim (2002). Business Continuity Management: A crisis management approach. London, Routledge.
- Government, U. (1989). Computer Viruses: Hearing Before the SubCommittee on Telecommunications and Finance of the Commuttee on Energy and Commerce House of Representatives. C. o. E. a. C. H. o. Representatives. Washington DC, U.S. Government Printing Office,: 14.
- Mijinyawa, H. (2018). Infosec Has an Image Problem. 2018.
- Radeska, T. (2016). “Brain -The first computer virus was created by two brothers from Pakistan. They just wanted to prevent their customers of making illegal software copies.” Retrieved Oct 19, 2018, from https://www.thevintagenews.com/2016/09/08/priority-brain-first-computer-virus-created-two-brothers-pakistan-just-wanted-prevent-customers-making-illegal-software-copies/.
- Roberts, C. (2018). Welcome To 2018 – Full speed off the edge of the cliff. Defcon. Ceasars Hotel, Las Vegas NV.
- Smith, R. (2018). Keynote. Rochester Security Summit 2018. Rochester Rivers Convention Center, Rochester NY.
- Tills, C. (2017). Fear appeals: what are they good for? Clear Security Communication. 2018.
- Zatko, S. (2016). “Rethinking the Role of Security in Undergraduate Education.” IEEE Computer and Reliability Societies.
 Design, A. J. (2018). Retrieved November 1, 2018, 2018, from http://www.alexjdesign.com/works/darktrace-adverts/.
 (2007). Computer World. Framingham MA, IDG. 41: 51.
 (1989). Network World. Framingham MA, IDG Communications. 6: 41.
 Roberts, C. (2018). Welcome To 2018 – Full speed off the edge of the cliff. Defcon. Ceasars Hotel, Las Vegas NV.
 Radeska, T. (2016). “Brain -The first computer virus was created by two brothers from Pakistan. They just wanted to prevent their customers of making illegal software copies.” Retrieved Oct 19, 2018, from https://www.thevintagenews.com/2016/09/08/priority-brain-first-computer-virus-created-two-brothers-pakistan-just-wanted-prevent-customers-making-illegal-software-copies/.
 Davis, J. (2012). “John McAfee Fled to Belize but He Couldn’t Escape Himself.” Retrieved Sept 25, 2018, from https://www.wired.com/2012/12/ff-john-mcafees-last-stand/.
 DiDio, L. (1989). “To Keep your System virus-free, use proper computer hygiene.” Network World: 90.
 (1986). Future bugs Computer world dreading electronic ‘virus’ attack. Globe & Mail. Toronto ON, The Globe and Mail Inc.
 Government, U. (1989). Computer Viruses: Hearing Before the SubCommittee on Telecommunications and Finance of the Commuttee on Energy and Commerce House of Representatives. C. o. E. a. C. H. o. Representatives. Washington DC, U.S. Government Printing Office,: 14. , ibid.
 Alexander, M. (1990). Health insurance for computers. ComputerWorld. Framingham, MA, IDG News Service. XXIV: 1.
 Avliolo, F. (1999). “Firewalls and Internet Security, the Second Hundred (Internet) Years.” The Internet Protocol Journal 2(2).
 Alexander, M. (1990). Health insurance for computers. ComputerWorld. Framingham, MA, IDG News Service. XXIV: 1.
 Tills, C. (2017). Fear appeals: what are they good for? Clear Security Communication. 2018.
 Elliott, D. S., Ethne; Herbane, Brahim (2002). Business Continuity Management: A crisis management approach. London, Routledge.
 Smith, R. (2018). Keynote. Rochester Security Summit 2018. Rochester Rivers Convention Center, Rochester NY.
 Mijinyawa, H. (2018). Infosec Has an Image Problem. 2018.
 Zatko, S. (2016). “Rethinking the Role of Security in Undergraduate Education.” IEEE Computer and Reliability Societies.
OG Image: Stranger Things by Netflix
Dr. Catherine J. Ullman is a security researcher, speaker, and Senior Information Security Analyst at University at Buffalo with over 20 years of highly technical experience. In her current role, Cathy is a data forensics and incident response (DFIR) specialist, performing incident management, intrusion detection, investigative services, and personnel case resolution in a dynamic academic environment. She additionally builds security awareness amongst faculty and staff via a comprehensive department-wide program which educates and informs users about how to prevent and detect social engineering threats, and how to compute and digitally communicate safely. Cathy has presented at numerous prestigious information security conferences including DEF CON and Hacker Halted. In her (minimal) spare time, she enjoys visiting her adopted three-toed sloth Flash at the Buffalo zoo, researching death and the dead, and learning more about hacking things to make the world a more secure place.
See all articles in our Features CategoryTags: highlight history industry infosec opinion sdlc user awareness