Cyber security, as an industry, has recorded exponential growth, especially within the last two decades. It grew along with the Internet and evolved from a simple buzzword to a real technological risk that can put you out of business rather quickly. Within the last years cyber security seized media’s attention and reached on top of most CEO’s agendas.
But 20 years ago, it was not like this. The need for cyber security arose briefly after the period when inter-connectivity and inter-compatibility among systems and apps grew sky-high. Once you could reach any corner of the globe and data could be easily transformed into many formats, the real digital transformation of humanity started. The opportunity to disrupt this development became a valuable strategic and tactical advantage. That is why, today, when talking about the most dangerous threat actors in cyber space, we actually refer to nation states, governments that have taken advantage of this great “opportunity”.
But let’s take a deep dive into how exactly we got here and what is to be done to tackle this phenomenon. Well, it all started in 1971 with the Creeper virus, less than 20 years after the launch of the first commercial PC (IBM 701, 1953). Bob Thomas created Creeper as an experimental self-duplicating program. Bob didn’t have malicious intentions, he just wanted to illustrate how software can be duplicated from one system to another.
(Figure 1 — Creeper virus and it’s source code)
Since the Creeper, other viruses have erupted during the 80’s (Cascade, Friday 13 th, Stoned etc.). All was made possible by the adoption of the TCP/IP standards (1981) that basically allowed wide scale interconnections and opened the gates to threats.
It was not until 1988, when the Morris worm massively hit the internet (infected around 10% of the Internet), when people really realized that if proper measures are not taken, malware can seriously disrupt the digital world.
Things continued to escalate as the Internet was growing (5 mil. devices in 1995), and many companies started falling victim to cyber-attacks. It was around the 2000’s years, when the term cyber-crime started to be used. Cyber-crime is becoming a business, as many cyber incidents are starting to fall within this category. A global convention against cyber-crime is signed in Budapest in 2001 ( Budapest Convention on Cyber Crime ).
In 2007 we see a major milestone, where Estonia, a 2 mil. people country is temporarily wiped off the Internet map (1 mil. computers affected). This is the moment when cyber-crime became a threat to national security.
The story goes on with Stuxnet, the first cyber weapon, aimed at disrupting Iran’s nuclear program. Nation states involvement is clearer than ever, and the scale achieve becomes impressive, but also scaring.
Another important milestone is the WannaCry, Petya and NotPetya attacks that clearly demonstrated the global impact that cyber-attacks can have.
If one is to describe how cybercrime evolved over the years, he can undoubtedly point to the picture below, adapted after an Infosec Institute graphical representation. You can easily notice how complexity and volume of cyber incidents evolved over time, going from „simple” incidents to actual attacks. Also, the motives behind changed dramatically, from personal fame, jokes or showing-off (script-kiddies) to financial interests or cyber warfare (launched by nation states).
(Fig. 2 — Cyber Crime historical evolution)
More on cyber security history can be found here.
Measuring one’s evolution should also rely on facts and figures. A proper assessment can only be done judging on rather mathematical data, not based on press titles.
In this respect, I have considered several metrics, to catch the evolution of the cyber security industry over time: incident cost, global security spending, global investments, security budget and workforce.
It’s good to know, in general, that the value of the cyber security market is evaluated at around $120 billion for 2019, estimated to grow to $170.4 billion in 2022. A good source of statistics in this area can be found here. According to Gartner, around half of the total spending goes into security services, while the other half is divided among different types of solutions.
When talking about the cost of incidents you will always run into colossal values. It’s like they all want to emphasize on the importance of this aspect, meaning that incidents cost a lot of money, so better prepare yourself. Nevertheless, one needs to keep in mind that the financial impact for a cyber incident is not something that can be computed easily.
A lot of factors must be considered, aspects that in most of the cases depend directly on the particularities of the situation. Such as, in many cases you cannot generalize based on unreliable sets of data (such as incidents collected from public sources). Having this in mind, one should not expect too much accuracy when seeing a statistic on the financial impact of cyber incidents.
Scouting the available online resources will give you plenty of articles and studies on this topic. Among them I have found the following:
$6 trillion damage cost/year by 2021, according to this source,$3.86 million avg. cost data breach, according to an IBM study,$13 million avg. cost per company according to an Accenture study,$150 per record, according to the same IBM study above, cited by this source.
Another metric, that’s worth analyzing, is the security budget. Don’t expect here to find straightforward figures, as companies rarely publish such numbers. What statistics usually show here is the percentage of spending in security out of the total IT budget. According to the 2017 Cyberthreat Defense Report, such percentages vary between 22% (education) and 38% (retail), depending on the industry type. What’s worth mentioning is that the security budget is taking bigger and bigger chunks out of the IT budget.
There is also another trend to be mentioned here, the positioning of the Information Security department under a more business-related unit, such as under the CFO (chief financial officer). This shows that cyber is getting more and more attention and starts to be rightfully treated as a risk to the business.
When trying to comprehend the value of an industry, one must also have a look at the global venture capital investments, meaning the amount of money funneled into an industry (as investment), in a specific timeframe. According to Strategic Cyber Ventures, more than $5 billion were invested in more than 350 companies, in 2018. That’s a 20% increase from 2017 and almost double compared to 2016. Definitely the trend is positive here also.
Finally, another representative metric is the workforce gap. Numerous online resources underline a quite large number of unfilled positions in cyber security. CyberSecurity Ventures mentions 3.5 million unfilled jobs globally by 2021. That’s quite a huge number!
The workforce gap is there, for sure, as it can clearly be seen with a naked eye. Nevertheless, going into details, will bring up some interesting underlying facts.
CyberSeek.com, a portal that provides useful data on the cyber security job market (supported by NIST, among others), mentions that in January 2020 there are almost 1 mil. employed cyber security workforce and around 500,000 openings in US only. That gives us a supply/demand ratio of 2, that apparently is quite tight. Other industries have around 6,5 people employed for everyone opening.
.CyberSeek also provides info on certification holders vs. openings requiring a certification. As you may notice entry level positions (such as CompTIA Security+) are not in very high demand (or the market is saturated), but more “senior level” certifications (CISSP, CISM, CISA) are in very high demand. An analysis on these aspects has been done by Richard A. Clarke and Robert K. Knake, within the book “ The Fifth Domain: Defending Our Country, Our Companies, and Ourselves in the Age of Cyber Threats “. Their conclusion is that if the real shortage in the cyber-security workforce regards mostly the experienced professionals, it is a much harder problem to solve.
Overall, business wise, looking from measurable perspective, seems like all indicators are going up, meaning the industry is still expanding, and has not reached maturity yet. The problem is that we do not know how much expansion this industry can still handle.
Figure 1 shows the amount of code needed to compile the first virus in history (Creeper). 14 lines, that’s all was needed. Nowadays, malware, may contain multiple modules with thousands of lines of code. Early malware (called virus) was just displaying annoying messages or messing up with your peripherals. You could have also clearly divided malware into categories such as virus, trojan, logic bomb etc.
Modern malware can do more damage such as erase hard drives, steal data or disrupt networks, it’s polymorphic and manages more often to bypass detection. Bottom line is that malware has evolved, attackers have evolved, so suddenly we are facing threats that are more advanced and persistent.
On the other side, tools we use to protect our systems have also evolved. Some say antivirus is dead. Instead we use endpoint protection, a more complex solution, with many more functions beside the old antivirus function. Although it is called endpoint, such solution usually protects more than just the endpoint itself. Endpoint may contain modules such as: antivirus, behavioral analysis, host IDS, containment and sandboxing, URL filtering etc.
Firewalls have also evolved over time. Since their birth, in 1989, their coverage extended from packet filtering only to stateful inspection, application level control, VPN and intrusion detection. We now buy Next Generation Firewalls (NGFW) a tool that covers many more functions than just packet filtering.
Bottom line is that cyber security technology is evolving. Even though you might get the impression that, for some years now, innovation has stopped in cyber and the industry is just reshuffling and combining what were once separate solutions, some amount of technological advancement is clearly visible. On the other hand, cyber domain is leader when it comes to adopting new technologies, such as AI/ML. A study done by Deloitte, available here, shows that ranks cyber security on the 3rd place globally, in the top use cases for AI adoption.
The above story basically tells us that cyber has historically evolved in terms of complexity and volume. More malware, more complexity, many solutions and different motives behind attacks.
In business terms, cyber has become an industry, that is still growing. Nevertheless, we are not sure, at this point, upon its maturity. It can continue to grow in numbers for the next decade or it can deflate, just as a bubble burst.
But if, by now, you got the impression that it’s all ok with cyber, keep in mind that, despite of all the improvements and developments described above, the Internet (and not only) still relies on a pile of legacy technologies, such as the TCP/IP stack of protocols. Unfortunately, we did not yet find the business case for a full technological rewrite of the Internet. Many patches (https, DNSSEC, SFTP etc.) have been added over the years, on top of the legacy infrastructure, but that’s still not enough. Even the most advanced and secure networks, can fall victim to cyber-attacks, if attackers have the proper motivation and persistence.
As Robert Mueller (FBI Director 2001–2013) would say it “I am convinced that there are only two types of companies: those that have been hacked and those that will be”.
There’s also the inter-dependencies issue. Systems have become more complex nowadays, meaning that you will certainly rely on/connect to, some parts/resources where you have no control over. Nobody develops systems (especially apps) from scratch anymore. There’s always a function or library somewhere, that you can use to shorten the delivery date. That helps the propagation of vulnerabilities, bugs, zero-days etc.
Getting back to our question in the title, the answer should probably be NO, at least not in this lifetime. Cyber security is an industry, and industries die hard. Or they don’t die, they reinvent. An industry concentrates many forces that feed each other, so they don’t starve. Secondly, there is still room for development, by bringing in new technological developments from outside the bubble (AI/ML, quantum etc.). There’s still work to be done and who knows what might follow after. Finally, it’s about the complexity, interconnections and the too many variables to manage. No matter how many resources you will invest, a determined hacker will always find a way in!
Originally published at http://technology-insights.com on January 29, 2020.