paint-brush
Introducing the InfoSec colour wheel — blending developers with red and blue security teams.by@proxyblue
60,478 reads
60,478 reads

Introducing the InfoSec colour wheel — blending developers with red and blue security teams.

by Louis CremenNovember 20th, 2018
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Introducing the InfoSec colour wheel — blending developers with red and blue security teams. Louis Cremen, a developer turned security person, explains how important it is for all teams to work together, more than just DevSecOps. April Wright proposed a solution in her BlackHat talk titled “Orange is the new Purple” (DefCamp Recorded Version) and she shows how builders/attackers/defenders are all one InfoSec team. In reality, organisations currently have a cycle: Yellow Builds it. Red Breaks it. Blue Defends it. Yellow Fixes it.

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Introducing the InfoSec colour wheel — blending developers with red and blue security teams.
Louis Cremen HackerNoon profile picture

As a developer turned security person, I’ve learnt first-hand how important it is for all teams to work together, more than just DevSecOps.

It’s time to include and blend developers into our Infosec circle.

Current state of Information Security — Red and Blue Teams

In the realm of information security, there tend to be two main groups:

  • The Red Team, employees or contractors hired to be Attackers, ethical hackers that work for an organisation finding security holes that a malicious individual could exploit.
  • The Blue Team, the organisation’s Defenders, who are responsible for protective measures within an organisation.

While it is good to have people dedicated to secure an organisation through defence or attack methods, organisations and their systems do not stay static. Additional processes, automations, products and being built by developers and architects constantly — with the potential attack surface area growing with each new change or integration.

Only having Red and Blue Security Teams is not enough. The people building what must be defended need to be included.

Introducing Yellow Team — The Builders

Yellow Team — Builders

Application developers, software engineers and architects fall into this category.

These are the people that build and design software, systems, and integrations that make businesses more efficient.

Their focus is usually on requirements, functionality, user experience and back-end performance.

If we want to have applications, automations and processes designed and implemented securely, Red Team and Blue Team need to work with the Builders. Builders need to be included as a part of Information Security.

Yellow need to be part of the team. We can only do this by working together

Last year, April Wright developed a solution in her BlackHat talk titled “Orange is the new Purple (DefCamp Recorded Version) and she totally nails how builders/attackers/defenders are all one InfoSec team. Some of the content below is directly from her submission paper.

We need to make Builders “The Yellow Team” and gives them the capacity to make their applications more secure.

A developer’s focus is on functionality, and making it as functional as quickly as possible. Once an application’s functionality works for all users and stakeholders, its developers will sleep easy because they’ve done their job perfectly.

Developers never know, or really think about, if they’ve done something insecurely — until it’s pointed out to them as part of a penetration test conducted by Red Team, or as part of a breach discovered by Blue Team (keeping in mind that two thirds of breaches take months or longer to discover).

But, we are building software faster than it can be security tested and defended.

We place the security responsibility on programmers, who don’t have the security experience.

Our security industry and organisations are plagued with vulnerabilities and misconfigurations and they all have the same source. The person or group who built it. As the saying goes:

If debugging is the process of removing bugs, then programming is the process of putting bugs into the application. Testing only proves the presence of bugs, not the absence of them.

To help solve this, organisations currently have a cycle:

Yellow Builds it. Red Breaks it. Blue Defends it. Yellow Fixes it.

Well, that’s the theory. In reality it’s:

Sir? Sir? Are you listening? Sir?

Yellow Builds it. Red Breaks it. Blue Complains about it. Yellow ignores it. Management hides it

That’s what our current Security / Development culture tends to encourage.

However, our ideal cycle will work best when Yellow are educated with Red and work hand-in-hand with Blue.

Yellow need to be involved to secure an organisation.

How to help Yellow Team become more secure

The solution is educating the Yellow Team with attack techniques, and using the Yellow Team’s strengths to make defence easier and more visible.

I’ve spent the last five years working with Blue Teams programming security automation solutions and optimising how to improve their monitoring and defence capabilities. If I had to label myself at the moment, I’m probably what the industry is trying to classify as DevSecOps. But more Dev and Sec than Ops.

Over the past five years, I’ve also dedicated many training sessions and ethical hacking courses for developers, and other technical staff. I teach them how attackers break into networks, infrastructure and applications — and how to mitigate it with good coding practices and other defence techniques.

How do we bridge the gap between developers and security?

We’ve now introduced our Yellow Team. We already have our Red Team and Blue Team.

Red. Blue. Yellow. These are our Primary Colours.

These three teams are needed to keep an organisation secure from threats, making them all responsible for the security of an organisation.

Attack and Defence cannot do it alone. Coders can’t do it alone. They all need to work together.

Individually, they don’t have the skills or visibility to protect an organisation.

Introducing the InfoSec Colour Wheel

If Red, Blue and Yellow are our Primary Colours, we can blend their skills together to create secondary teams that combine the skills and strengths of two primary teams

Red, Blue and Yellow are our Primary Colours. Combine two of them and you get Secondary Colours.

This InfoSec Colour Wheel expands on April Wright’s amazing work of bringing builders into the security team by putting it into a single infographic. (Image zip at the bottom of the article)

How does it work?

We have our Red and Blue Teams just as we always have, but now with the introduction of a Yellow Team, we can have secondary coloured teams (Orange, Green and Purple) dedicated to mixing skills between attackers, defenders and coders — making code more secure and the organisation more secure.

Secondary colour creation chart

Secondary Colours of InfoSec

Security cannot and should not be siloed, but in many cases, software developers and security are currently segregated from each other and from the rest of the organisation.

People tend to specialise in a primary colour, but for an organisation to be secure, all colours must be strong, but they must work together and blend.

As we blend, these teams will combine their knowledge and capability to enhance and improve the other teams to make an organisation more secure.

Do not think of these “Secondary colour” teams as dedicated full-time roles — these secondary colours are a concept and function of existing members.

It is more likely there will be regular and scheduled secondary team “meetings/exercises/engagements” and an open-door policy between groups. The goal is to have these usually-opposing primary teams work together to achieve a common goal.

Daniel Miessler captures this well:

Think of Purple Team as a marriage counsellor. It’s fine to have someone act in that role in order to fix communication, but under no circumstances should you decide that the new, permanent way for both partners to communicate is through a mediator.

Purple Team — Maximise Red. Enhance Blue.

The primary goal of a Purple Team is to maximise the results of Red Team engagements and improve Blue Team capability.

This is actually an already established, or easily spun up, team within many security mature organisations. We’ve learned from experience that the business works best when Red and Blue Teams work together to improve the security posture of the organisation. There are many Purple Teams already in existence and people happily declaring their solidarity to Purple Team.

Because knowing both attack and defence is a huge asset to any organisation, team and individual.

Orange Team — Inspiring coders with attackers

The reason for many security bugs within software is not malicious programmers, but a lack of security awareness within software development teams and architects.

The purpose of the Orange Team is to inspire Yellow Team to be more security conscious, increasing their security awareness by providing education to benefit software code and design implementation. There should be structured and ongoing engagements between Red and Yellow Team for the benefit of Yellow.

So you wanna write secure code…

I joined the security industry as a “Developer Security Trainer Guy”, teaching coders how attackers break into networks and apps. In my experience running ethical hacking courses and secure coding courses, developers don’t respond positively to a list of vulnerabilities, but they LOVE to learn how to think like an attacker.

Yellow Team want to see and understand how the attack tool/automation works, and they love how clever an attack can be. The techniques they learn tend to be more clever, insidious or easily available than they initially realised.

A developer will then, in their brain, begin to internalise how to make their apps resistant to these types of attacks. They begin to understand not just “use cases”, but “mis-use cases” and “abuse cases”.

They develop and inspire offensive critical thinking in Yellow Team and in themselves which is the goal — this helps prevent security bugs being introduced in the first place since it becomes an intrinsic part of how they develop and design solutions.

How are Yellow Team learning security without the help of an Orange Team?

Good question. They’re certainly not reading about it. Yellow Team are too busy keeping up with their own software tooling updates. Their first experience is usually, one day, they receive a penetration testing report with a list of vulnerabilities, then Google what they are, change a line or two of code or configuration, and go back to their actual job (which isn’t security).

Getting Builders to understand how attacks work, and why they want to code securely is significantly more effective than giving them a spreadsheet of vulnerability findings from a penetration test.

When giving a developer a checklist of security bugs to fix, you are basically telling them their work is wrong, after being told that it was right. No one likes to be told their art is wrong, giving them more work to do and having everyone resent security for ruining schedules and pushing back timelines. This only creates conflict.

Traditionally, we exclude builders from Information Security, but scold them after a penetration test or breach for not knowing stuff that changes constantly and is hard to keep up with even when it is our job!

Very difficult to keep up with InfoSec, even when you’re in InfoSec. Everything is always burning.

Some thoughts on creating an Orange Team

As mentioned previously, I don’t think there will be a full-time Orange Team, although it may depend on the size of an organisation. Some organisations already have “Security Awareness Champions”, and I feel like Orange Team is similar to that role.

Since Orange’s goal is to make developers think more like attackers, it will most likely start with certain Red Team members being available to Yellow Team. Working with developers to understand how attackers work will be the first step.

Orange Team members are highly technical and ‘in-the-weeds’. Someone who can speak code, and speak attack methods and understand this fundamentally and from an implementation perspective.

Bringing in a Red or Orange Team member at the beginning of a software sprint allows for additional “Abuser Stories” and “Mis-use cases” to be developed alongside the team’s usual “User Stories” and “use cases” that Yellow Team would typically rely upon to make features. This gives Yellow Team immediate feedback on areas they need to secure before they write a single line of code.

Time dedicated to training Yellow how the Red Team operate, or even just lunch-time learning sessions to help encourage conversations is also a great way to engage Yellow Team into thinking more securely. Encouraging an open-door, or specific point of contact is also a great way to bridge the gap between Yellow and Red.

If a developer can understand a security flaw, they may recognise the same security flaw they programmed into a separate project that wasn’t tested by attackers — and security flaws start to become fixed proactively.

Pictured above ^^ Me. So many times.

Once Orange start having more software builders in their team, they may also be in positions for code-reviews to check on other Yellow Team members code updates catching flaws before they’re committed, or in architecture design positions to ensure that the right decisions are made at the beginning of the project, and not at the end after testing.

The end result should hopefully be better coders trained by attackers, who then train each other to encourage a culture of security amongst developers.

Green Team — Enhance defence with coders

Blue Team are not always aware of all the frameworks, libraries, third-party systems, network calls and functionality added by Yellow Team. Yellow Team may be barely aware of some of the dependencies behind their own code.

Ask any coder using Node what their 80+ dependencies do. Stealthily, it might “heart” a random HotPockets tweet. It might steal bitcoin or developer passwords. And more. These are real cases in very popular libraries!

In the event of an incident, Blue Team may not have the data needed to investigate or defend breached systems and no one wants to test or touch the production environment for fear of it breaking.

Green Team consists of ongoing structured interactions between the Blue Team and members of the software team (Yellow Team). The ultimate goal is to improve code-based and design-based defence capability for detection, incident response and data forensics.

Yellow Team know how to set up systems and code. Green Team will work together with Blue Team or Purple Team and discuss solutions and improvements for:

  • DFIR Output
  • Logging improvements
  • Log content / events
  • Log generation standardising
  • Change Management
  • Integrity Monitoring
  • Anti-Virus / End Point Protection
  • Full coverage monitoring

With Yellow Team working with Blue Team, catching an event before it becomes an incident, before it becomes a breach, is essential to the defence of any organisation.

Detected and trapped. Blue and Yellow together are hard to break through

How are Blue Team currently doing this without the help of a Green Team?

With great difficulty (or a mandate from management). They inject security and monitoring requirements into projects, however, that in turn increases timeframes and budgets, which tend to become categorised as non-essential requirements for launch.

Alternatively, they have to ask and beg and push for monitoring improvements once a production system is live, forcing double the amount of work to add in improvements at the end of the development lifecycle.

Ignoring monitoring is a medium to high risk practice that companies are doing every day and is now considered one of the most common web app vulnerabilities by OWASP.

I constantly talk about bringing security in at the beginning of the Software Development Life Cycle (SDLC) and having a Green Team would help fulfil this requirement. Making the case at the beginning of a project, and during development for monitoring improvements, logging improvements, integrity monitoring that integrate with Blue Team systems is vital for the security visibility of an organisation.

A bug found in requirements is significantly cheaper than a breach in production. Source: Defense Systems Management College, 1993. Still very relevant today.

Some thoughts on creating a Green Team

I think having a Blue Team member, hopefully with some scripting ability, present at the beginning of the SDLC or Sprint process is a good first step. This new Green member will get to hear the use cases of the Sprint, understand the infrastructure being used, different integrations that might be required, hear potential misuse cases from Orange Team — and understand if there is anything that can should be monitored from a defence perspective. They can also give early tips for how to protect something easily that the Yellow Team might not need to worry about.

Blue tend to know the better and quicker way to secure something

The other side of Green Team is helping Blue with automation; have a Yellow Team member shadow Blue operations, understand how they work, and where they might have some challenges that can be fixed with automated processes. A Yellow or Green Team member might be able to improve Blue’s defences through a relatively simple code additions or logging standardisations within a project.

It can be much easier to determine what is easy or difficult to automate if someone is a developer and has visibility within the code of a project.

Green Team might be able to integrate Automated Security Testing (one of my specialties!) into a normal code testing workflow. So when Yellow Team commits code, and runs automated test cases, they also run automated security test cases too, making Blue and Yellow’s job easier.

The two sides of Green Team is enhancing Blue’s abilities to perform monitoring and forensics, while also encouraging Yellow to consider the long-term defence maintenance of the application.

The end result has many benefits that help the organisation as a whole.

The first is a culture of developers who want to integrate their work into the security fabric of an organisation and understand how they can benefit from the intrinsic protections of those defenders.

The second is a health check of Blue operations and saving hours, days, weeks, months of time through simple automation, reporting and standardisations only doable by a Yellow Team.

White Team — bringing all colours together.

At the end of the day, Information Security is dictated by p̶e̶r̶f̶e̶c̶t̶ ̶s̶e̶c̶u̶r̶i̶t̶y̶ ̶r̶e̶q̶u̶i̶r̶e̶m̶e̶n̶t̶s̶ governance, compliance, policy and business requirements.

Please someone send me a better image to show Information Security Management. My Google-fu failed me.

White Team members include elements of Compliance, Management, Analysts, Logistics and more. These are all-knowing, neutral, third-party individuals who set the rules of engagement, organises teams, makes plans and monitors progress.

They’re essential for every organisation, but they’re seen as “too corporate” for the developers, “too demanding” for the defenders and “too much of a killjoy” for the attackers.

Since White Team don’t tend to be technical in the same way as our primary coloured teams, conversations between these two groups tend to be more adversarial, each side trying to get across the other’s point of view and requirements heard.

Life is hard for White Team

But White Team are just as essential as builders, attackers and defenders. White Team encompass, and manage all of the colours, without directly being one of them.

So, although they have to be across a lot of organisational areas at a high level, they tend to conflict when directly interacting with a primary colour.

Having White Team interact with a secondary colour team will decrease conflict since secondary colours understand the language of the primary colours they come from, but they are “closer to white” and understand multiple pieces of the security puzzle.

Rainbow Team — All of the above

So I put this Colour Wheel concept to some of my security rockstar friends (people far smarter than me with minimum 10 years security experience, run events, communities etc) and many identified with all of the colours and declared themselves as a Rainbow Team.

They’ve done coding, they’ve done training, they’ve performed attacks, they’ve implemented defensive systems, and they’ve audited and designed for compliance and regulation.

Having a full spectrum understanding might be the ultimate path or goal for a passionate security person.

Or maybe some people will prefer to specialise. I think we’ll find out over the next few decades how this plays out. I do see some people preferring to specialise, but at some point all teams have a hard limit for what they can achieve alone.

One of my best pieces of feedback so far suggests that System Engineers are supposed to be the glue between all colours and white — and that problems Red and Blue are facing are because of a lack of System Engineers.

Tracking, documenting, verifying and managing the various interactions between red, blue and yellow. Systems engineers are technical integrators who are multi disciplined.

While I don’t disagree and I’m not suggesting this model is a replacement for System Engineers, I believe this is a great way to see how we can include Software Builders to work with security to help keep the organisation secure. Very happy for System Engineers to be the glue, but would also like have developers trained by Red and have developers help Blue.

What to do next?

I really like the Infosec Colour Wheel, and early feedback has shown that it’s already helping security teams understand how to interact with development teams, and where certain security professionals now sit within their own organisation as one of the new Orange / Green colours.

Got some thoughts? Did I miss something? Feedback is very welcome 😊. I like improving articles, so this one may change slightly once I receive some additional feedback.

Hope you enjoyed the article. Clap the article if you liked it. You can clap it 50 times if you really like it. I’ve also included links to the pictures used at the bottom of the page. DM me if you want access to PSDs.

For more reading on this topic, I recommend April Wright’s paper as she goes into more details and talks about how to implement this concept. Check out her Youtube video and Podcast on the topic. You should follow her on Twitter too — she’s pretty rad.


April C. Wright (@aprilwright) | Twitter_The latest Tweets from April C. Wright (@aprilwright). Endlessly-curious #Hacker, #infoSec Speaker/Author…_twitter.com

While you’re on twitter, follow me @proxyblue. I tweet InfoSec stuff.


Louis Cremen👨🏻‍💻 (@proxyblue) | Twitter_The latest Tweets from Louis Cremen👨🏻‍💻 (@proxyblue). I wear many hats. CTO @ tappON. AppSec in UAE gov. Teach…_twitter.com


For myself, I’ve been a security person for 5+ years, but never fit the Blue/Red/Purple mould. As an #orangeTeam #greenTeam person, now I can see where my skills fit in an organisation, rather than just being brought in as a contractor from time to time. Also, it’s always nice to be included and have a team 😊🧡💚

If you want to discuss and learn more, I regularly hang out at The Many Hats Club on Discord, the best InfoSec Discord. We have podcasts and lots of channels for discussion and questions. We’re also currently the only one with Discord Partner Status. https://discord.gg/infosec


Discord - Free voice and text chat for gamers_Step up your game with a modern voice & text chat app. Crystal clear voice, multiple server and channel support, mobile…_discord.gg

As I continue my InfoSec journey, I’m learning more Blue and Red skills, so hopefully I’m on my way to being Rainbow Team too. 🌈

Check out my other popular posts. They’re actually all very Orange Team, it’s nice to finally be included in the InfoSec colour circle 🧡💚


10 things InfoSec professionals need to know about networking_So this story stems from the fact that I’ve plopped myself into the InfoSec world from App Development and from my Sec…_hackernoon.com


You ‘do’ InfoSec right? What do you read? Who do you listen to?_As an enthusiastic security person, I get this question… a lot. 😅_louis.land


What devs need to know about Encoding / Encryption / Hashing / Salting / Stretching_In the world of software development, I see people get encryption terms and usage wrong a lot. One of the many pieces…_hackernoon.com


After scanning over a million apps — 3 things Mobile App Devs need to know about App Security_Hackers might not attack you. Bots will._hackernoon.com

Useful Links to Colour Wheel Graphics

https://goto.louis.land/infosec-wheel-pngs-zip

Thanks for reading the word “team” 120 times

And thanks to all my proof readers. This article would not be the same without you amazing people.