Every so often I read a blog post outlining how Agile has got it all wrong and that if you really want high-quality software you have to ditch that methodology in favour of the good old planning and design cycles upfront.
I must admit, this type of article gets me thinking every time. The last one I read prompted me to write this post as a way for me to rethink the concept of Software Quality through Agile eyes, if you will.
I’ve been working in Agile tech contexts throughout most of my career and what I've heard most often from the detractors of Agile is that you can hardly have good quality and proper architecture with this methodology.
I think there's a lot to unpack here to give a proper response to this claim so I figured I'd write a blog post focused on the Quality aspect.
Let's first define what Quality means. As per Wikipedia:
- Software functional quality reflects how well it complies with or conforms to a given design, based on functional requirements or specifications. [...]
- Software structural quality refers to how it meets non-functional requirements that support the delivery of the functional requirements, such as robustness or maintainability. It has a lot more to do with the degree to which the software works as needed.
What I get from the definition here is that you can't have quality without a set of requirements to adhere to. Now, as a detractor, you might argue that you've hardly seen extensive specifications in Agile and therefore it's a natural consequence for agile software not to be of quality.
The reality is that, in Agile, we have just enough requirements and specifications to fulfill the current iteration.
Working software is the primary measure of progress.
The Agile Manifesto
Therefore, strictly speaking, if the software adheres to the definition of that specific iteration then it is of quality.
If you come from non-Agile environments very light requirements in specs might throw you off.
The truth is this is foundational to Agile:
Simplicity--the art of maximizing the amount of work not done--is essential.
The Agile Manifesto
Agile has no desire to develop any verbose specification upfront: it acknowledges that in order to build working software, embracing change and iteration is essential. This makes it quite hard to think about quality in traditional terms.
If you come from a waterfall-like mindset you might classify this iterative approach as rework: you might think that light specifications and poor planning have led to this many iterations. Whereas, if you had put enough effort into upfront planning, design, and architecture, you'd have had only one development iteration. By now you'd be able to move on to the next project.
Yes, it might happen that if you are extremely lucky and you do a lot of thinking upfront you might come out of a 6 months development phase with a product that is right at that point in time. And what users want. Or, you are probably just optimising a solution for a problem that's already been solved. The rest of the time, lack of feedback during development will make you build something that doesn't meet your user’s needs fully.
In Agile, frequent iterations on the same software area are expected, even sought after.
Now, I know that feeling. You have stuck with Agile. You have built and shipped something according to those slim requirements but you already (think you) know what's going to scale and whatnot. What corner cases you haven't handled. What is going to happen if your users were to push your software to its limits?
Is an unscalable solution enough to help you validate what your users want?
If the answer is yes (and it is most of the time) you should not worry about putting an unscalable solution out there.
I'm sure you are opinionated about what is going to scale and whatnot. Real production usage, though, might inform otherwise or give you a new perspective.
You might have some ideas in mind on what non-functional aspects you want to tackle next: make it more resilient and scalable. If this is the case, I think you should either dismiss them or put them into your backlog. And if you must keep them in your backlog be prepared to revisit them as you gather more usage data.
You’ll probably discover that the way you're solving your users' problems is not helping anyone. In that case, the whole iteration might go into the bin, together with your ideas on how to make it scalable and resilient.
On the other hand, what if you have picked the right path and suddenly thousands of users flocked to start using your unscalable solution? First of all, I must tell you, this is an extremely rare possibility. But yes, it does happen from time to time.
If it happens you are going to have an actual production use case to architect your non-functional requirements upon. You’ll be able to base your scalability decisions upon real usage data. Your software will end up being more solid than you could have anticipated. Plus, you’ll have the confidence that you are building the right solution! This is a considerable advantage that goes beyond just software engineering. It might be crucial for the company as a whole.
I’ve read many posts advocating that cycles spent on design and planning upfront will help you embed quality into your software. But how can you embed quality into something that you haven't validated yet?
Agile methodologies try to move the focus away from the endless specifications in favour of frequent validation cycles. The purpose here is to make a bet, validate, and iterate to minimise wasted effort.
This means that all those characteristics you feel should be part of your software as you are building it will naturally emerge from iterating and validating if they make sense.
Remember, you want to maximise the amount of work not done.
Naturally, understanding the minimum amount of work required before putting your software in front of your (potential?) customers varies according to many factors. From experience, I can tell you it's usually less than what you think it should be.
Nevertheless, I understand that putting garbage in the hands of your customers might not help anyone. At the same time, investing in making your software fully resilient before acquiring your first customer might be the difference between you stuck in development and your competitor winning their Nth customer.
I won't deny it's a fine balance to find, though. You have to focus just enough on non-functional requirements so that you can keep iterating on what the users see while solidifying the foundations you build upon.
From my experience, I see some of us (software engineers) don't particularly enjoy the discovery aspect of working in an Agile context. This is something I've personally struggled with for a bit at the beginning of my career. All I cared about was the act of solving a problem in the most elegant (according to who?) possible way. And it is something I noticed with some colleagues too, being them junior or highly experienced senior engineers.
In these situations, a big list of highly detailed requirements might feel like a great thing. There's a bunch of problems you have to figure out how to solve and you can do that with code. It'll be like a playground where you can experiment and hammer at each requirement like a box-ticking exercise until you're done.
There's a great feeling of progress involved. You know exactly where you are and how much is left. Every requirement is like a challenge. There's very little uncertainty for you in this situation. This feeling is great for human beings. We don't particularly enjoy uncertainty after all.
Maybe you don't even care about whether your software is going to be used or useful for your company to acquire your next customer. You just care about doing your job.
It must feel great to complete the development phase having implemented every single requirement, to the exact level of detail in the specification. As a developer, you think you have done your job and you can move on.
Fortunately, Agile tries to open our eyes to the crude reality out there: you had no idea how your software was going to be used. Turns out your target users can't get any value out of it. You have just finished implementing a perfectly engineered high-quality piece of garbage that nobody is willing to use because it just makes no sense.
That being said, I don't think there's isn't a place for you in Agile if you don't want to take part in the discovery process.
You'll have to accept you are going to work with a slim set of requirements, and you might end up working on the same area of the software across multiple cycles. That's a good thing! It means your organization is betting on the right things and it's receiving feedback that they indeed are the right things for your customers.
Building software in Agile is a discovery process. Users gradually discover what they want. You work with them to discover how to meet their demand and solve their problems. Additionally, all of this happens while the environment around changes and influences the problems you are trying to solve.
Similar to Architecture, Quality is a characteristic that gets refined rather than being something planned upfront.
The best architectures, requirements, and designs emerge from self-organizing teams.
The Agile Manifesto
If you think light planning and multiple iterations represent the failure of software engineering think about how wasteful it would be to lock yourself up in a room to architect the highest quality software while your potential users and the environment around them keep changing their needs. By the time you come out of your room, you might have produced something that's no longer relevant.
We have explored how embracing Agile means embracing the emergence of characteristics in a changing environment. One thing I haven't stressed enough, though, is that you have to be prepared to receive those signals from the changing environment around you. If you ship an MVP how are you going to evaluate whether it's solving a problem for your users or not?
Be it customer interviews or instrumentation and observability, this is one of the things that you should invest in as an upfront effort. Agile is all about being able to understand the signals and make the next bet according to the signals from the previous iterations.
I see this aspect is probably one of the most neglected. In my experience it's hard to stop and think about what kind of signals we'd want to look at to understand if what we're building is being valuable or not. On the other hand, I think this is the most important thing you have to invest in upfront.
Luckily for us we have plenty of tools at our disposal that help us embed observability into our software. Analytics and metrics platforms usually require a little investment up front to be able to produce meaningful data. As with everything else in Agile, you'll discover how to better tune your signals to become more useful.
Do not wait until it's too late. Embedding observability as an afterthought is a much harder effort: not only you'll have to make code changes to activate observability, you'll also have to change how you build software. You will have to change your practices in order for observability to become part of your act of producing software. As with any kind of habit, this is going to be hard.
I've framed this post on the assumption that we are employing Agile methodologies in a fast-paced and changing environment. I've assumed there is a high competitiveness degree between entities chasing market value through software products. In this context, I find Agile helps maximising the pace and focus of your efforts around finding the shortest path to product-market-fit.
I won't deny that, in my experience, this approach leads to a higher degree of technological instability in the earlier iterations. Maximising all our efforts on getting early product feedback makes us neglect certain non-functional aspects like resilience, scalability, availability and more.
This happens because we are maximising our efforts on activities that give us a return in terms of confidence that there is an appetite for what we are building.
We use techniques like A/B testing, blue-green deployments, feature flagging to make experiments that help us test our assumptions. This type of experiment is often brought into production software that is not production-quality.
Hopefully, after having received the feedback we were looking for we will have enough data to make an informed decision about those non-functional requirements that we will invest in our subsequent iterations.
Agile has been formalised around practices that try to set you up for success in highly competitive and changing environments. But not all contexts are like this.
Highly regulated or critical domains where the problem is well known and possibly already solved will probably benefit from a more waterfall-like type of approach. Avionics software will hardly benefit from A/B testing or feature-flagging in production. Nor it will need to wait for the user's feedback on how to keep the plane in the air.
The Agile methodologies have a strong user-centric focus and are heavily oriented towards minimising effort. Try to understand if Agile really makes sense for you and your organization.
I hope you have enjoyed this post. If you did please consider following me on Twitter where I share my thoughts about Startup Software Engineering.