Some perspectives on quality
A couple of weeks ago, I participated in an assessment with 7 distinct teams to understand a bit more about their overall development process, including their approach on testing and quality. We tried to uncover a bunch of different things, related to collaboration, practices, tooling, CI/CD, testing, etc.
One of the more subjective questions raised was: What does quality mean to you and your team?
We wanted to understand the team's perspective on quality and cross-relate it with the practices followed within the team and the overall status of the underlying project.
The answers to: What does quality mean to you and your team?
In order to understand the answers, we need to provide a bit more context about each team and the software product being handled by them.
This product can be described as providing a very basic UI for processing an input and producing a well-defined output; it's a small team working in a product that exists for quite a while.
Security on the usage of the software; working without problems; always works in the same way and produces the same output
The team has been addressing some security issues that have been found (internally and through crowd-sourced security resources). The perspective on quality is a match with the ongoing security concerns and also with correctness.
This is not actually a product; it's a more tactical team (with shared resources) that helps with small projects that can serve multiple teams or that can start some module from scratch, later on to be incorporated in another product.
Conformity level between initial requirements and what the team delivered
For this team, it makes sense to provide such an answer as the team works with specifications, addresses these in a short time frame, does the handover, and moves on.
This is a medium team working on a product that provides a web UI. The product is SaaS based and has many concurrent users. The team embraces some DevOps practices, handles support, deployment and operations.
Absence of bugs (mostly bugs from support feeds) and to performance issues; (we're) mostly focused on customer feedback from support; How the team is performing/working
The feedback about quality is essentially around feedback coming from support, which may result in bug reports on feature requests. Besides, as it is a SaaS product serving large thousands of customers in multiple locations, performance is a concern, especially because the team is responsible for deploying to production and maintaining it. This kind of end-to-end perspective is interesting: on the left side, listening to customers... and on the right side, caring for production, using monitoring and observability (to an extent).
This small, young team works on an innovative product. It's a product where user experience and UI are important, that runs on local machines, supporting multiple OSes (operating systems), and receives frequent updates.
Easy to use (without requiring documentation) and the app is working without issues (no crashes or bugs)
This perspective on quality matches the product context as it has to be easy to use and it shall "behave fine" in user machines, without crashes. Knowing that users will use different versions of OSes, with all sorts of configurations, is challenging for features and stability.
This medium-sized team works on a product that exists for several years. It provides a basic UI but the core features are related to data processing, whose size and conditions are unknown upfront. The product is deployed in servers, having different configuration/infrastructure scenarios. Feature releases are done every month, while bugfix releases are pushed more frequently. The team handles support issues often.
Be able to make a release without bugs and without performance issues
The feedback on quality meaning aligns with the core concerns of the product. Besides correctness, performance is one of the main concerns due to data aspects (e.g. size, format) and also to the fact that data handling/processing is done in customer infrastructure, that may not be properly sized.
This is a medium/large team working in a product that provides a UI and that is installed locally. It has to support diverse customer scenarios. It is used by millions of users and also in numerous ways. The product has a long tail of feature requests and bug reports.
Keep up with customer expectations, concerning features and bugs
This answer has a reasoning behind it: the high customer interaction with the team through support and other channels. Also, there's a public roadmap that customers track and try to push.
This team works in a product that is actually a component used by other products/teams. It has no UI; it received some data, processes it, and then generates an output. It's a very small team.
Generate output without errors and in a fast way
The concern of this team is around delivering always the same output, and do it as fast as possible. Thus, correctness and performance arise as the main concerns.
We can see that the actual meaning of quality is distinct from team to team.
This is due to several factors:
- team context (e.g. maturity, skills, collaboration)
- product context (e.g. what is important for continuous product success?
- current context (i.e. what's important right now? What are we currently struggling with?)
Answers may be a bit vague (e.g. "without bugs/errors", "keep up with customer expectations") in the sense that they hide multiple quality criteria, that we would need to decompose further to have more clear insights. Some of these become a bit more clear when we uncover the testing being done, or that the team wished to have.
The quest for uncovering quality proceeds :)
Thanks for reading this article; as always, this represents one point in time of my evolving view :) Feedback is always welcome. Feel free to leave comments, share, retweet or contact me. If you can and wish to support this and additional contents, you may buy me a coffee ☕