Poor software quality cost US businesses US$2.08trn in 2020
According to a new report, the cost of poor software quality (CPSQ) in the US in 2020 was approximately US$2.08 trillion (€1.69 trillion). This included poor software quality resulting from software failures, unsuccessful development projects, legacy system problems, technical debt and cybercrime enabled by exploitable weaknesses and vulnerabilities in software.
Co-sponsored by Synopsys, the new report has been produced by the Consortium for Information & Software Quality (CISQ), an organisation which develops international standards to automate software quality measurement and promotes the development and sustainment of secure, reliable, and trustworthy software.
Its key findings include:
- Operational software failure is the leading driver of the cost of poor software quality (CPSQ), estimated at $1.56 trillion (€1.27 trillion). This figure represents a 22% increase since 2018. That number could be low given the meteoric rise in cybersecurity failures, and also with the understanding that many failures go unreported. Cybercrimes enabled by exploitable weaknesses and vulnerabilities in software are the largest growth area by far in the last 2 years. The underlying cause is primary unmitigated software flaws.
- Unsuccessful development projects, the next largest growth area of the CPSQ, is estimated at $260 billion (€210.84 billion). This figure has risen by 46% since 2018. There has been a steady project failure rate of ~19% for over a decade.
- The underlying causes are varied, but one consistent theme has been the lack of attention to quality. Research suggests that success rates go up dramatically when using agile and DevOps, leading to decision latency being minimised.
- Legacy system problems identified $520 billion (€421.67 billion) due to CPSQ. This is down from $635 billion (€514.92 billion) in 2018. The US spent $1.6 trillion (€1.30 trillion) in 2020 on IT, and that ~75% of that was spent on legacy systems – or $1.2 trillion (€0.97 trillion).
- If $1.2 trillion (€0.97 trillion) is being spent on legacy systems, and if as much as 2/3 of that could be classified as “waste”, that gives an approximate upper bound of $800 billion (€648.22 billion) on the cost of poor-quality software from a maintenance perspective. This waste does not include those additional costs incurred outside of the IT organisation.
The report shows that, despite the global pandemic, software continues to grow, proliferate, and enhance our digitally enabled lives, as organisations undertake major digital transformations, software-based innovation and development rapidly expands.
The result is a balancing act trying to deliver value at high speed without sacrificing quality. Generally, however, we are not very good at balancing. Software quality lags behind other objectives in most organisations. That lack of primary attention to quality comes at a steep cost, which is revealed in this report. While organisations can monetise the business value of speed, they rarely measure the offsetting cost of poor quality.
For 2020, the authors determined the total Cost of Poor Software Quality (CPSQ) in the US was $2.08 trillion (€1.69 trillion). They also noted that the 2020 US figure for the software technical debt residing in severe defects that need to be corrected would have been $1.31 trillion (€1.06 trillion) (minus interest) but did not include technical debt in the total CPSQ since it represents a future cost which is increasing (14% rise since 2018). The graphical results are shown below.
Specifically, the authors determined that:
- The largest contributor to CPSQ is operational software failures. For 2020 we estimated that it is approx $1.56 T(€1.26), a 22% growth over 2 years – but that could be underestimated given the meteoric rise in cybersecurity failures, and that many failures go unreported. The underlying cause is primarily unmitigated flaws in the software.
- The next largest contributor to CPSQ in unsuccessful development projects totaling $260 billion (€210.84 billion) (B), which rose by 46% since 2018. The project failure rate has been steady at approx 19% for over a decade. The underlying causes are varied, but one consistent theme has been the lack of attention to quality.
- Legacy system problems contributed $520 billion (€421.67 billion)to CPSQ (down from $635 billion (€514.92 billion) in 2018), mostly still due to non-value added “waste.”
The cost of poor software quality in the US: A 2020 report
In the report, the authors say “Our general recommendations for 2020 continue to emphasise prevention. The next best approach is to address weaknesses and vulnerabilities in software by isolating, mitigating, and correcting them as closely as possible to where they were injected to limit the damage done.”
More specifically, they recommend that software shops:
Avoid low quality development practices and adopt secure coding practices.
- Recognise the inherent difficulties of developing software and use effective tools to help deal with those difficulties.
- Ensure early and regular analysis of source code to detect violations, weaknesses, and vulnerabilities.
- Measure structural quality characteristics.
- Focus on the evaluation of included components (e.g., open source) and platforms which may have unknown weaknesses or vulnerabilities.
- Learn more about the typical vulnerabilities and exploitable weaknesses attributable to certain programming languages.
- Use best known practices for managing a legacy system – especially when it comes to overcoming the loss of understanding and knowledge of how the system works internally. Benchmarking health status is a good place to start.
- Avoid unsuccessful projects by not creating arbitrary schedules. Pay attention to defined quality objectives and measure against those objectives throughout the project lifecycle.
- Invest smartly in software quality improvements based on CPSQ numbers in hand.
- Focus on the different results of good vs. poor software quality in your shop and relevant benchmark organisations.
By attempting to improve CPSQ, other economic target areas will be impacted – for example, cost of ownership, profitability, human performance levels, ability to innovate, and effectiveness of your mission critical IT systems.
“In our conclusions we identify what specific actions you can take at the level of: 1) individual software professional, 2) team/project leader, and 3) management/executive level of an organisation. We also reveal an important (but little known) study that explains the difference in practices between high performing vs. low performing software organisations.
That study revealed a 5-10X difference in performance between the top 10% and the bottom 10% of organisations sampled. When you dig deeper into the data, the reason is clearly the adoption of certain quality and process best practices.
The key enablers for achieving the highest levels of cost, schedule and quality performance are:
- A well-defined, yet adaptable development process
- Excellent estimation methods
- Project management discipline
- Excellent staff skill levels
- Quality vision
- Customer satisfaction focus
- TQM management culture
- Defect prevention
These best practices and recommendations are then consolidated in CISQ’s conceptual process model called DevQualOps – which represents the next evolutionary step beyond today’s Agile plus DevOps and similar continuous evolution and delivery models.
CISQ consortium for information & software quality
In this report we quantified the negative economic value of poor quality in our software systems at a US national level. We did so, with the hope and expectation that the readers of this report will be inspired to do likewise within their own organisations.