The third semi-annual "State of Software Security Report - The Intractable Problem of Insecure Software" has been issued by Veracode (see my previous comments on volume 1 and volume 2).
provides further insight into the results of static binary, dynamic, and manual security testing of almost 5,000 applications over the last 18 months from Veracode's wide client base. The data covers both web and non-web application code in the most common programming languages: C/C++, ColdFusion, Java, .NET and PHP.
This report provides even more data on the types of vulnerabilities found and further comparison between applications by industry sector, company type, purpose, supplier type and time to acceptable quality. There is a wealth of statistics which will be useful to anyone looking to reduce software vulnerabilities including developers, testers and those in the information security industry. I'm particularly impressed by the thought that has gone into the design of the data-rich charts and the honesty about whether trends are statistically significant.
One aspect mentioned is that newer applications tested on first time submission are not much better than older ones (in this case "older" means "a year or so ago"). The reasons suggested are either lack of secure development practices, or such practices were performed but inadequately. But I wonder if this may be the result of Veracode's customers beginning to work backwards through their legacy applications, to assess and thus rank them for remediation effort? Therefore, these legacy applications will not have had the same degree of care and attention as perhaps more recently developed software.
The mine of information presented over 50 pages also discusses the relatively low level of security knowledge of developers, and the need to provide better awareness and training. But a new section in this report attempts to examine the remediation efforts. I really appreciate the effort that has gone into this and the presentation of so much data analysis. We have to thank Veracode's customers for allowing their data to be included in this aggregated data.
The report also discusses how there is a growing usage of third-party risk assessments, where the software is assessed independently using multiple testing techniques. In some sectors, software suppliers are increasingly being held accountable for the security quality of the applications they produce. I think that is a good thing.
While there is a comparison of different sectors, I wonder if it will be possible to delve greater into some details in future? For example, some large providers of outsourced development are also active in the software security space, and have their own products for static & dynamic security testing, and even provide software security consultancy services. Do those companies take their own medicine? Do they apply the knowledge and tools they offer in another part of their business in their own software development services? We probably won't find out any time soon, but it would be fascinating to know.
The report's data suggests web applications are still plagued by vulnerabilities such as cross-site scripting (XSS), information leakage and injection (SQL injection as CRLF injection). Meanwhile the most frequently found issues for non-web applications are buffer overflow, error handling and potential backdoors. Cryptographic issues are also very common. The majority of applications tested suffer from these well-known defects, and all of which are well documented and have a range of methods to solve them.
Good reading for the beach this weekend!