SoftwareSecurity2014/Group 9/Reflection

Uit Werkplaats
< SoftwareSecurity2014‎ | Group 9
Versie door Erik Poll (overleg | bijdragen) op 29 jun 2014 om 14:38 (Our reflection on the verification process)
(wijz) ← Oudere versie | Huidige versie (wijz) | Nieuwere versie → (wijz)
Ga naar: navigatie, zoeken

Our reflection on the ASVS

For this project we used the verification standard defined by OWASP. The standard provides a structured approach to perform a security audit. The standard describes 4 levels of thoroughness of the audit. Each level describes its own methodologies and the points to focus on.

While using this method we were impressed by the structure it provides by defining different classes of security related problems (i.e. authentication or input validation). Each class lists its own requirements an application should meet in order to pass the audit. These requirements were however, sometimes not clear in what was needed to be done. We had quite a few discussions on what we thought was the meaning of certain requirements, simply because the requirements were either ambiguous or unclear.

During our assessment we only needed to focus on two of those security classes (Authentication and Cryptography) up to level2B security. Because of this small focus we had difficulties seeing the global picture of the security within phpBB. This also lead to feelings of uncertainty about the exhaustiveness of the verification process. Even if the application complied to all the requirements and received the level2B secure predicate we still wouldn't be sure if the application actually was secure. Possibly doing higher level audits, or by performing audits on more classes we could diminish these doubts.

Our reflection on the verification process

During the verification process we encountered some problems. The main problem we had to tackle was the enormous amount of data we had to cope with. Not only the many lines of code of phpBB, but also the huge number of issues yielded by various tools. Aside from the number of issues, we had a hard time determining which part of this information was interesting for the categories we focussed on during our audit.

The problem of having too much code is probably due to the lack of experience in our group in working with larger web applications. We expect that doing more audits will help learn us to spot the patterns and have a quicker understanding of what is going on. Also, most of our assesment focusses on authentication so we tried to capture issues early with the use of tools. However, authentication issues can mostly only be found with thorough code-flow analysis since the whole authentication architecture cannot be summarized in a few lines of code. Getting a tool to recognize such authentication patterns in many different contexts however is extremely difficult and not yet possible. (Erik:Indeed!) Therefore we had to manually verify most of the source code of phpBB to validate our authentication requirements. The tools (by themselves) are thus not yet developed enough for authentication audits.

We had two deadlines, the first one was to perform a level1B analysis, the second one to perform a level2B analysis. This turned out to be quite ineffective since many steps from the first analysis had to be redone during the second phase, only more thoroughly. A better approach here would be to initially pick the level of the audit you want to perform and work from there, instead of auditing on multiple levels. The security requirements from the lower levels propagate to the higher levels anyways. (Erik: I agree. Note that having the two deadlnes, for level1B and 2B, was only really done for mundane practical considerations in having this work as part of this course: The early deadline makes sure group get started, and the tools give a nice concrete way to het started. For a real security evaluation one would always do these steps, of tool-based an manual analysis, together.)

Splitting the project into smaller parts by only looking into some of the security classes has its pros and cons. We believe that OWASP provided the classes in such a way that splitting the classes is the recommended approach of dividing the workload. However, considering that the class 'input validation' influences 'SQL injections' and that SQL injections possibly yield authentication issues, we note that many classes are overlapping. In practice this means that when a complete audit is performed by multiple persons, some work is done twice. Efficiency wise, this is not a good thing, however, effectively, the quality of the audit might be improved.

Another issue was that we only focussed on statical analysis of the application, while the penetration testing was not part of the assignment. We believe that doing a dynamic analysis also aids in the static analysis of the audit. If there is a live environment of the target of verification available it is eassier to comprehend the code by reviewing its actions/output in the live environment, thus giving the reviewers a better picture of the application. We believe that doing both a static analysis and a dynamic analysis improves the quality of the audit and gives a better overview of the security of the application. (Erik: Indeed. Because there is already attention to dynamic analysis in the pen-testing like work in Hacker's Hut, I deliberately avoid it here. Normally one would always combine these techniques.)

The same applies for the fact that we did not do a risk assessment prior to our audit. Since there are a lot of attack vectors to look after it would be nice to have a notion of which vectors are more important than others.

General remarks

It was not surprising that the final verdict of our target of verification is not a pass. At the time of release of our application's version there was not too much focus on security. To boost the application's security it is important to start early with security (not 'code first, analysis later'). Developers thus have to keep security in mind during several stages of the design proces and during development of the code, instead of developing functionality first and then worrying about security. Also, it is important for developers to properly document how and especially why several (security) choices were made to give other developers some added insight (especially with open-source applications).

Overall, the verification project was an interesting journey. We learned how to properly use tools for (relatively) simple verification analysis, the kind of tools that are available, we gained several insights into code analysis (both via tools and manually) and it was interesting to see how such a large (open-source) project was actually developed.

The project also had some downsides unfortunately, since the tools were not always as good as they advertised. We had quite a lot of false positives that we had to work through and the tools were not really that useful in most cases. Also, doing a manual code review on a large codebase was difficult since it was hard to figure out where to start from. For that reason it takes a while before starting to see the general structure of the application.

The software security analysis definitely gave us more insight into security aspects we weren't aware of at the start and especially how web security is actually implemented. Using the tools, ASVS and the manual analysis we can at least say we have now become better web-developers!