SoftwareSecurity2014/Group 12/Reflection
Inhoud
- 1 Reflection
- 1.1 What difficulties did you encounter, either with the code or the ASVS? Can you think of ways to reduce or avoid these difficulties?
- 1.2 Could the ASVS be clearer, more complete, or structured in a better way?
- 1.3 Is splitting up the work as we did,with different groups looking at different security requirement, a sensible way to split the work? Or are there more practical ways?
- 1.4 What could or should developers do to facilitate a security assessment?
- 1.5 Any other interesting, frustating, boring, educational, etc. aspects of the whole experience.
Reflection
What difficulties did you encounter, either with the code or the ASVS? Can you think of ways to reduce or avoid these difficulties?
In our opinion the code is very complex and we think PHP is not a well-structured language. This resulted in a lot of frustration. We tried to get an overview of the code in a structural way. This worked out eventually but took a lot of effort. We chose to focus on files related to the login, registration, search and contact functionality. We think these files are especially relevant for our requirement since we think input validation for these files is critical.
As for the OWASP ASVS V5 requirements, we sometimes had difficulties with interpreting the requirements. For instance, we were not sure how to interpret requirement V5.6 (e.g., "Verify that a single input validation control is used by the application for each type of data that is accepted."). We discussed our opinions about how the phrase "type of data" should be interpreted and we finally agreed upon the definition of the requirement. (Erik:Good point - this notion of data type is rather vague) Some of the requirements are only satisfied if something is true for all input validation failures. In order to disprove such a requirement, only one counterexample needs to be found. We experienced that such a counterexample is sometimes hard to find.
Could the ASVS be clearer, more complete, or structured in a better way?
ASVS is a well-structured method to evaluate code on different aspects. It is well defined and helps you keep focussing on the more important security aspects. However, it is unclear when a program passes or fails the requirement. If a single piece of code fails does the whole test fail?
Even though the ASVS document formulates precise security requirements(Erik:I think it is not always that clear - eg. as you note with the rather vague concept of `data type' above.) , to our knowledge it lacks guidelines for how to properly check the requirements.(Erik:It sure lacks such guidelines. Note that it is hard to give some simple and clear guidelines that can be followed - a lot of it is down to nature of the application the programming language used, how the application is organised, etc.)
Is splitting up the work as we did,with different groups looking at different security requirement, a sensible way to split the work? Or are there more practical ways?
We think it was a sensible way to split the work, because we can focus on your specific security requirement. On the contrary, we only looked at specific security requirements (e.g. V5 Input Validation) and not all the security requirements. Therefore, we do not have a complete view of the security requirements.
We divided the work by focusing on different pieces of code. We tested all requirements on the code pieces to get an overall view of the total code. By spreading the load we could discover certain special modules faster and helped to get a good idea about the code structure.
What could or should developers do to facilitate a security assessment?
Developers should convey their overall design, to clarify how Joomla was built and designed. Also a document with the security-critical functions in which the developers motivate their security decisions would help. Furthermore, the code should be well-documented in order for the security analyzers to be able to give a proper security analysis. (Erik: From your analysis I get the impression that some thought and design has gone into organising input validation for Joomla, even though it is hard to figure out how it's organised from just the pile of code. So some design documentation could indeed have helped a lot.
Any other interesting, frustating, boring, educational, etc. aspects of the whole experience.
We had the feeling that we were thrown to the wolves. We were expected to choose a large web application with no prior knowledge of the specific application. We were expected to investigate ASVS V5 so we only had the chance to focus on a single requirement. It would be a better learning experience to pick an already prepared project. We could have spent a practical session on an in-depth explanation of the web application. We could depict a ASVS V every week and review the results the week after. This would be a better learning experience, the feedback helps to develop a proper way of thinking.
(Erik:I appreciate your points. Some groups really like the freedom to look at some web application they happen to know and use, but I can see that people with less experience would like some more structure. My fear is that a more structured approach as you suggest would make it into more of a toy exercise. Experiences how it is to be thrown to the wolves os one of the goals of the project - security reviewing is a messy issue, so experiencing it hopefully changes how people would apporach the task of engineering the security for applications they build.)