SoftwareSecurity2012/CodeScanners

Uit Werkplaats
< SoftwareSecurity2012
Versie door S4201426 (overleg | bijdragen) op 21 jun 2012 om 11:38
(wijz) ← Oudere versie | Huidige versie (wijz) | Nieuwere versie → (wijz)
Ga naar: navigatie, zoeken

Assignment

As a first step of the code review project I want each group to try out RIPS and at least one other source code analysis tool on the source code.

Concrete steps for this:

  1. Besides RIPS, choose one additional tool from the list at the bottom of this page.
  2. Run these tools over the source code of the web application you are looking
    If you find a tool cannot be installed (on a particular platfrom ,or not at all) or say crashes when you try it, please document this in the wiki, on a wiki page for that tool, to prevent others from wasting time on this.
  3. Consider the feedback you get from the tools. What kind of feedback do you get? How much feedback do you get, a few lines, a few screenfuls, or tons and tons of warnings? Does it look meaningful/useful? Is (some of) the feedback useful for any security requirements mentioned in the ASVS, and if so which ones? More in particular, does the tool provide feedback that might be useful for any security requirements your group is looking at? Maybe you already note that the two tools you try report very similar things? How easy it is to trace back a problem in the source code given the feedback from the tool? can you understand how these tools actually work? Do you spot obvious false positives? Can you see things that the tools are great at/useful for/hopeless at? Etc.
  4. Look at the results of the code scanners from the point of view of the security requirements (V?) that your group looks at. Are any of the warnings relevant for this, and if so, which ones ? If not, can you image some code scanning tool that would give feedback relevant for your security requirements? Or is there some fundamental reason why a code scanner can not do this?
  5. On you own group wiki-page, or a subpage, write a small section (say a screenful or A4) discussing these issues for each of the tools, and - if applicable - a short comparison between them. Deadline: April 22 so that I have a chance to look at it before the lecture on April 29 and we can discuss and compare our findings during that lecture.

    Overall goal of the section on your group page should be to give a rough impression of what the tools can do and how useful this might be, for which purposes. Of course, a big issue is how accurate and complete the feedback from the tools is, but that is something that I do not expect you to look at now. Hopefully, that might be clearer at the end of the whole project.
    If the tool produces output that you can stick up on the web or in the wiki for others to have a look at, that is fine. Of course, that's no substitute for the discussion of the tools.

  6. Also, don't forget to keep a log of what you have done on your group log page! This should also be a useful means for synchronising work between members of the group. In the log also record your decisions on who will do what, to make sure everyone is clear on this.

Once this is done, if the code scanners did provide something useful for your security requirements, then you can start start checking if these warnings are false positives; this would amount to a Level 1B evaluation in the ASVS approach. If not, then you have to think of another way to get started verifying your security requirements, and you have to move on to a Level 2B type of evaluation. I won't have expected you have finished all this by April 29, nut you should have some ideas on how to get started.

Code analysis tools

There are several source code analysis tools we can experiment with. Below record which tools your group looks at. Also, create/update the wiki-page for that tool to record any problems/successes running them, your impressions about what it can/cannot do etc. The status of some tools is not so clear, so please record it if a tool is effectively dead, impossibly to install or use, etc. to save others the effort to trying it.

  • RIPS tried out by all groups
    Tool available here
  • RATS tried out by groups 4 (as Yasca plugin), 5, 1, 8, 3 (standalone and plugin).
    Tool available here
  • PHPLint tried out by groups 4 (as Yasca plugin), 8 (standalone) ...
    Tool available here
  • Yasca tried out by groups 4, 3, 8, ...
    Tool available here (Besides it's own analysis, Yasca supports use of RATS and PHPLint as plugins)
  • CodeSecure tried out by groups 3, 8
    A commercial tool, but they offer free 2-week trials and and we have an acadamic license that we can email you.
  • Fortify - if we manage to get our academic license renewed in time...

Dead tools

There are some tools around that seem to be dead or not really usable for real applications: Pixy, PHP-SAT, SWAAT, CodeScan. PHP Codesniffer only appears to check (syntactic) coding styles.

Overall impressions of the source code analysis tools

The table below gives a very brief overview of everyone's impressions of the tools and their usefulness, both in general and specifically for the security requirements you are looking at. Try to stick to a three word evaluation (e.g. great, good, very much, a bit, not much, marginally, not at all, not sure, not sure yet, ...). Motivation of this and possibly a more detailed judgement should be somewhere on your group's "reflection on code scanning page".

Group RIPS RATS PHPLint Yasca CodeSecure
works? useful? useful for us? works? useful? useful for us? works? useful? useful for us? works? useful? useful for us? works? useful? useful for us?
1 Yes Yes Maybe Yes a bit maybe yes yes no - - - - - -
2 Yes Yes No - - - - - - Yes Yes No - - -
3 Yes Yes Yes Yes Limited No Yes Yes Yes Yes Yes  ?  ? Limited to 10000 lines of code  ?
4 Yes Very much Good Yes (as Yasca plugin) Marginally Not much Yes (as Yasca plugin) No, too much output No Yes Good A bit - - -
5 Yes Yes No Yes Limited (very sparse output) No - - - - - - - - -
6 Yes Very much Yes Yes A bit No Yes Too much output Not sure - - - Yes (in pieces of code) Yes No-more targeted customization may reveal more
7 Yes Definitely Yes Yes [standalone and yasca plugin] A bit Less useful Yes [Yasca plugin] Not much Not Much Yes Yes [should improve result information] Yes - - -
8 Yes Yes Yes Yes Yes Maybe Yes (standalone) Limited No Yes Yes (limited) Yes (limited) Yes (stripped sourcecode) No (False Positives) No
9 Yes Yes No - - - - - - Yes Yes No A bit Not sure Not sure
10 Yes Yes Yes Yes Maybe Probably not Yes Not really (too much output) No No  ?  ? No (license missing) No No