[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Debian audits (qa, security, rough, performance...)



On Sat, 26 Oct 2002, Steve Kemp wrote:

> On Sat, Oct 26, 2002 at 08:55:56AM -0500, Drew Scott Daniels wrote:
>
>   My project is primarily focussed on security issues, and is carried out
>  by hand in the main.  However I've just created a tools page with pointers
>  to some of the tools you mention, and a 'fuzz tester'.
>
crashme is an interesting system 'fuzz tester'. There are some really good
network tools similar to 'fuzz tester' that may be worth looking at. iirc
hping and firewalker could do some of these kinds of things. nmap
interestingly enough breaks quite a few network software implementations.

> > http://www.debian.org/~apenwarr/popcon/ may be a good place to look for
>   I'd forgotten about the popularity contest; my initial thoughts were that
>  all packages which are remotely accessable should be looked at first,
>  along with those 'base packages' which we all know and love.
>
>   The popularity contest approach is a good one though.
>
I like your approach too, however popcon can allow you to target more
popular remotely accessable programs.

> > https://sourceforge.net/projects/debraudit/ which is a more general audit,
> > I feel will assist audits of Debian code. I would also like to target
> > performance and any kind of bugs in my audit project.
>
>   I think that performance bugs may be hard to fix, even if simple to report,
>  so I'd, personally, not mention them.
>
>   Other bugs are worth reporting.  I know that I've reported a few trivial
>  ones over the past few days (eg. program `foo` crashes if given a command
>  line argument of xxxxx... x 3000), these are the kind of things that might
>  be exploitable if the binary in question is setuid/setgid.
>
Performance is just a hopeful thing to find. It would be hard to look for
performance issues, however there may be some slower common functions that
could be easily found.

>   I'd be happy to share any interesting results with you, or anybody
>  else.  (Obviously some things can't be publically mentioned until they've
>  been fixed though)
>
I'm more interested in how to use the audit tools and methods that what
the results for a specific program were. Of course I'd like all my
software to be secure. :-)

>   They look good, I recently been playing with rats and my initial impression
>  is that it looks good, I like the way it trys to weed out false positives.
>
There are other code auditing programs too, but I don't know where I put
their names & URL's. You seem to have found the top three on your tools
page anyway.

>   From my limited examination of ADL it just appears to be pre-condition
>  and post-condition assertion system for C++, modelled after XP.  If it
>  were to be used it would have to be used globally within the project,
>  and seems hard to retrofit.
>
>   Is my understanding flawed, or is this a fair assessment?
>
I'm not sure. I'm taking a higher level software engineering course now
and we're covering the Z specification language. It seems possible, but
difficult to do it, but I think it's well worth retrofitting code for
easier audits in the future and perhaps even present. Such audit code
would have to be wish list bugs against packages, but I can't see why
anyone would be against adding audit code to their diff's.

     Drew Daniels
Still looking for work. My resume is at:
http://home.cc.umanitoba.ca/~umdanie8/resume.html



Reply to: