[Nikto-discuss] Mutation & Databases & Memory
Sullo
csullo at gmail.com
Tue Jan 12 05:03:45 UTC 2010
Ok, been messing around with using some type of flat-file database for
storage of mutation (m=1) attacks. Here are some preliminary findings.
I'm looking for feedback on things I missed or should try...
I tried perl's BerkeleyDB and DBM::Deep modules. Both offer flat-file
databases which can store serialized data (though berkeley requires
Data::Serializer). Both also offer optional compression using Zlib.
- Compression isn't worth the overhead since it is on a per-field basis
- BerkeleyDB offers best overall file size for the data set (stored
Data::Serializer arrays) using testid as the key
- BerkeleyDB is significantly (many orders of magnitude) faster than DBM::Deep
- A full BerkeleyDB file with over 3 million mutated tests takes
approximately 20 minutes to generate and uses 660mb of disk space
So, BerkeleyDB comes out the winner if just from speed alone (that
said, DBM::Deep shouldn't be ignored if you need something easy and
lightweight with smaller file sizes--it's easier on dependencies and
works quite well).
Anyway...
Questions for the crowd:
- If we convert all the databases into a binary format using
BerkelyDB, do we "lose" anything... besides them being grepable?
- We'll have to craft a way to still support user databases (does
anyone use them?) if we make all the dbs binary.
- Any other modules recommended for testing?
--
http://www.cirt.net | http://www.osvdb.org/
More information about the Nikto-discuss
mailing list