Man Linux: Main Page and Category List

NAME - make combined logfile from SQL database

SYNOPSIS <days> <virtual host>


       This  perl  script extracts the httpd access data from a MySQL database
       and formats it properly for parsing by 3rd-party log analysis tools.

       The script is intended to be run out by cron. Its commandline arguments
       tell  it  how  many days’ worth of access records to extract, and which
       virtual_host you are interested in (because  many  people  log  several
       virthosts  to  one MySQL db.) This permits you to run it daily, weekly,
       every 9 days -- whatever you decide.


       By "days" I mean "chunks of 24 hours prior to the moment this script is
       run."  So  if  you  run  it  at  4:34 p.m. on the 12th, it will go back
       through 4:34 p.m. on the 11th.


       Because GET and POST are not discriminated in the MySQL log, we’ll just
       assume  that  all requests are GETs. This should have negligible effect
       on any analysis software. This could be remedied IF you stored the full
       HTTP request in your database instead of just the URI, but that’s going
       to cost you a LOT of space really quickly...

       Because this is somewhat of a quick hack it doesn’t do the most  robust
       error  checking  in  the  world.  Run  it by hand to confirm your usage
       before putting it in crontab.


       Edward Rudd <>


       Michael A. Toth <> - based on comments of script


       This man page was written using xml2man (1) by the same author.