Net::Analysis Parse tcpdump for HTTP Request/Response Headers
am 29.07.2007 12:41:04 von raja.osuHi,
Iam pretty new to Perl. I have a requirement of parsing tcpdump file
to extract HTTP Request/Response headers, corresponding to successful
requests that have query strings. This, I have done using the
Net::Analysis package. But I have the additional requirement of
writing out the headers to different files based on the server program
being requested (like...all requests/responses corresponding to
example.com/login.pl, should go to one file).
I realised that the command:
perl -MNet::Analysis -e main HTTP Example3.pm tcpdump.file
invokes the .pm file for each line of the dump file. I was initially
thinking of opening one file handle for each unique server program and
writing the headers accordingly...But if the .pm file is invoked per
line of the input file, this does not seem possible. Iam very new to
Perl and am not able to think of the best way to get this done. Could
you please help me out in this regard? I have pasted below the simple
parser which reads the tcpdump file and prints out the request/
response headers for successful requests with query strings.
Thanks a lot,
Raja
=============================================
use strict;
use warnings;
use base qw(Net::Analysis::Listener::Base);
use URI;
use URI::QueryParam;
sub http_transaction {
my ($self, $args) = @_;
my ($req) = $args->{req}; # isa HTTP::Request
my ($resp) = $args->{resp}; # isa HTTP::Response
my $u;
# if ( $args->{req} ) { printf "%s\n", $req->as_string; }
# if ( $args->{resp} ) { printf "%s\n", $resp-
>headers_as_string; }
if ( $args->{req} && $args->{resp} && (lc($req->method) eq "get"))
{
$u = URI->new($req->uri, "http");
if ( $u->query && ($resp->code >= 200 && $resp->code < 300))
{
print $req->as_string,"\n";
print $resp->headers_as_string,"\n";
print "Method: ", $req->method,"\n";
print "URI: ", $req->uri,"\n";
print "QUERY: ", $u->query,"\n";
for my $key ($u->query_param) {
print "$key: ", join(", ", $u->query_param($key)), "\n";
}
}
}
}
1;
=============================================