mirror of
https://github.com/zeek/zeek.git
synced 2025-10-05 08:08:19 +00:00
Merge remote-tracking branch 'origin/topic/jsiwek/faf-cleanup'
Closes #1002. * origin/topic/jsiwek/faf-cleanup: Move file analyzers to new plugin infrastructure. Add a general file analysis overview/how-to document. Improve file analysis doxygen comments. Improve tracking of HTTP file extraction (addresses #988). Fix HTTP multipart body file analysis. Remove logging of analyzers field of FileAnalysis::Info. Remove extraction counter in default file extraction scripts. Remove FileAnalysis::postpone_timeout. Make default get_file_handle handlers &priority=5. Add input interface to forward data for file analysis. File analysis framework interface simplifications.
This commit is contained in:
commit
d8b05af7e5
127 changed files with 2458 additions and 1412 deletions
41
CHANGES
41
CHANGES
|
@ -1,4 +1,45 @@
|
|||
|
||||
2.1-755 | 2013-07-03 16:22:43 -0700
|
||||
|
||||
* Add a general file analysis overview/how-to document. (Jon Siwek)
|
||||
|
||||
* Improve file analysis doxygen comments. (Jon Siwek)
|
||||
|
||||
* Improve tracking of HTTP file extraction. http.log now has files
|
||||
taken from request and response bodies in different fields for
|
||||
each, and can now track multiple files per body. That is, the
|
||||
"extraction_file" field is now "extracted_request_files" and
|
||||
"extracted_response_files". Addresses #988. (Jon Siwek)
|
||||
|
||||
* Fix HTTP multipart body file analysis. Each part now gets assigned
|
||||
a different file handle/id. (Jon Siwek)
|
||||
|
||||
* Remove logging of analyzers field of FileAnalysis::Info. (Jon
|
||||
Siwek)
|
||||
|
||||
* Remove extraction counter in default file extraction scripts. (Jon
|
||||
Siwek)
|
||||
|
||||
* Remove FileAnalysis::postpone_timeout.
|
||||
FileAnalysis::set_timeout_interval can now perform same function.
|
||||
(Jon Siwek)
|
||||
|
||||
* Make default get_file_handle handlers &priority=5 so they're
|
||||
easier to override. (Jon Siwek)
|
||||
|
||||
* Add input interface to forward data for file analysis. The new
|
||||
Input::add_analysis function is used to automatically forward
|
||||
input data on to the file analysis framework. (Jon Siwek)
|
||||
|
||||
* File analysis framework interface simplifications. (Jon Siwek)
|
||||
|
||||
- Remove script-layer data input interface (will be managed directly
|
||||
by input framework later).
|
||||
|
||||
- Only track files internally by file id hash. Chance of collision
|
||||
too small to justify also tracking unique file string.
|
||||
|
||||
|
||||
2.1-741 | 2013-06-07 17:28:50 -0700
|
||||
|
||||
* Fixing typo that could cause an assertion to falsely trigger.
|
||||
|
|
2
VERSION
2
VERSION
|
@ -1 +1 @@
|
|||
2.1-741
|
||||
2.1-755
|
||||
|
|
184
doc/file-analysis.rst
Normal file
184
doc/file-analysis.rst
Normal file
|
@ -0,0 +1,184 @@
|
|||
=============
|
||||
File Analysis
|
||||
=============
|
||||
|
||||
.. rst-class:: opening
|
||||
|
||||
In the past, writing Bro scripts with the intent of analyzing file
|
||||
content could be cumbersome because of the fact that the content
|
||||
would be presented in different ways, via events, at the
|
||||
script-layer depending on which network protocol was involved in the
|
||||
file transfer. Scripts written to analyze files over one protocol
|
||||
would have to be copied and modified to fit other protocols. The
|
||||
file analysis framework (FAF) instead provides a generalized
|
||||
presentation of file-related information. The information regarding
|
||||
the protocol involved in transporting a file over the network is
|
||||
still available, but it no longer has to dictate how one organizes
|
||||
their scripting logic to handle it. A goal of the FAF is to
|
||||
provide analysis specifically for files that is analogous to the
|
||||
analysis Bro provides for network connections.
|
||||
|
||||
.. contents::
|
||||
|
||||
File Lifecycle Events
|
||||
=====================
|
||||
|
||||
The key events that may occur during the lifetime of a file are:
|
||||
:bro:see:`file_new`, :bro:see:`file_over_new_connection`,
|
||||
:bro:see:`file_timeout`, :bro:see:`file_gap`, and
|
||||
:bro:see:`file_state_remove`. Handling any of these events provides
|
||||
some information about the file such as which network
|
||||
:bro:see:`connection` and protocol are transporting the file, how many
|
||||
bytes have been transferred so far, and its MIME type.
|
||||
|
||||
.. code:: bro
|
||||
|
||||
event connection_state_remove(c: connection)
|
||||
{
|
||||
print "connection_state_remove";
|
||||
print c$uid;
|
||||
print c$id;
|
||||
for ( s in c$service )
|
||||
print s;
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file)
|
||||
{
|
||||
print "file_state_remove";
|
||||
print f$id;
|
||||
for ( cid in f$conns )
|
||||
{
|
||||
print f$conns[cid]$uid;
|
||||
print cid;
|
||||
}
|
||||
print f$source;
|
||||
}
|
||||
|
||||
might give output like::
|
||||
|
||||
file_state_remove
|
||||
Cx92a0ym5R8
|
||||
REs2LQfVW2j
|
||||
[orig_h=10.0.0.7, orig_p=59856/tcp, resp_h=192.150.187.43, resp_p=80/tcp]
|
||||
HTTP
|
||||
connection_state_remove
|
||||
REs2LQfVW2j
|
||||
[orig_h=10.0.0.7, orig_p=59856/tcp, resp_h=192.150.187.43, resp_p=80/tcp]
|
||||
HTTP
|
||||
|
||||
This doesn't perform any interesting analysis yet, but does highlight
|
||||
the similarity between analysis of connections and files. Connections
|
||||
are identified by the usual 5-tuple or a convenient UID string while
|
||||
files are identified just by a string of the same format as the
|
||||
connection UID. So there's unique ways to identify both files and
|
||||
connections and files hold references to a connection (or connections)
|
||||
that transported it.
|
||||
|
||||
Adding Analysis
|
||||
===============
|
||||
|
||||
There are builtin file analyzers which can be attached to files. Once
|
||||
attached, they start receiving the contents of the file as Bro extracts
|
||||
it from an ongoing network connection. What they do with the file
|
||||
contents is up to the particular file analyzer implementation, but
|
||||
they'll typically either report further information about the file via
|
||||
events (e.g. :bro:see:`FileAnalysis::ANALYZER_MD5` will report the
|
||||
file's MD5 checksum via :bro:see:`file_hash` once calculated) or they'll
|
||||
have some side effect (e.g. :bro:see:`FileAnalysis::ANALYZER_EXTRACT`
|
||||
will write the contents of the file out to the local file system).
|
||||
|
||||
In the future there may be file analyzers that automatically attach to
|
||||
files based on heuristics, similar to the Dynamic Protocol Detection
|
||||
(DPD) framework for connections, but many will always require an
|
||||
explicit attachment decision:
|
||||
|
||||
.. code:: bro
|
||||
|
||||
event file_new(f: fa_file)
|
||||
{
|
||||
print "new file", f$id;
|
||||
if ( f?$mime_type && f$mime_type == "text/plain" )
|
||||
FileAnalysis::add_analyzer(f, [$tag=FileAnalysis::ANALYZER_MD5]);
|
||||
}
|
||||
|
||||
event file_hash(f: fa_file, kind: string, hash: string)
|
||||
{
|
||||
print "file_hash", f$id, kind, hash;
|
||||
}
|
||||
|
||||
this script calculates MD5s for all plain text files and might give
|
||||
output::
|
||||
|
||||
new file, Cx92a0ym5R8
|
||||
file_hash, Cx92a0ym5R8, md5, 397168fd09991a0e712254df7bc639ac
|
||||
|
||||
Some file analyzers might have tunable parameters that need to be
|
||||
specified in the call to :bro:see:`FileAnalysis::add_analyzer`:
|
||||
|
||||
.. code:: bro
|
||||
|
||||
event file_new(f: fa_file)
|
||||
{
|
||||
FileAnalysis::add_analyzer(f, [$tag=FileAnalysis::ANALYZER_EXTRACT,
|
||||
$extract_filename="./myfile"]);
|
||||
}
|
||||
|
||||
In this case, the file extraction analyzer doesn't generate any further
|
||||
events, but does have the side effect of writing out the file contents
|
||||
to the local file system at the specified location of ``./myfile``. Of
|
||||
course, for a network with more than a single file being transferred,
|
||||
it's probably preferable to specify a different extraction path for each
|
||||
file, unlike this example.
|
||||
|
||||
Regardless of which file analyzers end up acting on a file, general
|
||||
information about the file (e.g. size, time of last data transferred,
|
||||
MIME type, etc.) are logged in ``file_analysis.log``.
|
||||
|
||||
Input Framework Integration
|
||||
===========================
|
||||
|
||||
The FAF comes with a simple way to integrate with the :doc:`Input
|
||||
Framework <input>`, so that Bro can analyze files from external sources
|
||||
in the same way it analyzes files that it sees coming over traffic from
|
||||
a network interface it's monitoring. It only requires a call to
|
||||
:bro:see:`Input::add_analysis`:
|
||||
|
||||
.. code:: bro
|
||||
|
||||
redef exit_only_after_terminate = T;
|
||||
|
||||
event file_new(f: fa_file)
|
||||
{
|
||||
print "new file", f$id;
|
||||
FileAnalysis::add_analyzer(f, [$tag=FileAnalysis::ANALYZER_MD5]);
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file)
|
||||
{
|
||||
Input::remove(f$source);
|
||||
terminate();
|
||||
}
|
||||
|
||||
event file_hash(f: fa_file, kind: string, hash: string)
|
||||
{
|
||||
print "file_hash", f$id, kind, hash;
|
||||
}
|
||||
|
||||
event bro_init()
|
||||
{
|
||||
local source: string = "./myfile";
|
||||
Input::add_analysis([$source=source, $name=source]);
|
||||
}
|
||||
|
||||
Note that the "source" field of :bro:see:`fa_file` corresponds to the
|
||||
"name" field of :bro:see:`Input::AnalysisDescription` since that is what
|
||||
the input framework uses to uniquely identify an input stream.
|
||||
|
||||
The output of the above script may be::
|
||||
|
||||
new file, G1fS2xthS4l
|
||||
file_hash, G1fS2xthS4l, md5, 54098b367d2e87b078671fad4afb9dbb
|
||||
|
||||
Nothing that special, but it at least verifies the MD5 file analyzer
|
||||
saw all the bytes of the input file and calculated the checksum
|
||||
correctly!
|
|
@ -25,6 +25,7 @@ Frameworks
|
|||
notice
|
||||
logging
|
||||
input
|
||||
file-analysis
|
||||
cluster
|
||||
signatures
|
||||
|
||||
|
|
|
@ -34,6 +34,7 @@ rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_DNS.events.bif.bro)
|
|||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_FTP.events.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_FTP.functions.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_File.events.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_FileHash.events.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_Finger.events.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_GTPv1.events.bif.bro)
|
||||
rest_target(${CMAKE_BINARY_DIR}/scripts base/bif/plugins/Bro_Gnutella.events.bif.bro)
|
||||
|
|
|
@ -15,18 +15,20 @@ export {
|
|||
## A structure which represents a desired type of file analysis.
|
||||
type AnalyzerArgs: record {
|
||||
## The type of analysis.
|
||||
tag: Analyzer;
|
||||
tag: FileAnalysis::Tag;
|
||||
|
||||
## The local filename to which to write an extracted file. Must be
|
||||
## set when *tag* is :bro:see:`FileAnalysis::ANALYZER_EXTRACT`.
|
||||
extract_filename: string &optional;
|
||||
|
||||
## An event which will be generated for all new file contents,
|
||||
## chunk-wise.
|
||||
## chunk-wise. Used when *tag* is
|
||||
## :bro:see:`FileAnalysis::ANALYZER_DATA_EVENT`.
|
||||
chunk_event: event(f: fa_file, data: string, off: count) &optional;
|
||||
|
||||
## An event which will be generated for all new file contents,
|
||||
## stream-wise.
|
||||
## stream-wise. Used when *tag* is
|
||||
## :bro:see:`FileAnalysis::ANALYZER_DATA_EVENT`.
|
||||
stream_event: event(f: fa_file, data: string) &optional;
|
||||
} &redef;
|
||||
|
||||
|
@ -87,7 +89,7 @@ export {
|
|||
conn_uids: set[string] &log;
|
||||
|
||||
## A set of analysis types done during the file analysis.
|
||||
analyzers: set[Analyzer] &log;
|
||||
analyzers: set[FileAnalysis::Tag];
|
||||
|
||||
## Local filenames of extracted files.
|
||||
extracted_files: set[string] &log;
|
||||
|
@ -120,7 +122,9 @@ export {
|
|||
|
||||
## Sets the *timeout_interval* field of :bro:see:`fa_file`, which is
|
||||
## used to determine the length of inactivity that is allowed for a file
|
||||
## before internal state related to it is cleaned up.
|
||||
## before internal state related to it is cleaned up. When used within a
|
||||
## :bro:see:`file_timeout` handler, the analysis will delay timing out
|
||||
## again for the period specified by *t*.
|
||||
##
|
||||
## f: the file.
|
||||
##
|
||||
|
@ -130,18 +134,6 @@ export {
|
|||
## for the *id* isn't currently active.
|
||||
global set_timeout_interval: function(f: fa_file, t: interval): bool;
|
||||
|
||||
## Postpones the timeout of file analysis for a given file.
|
||||
## When used within a :bro:see:`file_timeout` handler for, the analysis
|
||||
## the analysis will delay timing out for the period of time indicated by
|
||||
## the *timeout_interval* field of :bro:see:`fa_file`, which can be set
|
||||
## with :bro:see:`FileAnalysis::set_timeout_interval`.
|
||||
##
|
||||
## f: the file.
|
||||
##
|
||||
## Returns: true if the timeout will be postponed, or false if analysis
|
||||
## for the *id* isn't currently active.
|
||||
global postpone_timeout: function(f: fa_file): bool;
|
||||
|
||||
## Adds an analyzer to the analysis of a given file.
|
||||
##
|
||||
## f: the file.
|
||||
|
@ -171,58 +163,6 @@ export {
|
|||
## rest of it's contents, or false if analysis for the *id*
|
||||
## isn't currently active.
|
||||
global stop: function(f: fa_file): bool;
|
||||
|
||||
## Sends a sequential stream of data in for file analysis.
|
||||
## Meant for use when providing external file analysis input (e.g.
|
||||
## from the input framework).
|
||||
##
|
||||
## source: a string that uniquely identifies the logical file that the
|
||||
## data is a part of and describes its source.
|
||||
##
|
||||
## data: bytestring contents of the file to analyze.
|
||||
global data_stream: function(source: string, data: string);
|
||||
|
||||
## Sends a non-sequential chunk of data in for file analysis.
|
||||
## Meant for use when providing external file analysis input (e.g.
|
||||
## from the input framework).
|
||||
##
|
||||
## source: a string that uniquely identifies the logical file that the
|
||||
## data is a part of and describes its source.
|
||||
##
|
||||
## data: bytestring contents of the file to analyze.
|
||||
##
|
||||
## offset: the offset within the file that this chunk starts.
|
||||
global data_chunk: function(source: string, data: string, offset: count);
|
||||
|
||||
## Signals a content gap in the file bytestream.
|
||||
## Meant for use when providing external file analysis input (e.g.
|
||||
## from the input framework).
|
||||
##
|
||||
## source: a string that uniquely identifies the logical file that the
|
||||
## data is a part of and describes its source.
|
||||
##
|
||||
## offset: the offset within the file that this gap starts.
|
||||
##
|
||||
## len: the number of bytes that are missing.
|
||||
global gap: function(source: string, offset: count, len: count);
|
||||
|
||||
## Signals the total size of a file.
|
||||
## Meant for use when providing external file analysis input (e.g.
|
||||
## from the input framework).
|
||||
##
|
||||
## source: a string that uniquely identifies the logical file that the
|
||||
## data is a part of and describes its source.
|
||||
##
|
||||
## size: the number of bytes that comprise the full file.
|
||||
global set_size: function(source: string, size: count);
|
||||
|
||||
## Signals the end of a file.
|
||||
## Meant for use when providing external file analysis input (e.g.
|
||||
## from the input framework).
|
||||
##
|
||||
## source: a string that uniquely identifies the logical file that the
|
||||
## data is a part of and describes its source.
|
||||
global eof: function(source: string);
|
||||
}
|
||||
|
||||
redef record fa_file += {
|
||||
|
@ -259,11 +199,6 @@ function set_timeout_interval(f: fa_file, t: interval): bool
|
|||
return __set_timeout_interval(f$id, t);
|
||||
}
|
||||
|
||||
function postpone_timeout(f: fa_file): bool
|
||||
{
|
||||
return __postpone_timeout(f$id);
|
||||
}
|
||||
|
||||
function add_analyzer(f: fa_file, args: AnalyzerArgs): bool
|
||||
{
|
||||
if ( ! __add_analyzer(f$id, args) ) return F;
|
||||
|
@ -287,31 +222,6 @@ function stop(f: fa_file): bool
|
|||
return __stop(f$id);
|
||||
}
|
||||
|
||||
function data_stream(source: string, data: string)
|
||||
{
|
||||
__data_stream(source, data);
|
||||
}
|
||||
|
||||
function data_chunk(source: string, data: string, offset: count)
|
||||
{
|
||||
__data_chunk(source, data, offset);
|
||||
}
|
||||
|
||||
function gap(source: string, offset: count, len: count)
|
||||
{
|
||||
__gap(source, offset, len);
|
||||
}
|
||||
|
||||
function set_size(source: string, size: count)
|
||||
{
|
||||
__set_size(source, size);
|
||||
}
|
||||
|
||||
function eof(source: string)
|
||||
{
|
||||
__eof(source);
|
||||
}
|
||||
|
||||
event bro_init() &priority=5
|
||||
{
|
||||
Log::create_stream(FileAnalysis::LOG,
|
||||
|
|
|
@ -122,6 +122,34 @@ export {
|
|||
config: table[string] of string &default=table();
|
||||
};
|
||||
|
||||
## A file analyis input stream type used to forward input data to the
|
||||
## file analysis framework.
|
||||
type AnalysisDescription: record {
|
||||
## String that allows the reader to find the source.
|
||||
## For `READER_ASCII`, this is the filename.
|
||||
source: string;
|
||||
|
||||
## Reader to use for this steam. Compatible readers must be
|
||||
## able to accept a filter of a single string type (i.e.
|
||||
## they read a byte stream).
|
||||
reader: Reader &default=Input::READER_BINARY;
|
||||
|
||||
## Read mode to use for this stream
|
||||
mode: Mode &default=default_mode;
|
||||
|
||||
## Descriptive name that uniquely identifies the input source.
|
||||
## Can be used used to remove a stream at a later time.
|
||||
## This will also be used for the unique *source* field of
|
||||
## :bro:see:`fa_file`. Most of the time, the best choice for this
|
||||
## field will be the same value as the *source* field.
|
||||
name: string;
|
||||
|
||||
## A key/value table that will be passed on the reader.
|
||||
## Interpretation of the values is left to the writer, but
|
||||
## usually they will be used for configuration purposes.
|
||||
config: table[string] of string &default=table();
|
||||
};
|
||||
|
||||
## Create a new table input from a given source. Returns true on success.
|
||||
##
|
||||
## description: `TableDescription` record describing the source.
|
||||
|
@ -132,6 +160,14 @@ export {
|
|||
## description: `TableDescription` record describing the source.
|
||||
global add_event: function(description: Input::EventDescription) : bool;
|
||||
|
||||
## Create a new file analysis input from a given source. Data read from
|
||||
## the source is automatically forwarded to the file analysis framework.
|
||||
##
|
||||
## description: A record describing the source
|
||||
##
|
||||
## Returns: true on sucess.
|
||||
global add_analysis: function(description: Input::AnalysisDescription) : bool;
|
||||
|
||||
## Remove a input stream. Returns true on success and false if the named stream was
|
||||
## not found.
|
||||
##
|
||||
|
@ -164,6 +200,11 @@ function add_event(description: Input::EventDescription) : bool
|
|||
return __create_event_stream(description);
|
||||
}
|
||||
|
||||
function add_analysis(description: Input::AnalysisDescription) : bool
|
||||
{
|
||||
return __create_analysis_stream(description);
|
||||
}
|
||||
|
||||
function remove(id: string) : bool
|
||||
{
|
||||
return __remove_stream(id);
|
||||
|
|
|
@ -222,17 +222,6 @@ type endpoint_stats: record {
|
|||
endian_type: count;
|
||||
};
|
||||
|
||||
## A unique analyzer instance ID. Each time instantiates a protocol analyzers
|
||||
## for a connection, it assigns it a unique ID that can be used to reference
|
||||
## that instance.
|
||||
##
|
||||
## .. bro:see:: Analyzer::name Analyzer::disable_analyzer protocol_confirmation
|
||||
## protocol_violation
|
||||
##
|
||||
## .. todo::While we declare an alias for the type here, the events/functions still
|
||||
## use ``count``. That should be changed.
|
||||
type AnalyzerID: count;
|
||||
|
||||
module Tunnel;
|
||||
export {
|
||||
## Records the identity of an encapsulating parent of a tunneled connection.
|
||||
|
@ -3065,12 +3054,12 @@ module GLOBAL;
|
|||
## Number of bytes per packet to capture from live interfaces.
|
||||
const snaplen = 8192 &redef;
|
||||
|
||||
# Load BiFs defined by plugins.
|
||||
@load base/bif/plugins
|
||||
|
||||
# Load these frameworks here because they use fairly deep integration with
|
||||
# BiFs and script-land defined types.
|
||||
@load base/frameworks/logging
|
||||
@load base/frameworks/input
|
||||
@load base/frameworks/analyzer
|
||||
@load base/frameworks/file-analysis
|
||||
|
||||
# Load BiFs defined by plugins.
|
||||
@load base/bif/plugins
|
||||
|
|
|
@ -41,6 +41,7 @@ function get_file_handle(c: connection, is_orig: bool): string
|
|||
module GLOBAL;
|
||||
|
||||
event get_file_handle(tag: Analyzer::Tag, c: connection, is_orig: bool)
|
||||
&priority=5
|
||||
{
|
||||
if ( tag != Analyzer::ANALYZER_FTP_DATA ) return;
|
||||
set_file_handle(FTP::get_file_handle(c, is_orig));
|
||||
|
|
|
@ -13,8 +13,6 @@ export {
|
|||
const extraction_prefix = "ftp-item" &redef;
|
||||
}
|
||||
|
||||
global extract_count: count = 0;
|
||||
|
||||
redef record Info += {
|
||||
## On disk file where it was extracted to.
|
||||
extraction_file: string &log &optional;
|
||||
|
@ -26,8 +24,7 @@ redef record Info += {
|
|||
|
||||
function get_extraction_name(f: fa_file): string
|
||||
{
|
||||
local r = fmt("%s-%s-%d.dat", extraction_prefix, f$id, extract_count);
|
||||
++extract_count;
|
||||
local r = fmt("%s-%s.dat", extraction_prefix, f$id);
|
||||
return r;
|
||||
}
|
||||
|
||||
|
|
|
@ -6,25 +6,48 @@
|
|||
module HTTP;
|
||||
|
||||
export {
|
||||
redef record HTTP::Info += {
|
||||
## Number of MIME entities in the HTTP request message body so far.
|
||||
request_mime_level: count &default=0;
|
||||
## Number of MIME entities in the HTTP response message body so far.
|
||||
response_mime_level: count &default=0;
|
||||
};
|
||||
|
||||
## Default file handle provider for HTTP.
|
||||
global get_file_handle: function(c: connection, is_orig: bool): string;
|
||||
}
|
||||
|
||||
event http_begin_entity(c: connection, is_orig: bool) &priority=5
|
||||
{
|
||||
if ( ! c?$http )
|
||||
return;
|
||||
|
||||
if ( is_orig )
|
||||
++c$http$request_mime_level;
|
||||
else
|
||||
++c$http$response_mime_level;
|
||||
}
|
||||
|
||||
function get_file_handle(c: connection, is_orig: bool): string
|
||||
{
|
||||
if ( ! c?$http ) return "";
|
||||
|
||||
local mime_level: count =
|
||||
is_orig ? c$http$request_mime_level : c$http$response_mime_level;
|
||||
local mime_level_str: string = mime_level > 1 ? cat(mime_level) : "";
|
||||
|
||||
if ( c$http$range_request )
|
||||
return cat(Analyzer::ANALYZER_HTTP, " ", is_orig, " ", c$id$orig_h, " ",
|
||||
build_url(c$http));
|
||||
|
||||
return cat(Analyzer::ANALYZER_HTTP, " ", c$start_time, " ", is_orig, " ",
|
||||
c$http$trans_depth, " ", id_string(c$id));
|
||||
c$http$trans_depth, mime_level_str, " ", id_string(c$id));
|
||||
}
|
||||
|
||||
module GLOBAL;
|
||||
|
||||
event get_file_handle(tag: Analyzer::Tag, c: connection, is_orig: bool)
|
||||
&priority=5
|
||||
{
|
||||
if ( tag != Analyzer::ANALYZER_HTTP ) return;
|
||||
set_file_handle(HTTP::get_file_handle(c, is_orig));
|
||||
|
|
|
@ -14,8 +14,11 @@ export {
|
|||
const extraction_prefix = "http-item" &redef;
|
||||
|
||||
redef record Info += {
|
||||
## On-disk file where the response body was extracted to.
|
||||
extraction_file: string &log &optional;
|
||||
## On-disk location where files in request body were extracted.
|
||||
extracted_request_files: vector of string &log &optional;
|
||||
|
||||
## On-disk location where files in response body were extracted.
|
||||
extracted_response_files: vector of string &log &optional;
|
||||
|
||||
## Indicates if the response body is to be extracted or not. Must be
|
||||
## set before or by the first :bro:see:`file_new` for the file content.
|
||||
|
@ -23,15 +26,28 @@ export {
|
|||
};
|
||||
}
|
||||
|
||||
global extract_count: count = 0;
|
||||
|
||||
function get_extraction_name(f: fa_file): string
|
||||
{
|
||||
local r = fmt("%s-%s-%d.dat", extraction_prefix, f$id, extract_count);
|
||||
++extract_count;
|
||||
local r = fmt("%s-%s.dat", extraction_prefix, f$id);
|
||||
return r;
|
||||
}
|
||||
|
||||
function add_extraction_file(c: connection, is_orig: bool, fn: string)
|
||||
{
|
||||
if ( is_orig )
|
||||
{
|
||||
if ( ! c$http?$extracted_request_files )
|
||||
c$http$extracted_request_files = vector();
|
||||
c$http$extracted_request_files[|c$http$extracted_request_files|] = fn;
|
||||
}
|
||||
else
|
||||
{
|
||||
if ( ! c$http?$extracted_response_files )
|
||||
c$http$extracted_response_files = vector();
|
||||
c$http$extracted_response_files[|c$http$extracted_response_files|] = fn;
|
||||
}
|
||||
}
|
||||
|
||||
event file_new(f: fa_file) &priority=5
|
||||
{
|
||||
if ( ! f?$source ) return;
|
||||
|
@ -51,7 +67,7 @@ event file_new(f: fa_file) &priority=5
|
|||
{
|
||||
c = f$conns[cid];
|
||||
if ( ! c?$http ) next;
|
||||
c$http$extraction_file = fname;
|
||||
add_extraction_file(c, f$is_orig, fname);
|
||||
}
|
||||
|
||||
return;
|
||||
|
@ -79,6 +95,6 @@ event file_new(f: fa_file) &priority=5
|
|||
{
|
||||
c = f$conns[cid];
|
||||
if ( ! c?$http ) next;
|
||||
c$http$extraction_file = fname;
|
||||
add_extraction_file(c, f$is_orig, fname);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -39,8 +39,6 @@ export {
|
|||
|
||||
global dcc_expected_transfers: table[addr, port] of Info &read_expire=5mins;
|
||||
|
||||
global extract_count: count = 0;
|
||||
|
||||
function set_dcc_mime(f: fa_file)
|
||||
{
|
||||
if ( ! f?$conns ) return;
|
||||
|
@ -75,8 +73,7 @@ function set_dcc_extraction_file(f: fa_file, filename: string)
|
|||
|
||||
function get_extraction_name(f: fa_file): string
|
||||
{
|
||||
local r = fmt("%s-%s-%d.dat", extraction_prefix, f$id, extract_count);
|
||||
++extract_count;
|
||||
local r = fmt("%s-%s.dat", extraction_prefix, f$id);
|
||||
return r;
|
||||
}
|
||||
|
||||
|
|
|
@ -18,6 +18,7 @@ function get_file_handle(c: connection, is_orig: bool): string
|
|||
module GLOBAL;
|
||||
|
||||
event get_file_handle(tag: Analyzer::Tag, c: connection, is_orig: bool)
|
||||
&priority=5
|
||||
{
|
||||
if ( tag != Analyzer::ANALYZER_IRC_DATA ) return;
|
||||
set_file_handle(IRC::get_file_handle(c, is_orig));
|
||||
|
|
|
@ -66,8 +66,6 @@ export {
|
|||
global log_mime: event(rec: EntityInfo);
|
||||
}
|
||||
|
||||
global extract_count: count = 0;
|
||||
|
||||
event bro_init() &priority=5
|
||||
{
|
||||
Log::create_stream(SMTP::ENTITIES_LOG, [$columns=EntityInfo, $ev=log_mime]);
|
||||
|
@ -90,8 +88,7 @@ function set_session(c: connection, new_entity: bool)
|
|||
|
||||
function get_extraction_name(f: fa_file): string
|
||||
{
|
||||
local r = fmt("%s-%s-%d.dat", extraction_prefix, f$id, extract_count);
|
||||
++extract_count;
|
||||
local r = fmt("%s-%s.dat", extraction_prefix, f$id);
|
||||
return r;
|
||||
}
|
||||
|
||||
|
@ -127,7 +124,6 @@ event file_new(f: fa_file) &priority=5
|
|||
[$tag=FileAnalysis::ANALYZER_EXTRACT,
|
||||
$extract_filename=fname]);
|
||||
extracting = T;
|
||||
++extract_count;
|
||||
}
|
||||
|
||||
c$smtp$current_entity$extraction_file = fname;
|
||||
|
|
|
@ -20,6 +20,7 @@ function get_file_handle(c: connection, is_orig: bool): string
|
|||
module GLOBAL;
|
||||
|
||||
event get_file_handle(tag: Analyzer::Tag, c: connection, is_orig: bool)
|
||||
&priority=5
|
||||
{
|
||||
if ( tag != Analyzer::ANALYZER_SMTP ) return;
|
||||
set_file_handle(SMTP::get_file_handle(c, is_orig));
|
||||
|
|
|
@ -114,7 +114,6 @@ set(BIF_SRCS
|
|||
logging.bif
|
||||
input.bif
|
||||
event.bif
|
||||
file_analysis.bif
|
||||
const.bif
|
||||
types.bif
|
||||
strings.bif
|
||||
|
@ -150,6 +149,7 @@ set(bro_SUBDIR_LIBS CACHE INTERNAL "subdir libraries" FORCE)
|
|||
set(bro_PLUGIN_LIBS CACHE INTERNAL "plugin libraries" FORCE)
|
||||
|
||||
add_subdirectory(analyzer)
|
||||
add_subdirectory(file_analysis)
|
||||
|
||||
set(bro_SUBDIRS
|
||||
${bro_SUBDIR_LIBS}
|
||||
|
@ -355,21 +355,12 @@ set(bro_SRCS
|
|||
input/readers/Binary.cc
|
||||
input/readers/SQLite.cc
|
||||
|
||||
file_analysis/Manager.cc
|
||||
file_analysis/File.cc
|
||||
file_analysis/FileTimer.cc
|
||||
file_analysis/FileID.h
|
||||
file_analysis/Analyzer.h
|
||||
file_analysis/AnalyzerSet.cc
|
||||
file_analysis/Extract.cc
|
||||
file_analysis/Hash.cc
|
||||
file_analysis/DataEvent.cc
|
||||
|
||||
3rdparty/sqlite3.c
|
||||
|
||||
plugin/Component.cc
|
||||
plugin/Manager.cc
|
||||
plugin/Plugin.cc
|
||||
plugin/Macros.h
|
||||
|
||||
nb_dns.c
|
||||
digest.h
|
||||
|
|
|
@ -553,14 +553,12 @@ void builtin_error(const char* msg, BroObj* arg)
|
|||
#include "input.bif.func_h"
|
||||
#include "reporter.bif.func_h"
|
||||
#include "strings.bif.func_h"
|
||||
#include "file_analysis.bif.func_h"
|
||||
|
||||
#include "bro.bif.func_def"
|
||||
#include "logging.bif.func_def"
|
||||
#include "input.bif.func_def"
|
||||
#include "reporter.bif.func_def"
|
||||
#include "strings.bif.func_def"
|
||||
#include "file_analysis.bif.func_def"
|
||||
|
||||
void init_builtin_funcs()
|
||||
{
|
||||
|
@ -575,7 +573,6 @@ void init_builtin_funcs()
|
|||
#include "input.bif.func_init"
|
||||
#include "reporter.bif.func_init"
|
||||
#include "strings.bif.func_init"
|
||||
#include "file_analysis.bif.func_init"
|
||||
|
||||
did_builtin_init = true;
|
||||
}
|
||||
|
|
|
@ -249,7 +249,6 @@ OpaqueType* entropy_type;
|
|||
#include "logging.bif.netvar_def"
|
||||
#include "input.bif.netvar_def"
|
||||
#include "reporter.bif.netvar_def"
|
||||
#include "file_analysis.bif.netvar_def"
|
||||
|
||||
void init_event_handlers()
|
||||
{
|
||||
|
@ -317,7 +316,6 @@ void init_net_var()
|
|||
#include "logging.bif.netvar_init"
|
||||
#include "input.bif.netvar_init"
|
||||
#include "reporter.bif.netvar_init"
|
||||
#include "file_analysis.bif.netvar_init"
|
||||
|
||||
conn_id = internal_type("conn_id")->AsRecordType();
|
||||
endpoint = internal_type("endpoint")->AsRecordType();
|
||||
|
|
|
@ -260,6 +260,5 @@ extern void init_net_var();
|
|||
#include "logging.bif.netvar_h"
|
||||
#include "input.bif.netvar_h"
|
||||
#include "reporter.bif.netvar_h"
|
||||
#include "file_analysis.bif.netvar_h"
|
||||
|
||||
#endif
|
||||
|
|
|
@ -4,26 +4,12 @@
|
|||
#include "Manager.h"
|
||||
|
||||
#include "../Desc.h"
|
||||
#include "../util.h"
|
||||
|
||||
using namespace analyzer;
|
||||
|
||||
Tag::type_t Component::type_counter = 0;
|
||||
|
||||
static const char* canonify_name(const char* name)
|
||||
{
|
||||
unsigned int len = strlen(name);
|
||||
char* nname = new char[len + 1];
|
||||
|
||||
for ( unsigned int i = 0; i < len; i++ )
|
||||
{
|
||||
char c = isalnum(name[i]) ? name[i] : '_';
|
||||
nname[i] = toupper(c);
|
||||
}
|
||||
|
||||
nname[len] = '\0';
|
||||
return nname;
|
||||
}
|
||||
|
||||
Component::Component(const char* arg_name, factory_callback arg_factory, Tag::subtype_t arg_subtype, bool arg_enabled, bool arg_partial)
|
||||
: plugin::Component(plugin::component::ANALYZER)
|
||||
{
|
||||
|
|
|
@ -23,7 +23,6 @@ class Analyzer;
|
|||
*/
|
||||
class Component : public plugin::Component {
|
||||
public:
|
||||
typedef bool (*available_callback)();
|
||||
typedef Analyzer* (*factory_callback)(Connection* conn);
|
||||
|
||||
/**
|
||||
|
|
|
@ -8,6 +8,11 @@
|
|||
|
||||
class EnumVal;
|
||||
|
||||
namespace file_analysis {
|
||||
class Manager;
|
||||
class Component;
|
||||
}
|
||||
|
||||
namespace analyzer {
|
||||
|
||||
class Manager;
|
||||
|
@ -24,7 +29,7 @@ class Component;
|
|||
* subtype form an analyzer "tag". Each unique tag corresponds to a single
|
||||
* "analyzer" from the user's perspective. At the script layer, these tags
|
||||
* are mapped into enums of type \c Analyzer::Tag. Internally, the
|
||||
* analyzer::Mangager maintains the mapping of tag to analyzer (and it also
|
||||
* analyzer::Manager maintains the mapping of tag to analyzer (and it also
|
||||
* assigns them their main types), and analyzer::Component creates new
|
||||
* tags.
|
||||
*
|
||||
|
@ -121,9 +126,11 @@ public:
|
|||
protected:
|
||||
friend class analyzer::Manager;
|
||||
friend class analyzer::Component;
|
||||
friend class file_analysis::Manager;
|
||||
friend class file_analysis::Component;
|
||||
|
||||
/**
|
||||
* Constructor. Note
|
||||
* Constructor.
|
||||
*
|
||||
* @param type The main type. Note that the \a analyzer::Manager
|
||||
* manages the value space internally, so noone else should assign
|
||||
|
|
|
@ -23,5 +23,3 @@ const Tunnel::delay_gtp_confirmation: bool;
|
|||
const Tunnel::ip_tunnel_timeout: interval;
|
||||
|
||||
const Threading::heartbeat_interval: interval;
|
||||
|
||||
const FileAnalysis::salt: string;
|
||||
|
|
|
@ -920,7 +920,7 @@ event file_over_new_connection%(f: fa_file, c: connection%);
|
|||
## f: The file.
|
||||
##
|
||||
## .. bro:see:: file_new file_over_new_connection file_gap file_state_remove
|
||||
## default_file_timeout_interval FileAnalysis::postpone_timeout
|
||||
## default_file_timeout_interval FileAnalysis::set_timeout_interval
|
||||
## FileAnalysis::set_timeout_interval
|
||||
event file_timeout%(f: fa_file%);
|
||||
|
||||
|
@ -942,19 +942,6 @@ event file_gap%(f: fa_file, offset: count, len: count%);
|
|||
## .. bro:see:: file_new file_over_new_connection file_timeout file_gap
|
||||
event file_state_remove%(f: fa_file%);
|
||||
|
||||
## This event is generated each time file analysis generates a digest of the
|
||||
## file contents.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## kind: The type of digest algorithm.
|
||||
##
|
||||
## hash: The result of the hashing.
|
||||
##
|
||||
## .. bro:see:: FileAnalysis::add_analyzer FileAnalysis::ANALYZER_MD5
|
||||
## FileAnalysis::ANALYZER_SHA1 FileAnalysis::ANALYZER_SHA256
|
||||
event file_hash%(f: fa_file, kind: string, hash: string%);
|
||||
|
||||
## Generated when an internal DNS lookup produces the same result as last time.
|
||||
## Bro keeps an internal DNS cache for host names and IP addresses it has
|
||||
## already resolved. This event is generated when a subsequent lookup returns
|
||||
|
|
|
@ -1,127 +0,0 @@
|
|||
##! Internal functions and types used by the logging framework.
|
||||
|
||||
module FileAnalysis;
|
||||
|
||||
%%{
|
||||
#include "file_analysis/Manager.h"
|
||||
%%}
|
||||
|
||||
type AnalyzerArgs: record;
|
||||
|
||||
## An enumeration of various file analysis actions that can be taken.
|
||||
enum Analyzer %{
|
||||
|
||||
## Extract a file to local filesystem
|
||||
ANALYZER_EXTRACT,
|
||||
|
||||
## Calculate an MD5 digest of the file's contents.
|
||||
ANALYZER_MD5,
|
||||
|
||||
## Calculate an SHA1 digest of the file's contents.
|
||||
ANALYZER_SHA1,
|
||||
|
||||
## Calculate an SHA256 digest of the file's contents.
|
||||
ANALYZER_SHA256,
|
||||
|
||||
## Deliver the file contents to the script-layer in an event.
|
||||
ANALYZER_DATA_EVENT,
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::postpone_timeout`.
|
||||
function FileAnalysis::__postpone_timeout%(file_id: string%): bool
|
||||
%{
|
||||
using file_analysis::FileID;
|
||||
bool result = file_mgr->PostponeTimeout(FileID(file_id->CheckString()));
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::set_timeout_interval`.
|
||||
function FileAnalysis::__set_timeout_interval%(file_id: string, t: interval%): bool
|
||||
%{
|
||||
using file_analysis::FileID;
|
||||
bool result = file_mgr->SetTimeoutInterval(FileID(file_id->CheckString()),
|
||||
t);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::add_analyzer`.
|
||||
function FileAnalysis::__add_analyzer%(file_id: string, args: any%): bool
|
||||
%{
|
||||
using file_analysis::FileID;
|
||||
using BifType::Record::FileAnalysis::AnalyzerArgs;
|
||||
RecordVal* rv = args->AsRecordVal()->CoerceTo(AnalyzerArgs);
|
||||
bool result = file_mgr->AddAnalyzer(FileID(file_id->CheckString()), rv);
|
||||
Unref(rv);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::remove_analyzer`.
|
||||
function FileAnalysis::__remove_analyzer%(file_id: string, args: any%): bool
|
||||
%{
|
||||
using file_analysis::FileID;
|
||||
using BifType::Record::FileAnalysis::AnalyzerArgs;
|
||||
RecordVal* rv = args->AsRecordVal()->CoerceTo(AnalyzerArgs);
|
||||
bool result = file_mgr->RemoveAnalyzer(FileID(file_id->CheckString()), rv);
|
||||
Unref(rv);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::stop`.
|
||||
function FileAnalysis::__stop%(file_id: string%): bool
|
||||
%{
|
||||
using file_analysis::FileID;
|
||||
bool result = file_mgr->IgnoreFile(FileID(file_id->CheckString()));
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::data_stream`.
|
||||
function FileAnalysis::__data_stream%(source: string, data: string%): any
|
||||
%{
|
||||
file_mgr->DataIn(data->Bytes(), data->Len(), source->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::data_chunk`.
|
||||
function FileAnalysis::__data_chunk%(source: string, data: string,
|
||||
offset: count%): any
|
||||
%{
|
||||
file_mgr->DataIn(data->Bytes(), data->Len(), offset, source->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::gap`.
|
||||
function FileAnalysis::__gap%(source: string, offset: count, len: count%): any
|
||||
%{
|
||||
file_mgr->Gap(offset, len, source->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::set_size`.
|
||||
function FileAnalysis::__set_size%(source: string, size: count%): any
|
||||
%{
|
||||
file_mgr->SetSize(size, source->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::eof`.
|
||||
function FileAnalysis::__eof%(source: string%): any
|
||||
%{
|
||||
file_mgr->EndOfFile(source->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
module GLOBAL;
|
||||
|
||||
## For use within a :bro:see:`get_file_handle` handler to set a unique
|
||||
## identifier to associate with the current input to the file analysis
|
||||
## framework. Using an empty string for the handle signifies that the
|
||||
## input will be ignored/discarded.
|
||||
##
|
||||
## handle: A string that uniquely identifies a file.
|
||||
##
|
||||
## .. bro:see:: get_file_handle
|
||||
function set_file_handle%(handle: string%): any
|
||||
%{
|
||||
file_mgr->SetHandle(handle->CheckString());
|
||||
return 0;
|
||||
%}
|
|
@ -5,10 +5,13 @@
|
|||
|
||||
#include "Val.h"
|
||||
#include "NetVar.h"
|
||||
#include "analyzer/Tag.h"
|
||||
|
||||
#include "file_analysis/file_analysis.bif.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
typedef BifEnum::FileAnalysis::Analyzer FA_Tag;
|
||||
typedef int FA_Tag;
|
||||
|
||||
class File;
|
||||
|
||||
|
@ -17,6 +20,11 @@ class File;
|
|||
*/
|
||||
class Analyzer {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Destructor. Nothing special about it. Virtual since we definitely expect
|
||||
* to delete instances of derived classes via pointers to this class.
|
||||
*/
|
||||
virtual ~Analyzer()
|
||||
{
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Destroy file analyzer %d", tag);
|
||||
|
@ -24,7 +32,10 @@ public:
|
|||
}
|
||||
|
||||
/**
|
||||
* Subclasses may override this to receive file data non-sequentially.
|
||||
* Subclasses may override this metod to receive file data non-sequentially.
|
||||
* @param data points to start of a chunk of file data.
|
||||
* @param len length in bytes of the chunk of data pointed to by \a data.
|
||||
* @param offset the byte offset within full file that data chunk starts.
|
||||
* @return true if the analyzer is still in a valid state to continue
|
||||
* receiving data/events or false if it's essentially "done".
|
||||
*/
|
||||
|
@ -32,7 +43,9 @@ public:
|
|||
{ return true; }
|
||||
|
||||
/**
|
||||
* Subclasses may override this to receive file sequentially.
|
||||
* Subclasses may override this method to receive file sequentially.
|
||||
* @param data points to start of the next chunk of file data.
|
||||
* @param len length in bytes of the chunk of data pointed to by \a data.
|
||||
* @return true if the analyzer is still in a valid state to continue
|
||||
* receiving data/events or false if it's essentially "done".
|
||||
*/
|
||||
|
@ -40,7 +53,7 @@ public:
|
|||
{ return true; }
|
||||
|
||||
/**
|
||||
* Subclasses may override this to specifically handle an EOF signal,
|
||||
* Subclasses may override this method to specifically handle an EOF signal,
|
||||
* which means no more data is going to be incoming and the analyzer
|
||||
* may be deleted/cleaned up soon.
|
||||
* @return true if the analyzer is still in a valid state to continue
|
||||
|
@ -50,7 +63,10 @@ public:
|
|||
{ return true; }
|
||||
|
||||
/**
|
||||
* Subclasses may override this to handle missing data in a file stream.
|
||||
* Subclasses may override this method to handle missing data in a file.
|
||||
* @param offset the byte offset within full file at which the missing
|
||||
* data chunk occurs.
|
||||
* @param len the number of missing bytes.
|
||||
* @return true if the analyzer is still in a valid state to continue
|
||||
* receiving data/events or false if it's essentially "done".
|
||||
*/
|
||||
|
@ -73,17 +89,25 @@ public:
|
|||
File* GetFile() const { return file; }
|
||||
|
||||
/**
|
||||
* Retrieves an analyzer tag field from full analyzer argument record.
|
||||
* @param args an \c AnalyzerArgs (script-layer type) value.
|
||||
* @return the analyzer tag equivalent of the 'tag' field from the
|
||||
* AnalyzerArgs value \a args.
|
||||
* \c AnalyzerArgs value \a args.
|
||||
*/
|
||||
static FA_Tag ArgsTag(const RecordVal* args)
|
||||
{
|
||||
using BifType::Record::FileAnalysis::AnalyzerArgs;
|
||||
return static_cast<FA_Tag>(
|
||||
args->Lookup(AnalyzerArgs->FieldOffset("tag"))->AsEnum());
|
||||
return args->Lookup(AnalyzerArgs->FieldOffset("tag"))->AsEnum();
|
||||
}
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor. Only derived classes are meant to be instantiated.
|
||||
* @param arg_args an \c AnalyzerArgs (script-layer type) value specifiying
|
||||
* tunable options, if any, related to a particular analyzer type.
|
||||
* @param arg_file the file to which the the analyzer is being attached.
|
||||
*/
|
||||
Analyzer(RecordVal* arg_args, File* arg_file)
|
||||
: tag(file_analysis::Analyzer::ArgsTag(arg_args)),
|
||||
args(arg_args->Ref()->AsRecordVal()),
|
||||
|
@ -91,13 +115,11 @@ protected:
|
|||
{}
|
||||
|
||||
private:
|
||||
FA_Tag tag;
|
||||
RecordVal* args;
|
||||
File* file;
|
||||
};
|
||||
|
||||
typedef file_analysis::Analyzer* (*AnalyzerInstantiator)(RecordVal* args,
|
||||
File* file);
|
||||
FA_Tag tag; /**< The particular analyzer type of the analyzer instance. */
|
||||
RecordVal* args; /**< \c AnalyzerArgs val gives tunable analyzer params. */
|
||||
File* file; /**< The file to which the analyzer is attached. */
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
|
|
|
@ -3,21 +3,10 @@
|
|||
#include "AnalyzerSet.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
#include "Extract.h"
|
||||
#include "DataEvent.h"
|
||||
#include "Hash.h"
|
||||
#include "Manager.h"
|
||||
|
||||
using namespace file_analysis;
|
||||
|
||||
// keep in order w/ declared enum values in file_analysis.bif
|
||||
static AnalyzerInstantiator analyzer_factory[] = {
|
||||
file_analysis::Extract::Instantiate,
|
||||
file_analysis::MD5::Instantiate,
|
||||
file_analysis::SHA1::Instantiate,
|
||||
file_analysis::SHA256::Instantiate,
|
||||
file_analysis::DataEvent::Instantiate,
|
||||
};
|
||||
|
||||
static void analyzer_del_func(void* v)
|
||||
{
|
||||
delete (file_analysis::Analyzer*) v;
|
||||
|
@ -154,14 +143,13 @@ HashKey* AnalyzerSet::GetKey(const RecordVal* args) const
|
|||
|
||||
file_analysis::Analyzer* AnalyzerSet::InstantiateAnalyzer(RecordVal* args) const
|
||||
{
|
||||
file_analysis::Analyzer* a =
|
||||
analyzer_factory[file_analysis::Analyzer::ArgsTag(args)](args, file);
|
||||
FA_Tag tag = file_analysis::Analyzer::ArgsTag(args);
|
||||
file_analysis::Analyzer* a = file_mgr->InstantiateAnalyzer(tag, args, file);
|
||||
|
||||
if ( ! a )
|
||||
{
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Instantiate analyzer %d failed for file id",
|
||||
" %s", file_analysis::Analyzer::ArgsTag(args),
|
||||
file->GetID().c_str());
|
||||
reporter->Error("Failed file analyzer %s instantiation for file id %s",
|
||||
file_mgr->GetAnalyzerName(tag), file->GetID().c_str());
|
||||
return 0;
|
||||
}
|
||||
|
||||
|
|
|
@ -16,67 +16,144 @@ class File;
|
|||
declare(PDict,Analyzer);
|
||||
|
||||
/**
|
||||
* A set of file analysis analyzers indexed by AnalyzerArgs. Allows queueing
|
||||
* of addition/removals so that those modifications can happen at well-defined
|
||||
* times (e.g. to make sure a loop iterator isn't invalidated).
|
||||
* A set of file analysis analyzers indexed by an \c AnalyzerArgs (script-layer
|
||||
* type) value. Allows queueing of addition/removals so that those
|
||||
* modifications can happen at well-defined times (e.g. to make sure a loop
|
||||
* iterator isn't invalidated).
|
||||
*/
|
||||
class AnalyzerSet {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Constructor. Nothing special.
|
||||
* @param arg_file the file to which all analyzers in the set are attached.
|
||||
*/
|
||||
AnalyzerSet(File* arg_file);
|
||||
|
||||
/**
|
||||
* Destructor. Any queued analyzer additions/removals are aborted and
|
||||
* will not occur.
|
||||
*/
|
||||
~AnalyzerSet();
|
||||
|
||||
/**
|
||||
* Attach an analyzer to #file immediately.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return true if analyzer was instantiated/attached, else false.
|
||||
*/
|
||||
bool Add(RecordVal* args);
|
||||
|
||||
/**
|
||||
* Queue the attachment of an analyzer to #file.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return true if analyzer was able to be instantiated, else false.
|
||||
*/
|
||||
bool QueueAdd(RecordVal* args);
|
||||
|
||||
/**
|
||||
* Remove an analyzer from #file immediately.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return false if analyzer didn't exist and so wasn't removed, else true.
|
||||
*/
|
||||
bool Remove(const RecordVal* args);
|
||||
|
||||
/**
|
||||
* Queue the removal of an analyzer from #file.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return true if analyzer exists at time of call, else false;
|
||||
*/
|
||||
bool QueueRemove(const RecordVal* args);
|
||||
|
||||
/**
|
||||
* Perform all queued modifications to the currently active analyzers.
|
||||
* Perform all queued modifications to the current analyzer set.
|
||||
*/
|
||||
void DrainModifications();
|
||||
|
||||
/**
|
||||
* Prepare the analyzer set to be iterated over.
|
||||
* @see Dictionary#InitForIteration
|
||||
* @return an iterator that may be used to loop over analyzers in the set.
|
||||
*/
|
||||
IterCookie* InitForIteration() const
|
||||
{ return analyzer_map.InitForIteration(); }
|
||||
|
||||
/**
|
||||
* Get next entry in the analyzer set.
|
||||
* @see Dictionary#NextEntry
|
||||
* @param c a set iterator.
|
||||
* @return the next analyzer in the set or a null pointer if there is no
|
||||
* more left (in that case the cookie is also deleted).
|
||||
*/
|
||||
file_analysis::Analyzer* NextEntry(IterCookie* c)
|
||||
{ return analyzer_map.NextEntry(c); }
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Get a hash key which represents an analyzer instance.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return the hash key calculated from \a args
|
||||
*/
|
||||
HashKey* GetKey(const RecordVal* args) const;
|
||||
|
||||
/**
|
||||
* Create an instance of a file analyzer.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return a new file analyzer instance.
|
||||
*/
|
||||
file_analysis::Analyzer* InstantiateAnalyzer(RecordVal* args) const;
|
||||
|
||||
/**
|
||||
* Insert an analyzer instance in to the set.
|
||||
* @param a an analyzer instance.
|
||||
* @param key the hash key which represents the analyzer's \c AnalyzerArgs.
|
||||
*/
|
||||
void Insert(file_analysis::Analyzer* a, HashKey* key);
|
||||
|
||||
/**
|
||||
* Remove an analyzer instance from the set.
|
||||
* @param tag enumarator which specifies type of the analyzer to remove,
|
||||
* just used for debugging messages.
|
||||
* @param key the hash key which represents the analyzer's \c AnalyzerArgs.
|
||||
*/
|
||||
bool Remove(FA_Tag tag, HashKey* key);
|
||||
|
||||
private:
|
||||
File* file;
|
||||
|
||||
File* file; /**< File which owns the set */
|
||||
CompositeHash* analyzer_hash; /**< AnalyzerArgs hashes. */
|
||||
PDict(file_analysis::Analyzer) analyzer_map; /**< Indexed by AnalyzerArgs. */
|
||||
|
||||
/**
|
||||
* Abstract base class for analyzer set modifications.
|
||||
*/
|
||||
class Modification {
|
||||
public:
|
||||
virtual ~Modification() {}
|
||||
|
||||
/**
|
||||
* Perform the modification on an analyzer set.
|
||||
* @param set the analyzer set on which the modification will happen.
|
||||
* @return true if the modification altered \a set.
|
||||
*/
|
||||
virtual bool Perform(AnalyzerSet* set) = 0;
|
||||
|
||||
/**
|
||||
* Don't perform the modification on the analyzer set and clean up.
|
||||
*/
|
||||
virtual void Abort() = 0;
|
||||
};
|
||||
|
||||
/**
|
||||
* Represents a request to add an analyzer to an analyzer set.
|
||||
*/
|
||||
class AddMod : public Modification {
|
||||
public:
|
||||
/**
|
||||
* Construct request which can add an analyzer to an analyzer set.
|
||||
* @param arg_a an analyzer instance to add to an analyzer set.
|
||||
* @param arg_key hash key representing the analyzer's \c AnalyzerArgs.
|
||||
*/
|
||||
AddMod(file_analysis::Analyzer* arg_a, HashKey* arg_key)
|
||||
: Modification(), a(arg_a), key(arg_key) {}
|
||||
virtual ~AddMod() {}
|
||||
|
@ -88,8 +165,16 @@ private:
|
|||
HashKey* key;
|
||||
};
|
||||
|
||||
/**
|
||||
* Represents a request to remove an analyzer from an analyzer set.
|
||||
*/
|
||||
class RemoveMod : public Modification {
|
||||
public:
|
||||
/**
|
||||
* Construct request which can remove an analyzer from an analyzer set.
|
||||
* @param arg_a an analyzer instance to add to an analyzer set.
|
||||
* @param arg_key hash key representing the analyzer's \c AnalyzerArgs.
|
||||
*/
|
||||
RemoveMod(FA_Tag arg_tag, HashKey* arg_key)
|
||||
: Modification(), tag(arg_tag), key(arg_key) {}
|
||||
virtual ~RemoveMod() {}
|
||||
|
@ -102,7 +187,7 @@ private:
|
|||
};
|
||||
|
||||
typedef queue<Modification*> ModQueue;
|
||||
ModQueue mod_queue;
|
||||
ModQueue mod_queue; /**< A queue of analyzer additions/removals requests. */
|
||||
};
|
||||
|
||||
} // namespace file_analysiss
|
||||
|
|
22
src/file_analysis/CMakeLists.txt
Normal file
22
src/file_analysis/CMakeLists.txt
Normal file
|
@ -0,0 +1,22 @@
|
|||
include(BroSubdir)
|
||||
|
||||
include_directories(BEFORE
|
||||
${CMAKE_CURRENT_SOURCE_DIR}
|
||||
${CMAKE_CURRENT_BINARY_DIR}
|
||||
)
|
||||
|
||||
add_subdirectory(analyzer)
|
||||
|
||||
set(file_analysis_SRCS
|
||||
Manager.cc
|
||||
File.cc
|
||||
FileTimer.cc
|
||||
Analyzer.h
|
||||
AnalyzerSet.cc
|
||||
Component.cc
|
||||
)
|
||||
|
||||
bif_target(file_analysis.bif)
|
||||
|
||||
bro_add_subdir_library(file_analysis ${file_analysis_SRCS} ${BIF_OUTPUT_CC})
|
||||
add_dependencies(bro_file_analysis generate_outputs)
|
69
src/file_analysis/Component.cc
Normal file
69
src/file_analysis/Component.cc
Normal file
|
@ -0,0 +1,69 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#include "Component.h"
|
||||
#include "Manager.h"
|
||||
|
||||
#include "../Desc.h"
|
||||
#include "../util.h"
|
||||
|
||||
using namespace file_analysis;
|
||||
|
||||
analyzer::Tag::type_t Component::type_counter = 0;
|
||||
|
||||
Component::Component(const char* arg_name, factory_callback arg_factory,
|
||||
analyzer::Tag::subtype_t arg_subtype)
|
||||
: plugin::Component(plugin::component::FILE_ANALYZER)
|
||||
{
|
||||
name = copy_string(arg_name);
|
||||
canon_name = canonify_name(arg_name);
|
||||
factory = arg_factory;
|
||||
|
||||
tag = analyzer::Tag(++type_counter, arg_subtype);
|
||||
}
|
||||
|
||||
Component::Component(const Component& other)
|
||||
: plugin::Component(Type())
|
||||
{
|
||||
name = copy_string(other.name);
|
||||
canon_name = copy_string(other.canon_name);
|
||||
factory = other.factory;
|
||||
tag = other.tag;
|
||||
}
|
||||
|
||||
Component::~Component()
|
||||
{
|
||||
delete [] name;
|
||||
delete [] canon_name;
|
||||
}
|
||||
|
||||
analyzer::Tag Component::Tag() const
|
||||
{
|
||||
return tag;
|
||||
}
|
||||
|
||||
void Component::Describe(ODesc* d)
|
||||
{
|
||||
plugin::Component::Describe(d);
|
||||
d->Add(name);
|
||||
d->Add(" (");
|
||||
|
||||
if ( factory )
|
||||
{
|
||||
d->Add("ANALYZER_");
|
||||
d->Add(canon_name);
|
||||
}
|
||||
|
||||
d->Add(")");
|
||||
}
|
||||
|
||||
Component& Component::operator=(const Component& other)
|
||||
{
|
||||
if ( &other != this )
|
||||
{
|
||||
name = copy_string(other.name);
|
||||
factory = other.factory;
|
||||
tag = other.tag;
|
||||
}
|
||||
|
||||
return *this;
|
||||
}
|
109
src/file_analysis/Component.h
Normal file
109
src/file_analysis/Component.h
Normal file
|
@ -0,0 +1,109 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYZER_PLUGIN_COMPONENT_H
|
||||
#define FILE_ANALYZER_PLUGIN_COMPONENT_H
|
||||
|
||||
#include "analyzer/Tag.h"
|
||||
#include "plugin/Component.h"
|
||||
|
||||
#include "Val.h"
|
||||
|
||||
#include "../config.h"
|
||||
#include "../util.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
class File;
|
||||
class Analyzer;
|
||||
|
||||
/**
|
||||
* Component description for plugins providing file analyzers.
|
||||
*
|
||||
* A plugin can provide a specific file analyzer by registering this
|
||||
* analyzer component, describing the analyzer.
|
||||
*/
|
||||
class Component : public plugin::Component {
|
||||
public:
|
||||
typedef Analyzer* (*factory_callback)(RecordVal* args, File* file);
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
*
|
||||
* @param name The name of the provided analyzer. This name is used
|
||||
* across the system to identify the analyzer, e.g., when calling
|
||||
* file_analysis::Manager::InstantiateAnalyzer with a name.
|
||||
*
|
||||
* @param factory A factory function to instantiate instances of the
|
||||
* analyzer's class, which must be derived directly or indirectly
|
||||
* from file_analysis::Analyzer. This is typically a static \c
|
||||
* Instatiate() method inside the class that just allocates and
|
||||
* returns a new instance.
|
||||
*
|
||||
* @param subtype A subtype associated with this component that
|
||||
* further distinguishes it. The subtype will be integrated into
|
||||
* the analyzer::Tag that the manager associates with this analyzer,
|
||||
* and analyzer instances can accordingly access it via analyzer::Tag().
|
||||
* If not used, leave at zero.
|
||||
*/
|
||||
Component(const char* name, factory_callback factory,
|
||||
analyzer::Tag::subtype_t subtype = 0);
|
||||
|
||||
/**
|
||||
* Copy constructor.
|
||||
*/
|
||||
Component(const Component& other);
|
||||
|
||||
/**
|
||||
* Destructor.
|
||||
*/
|
||||
~Component();
|
||||
|
||||
/**
|
||||
* Returns the name of the analyzer. This name is unique across all
|
||||
* analyzers and used to identify it. The returned name is derived
|
||||
* from what's passed to the constructor but upper-cased and
|
||||
* canonified to allow being part of a script-level ID.
|
||||
*/
|
||||
const char* Name() const { return name; }
|
||||
|
||||
/**
|
||||
* Returns a canonocalized version of the analyzer's name. The
|
||||
* returned name is derived from what's passed to the constructor but
|
||||
* upper-cased and transformed to allow being part of a script-level
|
||||
* ID.
|
||||
*/
|
||||
const char* CanonicalName() const { return canon_name; }
|
||||
|
||||
/**
|
||||
* Returns the analyzer's factory function.
|
||||
*/
|
||||
factory_callback Factory() const { return factory; }
|
||||
|
||||
/**
|
||||
* Returns the analyzer's tag. Note that this is automatically
|
||||
* generated for each new Components, and hence unique across all of
|
||||
* them.
|
||||
*/
|
||||
analyzer::Tag Tag() const;
|
||||
|
||||
/**
|
||||
* Generates a human-readable description of the component's main
|
||||
* parameters. This goes into the output of \c "bro -NN".
|
||||
*/
|
||||
virtual void Describe(ODesc* d);
|
||||
|
||||
Component& operator=(const Component& other);
|
||||
|
||||
private:
|
||||
const char* name; // The analyzer's name.
|
||||
const char* canon_name; // The analyzer's canonical name.
|
||||
factory_callback factory; // The analyzer's factory callback.
|
||||
analyzer::Tag tag; // The automatically assigned analyzer tag.
|
||||
|
||||
// Global counter used to generate unique tags.
|
||||
static analyzer::Tag::type_t type_counter;
|
||||
};
|
||||
|
||||
}
|
||||
|
||||
#endif
|
|
@ -1,36 +0,0 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_DATAEVENT_H
|
||||
#define FILE_ANALYSIS_DATAEVENT_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to send file data to script-layer events.
|
||||
*/
|
||||
class DataEvent : public file_analysis::Analyzer {
|
||||
public:
|
||||
virtual bool DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
virtual bool DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file);
|
||||
|
||||
protected:
|
||||
DataEvent(RecordVal* args, File* file,
|
||||
EventHandlerPtr ce, EventHandlerPtr se);
|
||||
|
||||
private:
|
||||
EventHandlerPtr chunk_event;
|
||||
EventHandlerPtr stream_event;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
|
@ -1,35 +0,0 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_EXTRACT_H
|
||||
#define FILE_ANALYSIS_EXTRACT_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to extract files to disk.
|
||||
*/
|
||||
class Extract : public file_analysis::Analyzer {
|
||||
public:
|
||||
virtual ~Extract();
|
||||
|
||||
virtual bool DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file);
|
||||
|
||||
protected:
|
||||
Extract(RecordVal* args, File* file, const string& arg_filename);
|
||||
|
||||
private:
|
||||
string filename;
|
||||
int fd;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
|
@ -1,11 +1,9 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#include <string>
|
||||
#include <openssl/md5.h>
|
||||
|
||||
#include "File.h"
|
||||
#include "FileTimer.h"
|
||||
#include "FileID.h"
|
||||
#include "Analyzer.h"
|
||||
#include "Manager.h"
|
||||
#include "Reporter.h"
|
||||
|
@ -53,8 +51,6 @@ int File::bof_buffer_size_idx = -1;
|
|||
int File::bof_buffer_idx = -1;
|
||||
int File::mime_type_idx = -1;
|
||||
|
||||
string File::salt;
|
||||
|
||||
void File::StaticInit()
|
||||
{
|
||||
if ( id_idx != -1 )
|
||||
|
@ -74,42 +70,27 @@ void File::StaticInit()
|
|||
bof_buffer_size_idx = Idx("bof_buffer_size");
|
||||
bof_buffer_idx = Idx("bof_buffer");
|
||||
mime_type_idx = Idx("mime_type");
|
||||
|
||||
salt = BifConst::FileAnalysis::salt->CheckString();
|
||||
}
|
||||
|
||||
File::File(const string& unique, Connection* conn, analyzer::Tag tag,
|
||||
File::File(const string& file_id, Connection* conn, analyzer::Tag tag,
|
||||
bool is_orig)
|
||||
: id(""), unique(unique), val(0), postpone_timeout(false),
|
||||
first_chunk(true), missed_bof(false), need_reassembly(false), done(false),
|
||||
analyzers(this)
|
||||
: id(file_id), val(0), postpone_timeout(false), first_chunk(true),
|
||||
missed_bof(false), need_reassembly(false), done(false), analyzers(this)
|
||||
{
|
||||
StaticInit();
|
||||
|
||||
char tmp[20];
|
||||
uint64 hash[2];
|
||||
string msg(unique + salt);
|
||||
MD5(reinterpret_cast<const u_char*>(msg.data()), msg.size(),
|
||||
reinterpret_cast<u_char*>(hash));
|
||||
uitoa_n(hash[0], tmp, sizeof(tmp), 62);
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Creating new File object %s (%s)", tmp,
|
||||
unique.c_str());
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Creating new File object %s", file_id.c_str());
|
||||
|
||||
val = new RecordVal(fa_file_type);
|
||||
val->Assign(id_idx, new StringVal(tmp));
|
||||
id = FileID(tmp);
|
||||
val->Assign(id_idx, new StringVal(file_id.c_str()));
|
||||
|
||||
if ( conn )
|
||||
{
|
||||
// add source, connection, is_orig fields
|
||||
val->Assign(source_idx, new StringVal(analyzer_mgr->GetAnalyzerName(tag)));
|
||||
SetSource(analyzer_mgr->GetAnalyzerName(tag));
|
||||
val->Assign(is_orig_idx, new Val(is_orig, TYPE_BOOL));
|
||||
UpdateConnectionFields(conn);
|
||||
}
|
||||
else
|
||||
// use the unique file handle as source
|
||||
val->Assign(source_idx, new StringVal(unique.c_str()));
|
||||
|
||||
UpdateLastActivityTime();
|
||||
}
|
||||
|
@ -189,6 +170,18 @@ int File::Idx(const string& field)
|
|||
return rval;
|
||||
}
|
||||
|
||||
string File::GetSource() const
|
||||
{
|
||||
Val* v = val->Lookup(source_idx);
|
||||
|
||||
return v ? v->AsString()->CheckString() : string();
|
||||
}
|
||||
|
||||
void File::SetSource(const string& source)
|
||||
{
|
||||
val->Assign(source_idx, new StringVal(source.c_str()));
|
||||
}
|
||||
|
||||
double File::GetTimeoutInterval() const
|
||||
{
|
||||
return LookupFieldDefaultInterval(timeout_interval_idx);
|
||||
|
@ -425,7 +418,7 @@ void File::Gap(uint64 offset, uint64 len)
|
|||
|
||||
bool File::FileEventAvailable(EventHandlerPtr h)
|
||||
{
|
||||
return h && ! file_mgr->IsIgnored(unique);
|
||||
return h && ! file_mgr->IsIgnored(id);
|
||||
}
|
||||
|
||||
void File::FileEvent(EventHandlerPtr h)
|
||||
|
|
|
@ -9,7 +9,6 @@
|
|||
#include "Conn.h"
|
||||
#include "Val.h"
|
||||
#include "AnalyzerSet.h"
|
||||
#include "FileID.h"
|
||||
#include "BroString.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
@ -19,13 +18,30 @@ namespace file_analysis {
|
|||
*/
|
||||
class File {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Destructor. Nothing fancy, releases a reference to the wrapped
|
||||
* \c fa_file value.
|
||||
*/
|
||||
~File();
|
||||
|
||||
/**
|
||||
* @return the #val record.
|
||||
* @return the wrapped \c fa_file record value, #val.
|
||||
*/
|
||||
RecordVal* GetVal() const { return val; }
|
||||
|
||||
/**
|
||||
* @return the value of the "source" field from #val record or an empty
|
||||
* string if it's not initialized.
|
||||
*/
|
||||
string GetSource() const;
|
||||
|
||||
/**
|
||||
* Set the "source" field from #val record to \a source.
|
||||
* @param source the new value of the "source" field.
|
||||
*/
|
||||
void SetSource(const string& source);
|
||||
|
||||
/**
|
||||
* @return value (seconds) of the "timeout_interval" field from #val record.
|
||||
*/
|
||||
|
@ -33,18 +49,14 @@ public:
|
|||
|
||||
/**
|
||||
* Set the "timeout_interval" field from #val record to \a interval seconds.
|
||||
* @param interval the new value of the "timeout_interval" field.
|
||||
*/
|
||||
void SetTimeoutInterval(double interval);
|
||||
|
||||
/**
|
||||
* @return value of the "id" field from #val record.
|
||||
*/
|
||||
FileID GetID() const { return id; }
|
||||
|
||||
/**
|
||||
* @return the string which uniquely identifies the file.
|
||||
*/
|
||||
string GetUnique() const { return unique; }
|
||||
string GetID() const { return id; }
|
||||
|
||||
/**
|
||||
* @return value of "last_active" field in #val record;
|
||||
|
@ -58,13 +70,15 @@ public:
|
|||
|
||||
/**
|
||||
* Set "total_bytes" field of #val record to \a size.
|
||||
* @param size the new value of the "total_bytes" field.
|
||||
*/
|
||||
void SetTotalBytes(uint64 size);
|
||||
|
||||
/**
|
||||
* Compares "seen_bytes" field to "total_bytes" field of #val record
|
||||
* and returns true if the comparison indicates the full file was seen.
|
||||
* If "total_bytes" hasn't been set yet, it returns false.
|
||||
* Compares "seen_bytes" field to "total_bytes" field of #val record to
|
||||
* determine if the full file has been seen.
|
||||
* @return false if "total_bytes" hasn't been set yet or "seen_bytes" is
|
||||
* less than it, else true.
|
||||
*/
|
||||
bool IsComplete() const;
|
||||
|
||||
|
@ -78,23 +92,30 @@ public:
|
|||
/**
|
||||
* Queues attaching an analyzer. Only one analyzer per type can be attached
|
||||
* at a time unless the arguments differ.
|
||||
* @param args an \c AnalyzerArgs value representing a file analyzer.
|
||||
* @return false if analyzer can't be instantiated, else true.
|
||||
*/
|
||||
bool AddAnalyzer(RecordVal* args);
|
||||
|
||||
/**
|
||||
* Queues removal of an analyzer.
|
||||
* @param args an \c AnalyzerArgs value representing a file analyzer.
|
||||
* @return true if analyzer was active at time of call, else false.
|
||||
*/
|
||||
bool RemoveAnalyzer(const RecordVal* args);
|
||||
|
||||
/**
|
||||
* Pass in non-sequential data and deliver to attached analyzers.
|
||||
* @param data pointer to start of a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param offset number of bytes from start of file at which chunk occurs.
|
||||
*/
|
||||
void DataIn(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
/**
|
||||
* Pass in sequential data and deliver to attached analyzers.
|
||||
* @param data pointer to start of a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
*/
|
||||
void DataIn(const u_char* data, uint64 len);
|
||||
|
||||
|
@ -105,10 +126,13 @@ public:
|
|||
|
||||
/**
|
||||
* Inform attached analyzers about a gap in file stream.
|
||||
* @param offset number of bytes in to file at which missing chunk starts.
|
||||
* @param len length in bytes of the missing chunk of file data.
|
||||
*/
|
||||
void Gap(uint64 offset, uint64 len);
|
||||
|
||||
/**
|
||||
* @param h pointer to an event handler.
|
||||
* @return true if event has a handler and the file isn't ignored.
|
||||
*/
|
||||
bool FileEventAvailable(EventHandlerPtr h);
|
||||
|
@ -116,11 +140,14 @@ public:
|
|||
/**
|
||||
* Raises an event related to the file's life-cycle, the only parameter
|
||||
* to that event is the \c fa_file record..
|
||||
* @param h pointer to an event handler.
|
||||
*/
|
||||
void FileEvent(EventHandlerPtr h);
|
||||
|
||||
/**
|
||||
* Raises an event related to the file's life-cycle.
|
||||
* @param h pointer to an event handler.
|
||||
* @param vl list of argument values to pass to event call.
|
||||
*/
|
||||
void FileEvent(EventHandlerPtr h, val_list* vl);
|
||||
|
||||
|
@ -129,35 +156,51 @@ protected:
|
|||
|
||||
/**
|
||||
* Constructor; only file_analysis::Manager should be creating these.
|
||||
* @param file_id an identifier string for the file in pretty hash form
|
||||
* (similar to connection uids).
|
||||
* @param conn a network connection over which the file is transferred.
|
||||
* @param tag the network protocol over which the file is transferred.
|
||||
* @param is_orig true if the file is being transferred from the originator
|
||||
* of the connection to the responder. False indicates the other
|
||||
* direction.
|
||||
*/
|
||||
File(const string& unique, Connection* conn = 0,
|
||||
File(const string& file_id, Connection* conn = 0,
|
||||
analyzer::Tag tag = analyzer::Tag::Error, bool is_orig = false);
|
||||
|
||||
/**
|
||||
* Updates the "conn_ids" and "conn_uids" fields in #val record with the
|
||||
* \c conn_id and UID taken from \a conn.
|
||||
* @param conn the connection over which a part of the file has been seen.
|
||||
*/
|
||||
void UpdateConnectionFields(Connection* conn);
|
||||
|
||||
/**
|
||||
* Increment a byte count field of #val record by \a size.
|
||||
* @param size number of bytes by which to increment.
|
||||
* @param field_idx the index of the field in \c fa_file to increment.
|
||||
*/
|
||||
void IncrementByteCount(uint64 size, int field_idx);
|
||||
|
||||
/**
|
||||
* Wrapper to RecordVal::LookupWithDefault for the field in #val at index
|
||||
* \a idx which automatically unrefs the Val and returns a converted value.
|
||||
* @param idx the index of a field of type "count" in \c fa_file.
|
||||
* @return the value of the field, which may be it &default.
|
||||
*/
|
||||
uint64 LookupFieldDefaultCount(int idx) const;
|
||||
|
||||
/**
|
||||
* Wrapper to RecordVal::LookupWithDefault for the field in #val at index
|
||||
* \a idx which automatically unrefs the Val and returns a converted value.
|
||||
* @param idx the index of a field of type "interval" in \c fa_file.
|
||||
* @return the value of the field, which may be it &default.
|
||||
*/
|
||||
double LookupFieldDefaultInterval(int idx) const;
|
||||
|
||||
/**
|
||||
* Buffers incoming data at the beginning of a file.
|
||||
* @param data pointer to a data chunk to buffer.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @return true if buffering is still required, else false
|
||||
*/
|
||||
bool BufferBOF(const u_char* data, uint64 len);
|
||||
|
@ -170,11 +213,15 @@ protected:
|
|||
/**
|
||||
* Does mime type detection and assigns type (if available) to \c mime_type
|
||||
* field in #val.
|
||||
* @param data pointer to a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @return whether mime type was available.
|
||||
*/
|
||||
bool DetectMIME(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Lookup a record field index/offset by name.
|
||||
* @param field_name the name of the \c fa_file record field.
|
||||
* @return the field offset in #val record corresponding to \a field_name.
|
||||
*/
|
||||
static int Idx(const string& field_name);
|
||||
|
@ -185,15 +232,14 @@ protected:
|
|||
static void StaticInit();
|
||||
|
||||
private:
|
||||
FileID id; /**< A pretty hash that likely identifies file */
|
||||
string unique; /**< A string that uniquely identifies file */
|
||||
string id; /**< A pretty hash that likely identifies file */
|
||||
RecordVal* val; /**< \c fa_file from script layer. */
|
||||
bool postpone_timeout; /**< Whether postponing timeout is requested. */
|
||||
bool first_chunk; /**< Track first non-linear chunk. */
|
||||
bool missed_bof; /**< Flags that we missed start of file. */
|
||||
bool need_reassembly; /**< Whether file stream reassembly is needed. */
|
||||
bool done; /**< If this object is about to be deleted. */
|
||||
AnalyzerSet analyzers;
|
||||
AnalyzerSet analyzers; /**< A set of attached file analyzer. */
|
||||
|
||||
struct BOF_Buffer {
|
||||
BOF_Buffer() : full(false), replayed(false), size(0) {}
|
||||
|
@ -206,8 +252,6 @@ private:
|
|||
BroString::CVec chunks;
|
||||
} bof_buffer; /**< Beginning of file buffer. */
|
||||
|
||||
static string salt;
|
||||
|
||||
static int id_idx;
|
||||
static int parent_id_idx;
|
||||
static int source_idx;
|
||||
|
|
|
@ -1,34 +0,0 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_FILEID_H
|
||||
#define FILE_ANALYSIS_FILEID_H
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* A simple string wrapper class to help enforce some type safety between
|
||||
* methods of FileAnalysis::Manager, some of which use a unique string to
|
||||
* identify files, and others which use a pretty hash (the FileID) to identify
|
||||
* files. A FileID is primarily used in methods which interface with the
|
||||
* script-layer, while the unique strings are used for methods which interface
|
||||
* with protocol analyzers or anything that sends data to the file analysis
|
||||
* framework.
|
||||
*/
|
||||
struct FileID {
|
||||
string id;
|
||||
|
||||
explicit FileID(const string arg_id) : id(arg_id) {}
|
||||
FileID(const FileID& other) : id(other.id) {}
|
||||
|
||||
const char* c_str() const { return id.c_str(); }
|
||||
|
||||
bool operator==(const FileID& rhs) const { return id == rhs.id; }
|
||||
bool operator<(const FileID& rhs) const { return id < rhs.id; }
|
||||
|
||||
FileID& operator=(const FileID& rhs) { id = rhs.id; return *this; }
|
||||
FileID& operator=(const string& rhs) { id = rhs; return *this; }
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
|
@ -5,7 +5,7 @@
|
|||
|
||||
using namespace file_analysis;
|
||||
|
||||
FileTimer::FileTimer(double t, const FileID& id, double interval)
|
||||
FileTimer::FileTimer(double t, const string& id, double interval)
|
||||
: Timer(t + interval, TIMER_FILE_ANALYSIS_INACTIVITY), file_id(id)
|
||||
{
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "New %f second timeout timer for %s",
|
||||
|
|
|
@ -5,7 +5,6 @@
|
|||
|
||||
#include <string>
|
||||
#include "Timer.h"
|
||||
#include "FileID.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
|
@ -14,16 +13,25 @@ namespace file_analysis {
|
|||
*/
|
||||
class FileTimer : public Timer {
|
||||
public:
|
||||
FileTimer(double t, const FileID& id, double interval);
|
||||
|
||||
/**
|
||||
* Constructor, nothing interesting about it.
|
||||
* @param t unix time at which the timer should start ticking.
|
||||
* @param id the file identifier which will be checked for inactivity.
|
||||
* @param interval amount of time after \a t to check for inactivity.
|
||||
*/
|
||||
FileTimer(double t, const string& id, double interval);
|
||||
|
||||
/**
|
||||
* Check inactivity of file_analysis::File corresponding to #file_id,
|
||||
* reschedule if active, else call file_analysis::Manager::Timeout.
|
||||
* @param t current unix time
|
||||
* @param is_expire true if all pending timers are being expired.
|
||||
*/
|
||||
void Dispatch(double t, int is_expire);
|
||||
|
||||
private:
|
||||
FileID file_id;
|
||||
string file_id;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
|
|
@ -1,74 +0,0 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_HASH_H
|
||||
#define FILE_ANALYSIS_HASH_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "OpaqueVal.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to produce a hash of file contents.
|
||||
*/
|
||||
class Hash : public file_analysis::Analyzer {
|
||||
public:
|
||||
virtual ~Hash();
|
||||
|
||||
virtual bool DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
virtual bool EndOfFile();
|
||||
|
||||
virtual bool Undelivered(uint64 offset, uint64 len);
|
||||
|
||||
protected:
|
||||
Hash(RecordVal* args, File* file, HashVal* hv, const char* kind);
|
||||
|
||||
void Finalize();
|
||||
|
||||
private:
|
||||
HashVal* hash;
|
||||
bool fed;
|
||||
const char* kind;
|
||||
};
|
||||
|
||||
class MD5 : public Hash {
|
||||
public:
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new MD5(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
MD5(RecordVal* args, File* file)
|
||||
: Hash(args, file, new MD5Val(), "md5")
|
||||
{}
|
||||
};
|
||||
|
||||
class SHA1 : public Hash {
|
||||
public:
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new SHA1(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
SHA1(RecordVal* args, File* file)
|
||||
: Hash(args, file, new SHA1Val(), "sha1")
|
||||
{}
|
||||
};
|
||||
|
||||
class SHA256 : public Hash {
|
||||
public:
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new SHA256(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
SHA256(RecordVal* args, File* file)
|
||||
: Hash(args, file, new SHA256Val(), "sha256")
|
||||
{}
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
|
@ -2,6 +2,7 @@
|
|||
|
||||
#include <vector>
|
||||
#include <string>
|
||||
#include <openssl/md5.h>
|
||||
|
||||
#include "Manager.h"
|
||||
#include "File.h"
|
||||
|
@ -9,12 +10,18 @@
|
|||
#include "Var.h"
|
||||
#include "Event.h"
|
||||
|
||||
#include "plugin/Manager.h"
|
||||
|
||||
using namespace file_analysis;
|
||||
|
||||
TableVal* Manager::disabled = 0;
|
||||
string Manager::salt;
|
||||
|
||||
Manager::Manager()
|
||||
{
|
||||
tag_enum_type = new EnumType("FileAnalysis::Tag");
|
||||
::ID* id = install_ID("Tag", "FileAnalysis", true, true);
|
||||
add_type(id, tag_enum_type, 0, 0);
|
||||
}
|
||||
|
||||
Manager::~Manager()
|
||||
|
@ -22,9 +29,44 @@ Manager::~Manager()
|
|||
Terminate();
|
||||
}
|
||||
|
||||
void Manager::InitPreScript()
|
||||
{
|
||||
std::list<Component*> analyzers = plugin_mgr->Components<Component>();
|
||||
|
||||
for ( std::list<Component*>::const_iterator i = analyzers.begin();
|
||||
i != analyzers.end(); ++i )
|
||||
RegisterAnalyzerComponent(*i);
|
||||
}
|
||||
|
||||
void Manager::RegisterAnalyzerComponent(Component* component)
|
||||
{
|
||||
const char* cname = component->CanonicalName();
|
||||
|
||||
if ( tag_enum_type->Lookup("FileAnalysis", cname) != -1 )
|
||||
reporter->FatalError("File Analyzer %s defined more than once", cname);
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Registering analyzer %s (tag %s)",
|
||||
component->Name(), component->Tag().AsString().c_str());
|
||||
|
||||
analyzers_by_name.insert(std::make_pair(cname, component));
|
||||
analyzers_by_tag.insert(std::make_pair(component->Tag(), component));
|
||||
analyzers_by_val.insert(std::make_pair(
|
||||
component->Tag().AsEnumVal()->InternalInt(), component));
|
||||
|
||||
string id = fmt("ANALYZER_%s", cname);
|
||||
tag_enum_type->AddName("FileAnalysis", id.c_str(),
|
||||
component->Tag().AsEnumVal()->InternalInt(), true);
|
||||
}
|
||||
|
||||
void Manager::InitPostScript()
|
||||
{
|
||||
#include "file_analysis.bif.init.cc"
|
||||
}
|
||||
|
||||
void Manager::Terminate()
|
||||
{
|
||||
vector<FileID> keys;
|
||||
vector<string> keys;
|
||||
|
||||
for ( IDMap::iterator it = id_map.begin(); it != id_map.end(); ++it )
|
||||
keys.push_back(it->first);
|
||||
|
||||
|
@ -32,66 +74,77 @@ void Manager::Terminate()
|
|||
Timeout(keys[i], true);
|
||||
}
|
||||
|
||||
string Manager::HashHandle(const string& handle) const
|
||||
{
|
||||
if ( salt.empty() )
|
||||
salt = BifConst::FileAnalysis::salt->CheckString();
|
||||
|
||||
char tmp[20];
|
||||
uint64 hash[2];
|
||||
string msg(handle + salt);
|
||||
|
||||
MD5(reinterpret_cast<const u_char*>(msg.data()), msg.size(),
|
||||
reinterpret_cast<u_char*>(hash));
|
||||
uitoa_n(hash[0], tmp, sizeof(tmp), 62);
|
||||
|
||||
return tmp;
|
||||
}
|
||||
|
||||
void Manager::SetHandle(const string& handle)
|
||||
{
|
||||
current_handle = handle;
|
||||
if ( handle.empty() )
|
||||
return;
|
||||
|
||||
current_file_id = HashHandle(handle);
|
||||
}
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
analyzer::Tag tag, Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
GetFileHandle(tag, conn, is_orig);
|
||||
DataIn(data, len, offset, GetFile(current_handle, conn, tag, is_orig));
|
||||
}
|
||||
File* file = GetFile(current_file_id, conn, tag, is_orig);
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
const string& unique)
|
||||
{
|
||||
DataIn(data, len, offset, GetFile(unique));
|
||||
}
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
File* file)
|
||||
{
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
file->DataIn(data, len, offset);
|
||||
|
||||
if ( file->IsComplete() )
|
||||
RemoveFile(file->GetUnique());
|
||||
RemoveFile(file->GetID());
|
||||
}
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, analyzer::Tag tag,
|
||||
Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
GetFileHandle(tag, conn, is_orig);
|
||||
|
||||
// Sequential data input shouldn't be going over multiple conns, so don't
|
||||
// do the check to update connection set.
|
||||
DataIn(data, len, GetFile(current_handle, conn, tag, is_orig, false));
|
||||
}
|
||||
File* file = GetFile(current_file_id, conn, tag, is_orig, false);
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, const string& unique)
|
||||
{
|
||||
DataIn(data, len, GetFile(unique));
|
||||
}
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, File* file)
|
||||
{
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
file->DataIn(data, len);
|
||||
|
||||
if ( file->IsComplete() )
|
||||
RemoveFile(file->GetUnique());
|
||||
RemoveFile(file->GetID());
|
||||
}
|
||||
|
||||
void Manager::DataIn(const u_char* data, uint64 len, const string& file_id,
|
||||
const string& source)
|
||||
{
|
||||
File* file = GetFile(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
if ( file->GetSource().empty() )
|
||||
file->SetSource(source);
|
||||
|
||||
file->DataIn(data, len);
|
||||
|
||||
if ( file->IsComplete() )
|
||||
RemoveFile(file->GetID());
|
||||
}
|
||||
|
||||
void Manager::EndOfFile(analyzer::Tag tag, Connection* conn)
|
||||
|
@ -102,35 +155,22 @@ void Manager::EndOfFile(analyzer::Tag tag, Connection* conn)
|
|||
|
||||
void Manager::EndOfFile(analyzer::Tag tag, Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
// Don't need to create a file if we're just going to remove it right away.
|
||||
GetFileHandle(tag, conn, is_orig);
|
||||
EndOfFile(current_handle);
|
||||
RemoveFile(current_file_id);
|
||||
}
|
||||
|
||||
void Manager::EndOfFile(const string& unique)
|
||||
void Manager::EndOfFile(const string& file_id)
|
||||
{
|
||||
RemoveFile(unique);
|
||||
RemoveFile(file_id);
|
||||
}
|
||||
|
||||
void Manager::Gap(uint64 offset, uint64 len, analyzer::Tag tag,
|
||||
Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
GetFileHandle(tag, conn, is_orig);
|
||||
Gap(offset, len, GetFile(current_handle, conn, tag, is_orig));
|
||||
}
|
||||
File* file = GetFile(current_file_id, conn, tag, is_orig);
|
||||
|
||||
void Manager::Gap(uint64 offset, uint64 len, const string& unique)
|
||||
{
|
||||
Gap(offset, len, GetFile(unique));
|
||||
}
|
||||
|
||||
void Manager::Gap(uint64 offset, uint64 len, File* file)
|
||||
{
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
|
@ -140,52 +180,33 @@ void Manager::Gap(uint64 offset, uint64 len, File* file)
|
|||
void Manager::SetSize(uint64 size, analyzer::Tag tag, Connection* conn,
|
||||
bool is_orig)
|
||||
{
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
GetFileHandle(tag, conn, is_orig);
|
||||
SetSize(size, GetFile(current_handle, conn, tag, is_orig));
|
||||
}
|
||||
File* file = GetFile(current_file_id, conn, tag, is_orig);
|
||||
|
||||
void Manager::SetSize(uint64 size, const string& unique)
|
||||
{
|
||||
SetSize(size, GetFile(unique));
|
||||
}
|
||||
|
||||
void Manager::SetSize(uint64 size, File* file)
|
||||
{
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
file->SetTotalBytes(size);
|
||||
|
||||
if ( file->IsComplete() )
|
||||
RemoveFile(file->GetUnique());
|
||||
RemoveFile(file->GetID());
|
||||
}
|
||||
|
||||
bool Manager::PostponeTimeout(const FileID& file_id) const
|
||||
bool Manager::SetTimeoutInterval(const string& file_id, double interval) const
|
||||
{
|
||||
File* file = Lookup(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return false;
|
||||
|
||||
if ( interval > 0 )
|
||||
file->postpone_timeout = true;
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::SetTimeoutInterval(const FileID& file_id, double interval) const
|
||||
{
|
||||
File* file = Lookup(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return false;
|
||||
|
||||
file->SetTimeoutInterval(interval);
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::AddAnalyzer(const FileID& file_id, RecordVal* args) const
|
||||
bool Manager::AddAnalyzer(const string& file_id, RecordVal* args) const
|
||||
{
|
||||
File* file = Lookup(file_id);
|
||||
|
||||
|
@ -195,7 +216,7 @@ bool Manager::AddAnalyzer(const FileID& file_id, RecordVal* args) const
|
|||
return file->AddAnalyzer(args);
|
||||
}
|
||||
|
||||
bool Manager::RemoveAnalyzer(const FileID& file_id, const RecordVal* args) const
|
||||
bool Manager::RemoveAnalyzer(const string& file_id, const RecordVal* args) const
|
||||
{
|
||||
File* file = Lookup(file_id);
|
||||
|
||||
|
@ -205,32 +226,23 @@ bool Manager::RemoveAnalyzer(const FileID& file_id, const RecordVal* args) const
|
|||
return file->RemoveAnalyzer(args);
|
||||
}
|
||||
|
||||
File* Manager::GetFile(const string& unique, Connection* conn,
|
||||
File* Manager::GetFile(const string& file_id, Connection* conn,
|
||||
analyzer::Tag tag, bool is_orig, bool update_conn)
|
||||
{
|
||||
if ( unique.empty() )
|
||||
if ( file_id.empty() )
|
||||
return 0;
|
||||
|
||||
if ( IsIgnored(unique) )
|
||||
if ( IsIgnored(file_id) )
|
||||
return 0;
|
||||
|
||||
File* rval = str_map[unique];
|
||||
File* rval = id_map[file_id];
|
||||
|
||||
if ( ! rval )
|
||||
{
|
||||
rval = str_map[unique] = new File(unique, conn, tag, is_orig);
|
||||
FileID id = rval->GetID();
|
||||
|
||||
if ( id_map[id] )
|
||||
{
|
||||
reporter->Error("Evicted duplicate file ID: %s", id.c_str());
|
||||
RemoveFile(unique);
|
||||
}
|
||||
|
||||
id_map[id] = rval;
|
||||
rval = id_map[file_id] = new File(file_id, conn, tag, is_orig);
|
||||
rval->ScheduleInactivityTimer();
|
||||
|
||||
if ( IsIgnored(unique) )
|
||||
if ( IsIgnored(file_id) )
|
||||
return 0;
|
||||
}
|
||||
else
|
||||
|
@ -244,7 +256,7 @@ File* Manager::GetFile(const string& unique, Connection* conn,
|
|||
return rval;
|
||||
}
|
||||
|
||||
File* Manager::Lookup(const FileID& file_id) const
|
||||
File* Manager::Lookup(const string& file_id) const
|
||||
{
|
||||
IDMap::const_iterator it = id_map.find(file_id);
|
||||
|
||||
|
@ -254,7 +266,7 @@ File* Manager::Lookup(const FileID& file_id) const
|
|||
return it->second;
|
||||
}
|
||||
|
||||
void Manager::Timeout(const FileID& file_id, bool is_terminating)
|
||||
void Manager::Timeout(const string& file_id, bool is_terminating)
|
||||
{
|
||||
File* file = Lookup(file_id);
|
||||
|
||||
|
@ -277,53 +289,50 @@ void Manager::Timeout(const FileID& file_id, bool is_terminating)
|
|||
DBG_LOG(DBG_FILE_ANALYSIS, "File analysis timeout for %s",
|
||||
file->GetID().c_str());
|
||||
|
||||
RemoveFile(file->GetUnique());
|
||||
RemoveFile(file->GetID());
|
||||
}
|
||||
|
||||
bool Manager::IgnoreFile(const FileID& file_id)
|
||||
bool Manager::IgnoreFile(const string& file_id)
|
||||
{
|
||||
if ( id_map.find(file_id) == id_map.end() )
|
||||
return false;
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Ignore FileID %s", file_id.c_str());
|
||||
|
||||
ignored.insert(file_id);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::RemoveFile(const string& file_id)
|
||||
{
|
||||
IDMap::iterator it = id_map.find(file_id);
|
||||
|
||||
if ( it == id_map.end() )
|
||||
return false;
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Ignore FileID %s", file_id.c_str());
|
||||
|
||||
ignored.insert(it->second->GetUnique());
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::RemoveFile(const string& unique)
|
||||
{
|
||||
StrMap::iterator it = str_map.find(unique);
|
||||
|
||||
if ( it == str_map.end() )
|
||||
return false;
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Remove FileID %s", file_id.c_str());
|
||||
|
||||
it->second->EndOfFile();
|
||||
|
||||
FileID id = it->second->GetID();
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Remove FileID %s", id.c_str());
|
||||
|
||||
if ( ! id_map.erase(id) )
|
||||
reporter->Error("No mapping for fileID %s", id.c_str());
|
||||
|
||||
ignored.erase(unique);
|
||||
delete it->second;
|
||||
str_map.erase(unique);
|
||||
id_map.erase(file_id);
|
||||
ignored.erase(file_id);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::IsIgnored(const string& unique)
|
||||
bool Manager::IsIgnored(const string& file_id)
|
||||
{
|
||||
return ignored.find(unique) != ignored.end();
|
||||
return ignored.find(file_id) != ignored.end();
|
||||
}
|
||||
|
||||
void Manager::GetFileHandle(analyzer::Tag tag, Connection* c, bool is_orig)
|
||||
{
|
||||
current_handle.clear();
|
||||
current_file_id.clear();
|
||||
|
||||
if ( IsDisabled(tag) )
|
||||
return;
|
||||
|
||||
if ( ! get_file_handle )
|
||||
return;
|
||||
|
@ -357,3 +366,31 @@ bool Manager::IsDisabled(analyzer::Tag tag)
|
|||
|
||||
return rval;
|
||||
}
|
||||
|
||||
Analyzer* Manager::InstantiateAnalyzer(int tag, RecordVal* args, File* f) const
|
||||
{
|
||||
analyzer_map_by_val::const_iterator it = analyzers_by_val.find(tag);
|
||||
|
||||
if ( it == analyzers_by_val.end() )
|
||||
reporter->InternalError("cannot instantiate unknown file analyzer: %d",
|
||||
tag);
|
||||
|
||||
Component* c = it->second;
|
||||
|
||||
if ( ! c->Factory() )
|
||||
reporter->InternalError("file analyzer %s cannot be instantiated "
|
||||
"dynamically", c->CanonicalName());
|
||||
|
||||
return c->Factory()(args, f);
|
||||
}
|
||||
|
||||
const char* Manager::GetAnalyzerName(int tag) const
|
||||
{
|
||||
analyzer_map_by_val::const_iterator it = analyzers_by_val.find(tag);
|
||||
|
||||
if ( it == analyzers_by_val.end() )
|
||||
reporter->InternalError("cannot get name of unknown file analyzer: %d",
|
||||
tag);
|
||||
|
||||
return it->second->CanonicalName();
|
||||
}
|
||||
|
|
|
@ -17,10 +17,12 @@
|
|||
|
||||
#include "File.h"
|
||||
#include "FileTimer.h"
|
||||
#include "FileID.h"
|
||||
#include "Component.h"
|
||||
|
||||
#include "analyzer/Tag.h"
|
||||
|
||||
#include "file_analysis/file_analysis.bif.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
|
@ -28,152 +30,280 @@ namespace file_analysis {
|
|||
*/
|
||||
class Manager {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
*/
|
||||
Manager();
|
||||
|
||||
/**
|
||||
* Destructor. Times out any currently active file analyses.
|
||||
*/
|
||||
~Manager();
|
||||
|
||||
/**
|
||||
* First-stage initializion of the manager. This is called early on
|
||||
* during Bro's initialization, before any scripts are processed.
|
||||
*/
|
||||
void InitPreScript();
|
||||
|
||||
/**
|
||||
* Second-stage initialization of the manager. This is called late
|
||||
* during Bro's initialization after any scripts are processed.
|
||||
*/
|
||||
void InitPostScript();
|
||||
|
||||
/**
|
||||
* Times out any active file analysis to prepare for shutdown.
|
||||
*/
|
||||
void Terminate();
|
||||
|
||||
/**
|
||||
* Take in a unique file handle string to identifiy incoming file data.
|
||||
* Creates a file identifier from a unique file handle string.
|
||||
* @param handle a unique string which identifies a single file.
|
||||
* @return a prettified MD5 hash of \a handle, truncated to 64-bits.
|
||||
*/
|
||||
string HashHandle(const string& handle) const;
|
||||
|
||||
/**
|
||||
* Take in a unique file handle string to identify next piece of
|
||||
* incoming file data/information.
|
||||
* @param handle a unique string which identifies a single file.
|
||||
*/
|
||||
void SetHandle(const string& handle);
|
||||
|
||||
/**
|
||||
* Pass in non-sequential file data.
|
||||
* @param data pointer to start of a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param offset number of bytes from start of file that data chunk occurs.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction.
|
||||
*/
|
||||
void DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
analyzer::Tag tag, Connection* conn, bool is_orig);
|
||||
void DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
const string& unique);
|
||||
void DataIn(const u_char* data, uint64 len, uint64 offset,
|
||||
File* file);
|
||||
|
||||
/**
|
||||
* Pass in sequential file data.
|
||||
* @param data pointer to start of a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction.
|
||||
*/
|
||||
void DataIn(const u_char* data, uint64 len, analyzer::Tag tag,
|
||||
Connection* conn, bool is_orig);
|
||||
void DataIn(const u_char* data, uint64 len, const string& unique);
|
||||
void DataIn(const u_char* data, uint64 len, File* file);
|
||||
|
||||
/**
|
||||
* Signal the end of file data.
|
||||
* Pass in sequential file data from external source (e.g. input framework).
|
||||
* @param data pointer to start of a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param file_id an identifier for the file (usually a hash of \a source).
|
||||
* @param source uniquely identifies the file and should also describe
|
||||
* in human-readable form where the file input is coming from (e.g.
|
||||
* a local file path).
|
||||
*/
|
||||
void DataIn(const u_char* data, uint64 len, const string& file_id,
|
||||
const string& source);
|
||||
|
||||
/**
|
||||
* Signal the end of file data regardless of which direction it is being
|
||||
* sent over the connection.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
*/
|
||||
void EndOfFile(analyzer::Tag tag, Connection* conn);
|
||||
|
||||
/**
|
||||
* Signal the end of file data being transferred over a connection in
|
||||
* a particular direction.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
*/
|
||||
void EndOfFile(analyzer::Tag tag, Connection* conn, bool is_orig);
|
||||
void EndOfFile(const string& unique);
|
||||
|
||||
/**
|
||||
* Signal the end of file data being transferred using the file identifier.
|
||||
* @param file_id the file identifier/hash.
|
||||
*/
|
||||
void EndOfFile(const string& file_id);
|
||||
|
||||
/**
|
||||
* Signal a gap in the file data stream.
|
||||
* @param offset number of bytes in to file at which missing chunk starts.
|
||||
* @param len length in bytes of the missing chunk of file data.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction.
|
||||
*/
|
||||
void Gap(uint64 offset, uint64 len, analyzer::Tag tag, Connection* conn,
|
||||
bool is_orig);
|
||||
void Gap(uint64 offset, uint64 len, const string& unique);
|
||||
void Gap(uint64 offset, uint64 len, File* file);
|
||||
|
||||
/**
|
||||
* Provide the expected number of bytes that comprise a file.
|
||||
* @param size the number of bytes in the full file.
|
||||
* @param tag network protocol over which the file data is transferred.
|
||||
* @param conn network connection over which the file data is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction.
|
||||
*/
|
||||
void SetSize(uint64 size, analyzer::Tag tag, Connection* conn,
|
||||
bool is_orig);
|
||||
void SetSize(uint64 size, const string& unique);
|
||||
void SetSize(uint64 size, File* file);
|
||||
|
||||
/**
|
||||
* Starts ignoring a file, which will finally be removed from internal
|
||||
* mappings on EOF or TIMEOUT.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @return false if file identifier did not map to anything, else true.
|
||||
*/
|
||||
bool IgnoreFile(const FileID& file_id);
|
||||
|
||||
/**
|
||||
* If called during a \c file_timeout event handler, requests deferral of
|
||||
* analysis timeout.
|
||||
*/
|
||||
bool PostponeTimeout(const FileID& file_id) const;
|
||||
bool IgnoreFile(const string& file_id);
|
||||
|
||||
/**
|
||||
* Set's an inactivity threshold for the file.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param interval the amount of time in which no activity is seen for
|
||||
* the file identified by \a file_id that will cause the file
|
||||
* to be considered stale, timed out, and then resource reclaimed.
|
||||
* @return false if file identifier did not map to anything, else true.
|
||||
*/
|
||||
bool SetTimeoutInterval(const FileID& file_id, double interval) const;
|
||||
bool SetTimeoutInterval(const string& file_id, double interval) const;
|
||||
|
||||
/**
|
||||
* Queue attachment of an analzer to the file identifier. Multiple
|
||||
* analyzers of a given type can be attached per file identifier at a time
|
||||
* as long as the arguments differ.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param args a \c AnalyzerArgs value which describes a file analyzer.
|
||||
* @return false if the analyzer failed to be instantiated, else true.
|
||||
*/
|
||||
bool AddAnalyzer(const FileID& file_id, RecordVal* args) const;
|
||||
bool AddAnalyzer(const string& file_id, RecordVal* args) const;
|
||||
|
||||
/**
|
||||
* Queue removal of an analyzer for a given file identifier.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param args a \c AnalyzerArgs value which describes a file analyzer.
|
||||
* @return true if the analyzer is active at the time of call, else false.
|
||||
*/
|
||||
bool RemoveAnalyzer(const FileID& file_id, const RecordVal* args) const;
|
||||
bool RemoveAnalyzer(const string& file_id, const RecordVal* args) const;
|
||||
|
||||
/**
|
||||
* @return whether the file mapped to \a unique is being ignored.
|
||||
* Tells whether analysis for a file is active or ignored.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @return whether the file mapped to \a file_id is being ignored.
|
||||
*/
|
||||
bool IsIgnored(const string& unique);
|
||||
bool IsIgnored(const string& file_id);
|
||||
|
||||
/**
|
||||
* Instantiates a new file analyzer instance for the file.
|
||||
* @param tag The file analyzer's tag.
|
||||
* @param args The file analzer argument/option values.
|
||||
* @param f The file analzer is to be associated with.
|
||||
* @return The new analyzer instance or null if tag is invalid.
|
||||
*/
|
||||
Analyzer* InstantiateAnalyzer(int tag, RecordVal* args, File* f) const;
|
||||
|
||||
/**
|
||||
* Translates a script-level file analyzer tag in to corresponding file
|
||||
* analyzer name.
|
||||
* @param tag The enum val of a file analyzer.
|
||||
* @return The human-readable name of the file analyzer.
|
||||
*/
|
||||
const char* GetAnalyzerName(int tag) const;
|
||||
|
||||
protected:
|
||||
friend class FileTimer;
|
||||
|
||||
typedef map<string, File*> StrMap;
|
||||
typedef set<string> StrSet;
|
||||
typedef map<FileID, File*> IDMap;
|
||||
typedef set<string> IDSet;
|
||||
typedef map<string, File*> IDMap;
|
||||
|
||||
/**
|
||||
* @return the File object mapped to \a unique or a null pointer if analysis
|
||||
* is being ignored for the associated file. An File object may be
|
||||
* created if a mapping doesn't exist, and if it did exist, the
|
||||
* activity time is refreshed along with any connection-related
|
||||
* fields.
|
||||
* Create a new file to be analyzed or retrieve an existing one.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param conn network connection, if any, over which the file is
|
||||
* transferred.
|
||||
* @param tag network protocol, if any, over which the file is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction (or if it
|
||||
* this file isn't related to a connection).
|
||||
* @param update_conn whether we need to update connection-related field
|
||||
* in the \c fa_file record value associated with the file.
|
||||
* @return the File object mapped to \a file_id or a null pointer if
|
||||
* analysis is being ignored for the associated file. An File
|
||||
* object may be created if a mapping doesn't exist, and if it did
|
||||
* exist, the activity time is refreshed along with any
|
||||
* connection-related fields.
|
||||
*/
|
||||
File* GetFile(const string& unique, Connection* conn = 0,
|
||||
File* GetFile(const string& file_id, Connection* conn = 0,
|
||||
analyzer::Tag tag = analyzer::Tag::Error,
|
||||
bool is_orig = false, bool update_conn = true);
|
||||
|
||||
/**
|
||||
* Try to retrieve a file that's being analyzed, using its identifier/hash.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @return the File object mapped to \a file_id, or a null pointer if no
|
||||
* mapping exists.
|
||||
*/
|
||||
File* Lookup(const FileID& file_id) const;
|
||||
File* Lookup(const string& file_id) const;
|
||||
|
||||
/**
|
||||
* Evaluate timeout policy for a file and remove the File object mapped to
|
||||
* \a file_id if needed.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param is_termination whether the Manager (and probably Bro) is in a
|
||||
* terminating state. If true, then the timeout cannot be postponed.
|
||||
*/
|
||||
void Timeout(const FileID& file_id, bool is_terminating = ::terminating);
|
||||
void Timeout(const string& file_id, bool is_terminating = ::terminating);
|
||||
|
||||
/**
|
||||
* Immediately remove file_analysis::File object associated with \a unique.
|
||||
* @return false if file string did not map to anything, else true.
|
||||
* Immediately remove file_analysis::File object associated with \a file_id.
|
||||
* @param file_id the file identifier/hash.
|
||||
* @return false if file id string did not map to anything, else true.
|
||||
*/
|
||||
bool RemoveFile(const string& unique);
|
||||
bool RemoveFile(const string& file_id);
|
||||
|
||||
/**
|
||||
* Sets #current_handle to a unique file handle string based on what the
|
||||
* \c get_file_handle event derives from the connection params. The
|
||||
* event queue is flushed so that we can get the handle value immediately.
|
||||
* Sets #current_file_id to a hash of a unique file handle string based on
|
||||
* what the \c get_file_handle event derives from the connection params.
|
||||
* Event queue is flushed so that we can get the handle value immediately.
|
||||
* @param tag network protocol over which the file is transferred.
|
||||
* @param conn network connection over which the file is transferred.
|
||||
* @param is_orig true if the file is being sent from connection originator
|
||||
* or false if is being sent in the opposite direction.
|
||||
*/
|
||||
void GetFileHandle(analyzer::Tag tag, Connection* c, bool is_orig);
|
||||
|
||||
/**
|
||||
* @return whether file analysis is disabled for the given analyzer.
|
||||
* Check if analysis is available for files transferred over a given
|
||||
* network protocol.
|
||||
* @param tag the network protocol over which files can be transferred and
|
||||
* analyzed by the file analysis framework.
|
||||
* @return whether file analysis is disabled for the analyzer given by
|
||||
* \a tag.
|
||||
*/
|
||||
static bool IsDisabled(analyzer::Tag tag);
|
||||
|
||||
private:
|
||||
StrMap str_map; /**< Map unique string to file_analysis::File. */
|
||||
typedef map<string, Component*> analyzer_map_by_name;
|
||||
typedef map<analyzer::Tag, Component*> analyzer_map_by_tag;
|
||||
typedef map<int, Component*> analyzer_map_by_val;
|
||||
|
||||
void RegisterAnalyzerComponent(Component* component);
|
||||
|
||||
IDMap id_map; /**< Map file ID to file_analysis::File records. */
|
||||
StrSet ignored; /**< Ignored files. Will be finally removed on EOF. */
|
||||
string current_handle; /**< Last file handle set by get_file_handle event.*/
|
||||
IDSet ignored; /**< Ignored files. Will be finally removed on EOF. */
|
||||
string current_file_id; /**< Hash of what get_file_handle event sets. */
|
||||
EnumType* tag_enum_type; /**< File analyzer tag type. */
|
||||
|
||||
analyzer_map_by_name analyzers_by_name;
|
||||
analyzer_map_by_tag analyzers_by_tag;
|
||||
analyzer_map_by_val analyzers_by_val;
|
||||
|
||||
static TableVal* disabled; /**< Table of disabled analyzers. */
|
||||
static string salt; /**< A salt added to file handles before hashing. */
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
|
3
src/file_analysis/analyzer/CMakeLists.txt
Normal file
3
src/file_analysis/analyzer/CMakeLists.txt
Normal file
|
@ -0,0 +1,3 @@
|
|||
add_subdirectory(data_event)
|
||||
add_subdirectory(extract)
|
||||
add_subdirectory(hash)
|
8
src/file_analysis/analyzer/data_event/CMakeLists.txt
Normal file
8
src/file_analysis/analyzer/data_event/CMakeLists.txt
Normal file
|
@ -0,0 +1,8 @@
|
|||
include(BroPlugin)
|
||||
|
||||
include_directories(BEFORE ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
${CMAKE_CURRENT_BINARY_DIR})
|
||||
|
||||
bro_plugin_begin(Bro FileDataEvent)
|
||||
bro_plugin_cc(DataEvent.cc Plugin.cc)
|
||||
bro_plugin_end()
|
69
src/file_analysis/analyzer/data_event/DataEvent.h
Normal file
69
src/file_analysis/analyzer/data_event/DataEvent.h
Normal file
|
@ -0,0 +1,69 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_DATAEVENT_H
|
||||
#define FILE_ANALYSIS_DATAEVENT_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to send file data to script-layer via events.
|
||||
*/
|
||||
class DataEvent : public file_analysis::Analyzer {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Generates the event, if any, specified by the "chunk_event" field of this
|
||||
* analyzer's \c AnalyzerArgs. This is for non-sequential file data input.
|
||||
* @param data pointer to start of file data chunk.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param offset number of bytes from start of file at which chunk occurs.
|
||||
* @return always true
|
||||
*/
|
||||
virtual bool DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
/**
|
||||
* Generates the event, if any, specified by the "stream_event" field of
|
||||
* this analyzer's \c AnalyzerArgs. This is for sequential file data input.
|
||||
* @param data pointer to start of file data chunk.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @return always true
|
||||
*/
|
||||
virtual bool DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Create a new instance of a DataEvent analyzer.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @return the new DataEvent analyzer instance or a null pointer if
|
||||
* no "chunk_event" or "stream_event" field was specfied in \a args.
|
||||
*/
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file);
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @param ce pointer to event handler which will be called to receive
|
||||
* non-sequential file data.
|
||||
* @param se pointer to event handler which will be called to receive
|
||||
* sequential file data.
|
||||
*/
|
||||
DataEvent(RecordVal* args, File* file,
|
||||
EventHandlerPtr ce, EventHandlerPtr se);
|
||||
|
||||
private:
|
||||
EventHandlerPtr chunk_event;
|
||||
EventHandlerPtr stream_event;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
26
src/file_analysis/analyzer/data_event/Plugin.cc
Normal file
26
src/file_analysis/analyzer/data_event/Plugin.cc
Normal file
|
@ -0,0 +1,26 @@
|
|||
#include "plugin/Plugin.h"
|
||||
#include "file_analysis/Component.h"
|
||||
|
||||
#include "DataEvent.h"
|
||||
|
||||
namespace plugin { namespace Bro_FileDataEvent {
|
||||
|
||||
class Plugin : public plugin::Plugin {
|
||||
protected:
|
||||
void InitPreScript()
|
||||
{
|
||||
SetName("Bro::FileDataEvent");
|
||||
SetVersion(-1);
|
||||
SetAPIVersion(BRO_PLUGIN_API_VERSION);
|
||||
SetDynamicPlugin(false);
|
||||
|
||||
SetDescription("Delivers file content via events");
|
||||
|
||||
AddComponent(new ::file_analysis::Component("DATA_EVENT",
|
||||
::file_analysis::DataEvent::Instantiate));
|
||||
}
|
||||
};
|
||||
|
||||
Plugin __plugin;
|
||||
|
||||
} }
|
8
src/file_analysis/analyzer/extract/CMakeLists.txt
Normal file
8
src/file_analysis/analyzer/extract/CMakeLists.txt
Normal file
|
@ -0,0 +1,8 @@
|
|||
include(BroPlugin)
|
||||
|
||||
include_directories(BEFORE ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
${CMAKE_CURRENT_BINARY_DIR})
|
||||
|
||||
bro_plugin_begin(Bro FileExtract)
|
||||
bro_plugin_cc(Extract.cc Plugin.cc)
|
||||
bro_plugin_end()
|
62
src/file_analysis/analyzer/extract/Extract.h
Normal file
62
src/file_analysis/analyzer/extract/Extract.h
Normal file
|
@ -0,0 +1,62 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_EXTRACT_H
|
||||
#define FILE_ANALYSIS_EXTRACT_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to extract content of files to local disk.
|
||||
*/
|
||||
class Extract : public file_analysis::Analyzer {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Destructor. Will close the file that was used for data extraction.
|
||||
*/
|
||||
virtual ~Extract();
|
||||
|
||||
/**
|
||||
* Write a chunk of file data to the local extraction file.
|
||||
* @param data pointer to a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param offset number of bytes from start of file at which chunk starts.
|
||||
* @return false if there was no extraction file open and the data couldn't
|
||||
* be written, else true.
|
||||
*/
|
||||
virtual bool DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
/**
|
||||
* Create a new instance of an Extract analyzer.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @return the new Extract analyzer instance or a null pointer if the
|
||||
* the "extraction_file" field of \a args wasn't set.
|
||||
*/
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file);
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @param arg_filename a file system path which specifies the local file
|
||||
* to which the contents of the file will be extracted/written.
|
||||
*/
|
||||
Extract(RecordVal* args, File* file, const string& arg_filename);
|
||||
|
||||
private:
|
||||
string filename;
|
||||
int fd;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
26
src/file_analysis/analyzer/extract/Plugin.cc
Normal file
26
src/file_analysis/analyzer/extract/Plugin.cc
Normal file
|
@ -0,0 +1,26 @@
|
|||
#include "plugin/Plugin.h"
|
||||
#include "file_analysis/Component.h"
|
||||
|
||||
#include "Extract.h"
|
||||
|
||||
namespace plugin { namespace Bro_FileExtract {
|
||||
|
||||
class Plugin : public plugin::Plugin {
|
||||
protected:
|
||||
void InitPreScript()
|
||||
{
|
||||
SetName("Bro::FileExtract");
|
||||
SetVersion(-1);
|
||||
SetAPIVersion(BRO_PLUGIN_API_VERSION);
|
||||
SetDynamicPlugin(false);
|
||||
|
||||
SetDescription("Extract file content to local file system");
|
||||
|
||||
AddComponent(new ::file_analysis::Component("EXTRACT",
|
||||
::file_analysis::Extract::Instantiate));
|
||||
}
|
||||
};
|
||||
|
||||
Plugin __plugin;
|
||||
|
||||
} }
|
9
src/file_analysis/analyzer/hash/CMakeLists.txt
Normal file
9
src/file_analysis/analyzer/hash/CMakeLists.txt
Normal file
|
@ -0,0 +1,9 @@
|
|||
include(BroPlugin)
|
||||
|
||||
include_directories(BEFORE ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
${CMAKE_CURRENT_BINARY_DIR})
|
||||
|
||||
bro_plugin_begin(Bro FileHash)
|
||||
bro_plugin_cc(Hash.cc Plugin.cc)
|
||||
bro_plugin_bif(events.bif)
|
||||
bro_plugin_end()
|
160
src/file_analysis/analyzer/hash/Hash.h
Normal file
160
src/file_analysis/analyzer/hash/Hash.h
Normal file
|
@ -0,0 +1,160 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef FILE_ANALYSIS_HASH_H
|
||||
#define FILE_ANALYSIS_HASH_H
|
||||
|
||||
#include <string>
|
||||
|
||||
#include "Val.h"
|
||||
#include "OpaqueVal.h"
|
||||
#include "File.h"
|
||||
#include "Analyzer.h"
|
||||
|
||||
#include "events.bif.h"
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
/**
|
||||
* An analyzer to produce a hash of file contents.
|
||||
*/
|
||||
class Hash : public file_analysis::Analyzer {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Destructor.
|
||||
*/
|
||||
virtual ~Hash();
|
||||
|
||||
/**
|
||||
* Incrementally hash next chunk of file contents.
|
||||
* @param data pointer to start of a chunk of a file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @return false if the digest is in an invalid state, else true.
|
||||
*/
|
||||
virtual bool DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Finalizes the hash and raises a "file_hash" event.
|
||||
* @return always false so analyze will be deteched from file.
|
||||
*/
|
||||
virtual bool EndOfFile();
|
||||
|
||||
/**
|
||||
* Missing data can't be handled, so just indicate the this analyzer should
|
||||
* be removed from receiving further data. The hash will not be finalized.
|
||||
* @param offset byte offset in file at which missing chunk starts.
|
||||
* @param len number of missing bytes.
|
||||
* @return always false so analyzer will detach from file.
|
||||
*/
|
||||
virtual bool Undelivered(uint64 offset, uint64 len);
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @param hv specific hash calculator object.
|
||||
* @param kind human readable name of the hash algorithm to use.
|
||||
*/
|
||||
Hash(RecordVal* args, File* file, HashVal* hv, const char* kind);
|
||||
|
||||
/**
|
||||
* If some file contents have been seen, finalizes the hash of them and
|
||||
* raises the "file_hash" event with the results.
|
||||
*/
|
||||
void Finalize();
|
||||
|
||||
private:
|
||||
HashVal* hash;
|
||||
bool fed;
|
||||
const char* kind;
|
||||
};
|
||||
|
||||
/**
|
||||
* An analyzer to produce an MD5 hash of file contents.
|
||||
*/
|
||||
class MD5 : public Hash {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Create a new instance of the MD5 hashing file analyzer.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @return the new MD5 analyzer instance or a null pointer if there's no
|
||||
* handler for the "file_hash" event.
|
||||
*/
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new MD5(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
*/
|
||||
MD5(RecordVal* args, File* file)
|
||||
: Hash(args, file, new MD5Val(), "md5")
|
||||
{}
|
||||
};
|
||||
|
||||
/**
|
||||
* An analyzer to produce a SHA1 hash of file contents.
|
||||
*/
|
||||
class SHA1 : public Hash {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Create a new instance of the SHA1 hashing file analyzer.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @return the new MD5 analyzer instance or a null pointer if there's no
|
||||
* handler for the "file_hash" event.
|
||||
*/
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new SHA1(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
*/
|
||||
SHA1(RecordVal* args, File* file)
|
||||
: Hash(args, file, new SHA1Val(), "sha1")
|
||||
{}
|
||||
};
|
||||
|
||||
/**
|
||||
* An analyzer to produce a SHA256 hash of file contents.
|
||||
*/
|
||||
class SHA256 : public Hash {
|
||||
public:
|
||||
|
||||
/**
|
||||
* Create a new instance of the SHA256 hashing file analyzer.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
* @return the new MD5 analyzer instance or a null pointer if there's no
|
||||
* handler for the "file_hash" event.
|
||||
*/
|
||||
static file_analysis::Analyzer* Instantiate(RecordVal* args, File* file)
|
||||
{ return file_hash ? new SHA256(args, file) : 0; }
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param args the \c AnalyzerArgs value which represents the analyzer.
|
||||
* @param file the file to which the analyzer will be attached.
|
||||
*/
|
||||
SHA256(RecordVal* args, File* file)
|
||||
: Hash(args, file, new SHA256Val(), "sha256")
|
||||
{}
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
||||
#endif
|
33
src/file_analysis/analyzer/hash/Plugin.cc
Normal file
33
src/file_analysis/analyzer/hash/Plugin.cc
Normal file
|
@ -0,0 +1,33 @@
|
|||
#include "plugin/Plugin.h"
|
||||
#include "file_analysis/Component.h"
|
||||
|
||||
#include "Hash.h"
|
||||
|
||||
namespace plugin { namespace Bro_FileHash {
|
||||
|
||||
class Plugin : public plugin::Plugin {
|
||||
protected:
|
||||
void InitPreScript()
|
||||
{
|
||||
SetName("Bro::FileHash");
|
||||
SetVersion(-1);
|
||||
SetAPIVersion(BRO_PLUGIN_API_VERSION);
|
||||
SetDynamicPlugin(false);
|
||||
|
||||
SetDescription("Hash file content");
|
||||
|
||||
AddComponent(new ::file_analysis::Component("MD5",
|
||||
::file_analysis::MD5::Instantiate));
|
||||
AddComponent(new ::file_analysis::Component("SHA1",
|
||||
::file_analysis::SHA1::Instantiate));
|
||||
AddComponent(new ::file_analysis::Component("SHA256",
|
||||
::file_analysis::SHA256::Instantiate));
|
||||
|
||||
extern std::list<std::pair<const char*, int> > __bif_events_init();
|
||||
AddBifInitFunction(&__bif_events_init);
|
||||
}
|
||||
};
|
||||
|
||||
Plugin __plugin;
|
||||
|
||||
} }
|
12
src/file_analysis/analyzer/hash/events.bif
Normal file
12
src/file_analysis/analyzer/hash/events.bif
Normal file
|
@ -0,0 +1,12 @@
|
|||
## This event is generated each time file analysis generates a digest of the
|
||||
## file contents.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## kind: The type of digest algorithm.
|
||||
##
|
||||
## hash: The result of the hashing.
|
||||
##
|
||||
## .. bro:see:: FileAnalysis::add_analyzer FileAnalysis::ANALYZER_MD5
|
||||
## FileAnalysis::ANALYZER_SHA1 FileAnalysis::ANALYZER_SHA256
|
||||
event file_hash%(f: fa_file, kind: string, hash: string%);
|
61
src/file_analysis/file_analysis.bif
Normal file
61
src/file_analysis/file_analysis.bif
Normal file
|
@ -0,0 +1,61 @@
|
|||
##! Internal functions and types used by the file analysis framework.
|
||||
|
||||
module FileAnalysis;
|
||||
|
||||
%%{
|
||||
#include "file_analysis/Manager.h"
|
||||
%%}
|
||||
|
||||
type AnalyzerArgs: record;
|
||||
|
||||
## :bro:see:`FileAnalysis::set_timeout_interval`.
|
||||
function FileAnalysis::__set_timeout_interval%(file_id: string, t: interval%): bool
|
||||
%{
|
||||
bool result = file_mgr->SetTimeoutInterval(file_id->CheckString(), t);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::add_analyzer`.
|
||||
function FileAnalysis::__add_analyzer%(file_id: string, args: any%): bool
|
||||
%{
|
||||
using BifType::Record::FileAnalysis::AnalyzerArgs;
|
||||
RecordVal* rv = args->AsRecordVal()->CoerceTo(AnalyzerArgs);
|
||||
bool result = file_mgr->AddAnalyzer(file_id->CheckString(), rv);
|
||||
Unref(rv);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::remove_analyzer`.
|
||||
function FileAnalysis::__remove_analyzer%(file_id: string, args: any%): bool
|
||||
%{
|
||||
using BifType::Record::FileAnalysis::AnalyzerArgs;
|
||||
RecordVal* rv = args->AsRecordVal()->CoerceTo(AnalyzerArgs);
|
||||
bool result = file_mgr->RemoveAnalyzer(file_id->CheckString(), rv);
|
||||
Unref(rv);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`FileAnalysis::stop`.
|
||||
function FileAnalysis::__stop%(file_id: string%): bool
|
||||
%{
|
||||
bool result = file_mgr->IgnoreFile(file_id->CheckString());
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
module GLOBAL;
|
||||
|
||||
## For use within a :bro:see:`get_file_handle` handler to set a unique
|
||||
## identifier to associate with the current input to the file analysis
|
||||
## framework. Using an empty string for the handle signifies that the
|
||||
## input will be ignored/discarded.
|
||||
##
|
||||
## handle: A string that uniquely identifies a file.
|
||||
##
|
||||
## .. bro:see:: get_file_handle
|
||||
function set_file_handle%(handle: string%): any
|
||||
%{
|
||||
file_mgr->SetHandle(handle->CheckString());
|
||||
return 0;
|
||||
%}
|
||||
|
||||
const FileAnalysis::salt: string;
|
|
@ -9,6 +9,7 @@ module Input;
|
|||
|
||||
type TableDescription: record;
|
||||
type EventDescription: record;
|
||||
type AnalysisDescription: record;
|
||||
|
||||
function Input::__create_table_stream%(description: Input::TableDescription%) : bool
|
||||
%{
|
||||
|
@ -22,6 +23,12 @@ function Input::__create_event_stream%(description: Input::EventDescription%) :
|
|||
return new Val(res, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
function Input::__create_analysis_stream%(description: Input::AnalysisDescription%) : bool
|
||||
%{
|
||||
bool res = input_mgr->CreateAnalysisStream(description->AsRecordVal());
|
||||
return new Val(res, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
function Input::__remove_stream%(id: string%) : bool
|
||||
%{
|
||||
bool res = input_mgr->RemoveStream(id->AsString()->CheckString());
|
||||
|
|
|
@ -15,10 +15,9 @@
|
|||
#include "EventHandler.h"
|
||||
#include "NetVar.h"
|
||||
#include "Net.h"
|
||||
|
||||
|
||||
#include "CompHash.h"
|
||||
|
||||
#include "../file_analysis/Manager.h"
|
||||
#include "../threading/SerialTypes.h"
|
||||
|
||||
using namespace input;
|
||||
|
@ -148,6 +147,14 @@ public:
|
|||
~EventStream();
|
||||
};
|
||||
|
||||
class Manager::AnalysisStream: public Manager::Stream {
|
||||
public:
|
||||
string file_id;
|
||||
|
||||
AnalysisStream();
|
||||
~AnalysisStream();
|
||||
};
|
||||
|
||||
Manager::TableStream::TableStream() : Manager::Stream::Stream()
|
||||
{
|
||||
stream_type = TABLE_STREAM;
|
||||
|
@ -198,6 +205,15 @@ Manager::TableStream::~TableStream()
|
|||
}
|
||||
}
|
||||
|
||||
Manager::AnalysisStream::AnalysisStream() : Manager::Stream::Stream()
|
||||
{
|
||||
stream_type = ANALYSIS_STREAM;
|
||||
}
|
||||
|
||||
Manager::AnalysisStream::~AnalysisStream()
|
||||
{
|
||||
}
|
||||
|
||||
Manager::Manager()
|
||||
{
|
||||
end_of_data = internal_handler("Input::end_of_data");
|
||||
|
@ -274,7 +290,8 @@ bool Manager::CreateStream(Stream* info, RecordVal* description)
|
|||
|
||||
RecordType* rtype = description->Type()->AsRecordType();
|
||||
if ( ! ( same_type(rtype, BifType::Record::Input::TableDescription, 0)
|
||||
|| same_type(rtype, BifType::Record::Input::EventDescription, 0) ) )
|
||||
|| same_type(rtype, BifType::Record::Input::EventDescription, 0)
|
||||
|| same_type(rtype, BifType::Record::Input::AnalysisDescription, 0) ) )
|
||||
{
|
||||
reporter->Error("Streamdescription argument not of right type for new input stream");
|
||||
return false;
|
||||
|
@ -680,6 +697,40 @@ bool Manager::CreateTableStream(RecordVal* fval)
|
|||
return true;
|
||||
}
|
||||
|
||||
bool Manager::CreateAnalysisStream(RecordVal* fval)
|
||||
{
|
||||
RecordType* rtype = fval->Type()->AsRecordType();
|
||||
|
||||
if ( ! same_type(rtype, BifType::Record::Input::AnalysisDescription, 0) )
|
||||
{
|
||||
reporter->Error("AnalysisDescription argument not of right type");
|
||||
return false;
|
||||
}
|
||||
|
||||
AnalysisStream* stream = new AnalysisStream();
|
||||
|
||||
if ( ! CreateStream(stream, fval) )
|
||||
{
|
||||
delete stream;
|
||||
return false;
|
||||
}
|
||||
|
||||
stream->file_id = file_mgr->HashHandle(stream->name);
|
||||
|
||||
assert(stream->reader);
|
||||
|
||||
// reader takes in a byte stream as the only field
|
||||
Field** fields = new Field*[1];
|
||||
fields[0] = new Field("bytestream", 0, TYPE_STRING, TYPE_VOID, false);
|
||||
stream->reader->Init(1, fields);
|
||||
|
||||
readers[stream->reader] = stream;
|
||||
|
||||
DBG_LOG(DBG_INPUT, "Successfully created analysis stream %s",
|
||||
stream->name.c_str());
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::IsCompatibleType(BroType* t, bool atomic_only)
|
||||
{
|
||||
|
@ -966,6 +1017,15 @@ void Manager::SendEntry(ReaderFrontend* reader, Value* *vals)
|
|||
readFields = SendEventStreamEvent(i, type, vals);
|
||||
}
|
||||
|
||||
else if ( i->stream_type == ANALYSIS_STREAM )
|
||||
{
|
||||
readFields = 1;
|
||||
assert(vals[0]->type == TYPE_STRING);
|
||||
file_mgr->DataIn(reinterpret_cast<u_char*>(vals[0]->val.string_val.data),
|
||||
vals[0]->val.string_val.length,
|
||||
static_cast<AnalysisStream*>(i)->file_id, i->name);
|
||||
}
|
||||
|
||||
else
|
||||
assert(false);
|
||||
|
||||
|
@ -1179,7 +1239,7 @@ void Manager::EndCurrentSend(ReaderFrontend* reader)
|
|||
DBG_LOG(DBG_INPUT, "Got EndCurrentSend stream %s", i->name.c_str());
|
||||
#endif
|
||||
|
||||
if ( i->stream_type == EVENT_STREAM )
|
||||
if ( i->stream_type != TABLE_STREAM )
|
||||
{
|
||||
// just signal the end of the data source
|
||||
SendEndOfData(i);
|
||||
|
@ -1288,6 +1348,9 @@ void Manager::SendEndOfData(ReaderFrontend* reader)
|
|||
void Manager::SendEndOfData(const Stream *i)
|
||||
{
|
||||
SendEvent(end_of_data, 2, new StringVal(i->name.c_str()), new StringVal(i->info->source));
|
||||
|
||||
if ( i->stream_type == ANALYSIS_STREAM )
|
||||
file_mgr->EndOfFile(static_cast<const AnalysisStream*>(i)->file_id);
|
||||
}
|
||||
|
||||
void Manager::Put(ReaderFrontend* reader, Value* *vals)
|
||||
|
@ -1310,6 +1373,15 @@ void Manager::Put(ReaderFrontend* reader, Value* *vals)
|
|||
readFields = SendEventStreamEvent(i, type, vals);
|
||||
}
|
||||
|
||||
else if ( i->stream_type == ANALYSIS_STREAM )
|
||||
{
|
||||
readFields = 1;
|
||||
assert(vals[0]->type == TYPE_STRING);
|
||||
file_mgr->DataIn(reinterpret_cast<u_char*>(vals[0]->val.string_val.data),
|
||||
vals[0]->val.string_val.length,
|
||||
static_cast<AnalysisStream*>(i)->file_id, i->name);
|
||||
}
|
||||
|
||||
else
|
||||
assert(false);
|
||||
|
||||
|
@ -1577,6 +1649,12 @@ bool Manager::Delete(ReaderFrontend* reader, Value* *vals)
|
|||
success = true;
|
||||
}
|
||||
|
||||
else if ( i->stream_type == ANALYSIS_STREAM )
|
||||
{
|
||||
// can't do anything
|
||||
success = true;
|
||||
}
|
||||
|
||||
else
|
||||
{
|
||||
assert(false);
|
||||
|
|
|
@ -55,6 +55,18 @@ public:
|
|||
*/
|
||||
bool CreateEventStream(RecordVal* description);
|
||||
|
||||
/**
|
||||
* Creates a new input stream which will forward the data from the data
|
||||
* source on to the file analysis framework. The internal BiF defined
|
||||
* in input.bif just forward here. For an input reader to be compatible
|
||||
* with this method, it must be able to accept a filter of a single string
|
||||
* type (i.e. they read a byte stream).
|
||||
*
|
||||
* @param description A record of the script type \c
|
||||
* Input::AnalysisDescription
|
||||
*/
|
||||
bool CreateAnalysisStream(RecordVal* description);
|
||||
|
||||
/**
|
||||
* Force update on a input stream. Forces a re-read of the whole
|
||||
* input source. Usually used when an input stream is opened in
|
||||
|
@ -138,6 +150,7 @@ private:
|
|||
class Stream;
|
||||
class TableStream;
|
||||
class EventStream;
|
||||
class AnalysisStream;
|
||||
|
||||
// Actual RemoveStream implementation -- the function's public and
|
||||
// protected definitions are wrappers around this function.
|
||||
|
@ -202,7 +215,7 @@ private:
|
|||
Stream* FindStream(const string &name);
|
||||
Stream* FindStream(ReaderFrontend* reader);
|
||||
|
||||
enum StreamType { TABLE_STREAM, EVENT_STREAM };
|
||||
enum StreamType { TABLE_STREAM, EVENT_STREAM, ANALYSIS_STREAM };
|
||||
|
||||
map<ReaderFrontend*, Stream*> readers;
|
||||
|
||||
|
|
|
@ -834,6 +834,7 @@ int main(int argc, char** argv)
|
|||
|
||||
plugin_mgr->InitPreScript();
|
||||
analyzer_mgr->InitPreScript();
|
||||
file_mgr->InitPreScript();
|
||||
|
||||
if ( events_file )
|
||||
event_player = new EventPlayer(events_file);
|
||||
|
@ -855,6 +856,7 @@ int main(int argc, char** argv)
|
|||
|
||||
plugin_mgr->InitPostScript();
|
||||
analyzer_mgr->InitPostScript();
|
||||
file_mgr->InitPostScript();
|
||||
|
||||
if ( print_plugins )
|
||||
{
|
||||
|
|
|
@ -39,6 +39,10 @@ void Component::Describe(ODesc* d)
|
|||
d->Add("Analyzer");
|
||||
break;
|
||||
|
||||
case component::FILE_ANALYZER:
|
||||
d->Add("File Analyzer");
|
||||
break;
|
||||
|
||||
default:
|
||||
reporter->InternalError("unknown component type in plugin::Component::Describe");
|
||||
}
|
||||
|
|
|
@ -15,16 +15,11 @@ namespace component {
|
|||
enum Type {
|
||||
READER, /// An input reader (not currently used).
|
||||
WRITER, /// An logging writer (not currenly used).
|
||||
ANALYZER /// A protocol analyzer.
|
||||
ANALYZER, /// A protocol analyzer.
|
||||
FILE_ANALYZER /// A file analyzer.
|
||||
};
|
||||
}
|
||||
|
||||
#if 0
|
||||
namespace input { class PluginComponent; }
|
||||
namespace logging { class PluginComponent; }
|
||||
namespace analyzer { class PluginComponent; }
|
||||
#endif
|
||||
|
||||
/**
|
||||
* Base class for plugin components. A component is a specific piece of
|
||||
* functionality that a plugin provides, such as a protocol analyzer or a log
|
||||
|
|
15
src/util.cc
15
src/util.cc
|
@ -1617,3 +1617,18 @@ const char* bro_magic_buffer(magic_t cookie, const void* buffer, size_t length)
|
|||
|
||||
return rval;
|
||||
}
|
||||
|
||||
const char* canonify_name(const char* name)
|
||||
{
|
||||
unsigned int len = strlen(name);
|
||||
char* nname = new char[len + 1];
|
||||
|
||||
for ( unsigned int i = 0; i < len; i++ )
|
||||
{
|
||||
char c = isalnum(name[i]) ? name[i] : '_';
|
||||
nname[i] = toupper(c);
|
||||
}
|
||||
|
||||
nname[len] = '\0';
|
||||
return nname;
|
||||
}
|
||||
|
|
|
@ -383,4 +383,12 @@ extern magic_t magic_mime_cookie;
|
|||
void bro_init_magic(magic_t* cookie_ptr, int flags);
|
||||
const char* bro_magic_buffer(magic_t cookie, const void* buffer, size_t length);
|
||||
|
||||
/**
|
||||
* Canonicalizes a name by converting it to uppercase letters and replacing
|
||||
* all non-alphanumeric characters with an underscore.
|
||||
* @param name The string to canonicalize.
|
||||
* @return The canonicalized version of \a name which caller may later delete[].
|
||||
*/
|
||||
const char* canonify_name(const char* name);
|
||||
|
||||
#endif
|
||||
|
|
|
@ -3,10 +3,10 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-14-38-11
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1257655301.652206 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 10102 200 OK - - - (empty) - - - text/html - -
|
||||
1257655302.514424 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 2 GET ipv6.google.com /csi?v=3&s=webhp&action=&tran=undefined&e=17259,19771,21517,21766,21887,22212&ei=BUz2Su7PMJTglQfz3NzCAw&rt=prt.77,xjs.565,ol.645 http://ipv6.google.com/ Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 0 204 No Content - - - (empty) - - - - - -
|
||||
1257655303.603569 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 3 GET ipv6.google.com /gen_204?atyp=i&ct=fade&cad=1254&ei=BUz2Su7PMJTglQfz3NzCAw&zx=1257655303600 http://ipv6.google.com/ Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 0 204 No Content - - - (empty) - - - - - -
|
||||
#close 2013-03-22-14-38-11
|
||||
#open 2013-05-21-21-11-20
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1257655301.652206 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 10102 200 OK - - - (empty) - - - text/html - - -
|
||||
1257655302.514424 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 2 GET ipv6.google.com /csi?v=3&s=webhp&action=&tran=undefined&e=17259,19771,21517,21766,21887,22212&ei=BUz2Su7PMJTglQfz3NzCAw&rt=prt.77,xjs.565,ol.645 http://ipv6.google.com/ Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 0 204 No Content - - - (empty) - - - - - - -
|
||||
1257655303.603569 5OKnoww6xl4 2001:4978:f:4c::2 53382 2001:4860:b002::68 80 3 GET ipv6.google.com /gen_204?atyp=i&ct=fade&cad=1254&ei=BUz2Su7PMJTglQfz3NzCAw&zx=1257655303600 http://ipv6.google.com/ Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en; rv:1.9.0.15pre) Gecko/2009091516 Camino/2.0b4 (like Firefox/3.0.15pre) 0 0 204 No Content - - - (empty) - - - - - - -
|
||||
#close 2013-05-21-21-11-20
|
||||
|
|
|
@ -3,9 +3,9 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-14-37-45
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1333458850.340368 arKYeMETxOg 10.131.17.170 51803 173.199.115.168 80 1 GET cdn.epicgameads.com /ads/flash/728x90_nx8com.swf?clickTAG=http://www.epicgameads.com/ads/bannerclickPage.php?id=e3ubwU6IF&pd=1&adid=0&icpc=1&axid=0&uctt=1&channel=4&cac=1&t=728x90&cb=1333458879 http://www.epicgameads.com/ads/banneriframe.php?id=e3ubwU6IF&t=728x90&channel=4&cb=1333458905296 Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0) 0 31461 200 OK - - - (empty) - - - application/x-shockwave-flash - -
|
||||
1333458850.399501 arKYeMETxOg 10.131.17.170 51803 173.199.115.168 80 2 GET cdn.epicgameads.com /ads/flash/728x90_nx8com.swf?clickTAG=http://www.epicgameads.com/ads/bannerclickPage.php?id=e3ubwU6IF&pd=1&adid=0&icpc=1&axid=0&uctt=1&channel=0&cac=1&t=728x90&cb=1333458881 http://www.epicgameads.com/ads/banneriframe.php?id=e3ubwU6IF&t=728x90&cb=1333458920207 Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0) 0 31461 200 OK - - - (empty) - - - application/x-shockwave-flash - -
|
||||
#close 2013-03-22-14-37-45
|
||||
#open 2013-05-21-21-11-21
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1333458850.340368 arKYeMETxOg 10.131.17.170 51803 173.199.115.168 80 1 GET cdn.epicgameads.com /ads/flash/728x90_nx8com.swf?clickTAG=http://www.epicgameads.com/ads/bannerclickPage.php?id=e3ubwU6IF&pd=1&adid=0&icpc=1&axid=0&uctt=1&channel=4&cac=1&t=728x90&cb=1333458879 http://www.epicgameads.com/ads/banneriframe.php?id=e3ubwU6IF&t=728x90&channel=4&cb=1333458905296 Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0) 0 31461 200 OK - - - (empty) - - - application/x-shockwave-flash - - -
|
||||
1333458850.399501 arKYeMETxOg 10.131.17.170 51803 173.199.115.168 80 2 GET cdn.epicgameads.com /ads/flash/728x90_nx8com.swf?clickTAG=http://www.epicgameads.com/ads/bannerclickPage.php?id=e3ubwU6IF&pd=1&adid=0&icpc=1&axid=0&uctt=1&channel=0&cac=1&t=728x90&cb=1333458881 http://www.epicgameads.com/ads/banneriframe.php?id=e3ubwU6IF&t=728x90&cb=1333458920207 Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0) 0 31461 200 OK - - - (empty) - - - application/x-shockwave-flash - - -
|
||||
#close 2013-05-21-21-11-21
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-28-21-35-15
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1333458850.375568 arKYeMETxOg 10.131.47.185 1923 79.101.110.141 80 1 GET o-o.preferred.telekomrs-beg1.v2.lscache8.c.youtube.com /videoplayback?upn=MTU2MDY5NzQ5OTM0NTI3NDY4NDc&sparams=algorithm,burst,cp,factor,id,ip,ipbits,itag,source,upn,expire&fexp=912300,907210&algorithm=throttle-factor&itag=34&ip=212.0.0.0&burst=40&sver=3&signature=832FB1042E20780CFCA77A4DB5EA64AC593E8627.D1166C7E8365732E52DAFD68076DAE0146E0AE01&source=youtube&expire=1333484980&key=yt1&ipbits=8&factor=1.25&cp=U0hSSFRTUl9NSkNOMl9MTVZKOjh5eEN2SG8tZF84&id=ebf1e932d4bd1286&cm2=1 http://s.ytimg.com/yt/swfbin/watch_as3-vflqrJwOA.swf Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko; X-SBLSP) Chrome/17.0.963.83 Safari/535.11 0 56320 206 Partial Content - - - (empty) - - - application/octet-stream - -
|
||||
#close 2013-03-28-21-35-15
|
||||
#open 2013-05-21-21-11-22
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1333458850.375568 arKYeMETxOg 10.131.47.185 1923 79.101.110.141 80 1 GET o-o.preferred.telekomrs-beg1.v2.lscache8.c.youtube.com /videoplayback?upn=MTU2MDY5NzQ5OTM0NTI3NDY4NDc&sparams=algorithm,burst,cp,factor,id,ip,ipbits,itag,source,upn,expire&fexp=912300,907210&algorithm=throttle-factor&itag=34&ip=212.0.0.0&burst=40&sver=3&signature=832FB1042E20780CFCA77A4DB5EA64AC593E8627.D1166C7E8365732E52DAFD68076DAE0146E0AE01&source=youtube&expire=1333484980&key=yt1&ipbits=8&factor=1.25&cp=U0hSSFRTUl9NSkNOMl9MTVZKOjh5eEN2SG8tZF84&id=ebf1e932d4bd1286&cm2=1 http://s.ytimg.com/yt/swfbin/watch_as3-vflqrJwOA.swf Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko; X-SBLSP) Chrome/17.0.963.83 Safari/535.11 0 56320 206 Partial Content - - - (empty) - - - application/octet-stream - - -
|
||||
#close 2013-05-21-21-11-22
|
||||
|
|
|
@ -3,11 +3,11 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-14-37-44
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1210953057.917183 3PKsZ2Uye21 192.168.2.16 1578 75.126.203.78 80 1 POST download913.avast.com /cgi-bin/iavs4stats.cgi - Syncer/4.80 (av_pro-1169;f) 589 0 204 <empty> - - - (empty) - - - text/plain - -
|
||||
1210953061.585996 70MGiRM1Qf4 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 6640 200 OK - - - (empty) - - - text/html - -
|
||||
1210953073.381474 70MGiRM1Qf4 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 2 GET ipv6.google.com /search?hl=en&q=Wireshark+!&btnG=Google+Search http://ipv6.google.com/ Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 25119 200 OK - - - (empty) - - - text/html - -
|
||||
1210953074.674817 c4Zw9TmAE05 192.168.2.16 1580 67.228.110.120 80 1 GET www.wireshark.org / http://ipv6.google.com/search?hl=en&q=Wireshark+%21&btnG=Google+Search Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 11845 200 OK - - - (empty) - - - application/xml - -
|
||||
#close 2013-03-22-14-37-44
|
||||
#open 2013-05-21-21-11-21
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1210953057.917183 3PKsZ2Uye21 192.168.2.16 1578 75.126.203.78 80 1 POST download913.avast.com /cgi-bin/iavs4stats.cgi - Syncer/4.80 (av_pro-1169;f) 589 0 204 <empty> - - - (empty) - - - text/plain - - -
|
||||
1210953061.585996 70MGiRM1Qf4 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 6640 200 OK - - - (empty) - - - text/html - - -
|
||||
1210953073.381474 70MGiRM1Qf4 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 2 GET ipv6.google.com /search?hl=en&q=Wireshark+!&btnG=Google+Search http://ipv6.google.com/ Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 25119 200 OK - - - (empty) - - - text/html - - -
|
||||
1210953074.674817 c4Zw9TmAE05 192.168.2.16 1580 67.228.110.120 80 1 GET www.wireshark.org / http://ipv6.google.com/search?hl=en&q=Wireshark+%21&btnG=Google+Search Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 11845 200 OK - - - (empty) - - - application/xml - - -
|
||||
#close 2013-05-21-21-11-21
|
||||
|
|
|
@ -3,9 +3,9 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-14-37-44
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1340127577.361683 FrJExwHcSal 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 6640 200 OK - - - (empty) - - - text/html - -
|
||||
1340127577.379360 FrJExwHcSal 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 2 GET ipv6.google.com /search?hl=en&q=Wireshark+!&btnG=Google+Search http://ipv6.google.com/ Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 25119 200 OK - - - (empty) - - - text/html - -
|
||||
#close 2013-03-22-14-37-44
|
||||
#open 2013-05-21-21-11-22
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1340127577.361683 FrJExwHcSal 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 1 GET ipv6.google.com / - Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 6640 200 OK - - - (empty) - - - text/html - - -
|
||||
1340127577.379360 FrJExwHcSal 2001:0:4137:9e50:8000:f12a:b9c8:2815 1286 2001:4860:0:2001::68 80 2 GET ipv6.google.com /search?hl=en&q=Wireshark+!&btnG=Google+Search http://ipv6.google.com/ Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 0 25119 200 OK - - - (empty) - - - text/html - - -
|
||||
#close 2013-05-21-21-11-22
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path loaded_scripts
|
||||
#open 2013-05-17-03-57-47
|
||||
#open 2013-06-10-19-50-56
|
||||
#fields name
|
||||
#types string
|
||||
scripts/base/init-bare.bro
|
||||
|
@ -13,31 +13,6 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/bro.bif.bro
|
||||
build/scripts/base/bif/reporter.bif.bro
|
||||
build/scripts/base/bif/event.bif.bro
|
||||
scripts/base/frameworks/logging/__load__.bro
|
||||
scripts/base/frameworks/logging/main.bro
|
||||
build/scripts/base/bif/logging.bif.bro
|
||||
scripts/base/frameworks/logging/postprocessors/__load__.bro
|
||||
scripts/base/frameworks/logging/postprocessors/scp.bro
|
||||
scripts/base/frameworks/logging/postprocessors/sftp.bro
|
||||
scripts/base/frameworks/logging/writers/ascii.bro
|
||||
scripts/base/frameworks/logging/writers/dataseries.bro
|
||||
scripts/base/frameworks/logging/writers/sqlite.bro
|
||||
scripts/base/frameworks/logging/writers/elasticsearch.bro
|
||||
scripts/base/frameworks/logging/writers/none.bro
|
||||
scripts/base/frameworks/input/__load__.bro
|
||||
scripts/base/frameworks/input/main.bro
|
||||
build/scripts/base/bif/input.bif.bro
|
||||
scripts/base/frameworks/input/readers/ascii.bro
|
||||
scripts/base/frameworks/input/readers/raw.bro
|
||||
scripts/base/frameworks/input/readers/benchmark.bro
|
||||
scripts/base/frameworks/input/readers/binary.bro
|
||||
scripts/base/frameworks/input/readers/sqlite.bro
|
||||
scripts/base/frameworks/analyzer/__load__.bro
|
||||
scripts/base/frameworks/analyzer/main.bro
|
||||
build/scripts/base/bif/analyzer.bif.bro
|
||||
scripts/base/frameworks/file-analysis/__load__.bro
|
||||
scripts/base/frameworks/file-analysis/main.bro
|
||||
build/scripts/base/bif/file_analysis.bif.bro
|
||||
build/scripts/base/bif/plugins/__load__.bro
|
||||
build/scripts/base/bif/plugins/Bro_ARP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_AYIYA.events.bif.bro
|
||||
|
@ -50,6 +25,7 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_FTP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_FTP.functions.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_File.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_FileHash.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Finger.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_GTPv1.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Gnutella.events.bif.bro
|
||||
|
@ -85,6 +61,31 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_Teredo.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_UDP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_ZIP.events.bif.bro
|
||||
scripts/base/frameworks/logging/__load__.bro
|
||||
scripts/base/frameworks/logging/main.bro
|
||||
build/scripts/base/bif/logging.bif.bro
|
||||
scripts/base/frameworks/logging/postprocessors/__load__.bro
|
||||
scripts/base/frameworks/logging/postprocessors/scp.bro
|
||||
scripts/base/frameworks/logging/postprocessors/sftp.bro
|
||||
scripts/base/frameworks/logging/writers/ascii.bro
|
||||
scripts/base/frameworks/logging/writers/dataseries.bro
|
||||
scripts/base/frameworks/logging/writers/sqlite.bro
|
||||
scripts/base/frameworks/logging/writers/elasticsearch.bro
|
||||
scripts/base/frameworks/logging/writers/none.bro
|
||||
scripts/base/frameworks/input/__load__.bro
|
||||
scripts/base/frameworks/input/main.bro
|
||||
build/scripts/base/bif/input.bif.bro
|
||||
scripts/base/frameworks/input/readers/ascii.bro
|
||||
scripts/base/frameworks/input/readers/raw.bro
|
||||
scripts/base/frameworks/input/readers/benchmark.bro
|
||||
scripts/base/frameworks/input/readers/binary.bro
|
||||
scripts/base/frameworks/input/readers/sqlite.bro
|
||||
scripts/base/frameworks/analyzer/__load__.bro
|
||||
scripts/base/frameworks/analyzer/main.bro
|
||||
build/scripts/base/bif/analyzer.bif.bro
|
||||
scripts/base/frameworks/file-analysis/__load__.bro
|
||||
scripts/base/frameworks/file-analysis/main.bro
|
||||
build/scripts/base/bif/file_analysis.bif.bro
|
||||
scripts/policy/misc/loaded-scripts.bro
|
||||
scripts/base/utils/paths.bro
|
||||
#close 2013-05-17-03-57-47
|
||||
#close 2013-06-10-19-50-56
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path loaded_scripts
|
||||
#open 2013-05-17-03-58-48
|
||||
#open 2013-06-10-19-50-57
|
||||
#fields name
|
||||
#types string
|
||||
scripts/base/init-bare.bro
|
||||
|
@ -13,31 +13,6 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/bro.bif.bro
|
||||
build/scripts/base/bif/reporter.bif.bro
|
||||
build/scripts/base/bif/event.bif.bro
|
||||
scripts/base/frameworks/logging/__load__.bro
|
||||
scripts/base/frameworks/logging/main.bro
|
||||
build/scripts/base/bif/logging.bif.bro
|
||||
scripts/base/frameworks/logging/postprocessors/__load__.bro
|
||||
scripts/base/frameworks/logging/postprocessors/scp.bro
|
||||
scripts/base/frameworks/logging/postprocessors/sftp.bro
|
||||
scripts/base/frameworks/logging/writers/ascii.bro
|
||||
scripts/base/frameworks/logging/writers/dataseries.bro
|
||||
scripts/base/frameworks/logging/writers/sqlite.bro
|
||||
scripts/base/frameworks/logging/writers/elasticsearch.bro
|
||||
scripts/base/frameworks/logging/writers/none.bro
|
||||
scripts/base/frameworks/input/__load__.bro
|
||||
scripts/base/frameworks/input/main.bro
|
||||
build/scripts/base/bif/input.bif.bro
|
||||
scripts/base/frameworks/input/readers/ascii.bro
|
||||
scripts/base/frameworks/input/readers/raw.bro
|
||||
scripts/base/frameworks/input/readers/benchmark.bro
|
||||
scripts/base/frameworks/input/readers/binary.bro
|
||||
scripts/base/frameworks/input/readers/sqlite.bro
|
||||
scripts/base/frameworks/analyzer/__load__.bro
|
||||
scripts/base/frameworks/analyzer/main.bro
|
||||
build/scripts/base/bif/analyzer.bif.bro
|
||||
scripts/base/frameworks/file-analysis/__load__.bro
|
||||
scripts/base/frameworks/file-analysis/main.bro
|
||||
build/scripts/base/bif/file_analysis.bif.bro
|
||||
build/scripts/base/bif/plugins/__load__.bro
|
||||
build/scripts/base/bif/plugins/Bro_ARP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_AYIYA.events.bif.bro
|
||||
|
@ -50,6 +25,7 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_FTP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_FTP.functions.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_File.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_FileHash.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Finger.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_GTPv1.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Gnutella.events.bif.bro
|
||||
|
@ -85,6 +61,31 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_Teredo.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_UDP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_ZIP.events.bif.bro
|
||||
scripts/base/frameworks/logging/__load__.bro
|
||||
scripts/base/frameworks/logging/main.bro
|
||||
build/scripts/base/bif/logging.bif.bro
|
||||
scripts/base/frameworks/logging/postprocessors/__load__.bro
|
||||
scripts/base/frameworks/logging/postprocessors/scp.bro
|
||||
scripts/base/frameworks/logging/postprocessors/sftp.bro
|
||||
scripts/base/frameworks/logging/writers/ascii.bro
|
||||
scripts/base/frameworks/logging/writers/dataseries.bro
|
||||
scripts/base/frameworks/logging/writers/sqlite.bro
|
||||
scripts/base/frameworks/logging/writers/elasticsearch.bro
|
||||
scripts/base/frameworks/logging/writers/none.bro
|
||||
scripts/base/frameworks/input/__load__.bro
|
||||
scripts/base/frameworks/input/main.bro
|
||||
build/scripts/base/bif/input.bif.bro
|
||||
scripts/base/frameworks/input/readers/ascii.bro
|
||||
scripts/base/frameworks/input/readers/raw.bro
|
||||
scripts/base/frameworks/input/readers/benchmark.bro
|
||||
scripts/base/frameworks/input/readers/binary.bro
|
||||
scripts/base/frameworks/input/readers/sqlite.bro
|
||||
scripts/base/frameworks/analyzer/__load__.bro
|
||||
scripts/base/frameworks/analyzer/main.bro
|
||||
build/scripts/base/bif/analyzer.bif.bro
|
||||
scripts/base/frameworks/file-analysis/__load__.bro
|
||||
scripts/base/frameworks/file-analysis/main.bro
|
||||
build/scripts/base/bif/file_analysis.bif.bro
|
||||
scripts/base/init-default.bro
|
||||
scripts/base/utils/site.bro
|
||||
scripts/base/utils/patterns.bro
|
||||
|
@ -191,4 +192,4 @@ scripts/base/init-default.bro
|
|||
scripts/base/protocols/syslog/main.bro
|
||||
scripts/base/misc/find-checksum-offloading.bro
|
||||
scripts/policy/misc/loaded-scripts.bro
|
||||
#close 2013-05-17-03-58-48
|
||||
#close 2013-06-10-19-50-57
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-21-05-55
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1363986354.505533 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - -
|
||||
#close 2013-03-22-21-05-56
|
||||
#open 2013-05-21-21-11-32
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1369170691.550143 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - - -
|
||||
#close 2013-05-21-21-11-33
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-04-10-15-49-37
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1365608977.146651 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - -
|
||||
#close 2013-04-10-15-49-38
|
||||
#open 2013-05-21-21-11-32
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1369170691.550143 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - - -
|
||||
#close 2013-05-21-21-11-33
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-21-03-17
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1363986197.076696 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - -
|
||||
#close 2013-03-22-21-03-18
|
||||
#open 2013-05-21-21-11-40
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1369170699.511968 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - - -
|
||||
#close 2013-05-21-21-11-41
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-04-10-15-48-08
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1365608887.935644 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - -
|
||||
#close 2013-04-10-15-48-09
|
||||
#open 2013-05-21-21-11-40
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1369170699.511968 arKYeMETxOg 141.42.64.125 56730 125.190.109.199 80 1 GET www.icir.org / - Wget/1.10 0 9130 200 OK - - - (empty) - - - - - - -
|
||||
#close 2013-05-21-21-11-41
|
||||
|
|
|
@ -1,23 +1,23 @@
|
|||
FILE_NEW
|
||||
BYYd1GSNX5c, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
^J0.26 | 201
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
file_stream, BYYd1GSNX5c, 1500, ^J0.26 | 2012-08-24 15:10:04 -0700^J^J * Fixing update-changes, which could pick the wrong control file. (Robin Sommer)^J^J * Fixing GPG signing script. (Robin Sommer)^J^J0.25 | 2012-08-01 13:55:46 -0500^J^J * Fix configure script to exit with non-zero status on error (Jon Siwek)^J^J0.24 | 2012-07-05 12:50:43 -0700^J^J * Raise minimum required CMake version to 2.6.3 (Jon Siwek)^J^J * Adding script to delete old fully-merged branches. (Robin Sommer)^J^J0.23-2 | 2012-01-25 13:24:01 -0800^J^J * Fix a bro-cut error message. (Daniel Thayer)^J^J0.23 | 2012-01-11 12:16:11 -0800^J^J * Tweaks to release scripts, plus a new one for signing files.^J (Robin Sommer)^J^J0.22 | 2012-01-10 16:45:19 -0800^J^J * Tweaks for OpenBSD support. (Jon Siwek)^J^J * bro-cut extensions and fixes. (Robin Sommer)^J ^J - If no field names are given on the command line, we now pass through^J all fields. Adresses #657.^J^J - Removing some GNUism from awk script. Addresses #653.^J^J - Added option for time output in UTC. Addresses #668.^J^J - Added output field separator option -F. Addresses #649.^J^J - Fixing option -c: only some header lines were passed through^J rather than all. (Robin Sommer)^J^J * Fix parallel make portability. (Jon Siwek)^J^J0.21-9 | 2011-11-07 05:44:14 -0800^J^J * Fixing compiler warnings. Addresses #388. (Jon Siwek)^J^J0.21-2 | 2011-11-02 18:12:13 -0700^J^J * Fix for misnaming temp file in update-changes script. (Robin Sommer)^J^J0.21-1 | 2011-11-02 18:10:39 -0700^J^J * Little fix for make-relea
|
||||
file_chunk, BYYd1GSNX5c, 1500, 0, ^J0.26 | 2012-08-24 15:10:04 -0700^J^J * Fixing update-changes, which could pick the wrong control file. (Robin Sommer)^J^J * Fixing GPG signing script. (Robin Sommer)^J^J0.25 | 2012-08-01 13:55:46 -0500^J^J * Fix configure script to exit with non-zero status on error (Jon Siwek)^J^J0.24 | 2012-07-05 12:50:43 -0700^J^J * Raise minimum required CMake version to 2.6.3 (Jon Siwek)^J^J * Adding script to delete old fully-merged branches. (Robin Sommer)^J^J0.23-2 | 2012-01-25 13:24:01 -0800^J^J * Fix a bro-cut error message. (Daniel Thayer)^J^J0.23 | 2012-01-11 12:16:11 -0800^J^J * Tweaks to release scripts, plus a new one for signing files.^J (Robin Sommer)^J^J0.22 | 2012-01-10 16:45:19 -0800^J^J * Tweaks for OpenBSD support. (Jon Siwek)^J^J * bro-cut extensions and fixes. (Robin Sommer)^J ^J - If no field names are given on the command line, we now pass through^J all fields. Adresses #657.^J^J - Removing some GNUism from awk script. Addresses #653.^J^J - Added option for time output in UTC. Addresses #668.^J^J - Added output field separator option -F. Addresses #649.^J^J - Fixing option -c: only some header lines were passed through^J rather than all. (Robin Sommer)^J^J * Fix parallel make portability. (Jon Siwek)^J^J0.21-9 | 2011-11-07 05:44:14 -0800^J^J * Fixing compiler warnings. Addresses #388. (Jon Siwek)^J^J0.21-2 | 2011-11-02 18:12:13 -0700^J^J * Fix for misnaming temp file in update-changes script. (Robin Sommer)^J^J0.21-1 | 2011-11-02 18:10:39 -0700^J^J * Little fix for make-relea
|
||||
file_stream, BYYd1GSNX5c, 1024, se script, which could pick out the wrong^J tag. (Robin Sommer)^J^J0.21 | 2011-10-27 17:40:45 -0700^J^J * Fixing bro-cut's usage message and argument error handling. (Robin Sommer)^J^J * Bugfix in update-changes script. (Robin Sommer)^J^J * update-changes now ignores commits it did itself. (Robin Sommer)^J^J * Fix a bug in the update-changes script. (Robin Sommer)^J^J * bro-cut now always installs to $prefix/bin by `make install`. (Jon Siwek)^J^J * Options to adjust time format for bro-cut. (Robin Sommer)^J^J The default with -d is now ISO format. The new option "-D <fmt>"^J specifies a custom strftime()-style format string. Alternatively,^J the environment variable BRO_CUT_TIMEFMT can set the format as^J well.^J^J * bro-cut now understands the field separator header. (Robin Sommer)^J^J * Renaming options -h/-H -> -c/-C, and doing some general cleanup.^J^J0.2 | 2011-10-25 19:53:57 -0700^J^J * Adding support for replacing version string in a setup.py. (Robin^J Sommer)^J^J * Change generated root cert DN indices f
|
||||
file_chunk, BYYd1GSNX5c, 1024, 1500, se script, which could pick out the wrong^J tag. (Robin Sommer)^J^J0.21 | 2011-10-27 17:40:45 -0700^J^J * Fixing bro-cut's usage message and argument error handling. (Robin Sommer)^J^J * Bugfix in update-changes script. (Robin Sommer)^J^J * update-changes now ignores commits it did itself. (Robin Sommer)^J^J * Fix a bug in the update-changes script. (Robin Sommer)^J^J * bro-cut now always installs to $prefix/bin by `make install`. (Jon Siwek)^J^J * Options to adjust time format for bro-cut. (Robin Sommer)^J^J The default with -d is now ISO format. The new option "-D <fmt>"^J specifies a custom strftime()-style format string. Alternatively,^J the environment variable BRO_CUT_TIMEFMT can set the format as^J well.^J^J * bro-cut now understands the field separator header. (Robin Sommer)^J^J * Renaming options -h/-H -> -c/-C, and doing some general cleanup.^J^J0.2 | 2011-10-25 19:53:57 -0700^J^J * Adding support for replacing version string in a setup.py. (Robin^J Sommer)^J^J * Change generated root cert DN indices f
|
||||
file_stream, BYYd1GSNX5c, 476, ormat for RFC2253^J compliance. (Jon Siwek)^J^J * New tool devel-tools/check-release to run before making releases.^J (Robin Sommer)^J^J * devel-tools/update-changes gets a new option -a to amend to^J previous commit if possible. Default is now not to (used to be the^J opposite). (Robin Sommer)^J^J * Change Mozilla trust root generation to index certs by subject DN. (Jon Siwek)^J^J * Change distclean to only remove build dir. (Jon Siwek)^J^J * Make dist now cleans the
|
||||
file_chunk, BYYd1GSNX5c, 476, 2524, ormat for RFC2253^J compliance. (Jon Siwek)^J^J * New tool devel-tools/check-release to run before making releases.^J (Robin Sommer)^J^J * devel-tools/update-changes gets a new option -a to amend to^J previous commit if possible. Default is now not to (used to be the^J opposite). (Robin Sommer)^J^J * Change Mozilla trust root generation to index certs by subject DN. (Jon Siwek)^J^J * Change distclean to only remove build dir. (Jon Siwek)^J^J * Make dist now cleans the
|
||||
file_stream, BYYd1GSNX5c, 1024, copied source (Jon Siwek)^J^J * Small tweak to make-release for forced git-clean. (Jon Siwek)^J^J * Fix to not let updates scripts loose their executable permissions.^J (Robin Sommer)^J^J * devel-tools/update-changes now looks for a 'release' tag to^J idenfify the stable version, and 'beta' for the beta versions.^J (Robin Sommer).^J^J * Distribution cleanup. (Robin Sommer)^J^J * New script devel-tools/make-release to create source tar balls.^J (Robin Sommer)^J^J * Removing bdcat. With the new log format, this isn't very useful^J anymore. (Robin Sommer)^J^J * Adding script that shows all pending git fastpath commits. (Robin^J Sommer)^J^J * Script to measure CPU time by loading an increasing set of^J scripts. (Robin Sommer)^J^J * extract-conn script now deals wit *.gz files. (Robin Sommer)^J^J * Tiny update to output a valid CA list file for SSL cert^J validation. (Seth Hall)^J^J * Adding "install-aux" target. Addresses #622. (Jon Siwek)^J^J * Distribution cleanup. (Jon Siwek and Robin Sommer)^J^J * FindPCAP
|
||||
file_chunk, BYYd1GSNX5c, 1024, 3000, copied source (Jon Siwek)^J^J * Small tweak to make-release for forced git-clean. (Jon Siwek)^J^J * Fix to not let updates scripts loose their executable permissions.^J (Robin Sommer)^J^J * devel-tools/update-changes now looks for a 'release' tag to^J idenfify the stable version, and 'beta' for the beta versions.^J (Robin Sommer).^J^J * Distribution cleanup. (Robin Sommer)^J^J * New script devel-tools/make-release to create source tar balls.^J (Robin Sommer)^J^J * Removing bdcat. With the new log format, this isn't very useful^J anymore. (Robin Sommer)^J^J * Adding script that shows all pending git fastpath commits. (Robin^J Sommer)^J^J * Script to measure CPU time by loading an increasing set of^J scripts. (Robin Sommer)^J^J * extract-conn script now deals wit *.gz files. (Robin Sommer)^J^J * Tiny update to output a valid CA list file for SSL cert^J validation. (Seth Hall)^J^J * Adding "install-aux" target. Addresses #622. (Jon Siwek)^J^J * Distribution cleanup. (Jon Siwek and Robin Sommer)^J^J * FindPCAP
|
||||
file_stream, BYYd1GSNX5c, 476, now links against thread library when necessary (e.g.^J PF_RING's libpcap) (Jon Siwek)^J^J * Install binaries with an RPATH (Jon Siwek)^J^J * Workaround for FreeBSD CMake port missing debug flags (Jon Siwek)^J^J * Rewrite of the update-changes script. (Robin Sommer)^J^J0.1-1 | 2011-06-14 21:12:41 -0700^J^J * Add a script for generating Mozilla's CA list for the SSL analyzer.^J (Seth Hall)^J^J0.1 | 2011-04-01 16:28:22 -0700^J^J * Converting build process to CMake. (Jon Siwek)^J
|
||||
file_chunk, BYYd1GSNX5c, 476, 4024, now links against thread library when necessary (e.g.^J PF_RING's libpcap) (Jon Siwek)^J^J * Install binaries with an RPATH (Jon Siwek)^J^J * Workaround for FreeBSD CMake port missing debug flags (Jon Siwek)^J^J * Rewrite of the update-changes script. (Robin Sommer)^J^J0.1-1 | 2011-06-14 21:12:41 -0700^J^J * Add a script for generating Mozilla's CA list for the SSL analyzer.^J (Seth Hall)^J^J0.1 | 2011-04-01 16:28:22 -0700^J^J * Converting build process to CMake. (Jon Siwek)^J
|
||||
file_stream, BYYd1GSNX5c, 205, ^J * Removing cf/hf/ca-* from distribution. The README has a note where^J to find them now. (Robin Sommer)^J^J * General cleanup. (Robin Sommer)^J^J * Initial import of bro/aux from SVN r7088. (Jon Siwek)^J
|
||||
file_chunk, BYYd1GSNX5c, 205, 4500, ^J * Removing cf/hf/ca-* from distribution. The README has a note where^J to find them now. (Robin Sommer)^J^J * General cleanup. (Robin Sommer)^J^J * Initial import of bro/aux from SVN r7088. (Jon Siwek)^J
|
||||
file_stream, file #0, 1500, ^J0.26 | 2012-08-24 15:10:04 -0700^J^J * Fixing update-changes, which could pick the wrong control file. (Robin Sommer)^J^J * Fixing GPG signing script. (Robin Sommer)^J^J0.25 | 2012-08-01 13:55:46 -0500^J^J * Fix configure script to exit with non-zero status on error (Jon Siwek)^J^J0.24 | 2012-07-05 12:50:43 -0700^J^J * Raise minimum required CMake version to 2.6.3 (Jon Siwek)^J^J * Adding script to delete old fully-merged branches. (Robin Sommer)^J^J0.23-2 | 2012-01-25 13:24:01 -0800^J^J * Fix a bro-cut error message. (Daniel Thayer)^J^J0.23 | 2012-01-11 12:16:11 -0800^J^J * Tweaks to release scripts, plus a new one for signing files.^J (Robin Sommer)^J^J0.22 | 2012-01-10 16:45:19 -0800^J^J * Tweaks for OpenBSD support. (Jon Siwek)^J^J * bro-cut extensions and fixes. (Robin Sommer)^J ^J - If no field names are given on the command line, we now pass through^J all fields. Adresses #657.^J^J - Removing some GNUism from awk script. Addresses #653.^J^J - Added option for time output in UTC. Addresses #668.^J^J - Added output field separator option -F. Addresses #649.^J^J - Fixing option -c: only some header lines were passed through^J rather than all. (Robin Sommer)^J^J * Fix parallel make portability. (Jon Siwek)^J^J0.21-9 | 2011-11-07 05:44:14 -0800^J^J * Fixing compiler warnings. Addresses #388. (Jon Siwek)^J^J0.21-2 | 2011-11-02 18:12:13 -0700^J^J * Fix for misnaming temp file in update-changes script. (Robin Sommer)^J^J0.21-1 | 2011-11-02 18:10:39 -0700^J^J * Little fix for make-relea
|
||||
file_chunk, file #0, 1500, 0, ^J0.26 | 2012-08-24 15:10:04 -0700^J^J * Fixing update-changes, which could pick the wrong control file. (Robin Sommer)^J^J * Fixing GPG signing script. (Robin Sommer)^J^J0.25 | 2012-08-01 13:55:46 -0500^J^J * Fix configure script to exit with non-zero status on error (Jon Siwek)^J^J0.24 | 2012-07-05 12:50:43 -0700^J^J * Raise minimum required CMake version to 2.6.3 (Jon Siwek)^J^J * Adding script to delete old fully-merged branches. (Robin Sommer)^J^J0.23-2 | 2012-01-25 13:24:01 -0800^J^J * Fix a bro-cut error message. (Daniel Thayer)^J^J0.23 | 2012-01-11 12:16:11 -0800^J^J * Tweaks to release scripts, plus a new one for signing files.^J (Robin Sommer)^J^J0.22 | 2012-01-10 16:45:19 -0800^J^J * Tweaks for OpenBSD support. (Jon Siwek)^J^J * bro-cut extensions and fixes. (Robin Sommer)^J ^J - If no field names are given on the command line, we now pass through^J all fields. Adresses #657.^J^J - Removing some GNUism from awk script. Addresses #653.^J^J - Added option for time output in UTC. Addresses #668.^J^J - Added output field separator option -F. Addresses #649.^J^J - Fixing option -c: only some header lines were passed through^J rather than all. (Robin Sommer)^J^J * Fix parallel make portability. (Jon Siwek)^J^J0.21-9 | 2011-11-07 05:44:14 -0800^J^J * Fixing compiler warnings. Addresses #388. (Jon Siwek)^J^J0.21-2 | 2011-11-02 18:12:13 -0700^J^J * Fix for misnaming temp file in update-changes script. (Robin Sommer)^J^J0.21-1 | 2011-11-02 18:10:39 -0700^J^J * Little fix for make-relea
|
||||
file_stream, file #0, 1024, se script, which could pick out the wrong^J tag. (Robin Sommer)^J^J0.21 | 2011-10-27 17:40:45 -0700^J^J * Fixing bro-cut's usage message and argument error handling. (Robin Sommer)^J^J * Bugfix in update-changes script. (Robin Sommer)^J^J * update-changes now ignores commits it did itself. (Robin Sommer)^J^J * Fix a bug in the update-changes script. (Robin Sommer)^J^J * bro-cut now always installs to $prefix/bin by `make install`. (Jon Siwek)^J^J * Options to adjust time format for bro-cut. (Robin Sommer)^J^J The default with -d is now ISO format. The new option "-D <fmt>"^J specifies a custom strftime()-style format string. Alternatively,^J the environment variable BRO_CUT_TIMEFMT can set the format as^J well.^J^J * bro-cut now understands the field separator header. (Robin Sommer)^J^J * Renaming options -h/-H -> -c/-C, and doing some general cleanup.^J^J0.2 | 2011-10-25 19:53:57 -0700^J^J * Adding support for replacing version string in a setup.py. (Robin^J Sommer)^J^J * Change generated root cert DN indices f
|
||||
file_chunk, file #0, 1024, 1500, se script, which could pick out the wrong^J tag. (Robin Sommer)^J^J0.21 | 2011-10-27 17:40:45 -0700^J^J * Fixing bro-cut's usage message and argument error handling. (Robin Sommer)^J^J * Bugfix in update-changes script. (Robin Sommer)^J^J * update-changes now ignores commits it did itself. (Robin Sommer)^J^J * Fix a bug in the update-changes script. (Robin Sommer)^J^J * bro-cut now always installs to $prefix/bin by `make install`. (Jon Siwek)^J^J * Options to adjust time format for bro-cut. (Robin Sommer)^J^J The default with -d is now ISO format. The new option "-D <fmt>"^J specifies a custom strftime()-style format string. Alternatively,^J the environment variable BRO_CUT_TIMEFMT can set the format as^J well.^J^J * bro-cut now understands the field separator header. (Robin Sommer)^J^J * Renaming options -h/-H -> -c/-C, and doing some general cleanup.^J^J0.2 | 2011-10-25 19:53:57 -0700^J^J * Adding support for replacing version string in a setup.py. (Robin^J Sommer)^J^J * Change generated root cert DN indices f
|
||||
file_stream, file #0, 476, ormat for RFC2253^J compliance. (Jon Siwek)^J^J * New tool devel-tools/check-release to run before making releases.^J (Robin Sommer)^J^J * devel-tools/update-changes gets a new option -a to amend to^J previous commit if possible. Default is now not to (used to be the^J opposite). (Robin Sommer)^J^J * Change Mozilla trust root generation to index certs by subject DN. (Jon Siwek)^J^J * Change distclean to only remove build dir. (Jon Siwek)^J^J * Make dist now cleans the
|
||||
file_chunk, file #0, 476, 2524, ormat for RFC2253^J compliance. (Jon Siwek)^J^J * New tool devel-tools/check-release to run before making releases.^J (Robin Sommer)^J^J * devel-tools/update-changes gets a new option -a to amend to^J previous commit if possible. Default is now not to (used to be the^J opposite). (Robin Sommer)^J^J * Change Mozilla trust root generation to index certs by subject DN. (Jon Siwek)^J^J * Change distclean to only remove build dir. (Jon Siwek)^J^J * Make dist now cleans the
|
||||
file_stream, file #0, 1024, copied source (Jon Siwek)^J^J * Small tweak to make-release for forced git-clean. (Jon Siwek)^J^J * Fix to not let updates scripts loose their executable permissions.^J (Robin Sommer)^J^J * devel-tools/update-changes now looks for a 'release' tag to^J idenfify the stable version, and 'beta' for the beta versions.^J (Robin Sommer).^J^J * Distribution cleanup. (Robin Sommer)^J^J * New script devel-tools/make-release to create source tar balls.^J (Robin Sommer)^J^J * Removing bdcat. With the new log format, this isn't very useful^J anymore. (Robin Sommer)^J^J * Adding script that shows all pending git fastpath commits. (Robin^J Sommer)^J^J * Script to measure CPU time by loading an increasing set of^J scripts. (Robin Sommer)^J^J * extract-conn script now deals wit *.gz files. (Robin Sommer)^J^J * Tiny update to output a valid CA list file for SSL cert^J validation. (Seth Hall)^J^J * Adding "install-aux" target. Addresses #622. (Jon Siwek)^J^J * Distribution cleanup. (Jon Siwek and Robin Sommer)^J^J * FindPCAP
|
||||
file_chunk, file #0, 1024, 3000, copied source (Jon Siwek)^J^J * Small tweak to make-release for forced git-clean. (Jon Siwek)^J^J * Fix to not let updates scripts loose their executable permissions.^J (Robin Sommer)^J^J * devel-tools/update-changes now looks for a 'release' tag to^J idenfify the stable version, and 'beta' for the beta versions.^J (Robin Sommer).^J^J * Distribution cleanup. (Robin Sommer)^J^J * New script devel-tools/make-release to create source tar balls.^J (Robin Sommer)^J^J * Removing bdcat. With the new log format, this isn't very useful^J anymore. (Robin Sommer)^J^J * Adding script that shows all pending git fastpath commits. (Robin^J Sommer)^J^J * Script to measure CPU time by loading an increasing set of^J scripts. (Robin Sommer)^J^J * extract-conn script now deals wit *.gz files. (Robin Sommer)^J^J * Tiny update to output a valid CA list file for SSL cert^J validation. (Seth Hall)^J^J * Adding "install-aux" target. Addresses #622. (Jon Siwek)^J^J * Distribution cleanup. (Jon Siwek and Robin Sommer)^J^J * FindPCAP
|
||||
file_stream, file #0, 476, now links against thread library when necessary (e.g.^J PF_RING's libpcap) (Jon Siwek)^J^J * Install binaries with an RPATH (Jon Siwek)^J^J * Workaround for FreeBSD CMake port missing debug flags (Jon Siwek)^J^J * Rewrite of the update-changes script. (Robin Sommer)^J^J0.1-1 | 2011-06-14 21:12:41 -0700^J^J * Add a script for generating Mozilla's CA list for the SSL analyzer.^J (Seth Hall)^J^J0.1 | 2011-04-01 16:28:22 -0700^J^J * Converting build process to CMake. (Jon Siwek)^J
|
||||
file_chunk, file #0, 476, 4024, now links against thread library when necessary (e.g.^J PF_RING's libpcap) (Jon Siwek)^J^J * Install binaries with an RPATH (Jon Siwek)^J^J * Workaround for FreeBSD CMake port missing debug flags (Jon Siwek)^J^J * Rewrite of the update-changes script. (Robin Sommer)^J^J0.1-1 | 2011-06-14 21:12:41 -0700^J^J * Add a script for generating Mozilla's CA list for the SSL analyzer.^J (Seth Hall)^J^J0.1 | 2011-04-01 16:28:22 -0700^J^J * Converting build process to CMake. (Jon Siwek)^J
|
||||
file_stream, file #0, 205, ^J * Removing cf/hf/ca-* from distribution. The README has a note where^J to find them now. (Robin Sommer)^J^J * General cleanup. (Robin Sommer)^J^J * Initial import of bro/aux from SVN r7088. (Jon Siwek)^J
|
||||
file_chunk, file #0, 205, 4500, ^J * Removing cf/hf/ca-* from distribution. The README has a note where^J to find them now. (Robin Sommer)^J^J * General cleanup. (Robin Sommer)^J^J * Initial import of bro/aux from SVN r7088. (Jon Siwek)^J
|
||||
FILE_STATE_REMOVE
|
||||
BYYd1GSNX5c, 4705, 0
|
||||
file #0, 4705, 0
|
||||
[orig_h=141.142.228.5, orig_p=59856/tcp, resp_h=192.150.187.43, resp_p=80/tcp]
|
||||
total bytes: 4705
|
||||
source: HTTP
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
BYYd1GSNX5c, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
^J0.26 | 201
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
BYYd1GSNX5c, 4705, 0
|
||||
file #0, 4705, 0
|
||||
[orig_h=141.142.228.5, orig_p=59856/tcp, resp_h=192.150.187.43, resp_p=80/tcp]
|
||||
total bytes: 4705
|
||||
source: HTTP
|
||||
|
|
|
@ -1,20 +1,20 @@
|
|||
FILE_NEW
|
||||
Cvu8OAp0WEd, 0, 0
|
||||
file #0, 0, 0
|
||||
MIME_TYPE
|
||||
application/x-dosexec
|
||||
FILE_STATE_REMOVE
|
||||
Cvu8OAp0WEd, 1022920, 0
|
||||
file #0, 1022920, 0
|
||||
[orig_h=192.168.72.14, orig_p=3254/tcp, resp_h=65.54.95.206, resp_p=80/tcp]
|
||||
total bytes: 1022920
|
||||
source: HTTP
|
||||
FILE_NEW
|
||||
Cvu8OAp0WEd, 0, 0
|
||||
file #1, 0, 0
|
||||
MIME_TYPE
|
||||
application/octet-stream
|
||||
FILE_TIMEOUT
|
||||
FILE_TIMEOUT
|
||||
FILE_STATE_REMOVE
|
||||
Cvu8OAp0WEd, 206024, 0
|
||||
file #1, 206024, 0
|
||||
[orig_h=192.168.72.14, orig_p=3257/tcp, resp_h=65.54.95.14, resp_p=80/tcp]
|
||||
total bytes: 1022920
|
||||
source: HTTP
|
|
@ -1,5 +1,5 @@
|
|||
FILE_NEW
|
||||
BYYd1GSNX5c, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
^J0.26 | 201
|
||||
MIME_TYPE
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
5LcdtqrLA97, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
The Nationa
|
||||
MIME_TYPE
|
||||
text/x-pascal
|
||||
FILE_STATE_REMOVE
|
||||
5LcdtqrLA97, 16557, 0
|
||||
file #0, 16557, 0
|
||||
[orig_h=141.142.228.5, orig_p=50737/tcp, resp_h=141.142.192.162, resp_p=38141/tcp]
|
||||
source: FTP_DATA
|
||||
MD5: 7192a8075196267203adb3dfaa5c908d
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
FBfDYB0kA49, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
{^J "origin
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
FBfDYB0kA49, 197, 0
|
||||
file #0, 197, 0
|
||||
[orig_h=141.142.228.5, orig_p=50153/tcp, resp_h=54.243.118.187, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: 5baba7eea57bc8a42a92c817ed566d72
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
BYYd1GSNX5c, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
^J0.26 | 201
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
BYYd1GSNX5c, 4705, 0
|
||||
file #0, 4705, 0
|
||||
[orig_h=141.142.228.5, orig_p=59856/tcp, resp_h=192.150.187.43, resp_p=80/tcp]
|
||||
total bytes: 4705
|
||||
source: HTTP
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
test
|
|
@ -0,0 +1 @@
|
|||
test2
|
|
@ -0,0 +1 @@
|
|||
test3
|
|
@ -0,0 +1,21 @@
|
|||
{
|
||||
"data": "",
|
||||
"form": {
|
||||
"example": "test",
|
||||
"example2": "test2",
|
||||
"example3": "test3"
|
||||
},
|
||||
"origin": "141.142.228.5",
|
||||
"json": null,
|
||||
"url": "http://httpbin.org/post",
|
||||
"args": {},
|
||||
"headers": {
|
||||
"Content-Type": "multipart/form-data; boundary=----------------------------4ebf00fbcf09",
|
||||
"User-Agent": "curl/7.30.0",
|
||||
"Connection": "close",
|
||||
"Accept": "*/*",
|
||||
"Content-Length": "350",
|
||||
"Host": "httpbin.org"
|
||||
},
|
||||
"files": {}
|
||||
}
|
|
@ -0,0 +1,53 @@
|
|||
FILE_NEW
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
test^M^J
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
file #0, 6, 0
|
||||
[orig_h=141.142.228.5, orig_p=57262/tcp, resp_h=54.243.88.146, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: 9f06243abcb89c70e0c331c61d871fa7
|
||||
SHA1: fde773a18bb29f5ed65e6f0a7aa717fd1fa485d4
|
||||
SHA256: 837ccb607e312b170fac7383d7ccfd61fa5072793f19a25e75fbacb56539b86b
|
||||
FILE_NEW
|
||||
file #1, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
test2^M^J
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
file #1, 7, 0
|
||||
[orig_h=141.142.228.5, orig_p=57262/tcp, resp_h=54.243.88.146, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: d68af81ef370b3873d50f09140068810
|
||||
SHA1: 51a7b6f2d91f6a87822dc04560f2972bc14fc97e
|
||||
SHA256: de0edd0ac4a705aff70f34734e90a1d0a1d8b76abe4bb53f3ea934bc105b3b17
|
||||
FILE_NEW
|
||||
file #2, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
test3^M^J
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
file #2, 7, 0
|
||||
[orig_h=141.142.228.5, orig_p=57262/tcp, resp_h=54.243.88.146, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: 1a3d75d44753ad246f0bd333cdaf08b0
|
||||
SHA1: 4f98809ab09272dfcc58266e3f23ae2393f70e76
|
||||
SHA256: 018c67a2c30ed9977e1dddfe98cac542165dac355cf9764c91a362613e752933
|
||||
FILE_NEW
|
||||
file #3, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
{^J "data":
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
file #3, 465, 0
|
||||
[orig_h=141.142.228.5, orig_p=57262/tcp, resp_h=54.243.88.146, resp_p=80/tcp]
|
||||
total bytes: 465
|
||||
source: HTTP
|
||||
MD5: 226244811006caf4ac904344841168dd
|
||||
SHA1: 7222902b8b8e68e25c0422e7f8bdf344efeda54d
|
||||
SHA256: dd485ecf240e12807516b0a27718fc3ab9a17c1158a452967343c98cefba07a0
|
|
@ -1,10 +1,10 @@
|
|||
FILE_NEW
|
||||
1QXxzNpRT3h, 0, 0
|
||||
file #0, 0, 0
|
||||
MIME_TYPE
|
||||
application/pdf
|
||||
FILE_OVER_NEW_CONNECTION
|
||||
FILE_STATE_REMOVE
|
||||
1QXxzNpRT3h, 555523, 0
|
||||
file #0, 555523, 0
|
||||
[orig_h=10.101.84.70, orig_p=10978/tcp, resp_h=129.174.93.161, resp_p=80/tcp]
|
||||
[orig_h=10.101.84.70, orig_p=10977/tcp, resp_h=129.174.93.161, resp_p=80/tcp]
|
||||
total bytes: 555523
|
||||
|
|
|
@ -1,19 +1,19 @@
|
|||
FILE_NEW
|
||||
Cvu8OAp0WEd, 0, 0
|
||||
file #0, 0, 0
|
||||
MIME_TYPE
|
||||
application/x-dosexec
|
||||
FILE_STATE_REMOVE
|
||||
Cvu8OAp0WEd, 1022920, 0
|
||||
file #0, 1022920, 0
|
||||
[orig_h=192.168.72.14, orig_p=3254/tcp, resp_h=65.54.95.206, resp_p=80/tcp]
|
||||
total bytes: 1022920
|
||||
source: HTTP
|
||||
FILE_NEW
|
||||
Cvu8OAp0WEd, 0, 0
|
||||
file #1, 0, 0
|
||||
MIME_TYPE
|
||||
application/octet-stream
|
||||
FILE_TIMEOUT
|
||||
FILE_STATE_REMOVE
|
||||
Cvu8OAp0WEd, 206024, 0
|
||||
file #1, 206024, 0
|
||||
[orig_h=192.168.72.14, orig_p=3257/tcp, resp_h=65.54.95.14, resp_p=80/tcp]
|
||||
total bytes: 1022920
|
||||
source: HTTP
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
FILE_NEW
|
||||
me4WAjZH0Ik, 0, 0
|
||||
file #0, 0, 0
|
||||
MIME_TYPE
|
||||
application/octet-stream
|
||||
FILE_OVER_NEW_CONNECTION
|
||||
FILE_STATE_REMOVE
|
||||
me4WAjZH0Ik, 498702, 0
|
||||
file #0, 498702, 0
|
||||
[orig_h=10.45.179.94, orig_p=19950/tcp, resp_h=129.174.93.170, resp_p=80/tcp]
|
||||
[orig_h=10.45.179.94, orig_p=19953/tcp, resp_h=129.174.93.170, resp_p=80/tcp]
|
||||
total bytes: 498668
|
||||
|
|
|
@ -1,37 +1,37 @@
|
|||
FILE_NEW
|
||||
FiqZGsUZjXk, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
/*^J********
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
FiqZGsUZjXk, 2675, 0
|
||||
file #0, 2675, 0
|
||||
[orig_h=192.168.1.104, orig_p=1673/tcp, resp_h=63.245.209.11, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: b932c3310ce47e158d1a5a42e0b01279
|
||||
SHA1: 0e42ae17eea9b074981bd3a34535ad3a22d02706
|
||||
SHA256: 5b037a2c5e36f56e63a3012c73e46a04b27741d8ff8f8b62c832fb681fc60f42
|
||||
FILE_NEW
|
||||
GU8RrggV4f5, 0, 0
|
||||
file #1, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
//-- Google
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
GU8RrggV4f5, 21421, 0
|
||||
file #1, 21421, 0
|
||||
[orig_h=192.168.1.104, orig_p=1673/tcp, resp_h=63.245.209.11, resp_p=80/tcp]
|
||||
source: HTTP
|
||||
MD5: e732f7bf1d7cb4eedcb1661697d7bc8c
|
||||
SHA1: 8f241117afaa8ca5f41dc059e66d75c283dcc983
|
||||
SHA256: 6a509fd05aa7c8fa05080198894bb19e638554ffcee0e0b3d7bc8ff54afee1da
|
||||
FILE_NEW
|
||||
0afVj9ZG1J9, 0, 0
|
||||
file #2, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
GIF89a^D\0^D\0\xb3
|
||||
MIME_TYPE
|
||||
image/gif
|
||||
FILE_STATE_REMOVE
|
||||
0afVj9ZG1J9, 94, 0
|
||||
file #2, 94, 0
|
||||
[orig_h=192.168.1.104, orig_p=1673/tcp, resp_h=63.245.209.11, resp_p=80/tcp]
|
||||
total bytes: 94
|
||||
source: HTTP
|
||||
|
@ -39,13 +39,13 @@ MD5: d903de7e30db1691d3130ba5eae6b9a7
|
|||
SHA1: 81f5f056ce5e97d940854bb0c48017b45dd9f15e
|
||||
SHA256: 6fb22aa9d780ea63bd7a2e12b92b16fcbf1c4874f1d3e11309a5ba984433c315
|
||||
FILE_NEW
|
||||
oMJlhgZt8Nh, 0, 0
|
||||
file #3, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
\x89PNG^M^J^Z^J\0\0\0
|
||||
MIME_TYPE
|
||||
image/png
|
||||
FILE_STATE_REMOVE
|
||||
oMJlhgZt8Nh, 2349, 0
|
||||
file #3, 2349, 0
|
||||
[orig_h=192.168.1.104, orig_p=1673/tcp, resp_h=63.245.209.11, resp_p=80/tcp]
|
||||
total bytes: 2349
|
||||
source: HTTP
|
||||
|
@ -53,13 +53,13 @@ MD5: e0029eea80812e9a8e57b8d05d52938a
|
|||
SHA1: 560eab5a0177246827a94042dd103916d8765ac7
|
||||
SHA256: e0b4500c1fd1d675da4137461cbe64d3c8489f4180d194e47683b20e7fb876f4
|
||||
FILE_NEW
|
||||
KajlXqmipId, 0, 0
|
||||
file #4, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
\x89PNG^M^J^Z^J\0\0\0
|
||||
MIME_TYPE
|
||||
image/png
|
||||
FILE_STATE_REMOVE
|
||||
KajlXqmipId, 27579, 0
|
||||
file #4, 27579, 0
|
||||
[orig_h=192.168.1.104, orig_p=1673/tcp, resp_h=63.245.209.11, resp_p=80/tcp]
|
||||
total bytes: 27579
|
||||
source: HTTP
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
1V1QkS1JR02, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
hello world
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
1V1QkS1JR02, 11, 0
|
||||
file #0, 11, 0
|
||||
[orig_h=141.142.228.5, orig_p=53595/tcp, resp_h=54.243.55.129, resp_p=80/tcp]
|
||||
total bytes: 11
|
||||
source: HTTP
|
||||
|
@ -13,13 +13,13 @@ MD5: 5eb63bbbe01eeed093cb22bb8f5acdc3
|
|||
SHA1: 2aae6c35c94fcfb415dbe95f408b9ce91ee846ed
|
||||
SHA256: b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
|
||||
FILE_NEW
|
||||
IYuq13QwRPh, 0, 0
|
||||
file #1, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
{^J "origin
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
IYuq13QwRPh, 366, 0
|
||||
file #1, 366, 0
|
||||
[orig_h=141.142.228.5, orig_p=53595/tcp, resp_h=54.243.55.129, resp_p=80/tcp]
|
||||
total bytes: 366
|
||||
source: HTTP
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
nYgPNGLrZf9, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
#separator
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
nYgPNGLrZf9, 311, 0
|
||||
file #0, 311, 0
|
||||
source: ../input.log
|
||||
MD5: bf4dfa6169b74146da5236e918743599
|
||||
SHA1: 0a0f20de89c86d7bce1301af6548d6e9ae87b0f1
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
FILE_NEW
|
||||
A3OSdqG9zvk, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
PK^C^D^T\0\0\0^H\0\xae
|
||||
MIME_TYPE
|
||||
application/zip
|
||||
FILE_STATE_REMOVE
|
||||
A3OSdqG9zvk, 42208, 0
|
||||
file #0, 42208, 0
|
||||
[orig_h=192.168.1.77, orig_p=57655/tcp, resp_h=209.197.168.151, resp_p=1024/tcp]
|
||||
source: IRC_DATA
|
||||
MD5: 8c0803242f549c2780cb88b9a9215c65
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path file_analysis
|
||||
#open 2013-05-17-00-55-16
|
||||
#fields id parent_id source is_orig last_active seen_bytes total_bytes missing_bytes overflow_bytes timeout_interval bof_buffer_size mime_type timedout conn_uids analyzers extracted_files md5 sha1 sha256
|
||||
#types string string string bool time count count count count interval count string bool table[string] table[enum] table[string] string string string
|
||||
BYYd1GSNX5c - HTTP F 1362692527.009775 4705 4705 0 0 120.000000 1024 text/plain F UWkUyAuUGXf FileAnalysis::ANALYZER_SHA1,FileAnalysis::ANALYZER_EXTRACT,FileAnalysis::ANALYZER_DATA_EVENT,FileAnalysis::ANALYZER_MD5,FileAnalysis::ANALYZER_SHA256 BYYd1GSNX5c-file 397168fd09991a0e712254df7bc639ac 1dd7ac0398df6cbc0696445a91ec681facf4dc47 4e7c7ef0984119447e743e3ec77e1de52713e345cde03fe7df753a35849bed18
|
||||
#close 2013-05-17-00-55-16
|
||||
#open 2013-06-07-18-51-45
|
||||
#fields id parent_id source is_orig last_active seen_bytes total_bytes missing_bytes overflow_bytes timeout_interval bof_buffer_size mime_type timedout conn_uids extracted_files md5 sha1 sha256
|
||||
#types string string string bool time count count count count interval count string bool table[string] table[string] string string string
|
||||
BYYd1GSNX5c - HTTP F 1362692527.009775 4705 4705 0 0 120.000000 1024 text/plain F UWkUyAuUGXf BYYd1GSNX5c-file 397168fd09991a0e712254df7bc639ac 1dd7ac0398df6cbc0696445a91ec681facf4dc47 4e7c7ef0984119447e743e3ec77e1de52713e345cde03fe7df753a35849bed18
|
||||
#close 2013-06-07-18-51-46
|
||||
|
|
|
@ -1,37 +1,37 @@
|
|||
FILE_NEW
|
||||
mR3f2AAKo11, 0, 0
|
||||
file #0, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
Hello^M^J^M^J ^M
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
mR3f2AAKo11, 79, 0
|
||||
file #0, 79, 0
|
||||
[orig_h=10.10.1.4, orig_p=1470/tcp, resp_h=74.53.140.153, resp_p=25/tcp]
|
||||
source: SMTP
|
||||
MD5: 92bca2e6cdcde73647125da7dccbdd07
|
||||
SHA1: b7e497be8a9f5e2c4b6980fceb015360f98f4a13
|
||||
SHA256: 785a8a044d1454ec88837108f443bbb30cc4f529393ffd57118261036bfe59f5
|
||||
FILE_NEW
|
||||
svBvmJEWan2, 0, 0
|
||||
file #1, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
<html xmlns
|
||||
MIME_TYPE
|
||||
text/html
|
||||
FILE_STATE_REMOVE
|
||||
svBvmJEWan2, 1918, 0
|
||||
file #1, 1918, 0
|
||||
[orig_h=10.10.1.4, orig_p=1470/tcp, resp_h=74.53.140.153, resp_p=25/tcp]
|
||||
source: SMTP
|
||||
MD5: d194c6359c85bb88b54caee18b1e9b44
|
||||
SHA1: e54af6c6616525611364b80bd6557a7ea21dae94
|
||||
SHA256: b9556e92ddbe52379b64804136f830d111cafe7fcd78e54817fe40f3bc24268d
|
||||
FILE_NEW
|
||||
ZNp0KBSLByc, 0, 0
|
||||
file #2, 0, 0
|
||||
FILE_BOF_BUFFER
|
||||
Version 4.9
|
||||
MIME_TYPE
|
||||
text/plain
|
||||
FILE_STATE_REMOVE
|
||||
ZNp0KBSLByc, 10823, 0
|
||||
file #2, 10823, 0
|
||||
[orig_h=10.10.1.4, orig_p=1470/tcp, resp_h=74.53.140.153, resp_p=25/tcp]
|
||||
source: SMTP
|
||||
MD5: a968bb0f9f9d95835b2e74c845877e87
|
||||
|
|
|
@ -3,8 +3,8 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path http
|
||||
#open 2013-03-22-14-38-21
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string string
|
||||
1315799856.264750 UWkUyAuUGXf 10.0.1.104 64216 193.40.5.162 80 1 GET lepo.it.da.ut.ee /~cect/teoreetilised seminarid_2010/arheoloogia_uurimisr\xfchma_seminar/Joyce et al - The Languages of Archaeology ~ Dialogue, Narrative and Writing.pdf - Wget/1.12 (darwin10.8.0) 0 346 404 Not Found - - - (empty) - - - text/html - -
|
||||
#close 2013-03-22-14-38-21
|
||||
#open 2013-05-21-21-11-23
|
||||
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
#types time string addr port addr port count string string string string string count count count string count string string table[enum] string string table[string] string string vector[string] vector[string]
|
||||
1315799856.264750 UWkUyAuUGXf 10.0.1.104 64216 193.40.5.162 80 1 GET lepo.it.da.ut.ee /~cect/teoreetilised seminarid_2010/arheoloogia_uurimisr\xfchma_seminar/Joyce et al - The Languages of Archaeology ~ Dialogue, Narrative and Writing.pdf - Wget/1.12 (darwin10.8.0) 0 346 404 Not Found - - - (empty) - - - text/html - - -
|
||||
#close 2013-05-21-21-11-23
|
||||
|
|
|
@ -34,7 +34,8 @@
|
|||
<field type="variable32" name="proxied" pack_unique="yes"/>
|
||||
<field type="variable32" name="mime_type" pack_unique="yes"/>
|
||||
<field type="variable32" name="md5" pack_unique="yes"/>
|
||||
<field type="variable32" name="extraction_file" pack_unique="yes"/>
|
||||
<field type="variable32" name="extracted_request_files" pack_unique="yes"/>
|
||||
<field type="variable32" name="extracted_response_files" pack_unique="yes"/>
|
||||
</ExtentType>
|
||||
<!-- ts : time -->
|
||||
<!-- uid : string -->
|
||||
|
@ -61,10 +62,11 @@
|
|||
<!-- proxied : table[string] -->
|
||||
<!-- mime_type : string -->
|
||||
<!-- md5 : string -->
|
||||
<!-- extraction_file : string -->
|
||||
<!-- extracted_request_files : vector[string] -->
|
||||
<!-- extracted_response_files : vector[string] -->
|
||||
|
||||
# Extent, type='http'
|
||||
ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extraction_file
|
||||
ts uid id.orig_h id.orig_p id.resp_h id.resp_p trans_depth method host uri referrer user_agent request_body_len response_body_len status_code status_msg info_code info_msg filename tags username password proxied mime_type md5 extracted_request_files extracted_response_files
|
||||
1300475168.784020 j4u32Pc5bif 141.142.220.118 48649 208.80.152.118 80 1 GET bits.wikimedia.org /skins-1.5/monobook/main.css http://www.wikipedia.org/ Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15 0 0 304 Not Modified 0
|
||||
1300475168.916018 VW0XPVINV8a 141.142.220.118 49997 208.80.152.3 80 1 GET upload.wikimedia.org /wikipedia/commons/6/63/Wikipedia-logo.png http://www.wikipedia.org/ Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15 0 0 304 Not Modified 0
|
||||
1300475168.916183 3PKsZ2Uye21 141.142.220.118 49996 208.80.152.3 80 1 GET upload.wikimedia.org /wikipedia/commons/thumb/b/bb/Wikipedia_wordmark.svg/174px-Wikipedia_wordmark.svg.png http://www.wikipedia.org/ Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15 0 0 304 Not Modified 0
|
||||
|
|
|
@ -1,14 +1,14 @@
|
|||
1300475168.78402|j4u32Pc5bif|141.142.220.118|48649|208.80.152.118|80|1|GET|bits.wikimedia.org|/skins-1.5/monobook/main.css|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.91602|VW0XPVINV8a|141.142.220.118|49997|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/6/63/Wikipedia-logo.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.91618|3PKsZ2Uye21|141.142.220.118|49996|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/b/bb/Wikipedia_wordmark.svg/174px-Wikipedia_wordmark.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.91836|GSxOnSLghOa|141.142.220.118|49998|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/b/bd/Bookshelf-40x201_6.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.9523|P654jzLoe3a|141.142.220.118|49999|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/4/4a/Wiktionary-logo-en-35px.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.95231|Tw8jXtpTGu6|141.142.220.118|50000|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/8/8a/Wikinews-logo.png/35px-Wikinews-logo.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.95482|0Q4FH8sESw5|141.142.220.118|50001|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/35px-Wikiquote-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.96269|i2rO3KD1Syg|141.142.220.118|35642|208.80.152.2|80|1|GET|meta.wikimedia.org|/images/wikimedia-button.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.97593|VW0XPVINV8a|141.142.220.118|49997|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/f/fa/Wikibooks-logo.svg/35px-Wikibooks-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.97644|3PKsZ2Uye21|141.142.220.118|49996|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/d/df/Wikispecies-logo.svg/35px-Wikispecies-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.97926|GSxOnSLghOa|141.142.220.118|49998|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/35px-Wikisource-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475169.01459|P654jzLoe3a|141.142.220.118|49999|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/9/91/Wikiversity-logo.svg/35px-Wikiversity-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475169.01462|Tw8jXtpTGu6|141.142.220.118|50000|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/4/4a/Commons-logo.svg/35px-Commons-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475169.01493|0Q4FH8sESw5|141.142.220.118|50001|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/7/75/Wikimedia_Community_Logo.svg/35px-Wikimedia_Community_Logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)||||||
|
||||
1300475168.78402|j4u32Pc5bif|141.142.220.118|48649|208.80.152.118|80|1|GET|bits.wikimedia.org|/skins-1.5/monobook/main.css|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.91602|VW0XPVINV8a|141.142.220.118|49997|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/6/63/Wikipedia-logo.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.91618|3PKsZ2Uye21|141.142.220.118|49996|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/b/bb/Wikipedia_wordmark.svg/174px-Wikipedia_wordmark.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.91836|GSxOnSLghOa|141.142.220.118|49998|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/b/bd/Bookshelf-40x201_6.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.9523|P654jzLoe3a|141.142.220.118|49999|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/4/4a/Wiktionary-logo-en-35px.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.95231|Tw8jXtpTGu6|141.142.220.118|50000|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/8/8a/Wikinews-logo.png/35px-Wikinews-logo.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.95482|0Q4FH8sESw5|141.142.220.118|50001|208.80.152.3|80|1|GET|upload.wikimedia.org|/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/35px-Wikiquote-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.96269|i2rO3KD1Syg|141.142.220.118|35642|208.80.152.2|80|1|GET|meta.wikimedia.org|/images/wikimedia-button.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.97593|VW0XPVINV8a|141.142.220.118|49997|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/f/fa/Wikibooks-logo.svg/35px-Wikibooks-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.97644|3PKsZ2Uye21|141.142.220.118|49996|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/d/df/Wikispecies-logo.svg/35px-Wikispecies-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475168.97926|GSxOnSLghOa|141.142.220.118|49998|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/35px-Wikisource-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475169.01459|P654jzLoe3a|141.142.220.118|49999|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/9/91/Wikiversity-logo.svg/35px-Wikiversity-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475169.01462|Tw8jXtpTGu6|141.142.220.118|50000|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/4/4a/Commons-logo.svg/35px-Commons-logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
1300475169.01493|0Q4FH8sESw5|141.142.220.118|50001|208.80.152.3|80|2|GET|upload.wikimedia.org|/wikipedia/commons/thumb/7/75/Wikimedia_Community_Logo.svg/35px-Wikimedia_Community_Logo.svg.png|http://www.wikipedia.org/|Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.15) Gecko/20110303 Ubuntu/10.04 (lucid) Firefox/3.6.15|0|0|304|Not Modified||||(empty)|||||||
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue