Merge remote-tracking branch 'origin/master' into topic/liangzhu/analyzer-ocsp

This commit is contained in:
Liang Zhu 2015-08-18 16:00:59 -07:00
commit 12c68f197c
30 changed files with 255 additions and 252 deletions

37
CHANGES
View file

@ -1,4 +1,41 @@
2.4-87 | 2015-08-14 08:34:41 -0700
* Removing the yielding_teredo_decapsulation option. (Robin Sommer)
2.4-86 | 2015-08-12 17:02:24 -0700
* Make Teredo DPD signature more precise. (Martina Balint)
2.4-84 | 2015-08-10 14:44:39 -0700
* Add hook 'HookSetupAnalyzerTree' to allow plugins access to a
connection's initial analyzer tree for customization. (James
Swaro)
* Plugins now look for a file "__preload__.bro" in the top-level
script directory. If found, they load it first, before any scripts
defining BiF elements. This can be used to define types that the
BiFs already depend on (like a custom type for an event argument).
(Robin Sommer)
2.4-81 | 2015-08-08 07:38:42 -0700
* Fix a test that is failing very frequently. (Daniel Thayer)
2.4-78 | 2015-08-06 22:25:19 -0400
* Remove build dependency on Perl (now requiring Python instad).
(Daniel Thayer)
* CID 1314754: Fixing unreachable code in RSH analyzer. (Robin
Sommer)
* CID 1312752: Add comment to mark 'case' fallthrough as ok. (Robin
Sommer)
* CID 1312751: Removing redundant assignment. (Robin Sommer)
2.4-73 | 2015-07-31 08:53:49 -0700 2.4-73 | 2015-07-31 08:53:49 -0700
* BIT-1429: SMTP logs now include CC: addresses. (Albert Zaharovits) * BIT-1429: SMTP logs now include CC: addresses. (Albert Zaharovits)

View file

@ -61,7 +61,7 @@ if (NOT SED_EXE)
endif () endif ()
endif () endif ()
FindRequiredPackage(Perl) FindRequiredPackage(PythonInterp)
FindRequiredPackage(FLEX) FindRequiredPackage(FLEX)
FindRequiredPackage(BISON) FindRequiredPackage(BISON)
FindRequiredPackage(PCAP) FindRequiredPackage(PCAP)

3
NEWS
View file

@ -16,6 +16,8 @@ New Dependencies
- Bro now requires the C++ Actor Framework, CAF, which must be - Bro now requires the C++ Actor Framework, CAF, which must be
installed first. See http://actor-framework.org. installed first. See http://actor-framework.org.
- Bro now requires Python instead of Perl to compile the source code.
New Functionality New Functionality
----------------- -----------------
@ -29,6 +31,7 @@ New Functionality
- New Bro plugins in aux/plugins: - New Bro plugins in aux/plugins:
- pf_ring: Native PF_RING support. - pf_ring: Native PF_RING support.
- redis: An experimental log writer for Redis.
Bro 2.4 Bro 2.4
======= =======

View file

@ -1 +1 @@
2.4-73 2.4-87

@ -1 +1 @@
Subproject commit 07af9748f40dc47d3a2b3290db494a90dcbddbdc Subproject commit 2470f64b58d875f9491e251b866a15a2ec4c05da

@ -1 +1 @@
Subproject commit 2799b2a13577fc70eea1da6192879a25c58902de Subproject commit bb86ad945c823c94ea8385ec4ebb9546ba5198af

6
configure vendored
View file

@ -55,7 +55,7 @@ Usage: $0 [OPTION]... [VAR=VALUE]...
--with-binpac=PATH path to BinPAC install root --with-binpac=PATH path to BinPAC install root
--with-flex=PATH path to flex executable --with-flex=PATH path to flex executable
--with-bison=PATH path to bison executable --with-bison=PATH path to bison executable
--with-perl=PATH path to perl executable --with-python=PATH path to Python executable
--with-libcaf=PATH path to C++ Actor Framework installation --with-libcaf=PATH path to C++ Actor Framework installation
(a required Broker dependency) (a required Broker dependency)
@ -63,7 +63,6 @@ Usage: $0 [OPTION]... [VAR=VALUE]...
--with-geoip=PATH path to the libGeoIP install root --with-geoip=PATH path to the libGeoIP install root
--with-perftools=PATH path to Google Perftools install root --with-perftools=PATH path to Google Perftools install root
--with-jemalloc=PATH path to jemalloc install root --with-jemalloc=PATH path to jemalloc install root
--with-python=PATH path to Python interpreter
--with-python-lib=PATH path to libpython --with-python-lib=PATH path to libpython
--with-python-inc=PATH path to Python headers --with-python-inc=PATH path to Python headers
--with-ruby=PATH path to ruby interpreter --with-ruby=PATH path to ruby interpreter
@ -239,9 +238,6 @@ while [ $# -ne 0 ]; do
--with-bison=*) --with-bison=*)
append_cache_entry BISON_EXECUTABLE PATH $optarg append_cache_entry BISON_EXECUTABLE PATH $optarg
;; ;;
--with-perl=*)
append_cache_entry PERL_EXECUTABLE PATH $optarg
;;
--with-geoip=*) --with-geoip=*)
append_cache_entry LibGeoIP_ROOT_DIR PATH $optarg append_cache_entry LibGeoIP_ROOT_DIR PATH $optarg
;; ;;

View file

@ -209,8 +209,15 @@ directory. With the skeleton, ``<base>`` corresponds to ``build/``.
"@load"ed. "@load"ed.
``scripts``/__load__.bro ``scripts``/__load__.bro
A Bro script that will be loaded immediately when the plugin gets A Bro script that will be loaded when the plugin gets activated.
activated. See below for more information on activating plugins. When this script executes, any BiF elements that the plugin
defines will already be available. See below for more information
on activating plugins.
``scripts``/__preload__.bro
A Bro script that will be loaded when the plugin gets activated,
but before any BiF elements become available. See below for more
information on activating plugins.
``lib/bif/`` ``lib/bif/``
Directory with auto-generated Bro scripts that declare the plugin's Directory with auto-generated Bro scripts that declare the plugin's
@ -279,7 +286,9 @@ Activating a plugin will:
1. Load the dynamic module 1. Load the dynamic module
2. Make any bif items available 2. Make any bif items available
3. Add the ``scripts/`` directory to ``BROPATH`` 3. Add the ``scripts/`` directory to ``BROPATH``
4. Load ``scripts/__load__.bro`` 5. Load ``scripts/__preload__.bro``
6. Make BiF elements available to scripts.
7. Load ``scripts/__load__.bro``
By default, Bro will automatically activate all dynamic plugins found By default, Bro will automatically activate all dynamic plugins found
in its search path ``BRO_PLUGIN_PATH``. However, in bare mode (``bro in its search path ``BRO_PLUGIN_PATH``. However, in bare mode (``bro

View file

@ -45,7 +45,7 @@ To build Bro from source, the following additional dependencies are required:
* Libpcap headers (http://www.tcpdump.org) * Libpcap headers (http://www.tcpdump.org)
* OpenSSL headers (http://www.openssl.org) * OpenSSL headers (http://www.openssl.org)
* zlib headers * zlib headers
* Perl * Python
.. todo:: .. todo::
@ -72,7 +72,7 @@ To install the required dependencies, you can use:
.. console:: .. console::
sudo pkg install bash cmake swig bison python perl5 py27-sqlite3 sudo pkg install bash cmake swig bison python py27-sqlite3
Note that in older versions of FreeBSD, you might have to use the Note that in older versions of FreeBSD, you might have to use the
"pkg_add -r" command instead of "pkg install". "pkg_add -r" command instead of "pkg install".

View file

@ -3712,20 +3712,11 @@ export {
## Toggle whether to do GRE decapsulation. ## Toggle whether to do GRE decapsulation.
const enable_gre = T &redef; const enable_gre = T &redef;
## With this option set, the Teredo analysis will first check to see if
## other protocol analyzers have confirmed that they think they're
## parsing the right protocol and only continue with Teredo tunnel
## decapsulation if nothing else has yet confirmed. This can help
## reduce false positives of UDP traffic (e.g. DNS) that also happens
## to have a valid Teredo encapsulation.
const yielding_teredo_decapsulation = T &redef;
## With this set, the Teredo analyzer waits until it sees both sides ## With this set, the Teredo analyzer waits until it sees both sides
## of a connection using a valid Teredo encapsulation before issuing ## of a connection using a valid Teredo encapsulation before issuing
## a :bro:see:`protocol_confirmation`. If it's false, the first ## a :bro:see:`protocol_confirmation`. If it's false, the first
## occurrence of a packet with valid Teredo encapsulation causes a ## occurrence of a packet with valid Teredo encapsulation causes a
## confirmation. Both cases are still subject to effects of ## confirmation.
## :bro:see:`Tunnel::yielding_teredo_decapsulation`.
const delay_teredo_confirmation = T &redef; const delay_teredo_confirmation = T &redef;
## With this set, the GTP analyzer waits until the most-recent upflow ## With this set, the GTP analyzer waits until the most-recent upflow

View file

@ -9,6 +9,6 @@ signature dpd_ayiya {
signature dpd_teredo { signature dpd_teredo {
ip-proto = udp ip-proto = udp
payload /^(\x00\x00)|(\x00\x01)|([\x60-\x6f])/ payload /^(\x00\x00)|(\x00\x01)|([\x60-\x6f].{7}((\x20\x01\x00\x00)).{28})|([\x60-\x6f].{23}((\x20\x01\x00\x00))).{12}/
enable "teredo" enable "teredo"
} }

View file

@ -223,16 +223,16 @@ endmacro(COLLECT_HEADERS _var)
cmake_policy(POP) cmake_policy(POP)
# define a command that's used to run the make_dbg_constants.pl script # define a command that's used to run the make_dbg_constants.py script
# building the bro binary depends on the outputs of this script # building the bro binary depends on the outputs of this script
add_custom_command(OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/DebugCmdConstants.h add_custom_command(OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/DebugCmdConstants.h
${CMAKE_CURRENT_BINARY_DIR}/DebugCmdInfoConstants.cc ${CMAKE_CURRENT_BINARY_DIR}/DebugCmdInfoConstants.cc
COMMAND ${PERL_EXECUTABLE} COMMAND ${PYTHON_EXECUTABLE}
ARGS ${CMAKE_CURRENT_SOURCE_DIR}/make_dbg_constants.pl ARGS ${CMAKE_CURRENT_SOURCE_DIR}/make_dbg_constants.py
${CMAKE_CURRENT_SOURCE_DIR}/DebugCmdInfoConstants.in ${CMAKE_CURRENT_SOURCE_DIR}/DebugCmdInfoConstants.in
DEPENDS ${CMAKE_CURRENT_SOURCE_DIR}/make_dbg_constants.pl DEPENDS ${CMAKE_CURRENT_SOURCE_DIR}/make_dbg_constants.py
${CMAKE_CURRENT_SOURCE_DIR}/DebugCmdInfoConstants.in ${CMAKE_CURRENT_SOURCE_DIR}/DebugCmdInfoConstants.in
COMMENT "[Perl] Processing debug commands" COMMENT "[Python] Processing debug commands"
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR} WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
) )

View file

@ -505,6 +505,8 @@ bool Manager::BuildInitialAnalyzerTree(Connection* conn)
if ( ! analyzed ) if ( ! analyzed )
conn->SetLifetime(non_analyzed_lifetime); conn->SetLifetime(non_analyzed_lifetime);
PLUGIN_HOOK_VOID(HOOK_SETUP_ANALYZER_TREE, HookSetupAnalyzerTree(conn));
return true; return true;
} }

View file

@ -93,8 +93,7 @@ void Contents_Rsh_Analyzer::DoDeliver(int len, const u_char* data)
case RSH_LINE_MODE: case RSH_LINE_MODE:
case RSH_UNKNOWN: case RSH_UNKNOWN:
case RSH_PRESUMED_REJECTED: case RSH_PRESUMED_REJECTED:
if ( state == RSH_LINE_MODE && if ( state == RSH_PRESUMED_REJECTED )
state == RSH_PRESUMED_REJECTED )
{ {
Conn()->Weird("rsh_text_after_rejected"); Conn()->Weird("rsh_text_after_rejected");
state = RSH_UNKNOWN; state = RSH_UNKNOWN;

View file

@ -722,6 +722,8 @@ void POP3_Analyzer::ProcessReply(int length, const char* line)
case CAPA: case CAPA:
ProtocolConfirmation(); ProtocolConfirmation();
// Fall-through.
case UIDL: case UIDL:
case LIST: case LIST:
if (requestForMultiLine == true) if (requestForMultiLine == true)

View file

@ -189,36 +189,7 @@ void Teredo_Analyzer::DeliverPacket(int len, const u_char* data, bool orig,
else else
valid_resp = true; valid_resp = true;
if ( BifConst::Tunnel::yielding_teredo_decapsulation && Confirm();
! ProtocolConfirmed() )
{
// Only confirm the Teredo tunnel and start decapsulating packets
// when no other sibling analyzer thinks it's already parsing the
// right protocol.
bool sibling_has_confirmed = false;
if ( Parent() )
{
LOOP_OVER_GIVEN_CONST_CHILDREN(i, Parent()->GetChildren())
{
if ( (*i)->ProtocolConfirmed() )
{
sibling_has_confirmed = true;
break;
}
}
}
if ( ! sibling_has_confirmed )
Confirm();
else
{
delete inner;
return;
}
}
else
// Aggressively decapsulate anything with valid Teredo encapsulation.
Confirm();
} }
else else

View file

@ -19,7 +19,6 @@ const Tunnel::enable_ayiya: bool;
const Tunnel::enable_teredo: bool; const Tunnel::enable_teredo: bool;
const Tunnel::enable_gtpv1: bool; const Tunnel::enable_gtpv1: bool;
const Tunnel::enable_gre: bool; const Tunnel::enable_gre: bool;
const Tunnel::yielding_teredo_decapsulation: bool;
const Tunnel::delay_teredo_confirmation: bool; const Tunnel::delay_teredo_confirmation: bool;
const Tunnel::delay_gtp_confirmation: bool; const Tunnel::delay_gtp_confirmation: bool;
const Tunnel::ip_tunnel_timeout: interval; const Tunnel::ip_tunnel_timeout: interval;

View file

@ -310,9 +310,8 @@ void Packet::ProcessLayer2()
} }
// We've now determined (a) L3_IPV4 vs (b) L3_IPV6 vs // We've now determined (a) L3_IPV4 vs (b) L3_IPV6 vs (c) L3_ARP vs
// (c) L3_ARP vs (d) L3_UNKNOWN. // (d) L3_UNKNOWN.
l3_proto = l3_proto;
// Calculate how much header we've used up. // Calculate how much header we've used up.
hdr_size = (pdata - data); hdr_size = (pdata - data);

View file

@ -1,143 +0,0 @@
# Build the DebugCmdConstants.h and DebugCmdInfoConstants.h files from the
# DebugCmdInfoConstants.in file.
#
# We do this via a script rather than maintaining them directly because
# the struct is a little complicated, so has to be initialized from code,
# plus we want to make adding new constants somewhat less painful.
#
# The input filename should be supplied as an argument
#
# DebugCmds are printed to DebugCmdConstants.h
# DebugCmdInfos are printed to DebugCmdInfoConstants.h
#
# The input format is:
#
# cmd: [DebugCmd]
# names: [space delimited names of cmd]
# resume: ['true' or 'false': should execution resume after this command?]
# help: [some help text]
#
# Blank lines are skipped.
# Comments should start with // and should be on a line by themselves.
use strict;
open INPUT, $ARGV[0] or die "Input file $ARGV[0] not found.";
open DEBUGCMDS, ">DebugCmdConstants.h"
or die "Unable to open DebugCmdConstants.h";
open DEBUGCMDINFOS, ">DebugCmdInfoConstants.cc"
or die "Unable to open DebugCmdInfoConstants.cc";
my $init_tmpl =
'
{
DebugCmdInfo* info;
@@name_init
info = new DebugCmdInfo (@@cmd, names, @@num_names, @@resume, "@@help",
@@repeatable);
g_DebugCmdInfos.push_back(info);
}
';
my $enum_str = "
//
// This file was automatically generated from $ARGV[0]
// DO NOT EDIT.
//
enum DebugCmd {
";
my $init_str = "
//
// This file was automatically generated from $ARGV[0]
// DO NOT EDIT.
//
#include \"util.h\"
void init_global_dbg_constants () {
";
my %dbginfo;
# { cmd, num_names, \@names, name_init, resume, help, repeatable }
no strict "refs";
sub OutputRecord {
$dbginfo{name_init} .= "const char * const names[] = {\n\t";
$_ = "\"$_\"" foreach @{$dbginfo{names}}; # put quotes around the strings
my $name_strs = join ",\n\t", @{$dbginfo{names}};
$dbginfo{name_init} .= "$name_strs\n };\n";
$dbginfo{num_names} = scalar @{$dbginfo{names}};
# substitute into template
my $init = $init_tmpl;
$init =~ s/(\@\@(\w+))/defined $dbginfo{$2} ? $dbginfo{$2} : ""/eg;
$init_str .= $init;
$enum_str .= "\t$dbginfo{cmd},\n";
}
use strict "refs";
sub InitDbginfo
{
my $dbginfo = shift;
%$dbginfo = ( num_names => 0, names => [], resume => 'false', help => '',
repeatable => 'false' );
}
InitDbginfo(\%dbginfo);
while (<INPUT>) {
chomp ($_);
next if $_ =~ /^\s*$/; # skip blank
next if $_ =~ /^\s*\/\//; # skip comments
$_ =~ /^\s*([a-z]+):\s*(.*)$/ or
die "Error in debug constant file on line: $_";
if ($1 eq 'cmd')
{
my $newcmd = $2;
if (defined $dbginfo{cmd}) { # output the previous record
OutputRecord();
InitDbginfo(\%dbginfo);
}
$dbginfo{cmd} = $newcmd;
}
elsif ($1 eq 'names')
{
my @names = split / /, $2;
$dbginfo{names} = \@names;
}
elsif ($1 eq 'resume')
{
$dbginfo{resume} = $2;
}
elsif ($1 eq 'help')
{
$dbginfo{help} = $2;
$dbginfo{help} =~ s{\"}{\\\"}g; # escape quotation marks
}
elsif ($1 eq 'repeatable')
{
$dbginfo{repeatable} = $2;
}
else {
die "Unknown command: $_\n";
}
}
# output the last record
OutputRecord();
$init_str .= " \n}\n";
$enum_str .= " dcLast\n};\n";
print DEBUGCMDS $enum_str;
close DEBUGCMDS;
print DEBUGCMDINFOS $init_str;
close DEBUGCMDINFOS;

114
src/make_dbg_constants.py Normal file
View file

@ -0,0 +1,114 @@
# Build the DebugCmdConstants.h and DebugCmdInfoConstants.cc files from the
# DebugCmdInfoConstants.in file.
#
# We do this via a script rather than maintaining them directly because
# the struct is a little complicated, so has to be initialized from code,
# plus we want to make adding new constants somewhat less painful.
#
# The input filename should be supplied as an argument.
#
# DebugCmds are printed to DebugCmdConstants.h
# DebugCmdInfos are printed to DebugCmdInfoConstants.cc
#
# The input format is:
#
# cmd: [DebugCmd]
# names: [space delimited names of cmd]
# resume: ['true' or 'false': should execution resume after this command?]
# help: [some help text]
#
# Blank lines are skipped.
# Comments should start with // and should be on a line by themselves.
import sys
inputfile = sys.argv[1]
init_tmpl = '''
{
DebugCmdInfo* info;
%(name_init)s
info = new DebugCmdInfo (%(cmd)s, names, %(num_names)s, %(resume)s, "%(help)s",
%(repeatable)s);
g_DebugCmdInfos.push_back(info);
}
'''
enum_str = '''
//
// This file was automatically generated from %s
// DO NOT EDIT.
//
enum DebugCmd {
''' % inputfile
init_str = '''
//
// This file was automatically generated from %s
// DO NOT EDIT.
//
#include "util.h"
void init_global_dbg_constants () {
''' % inputfile
def outputrecord():
global init_str, enum_str
dbginfo["name_init"] = "const char * const names[] = {\n\t%s\n };\n" % ",\n\t".join(dbginfo["names"])
dbginfo["num_names"] = len(dbginfo["names"])
# substitute into template
init_str += init_tmpl % dbginfo
enum_str += "\t%s,\n" % dbginfo["cmd"]
def initdbginfo():
return {"cmd": "", "name_init": "", "num_names": 0, "names": [],
"resume": "false", "help": "", "repeatable": "false"}
dbginfo = initdbginfo()
inputf = open(inputfile, "r")
for line in inputf:
line = line.strip()
if not line or line.startswith("//"): # skip empty lines and comments
continue
fields = line.split(":", 1)
if len(fields) != 2:
raise RuntimeError("Error in debug constant file on line: %s" % line)
f1, f2 = fields
f2 = f2.strip()
if f1 == "cmd":
if dbginfo[f1]: # output the previous record
outputrecord()
dbginfo = initdbginfo()
dbginfo[f1] = f2
elif f1 == "names":
# put quotes around the strings
dbginfo[f1] = [ '"%s"' % n for n in f2.split() ]
elif f1 == "help":
dbginfo[f1] = f2.replace('"', '\\"') # escape quotation marks
elif f1 in ("resume", "repeatable"):
dbginfo[f1] = f2
else:
raise RuntimeError("Unknown command: %s" % line)
# output the last record
outputrecord()
init_str += " \n}\n"
enum_str += " dcLast\n};\n"
debugcmds = open("DebugCmdConstants.h", "w")
debugcmds.write(enum_str)
debugcmds.close()
debugcmdinfos = open("DebugCmdInfoConstants.cc", "w")
debugcmdinfos.write(init_str)
debugcmdinfos.close()

View file

@ -182,9 +182,17 @@ bool Manager::ActivateDynamicPluginInternal(const std::string& name, bool ok_if_
add_to_bro_path(scripts); add_to_bro_path(scripts);
} }
// Load {bif,scripts}/__load__.bro automatically. // First load {scripts}/__preload__.bro automatically.
string init = dir + "scripts/__preload__.bro";
string init = dir + "lib/bif/__load__.bro"; if ( is_file(init) )
{
DBG_LOG(DBG_PLUGINS, " Loading %s", init.c_str());
scripts_to_load.push_back(init);
}
// Load {bif,scripts}/__load__.bro automatically.
init = dir + "lib/bif/__load__.bro";
if ( is_file(init) ) if ( is_file(init) )
{ {
@ -660,6 +668,33 @@ void Manager::HookDrainEvents() const
} }
void Manager::HookSetupAnalyzerTree(Connection *conn) const
{
HookArgumentList args;
if ( HavePluginForHook(META_HOOK_PRE) )
{
args.push_back(conn);
MetaHookPre(HOOK_SETUP_ANALYZER_TREE, args);
}
hook_list *l = hooks[HOOK_SETUP_ANALYZER_TREE];
if ( l )
{
for (hook_list::iterator i = l->begin() ; i != l->end(); ++i)
{
Plugin *p = (*i).second;
p->HookSetupAnalyzerTree(conn);
}
}
if ( HavePluginForHook(META_HOOK_POST) )
{
MetaHookPost(HOOK_SETUP_ANALYZER_TREE, args, HookArgument());
}
}
void Manager::HookUpdateNetworkTime(double network_time) const void Manager::HookUpdateNetworkTime(double network_time) const
{ {
HookArgumentList args; HookArgumentList args;

View file

@ -264,6 +264,15 @@ public:
*/ */
void HookUpdateNetworkTime(double network_time) const; void HookUpdateNetworkTime(double network_time) const;
/**
* Hook that executes when a connection's initial analyzer tree
* has been fully set up. The hook can manipulate the tree at this time,
* for example by adding further analyzers.
*
* @param conn The connection.
*/
void HookSetupAnalyzerTree(Connection *conn) const;
/** /**
* Hook that informs plugins that the event queue is being drained. * Hook that informs plugins that the event queue is being drained.
*/ */

View file

@ -23,6 +23,7 @@ const char* plugin::hook_name(HookType h)
"DrainEvents", "DrainEvents",
"UpdateNetworkTime", "UpdateNetworkTime",
"BroObjDtor", "BroObjDtor",
"SetupAnalyzerTree",
// MetaHooks // MetaHooks
"MetaHookPre", "MetaHookPre",
"MetaHookPost", "MetaHookPost",
@ -310,6 +311,10 @@ void Plugin::HookUpdateNetworkTime(double network_time)
{ {
} }
void Plugin::HookSetupAnalyzerTree(Connection *conn)
{
}
void Plugin::HookBroObjDtor(void* obj) void Plugin::HookBroObjDtor(void* obj)
{ {
} }

View file

@ -14,7 +14,7 @@
// We allow to override this externally for testing purposes. // We allow to override this externally for testing purposes.
#ifndef BRO_PLUGIN_API_VERSION #ifndef BRO_PLUGIN_API_VERSION
#define BRO_PLUGIN_API_VERSION 3 #define BRO_PLUGIN_API_VERSION 4
#endif #endif
class ODesc; class ODesc;
@ -39,6 +39,7 @@ enum HookType {
HOOK_DRAIN_EVENTS, //< Activates Plugin::HookDrainEvents() HOOK_DRAIN_EVENTS, //< Activates Plugin::HookDrainEvents()
HOOK_UPDATE_NETWORK_TIME, //< Activates Plugin::HookUpdateNetworkTime. HOOK_UPDATE_NETWORK_TIME, //< Activates Plugin::HookUpdateNetworkTime.
HOOK_BRO_OBJ_DTOR, //< Activates Plugin::HookBroObjDtor. HOOK_BRO_OBJ_DTOR, //< Activates Plugin::HookBroObjDtor.
HOOK_SETUP_ANALYZER_TREE, //< Activates Plugin::HookAddToAnalyzerTree
// Meta hooks. // Meta hooks.
META_HOOK_PRE, //< Activates Plugin::MetaHookPre(). META_HOOK_PRE, //< Activates Plugin::MetaHookPre().
@ -636,6 +637,8 @@ protected:
*/ */
virtual void HookUpdateNetworkTime(double network_time); virtual void HookUpdateNetworkTime(double network_time);
virtual void HookSetupAnalyzerTree(Connection *conn);
/** /**
* Hook for destruction of objects registered with * Hook for destruction of objects registered with
* RequestBroObjDtor(). When Bro's reference counting triggers the * RequestBroObjDtor(). When Bro's reference counting triggers the

View file

@ -1,15 +0,0 @@
#separator \x09
#set_separator ,
#empty_field (empty)
#unset_field -
#path weird
#open 2009-11-18-17-59-51
#fields ts uid id.orig_h id.orig_p id.resp_h id.resp_p name addl notice peer
#types time string addr port addr port string string bool string
1258567191.405770 - - - - - truncated_header_in_tunnel - F bro
1258578181.260420 - - - - - truncated_header_in_tunnel - F bro
1258579063.557927 - - - - - truncated_header_in_tunnel - F bro
1258581768.568451 - - - - - truncated_header_in_tunnel - F bro
1258584478.859853 - - - - - truncated_header_in_tunnel - F bro
1258600683.934458 - - - - - truncated_header_in_tunnel - F bro
#close 2009-11-19-03-18-03

View file

@ -1,10 +0,0 @@
#separator \x09
#set_separator ,
#empty_field (empty)
#unset_field -
#path known_services
#open 2014-04-01-22-57-25
#fields ts host port_num port_proto service
#types time addr port enum set[string]
1258567191.405770 192.168.1.1 53 udp TEREDO
#close 2014-04-01-22-57-25

View file

@ -220,7 +220,7 @@
0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::__create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::__write, <frame>, (PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::__write, <frame>, (PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Cluster::LOG)) -> <no result> 0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Cluster::LOG)) -> <no result>
0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Communication::LOG)) -> <no result> 0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Communication::LOG)) -> <no result>
0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Conn::LOG)) -> <no result> 0.000000 MetaHookPost CallFunction(Log::add_default_filter, <frame>, (Conn::LOG)) -> <no result>
@ -326,7 +326,7 @@
0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) -> <no result>
0.000000 MetaHookPost CallFunction(Log::write, <frame>, (PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T])) -> <no result> 0.000000 MetaHookPost CallFunction(Log::write, <frame>, (PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T])) -> <no result>
0.000000 MetaHookPost CallFunction(Notice::want_pp, <frame>, ()) -> <no result> 0.000000 MetaHookPost CallFunction(Notice::want_pp, <frame>, ()) -> <no result>
0.000000 MetaHookPost CallFunction(PacketFilter::build, <frame>, ()) -> <no result> 0.000000 MetaHookPost CallFunction(PacketFilter::build, <frame>, ()) -> <no result>
0.000000 MetaHookPost CallFunction(PacketFilter::combine_filters, <frame>, (ip or not ip, and, )) -> <no result> 0.000000 MetaHookPost CallFunction(PacketFilter::combine_filters, <frame>, (ip or not ip, and, )) -> <no result>
@ -490,6 +490,7 @@
0.000000 MetaHookPost LoadFile(./top-k.bif.bro) -> -1 0.000000 MetaHookPost LoadFile(./top-k.bif.bro) -> -1
0.000000 MetaHookPost LoadFile(./topk) -> -1 0.000000 MetaHookPost LoadFile(./topk) -> -1
0.000000 MetaHookPost LoadFile(./types.bif.bro) -> -1 0.000000 MetaHookPost LoadFile(./types.bif.bro) -> -1
0.000000 MetaHookPost LoadFile(./types.bro) -> -1
0.000000 MetaHookPost LoadFile(./unique) -> -1 0.000000 MetaHookPost LoadFile(./unique) -> -1
0.000000 MetaHookPost LoadFile(./utils) -> -1 0.000000 MetaHookPost LoadFile(./utils) -> -1
0.000000 MetaHookPost LoadFile(./utils-commands) -> -1 0.000000 MetaHookPost LoadFile(./utils-commands) -> -1
@ -509,6 +510,7 @@
0.000000 MetaHookPost LoadFile(.<...>/raw) -> -1 0.000000 MetaHookPost LoadFile(.<...>/raw) -> -1
0.000000 MetaHookPost LoadFile(.<...>/sqlite) -> -1 0.000000 MetaHookPost LoadFile(.<...>/sqlite) -> -1
0.000000 MetaHookPost LoadFile(<...>/__load__.bro) -> -1 0.000000 MetaHookPost LoadFile(<...>/__load__.bro) -> -1
0.000000 MetaHookPost LoadFile(<...>/__preload__.bro) -> -1
0.000000 MetaHookPost LoadFile(<...>/hooks.bro) -> -1 0.000000 MetaHookPost LoadFile(<...>/hooks.bro) -> -1
0.000000 MetaHookPost LoadFile(base/bif) -> -1 0.000000 MetaHookPost LoadFile(base/bif) -> -1
0.000000 MetaHookPost LoadFile(base/init-default.bro) -> -1 0.000000 MetaHookPost LoadFile(base/init-default.bro) -> -1
@ -810,7 +812,7 @@
0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) 0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird]))
0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) 0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509]))
0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) 0.000000 MetaHookPre CallFunction(Log::__create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql]))
0.000000 MetaHookPre CallFunction(Log::__write, <frame>, (PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T])) 0.000000 MetaHookPre CallFunction(Log::__write, <frame>, (PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T]))
0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Cluster::LOG)) 0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Cluster::LOG))
0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Communication::LOG)) 0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Communication::LOG))
0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Conn::LOG)) 0.000000 MetaHookPre CallFunction(Log::add_default_filter, <frame>, (Conn::LOG))
@ -916,7 +918,7 @@
0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])) 0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird]))
0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])) 0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509]))
0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])) 0.000000 MetaHookPre CallFunction(Log::create_stream, <frame>, (mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql]))
0.000000 MetaHookPre CallFunction(Log::write, <frame>, (PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T])) 0.000000 MetaHookPre CallFunction(Log::write, <frame>, (PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T]))
0.000000 MetaHookPre CallFunction(Notice::want_pp, <frame>, ()) 0.000000 MetaHookPre CallFunction(Notice::want_pp, <frame>, ())
0.000000 MetaHookPre CallFunction(PacketFilter::build, <frame>, ()) 0.000000 MetaHookPre CallFunction(PacketFilter::build, <frame>, ())
0.000000 MetaHookPre CallFunction(PacketFilter::combine_filters, <frame>, (ip or not ip, and, )) 0.000000 MetaHookPre CallFunction(PacketFilter::combine_filters, <frame>, (ip or not ip, and, ))
@ -1080,6 +1082,7 @@
0.000000 MetaHookPre LoadFile(./top-k.bif.bro) 0.000000 MetaHookPre LoadFile(./top-k.bif.bro)
0.000000 MetaHookPre LoadFile(./topk) 0.000000 MetaHookPre LoadFile(./topk)
0.000000 MetaHookPre LoadFile(./types.bif.bro) 0.000000 MetaHookPre LoadFile(./types.bif.bro)
0.000000 MetaHookPre LoadFile(./types.bro)
0.000000 MetaHookPre LoadFile(./unique) 0.000000 MetaHookPre LoadFile(./unique)
0.000000 MetaHookPre LoadFile(./utils) 0.000000 MetaHookPre LoadFile(./utils)
0.000000 MetaHookPre LoadFile(./utils-commands) 0.000000 MetaHookPre LoadFile(./utils-commands)
@ -1099,6 +1102,7 @@
0.000000 MetaHookPre LoadFile(.<...>/raw) 0.000000 MetaHookPre LoadFile(.<...>/raw)
0.000000 MetaHookPre LoadFile(.<...>/sqlite) 0.000000 MetaHookPre LoadFile(.<...>/sqlite)
0.000000 MetaHookPre LoadFile(<...>/__load__.bro) 0.000000 MetaHookPre LoadFile(<...>/__load__.bro)
0.000000 MetaHookPre LoadFile(<...>/__preload__.bro)
0.000000 MetaHookPre LoadFile(<...>/hooks.bro) 0.000000 MetaHookPre LoadFile(<...>/hooks.bro)
0.000000 MetaHookPre LoadFile(base/bif) 0.000000 MetaHookPre LoadFile(base/bif)
0.000000 MetaHookPre LoadFile(base/init-default.bro) 0.000000 MetaHookPre LoadFile(base/init-default.bro)
@ -1399,7 +1403,7 @@
0.000000 | HookCallFunction Log::__create_stream(Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird]) 0.000000 | HookCallFunction Log::__create_stream(Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])
0.000000 | HookCallFunction Log::__create_stream(X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509]) 0.000000 | HookCallFunction Log::__create_stream(X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])
0.000000 | HookCallFunction Log::__create_stream(mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql]) 0.000000 | HookCallFunction Log::__create_stream(mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])
0.000000 | HookCallFunction Log::__write(PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T]) 0.000000 | HookCallFunction Log::__write(PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T])
0.000000 | HookCallFunction Log::add_default_filter(Cluster::LOG) 0.000000 | HookCallFunction Log::add_default_filter(Cluster::LOG)
0.000000 | HookCallFunction Log::add_default_filter(Communication::LOG) 0.000000 | HookCallFunction Log::add_default_filter(Communication::LOG)
0.000000 | HookCallFunction Log::add_default_filter(Conn::LOG) 0.000000 | HookCallFunction Log::add_default_filter(Conn::LOG)
@ -1505,7 +1509,7 @@
0.000000 | HookCallFunction Log::create_stream(Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird]) 0.000000 | HookCallFunction Log::create_stream(Weird::LOG, [columns=<no value description>, ev=Weird::log_weird, path=weird])
0.000000 | HookCallFunction Log::create_stream(X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509]) 0.000000 | HookCallFunction Log::create_stream(X509::LOG, [columns=<no value description>, ev=X509::log_x509, path=x509])
0.000000 | HookCallFunction Log::create_stream(mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql]) 0.000000 | HookCallFunction Log::create_stream(mysql::LOG, [columns=<no value description>, ev=MySQL::log_mysql, path=mysql])
0.000000 | HookCallFunction Log::write(PacketFilter::LOG, [ts=1429655378.868621, node=bro, filter=ip or not ip, init=T, success=T]) 0.000000 | HookCallFunction Log::write(PacketFilter::LOG, [ts=1439244305.210087, node=bro, filter=ip or not ip, init=T, success=T])
0.000000 | HookCallFunction Notice::want_pp() 0.000000 | HookCallFunction Notice::want_pp()
0.000000 | HookCallFunction PacketFilter::build() 0.000000 | HookCallFunction PacketFilter::build()
0.000000 | HookCallFunction PacketFilter::combine_filters(ip or not ip, and, ) 0.000000 | HookCallFunction PacketFilter::combine_filters(ip or not ip, and, )

View file

@ -1,6 +1,6 @@
# @TEST-REQUIRES: grep -q ENABLE_BROKER $BUILD/CMakeCache.txt # @TEST-REQUIRES: grep -q ENABLE_BROKER $BUILD/CMakeCache.txt
# @TEST-EXEC: btest-bg-run master "bro -b -r $TRACES/wikipedia.trace %INPUT >out" # @TEST-EXEC: btest-bg-run master "bro -b %INPUT >out"
# @TEST-EXEC: btest-bg-wait 60 # @TEST-EXEC: btest-bg-wait 60
# @TEST-EXEC: TEST_DIFF_CANONIFIER=$SCRIPTS/diff-sort btest-diff master/out # @TEST-EXEC: TEST_DIFF_CANONIFIER=$SCRIPTS/diff-sort btest-diff master/out

View file

@ -1,9 +1,6 @@
# @TEST-EXEC: bro -r $TRACES/tunnels/false-teredo.pcap %INPUT >output # @TEST-EXEC: bro -r $TRACES/tunnels/false-teredo.pcap %INPUT >output
# @TEST-EXEC: test ! -e weird.log # @TEST-EXEC: test ! -e weird.log
# @TEST-EXEC: test ! -e dpd.log # @TEST-EXEC: test ! -e dpd.log
# @TEST-EXEC: bro -r $TRACES/tunnels/false-teredo.pcap %INPUT Tunnel::yielding_teredo_decapsulation=F >output
# @TEST-EXEC: btest-diff weird.log
# @TEST-EXEC: test ! -e dpd.log
# In the first case, there isn't any weird or protocol violation logged # In the first case, there isn't any weird or protocol violation logged
# since the teredo analyzer recognizes that the DNS analyzer has confirmed # since the teredo analyzer recognizes that the DNS analyzer has confirmed

View file

@ -1,11 +1,7 @@
# @TEST-EXEC: bro -r $TRACES/tunnels/false-teredo.pcap base/frameworks/dpd base/protocols/tunnels protocols/conn/known-services Tunnel::delay_teredo_confirmation=T "Site::local_nets+={192.168.1.0/24}" # @TEST-EXEC: bro -r $TRACES/tunnels/false-teredo.pcap base/frameworks/dpd base/protocols/tunnels protocols/conn/known-services Tunnel::delay_teredo_confirmation=T "Site::local_nets+={192.168.1.0/24}"
# @TEST-EXEC: test ! -e known_services.log # @TEST-EXEC: test ! -e known_services.log
# @TEST-EXEC: bro -b -r $TRACES/tunnels/false-teredo.pcap base/frameworks/dpd base/protocols/tunnels protocols/conn/known-services Tunnel::delay_teredo_confirmation=F "Site::local_nets+={192.168.1.0/24}"
# @TEST-EXEC: btest-diff known_services.log
# The first case using Tunnel::delay_teredo_confirmation=T doesn't produce # The first case using Tunnel::delay_teredo_confirmation=T doesn't produce
# a known services.log since valid Teredo encapsulations from both endpoints # a known services.log since valid Teredo encapsulations from both endpoints
# of a connection is never witnessed and a protocol_confirmation never issued. # of a connection is never witnessed and a protocol_confirmation never issued.
# The second case issues protocol_confirmations more hastily and so bogus
# entries in known-services.log are more likely to appear.