mirror of
https://github.com/zeek/zeek.git
synced 2025-10-02 22:58:20 +00:00
Merge remote-tracking branch 'origin/master' into topic/vladg/kerberos
Conflicts: testing/btest/Baseline/core.print-bpf-filters/output2 testing/btest/Baseline/scripts.policy.misc.dump-events/smtp-events.log
This commit is contained in:
commit
2c8a3fce49
156 changed files with 3758 additions and 1614 deletions
159
CHANGES
159
CHANGES
|
@ -1,4 +1,163 @@
|
|||
|
||||
2.3-376 | 2015-01-12 09:38:10 -0600
|
||||
|
||||
* Improve documentation for connection_established event. (Jon Siwek)
|
||||
|
||||
2.3-375 | 2015-01-08 13:10:09 -0600
|
||||
|
||||
* Increase minimum required CMake version to 2.8. (Jon Siwek)
|
||||
|
||||
2.3-374 | 2015-01-07 10:03:17 -0600
|
||||
|
||||
* Improve documentation of the Intelligence Framework. (Daniel Thayer)
|
||||
|
||||
2.3-371 | 2015-01-06 09:58:09 -0600
|
||||
|
||||
* Update/improve file mime type identification. (Seth Hall)
|
||||
|
||||
- Change to the default BOF buffer size to 3000 (was 1024).
|
||||
|
||||
- Reorganized MS signatures into a separate file.
|
||||
|
||||
- Remove all of the x-c detections. Nearly all false positives.
|
||||
|
||||
- Improve TAR detections, removing old, back up TAR detections.
|
||||
|
||||
- Remove one of the x-elc detections that was too loose
|
||||
and caused many false positives.
|
||||
|
||||
- Improved lots of the signatures and added new ones. (Seth Hall)
|
||||
|
||||
* Add support for file reassembly in the file analysis framework
|
||||
(Seth Hall, Jon Siwek).
|
||||
|
||||
- The reassembly behavior can be modified per-file by enabling or
|
||||
disabling the reassembler and/or modifying the size of the
|
||||
reassembly buffer.
|
||||
|
||||
- Changed the file extraction analyzer to use stream-wise input to
|
||||
avoid issues with the chunk-wise approach not immediately
|
||||
triggering the file_new event due to mime-type detection delay.
|
||||
Before, early chunks frequently ended up lost. Extraction also
|
||||
will now explicitly NUL-fill gaps in the file instead of
|
||||
implicitly relying on pwrite to do it.
|
||||
|
||||
2.3-349 | 2015-01-05 15:21:13 -0600
|
||||
|
||||
* Fix race condition in unified2 file analyzer startup. (Jon siwek)
|
||||
|
||||
2.3-348 | 2014-12-31 09:19:34 -0800
|
||||
|
||||
* Changing Makefile's test-all to run test-all for broctl, which now
|
||||
executes trace-summary tests as well. (Robin Sommer)
|
||||
|
||||
2.3-345 | 2014-12-31 09:06:15 -0800
|
||||
|
||||
* Correct a typo in the Notice framework doc. (Daniel Thayer)
|
||||
|
||||
2.3-343 | 2014-12-12 12:43:46 -0800
|
||||
|
||||
* Fix PIA packet replay to deliver copy of IP header. This prevented
|
||||
one from writing a packet-wise analyzer that needs access to IP
|
||||
headers and can be attached to a connection via signature match.
|
||||
Addresses BIT-1298 (Jon Siwek)
|
||||
|
||||
2.3-338 | 2014-12-08 13:56:19 -0800
|
||||
|
||||
* Add man page for Bro. (Raúl Benencia)
|
||||
|
||||
* Updating doc baselines. (Robin Sommer)
|
||||
|
||||
2.3-334 | 2014-12-03 14:22:07 -0800
|
||||
|
||||
* Fix compound assignment to require proper L-value. Addresses
|
||||
BIT-1295. (Jon Siwek)
|
||||
|
||||
2.3-332 | 2014-12-03 14:14:11 -0800
|
||||
|
||||
* Make using local IDs in @if directives an error. Addresses
|
||||
BIT-1296. (Jon Siwek)
|
||||
|
||||
2.3-330 | 2014-12-03 14:10:39 -0800
|
||||
|
||||
* Fix some "make doc" warnings and update some doc tests. (Daniel
|
||||
Thayer)
|
||||
|
||||
2.3-328 | 2014-12-02 08:13:10 -0500
|
||||
|
||||
* Update windows-version-detection.bro to add support for
|
||||
Windows 10. (Michal Purzynski)
|
||||
|
||||
2.3-326 | 2014-12-01 12:10:27 -0600
|
||||
|
||||
* BIFScanner: fix invalid characters in generated preprocessor macros.
|
||||
(Hilko Bengen)
|
||||
|
||||
* BIT-1294: fix exec.bro from mutating Input::end_of_data event
|
||||
parameters. (Johanna Amann)
|
||||
|
||||
* Add/invoke "distclean" for testing directories. (Raúl Benencia)
|
||||
|
||||
* Delete prebuilt python bytecode files from git. (Jon Siwek)
|
||||
|
||||
* Add Windows detection based on CryptoAPI HTTP traffic as a software
|
||||
framework policy script. (Vlad Grigorescu)
|
||||
|
||||
2.3-316 | 2014-11-25 17:35:06 -0800
|
||||
|
||||
* Make the SSL analyzer skip further processing once encountering
|
||||
situations which are very probably non-recoverable. (Johanna
|
||||
Amann)
|
||||
|
||||
2.3-313 | 2014-11-25 14:27:07 -0800
|
||||
|
||||
* Make SSL v2 protocol tests more strict. In its former state they
|
||||
triggered on http traffic over port 443 sometimes. Found by Michał
|
||||
Purzyński. (Johanna Amann)
|
||||
|
||||
* Fix X509 analyzer to correctly return ECDSA as the key_type for
|
||||
ECDSA certs. Bug found by Michał Purzyński. (Johanna Amann)
|
||||
|
||||
2.3-310 | 2014-11-19 10:56:59 -0600
|
||||
|
||||
* Disable verbose bison output. (Jon Siwek)
|
||||
|
||||
2.3-309 | 2014-11-18 12:17:53 -0800
|
||||
|
||||
* New decompose_uri() function in base/utils/urls that splits a URI
|
||||
into its pieces. (Anthony Kasza).
|
||||
|
||||
2.3-305 | 2014-11-18 11:09:04 -0800
|
||||
|
||||
* Improve coercion of &default expressions. Addresses BIT-1288. (Jon
|
||||
Siwek)
|
||||
|
||||
2.3-303 | 2014-11-18 10:53:04 -0800
|
||||
|
||||
* For DH key exchanges, use p as the parameter for weak key
|
||||
exchanges. (Johanna Amann)
|
||||
|
||||
2.3-301 | 2014-11-11 13:47:27 -0800
|
||||
|
||||
* Add builtin function enum_to_int() that converts an enum into a
|
||||
integer. (Christian Struck)
|
||||
|
||||
2.3-297 | 2014-11-11 11:50:47 -0800
|
||||
|
||||
* Removing method from SSL analyzer that's no longer used. (Robin
|
||||
Sommer)
|
||||
|
||||
2.3-296 | 2014-11-11 11:42:38 -0800
|
||||
|
||||
* A new analyzer parsing the MySQL wire protocol. Activity gets
|
||||
logged into mysql.log. Supports protocol versions 9 and 10. (Vlad
|
||||
Grigorescu)
|
||||
|
||||
2.3-280 | 2014-11-05 09:46:33 -0500
|
||||
|
||||
* Add Windows detection based on CryptoAPI HTTP traffic as a
|
||||
software framework policy script. (Vlad Grigorescu)
|
||||
|
||||
2.3-278 | 2014-11-03 18:55:18 -0800
|
||||
|
||||
* Add new curves from draft-ietf-tls-negotiated-ff-dhe to SSL
|
||||
|
|
|
@ -2,7 +2,7 @@ project(Bro C CXX)
|
|||
|
||||
# When changing the minimum version here, also adapt
|
||||
# aux/bro-aux/plugin-support/skeleton/CMakeLists.txt
|
||||
cmake_minimum_required(VERSION 2.6.3 FATAL_ERROR)
|
||||
cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
|
||||
|
||||
include(cmake/CommonCMakeConfig.cmake)
|
||||
|
||||
|
@ -15,6 +15,11 @@ if (NOT BRO_SCRIPT_INSTALL_PATH)
|
|||
set(BRO_SCRIPT_INSTALL_PATH ${BRO_ROOT_DIR}/share/bro)
|
||||
endif ()
|
||||
|
||||
if (NOT BRO_MAN_INSTALL_PATH)
|
||||
# set the default Bro man page installation path (user did not specify one)
|
||||
set(BRO_MAN_INSTALL_PATH ${BRO_ROOT_DIR}/share/man)
|
||||
endif ()
|
||||
|
||||
# sanitize the Bro script install directory into an absolute path
|
||||
# (CMake is confused by ~ as a representation of home directory)
|
||||
get_filename_component(BRO_SCRIPT_INSTALL_PATH ${BRO_SCRIPT_INSTALL_PATH}
|
||||
|
@ -175,6 +180,7 @@ include_directories(${CMAKE_CURRENT_BINARY_DIR})
|
|||
add_subdirectory(src)
|
||||
add_subdirectory(scripts)
|
||||
add_subdirectory(doc)
|
||||
add_subdirectory(man)
|
||||
|
||||
include(CheckOptionalBuildSources)
|
||||
|
||||
|
|
3
Makefile
3
Makefile
|
@ -48,12 +48,13 @@ bindist:
|
|||
|
||||
distclean:
|
||||
rm -rf $(BUILD)
|
||||
$(MAKE) -C testing $@
|
||||
|
||||
test:
|
||||
@( cd testing && make )
|
||||
|
||||
test-all: test
|
||||
test -d aux/broctl && ( cd aux/broctl && make test )
|
||||
test -d aux/broctl && ( cd aux/broctl && make test-all )
|
||||
test -d aux/btest && ( cd aux/btest && make test )
|
||||
test -d aux/bro-aux && ( cd aux/bro-aux && make test )
|
||||
test -d aux/plugins && ( cd aux/plugins && make test-all )
|
||||
|
|
23
NEWS
23
NEWS
|
@ -25,11 +25,34 @@ New Functionality
|
|||
See https://www.bro.org/sphinx-git/devel/plugins.html for more
|
||||
information on writing plugins.
|
||||
|
||||
- Bro now has supoprt for the MySQL wire protocol. Activity gets
|
||||
logged into mysql.log.
|
||||
|
||||
- Bro's file analysis now supports reassembly of files that are not
|
||||
transferred/seen sequentially.
|
||||
|
||||
Changed Functionality
|
||||
---------------------
|
||||
|
||||
- bro-cut has been rewritten in C, and is hence much faster.
|
||||
|
||||
- File analysis
|
||||
|
||||
* Removed ``fa_file`` record's ``mime_type`` and ``mime_types``
|
||||
fields. The events ``file_mime_type`` and ``file_mime_types``
|
||||
have been added which contain the same information. The
|
||||
``mime_type`` field of ``Files::Info`` also still has this info.
|
||||
|
||||
* Removed ``Files::add_analyzers_for_mime_type`` function.
|
||||
|
||||
* Removed ``offset`` parameter of the ``file_extraction_limit``
|
||||
event. Since file extraction now internally depends on file
|
||||
reassembly for non-sequential files, "offset" can be obtained
|
||||
with other information already available -- adding together
|
||||
``seen_bytes`` and ``missed_bytes`` fields of the ``fa_file``
|
||||
record gives the how many bytes have been written so far (i.e.
|
||||
the "offset").
|
||||
|
||||
Bro 2.3
|
||||
=======
|
||||
|
||||
|
|
2
VERSION
2
VERSION
|
@ -1 +1 @@
|
|||
2.3-278
|
||||
2.3-376
|
||||
|
|
|
@ -1 +1 @@
|
|||
Subproject commit 977654dc51ab08a2afde32241f108cdb4a581d8f
|
||||
Subproject commit 0b713c027d3efaaca50e5df995c02656175573cd
|
|
@ -1 +1 @@
|
|||
Subproject commit acb8fbe8e7bc6ace5135fb73dca8e29432cdc1ca
|
||||
Subproject commit d43cc790e5b8709b5e032e52ad0e00936494739b
|
|
@ -1 +1 @@
|
|||
Subproject commit 39e865dec9611b9b53b609cbc8df519cebae0a1e
|
||||
Subproject commit 8c9b87bc73e1ddaa304e3d89028c1e7b95d37a91
|
|
@ -1 +1 @@
|
|||
Subproject commit 1efa4d10f943351efea96def68e598b053fd217a
|
||||
Subproject commit d67d89aaee32ad5edb9068db55d1310c2f36970a
|
Binary file not shown.
Binary file not shown.
|
@ -1,7 +1,7 @@
|
|||
event file_new(f: fa_file)
|
||||
event file_mime_type(f: fa_file, mime_type: string)
|
||||
{
|
||||
print "new file", f$id;
|
||||
if ( f?$mime_type && f$mime_type == "text/plain" )
|
||||
if ( mime_type == "text/plain" )
|
||||
Files::add_analyzer(f, Files::ANALYZER_MD5);
|
||||
}
|
||||
|
||||
|
|
|
@ -14,32 +14,35 @@ consume that data, make it available for matching, and provide
|
|||
infrastructure around improving performance, memory utilization, and
|
||||
generally making all of this easier.
|
||||
|
||||
Data in the Intelligence Framework is the atomic piece of intelligence
|
||||
Data in the Intelligence Framework is an atomic piece of intelligence
|
||||
such as an IP address or an e-mail address along with a suite of
|
||||
metadata about it such as a freeform source field, a freeform
|
||||
descriptive field and a URL which might lead to more information about
|
||||
the specific item. The metadata in the default scripts has been
|
||||
deliberately kept minimal so that the community can find the
|
||||
appropriate fields that need added by writing scripts which extend the
|
||||
appropriate fields that need to be added by writing scripts which extend the
|
||||
base record using the normal record extension mechanism.
|
||||
|
||||
Quick Start
|
||||
-----------
|
||||
|
||||
Load the package of scripts that sends data into the Intelligence
|
||||
Framework to be checked by loading this script in local.bro::
|
||||
|
||||
@load policy/frameworks/intel/seen
|
||||
|
||||
Refer to the "Loading Intelligence" section below to see the format
|
||||
for Intelligence Framework text files, then load those text files with
|
||||
this line in local.bro::
|
||||
|
||||
redef Intel::read_files += { "/somewhere/yourdata.txt" };
|
||||
|
||||
The data itself only needs to reside on the manager if running in a
|
||||
The text files need to reside only on the manager if running in a
|
||||
cluster.
|
||||
|
||||
Add the following line to local.bro in order to load the scripts
|
||||
that send "seen" data into the Intelligence Framework to be checked against
|
||||
the loaded intelligence data::
|
||||
|
||||
@load policy/frameworks/intel/seen
|
||||
|
||||
Intelligence data matches will be logged to the intel.log file.
|
||||
|
||||
Architecture
|
||||
------------
|
||||
|
||||
|
@ -58,8 +61,10 @@ manager is the only node that needs the intelligence data. The
|
|||
intelligence framework has distribution mechanisms which will push
|
||||
data out to all of the nodes that need it.
|
||||
|
||||
Here is an example of the intelligence data format. Note that all
|
||||
whitespace field separators are literal tabs and fields containing only a
|
||||
Here is an example of the intelligence data format (note that there will be
|
||||
additional fields if you are using CIF intelligence data or if you are
|
||||
using the policy/frameworks/intel/do_notice script). Note that all fields
|
||||
must be separated by a single tab character and fields containing only a
|
||||
hyphen are considered to be null values. ::
|
||||
|
||||
#fields indicator indicator_type meta.source meta.desc meta.url
|
||||
|
@ -69,8 +74,21 @@ hyphen are considered to be null values. ::
|
|||
For a list of all built-in `indicator_type` values, please refer to the
|
||||
documentation of :bro:see:`Intel::Type`.
|
||||
|
||||
To load the data once files are created, use the following example
|
||||
code to define files to load with your own file names of course::
|
||||
Note that if you are using data from the Collective Intelligence Framework,
|
||||
then you will need to add the following line to your local.bro in order
|
||||
to support additional metadata fields used by CIF::
|
||||
|
||||
@load policy/integration/collective-intel
|
||||
|
||||
There is a simple mechanism to raise a Bro notice (of type Intel::Notice)
|
||||
for user-specified intelligence matches. To use this feature, add the
|
||||
following line to local.bro in order to support additional metadata fields
|
||||
(documented in the :bro:see:`Intel::MetaData` record)::
|
||||
|
||||
@load policy/frameworks/intel/do_notice
|
||||
|
||||
To load the data once the files are created, use the following example
|
||||
to specify which files to load (with your own file names of course)::
|
||||
|
||||
redef Intel::read_files += {
|
||||
"/somewhere/feed1.txt",
|
||||
|
@ -85,24 +103,23 @@ Seen Data
|
|||
|
||||
When some bit of data is extracted (such as an email address in the
|
||||
"From" header in a message over SMTP), the Intelligence Framework
|
||||
needs to be informed that this data was discovered and it's presence
|
||||
should be checked within the intelligence data set. This is
|
||||
accomplished through the :bro:see:`Intel::seen` function.
|
||||
needs to be informed that this data was discovered so that its presence
|
||||
will be checked within the loaded intelligence data. This is
|
||||
accomplished through the :bro:see:`Intel::seen` function, however
|
||||
typically users won't need to work with this function due to the
|
||||
scripts included with Bro that will call this function.
|
||||
|
||||
Typically users won't need to work with this function due to built in
|
||||
hook scripts that Bro ships with that will "see" data and send it into
|
||||
the intelligence framework. A user may only need to load the entire
|
||||
package of hook scripts as a module or pick and choose specific
|
||||
scripts to load. Keep in mind that as more data is sent into the
|
||||
To load all of the scripts included with Bro for sending "seen" data to
|
||||
the intelligence framework, just add this line to local.bro::
|
||||
|
||||
@load policy/frameworks/intel/seen
|
||||
|
||||
Alternatively, specific scripts in that directory can be loaded.
|
||||
Keep in mind that as more data is sent into the
|
||||
intelligence framework, the CPU load consumed by Bro will increase
|
||||
depending on how many times the :bro:see:`Intel::seen` function is
|
||||
being called which is heavily traffic dependent.
|
||||
|
||||
The full package of hook scripts that Bro ships with for sending this
|
||||
"seen" data into the intelligence framework can be loading by adding
|
||||
this line to local.bro::
|
||||
|
||||
@load policy/frameworks/intel/seen
|
||||
|
||||
Intelligence Matches
|
||||
********************
|
||||
|
@ -111,6 +128,7 @@ Against all hopes, most networks will eventually have a hit on
|
|||
intelligence data which could indicate a possible compromise or other
|
||||
unwanted activity. The Intelligence Framework provides an event that
|
||||
is generated whenever a match is discovered named :bro:see:`Intel::match`.
|
||||
|
||||
Due to design restrictions placed upon
|
||||
the intelligence framework, there is no assurance as to where this
|
||||
event will be generated. It could be generated on the worker where
|
||||
|
@ -119,3 +137,7 @@ handled, only the data given as event arguments to the event can be
|
|||
assured since the host where the data was seen may not be where
|
||||
``Intel::match`` is handled.
|
||||
|
||||
Intelligence matches are logged to the intel.log file. For a description of
|
||||
each field in that file, see the documentation for the :bro:see:`Intel::Info`
|
||||
record.
|
||||
|
||||
|
|
|
@ -271,7 +271,7 @@ script that is generating the notice has indicated to the notice framework how
|
|||
to identify notices that are intrinsically the same. Identification of these
|
||||
"intrinsically duplicate" notices is implemented with an optional field in
|
||||
:bro:see:`Notice::Info` records named ``$identifier`` which is a simple string.
|
||||
If the ``$identifier`` and ``$type`` fields are the same for two notices, the
|
||||
If the ``$identifier`` and ``$note`` fields are the same for two notices, the
|
||||
notice framework actually considers them to be the same thing and can use that
|
||||
information to suppress duplicates for a configurable period of time.
|
||||
|
||||
|
|
|
@ -7,18 +7,15 @@ global mime_to_ext: table[string] of string = {
|
|||
["text/html"] = "html",
|
||||
};
|
||||
|
||||
event file_new(f: fa_file)
|
||||
event file_mime_type(f: fa_file, mime_type: string)
|
||||
{
|
||||
if ( f$source != "HTTP" )
|
||||
return;
|
||||
|
||||
if ( ! f?$mime_type )
|
||||
if ( mime_type !in mime_to_ext )
|
||||
return;
|
||||
|
||||
if ( f$mime_type !in mime_to_ext )
|
||||
return;
|
||||
|
||||
local fname = fmt("%s-%s.%s", f$source, f$id, mime_to_ext[f$mime_type]);
|
||||
local fname = fmt("%s-%s.%s", f$source, f$id, mime_to_ext[mime_type]);
|
||||
print fmt("Extracting file %s", fname);
|
||||
Files::add_analyzer(f, Files::ANALYZER_EXTRACT, [$extract_filename=fname]);
|
||||
}
|
||||
}
|
|
@ -35,7 +35,7 @@ before you begin:
|
|||
|
||||
To build Bro from source, the following additional dependencies are required:
|
||||
|
||||
* CMake 2.6.3 or greater (http://www.cmake.org)
|
||||
* CMake 2.8 or greater (http://www.cmake.org)
|
||||
* Make
|
||||
* C/C++ compiler
|
||||
* SWIG (http://www.swig.org)
|
||||
|
|
|
@ -113,7 +113,7 @@ default, including:
|
|||
As you can see, some log files are specific to a particular protocol,
|
||||
while others aggregate information across different types of activity.
|
||||
For a complete list of log files and a description of its purpose,
|
||||
see :doc:`List of Log Files <../script-reference/list-of-log-files>`.
|
||||
see :doc:`Log Files <../script-reference/log-files>`.
|
||||
|
||||
.. _bro-cut:
|
||||
|
||||
|
|
|
@ -103,9 +103,9 @@ In the ``file_hash`` event handler, there is an ``if`` statement that is used
|
|||
to check for the correct type of hash, in this case
|
||||
a SHA1 hash. It also checks for a mime type we've defined as
|
||||
being of interest as defined in the constant ``match_file_types``.
|
||||
The comparison is made against the expression ``f$mime_type``, which uses
|
||||
The comparison is made against the expression ``f$info$mime_type``, which uses
|
||||
the ``$`` dereference operator to check the value ``mime_type``
|
||||
inside the variable ``f``. If the entire expression evaluates to true,
|
||||
inside the variable ``f$info``. If the entire expression evaluates to true,
|
||||
then a helper function is called to do the rest of the work. In that
|
||||
function, a local variable is defined to hold a string comprised of
|
||||
the SHA1 hash concatenated with ``.malware.hash.cymru.com``; this
|
||||
|
|
5
man/CMakeLists.txt
Normal file
5
man/CMakeLists.txt
Normal file
|
@ -0,0 +1,5 @@
|
|||
|
||||
install(DIRECTORY . DESTINATION ${BRO_MAN_INSTALL_PATH}/man8 FILES_MATCHING
|
||||
PATTERN "*.8"
|
||||
)
|
||||
|
164
man/bro.8
Normal file
164
man/bro.8
Normal file
|
@ -0,0 +1,164 @@
|
|||
.TH BRO "8" "November 2014" "bro" "System Administration Utilities"
|
||||
.SH NAME
|
||||
bro \- passive network traffic analyzer
|
||||
.SH SYNOPSIS
|
||||
.B bro
|
||||
\/\fP [\fIoptions\fR] [\fIfile\fR ...]
|
||||
.SH DESCRIPTION
|
||||
Bro is primarily a security monitor that inspects all traffic on a link in
|
||||
depth for signs of suspicious activity. More generally, however, Bro
|
||||
supports a wide range of traffic analysis tasks even outside of the
|
||||
security domain, including performance measurements and helping with
|
||||
trouble-shooting.
|
||||
|
||||
Bro comes with built-in functionality for a range of analysis and detection
|
||||
tasks, including detecting malware by interfacing to external registries,
|
||||
reporting vulnerable versions of software seen on the network, identifying
|
||||
popular web applications, detecting SSH brute-forcing, validating SSL
|
||||
certificate chains, among others.
|
||||
.SH OPTIONS
|
||||
.TP
|
||||
.B <file>
|
||||
policy file, or read stdin
|
||||
.TP
|
||||
\fB\-a\fR,\ \-\-parse\-only
|
||||
exit immediately after parsing scripts
|
||||
.TP
|
||||
\fB\-b\fR,\ \-\-bare\-mode
|
||||
don't load scripts from the base/ directory
|
||||
.TP
|
||||
\fB\-d\fR,\ \-\-debug\-policy
|
||||
activate policy file debugging
|
||||
.TP
|
||||
\fB\-e\fR,\ \-\-exec <bro code>
|
||||
augment loaded policies by given code
|
||||
.TP
|
||||
\fB\-f\fR,\ \-\-filter <filter>
|
||||
tcpdump filter
|
||||
.TP
|
||||
\fB\-g\fR,\ \-\-dump\-config
|
||||
dump current config into .state dir
|
||||
.TP
|
||||
\fB\-h\fR,\ \-\-help|\-?
|
||||
command line help
|
||||
.TP
|
||||
\fB\-i\fR,\ \-\-iface <interface>
|
||||
read from given interface
|
||||
.TP
|
||||
\fB\-p\fR,\ \-\-prefix <prefix>
|
||||
add given prefix to policy file resolution
|
||||
.TP
|
||||
\fB\-r\fR,\ \-\-readfile <readfile>
|
||||
read from given tcpdump file
|
||||
.TP
|
||||
\fB\-y\fR,\ \-\-flowfile <file>[=<ident>]
|
||||
read from given flow file
|
||||
.TP
|
||||
\fB\-Y\fR,\ \-\-netflow <ip>:<prt>[=<id>]
|
||||
read flow from socket
|
||||
.TP
|
||||
\fB\-s\fR,\ \-\-rulefile <rulefile>
|
||||
read rules from given file
|
||||
.TP
|
||||
\fB\-t\fR,\ \-\-tracefile <tracefile>
|
||||
activate execution tracing
|
||||
.TP
|
||||
\fB\-w\fR,\ \-\-writefile <writefile>
|
||||
write to given tcpdump file
|
||||
.TP
|
||||
\fB\-v\fR,\ \-\-version
|
||||
print version and exit
|
||||
.TP
|
||||
\fB\-x\fR,\ \-\-print\-state <file.bst>
|
||||
print contents of state file
|
||||
.TP
|
||||
\fB\-z\fR,\ \-\-analyze <analysis>
|
||||
run the specified policy file analysis
|
||||
.TP
|
||||
\fB\-C\fR,\ \-\-no\-checksums
|
||||
ignore checksums
|
||||
.TP
|
||||
\fB\-D\fR,\ \-\-dfa\-size <size>
|
||||
DFA state cache size
|
||||
.TP
|
||||
\fB\-F\fR,\ \-\-force\-dns
|
||||
force DNS
|
||||
.TP
|
||||
\fB\-I\fR,\ \-\-print\-id <ID name>
|
||||
print out given ID
|
||||
.TP
|
||||
\fB\-K\fR,\ \-\-md5\-hashkey <hashkey>
|
||||
set key for MD5\-keyed hashing
|
||||
.TP
|
||||
\fB\-L\fR,\ \-\-rule\-benchmark
|
||||
benchmark for rules
|
||||
.TP
|
||||
\fB\-N\fR,\ \-\-print\-plugins
|
||||
print available plugins and exit (\fB\-NN\fR for verbose)
|
||||
.TP
|
||||
\fB\-O\fR,\ \-\-optimize
|
||||
optimize policy script
|
||||
.TP
|
||||
\fB\-P\fR,\ \-\-prime\-dns
|
||||
prime DNS
|
||||
.TP
|
||||
\fB\-Q\fR,\ \-\-time
|
||||
print execution time summary to stderr
|
||||
.TP
|
||||
\fB\-R\fR,\ \-\-replay <events.bst>
|
||||
replay events
|
||||
.TP
|
||||
\fB\-S\fR,\ \-\-debug\-rules
|
||||
enable rule debugging
|
||||
.TP
|
||||
\fB\-T\fR,\ \-\-re\-level <level>
|
||||
set 'RE_level' for rules
|
||||
.TP
|
||||
\fB\-U\fR,\ \-\-status\-file <file>
|
||||
Record process status in file
|
||||
.TP
|
||||
\fB\-W\fR,\ \-\-watchdog
|
||||
activate watchdog timer
|
||||
.TP
|
||||
\fB\-X\fR,\ \-\-broxygen
|
||||
generate documentation based on config file
|
||||
.TP
|
||||
\fB\-\-pseudo\-realtime[=\fR<speedup>]
|
||||
enable pseudo\-realtime for performance evaluation (default 1)
|
||||
.TP
|
||||
\fB\-\-load\-seeds\fR <file>
|
||||
load seeds from given file
|
||||
.TP
|
||||
\fB\-\-save\-seeds\fR <file>
|
||||
save seeds to given file
|
||||
.SH ENVIRONMENT
|
||||
.TP
|
||||
.B BROPATH
|
||||
file search path
|
||||
.TP
|
||||
.B BRO_PLUGIN_PATH
|
||||
plugin search path
|
||||
.TP
|
||||
.B BRO_PLUGIN_ACTIVATE
|
||||
plugins to always activate
|
||||
.TP
|
||||
.B BRO_PREFIXES
|
||||
prefix list
|
||||
.TP
|
||||
.B BRO_DNS_FAKE
|
||||
disable DNS lookups
|
||||
.TP
|
||||
.B BRO_SEED_FILE
|
||||
file to load seeds from
|
||||
.TP
|
||||
.B BRO_LOG_SUFFIX
|
||||
ASCII log file extension
|
||||
.TP
|
||||
.B BRO_PROFILER_FILE
|
||||
Output file for script execution statistics
|
||||
.TP
|
||||
.B BRO_DISABLE_BROXYGEN
|
||||
Disable Broxygen documentation support
|
||||
.SH AUTHOR
|
||||
.B bro
|
||||
was written by The Bro Project <info@bro.org>.
|
|
@ -71,11 +71,50 @@ global classification_map: table[count] of string;
|
|||
global sid_map: table[count] of string;
|
||||
global gen_map: table[count] of string;
|
||||
|
||||
global num_classification_map_reads = 0;
|
||||
global num_sid_map_reads = 0;
|
||||
global num_gen_map_reads = 0;
|
||||
global watching = F;
|
||||
|
||||
# For reading in config files.
|
||||
type OneLine: record {
|
||||
line: string;
|
||||
};
|
||||
|
||||
function mappings_initialized(): bool
|
||||
{
|
||||
return num_classification_map_reads > 0 &&
|
||||
num_sid_map_reads > 0 &&
|
||||
num_gen_map_reads > 0;
|
||||
}
|
||||
|
||||
function start_watching()
|
||||
{
|
||||
if ( watching )
|
||||
return;
|
||||
|
||||
watching = T;
|
||||
|
||||
if ( watch_dir != "" )
|
||||
{
|
||||
Dir::monitor(watch_dir, function(fname: string)
|
||||
{
|
||||
Input::add_analysis([$source=fname,
|
||||
$reader=Input::READER_BINARY,
|
||||
$mode=Input::STREAM,
|
||||
$name=fname]);
|
||||
}, 10secs);
|
||||
}
|
||||
|
||||
if ( watch_file != "" )
|
||||
{
|
||||
Input::add_analysis([$source=watch_file,
|
||||
$reader=Input::READER_BINARY,
|
||||
$mode=Input::STREAM,
|
||||
$name=watch_file]);
|
||||
}
|
||||
}
|
||||
|
||||
function create_info(ev: IDSEvent): Info
|
||||
{
|
||||
local info = Info($ts=ev$ts,
|
||||
|
@ -136,11 +175,33 @@ event Unified2::read_classification_line(desc: Input::EventDescription, tpe: Inp
|
|||
}
|
||||
}
|
||||
|
||||
event Input::end_of_data(name: string, source: string)
|
||||
{
|
||||
if ( name == classification_config )
|
||||
++num_classification_map_reads;
|
||||
else if ( name == sid_msg )
|
||||
++num_sid_map_reads;
|
||||
else if ( name == gen_msg )
|
||||
++num_gen_map_reads;
|
||||
else
|
||||
return;
|
||||
|
||||
if ( watching )
|
||||
return;
|
||||
|
||||
if ( mappings_initialized() )
|
||||
start_watching();
|
||||
}
|
||||
|
||||
event bro_init() &priority=5
|
||||
{
|
||||
Log::create_stream(Unified2::LOG, [$columns=Info, $ev=log_unified2]);
|
||||
|
||||
if ( sid_msg != "" )
|
||||
if ( sid_msg == "" )
|
||||
{
|
||||
num_sid_map_reads = 1;
|
||||
}
|
||||
else
|
||||
{
|
||||
Input::add_event([$source=sid_msg,
|
||||
$reader=Input::READER_RAW,
|
||||
|
@ -151,7 +212,11 @@ event bro_init() &priority=5
|
|||
$ev=Unified2::read_sid_msg_line]);
|
||||
}
|
||||
|
||||
if ( gen_msg != "" )
|
||||
if ( gen_msg == "" )
|
||||
{
|
||||
num_gen_map_reads = 1;
|
||||
}
|
||||
else
|
||||
{
|
||||
Input::add_event([$source=gen_msg,
|
||||
$name=gen_msg,
|
||||
|
@ -162,7 +227,11 @@ event bro_init() &priority=5
|
|||
$ev=Unified2::read_gen_msg_line]);
|
||||
}
|
||||
|
||||
if ( classification_config != "" )
|
||||
if ( classification_config == "" )
|
||||
{
|
||||
num_classification_map_reads = 1;
|
||||
}
|
||||
else
|
||||
{
|
||||
Input::add_event([$source=classification_config,
|
||||
$name=classification_config,
|
||||
|
@ -173,24 +242,8 @@ event bro_init() &priority=5
|
|||
$ev=Unified2::read_classification_line]);
|
||||
}
|
||||
|
||||
if ( watch_dir != "" )
|
||||
{
|
||||
Dir::monitor(watch_dir, function(fname: string)
|
||||
{
|
||||
Input::add_analysis([$source=fname,
|
||||
$reader=Input::READER_BINARY,
|
||||
$mode=Input::STREAM,
|
||||
$name=fname]);
|
||||
}, 10secs);
|
||||
}
|
||||
|
||||
if ( watch_file != "" )
|
||||
{
|
||||
Input::add_analysis([$source=watch_file,
|
||||
$reader=Input::READER_BINARY,
|
||||
$mode=Input::STREAM,
|
||||
$name=watch_file]);
|
||||
}
|
||||
if ( mappings_initialized() )
|
||||
start_watching();
|
||||
}
|
||||
|
||||
event file_new(f: fa_file)
|
||||
|
|
|
@ -1,2 +1,3 @@
|
|||
@load-sigs ./general
|
||||
@load-sigs ./msoffice
|
||||
@load-sigs ./libmagic
|
||||
|
|
|
@ -1,16 +1,137 @@
|
|||
# General purpose file magic signatures.
|
||||
|
||||
signature file-plaintext {
|
||||
file-magic /([[:print:][:space:]]{10})/
|
||||
file-magic /^([[:print:][:space:]]{10})/
|
||||
file-mime "text/plain", -20
|
||||
}
|
||||
|
||||
signature file-tar {
|
||||
file-magic /([[:print:]\x00]){100}(([[:digit:]\x00\x20]){8}){3}/
|
||||
file-mime "application/x-tar", 150
|
||||
file-magic /^[[:print:]\x00]{100}([[:digit:]\x20]{7}\x00){3}([[:digit:]\x20]{11}\x00){2}([[:digit:]\x00\x20]{7}[\x20\x00])[0-7\x00]/
|
||||
file-mime "application/x-tar", 100
|
||||
}
|
||||
|
||||
signature file-zip {
|
||||
file-mime "application/zip", 10
|
||||
file-magic /^PK\x03\x04.{2}/
|
||||
}
|
||||
|
||||
signature file-jar {
|
||||
file-mime "application/java-archive", 100
|
||||
file-magic /^PK\x03\x04.{1,200}\x14\x00..META-INF\/MANIFEST\.MF/
|
||||
}
|
||||
|
||||
signature file-java-applet {
|
||||
file-magic /^\xca\xfe\xba\xbe...[\x2e-\x34]/
|
||||
file-mime "application/x-java-applet", 71
|
||||
}
|
||||
|
||||
# Shockwave flash
|
||||
signature file-swf {
|
||||
file-magic /(F|C|Z)WS/
|
||||
file-magic /^(F|C|Z)WS/
|
||||
file-mime "application/x-shockwave-flash", 60
|
||||
}
|
||||
}
|
||||
|
||||
# Microsoft Outlook's Transport Neutral Encapsulation Format
|
||||
signature file-tnef {
|
||||
file-magic /^\x78\x9f\x3e\x22/
|
||||
file-mime "application/vnd.ms-tnef", 100
|
||||
}
|
||||
|
||||
# Mac OS X DMG files
|
||||
signature file-dmg {
|
||||
file-magic /^(\x78\x01\x73\x0D\x62\x62\x60|\x78\xDA\x63\x60\x18\x05|\x78\x01\x63\x60\x18\x05|\x78\xDA\x73\x0D|\x78[\x01\xDA]\xED[\xD0-\xD9])/
|
||||
file-mime "application/x-dmg", 100
|
||||
}
|
||||
|
||||
# Mac OS X Mach-O executable
|
||||
signature file-mach-o {
|
||||
file-magic /^[\xce\xcf]\xfa\xed\xfe/
|
||||
file-mime "application/x-mach-o-executable", 100
|
||||
}
|
||||
|
||||
# Mac OS X Universal Mach-O executable
|
||||
signature file-mach-o-universal {
|
||||
file-magic /^\xca\xfe\xba\xbe..\x00[\x01-\x14]/
|
||||
file-mime "application/x-mach-o-executable", 100
|
||||
}
|
||||
|
||||
# XAR (eXtensible ARchive) format.
|
||||
# Mac OS X uses this for the .pkg format.
|
||||
signature file-xar {
|
||||
file-magic /^xar\!/
|
||||
file-mime "application/x-xar", 100
|
||||
}
|
||||
|
||||
signature file-pkcs7 {
|
||||
file-magic /^MIME-Version:.*protocol=\"application\/pkcs7-signature\"/
|
||||
file-mime "application/pkcs7-signature", 100
|
||||
}
|
||||
|
||||
# Concatenated X.509 certificates in textual format.
|
||||
signature file-pem {
|
||||
file-magic /^-----BEGIN CERTIFICATE-----/
|
||||
file-mime "application/x-pem"
|
||||
}
|
||||
|
||||
# Java Web Start file.
|
||||
signature file-jnlp {
|
||||
file-magic /^\<jnlp\x20/
|
||||
file-mime "application/x-java-jnlp-file", 100
|
||||
}
|
||||
|
||||
signature file-ico {
|
||||
file-magic /^\x00\x00\x01\x00/
|
||||
file-mime "image/x-icon", 70
|
||||
}
|
||||
|
||||
signature file-cur {
|
||||
file-magic /^\x00\x00\x02\x00/
|
||||
file-mime "image/x-cursor", 70
|
||||
}
|
||||
|
||||
signature file-pcap {
|
||||
file-magic /^(\xa1\xb2\xc3\xd4|\xd4\xc3\xb2\xa1)/
|
||||
file-mime "application/vnd.tcpdump.pcap", 70
|
||||
}
|
||||
|
||||
signature file-pcap-ng {
|
||||
file-magic /^\x0a\x0d\x0d\x0a.{4}(\x1a\x2b\x3c\x4d|\x4d\x3c\x2b\x1a)/
|
||||
file-mime "application/vnd.tcpdump.pcap", 100
|
||||
}
|
||||
|
||||
signature file-shellscript {
|
||||
file-mime "text/x-shellscript", 250
|
||||
file-magic /^\x23\x21[^\n]{1,15}bin\/(env[[:space:]]+)?(ba|tc|c|z|fa|ae|k)?sh/
|
||||
}
|
||||
|
||||
signature file-perl {
|
||||
file-magic /^\x23\x21[^\n]{1,15}bin\/(env[[:space:]]+)?perl/
|
||||
file-mime "text/x-perl", 60
|
||||
}
|
||||
|
||||
signature file-ruby {
|
||||
file-magic /^\x23\x21[^\n]{1,15}bin\/(env[[:space:]]+)?ruby/
|
||||
file-mime "text/x-ruby", 60
|
||||
}
|
||||
|
||||
signature file-python {
|
||||
file-magic /^\x23\x21[^\n]{1,15}bin\/(env[[:space:]]+)?python/
|
||||
file-mime "text/x-python", 60
|
||||
}
|
||||
|
||||
signature file-php {
|
||||
file-magic /^.*<\?php/
|
||||
file-mime "text/x-php", 40
|
||||
}
|
||||
|
||||
# Stereolithography ASCII format
|
||||
signature file-stl-ascii {
|
||||
file-magic /^solid\x20/
|
||||
file-mime "application/sla", 10
|
||||
}
|
||||
|
||||
# Sketchup model file
|
||||
signature file-skp {
|
||||
file-magic /^\xFF\xFE\xFF\x0E\x53\x00\x6B\x00\x65\x00\x74\x00\x63\x00\x68\x00\x55\x00\x70\x00\x20\x00\x4D\x00\x6F\x00\x64\x00\x65\x00\x6C\x00/
|
||||
file-mime "application/skp", 100
|
||||
}
|
||||
|
|
|
@ -7,42 +7,18 @@
|
|||
# The instrumented version of the `file` command used to generate these
|
||||
# is located at: https://github.com/jsiwek/file/tree/bro-signatures.
|
||||
|
||||
# >2080 string,=Foglio di lavoro Microsoft Exce (len=31), ["%s"], swap_endian=0
|
||||
signature file-magic-auto0 {
|
||||
file-mime "application/vnd.ms-excel", 340
|
||||
file-magic /(.{2080})(Foglio di lavoro Microsoft Exce)/
|
||||
}
|
||||
|
||||
# >2 string,=---BEGIN PGP PUBLIC KEY BLOCK- (len=30), ["PGP public key block"], swap_endian=0
|
||||
signature file-magic-auto1 {
|
||||
file-mime "application/pgp-keys", 330
|
||||
file-magic /(.{2})(\x2d\x2d\x2dBEGIN PGP PUBLIC KEY BLOCK\x2d)/
|
||||
}
|
||||
|
||||
# >2080 string,=Microsoft Excel 5.0 Worksheet (len=29), ["%s"], swap_endian=0
|
||||
signature file-magic-auto2 {
|
||||
file-mime "application/vnd.ms-excel", 320
|
||||
file-magic /(.{2080})(Microsoft Excel 5\x2e0 Worksheet)/
|
||||
}
|
||||
|
||||
# >11 string,=must be converted with BinHex (len=29), ["BinHex binary text"], swap_endian=0
|
||||
signature file-magic-auto3 {
|
||||
file-mime "application/mac-binhex40", 320
|
||||
file-magic /(.{11})(must be converted with BinHex)/
|
||||
}
|
||||
|
||||
# >2080 string,=Microsoft Word 6.0 Document (len=27), ["%s"], swap_endian=0
|
||||
signature file-magic-auto4 {
|
||||
file-mime "application/msword", 300
|
||||
file-magic /(.{2080})(Microsoft Word 6\x2e0 Document)/
|
||||
}
|
||||
|
||||
# >2080 string,=Documento Microsoft Word 6 (len=26), ["Spanish Microsoft Word 6 document data"], swap_endian=0
|
||||
signature file-magic-auto5 {
|
||||
file-mime "application/msword", 290
|
||||
file-magic /(.{2080})(Documento Microsoft Word 6)/
|
||||
}
|
||||
|
||||
# >0 string,=-----BEGIN PGP SIGNATURE- (len=25), ["PGP signature"], swap_endian=0
|
||||
signature file-magic-auto6 {
|
||||
file-mime "application/pgp-signature", 280
|
||||
|
@ -92,36 +68,6 @@ signature file-magic-auto13 {
|
|||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fgawk)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bin/bash (len=22), ["Bourne-Again shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto14 {
|
||||
file-mime "text/x-shellscript", 250
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fbash)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bin/tcsh (len=22), ["Tenex C shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto15 {
|
||||
file-mime "text/x-shellscript", 250
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2ftcsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bin/zsh (len=21), ["Paul Falstad's zsh script text executable"], swap_endian=0
|
||||
signature file-magic-auto16 {
|
||||
file-mime "text/x-shellscript", 240
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fzsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bin/ash (len=21), ["Neil Brown's ash script text executable"], swap_endian=0
|
||||
signature file-magic-auto17 {
|
||||
file-mime "text/x-shellscript", 240
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fash)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bin/ae (len=20), ["Neil Brown's ae script text executable"], swap_endian=0
|
||||
signature file-magic-auto18 {
|
||||
file-mime "text/x-shellscript", 230
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fae)/
|
||||
}
|
||||
|
||||
# >0 string,=# PaCkAgE DaTaStReAm (len=20), ["pkg Datastream (SVR4)"], swap_endian=0
|
||||
signature file-magic-auto19 {
|
||||
file-mime "application/x-svr4-package", 230
|
||||
|
@ -140,30 +86,12 @@ signature file-magic-auto21 {
|
|||
file-magic /(\x5bKDE Desktop Entry\x5d)/
|
||||
}
|
||||
|
||||
# >512 string,=R\000o\000o\000t\000 \000E\000n\000t\000r\000y (len=19), ["Microsoft Word Document"], swap_endian=0
|
||||
signature file-magic-auto22 {
|
||||
file-mime "application/msword", 220
|
||||
file-magic /(.{512})(R\x00o\x00o\x00t\x00 \x00E\x00n\x00t\x00r\x00y)/
|
||||
}
|
||||
|
||||
# >0 string,=!<arch>\n__________E (len=19), ["MIPS archive"], swap_endian=0
|
||||
signature file-magic-auto23 {
|
||||
file-mime "application/x-archive", 220
|
||||
file-magic /(\x21\x3carch\x3e\x0a\x5f\x5f\x5f\x5f\x5f\x5f\x5f\x5f\x5f\x5fE)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/tcsh (len=18), ["Tenex C shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto24 {
|
||||
file-mime "text/x-shellscript", 210
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2ftcsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/local/bash (len=18), ["Bourne-Again shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto25 {
|
||||
file-mime "text/x-shellscript", 210
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2flocal\x2fbash)/
|
||||
}
|
||||
|
||||
# >0 string/t,=# KDE Config File (len=17), ["KDE config file"], swap_endian=0
|
||||
signature file-magic-auto26 {
|
||||
file-mime "application/x-kdelnk", 200
|
||||
|
@ -189,12 +117,6 @@ signature file-magic-auto29 {
|
|||
file-magic /(\x23\x21 ?\x2fusr\x2fbin\x2fnawk)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/bin/tcsh (len=16), ["Tenex C shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto30 {
|
||||
file-mime "text/x-shellscript", 190
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2fbin\x2ftcsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/bin/gawk (len=16), ["GNU awk script text executable"], swap_endian=0
|
||||
signature file-magic-auto31 {
|
||||
file-mime "text/x-gawk", 190
|
||||
|
@ -207,12 +129,6 @@ signature file-magic-auto32 {
|
|||
file-magic /(.{369})(MICROSOFT PIFEX\x00)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/bin/bash (len=16), ["Bourne-Again shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto33 {
|
||||
file-mime "text/x-shellscript", 190
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2fbin\x2fbash)/
|
||||
}
|
||||
|
||||
# >0 string/w,=#VRML V1.0 ascii (len=16), ["VRML 1 file"], swap_endian=0
|
||||
signature file-magic-auto34 {
|
||||
file-mime "model/vrml", 190
|
||||
|
@ -334,12 +250,6 @@ signature file-magic-auto51 {
|
|||
file-magic /(\x23\x21 ?\x2fusr\x2fbin\x2fawk)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /usr/bin/zsh (len=15), ["Paul Falstad's zsh script text executable"], swap_endian=0
|
||||
signature file-magic-auto52 {
|
||||
file-mime "text/x-shellscript", 180
|
||||
file-magic /(\x23\x21 ?\x2fusr\x2fbin\x2fzsh)/
|
||||
}
|
||||
|
||||
# >0 string,=MAS_UTrack_V00 (len=14), [""], swap_endian=0
|
||||
# >>14 string,>/0 (len=2), ["ultratracker V1.%.1s module sound data"], swap_endian=0
|
||||
signature file-magic-auto53 {
|
||||
|
@ -457,12 +367,6 @@ signature file-magic-auto70 {
|
|||
file-magic /(\x3cmap ?version)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/tcsh (len=12), ["Tenex C shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto71 {
|
||||
file-mime "text/x-shellscript", 150
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2ftcsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/nawk (len=12), ["new awk script text executable"], swap_endian=0
|
||||
signature file-magic-auto72 {
|
||||
file-mime "text/x-nawk", 150
|
||||
|
@ -475,12 +379,6 @@ signature file-magic-auto73 {
|
|||
file-magic /(\x23\x21 ?\x2fbin\x2fgawk)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/bash (len=12), ["Bourne-Again shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto74 {
|
||||
file-mime "text/x-shellscript", 150
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2fbash)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/awk (len=11), ["awk script text executable"], swap_endian=0
|
||||
signature file-magic-auto75 {
|
||||
file-mime "text/x-awk", 140
|
||||
|
@ -505,24 +403,6 @@ signature file-magic-auto78 {
|
|||
file-magic /(d8\x3aannounce)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/csh (len=11), ["C shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto79 {
|
||||
file-mime "text/x-shellscript", 140
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2fcsh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/ksh (len=11), ["Korn shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto80 {
|
||||
file-mime "text/x-shellscript", 140
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2fksh)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/zsh (len=11), ["Paul Falstad's zsh script text executable"], swap_endian=0
|
||||
signature file-magic-auto81 {
|
||||
file-mime "text/x-shellscript", 140
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2fzsh)/
|
||||
}
|
||||
|
||||
# >0 string/c,=BEGIN:VCARD (len=11), ["vCard visiting card"], swap_endian=0
|
||||
signature file-magic-auto82 {
|
||||
file-mime "text/x-vcard", 140
|
||||
|
@ -545,12 +425,6 @@ signature file-magic-auto84 {
|
|||
file-magic /(Forward to)/
|
||||
}
|
||||
|
||||
# >0 string/wt,=#! /bin/sh (len=10), ["POSIX shell script text executable"], swap_endian=0
|
||||
signature file-magic-auto85 {
|
||||
file-mime "text/x-shellscript", 130
|
||||
file-magic /(\x23\x21 ?\x2fbin\x2fsh)/
|
||||
}
|
||||
|
||||
# >0 string,=II*\000\020\000\000\000CR (len=10), ["Canon CR2 raw image data"], swap_endian=0
|
||||
signature file-magic-auto86 {
|
||||
file-mime "image/x-canon-cr2", 130
|
||||
|
@ -585,12 +459,6 @@ signature file-magic-auto90 {
|
|||
file-magic /(\x3cBookFile)/
|
||||
}
|
||||
|
||||
# >2112 string,=MSWordDoc (len=9), ["Microsoft Word document data"], swap_endian=0
|
||||
signature file-magic-auto91 {
|
||||
file-mime "application/msword", 120
|
||||
file-magic /(.{2112})(MSWordDoc)/
|
||||
}
|
||||
|
||||
# >0 string/t,=N#! rnews (len=9), ["mailed, batched news text"], swap_endian=0
|
||||
signature file-magic-auto92 {
|
||||
file-mime "message/rfc822", 120
|
||||
|
@ -656,12 +524,6 @@ signature file-magic-auto100 {
|
|||
file-magic /(MSCF\x00\x00\x00\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\320\317\021\340\241\261\032\341 (len=8), ["Microsoft Office Document"], swap_endian=0
|
||||
signature file-magic-auto101 {
|
||||
file-mime "application/msword", 110
|
||||
file-magic /(\xd0\xcf\x11\xe0\xa1\xb1\x1a\xe1)/
|
||||
}
|
||||
|
||||
# >21 string/c,=!SCREAM! (len=8), ["Screamtracker 2 module sound data"], swap_endian=0
|
||||
signature file-magic-auto102 {
|
||||
file-mime "audio/x-mod", 110
|
||||
|
@ -754,10 +616,10 @@ signature file-magic-auto116 {
|
|||
}
|
||||
|
||||
# >257 string,=ustar \000 (len=8), ["GNU tar archive"], swap_endian=0
|
||||
signature file-magic-auto117 {
|
||||
file-mime "application/x-tar", 110
|
||||
file-magic /(.{257})(ustar \x00)/
|
||||
}
|
||||
#signature file-magic-auto117 {
|
||||
# file-mime "application/x-tar", 110
|
||||
# file-magic /(.{257})(ustar \x00)/
|
||||
#}
|
||||
|
||||
# >0 string,=<MIFFile (len=8), ["FrameMaker MIF (ASCII) file"], swap_endian=0
|
||||
signature file-magic-auto118 {
|
||||
|
@ -771,12 +633,6 @@ signature file-magic-auto119 {
|
|||
file-magic /(PK\x07\x08PK\x03\x04)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\t\004\006\000\000\000\020\000 (len=8), ["Microsoft Excel Worksheet"], swap_endian=0
|
||||
signature file-magic-auto120 {
|
||||
file-mime "application/vnd.ms-excel", 110
|
||||
file-magic /(\x09\x04\x06\x00\x00\x00\x10\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=WordPro\000 (len=8), ["Lotus WordPro"], swap_endian=0
|
||||
signature file-magic-auto121 {
|
||||
file-mime "application/vnd.lotus-wordpro", 110
|
||||
|
@ -844,10 +700,10 @@ signature file-magic-auto130 {
|
|||
}
|
||||
|
||||
# >257 string,=ustar\000 (len=6), ["POSIX tar archive"], swap_endian=0
|
||||
signature file-magic-auto131 {
|
||||
file-mime "application/x-tar", 90
|
||||
file-magic /(.{257})(ustar\x00)/
|
||||
}
|
||||
#signature file-magic-auto131 {
|
||||
# file-mime "application/x-tar", 90
|
||||
# file-magic /(.{257})(ustar\x00)/
|
||||
#}
|
||||
|
||||
# >0 string,=AC1.40 (len=6), ["DWG AutoDesk AutoCAD Release 1.40"], swap_endian=0
|
||||
signature file-magic-auto132 {
|
||||
|
@ -994,12 +850,6 @@ signature file-magic-auto155 {
|
|||
file-magic /(\x23 xmcd)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\333\245-\000\000\000 (len=6), ["Microsoft Office Document"], swap_endian=0
|
||||
signature file-magic-auto156 {
|
||||
file-mime "application/msword", 90
|
||||
file-magic /(\xdb\xa5\x2d\x00\x00\x00)/
|
||||
}
|
||||
|
||||
# >2 string,=MMXPR3 (len=6), ["Motorola Quark Express Document (English)"], swap_endian=0
|
||||
signature file-magic-auto157 {
|
||||
file-mime "application/x-quark-xpress-3", 90
|
||||
|
@ -1046,36 +896,6 @@ signature file-magic-auto162 {
|
|||
file-magic /(\x3c\x3fxml)(.{15})(.*)( xmlns\x3d)(['"]http:\x2f\x2fwww.opengis.net\x2fkml)/
|
||||
}
|
||||
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>30 regex,=[Content_Types].xml|_rels/.rels (len=31), [""], swap_endian=0
|
||||
# >>>18 (lelong,+49), search/2000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>&26 search/1000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>>&26 string,=word/ (len=5), ["Microsoft Word 2007+"], swap_endian=0
|
||||
signature file-magic-auto163 {
|
||||
file-mime "application/vnd.openxmlformats-officedocument.wordprocessingml.document", 80
|
||||
file-magic /(PK\x03\x04)(.{26})(\[Content_Types\].xml|_rels\x2f.rels)(.*)(PK\x03\x04)(.{26})(.*)(PK\x03\x04)(.{26})(word\x2f)/
|
||||
}
|
||||
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>30 regex,=[Content_Types].xml|_rels/.rels (len=31), [""], swap_endian=0
|
||||
# >>>18 (lelong,+49), search/2000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>&26 search/1000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>>&26 string,=ppt/ (len=4), ["Microsoft PowerPoint 2007+"], swap_endian=0
|
||||
signature file-magic-auto164 {
|
||||
file-mime "application/vnd.openxmlformats-officedocument.presentationml.presentation", 70
|
||||
file-magic /(PK\x03\x04)(.{26})(\[Content_Types\].xml|_rels\x2f.rels)(.*)(PK\x03\x04)(.{26})(.*)(PK\x03\x04)(.{26})(ppt\x2f)/
|
||||
}
|
||||
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>30 regex,=[Content_Types].xml|_rels/.rels (len=31), [""], swap_endian=0
|
||||
# >>>18 (lelong,+49), search/2000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>&26 search/1000,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>>>>&26 string,=xl/ (len=3), ["Microsoft Excel 2007+"], swap_endian=0
|
||||
signature file-magic-auto165 {
|
||||
file-mime "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet", 60
|
||||
file-magic /(PK\x03\x04)(.{26})(\[Content_Types\].xml|_rels\x2f.rels)(.*)(PK\x03\x04)(.{26})(.*)(PK\x03\x04)(.{26})(xl\x2f)/
|
||||
}
|
||||
|
||||
# >60 string,=RINEX (len=5), [""], swap_endian=0
|
||||
# >>80 search/256,=XXRINEXB (len=8), ["RINEX Data, GEO SBAS Broadcast"], swap_endian=0
|
||||
# >>>5 string,x, [", version %6.6s"], swap_endian=0
|
||||
|
@ -1229,30 +1049,12 @@ signature file-magic-auto187 {
|
|||
file-magic /(\x00\x01\x00\x00\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=PO^Q` (len=5), ["Microsoft Word 6.0 Document"], swap_endian=0
|
||||
signature file-magic-auto188 {
|
||||
file-mime "application/msword", 80
|
||||
file-magic /(PO\x5eQ\x60)/
|
||||
}
|
||||
|
||||
# >0 string,=%PDF- (len=5), ["PDF document"], swap_endian=0
|
||||
signature file-magic-auto189 {
|
||||
file-mime "application/pdf", 80
|
||||
file-magic /(\x25PDF\x2d)/
|
||||
}
|
||||
|
||||
# >2114 string,=Biff5 (len=5), ["Microsoft Excel 5.0 Worksheet"], swap_endian=0
|
||||
signature file-magic-auto190 {
|
||||
file-mime "application/vnd.ms-excel", 80
|
||||
file-magic /(.{2114})(Biff5)/
|
||||
}
|
||||
|
||||
# >2121 string,=Biff5 (len=5), ["Microsoft Excel 5.0 Worksheet"], swap_endian=0
|
||||
signature file-magic-auto191 {
|
||||
file-mime "application/vnd.ms-excel", 80
|
||||
file-magic /(.{2121})(Biff5)/
|
||||
}
|
||||
|
||||
# >0 string/t,=Path: (len=5), ["news text"], swap_endian=0
|
||||
signature file-magic-auto192 {
|
||||
file-mime "message/news", 80
|
||||
|
@ -1383,12 +1185,6 @@ signature file-magic-auto211 {
|
|||
file-magic /(\x00\x00\x00\x01)([\x07\x27\x47\x67\x87\xa7\xc7\xe7])/
|
||||
}
|
||||
|
||||
# >0 belong&,=-889275714 (0xcafebabe), [""], swap_endian=0
|
||||
signature file-magic-auto212 {
|
||||
file-mime "application/x-java-applet", 71
|
||||
file-magic /(\xca\xfe\xba\xbe)/
|
||||
}
|
||||
|
||||
# >0 belong&ffffffffffffff00,=256 (0x00000100), [""], swap_endian=0
|
||||
# >>3 byte&,=0xba, ["MPEG sequence"], swap_endian=0
|
||||
signature file-magic-auto213 {
|
||||
|
@ -1706,46 +1502,6 @@ signature file-magic-auto245 {
|
|||
file-magic /(PK\x03\x04)(.{22})(\x08\x00\x00\x00mimetypeapplication\x2f)(epub\x2bzip)/
|
||||
}
|
||||
|
||||
# Seems redundant with other zip signature below.
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>26 string,=\b\000\000\000mimetypeapplication/ (len=24), [""], swap_endian=0
|
||||
# >>>50 string,!epub+zip (len=8), [""], swap_endian=0
|
||||
# >>>>50 string,!vnd.oasis.opendocument. (len=23), [""], swap_endian=0
|
||||
# >>>>>50 string,!vnd.sun.xml. (len=12), [""], swap_endian=0
|
||||
# >>>>>>50 string,!vnd.kde. (len=8), [""], swap_endian=0
|
||||
# >>>>>>>38 regex,=[!-OQ-~]+ (len=9), ["Zip data (MIME type "%s"?)"], swap_endian=0
|
||||
#signature file-magic-auto246 {
|
||||
# file-mime "application/zip", 39
|
||||
# file-magic /(PK\x03\x04)(.{22})(\x08\x00\x00\x00mimetypeapplication\x2f)/
|
||||
#}
|
||||
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>26 string,=\b\000\000\000mimetype (len=12), [""], swap_endian=0
|
||||
# >>>38 string,!application/ (len=12), [""], swap_endian=0
|
||||
# >>>>38 regex,=[!-OQ-~]+ (len=9), ["Zip data (MIME type "%s"?)"], swap_endian=0
|
||||
signature file-magic-auto247 {
|
||||
file-mime "application/zip", 39
|
||||
file-magic /(PK\x03\x04)(.{22})(\x08\x00\x00\x00mimetype)/
|
||||
}
|
||||
|
||||
# The indirect offset makes this difficult to convert.
|
||||
# The (.*) may be too generous.
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>26 (leshort,+30), leshort&,=-13570 (0xcafe), ["Java archive data (JAR)"], swap_endian=0
|
||||
signature file-magic-auto248 {
|
||||
file-mime "application/java-archive", 50
|
||||
file-magic /(PK\x03\x04)(.*)(\xfe\xca)/
|
||||
}
|
||||
|
||||
# The indeirect offset and string inequality make this difficult to convert.
|
||||
# >0 string,=PK\003\004 (len=4), [""], swap_endian=0
|
||||
# >>26 (leshort,+30), leshort&,!-13570 (0xcafe), [""], swap_endian=0
|
||||
# >>>26 string,!\b\000\000\000mimetype (len=12), ["Zip archive data"], swap_endian=0
|
||||
signature file-magic-auto249 {
|
||||
file-mime "application/zip", 10
|
||||
file-magic /(PK\x03\x04)(.{2})/
|
||||
}
|
||||
|
||||
# >0 belong&,=442 (0x000001ba), [""], swap_endian=0
|
||||
# >>4 byte&,&0x40, [""], swap_endian=0
|
||||
signature file-magic-auto250 {
|
||||
|
@ -2065,18 +1821,6 @@ signature file-magic-auto299 {
|
|||
file-magic /(PDN3)/
|
||||
}
|
||||
|
||||
# >0 ulelong&,=2712847316 (0xa1b2c3d4), ["tcpdump capture file (little-endian)"], swap_endian=0
|
||||
signature file-magic-auto300 {
|
||||
file-mime "application/vnd.tcpdump.pcap", 70
|
||||
file-magic /(\xd4\xc3\xb2\xa1)/
|
||||
}
|
||||
|
||||
# >0 ubelong&,=2712847316 (0xa1b2c3d4), ["tcpdump capture file (big-endian)"], swap_endian=0
|
||||
signature file-magic-auto301 {
|
||||
file-mime "application/vnd.tcpdump.pcap", 70
|
||||
file-magic /(\xa1\xb2\xc3\xd4)/
|
||||
}
|
||||
|
||||
# >0 belong&,=-17957139 (0xfeedfeed), ["Java KeyStore"], swap_endian=0
|
||||
signature file-magic-auto302 {
|
||||
file-mime "application/x-java-keystore", 70
|
||||
|
@ -2297,12 +2041,6 @@ signature file-magic-auto335 {
|
|||
file-magic /(SIT\x21)/
|
||||
}
|
||||
|
||||
# >0 lelong&,=574529400 (0x223e9f78), ["Transport Neutral Encapsulation Format"], swap_endian=0
|
||||
signature file-magic-auto336 {
|
||||
file-mime "application/vnd.ms-tnef", 70
|
||||
file-magic /(\x78\x9f\x3e\x22)/
|
||||
}
|
||||
|
||||
# >0 string,=<ar> (len=4), ["System V Release 1 ar archive"], swap_endian=0
|
||||
signature file-magic-auto337 {
|
||||
file-mime "application/x-archive", 70
|
||||
|
@ -2433,48 +2171,6 @@ signature file-magic-auto357 {
|
|||
file-magic /(RIFF)(.{4})(AVI )/
|
||||
}
|
||||
|
||||
# >0 belong&,=834535424 (0x31be0000), ["Microsoft Word Document"], swap_endian=0
|
||||
signature file-magic-auto358 {
|
||||
file-mime "application/msword", 70
|
||||
file-magic /(\x31\xbe\x00\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\3767\000# (len=4), ["Microsoft Office Document"], swap_endian=0
|
||||
signature file-magic-auto359 {
|
||||
file-mime "application/msword", 70
|
||||
file-magic /(\xfe7\x00\x23)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\333\245-\000 (len=4), ["Microsoft WinWord 2.0 Document"], swap_endian=0
|
||||
signature file-magic-auto360 {
|
||||
file-mime "application/msword", 70
|
||||
file-magic /(\xdb\xa5\x2d\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\333\245-\000 (len=4), ["Microsoft WinWord 2.0 Document"], swap_endian=0
|
||||
signature file-magic-auto361 {
|
||||
file-mime "application/msword", 70
|
||||
file-magic /(\xdb\xa5\x2d\x00)/
|
||||
}
|
||||
|
||||
# >0 belong&,=6656 (0x00001a00), ["Lotus 1-2-3"], swap_endian=0
|
||||
signature file-magic-auto362 {
|
||||
file-mime "application/x-123", 70
|
||||
file-magic /(\x00\x00\x1a\x00)/
|
||||
}
|
||||
|
||||
# >0 belong&,=512 (0x00000200), ["Lotus 1-2-3"], swap_endian=0
|
||||
signature file-magic-auto363 {
|
||||
file-mime "application/x-123", 70
|
||||
file-magic /(\x00\x00\x02\x00)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\000\000\001\000 (len=4), ["MS Windows icon resource"], swap_endian=0
|
||||
signature file-magic-auto364 {
|
||||
file-mime "image/x-icon", 70
|
||||
file-magic /(\x00\x00\x01\x00)/
|
||||
}
|
||||
|
||||
# >0 lelong&,=268435536 (0x10000050), ["Psion Series 5"], swap_endian=0
|
||||
# >>4 lelong&,=268435565 (0x1000006d), ["database"], swap_endian=0
|
||||
# >>>8 lelong&,=268435588 (0x10000084), ["Agenda file"], swap_endian=0
|
||||
|
@ -2737,12 +2433,6 @@ signature file-magic-auto403 {
|
|||
file-magic /(SBI)/
|
||||
}
|
||||
|
||||
# >0 string/b,=\224\246. (len=3), ["Microsoft Word Document"], swap_endian=0
|
||||
signature file-magic-auto404 {
|
||||
file-mime "application/msword", 60
|
||||
file-magic /(\x94\xa6\x2e)/
|
||||
}
|
||||
|
||||
# >0 string,=\004%! (len=3), ["PostScript document text"], swap_endian=0
|
||||
signature file-magic-auto405 {
|
||||
file-mime "application/postscript", 60
|
||||
|
@ -2763,17 +2453,11 @@ signature file-magic-auto407 {
|
|||
file-magic /(.*)([ \x09]*(class|module)[ \x09][A-Z])((modul|includ)e [A-Z]|def [a-z])(^[ \x09]*end([ \x09]*[;#].*)?$)/
|
||||
}
|
||||
|
||||
# >512 string/b,=\354\245\301 (len=3), ["Microsoft Word Document"], swap_endian=0
|
||||
signature file-magic-auto408 {
|
||||
file-mime "application/msword", 60
|
||||
file-magic /(.{512})(\xec\xa5\xc1)/
|
||||
}
|
||||
|
||||
# >0 regex/20,=^\.[A-Za-z0-9][A-Za-z0-9][ \t] (len=29), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto411 {
|
||||
file-mime "text/troff", 59
|
||||
file-magic /(^\.[A-Za-z0-9][A-Za-z0-9][ \x09])/
|
||||
}
|
||||
#signature file-magic-auto411 {
|
||||
# file-mime "text/troff", 59
|
||||
# file-magic /(^\.[A-Za-z0-9][A-Za-z0-9][ \x09])/
|
||||
#}
|
||||
|
||||
# >0 search/4096,=\documentclass (len=14), ["LaTeX 2e document text"], swap_endian=0
|
||||
signature file-magic-auto412 {
|
||||
|
@ -2806,10 +2490,10 @@ signature file-magic-auto416 {
|
|||
}
|
||||
|
||||
# >0 regex/20,=^\.[A-Za-z0-9][A-Za-z0-9]$ (len=26), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto417 {
|
||||
file-mime "text/troff", 56
|
||||
file-magic /(^\.[A-Za-z0-9][A-Za-z0-9]$)/
|
||||
}
|
||||
#signature file-magic-auto417 {
|
||||
# file-mime "text/troff", 56
|
||||
# file-magic /(^\.[A-Za-z0-9][A-Za-z0-9]$)/
|
||||
#}
|
||||
|
||||
# >0 search/w/1,=#! /usr/bin/php (len=15), ["PHP script text executable"], swap_endian=0
|
||||
signature file-magic-auto418 {
|
||||
|
@ -2829,30 +2513,12 @@ signature file-magic-auto420 {
|
|||
file-magic /(.*)(eval \x22exec \x2fusr\x2fbin\x2fperl)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/local/bin/python (len=24), ["Python script text executable"], swap_endian=0
|
||||
signature file-magic-auto421 {
|
||||
file-mime "text/x-python", 54
|
||||
file-magic /(.*)(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fpython)/
|
||||
}
|
||||
|
||||
# >0 search/1,=Common subdirectories: (len=23), ["diff output text"], swap_endian=0
|
||||
signature file-magic-auto422 {
|
||||
file-mime "text/x-diff", 53
|
||||
file-magic /(.*)(Common subdirectories\x3a )/
|
||||
}
|
||||
|
||||
# >0 search/1,=#! /usr/bin/env python (len=22), ["Python script text executable"], swap_endian=0
|
||||
signature file-magic-auto423 {
|
||||
file-mime "text/x-python", 52
|
||||
file-magic /(.*)(\x23\x21 \x2fusr\x2fbin\x2fenv python)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/local/bin/ruby (len=22), ["Ruby script text executable"], swap_endian=0
|
||||
signature file-magic-auto424 {
|
||||
file-mime "text/x-ruby", 52
|
||||
file-magic /(.*)(\x23\x21 ?\x2fusr\x2flocal\x2fbin\x2fruby)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/local/bin/wish (len=22), ["Tcl/Tk script text executable"], swap_endian=0
|
||||
signature file-magic-auto425 {
|
||||
file-mime "text/x-tcl", 52
|
||||
|
@ -2871,12 +2537,6 @@ signature file-magic-auto427 {
|
|||
file-magic /(\xff\xd8)/
|
||||
}
|
||||
|
||||
# >0 search/1,=#!/usr/bin/env python (len=21), ["Python script text executable"], swap_endian=0
|
||||
signature file-magic-auto428 {
|
||||
file-mime "text/x-python", 51
|
||||
file-magic /(.*)(\x23\x21\x2fusr\x2fbin\x2fenv python)/
|
||||
}
|
||||
|
||||
# >0 search/1,=#!/usr/bin/env nodejs (len=21), ["Node.js script text executable"], swap_endian=0
|
||||
signature file-magic-auto429 {
|
||||
file-mime "application/javascript", 51
|
||||
|
@ -3189,12 +2849,6 @@ signature file-magic-auto474 {
|
|||
file-magic /(\x25\x21)/
|
||||
}
|
||||
|
||||
# >0 search/1,=#! /usr/bin/env ruby (len=20), ["Ruby script text executable"], swap_endian=0
|
||||
signature file-magic-auto475 {
|
||||
file-mime "text/x-ruby", 50
|
||||
file-magic /(.*)(\x23\x21 \x2fusr\x2fbin\x2fenv ruby)/
|
||||
}
|
||||
|
||||
# >0 regex/1,=(^[0-9]{5})[acdn][w] (len=20), ["MARC21 Classification"], swap_endian=0
|
||||
signature file-magic-auto476 {
|
||||
file-mime "application/marc", 50
|
||||
|
@ -3228,10 +2882,10 @@ signature file-magic-auto480 {
|
|||
}
|
||||
|
||||
# >0 string,=\n( (len=2), ["Emacs v18 byte-compiled Lisp data"], swap_endian=0
|
||||
signature file-magic-auto481 {
|
||||
file-mime "application/x-elc", 50
|
||||
file-magic /(\x0a\x28)/
|
||||
}
|
||||
#signature file-magic-auto481 {
|
||||
# file-mime "application/x-elc", 50
|
||||
# file-magic /(\x0a\x28)/
|
||||
#}
|
||||
|
||||
# >0 string,=\021\t (len=2), ["Award BIOS Logo, 136 x 126"], swap_endian=0
|
||||
signature file-magic-auto482 {
|
||||
|
@ -3305,17 +2959,17 @@ signature file-magic-auto493 {
|
|||
file-magic /(\xf7\x02)/
|
||||
}
|
||||
|
||||
# >2 string,=\000\021 (len=2), ["TeX font metric data"], swap_endian=0
|
||||
signature file-magic-auto494 {
|
||||
file-mime "application/x-tex-tfm", 50
|
||||
file-magic /(.{2})(\x00\x11)/
|
||||
}
|
||||
|
||||
# >2 string,=\000\022 (len=2), ["TeX font metric data"], swap_endian=0
|
||||
signature file-magic-auto495 {
|
||||
file-mime "application/x-tex-tfm", 50
|
||||
file-magic /(.{2})(\x00\x12)/
|
||||
}
|
||||
## >2 string,=\000\021 (len=2), ["TeX font metric data"], swap_endian=0
|
||||
#signature file-magic-auto494 {
|
||||
# file-mime "application/x-tex-tfm", 50
|
||||
# file-magic /(.{2})(\x00\x11)/
|
||||
#}
|
||||
#
|
||||
## >2 string,=\000\022 (len=2), ["TeX font metric data"], swap_endian=0
|
||||
#signature file-magic-auto495 {
|
||||
# file-mime "application/x-tex-tfm", 50
|
||||
# file-magic /(.{2})(\x00\x12)/
|
||||
#}
|
||||
|
||||
# >0 beshort&,=-31486 (0x8502), ["GPG encrypted data"], swap_endian=0
|
||||
signature file-magic-auto496 {
|
||||
|
@ -3470,12 +3124,6 @@ signature file-magic-auto514 {
|
|||
file-magic /(.*)(\x23\x21 \x2fusr\x2fbin\x2fenv lua)/
|
||||
}
|
||||
|
||||
# >0 search/1,=#!/usr/bin/env ruby (len=19), ["Ruby script text executable"], swap_endian=0
|
||||
signature file-magic-auto515 {
|
||||
file-mime "text/x-ruby", 49
|
||||
file-magic /(.*)(\x23\x21\x2fusr\x2fbin\x2fenv ruby)/
|
||||
}
|
||||
|
||||
# >0 search/1,=#! /usr/bin/env tcl (len=19), ["Tcl script text executable"], swap_endian=0
|
||||
signature file-magic-auto516 {
|
||||
file-mime "text/x-tcl", 49
|
||||
|
@ -3493,12 +3141,6 @@ signature file-magic-auto519 {
|
|||
file-magic /(.*)(\x23\x21\x2fusr\x2fbin\x2fenv lua)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/bin/python (len=18), ["Python script text executable"], swap_endian=0
|
||||
signature file-magic-auto520 {
|
||||
file-mime "text/x-python", 48
|
||||
file-magic /(.*)(\x23\x21 ?\x2fusr\x2fbin\x2fpython)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#!/usr/bin/nodejs (len=17), ["Node.js script text executable"], swap_endian=0
|
||||
signature file-magic-auto521 {
|
||||
file-mime "application/javascript", 47
|
||||
|
@ -3506,10 +3148,10 @@ signature file-magic-auto521 {
|
|||
}
|
||||
|
||||
# >0 regex,=^class[ \t\n]+ (len=12), ["C++ source text"], swap_endian=0
|
||||
signature file-magic-auto522 {
|
||||
file-mime "text/x-c++", 47
|
||||
file-magic /(.*)(class[ \x09\x0a]+[[:alnum:]_]+)(.*)(\x7b)(.*)(public:)/
|
||||
}
|
||||
#signature file-magic-auto522 {
|
||||
# file-mime "text/x-c++", 47
|
||||
# file-magic /(.*)(class[ \x09\x0a]+[[:alnum:]_]+)(.*)(\x7b)(.*)(public:)/
|
||||
#}
|
||||
|
||||
# >0 search/1,=This is Info file (len=17), ["GNU Info text"], swap_endian=0
|
||||
signature file-magic-auto528 {
|
||||
|
@ -3658,12 +3300,6 @@ signature file-magic-auto545 {
|
|||
file-magic /(.*)(\x23\x21 ?\x2fusr\x2fbin\x2fwish)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/bin/ruby (len=16), ["Ruby script text executable"], swap_endian=0
|
||||
signature file-magic-auto546 {
|
||||
file-mime "text/x-ruby", 46
|
||||
file-magic /(.*)(\x23\x21 ?\x2fusr\x2fbin\x2fruby)/
|
||||
}
|
||||
|
||||
# >0 search/w/1,=#! /usr/bin/lua (len=15), ["Lua script text executable"], swap_endian=0
|
||||
signature file-magic-auto547 {
|
||||
file-mime "text/x-lua", 45
|
||||
|
@ -3727,10 +3363,10 @@ signature file-magic-auto556 {
|
|||
}
|
||||
|
||||
# >0 regex,=^extern[ \t\n]+ (len=13), ["C source text"], swap_endian=0
|
||||
signature file-magic-auto557 {
|
||||
file-mime "text/x-c", 43
|
||||
file-magic /(.*)(extern[ \x09\x0a]+)/
|
||||
}
|
||||
#signature file-magic-auto557 {
|
||||
# file-mime "text/x-c", 43
|
||||
# file-magic /(.*)(extern[ \x09\x0a]+)/
|
||||
#}
|
||||
|
||||
# >0 search/4096,=% -*-latex-*- (len=13), ["LaTeX document text"], swap_endian=0
|
||||
signature file-magic-auto558 {
|
||||
|
@ -3746,10 +3382,10 @@ signature file-magic-auto558 {
|
|||
#}
|
||||
|
||||
# >0 regex,=^struct[ \t\n]+ (len=13), ["C source text"], swap_endian=0
|
||||
signature file-magic-auto560 {
|
||||
file-mime "text/x-c", 43
|
||||
file-magic /(.*)(struct[ \x09\x0a]+)/
|
||||
}
|
||||
#signature file-magic-auto560 {
|
||||
# file-mime "text/x-c", 43
|
||||
# file-magic /(.*)(struct[ \x09\x0a]+)/
|
||||
#}
|
||||
|
||||
# >0 search/w/1,=#!/bin/nodejs (len=13), ["Node.js script text executable"], swap_endian=0
|
||||
signature file-magic-auto561 {
|
||||
|
@ -3802,10 +3438,10 @@ signature file-magic-auto567 {
|
|||
}
|
||||
|
||||
# >0 regex,=^char[ \t\n]+ (len=11), ["C source text"], swap_endian=0
|
||||
signature file-magic-auto568 {
|
||||
file-mime "text/x-c", 41
|
||||
file-magic /(.*)(char[ \x09\x0a]+)/
|
||||
}
|
||||
#signature file-magic-auto568 {
|
||||
# file-mime "text/x-c", 41
|
||||
# file-magic /(.*)(char[ \x09\x0a]+)/
|
||||
#}
|
||||
|
||||
# >0 search/1,=#! (len=2), [""], swap_endian=0
|
||||
# >>0 regex,=^#!.*/bin/perl$ (len=15), ["Perl script text executable"], swap_endian=0
|
||||
|
@ -3887,23 +3523,11 @@ signature file-magic-auto578 {
|
|||
file-magic /(^dnl )/
|
||||
}
|
||||
|
||||
# >0 regex,=^all: (len=5), ["makefile script text"], swap_endian=0
|
||||
signature file-magic-auto579 {
|
||||
file-mime "text/x-makefile", 40
|
||||
file-magic /(^all:)/
|
||||
}
|
||||
|
||||
# >0 regex,=^.PRECIOUS (len=10), ["makefile script text"], swap_endian=0
|
||||
signature file-magic-auto580 {
|
||||
file-mime "text/x-makefile", 40
|
||||
file-magic /(^.PRECIOUS)/
|
||||
}
|
||||
|
||||
# >0 search/8192,=main( (len=5), ["C source text"], swap_endian=0
|
||||
signature file-magic-auto581 {
|
||||
file-mime "text/x-c", 40
|
||||
file-magic /(.*)(main\x28)/
|
||||
}
|
||||
#signature file-magic-auto581 {
|
||||
# file-mime "text/x-c", 40
|
||||
# file-magic /(.*)(main\x28)/
|
||||
#}
|
||||
|
||||
# Not specific enough.
|
||||
# >0 search/1,=\" (len=2), ["troff or preprocessor input text"], swap_endian=0
|
||||
|
@ -3932,22 +3556,22 @@ signature file-magic-auto584 {
|
|||
#}
|
||||
|
||||
# >0 regex,=^#include (len=9), ["C source text"], swap_endian=0
|
||||
signature file-magic-auto586 {
|
||||
file-mime "text/x-c", 39
|
||||
file-magic /(.*)(#include)/
|
||||
}
|
||||
#signature file-magic-auto586 {
|
||||
# file-mime "text/x-c", 39
|
||||
# file-magic /(.*)(#include)/
|
||||
#}
|
||||
|
||||
# >0 search/1,=.\" (len=3), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto587 {
|
||||
file-mime "text/troff", 39
|
||||
file-magic /(.*)(\x2e\x5c\x22)/
|
||||
}
|
||||
#signature file-magic-auto587 {
|
||||
# file-mime "text/troff", 39
|
||||
# file-magic /(.*)(\x2e\x5c\x22)/
|
||||
#}
|
||||
|
||||
# >0 search/1,='\" (len=3), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto588 {
|
||||
file-mime "text/troff", 39
|
||||
file-magic /(.*)(\x27\x5c\x22)/
|
||||
}
|
||||
#signature file-magic-auto588 {
|
||||
# file-mime "text/troff", 39
|
||||
# file-magic /(.*)(\x27\x5c\x22)/
|
||||
#}
|
||||
|
||||
# >0 search/1,=<TeXmacs| (len=9), ["TeXmacs document text"], swap_endian=0
|
||||
signature file-magic-auto589 {
|
||||
|
@ -3974,10 +3598,10 @@ signature file-magic-auto592 {
|
|||
}
|
||||
|
||||
# >0 search/1,=''' (len=3), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto593 {
|
||||
file-mime "text/troff", 39
|
||||
file-magic /(.*)(\x27\x27\x27)/
|
||||
}
|
||||
#signature file-magic-auto593 {
|
||||
# file-mime "text/troff", 39
|
||||
# file-magic /(.*)(\x27\x27\x27)/
|
||||
#}
|
||||
|
||||
# >0 search/4096,=try: (len=4), [""], swap_endian=0
|
||||
# >>&0 regex,=^\s*except.*: (len=13), ["Python script text executable"], swap_endian=0
|
||||
|
@ -3999,12 +3623,6 @@ signature file-magic-auto596 {
|
|||
file-magic /(.*)(\x22LIBHDR\x22)/
|
||||
}
|
||||
|
||||
# >0 regex,=^SUBDIRS (len=8), ["automake makefile script text"], swap_endian=0
|
||||
signature file-magic-auto597 {
|
||||
file-mime "text/x-makefile", 38
|
||||
file-magic /(.*)(SUBDIRS)/
|
||||
}
|
||||
|
||||
# >0 search/4096,=(defvar (len=8), ["Lisp/Scheme program text"], swap_endian=0
|
||||
signature file-magic-auto598 {
|
||||
file-mime "text/x-lisp", 38
|
||||
|
@ -4031,19 +3649,6 @@ signature file-magic-auto600 {
|
|||
# file-magic /(.*)(\x2a\x2a\x2a )/
|
||||
#}
|
||||
|
||||
# >0 search/1,='.\" (len=4), ["troff or preprocessor input text"], swap_endian=0
|
||||
signature file-magic-auto602 {
|
||||
file-mime "text/troff", 38
|
||||
file-magic /(.*)(\x27\x2e\x5c\x22)/
|
||||
}
|
||||
|
||||
# LDFLAGS appears in other contexts, e.g. shell script.
|
||||
# >0 regex,=^LDFLAGS (len=8), ["makefile script text"], swap_endian=0
|
||||
#signature file-magic-auto603 {
|
||||
# file-mime "text/x-makefile", 38
|
||||
# file-magic /(.*)(LDFLAGS)/
|
||||
#}
|
||||
|
||||
# >0 search/8192,="libhdr" (len=8), ["BCPL source text"], swap_endian=0
|
||||
signature file-magic-auto604 {
|
||||
file-mime "text/x-bcpl", 38
|
||||
|
@ -4057,12 +3662,6 @@ signature file-magic-auto604 {
|
|||
# file-magic /(^record)/
|
||||
#}
|
||||
|
||||
# >0 regex,=^CFLAGS (len=7), ["makefile script text"], swap_endian=0
|
||||
signature file-magic-auto606 {
|
||||
file-mime "text/x-makefile", 37
|
||||
file-magic /(.*)(CFLAGS)/
|
||||
}
|
||||
|
||||
# >0 search/4096,=(defun (len=7), ["Lisp/Scheme program text"], swap_endian=0
|
||||
signature file-magic-auto607 {
|
||||
file-mime "text/x-lisp", 37
|
||||
|
|
28
scripts/base/frameworks/files/magic/msoffice.sig
Normal file
28
scripts/base/frameworks/files/magic/msoffice.sig
Normal file
|
@ -0,0 +1,28 @@
|
|||
|
||||
# This signature is non-specific and terrible but after
|
||||
# searching for a long time there doesn't seem to be a
|
||||
# better option.
|
||||
signature file-msword {
|
||||
file-magic /^\xd0\xcf\x11\xe0\xa1\xb1\x1a\xe1/
|
||||
file-mime "application/msword", 50
|
||||
}
|
||||
|
||||
signature file-ooxml {
|
||||
file-magic /^PK\x03\x04\x14\x00\x06\x00/
|
||||
file-mime "application/vnd.openxmlformats-officedocument", 50
|
||||
}
|
||||
|
||||
signature file-docx {
|
||||
file-magic /^PK\x03\x04.{26}(\[Content_Types\]\.xml|_rels\x2f\.rels|word\x2f).*PK\x03\x04.{26}word\x2f/
|
||||
file-mime "application/vnd.openxmlformats-officedocument.wordprocessingml.document", 80
|
||||
}
|
||||
|
||||
signature file-xlsx {
|
||||
file-magic /^PK\x03\x04.{26}(\[Content_Types\]\.xml|_rels\x2f\.rels|xl\2f).*PK\x03\x04.{26}xl\x2f/
|
||||
file-mime "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet", 80
|
||||
}
|
||||
|
||||
signature file-pptx {
|
||||
file-magic /^PK\x03\x04.{26}(\[Content_Types\]\.xml|_rels\x2f\.rels|ppt\x2f).*PK\x03\x04.{26}ppt\x2f/
|
||||
file-mime "application/vnd.openxmlformats-officedocument.presentationml.presentation", 80
|
||||
}
|
|
@ -100,8 +100,9 @@ export {
|
|||
## during the process of analysis e.g. due to dropped packets.
|
||||
missing_bytes: count &log &default=0;
|
||||
|
||||
## The number of not all-in-sequence bytes in the file stream that
|
||||
## were delivered to file analyzers due to reassembly buffer overflow.
|
||||
## The number of bytes in the file stream that were not delivered to
|
||||
## stream file analyzers. This could be overlapping bytes or
|
||||
## bytes that couldn't be reassembled.
|
||||
overflow_bytes: count &log &default=0;
|
||||
|
||||
## Whether the file analysis timed out at least once for the file.
|
||||
|
@ -124,6 +125,37 @@ export {
|
|||
## generate two handles that would hash to the same file id.
|
||||
const salt = "I recommend changing this." &redef;
|
||||
|
||||
## Decide if you want to automatically attached analyzers to
|
||||
## files based on the detected mime type of the file.
|
||||
const analyze_by_mime_type_automatically = T &redef;
|
||||
|
||||
## The default setting for if the file reassembler is enabled for
|
||||
## each file.
|
||||
const enable_reassembler = T &redef;
|
||||
|
||||
## The default per-file reassembly buffer size.
|
||||
const reassembly_buffer_size = 1048576 &redef;
|
||||
|
||||
## Allows the file reassembler to be used if it's necessary because the
|
||||
## file is transferred out of order.
|
||||
##
|
||||
## f: the file.
|
||||
global enable_reassembly: function(f: fa_file);
|
||||
|
||||
## Disables the file reassembler on this file. If the file is not
|
||||
## transferred out of order this will have no effect.
|
||||
##
|
||||
## f: the file.
|
||||
global disable_reassembly: function(f: fa_file);
|
||||
|
||||
## Set the maximum size the reassembly buffer is allowed to grow
|
||||
## for the given file.
|
||||
##
|
||||
## f: the file.
|
||||
##
|
||||
## max: Maximum allowed size of the reassembly buffer.
|
||||
global set_reassembly_buffer_size: function(f: fa_file, max: count);
|
||||
|
||||
## Sets the *timeout_interval* field of :bro:see:`fa_file`, which is
|
||||
## used to determine the length of inactivity that is allowed for a file
|
||||
## before internal state related to it is cleaned up. When used within
|
||||
|
@ -153,15 +185,6 @@ export {
|
|||
tag: Files::Tag,
|
||||
args: AnalyzerArgs &default=AnalyzerArgs()): bool;
|
||||
|
||||
## Adds all analyzers associated with a give MIME type to the analysis of
|
||||
## a file. Note that analyzers added via MIME types cannot take further
|
||||
## arguments.
|
||||
##
|
||||
## f: the file.
|
||||
##
|
||||
## mtype: the MIME type; it will be compared case-insensitive.
|
||||
global add_analyzers_for_mime_type: function(f: fa_file, mtype: string);
|
||||
|
||||
## Removes an analyzer from the analysis of a given file.
|
||||
##
|
||||
## f: the file.
|
||||
|
@ -284,6 +307,7 @@ global registered_protocols: table[Analyzer::Tag] of ProtoRegistration = table()
|
|||
|
||||
# Store the MIME type to analyzer mappings.
|
||||
global mime_types: table[Analyzer::Tag] of set[string];
|
||||
global mime_type_to_analyzers: table[string] of set[Analyzer::Tag];
|
||||
|
||||
global analyzer_add_callbacks: table[Files::Tag] of function(f: fa_file, args: AnalyzerArgs) = table();
|
||||
|
||||
|
@ -313,8 +337,6 @@ function set_info(f: fa_file)
|
|||
f$info$overflow_bytes = f$overflow_bytes;
|
||||
if ( f?$is_orig )
|
||||
f$info$is_orig = f$is_orig;
|
||||
if ( f?$mime_type )
|
||||
f$info$mime_type = f$mime_type;
|
||||
}
|
||||
|
||||
function set_timeout_interval(f: fa_file, t: interval): bool
|
||||
|
@ -322,6 +344,21 @@ function set_timeout_interval(f: fa_file, t: interval): bool
|
|||
return __set_timeout_interval(f$id, t);
|
||||
}
|
||||
|
||||
function enable_reassembly(f: fa_file)
|
||||
{
|
||||
__enable_reassembly(f$id);
|
||||
}
|
||||
|
||||
function disable_reassembly(f: fa_file)
|
||||
{
|
||||
__disable_reassembly(f$id);
|
||||
}
|
||||
|
||||
function set_reassembly_buffer_size(f: fa_file, max: count)
|
||||
{
|
||||
__set_reassembly_buffer(f$id, max);
|
||||
}
|
||||
|
||||
function add_analyzer(f: fa_file, tag: Files::Tag, args: AnalyzerArgs): bool
|
||||
{
|
||||
add f$info$analyzers[Files::analyzer_name(tag)];
|
||||
|
@ -337,15 +374,6 @@ function add_analyzer(f: fa_file, tag: Files::Tag, args: AnalyzerArgs): bool
|
|||
return T;
|
||||
}
|
||||
|
||||
function add_analyzers_for_mime_type(f: fa_file, mtype: string)
|
||||
{
|
||||
local dummy_args: AnalyzerArgs;
|
||||
local analyzers = __add_analyzers_for_mime_type(f$id, mtype, dummy_args);
|
||||
|
||||
for ( tag in analyzers )
|
||||
add f$info$analyzers[Files::analyzer_name(tag)];
|
||||
}
|
||||
|
||||
function register_analyzer_add_callback(tag: Files::Tag, callback: function(f: fa_file, args: AnalyzerArgs))
|
||||
{
|
||||
analyzer_add_callbacks[tag] = callback;
|
||||
|
@ -366,42 +394,6 @@ function analyzer_name(tag: Files::Tag): string
|
|||
return __analyzer_name(tag);
|
||||
}
|
||||
|
||||
event file_new(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
|
||||
if ( f?$mime_type )
|
||||
add_analyzers_for_mime_type(f, f$mime_type);
|
||||
}
|
||||
|
||||
event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
add f$info$conn_uids[c$uid];
|
||||
local cid = c$id;
|
||||
add f$info$tx_hosts[f$is_orig ? cid$orig_h : cid$resp_h];
|
||||
if( |Site::local_nets| > 0 )
|
||||
f$info$local_orig=Site::is_local_addr(f$is_orig ? cid$orig_h : cid$resp_h);
|
||||
|
||||
add f$info$rx_hosts[f$is_orig ? cid$resp_h : cid$orig_h];
|
||||
}
|
||||
|
||||
event file_timeout(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
f$info$timedout = T;
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file) &priority=-10
|
||||
{
|
||||
Log::write(Files::LOG, f$info);
|
||||
}
|
||||
|
||||
function register_protocol(tag: Analyzer::Tag, reg: ProtoRegistration): bool
|
||||
{
|
||||
local result = (tag !in registered_protocols);
|
||||
|
@ -424,13 +416,18 @@ function register_for_mime_types(tag: Analyzer::Tag, mime_types: set[string]) :
|
|||
|
||||
function register_for_mime_type(tag: Analyzer::Tag, mt: string) : bool
|
||||
{
|
||||
if ( ! __register_for_mime_type(tag, mt) )
|
||||
return F;
|
||||
|
||||
if ( tag !in mime_types )
|
||||
{
|
||||
mime_types[tag] = set();
|
||||
|
||||
}
|
||||
add mime_types[tag][mt];
|
||||
|
||||
if ( mt !in mime_type_to_analyzers )
|
||||
{
|
||||
mime_type_to_analyzers[mt] = set();
|
||||
}
|
||||
add mime_type_to_analyzers[mt][tag];
|
||||
|
||||
return T;
|
||||
}
|
||||
|
||||
|
@ -462,3 +459,61 @@ event get_file_handle(tag: Analyzer::Tag, c: connection, is_orig: bool) &priorit
|
|||
local handler = registered_protocols[tag];
|
||||
set_file_handle(handler$get_file_handle(c, is_orig));
|
||||
}
|
||||
|
||||
event file_new(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
|
||||
if ( enable_reassembler )
|
||||
{
|
||||
Files::enable_reassembly(f);
|
||||
Files::set_reassembly_buffer_size(f, reassembly_buffer_size);
|
||||
}
|
||||
}
|
||||
|
||||
event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
|
||||
add f$info$conn_uids[c$uid];
|
||||
local cid = c$id;
|
||||
add f$info$tx_hosts[f$is_orig ? cid$orig_h : cid$resp_h];
|
||||
if( |Site::local_nets| > 0 )
|
||||
f$info$local_orig=Site::is_local_addr(f$is_orig ? cid$orig_h : cid$resp_h);
|
||||
|
||||
add f$info$rx_hosts[f$is_orig ? cid$resp_h : cid$orig_h];
|
||||
}
|
||||
|
||||
event file_mime_type(f: fa_file, mime_type: string) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
|
||||
f$info$mime_type = mime_type;
|
||||
|
||||
if ( analyze_by_mime_type_automatically &&
|
||||
mime_type in mime_type_to_analyzers )
|
||||
{
|
||||
local analyzers = mime_type_to_analyzers[mime_type];
|
||||
for ( a in analyzers )
|
||||
{
|
||||
add f$info$analyzers[Files::analyzer_name(a)];
|
||||
Files::add_analyzer(f, a);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
event file_timeout(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
f$info$timedout = T;
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file) &priority=10
|
||||
{
|
||||
set_info(f);
|
||||
}
|
||||
|
||||
event file_state_remove(f: fa_file) &priority=-10
|
||||
{
|
||||
Log::write(Files::LOG, f$info);
|
||||
}
|
||||
|
|
|
@ -67,6 +67,7 @@ export {
|
|||
IN_ANYWHERE,
|
||||
};
|
||||
|
||||
## Information about a piece of "seen" data.
|
||||
type Seen: record {
|
||||
## The string if the data is about a string.
|
||||
indicator: string &log &optional;
|
||||
|
@ -124,7 +125,7 @@ export {
|
|||
sources: set[string] &log &default=string_set();
|
||||
};
|
||||
|
||||
## Intelligence data manipulation functions.
|
||||
## Intelligence data manipulation function.
|
||||
global insert: function(item: Item);
|
||||
|
||||
## Function to declare discovery of a piece of data in order to check
|
||||
|
@ -289,8 +290,8 @@ event Intel::match(s: Seen, items: set[Item]) &priority=5
|
|||
if ( ! info?$fuid )
|
||||
info$fuid = s$f$id;
|
||||
|
||||
if ( ! info?$file_mime_type && s$f?$mime_type )
|
||||
info$file_mime_type = s$f$mime_type;
|
||||
if ( ! info?$file_mime_type && s$f?$info && s$f$info?$mime_type )
|
||||
info$file_mime_type = s$f$info$mime_type;
|
||||
|
||||
if ( ! info?$file_desc )
|
||||
info$file_desc = Files::describe(s$f);
|
||||
|
|
|
@ -531,8 +531,8 @@ function create_file_info(f: fa_file): Notice::FileInfo
|
|||
local fi: Notice::FileInfo = Notice::FileInfo($fuid = f$id,
|
||||
$desc = Files::describe(f));
|
||||
|
||||
if ( f?$mime_type )
|
||||
fi$mime = f$mime_type;
|
||||
if ( f?$info && f$info?$mime_type )
|
||||
fi$mime = f$info$mime_type;
|
||||
|
||||
if ( f?$conns && |f$conns| == 1 )
|
||||
for ( id in f$conns )
|
||||
|
|
|
@ -353,9 +353,10 @@ type connection: record {
|
|||
## gives up and discards any internal state related to the file.
|
||||
const default_file_timeout_interval: interval = 2 mins &redef;
|
||||
|
||||
## Default amount of bytes that file analysis will buffer before raising
|
||||
## :bro:see:`file_new`.
|
||||
const default_file_bof_buffer_size: count = 1024 &redef;
|
||||
## Default amount of bytes that file analysis will buffer in order to use
|
||||
## for mime type matching. File analyzers attached at the time of mime type
|
||||
## matching or later, will receive a copy of this buffer.
|
||||
const default_file_bof_buffer_size: count = 4096 &redef;
|
||||
|
||||
## A file that Bro is analyzing. This is Bro's type for describing the basic
|
||||
## internal metadata collected about a "file", which is essentially just a
|
||||
|
@ -394,8 +395,10 @@ type fa_file: record {
|
|||
## during the process of analysis e.g. due to dropped packets.
|
||||
missing_bytes: count &default=0;
|
||||
|
||||
## The number of not all-in-sequence bytes in the file stream that
|
||||
## were delivered to file analyzers due to reassembly buffer overflow.
|
||||
## The number of bytes in the file stream that were not delivered to
|
||||
## stream file analyzers. Generally, this consists of bytes that
|
||||
## couldn't be reassembled, either because reassembly simply isn't
|
||||
## enabled, or due to size limitations of the reassembly buffer.
|
||||
overflow_bytes: count &default=0;
|
||||
|
||||
## The amount of time between receiving new data for this file that
|
||||
|
@ -409,16 +412,6 @@ type fa_file: record {
|
|||
## The content of the beginning of a file up to *bof_buffer_size* bytes.
|
||||
## This is also the buffer that's used for file/mime type detection.
|
||||
bof_buffer: string &optional;
|
||||
|
||||
## The mime type of the strongest file magic signature matches against
|
||||
## the data chunk in *bof_buffer*, or in the cases where no buffering
|
||||
## of the beginning of file occurs, an initial guess of the mime type
|
||||
## based on the first data seen.
|
||||
mime_type: string &optional;
|
||||
|
||||
## All mime types that matched file magic signatures against the data
|
||||
## chunk in *bof_buffer*, in order of their strength value.
|
||||
mime_types: mime_matches &optional;
|
||||
} &redef;
|
||||
|
||||
## Fields of a SYN packet.
|
||||
|
|
|
@ -47,6 +47,7 @@
|
|||
@load base/protocols/irc
|
||||
@load base/protocols/krb
|
||||
@load base/protocols/modbus
|
||||
@load base/protocols/mysql
|
||||
@load base/protocols/pop3
|
||||
@load base/protocols/radius
|
||||
@load base/protocols/snmp
|
||||
|
|
|
@ -17,6 +17,10 @@ export {
|
|||
|
||||
## Describe the file being transferred.
|
||||
global describe_file: function(f: fa_file): string;
|
||||
|
||||
redef record fa_file += {
|
||||
ftp: FTP::Info &optional;
|
||||
};
|
||||
}
|
||||
|
||||
function get_file_handle(c: connection, is_orig: bool): string
|
||||
|
@ -48,7 +52,6 @@ event bro_init() &priority=5
|
|||
$describe = FTP::describe_file]);
|
||||
}
|
||||
|
||||
|
||||
event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priority=5
|
||||
{
|
||||
if ( [c$id$resp_h, c$id$resp_p] !in ftp_data_expected )
|
||||
|
@ -56,6 +59,14 @@ event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priori
|
|||
|
||||
local ftp = ftp_data_expected[c$id$resp_h, c$id$resp_p];
|
||||
ftp$fuid = f$id;
|
||||
if ( f?$mime_type )
|
||||
ftp$mime_type = f$mime_type;
|
||||
|
||||
f$ftp = ftp;
|
||||
}
|
||||
|
||||
event file_mime_type(f: fa_file, mime_type: string) &priority=5
|
||||
{
|
||||
if ( ! f?$ftp )
|
||||
return;
|
||||
|
||||
f$ftp$mime_type = mime_type;
|
||||
}
|
||||
|
|
|
@ -35,6 +35,10 @@ export {
|
|||
## body.
|
||||
resp_mime_depth: count &default=0;
|
||||
};
|
||||
|
||||
redef record fa_file += {
|
||||
http: HTTP::Info &optional;
|
||||
};
|
||||
}
|
||||
|
||||
event http_begin_entity(c: connection, is_orig: bool) &priority=10
|
||||
|
@ -67,6 +71,8 @@ event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priori
|
|||
{
|
||||
if ( f$source == "HTTP" && c?$http )
|
||||
{
|
||||
f$http = c$http;
|
||||
|
||||
if ( c$http?$current_entity && c$http$current_entity?$filename )
|
||||
f$info$filename = c$http$current_entity$filename;
|
||||
|
||||
|
@ -76,14 +82,6 @@ event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priori
|
|||
c$http$orig_fuids = string_vec(f$id);
|
||||
else
|
||||
c$http$orig_fuids[|c$http$orig_fuids|] = f$id;
|
||||
|
||||
if ( f?$mime_type )
|
||||
{
|
||||
if ( ! c$http?$orig_mime_types )
|
||||
c$http$orig_mime_types = string_vec(f$mime_type);
|
||||
else
|
||||
c$http$orig_mime_types[|c$http$orig_mime_types|] = f$mime_type;
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
|
@ -91,17 +89,29 @@ event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priori
|
|||
c$http$resp_fuids = string_vec(f$id);
|
||||
else
|
||||
c$http$resp_fuids[|c$http$resp_fuids|] = f$id;
|
||||
|
||||
if ( f?$mime_type )
|
||||
{
|
||||
if ( ! c$http?$resp_mime_types )
|
||||
c$http$resp_mime_types = string_vec(f$mime_type);
|
||||
else
|
||||
c$http$resp_mime_types[|c$http$resp_mime_types|] = f$mime_type;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
event file_mime_type(f: fa_file, mime_type: string) &priority=5
|
||||
{
|
||||
if ( ! f?$http || ! f?$is_orig )
|
||||
return;
|
||||
|
||||
if ( f$is_orig )
|
||||
{
|
||||
if ( ! f$http?$orig_mime_types )
|
||||
f$http$orig_mime_types = string_vec(mime_type);
|
||||
else
|
||||
f$http$orig_mime_types[|f$http$orig_mime_types|] = mime_type;
|
||||
}
|
||||
else
|
||||
{
|
||||
if ( ! f$http?$resp_mime_types )
|
||||
f$http$resp_mime_types = string_vec(mime_type);
|
||||
else
|
||||
f$http$resp_mime_types[|f$http$resp_mime_types|] = mime_type;
|
||||
}
|
||||
}
|
||||
|
||||
event http_end_entity(c: connection, is_orig: bool) &priority=5
|
||||
|
|
|
@ -12,6 +12,10 @@ export {
|
|||
|
||||
## Default file handle provider for IRC.
|
||||
global get_file_handle: function(c: connection, is_orig: bool): string;
|
||||
|
||||
redef record fa_file += {
|
||||
irc: IRC::Info &optional;
|
||||
};
|
||||
}
|
||||
|
||||
function get_file_handle(c: connection, is_orig: bool): string
|
||||
|
@ -34,6 +38,12 @@ event file_over_new_connection(f: fa_file, c: connection, is_orig: bool) &priori
|
|||
irc$fuid = f$id;
|
||||
if ( irc?$dcc_file_name )
|
||||
f$info$filename = irc$dcc_file_name;
|
||||
if ( f?$mime_type )
|
||||
irc$dcc_mime_type = f$mime_type;
|
||||
|
||||
f$irc = irc;
|
||||
}
|
||||
|
||||
event file_mime_type(f: fa_file, mime_type: string) &priority=5
|
||||
{
|
||||
if ( f?$irc )
|
||||
f$irc$dcc_mime_type = mime_type;
|
||||
}
|
1
scripts/base/protocols/mysql/__load__.bro
Normal file
1
scripts/base/protocols/mysql/__load__.bro
Normal file
|
@ -0,0 +1 @@
|
|||
@load ./main
|
38
scripts/base/protocols/mysql/consts.bro
Normal file
38
scripts/base/protocols/mysql/consts.bro
Normal file
|
@ -0,0 +1,38 @@
|
|||
module MySQL;
|
||||
|
||||
export {
|
||||
const commands: table[count] of string = {
|
||||
[0] = "sleep",
|
||||
[1] = "quit",
|
||||
[2] = "init_db",
|
||||
[3] = "query",
|
||||
[4] = "field_list",
|
||||
[5] = "create_db",
|
||||
[6] = "drop_db",
|
||||
[7] = "refresh",
|
||||
[8] = "shutdown",
|
||||
[9] = "statistics",
|
||||
[10] = "process_info",
|
||||
[11] = "connect",
|
||||
[12] = "process_kill",
|
||||
[13] = "debug",
|
||||
[14] = "ping",
|
||||
[15] = "time",
|
||||
[16] = "delayed_insert",
|
||||
[17] = "change_user",
|
||||
[18] = "binlog_dump",
|
||||
[19] = "table_dump",
|
||||
[20] = "connect_out",
|
||||
[21] = "register_slave",
|
||||
[22] = "stmt_prepare",
|
||||
[23] = "stmt_execute",
|
||||
[24] = "stmt_send_long_data",
|
||||
[25] = "stmt_close",
|
||||
[26] = "stmt_reset",
|
||||
[27] = "set_option",
|
||||
[28] = "stmt_fetch",
|
||||
[29] = "daemon",
|
||||
[30] = "binlog_dump_gtid",
|
||||
[31] = "reset_connection",
|
||||
} &default=function(i: count): string { return fmt("unknown-%d", i); };
|
||||
}
|
116
scripts/base/protocols/mysql/main.bro
Normal file
116
scripts/base/protocols/mysql/main.bro
Normal file
|
@ -0,0 +1,116 @@
|
|||
##! Implements base functionality for MySQL analysis. Generates the mysql.log file.
|
||||
|
||||
module MySQL;
|
||||
|
||||
@load ./consts
|
||||
|
||||
export {
|
||||
redef enum Log::ID += { mysql::LOG };
|
||||
|
||||
type Info: record {
|
||||
## Timestamp for when the event happened.
|
||||
ts: time &log;
|
||||
## Unique ID for the connection.
|
||||
uid: string &log;
|
||||
## The connection's 4-tuple of endpoint addresses/ports.
|
||||
id: conn_id &log;
|
||||
## The command that was issued
|
||||
cmd: string &log;
|
||||
## The argument issued to the command
|
||||
arg: string &log;
|
||||
## The result (error, OK, etc.) from the server
|
||||
result: string &log &optional;
|
||||
## Server message, if any
|
||||
response: string &log &optional;
|
||||
};
|
||||
|
||||
## Event that can be handled to access the MySQL record as it is sent on
|
||||
## to the logging framework.
|
||||
global log_mysql: event(rec: Info);
|
||||
}
|
||||
|
||||
redef record connection += {
|
||||
mysql: Info &optional;
|
||||
};
|
||||
|
||||
const ports = { 1434/tcp, 3306/tcp };
|
||||
|
||||
event bro_init() &priority=5
|
||||
{
|
||||
Log::create_stream(mysql::LOG, [$columns=Info, $ev=log_mysql]);
|
||||
Analyzer::register_for_ports(Analyzer::ANALYZER_MYSQL, ports);
|
||||
}
|
||||
|
||||
event mysql_handshake(c: connection, username: string)
|
||||
{
|
||||
if ( ! c?$mysql )
|
||||
{
|
||||
local info: Info;
|
||||
info$ts = network_time();
|
||||
info$uid = c$uid;
|
||||
info$id = c$id;
|
||||
info$cmd = "login";
|
||||
info$arg = username;
|
||||
c$mysql = info;
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_command_request(c: connection, command: count, arg: string) &priority=5
|
||||
{
|
||||
if ( ! c?$mysql )
|
||||
{
|
||||
local info: Info;
|
||||
info$ts = network_time();
|
||||
info$uid = c$uid;
|
||||
info$id = c$id;
|
||||
info$cmd = commands[command];
|
||||
info$arg = sub(arg, /\0$/, "");
|
||||
c$mysql = info;
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_command_request(c: connection, command: count, arg: string) &priority=-5
|
||||
{
|
||||
if ( c?$mysql && c$mysql?$cmd && c$mysql$cmd == "quit" )
|
||||
{
|
||||
# We get no response for quits, so let's just log it now.
|
||||
Log::write(mysql::LOG, c$mysql);
|
||||
delete c$mysql;
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_error(c: connection, code: count, msg: string) &priority=5
|
||||
{
|
||||
if ( c?$mysql )
|
||||
{
|
||||
c$mysql$result = "error";
|
||||
c$mysql$response = msg;
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_error(c: connection, code: count, msg: string) &priority=-5
|
||||
{
|
||||
if ( c?$mysql )
|
||||
{
|
||||
Log::write(mysql::LOG, c$mysql);
|
||||
delete c$mysql;
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_ok(c: connection, affected_rows: count) &priority=5
|
||||
{
|
||||
if ( c?$mysql )
|
||||
{
|
||||
c$mysql$result = "ok";
|
||||
c$mysql$response = fmt("Affected rows: %d", affected_rows);
|
||||
}
|
||||
}
|
||||
|
||||
event mysql_ok(c: connection, affected_rows: count) &priority=-5
|
||||
{
|
||||
if ( c?$mysql )
|
||||
{
|
||||
Log::write(mysql::LOG, c$mysql);
|
||||
delete c$mysql;
|
||||
}
|
||||
}
|
|
@ -96,8 +96,9 @@ event Exec::file_line(description: Input::EventDescription, tpe: Input::Event, s
|
|||
result$files[track_file][|result$files[track_file]|] = s;
|
||||
}
|
||||
|
||||
event Input::end_of_data(name: string, source:string)
|
||||
event Input::end_of_data(orig_name: string, source:string)
|
||||
{
|
||||
local name = orig_name;
|
||||
local parts = split1(name, /_/);
|
||||
name = parts[1];
|
||||
|
||||
|
|
|
@ -3,6 +3,28 @@
|
|||
## A regular expression for matching and extracting URLs.
|
||||
const url_regex = /^([a-zA-Z\-]{3,5})(:\/\/[^\/?#"'\r\n><]*)([^?#"'\r\n><]*)([^[:blank:]\r\n"'><]*|\??[^"'\r\n><]*)/ &redef;
|
||||
|
||||
## A URI, as parsed by :bro:id:`decompose_uri`.
|
||||
type URI: record {
|
||||
## The URL's scheme..
|
||||
scheme: string &optional;
|
||||
## The location, which could be a domain name or an IP address. Left empty if not
|
||||
## specified.
|
||||
netlocation: string;
|
||||
## Port number, if included in URI.
|
||||
portnum: count &optional;
|
||||
## Full including the file name. Will be '/' if there's not path given.
|
||||
path: string;
|
||||
## Full file name, including extension, if there is a file name.
|
||||
file_name: string &optional;
|
||||
## The base filename, without extension, if there is a file name.
|
||||
file_base: string &optional;
|
||||
## The filename's extension, if there is a file name.
|
||||
file_ext: string &optional;
|
||||
## A table of all query parameters, mapping their keys to values, if there's a
|
||||
## query.
|
||||
params: table[string] of string &optional;
|
||||
};
|
||||
|
||||
## Extracts URLs discovered in arbitrary text.
|
||||
function find_all_urls(s: string): string_set
|
||||
{
|
||||
|
@ -23,3 +45,84 @@ function find_all_urls_without_scheme(s: string): string_set
|
|||
|
||||
return return_urls;
|
||||
}
|
||||
|
||||
function decompose_uri(s: string): URI
|
||||
{
|
||||
local parts: string_array;
|
||||
local u: URI = [$netlocation="", $path="/"];
|
||||
|
||||
if ( /\?/ in s)
|
||||
{
|
||||
# Parse query.
|
||||
u$params = table();
|
||||
|
||||
parts = split1(s, /\?/);
|
||||
s = parts[1];
|
||||
local query: string = parts[2];
|
||||
|
||||
if ( /&/ in query )
|
||||
{
|
||||
local opv: table[count] of string = split(query, /&/);
|
||||
|
||||
for ( each in opv )
|
||||
{
|
||||
if ( /=/ in opv[each] )
|
||||
{
|
||||
parts = split1(opv[each], /=/);
|
||||
u$params[parts[1]] = parts[2];
|
||||
}
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
parts = split1(query, /=/);
|
||||
u$params[parts[1]] = parts[2];
|
||||
}
|
||||
}
|
||||
|
||||
if ( /:\/\// in s )
|
||||
{
|
||||
# Parse scheme and remove from s.
|
||||
parts = split1(s, /:\/\//);
|
||||
u$scheme = parts[1];
|
||||
s = parts[2];
|
||||
}
|
||||
|
||||
if ( /\// in s )
|
||||
{
|
||||
# Parse path and remove from s.
|
||||
parts = split1(s, /\//);
|
||||
s = parts[1];
|
||||
u$path = fmt("/%s", parts[2]);
|
||||
|
||||
if ( |u$path| > 1 && u$path[|u$path| - 1] != "/" )
|
||||
{
|
||||
local last_token: string = find_last(u$path, /\/.+/);
|
||||
local full_filename = split1(last_token, /\//)[2];
|
||||
|
||||
if ( /\./ in full_filename )
|
||||
{
|
||||
u$file_name = full_filename;
|
||||
u$file_base = split1(full_filename, /\./)[1];
|
||||
u$file_ext = split1(full_filename, /\./)[2];
|
||||
}
|
||||
else
|
||||
{
|
||||
u$file_name = full_filename;
|
||||
u$file_base = full_filename;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if ( /:/ in s )
|
||||
{
|
||||
# Parse location and port.
|
||||
parts = split1(s, /:/);
|
||||
u$netlocation = parts[1];
|
||||
u$portnum = to_count(parts[2]);
|
||||
}
|
||||
else
|
||||
u$netlocation = s;
|
||||
|
||||
return u;
|
||||
}
|
||||
|
|
|
@ -66,6 +66,7 @@ function do_mhr_lookup(hash: string, fi: Notice::FileInfo)
|
|||
|
||||
event file_hash(f: fa_file, kind: string, hash: string)
|
||||
{
|
||||
if ( kind == "sha1" && f?$mime_type && match_file_types in f$mime_type )
|
||||
if ( kind == "sha1" && f?$info && f$info?$mime_type &&
|
||||
match_file_types in f$info$mime_type )
|
||||
do_mhr_lookup(hash, Notice::create_file_info(f));
|
||||
}
|
||||
|
|
|
@ -4,6 +4,7 @@
|
|||
##!
|
||||
##! This script will log the version of Windows that was identified to the Software framework.
|
||||
|
||||
@load base/protocols/http
|
||||
@load base/frameworks/software
|
||||
|
||||
module OS;
|
||||
|
@ -46,6 +47,7 @@ export {
|
|||
["Microsoft-CryptoAPI/6.1"] = [$name="Windows", $version=[$major=6, $minor=1, $addl="7 or Server 2008 R2"]],
|
||||
["Microsoft-CryptoAPI/6.2"] = [$name="Windows", $version=[$major=6, $minor=2, $addl="8 or Server 2012"]],
|
||||
["Microsoft-CryptoAPI/6.3"] = [$name="Windows", $version=[$major=6, $minor=3, $addl="8.1 or Server 2012 R2"]],
|
||||
["Microsoft-CryptoAPI/6.4"] = [$name="Windows", $version=[$major=6, $minor=4, $addl="10 Technical Preview"]],
|
||||
} &redef;
|
||||
}
|
||||
|
||||
|
|
20
scripts/policy/protocols/mysql/software.bro
Normal file
20
scripts/policy/protocols/mysql/software.bro
Normal file
|
@ -0,0 +1,20 @@
|
|||
##! Software identification and extraction for MySQL traffic.
|
||||
|
||||
@load base/frameworks/software
|
||||
|
||||
module MySQL;
|
||||
|
||||
export {
|
||||
redef enum Software::Type += {
|
||||
## Identifier for MySQL servers in the software framework.
|
||||
SERVER,
|
||||
};
|
||||
}
|
||||
|
||||
event mysql_server_version(c: connection, ver: string)
|
||||
{
|
||||
if ( ver == "" )
|
||||
return;
|
||||
|
||||
Software::found(c$id, [$unparsed_version=ver, $host=c$id$resp_h, $software_type=SERVER]);
|
||||
}
|
|
@ -65,7 +65,7 @@ event ssl_dh_server_params(c: connection, p: string, q: string, Ys: string) &pri
|
|||
if ( ! addr_matches_host(c$id$resp_h, notify_weak_keys) )
|
||||
return;
|
||||
|
||||
local key_length = |Ys| * 8; # key length in bits
|
||||
local key_length = |p| * 8; # length of the used prime number in bits
|
||||
|
||||
if ( key_length < notify_minimal_key_length )
|
||||
NOTICE([$note=Weak_Key,
|
||||
|
|
|
@ -32,6 +32,7 @@
|
|||
@load frameworks/packet-filter/shunt.bro
|
||||
@load frameworks/software/version-changes.bro
|
||||
@load frameworks/software/vulnerable.bro
|
||||
@load frameworks/software/windows-version-detection.bro
|
||||
@load integration/barnyard2/__load__.bro
|
||||
@load integration/barnyard2/main.bro
|
||||
@load integration/barnyard2/types.bro
|
||||
|
@ -75,6 +76,7 @@
|
|||
@load protocols/http/var-extraction-uri.bro
|
||||
@load protocols/modbus/known-masters-slaves.bro
|
||||
@load protocols/modbus/track-memmap.bro
|
||||
@load protocols/mysql/software.bro
|
||||
@load protocols/smtp/blocklists.bro
|
||||
@load protocols/smtp/detect-suspicious-orig.bro
|
||||
@load protocols/smtp/entities-excerpt.bro
|
||||
|
|
11
src/Attr.cc
11
src/Attr.cc
|
@ -265,6 +265,14 @@ void Attributes::CheckAttr(Attr* a)
|
|||
// Ok.
|
||||
break;
|
||||
|
||||
Expr* e = a->AttrExpr();
|
||||
if ( check_and_promote_expr(e, type) )
|
||||
{
|
||||
a->SetAttrExpr(e);
|
||||
// Ok.
|
||||
break;
|
||||
}
|
||||
|
||||
a->AttrExpr()->Error("&default value has inconsistent type", type);
|
||||
}
|
||||
|
||||
|
@ -297,8 +305,11 @@ void Attributes::CheckAttr(Attr* a)
|
|||
|
||||
Expr* e = a->AttrExpr();
|
||||
if ( check_and_promote_expr(e, ytype) )
|
||||
{
|
||||
a->SetAttrExpr(e);
|
||||
// Ok.
|
||||
break;
|
||||
}
|
||||
|
||||
Error("&default value has inconsistent type 2");
|
||||
}
|
||||
|
|
|
@ -45,6 +45,13 @@ public:
|
|||
attr_tag Tag() const { return tag; }
|
||||
Expr* AttrExpr() const { return expr; }
|
||||
|
||||
// Up to the caller to decide if previous expr can be unref'd since it may
|
||||
// not always be safe; e.g. expressions (at time of writing) don't always
|
||||
// keep careful track of referencing their operands, so doing something
|
||||
// like SetAttrExpr(coerce(AttrExpr())) must not completely unref the
|
||||
// previous expr as the new expr depends on it.
|
||||
void SetAttrExpr(Expr* e) { expr = e; }
|
||||
|
||||
int RedundantAttrOkay() const
|
||||
{ return tag == ATTR_REDEF || tag == ATTR_OPTIONAL; }
|
||||
|
||||
|
|
|
@ -48,7 +48,7 @@ set(BISON_FLAGS "--debug")
|
|||
bison_target(BIFParser builtin-func.y
|
||||
${CMAKE_CURRENT_BINARY_DIR}/bif_parse.cc
|
||||
HEADER ${CMAKE_CURRENT_BINARY_DIR}/bif_parse.h
|
||||
VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/bif_parse.output
|
||||
#VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/bif_parse.output
|
||||
COMPILE_FLAGS "${BISON_FLAGS}")
|
||||
flex_target(BIFScanner builtin-func.l ${CMAKE_CURRENT_BINARY_DIR}/bif_lex.cc)
|
||||
add_flex_bison_dependency(BIFScanner BIFParser)
|
||||
|
@ -57,7 +57,7 @@ add_flex_bison_dependency(BIFScanner BIFParser)
|
|||
bison_target(RuleParser rule-parse.y
|
||||
${CMAKE_CURRENT_BINARY_DIR}/rup.cc
|
||||
HEADER ${CMAKE_CURRENT_BINARY_DIR}/rup.h
|
||||
VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/rule_parse.output
|
||||
#VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/rule_parse.output
|
||||
COMPILE_FLAGS "${BISON_FLAGS}")
|
||||
replace_yy_prefix_target(${CMAKE_CURRENT_BINARY_DIR}/rup.cc
|
||||
${CMAKE_CURRENT_BINARY_DIR}/rule-parse.cc
|
||||
|
@ -72,7 +72,7 @@ flex_target(RuleScanner rule-scan.l ${CMAKE_CURRENT_BINARY_DIR}/rule-scan.cc
|
|||
bison_target(REParser re-parse.y
|
||||
${CMAKE_CURRENT_BINARY_DIR}/rep.cc
|
||||
HEADER ${CMAKE_CURRENT_BINARY_DIR}/re-parse.h
|
||||
VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/re_parse.output
|
||||
#VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/re_parse.output
|
||||
COMPILE_FLAGS "${BISON_FLAGS}")
|
||||
replace_yy_prefix_target(${CMAKE_CURRENT_BINARY_DIR}/rep.cc
|
||||
${CMAKE_CURRENT_BINARY_DIR}/re-parse.cc
|
||||
|
@ -85,7 +85,7 @@ add_flex_bison_dependency(REScanner REParser)
|
|||
bison_target(Parser parse.y
|
||||
${CMAKE_CURRENT_BINARY_DIR}/p.cc
|
||||
HEADER ${CMAKE_CURRENT_BINARY_DIR}/broparse.h
|
||||
VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/parse.output
|
||||
#VERBOSE ${CMAKE_CURRENT_BINARY_DIR}/parse.output
|
||||
COMPILE_FLAGS "${BISON_FLAGS}")
|
||||
replace_yy_prefix_target(${CMAKE_CURRENT_BINARY_DIR}/p.cc
|
||||
${CMAKE_CURRENT_BINARY_DIR}/parse.cc
|
||||
|
|
|
@ -1438,7 +1438,7 @@ bool AddExpr::DoUnserialize(UnserialInfo* info)
|
|||
}
|
||||
|
||||
AddToExpr::AddToExpr(Expr* arg_op1, Expr* arg_op2)
|
||||
: BinaryExpr(EXPR_ADD_TO, arg_op1, arg_op2)
|
||||
: BinaryExpr(EXPR_ADD_TO, arg_op1->MakeLvalue(), arg_op2)
|
||||
{
|
||||
if ( IsError() )
|
||||
return;
|
||||
|
@ -1562,7 +1562,7 @@ bool SubExpr::DoUnserialize(UnserialInfo* info)
|
|||
}
|
||||
|
||||
RemoveFromExpr::RemoveFromExpr(Expr* arg_op1, Expr* arg_op2)
|
||||
: BinaryExpr(EXPR_REMOVE_FROM, arg_op1, arg_op2)
|
||||
: BinaryExpr(EXPR_REMOVE_FROM, arg_op1->MakeLvalue(), arg_op2)
|
||||
{
|
||||
if ( IsError() )
|
||||
return;
|
||||
|
|
|
@ -28,7 +28,7 @@ void FragTimer::Dispatch(double t, int /* is_expire */)
|
|||
FragReassembler::FragReassembler(NetSessions* arg_s,
|
||||
const IP_Hdr* ip, const u_char* pkt,
|
||||
HashKey* k, double t)
|
||||
: Reassembler(0, REASSEM_IP)
|
||||
: Reassembler(0)
|
||||
{
|
||||
s = arg_s;
|
||||
key = k;
|
||||
|
|
48
src/IP.cc
48
src/IP.cc
|
@ -636,3 +636,51 @@ VectorVal* IPv6_Hdr_Chain::BuildVal() const
|
|||
|
||||
return rval;
|
||||
}
|
||||
|
||||
IP_Hdr* IP_Hdr::Copy() const
|
||||
{
|
||||
char* new_hdr = new char[HdrLen()];
|
||||
|
||||
if ( ip4 )
|
||||
{
|
||||
memcpy(new_hdr, ip4, HdrLen());
|
||||
return new IP_Hdr((const struct ip*) new_hdr, true);
|
||||
}
|
||||
|
||||
memcpy(new_hdr, ip6, HdrLen());
|
||||
const struct ip6_hdr* new_ip6 = (const struct ip6_hdr*)new_hdr;
|
||||
IPv6_Hdr_Chain* new_ip6_hdrs = ip6_hdrs->Copy(new_ip6);
|
||||
return new IP_Hdr(new_ip6, true, 0, new_ip6_hdrs);
|
||||
}
|
||||
|
||||
IPv6_Hdr_Chain* IPv6_Hdr_Chain::Copy(const ip6_hdr* new_hdr) const
|
||||
{
|
||||
IPv6_Hdr_Chain* rval = new IPv6_Hdr_Chain;
|
||||
rval->length = length;
|
||||
|
||||
#ifdef ENABLE_MOBILE_IPV6
|
||||
if ( homeAddr )
|
||||
rval->homeAddr = new IPAddr(*homeAddr);
|
||||
#endif
|
||||
|
||||
if ( finalDst )
|
||||
rval->finalDst = new IPAddr(*finalDst);
|
||||
|
||||
if ( chain.empty() )
|
||||
{
|
||||
reporter->InternalWarning("empty IPv6 header chain");
|
||||
delete rval;
|
||||
return 0;
|
||||
}
|
||||
|
||||
const u_char* new_data = (const u_char*)new_hdr;
|
||||
const u_char* old_data = chain[0]->Data();
|
||||
|
||||
for ( size_t i = 0; i < chain.size(); ++i )
|
||||
{
|
||||
int off = chain[i]->Data() - old_data;
|
||||
rval->chain.push_back(new IPv6_Hdr(chain[i]->Type(), new_data + off));
|
||||
}
|
||||
|
||||
return rval;
|
||||
}
|
||||
|
|
22
src/IP.h
22
src/IP.h
|
@ -157,6 +157,12 @@ public:
|
|||
delete finalDst;
|
||||
}
|
||||
|
||||
/**
|
||||
* @return a copy of the header chain, but with pointers to individual
|
||||
* IPv6 headers now pointing within \a new_hdr.
|
||||
*/
|
||||
IPv6_Hdr_Chain* Copy(const struct ip6_hdr* new_hdr) const;
|
||||
|
||||
/**
|
||||
* Returns the number of headers in the chain.
|
||||
*/
|
||||
|
@ -264,6 +270,14 @@ protected:
|
|||
// point to a fragment
|
||||
friend class FragReassembler;
|
||||
|
||||
IPv6_Hdr_Chain() :
|
||||
length(0),
|
||||
#ifdef ENABLE_MOBILE_IPV6
|
||||
homeAddr(0),
|
||||
#endif
|
||||
finalDst(0)
|
||||
{}
|
||||
|
||||
/**
|
||||
* Initializes the header chain from an IPv6 header structure, and replaces
|
||||
* the first next protocol pointer field that points to a fragment header.
|
||||
|
@ -353,6 +367,13 @@ public:
|
|||
{
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy a header. The internal buffer which contains the header data
|
||||
* must not be truncated. Also note that if that buffer points to a full
|
||||
* packet payload, only the IP header portion is copied.
|
||||
*/
|
||||
IP_Hdr* Copy() const;
|
||||
|
||||
/**
|
||||
* Destructor.
|
||||
*/
|
||||
|
@ -554,6 +575,7 @@ public:
|
|||
RecordVal* BuildPktHdrVal() const;
|
||||
|
||||
private:
|
||||
|
||||
const struct ip* ip4;
|
||||
const struct ip6_hdr* ip6;
|
||||
bool del;
|
||||
|
|
|
@ -31,7 +31,7 @@ DataBlock::DataBlock(const u_char* data, uint64 size, uint64 arg_seq,
|
|||
|
||||
uint64 Reassembler::total_size = 0;
|
||||
|
||||
Reassembler::Reassembler(uint64 init_seq, ReassemblerType arg_type)
|
||||
Reassembler::Reassembler(uint64 init_seq)
|
||||
{
|
||||
blocks = last_block = 0;
|
||||
trim_seq = last_reassem_seq = init_seq;
|
||||
|
|
|
@ -22,11 +22,10 @@ public:
|
|||
};
|
||||
|
||||
|
||||
enum ReassemblerType { REASSEM_IP, REASSEM_TCP };
|
||||
|
||||
class Reassembler : public BroObj {
|
||||
public:
|
||||
Reassembler(uint64 init_seq, ReassemblerType arg_type);
|
||||
Reassembler(uint64 init_seq);
|
||||
virtual ~Reassembler();
|
||||
|
||||
void NewBlock(double t, uint64 seq, uint64 len, const u_char* data);
|
||||
|
|
|
@ -87,6 +87,7 @@ SERIAL_TCP_CONTENTS(TCP_NVT, 3)
|
|||
#define SERIAL_REASSEMBLER(name, val) SERIAL_CONST(name, val, REASSEMBLER)
|
||||
SERIAL_REASSEMBLER(REASSEMBLER, 1)
|
||||
SERIAL_REASSEMBLER(TCP_REASSEMBLER, 2)
|
||||
SERIAL_REASSEMBLER(FILE_REASSEMBLER, 3)
|
||||
|
||||
#define SERIAL_VAL(name, val) SERIAL_CONST(name, val, VAL)
|
||||
SERIAL_VAL(VAL, 1)
|
||||
|
|
|
@ -22,6 +22,7 @@ add_subdirectory(krb)
|
|||
add_subdirectory(login)
|
||||
add_subdirectory(mime)
|
||||
add_subdirectory(modbus)
|
||||
add_subdirectory(mysql)
|
||||
add_subdirectory(ncp)
|
||||
add_subdirectory(netbios)
|
||||
add_subdirectory(netflow)
|
||||
|
|
10
src/analyzer/protocol/mysql/CMakeLists.txt
Normal file
10
src/analyzer/protocol/mysql/CMakeLists.txt
Normal file
|
@ -0,0 +1,10 @@
|
|||
|
||||
include(BroPlugin)
|
||||
|
||||
include_directories(BEFORE ${CMAKE_CURRENT_SOURCE_DIR} ${CMAKE_CURRENT_BINARY_DIR})
|
||||
|
||||
bro_plugin_begin(Bro MySQL)
|
||||
bro_plugin_cc(MySQL.cc Plugin.cc)
|
||||
bro_plugin_bif(events.bif)
|
||||
bro_plugin_pac(mysql.pac mysql-analyzer.pac mysql-protocol.pac)
|
||||
bro_plugin_end()
|
65
src/analyzer/protocol/mysql/MySQL.cc
Normal file
65
src/analyzer/protocol/mysql/MySQL.cc
Normal file
|
@ -0,0 +1,65 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#include "MySQL.h"
|
||||
#include "analyzer/protocol/tcp/TCP_Reassembler.h"
|
||||
#include "Reporter.h"
|
||||
#include "events.bif.h"
|
||||
|
||||
using namespace analyzer::MySQL;
|
||||
|
||||
MySQL_Analyzer::MySQL_Analyzer(Connection* c)
|
||||
: tcp::TCP_ApplicationAnalyzer("MySQL", c)
|
||||
{
|
||||
interp = new binpac::MySQL::MySQL_Conn(this);
|
||||
had_gap = false;
|
||||
}
|
||||
|
||||
MySQL_Analyzer::~MySQL_Analyzer()
|
||||
{
|
||||
delete interp;
|
||||
}
|
||||
|
||||
void MySQL_Analyzer::Done()
|
||||
{
|
||||
tcp::TCP_ApplicationAnalyzer::Done();
|
||||
|
||||
interp->FlowEOF(true);
|
||||
interp->FlowEOF(false);
|
||||
}
|
||||
|
||||
void MySQL_Analyzer::EndpointEOF(bool is_orig)
|
||||
{
|
||||
tcp::TCP_ApplicationAnalyzer::EndpointEOF(is_orig);
|
||||
interp->FlowEOF(is_orig);
|
||||
}
|
||||
|
||||
void MySQL_Analyzer::DeliverStream(int len, const u_char* data, bool orig)
|
||||
{
|
||||
tcp::TCP_ApplicationAnalyzer::DeliverStream(len, data, orig);
|
||||
|
||||
assert(TCP());
|
||||
if ( TCP()->IsPartial() )
|
||||
return;
|
||||
|
||||
if ( had_gap )
|
||||
// If only one side had a content gap, we could still try to
|
||||
// deliver data to the other side if the script layer can
|
||||
// handle this.
|
||||
return;
|
||||
|
||||
try
|
||||
{
|
||||
interp->NewData(orig, data, data + len);
|
||||
}
|
||||
catch ( const binpac::Exception& e )
|
||||
{
|
||||
reporter->Weird(e.msg().c_str());
|
||||
}
|
||||
}
|
||||
|
||||
void MySQL_Analyzer::Undelivered(uint64 seq, int len, bool orig)
|
||||
{
|
||||
tcp::TCP_ApplicationAnalyzer::Undelivered(seq, len, orig);
|
||||
had_gap = true;
|
||||
interp->NewGap(orig, len);
|
||||
}
|
40
src/analyzer/protocol/mysql/MySQL.h
Normal file
40
src/analyzer/protocol/mysql/MySQL.h
Normal file
|
@ -0,0 +1,40 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#ifndef ANALYZER_PROTOCOL_MYSQL_MYSQL_H
|
||||
#define ANALYZER_PROTOCOL_MYSQL_MYSQL_H
|
||||
|
||||
#include "events.bif.h"
|
||||
#include "analyzer/protocol/tcp/TCP.h"
|
||||
|
||||
#include "mysql_pac.h"
|
||||
|
||||
namespace analyzer { namespace MySQL {
|
||||
|
||||
class MySQL_Analyzer
|
||||
|
||||
: public tcp::TCP_ApplicationAnalyzer {
|
||||
|
||||
public:
|
||||
MySQL_Analyzer(Connection* conn);
|
||||
virtual ~MySQL_Analyzer();
|
||||
|
||||
// Overriden from Analyzer.
|
||||
virtual void Done();
|
||||
|
||||
virtual void DeliverStream(int len, const u_char* data, bool orig);
|
||||
virtual void Undelivered(uint64 seq, int len, bool orig);
|
||||
|
||||
// Overriden from tcp::TCP_ApplicationAnalyzer.
|
||||
virtual void EndpointEOF(bool is_orig);
|
||||
|
||||
static analyzer::Analyzer* Instantiate(Connection* conn)
|
||||
{ return new MySQL_Analyzer(conn); }
|
||||
|
||||
protected:
|
||||
binpac::MySQL::MySQL_Conn* interp;
|
||||
bool had_gap;
|
||||
};
|
||||
|
||||
} } // namespace analyzer::*
|
||||
|
||||
#endif
|
21
src/analyzer/protocol/mysql/Plugin.cc
Normal file
21
src/analyzer/protocol/mysql/Plugin.cc
Normal file
|
@ -0,0 +1,21 @@
|
|||
// See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
#include "plugin/Plugin.h"
|
||||
|
||||
#include "MySQL.h"
|
||||
|
||||
namespace plugin {
|
||||
namespace Bro_MySQL {
|
||||
class Plugin : public plugin::Plugin {
|
||||
public:
|
||||
plugin::Configuration Configure()
|
||||
{
|
||||
AddComponent(new ::analyzer::Component("MySQL", ::analyzer::MySQL::MySQL_Analyzer::Instantiate));
|
||||
plugin::Configuration config;
|
||||
config.name = "Bro::MySQL";
|
||||
config.description = "MySQL analyzer";
|
||||
return config;
|
||||
}
|
||||
} plugin;
|
||||
}
|
||||
}
|
65
src/analyzer/protocol/mysql/events.bif
Normal file
65
src/analyzer/protocol/mysql/events.bif
Normal file
|
@ -0,0 +1,65 @@
|
|||
## Generated for a command request from a MySQL client.
|
||||
##
|
||||
## See the MySQL `documentation <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>`__
|
||||
## for more information about the MySQL protocol.
|
||||
##
|
||||
## c: The connection.
|
||||
##
|
||||
## command: The numerical code of the command issued.
|
||||
##
|
||||
## arg: The argument for the command (empty string if not provided).
|
||||
##
|
||||
## .. bro:see:: mysql_error mysql_ok mysql_server_version mysql_handshake_response
|
||||
event mysql_command_request%(c: connection, command: count, arg: string%);
|
||||
|
||||
## Generated for an unsuccessful MySQL response.
|
||||
##
|
||||
## See the MySQL `documentation <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>`__
|
||||
## for more information about the MySQL protocol.
|
||||
##
|
||||
## c: The connection.
|
||||
##
|
||||
## code: The error code.
|
||||
##
|
||||
## msg: Any extra details about the error (empty string if not provided).
|
||||
##
|
||||
## .. bro:see:: mysql_command_request mysql_ok mysql_server_version mysql_handshake_response
|
||||
event mysql_error%(c: connection, code: count, msg: string%);
|
||||
|
||||
## Generated for a successful MySQL response.
|
||||
##
|
||||
## See the MySQL `documentation <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>`__
|
||||
## for more information about the MySQL protocol.
|
||||
##
|
||||
## c: The connection.
|
||||
##
|
||||
## affected_rows: The number of rows that were affected.
|
||||
##
|
||||
## .. bro:see:: mysql_command_request mysql_error mysql_server_version mysql_handshake_response
|
||||
event mysql_ok%(c: connection, affected_rows: count%);
|
||||
|
||||
## Generated for the initial server handshake packet, which includes the MySQL server version.
|
||||
##
|
||||
## See the MySQL `documentation <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>`__
|
||||
## for more information about the MySQL protocol.
|
||||
##
|
||||
## c: The connection.
|
||||
##
|
||||
## ver: The server version string.
|
||||
##
|
||||
## .. bro:see:: mysql_command_request mysql_error mysql_ok mysql_handshake_response
|
||||
event mysql_server_version%(c: connection, ver: string%);
|
||||
|
||||
## Generated for a client handshake response packet, which includes the username the client is attempting
|
||||
## to connect as.
|
||||
##
|
||||
## See the MySQL `documentation <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>`__
|
||||
## for more information about the MySQL protocol.
|
||||
##
|
||||
## c: The connection.
|
||||
##
|
||||
## username: The username supplied by the client
|
||||
##
|
||||
## .. bro:see:: mysql_command_request mysql_error mysql_ok mysql_server_version
|
||||
event mysql_handshake%(c: connection, username: string%);
|
||||
|
98
src/analyzer/protocol/mysql/mysql-analyzer.pac
Normal file
98
src/analyzer/protocol/mysql/mysql-analyzer.pac
Normal file
|
@ -0,0 +1,98 @@
|
|||
# See the file "COPYING" in the main distribution directory for copyright.
|
||||
|
||||
refine flow MySQL_Flow += {
|
||||
function proc_mysql_initial_handshake_packet(msg: Initial_Handshake_Packet): bool
|
||||
%{
|
||||
if ( mysql_server_version )
|
||||
{
|
||||
if ( ${msg.version} == 10 )
|
||||
BifEvent::generate_mysql_server_version(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
bytestring_to_val(${msg.handshake10.server_version}));
|
||||
if ( ${msg.version} == 9 )
|
||||
BifEvent::generate_mysql_server_version(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
bytestring_to_val(${msg.handshake9.server_version}));
|
||||
}
|
||||
return true;
|
||||
%}
|
||||
|
||||
function proc_mysql_handshake_response_packet(msg: Handshake_Response_Packet): bool
|
||||
%{
|
||||
if ( mysql_handshake )
|
||||
{
|
||||
if ( ${msg.version} == 10 )
|
||||
BifEvent::generate_mysql_handshake(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
bytestring_to_val(${msg.v10_response.username}));
|
||||
if ( ${msg.version} == 9 )
|
||||
BifEvent::generate_mysql_handshake(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
bytestring_to_val(${msg.v9_response.username}));
|
||||
}
|
||||
return true;
|
||||
%}
|
||||
|
||||
function proc_mysql_command_request_packet(msg: Command_Request_Packet): bool
|
||||
%{
|
||||
if ( mysql_command_request )
|
||||
BifEvent::generate_mysql_command_request(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
${msg.command},
|
||||
bytestring_to_val(${msg.arg}));
|
||||
return true;
|
||||
%}
|
||||
|
||||
function proc_err_packet(msg: ERR_Packet): bool
|
||||
%{
|
||||
if ( mysql_error )
|
||||
BifEvent::generate_mysql_error(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
${msg.code},
|
||||
bytestring_to_val(${msg.msg}));
|
||||
return true;
|
||||
%}
|
||||
|
||||
function proc_ok_packet(msg: OK_Packet): bool
|
||||
%{
|
||||
if ( mysql_ok )
|
||||
BifEvent::generate_mysql_ok(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
${msg.rows});
|
||||
return true;
|
||||
%}
|
||||
|
||||
function proc_resultset(msg: Resultset): bool
|
||||
%{
|
||||
if ( mysql_ok )
|
||||
BifEvent::generate_mysql_ok(connection()->bro_analyzer(),
|
||||
connection()->bro_analyzer()->Conn(),
|
||||
${msg.rows}->size());
|
||||
return true;
|
||||
%}
|
||||
|
||||
};
|
||||
|
||||
refine typeattr Initial_Handshake_Packet += &let {
|
||||
proc = $context.flow.proc_mysql_initial_handshake_packet(this);
|
||||
};
|
||||
|
||||
refine typeattr Handshake_Response_Packet += &let {
|
||||
proc = $context.flow.proc_mysql_handshake_response_packet(this);
|
||||
};
|
||||
|
||||
refine typeattr Command_Request_Packet += &let {
|
||||
proc = $context.flow.proc_mysql_command_request_packet(this);
|
||||
};
|
||||
|
||||
refine typeattr ERR_Packet += &let {
|
||||
proc = $context.flow.proc_err_packet(this);
|
||||
};
|
||||
|
||||
refine typeattr OK_Packet += &let {
|
||||
proc = $context.flow.proc_ok_packet(this);
|
||||
};
|
||||
|
||||
refine typeattr Resultset += &let {
|
||||
proc = $context.flow.proc_resultset(this);
|
||||
};
|
407
src/analyzer/protocol/mysql/mysql-protocol.pac
Normal file
407
src/analyzer/protocol/mysql/mysql-protocol.pac
Normal file
|
@ -0,0 +1,407 @@
|
|||
# See the file "COPYING" in the main distribution directory for copyright.
|
||||
#
|
||||
# All information is from the MySQL internals documentation at:
|
||||
# <http://dev.mysql.com/doc/internals/en/client-server-protocol.html>
|
||||
#
|
||||
|
||||
# Basic Types
|
||||
|
||||
type uint24le = record {
|
||||
byte3 : uint8;
|
||||
byte2 : uint8;
|
||||
byte1 : uint8;
|
||||
};
|
||||
|
||||
type LengthEncodedInteger = record {
|
||||
length : uint8;
|
||||
integer : LengthEncodedIntegerLookahead(length);
|
||||
};
|
||||
|
||||
type LengthEncodedIntegerLookahead(length: uint8) = record {
|
||||
val: case length of {
|
||||
0xfb -> i0 : empty;
|
||||
0xfc -> i2 : uint16;
|
||||
0xfd -> i3 : uint24le;
|
||||
0xfe -> i4 : uint64;
|
||||
0xff -> err_packet: empty;
|
||||
default -> one : empty;
|
||||
};
|
||||
};
|
||||
|
||||
type LengthEncodedString = record {
|
||||
len: LengthEncodedInteger;
|
||||
val: bytestring &length=to_int()(len);
|
||||
};
|
||||
|
||||
%header{
|
||||
class to_int
|
||||
{
|
||||
public:
|
||||
int operator()(uint24le* num) const
|
||||
{
|
||||
return (num->byte1() << 16) | (num->byte2() << 8) | num->byte3();
|
||||
}
|
||||
|
||||
int operator()(LengthEncodedInteger* lei) const
|
||||
{
|
||||
if ( lei->length() < 0xfb )
|
||||
return lei->length();
|
||||
else if ( lei->length() == 0xfc )
|
||||
return lei->integer()->i2();
|
||||
else if ( lei->length() == 0xfd )
|
||||
return to_int()(lei->integer()->i3());
|
||||
else if ( lei->length() == 0xfe )
|
||||
return lei->integer()->i4();
|
||||
else
|
||||
return 0;
|
||||
}
|
||||
|
||||
int operator()(LengthEncodedIntegerLookahead* lei) const
|
||||
{
|
||||
if ( lei->length() < 0xfb )
|
||||
return lei->length();
|
||||
else if ( lei->length() == 0xfc )
|
||||
return lei->i2();
|
||||
else if ( lei->length() == 0xfd )
|
||||
return to_int()(lei->i3());
|
||||
else if ( lei->length() == 0xfe )
|
||||
return lei->i4();
|
||||
else
|
||||
return 0;
|
||||
}
|
||||
};
|
||||
%}
|
||||
|
||||
extern type to_int;
|
||||
|
||||
# Enums
|
||||
|
||||
enum command_consts {
|
||||
COM_SLEEP = 0x00,
|
||||
COM_QUIT = 0x01,
|
||||
COM_INIT_DB = 0x02,
|
||||
COM_QUERY = 0x03,
|
||||
COM_FIELD_LIST = 0x04,
|
||||
COM_CREATE_DB = 0x05,
|
||||
COM_DROP_DB = 0x06,
|
||||
COM_REFRESH = 0x07,
|
||||
COM_SHUTDOWN = 0x08,
|
||||
COM_STATISTICS = 0x09,
|
||||
COM_PROCESS_INFO = 0x0a,
|
||||
COM_CONNECT = 0x0b,
|
||||
COM_PROCESS_KILL = 0x0c,
|
||||
COM_DEBUG = 0x0d,
|
||||
COM_PING = 0x0e,
|
||||
COM_TIME = 0x0f,
|
||||
COM_DELAYED_INSERT = 0x10,
|
||||
COM_CHANGE_USER = 0x11,
|
||||
COM_BINLOG_DUMP = 0x12,
|
||||
COM_TABLE_DUMP = 0x13,
|
||||
COM_CONNECT_OUT = 0x14,
|
||||
COM_REGISTER_SLAVE = 0x15,
|
||||
COM_STMT_PREPARE = 0x16,
|
||||
COM_STMT_EXECUTE = 0x17,
|
||||
COM_STMT_SEND_LONG_DATA = 0x18,
|
||||
COM_STMT_CLOSE = 0x19,
|
||||
COM_STMT_RESET = 0x1a,
|
||||
COM_SET_OPTION = 0x1b,
|
||||
COM_STMT_FETCH = 0x1c,
|
||||
COM_DAEMON = 0x1d,
|
||||
COM_BINLOG_DUMP_GTID = 0x1e
|
||||
};
|
||||
|
||||
enum state {
|
||||
CONNECTION_PHASE = 0,
|
||||
COMMAND_PHASE = 1,
|
||||
};
|
||||
|
||||
enum Expected {
|
||||
NO_EXPECTATION,
|
||||
EXPECT_STATUS,
|
||||
EXPECT_COLUMN_DEFINITION,
|
||||
EXPECT_COLUMN_COUNT,
|
||||
EXPECT_EOF1,
|
||||
EXPECT_EOF2,
|
||||
EXPECT_RESULTSET,
|
||||
EXPECT_QUERY_RESPONSE,
|
||||
};
|
||||
|
||||
type NUL_String = RE/[^\0]*/;
|
||||
|
||||
# MySQL PDU
|
||||
|
||||
type MySQL_PDU(is_orig: bool) = record {
|
||||
hdr : Header;
|
||||
msg : case is_orig of {
|
||||
false -> server_msg: Server_Message(hdr.seq_id);
|
||||
true -> client_msg: Client_Message(state);
|
||||
} &requires(state);
|
||||
} &let {
|
||||
state : int = $context.connection.get_state();
|
||||
} &length=hdr.len &byteorder=bigendian;
|
||||
|
||||
type Header = record {
|
||||
le_len: uint24le;
|
||||
seq_id: uint8;
|
||||
} &let {
|
||||
len : uint32 = to_int()(le_len) + 4;
|
||||
} &length=4;
|
||||
|
||||
type Server_Message(seq_id: uint8) = case seq_id of {
|
||||
0 -> initial_handshake: Initial_Handshake_Packet;
|
||||
default -> command_response : Command_Response;
|
||||
};
|
||||
|
||||
type Client_Message(state: int) = case state of {
|
||||
CONNECTION_PHASE -> connection_phase: Handshake_Response_Packet;
|
||||
COMMAND_PHASE -> command_phase : Command_Request_Packet;
|
||||
};
|
||||
|
||||
# Handshake Request
|
||||
|
||||
type Initial_Handshake_Packet = record {
|
||||
version : uint8;
|
||||
pkt : case version of {
|
||||
10 -> handshake10 : Handshake_v10;
|
||||
9 -> handshake9 : Handshake_v9;
|
||||
default -> error : ERR_Packet;
|
||||
};
|
||||
} &let {
|
||||
set_version : bool = $context.connection.set_version(version);
|
||||
};
|
||||
|
||||
type Handshake_v10 = record {
|
||||
server_version : NUL_String;
|
||||
connection_id : uint32;
|
||||
auth_plugin_data_part_1 : bytestring &length=8;
|
||||
filler_1 : uint8;
|
||||
capability_flag_1 : uint16;
|
||||
character_set : uint8;
|
||||
status_flags : uint16;
|
||||
capability_flags_2 : uint16;
|
||||
auth_plugin_data_len : uint8;
|
||||
auth_plugin_name : NUL_String;
|
||||
};
|
||||
|
||||
type Handshake_v9 = record {
|
||||
server_version : NUL_String;
|
||||
connection_id : uint32;
|
||||
scramble : NUL_String;
|
||||
};
|
||||
|
||||
# Handshake Response
|
||||
|
||||
type Handshake_Response_Packet = case $context.connection.get_version() of {
|
||||
10 -> v10_response : Handshake_Response_Packet_v10;
|
||||
9 -> v9_response : Handshake_Response_Packet_v9;
|
||||
} &let {
|
||||
version : uint8 = $context.connection.get_version();
|
||||
} &byteorder=bigendian;
|
||||
|
||||
type Handshake_Response_Packet_v10 = record {
|
||||
cap_flags : uint32;
|
||||
max_pkt_size : uint32;
|
||||
char_set : uint8;
|
||||
pad : padding[23];
|
||||
username : NUL_String;
|
||||
password : bytestring &restofdata;
|
||||
};
|
||||
|
||||
type Handshake_Response_Packet_v9 = record {
|
||||
cap_flags : uint16;
|
||||
max_pkt_size : uint24le;
|
||||
username : NUL_String;
|
||||
auth_response : NUL_String;
|
||||
have_db : case ( cap_flags & 0x8 ) of {
|
||||
0x8 -> database : NUL_String;
|
||||
0x0 -> none : empty;
|
||||
};
|
||||
password : bytestring &restofdata;
|
||||
};
|
||||
|
||||
# Command Request
|
||||
|
||||
type Command_Request_Packet = record {
|
||||
command : uint8;
|
||||
arg : bytestring &restofdata;
|
||||
} &let {
|
||||
update_expectation : bool = $context.connection.set_next_expected(EXPECT_COLUMN_COUNT);
|
||||
};
|
||||
|
||||
# Command Response
|
||||
|
||||
type Command_Response = case $context.connection.get_expectation() of {
|
||||
EXPECT_COLUMN_COUNT -> col_count_meta : ColumnCountMeta;
|
||||
EXPECT_COLUMN_DEFINITION -> col_defs : ColumnDefinitions;
|
||||
EXPECT_RESULTSET -> resultset : Resultset;
|
||||
EXPECT_STATUS -> status : Command_Response_Status;
|
||||
EXPECT_EOF1 -> eof1 : EOF1;
|
||||
EXPECT_EOF2 -> eof2 : EOF2;
|
||||
default -> unknow : empty;
|
||||
};
|
||||
|
||||
type Command_Response_Status = record {
|
||||
pkt_type: uint8;
|
||||
response: case pkt_type of {
|
||||
0x00 -> data_ok: OK_Packet;
|
||||
0xfe -> data_eof: EOF_Packet;
|
||||
0xff -> data_err: ERR_Packet;
|
||||
default -> unknown: empty;
|
||||
};
|
||||
};
|
||||
|
||||
type ColumnCountMeta = record {
|
||||
byte : uint8;
|
||||
pkt_type: case byte of {
|
||||
0x00 -> ok : OK_Packet;
|
||||
0xff -> err : ERR_Packet;
|
||||
# 0xfb -> Not implemented
|
||||
default -> col_count: ColumnCount(byte);
|
||||
};
|
||||
};
|
||||
|
||||
type ColumnCount(byte: uint8) = record {
|
||||
le_column_count : LengthEncodedIntegerLookahead(byte);
|
||||
} &let {
|
||||
col_num : uint32 = to_int()(le_column_count);
|
||||
update_col_num : bool = $context.connection.set_col_count(col_num);
|
||||
update_expectation : bool = $context.connection.set_next_expected(EXPECT_COLUMN_DEFINITION);
|
||||
};
|
||||
|
||||
type ColumnDefinitions = record {
|
||||
defs : ColumnDefinition41[1];
|
||||
} &let {
|
||||
update_expectation : bool = $context.connection.set_next_expected(EXPECT_EOF1);
|
||||
};
|
||||
|
||||
type EOF1 = record {
|
||||
eof : EOF_Packet;
|
||||
} &let {
|
||||
update_expectation : bool = $context.connection.set_next_expected(EXPECT_RESULTSET);
|
||||
};
|
||||
|
||||
type EOF2 = record {
|
||||
eof : EOF_Packet;
|
||||
} &let {
|
||||
update_expectation : bool = $context.connection.set_next_expected(NO_EXPECTATION);
|
||||
};
|
||||
|
||||
type Resultset = record {
|
||||
rows : ResultsetRow[] &until($input.length()==0);
|
||||
} &let {
|
||||
update_expectation : bool = $context.connection.set_next_expected(EXPECT_EOF2);
|
||||
};
|
||||
|
||||
type ResultsetRow = record {
|
||||
fields: LengthEncodedString[$context.connection.get_col_count()];
|
||||
};
|
||||
|
||||
type ColumnDefinition41 = record {
|
||||
catalog : LengthEncodedString;
|
||||
schema : LengthEncodedString;
|
||||
table : LengthEncodedString;
|
||||
org_table: LengthEncodedString;
|
||||
name : LengthEncodedString;
|
||||
org_name : LengthEncodedString;
|
||||
next_len : LengthEncodedInteger;
|
||||
char_set : uint16;
|
||||
col_len : uint32;
|
||||
type : uint8;
|
||||
flags : uint16;
|
||||
decimals : uint8;
|
||||
filler : padding[2];
|
||||
};
|
||||
|
||||
type ColumnDefinition320 = record {
|
||||
table : LengthEncodedString;
|
||||
name : LengthEncodedString;
|
||||
length_of_col_len: LengthEncodedInteger;
|
||||
col_len : uint24le;
|
||||
type_len : LengthEncodedInteger;
|
||||
type : uint8;
|
||||
};
|
||||
|
||||
type OK_Packet = record {
|
||||
le_rows : LengthEncodedInteger;
|
||||
todo : bytestring &restofdata;
|
||||
} &let {
|
||||
rows : uint32 = to_int()(le_rows);
|
||||
update_state: bool = $context.connection.update_state(COMMAND_PHASE);
|
||||
};
|
||||
|
||||
type ERR_Packet = record {
|
||||
code : uint16;
|
||||
state: bytestring &length=6;
|
||||
msg : bytestring &restofdata;
|
||||
} &let {
|
||||
update_state: bool = $context.connection.update_state(COMMAND_PHASE);
|
||||
};
|
||||
|
||||
type EOF_Packet = record {
|
||||
warnings: uint16;
|
||||
status : uint16;
|
||||
} &let {
|
||||
update_state: bool = $context.connection.update_state(COMMAND_PHASE);
|
||||
};
|
||||
|
||||
# State tracking
|
||||
|
||||
refine connection MySQL_Conn += {
|
||||
%member{
|
||||
uint8 version_;
|
||||
int state_;
|
||||
Expected expected_;
|
||||
uint32 col_count_;
|
||||
%}
|
||||
|
||||
%init{
|
||||
version_ = 0;
|
||||
state_ = CONNECTION_PHASE;
|
||||
expected_ = EXPECT_STATUS;
|
||||
col_count_ = 0;
|
||||
%}
|
||||
|
||||
function get_version(): uint8
|
||||
%{
|
||||
return version_;
|
||||
%}
|
||||
|
||||
function set_version(v: uint8): bool
|
||||
%{
|
||||
version_ = v;
|
||||
return true;
|
||||
%}
|
||||
|
||||
function get_state(): int
|
||||
%{
|
||||
return state_;
|
||||
%}
|
||||
|
||||
function update_state(s: state): bool
|
||||
%{
|
||||
state_ = s;
|
||||
return true;
|
||||
%}
|
||||
|
||||
function get_expectation(): Expected
|
||||
%{
|
||||
return expected_;
|
||||
%}
|
||||
|
||||
function set_next_expected(e: Expected): bool
|
||||
%{
|
||||
expected_ = e;
|
||||
return true;
|
||||
%}
|
||||
|
||||
function get_col_count(): uint32
|
||||
%{
|
||||
return col_count_;
|
||||
%}
|
||||
|
||||
function set_col_count(i: uint32): bool
|
||||
%{
|
||||
col_count_ = i;
|
||||
return true;
|
||||
%}
|
||||
};
|
37
src/analyzer/protocol/mysql/mysql.pac
Normal file
37
src/analyzer/protocol/mysql/mysql.pac
Normal file
|
@ -0,0 +1,37 @@
|
|||
# See the file "COPYING" in the main distribution directory for copyright.
|
||||
#
|
||||
# Analyzer for MySQL
|
||||
# - mysql-protocol.pac: describes the MySQL protocol messages
|
||||
# - mysql-analyzer.pac: describes the MySQL analyzer code
|
||||
|
||||
%include binpac.pac
|
||||
%include bro.pac
|
||||
|
||||
%extern{
|
||||
#include "events.bif.h"
|
||||
%}
|
||||
|
||||
analyzer MySQL withcontext {
|
||||
connection: MySQL_Conn;
|
||||
flow: MySQL_Flow;
|
||||
};
|
||||
|
||||
# Our connection consists of two flows, one in each direction.
|
||||
connection MySQL_Conn(bro_analyzer: BroAnalyzer) {
|
||||
upflow = MySQL_Flow(true);
|
||||
downflow = MySQL_Flow(false);
|
||||
};
|
||||
|
||||
%include mysql-protocol.pac
|
||||
|
||||
# Now we define the flow:
|
||||
flow MySQL_Flow(is_orig: bool) {
|
||||
# There are two options here: flowunit or datagram.
|
||||
# flowunit = MySQL_PDU(is_orig) withcontext(connection, this);
|
||||
flowunit = MySQL_PDU(is_orig) withcontext(connection, this);
|
||||
# Using flowunit will cause the anlayzer to buffer incremental input.
|
||||
# This is needed for &oneline and &length. If you don't need this, you'll
|
||||
# get better performance with datagram.
|
||||
};
|
||||
|
||||
%include mysql-analyzer.pac
|
|
@ -22,6 +22,7 @@ void PIA::ClearBuffer(Buffer* buffer)
|
|||
for ( DataBlock* b = buffer->head; b; b = next )
|
||||
{
|
||||
next = b->next;
|
||||
delete b->ip;
|
||||
delete [] b->data;
|
||||
delete b;
|
||||
}
|
||||
|
@ -31,7 +32,7 @@ void PIA::ClearBuffer(Buffer* buffer)
|
|||
}
|
||||
|
||||
void PIA::AddToBuffer(Buffer* buffer, uint64 seq, int len, const u_char* data,
|
||||
bool is_orig)
|
||||
bool is_orig, const IP_Hdr* ip)
|
||||
{
|
||||
u_char* tmp = 0;
|
||||
|
||||
|
@ -42,6 +43,7 @@ void PIA::AddToBuffer(Buffer* buffer, uint64 seq, int len, const u_char* data,
|
|||
}
|
||||
|
||||
DataBlock* b = new DataBlock;
|
||||
b->ip = ip ? ip->Copy() : 0;
|
||||
b->data = tmp;
|
||||
b->is_orig = is_orig;
|
||||
b->len = len;
|
||||
|
@ -59,9 +61,10 @@ void PIA::AddToBuffer(Buffer* buffer, uint64 seq, int len, const u_char* data,
|
|||
buffer->size += len;
|
||||
}
|
||||
|
||||
void PIA::AddToBuffer(Buffer* buffer, int len, const u_char* data, bool is_orig)
|
||||
void PIA::AddToBuffer(Buffer* buffer, int len, const u_char* data, bool is_orig,
|
||||
const IP_Hdr* ip)
|
||||
{
|
||||
AddToBuffer(buffer, -1, len, data, is_orig);
|
||||
AddToBuffer(buffer, -1, len, data, is_orig, ip);
|
||||
}
|
||||
|
||||
void PIA::ReplayPacketBuffer(analyzer::Analyzer* analyzer)
|
||||
|
@ -69,7 +72,7 @@ void PIA::ReplayPacketBuffer(analyzer::Analyzer* analyzer)
|
|||
DBG_LOG(DBG_ANALYZER, "PIA replaying %d total packet bytes", pkt_buffer.size);
|
||||
|
||||
for ( DataBlock* b = pkt_buffer.head; b; b = b->next )
|
||||
analyzer->DeliverPacket(b->len, b->data, b->is_orig, -1, 0, 0);
|
||||
analyzer->DeliverPacket(b->len, b->data, b->is_orig, -1, b->ip, 0);
|
||||
}
|
||||
|
||||
void PIA::PIA_Done()
|
||||
|
@ -96,7 +99,7 @@ void PIA::PIA_DeliverPacket(int len, const u_char* data, bool is_orig, uint64 se
|
|||
if ( (pkt_buffer.state == BUFFERING || new_state == BUFFERING) &&
|
||||
len > 0 )
|
||||
{
|
||||
AddToBuffer(&pkt_buffer, seq, len, data, is_orig);
|
||||
AddToBuffer(&pkt_buffer, seq, len, data, is_orig, ip);
|
||||
if ( pkt_buffer.size > dpd_buffer_size )
|
||||
new_state = dpd_match_only_beginning ?
|
||||
SKIPPING : MATCHING_ONLY;
|
||||
|
|
|
@ -49,6 +49,7 @@ protected:
|
|||
// Buffers one chunk of data. Used both for packet payload (incl.
|
||||
// sequence numbers for TCP) and chunks of a reassembled stream.
|
||||
struct DataBlock {
|
||||
IP_Hdr* ip;
|
||||
const u_char* data;
|
||||
bool is_orig;
|
||||
int len;
|
||||
|
@ -66,9 +67,9 @@ protected:
|
|||
};
|
||||
|
||||
void AddToBuffer(Buffer* buffer, uint64 seq, int len,
|
||||
const u_char* data, bool is_orig);
|
||||
const u_char* data, bool is_orig, const IP_Hdr* ip = 0);
|
||||
void AddToBuffer(Buffer* buffer, int len,
|
||||
const u_char* data, bool is_orig);
|
||||
const u_char* data, bool is_orig, const IP_Hdr* ip = 0);
|
||||
void ClearBuffer(Buffer* buffer);
|
||||
|
||||
DataBlock* CurrentPacket() { return ¤t_packet; }
|
||||
|
|
|
@ -24,12 +24,6 @@ public:
|
|||
static analyzer::Analyzer* Instantiate(Connection* conn)
|
||||
{ return new SSL_Analyzer(conn); }
|
||||
|
||||
static bool Available()
|
||||
{
|
||||
return ( ssl_client_hello || ssl_server_hello ||
|
||||
ssl_established || ssl_extension || ssl_alert );
|
||||
}
|
||||
|
||||
protected:
|
||||
binpac::SSL::SSL_Conn* interp;
|
||||
bool had_gap;
|
||||
|
|
|
@ -112,7 +112,10 @@ refine connection SSL_Conn += {
|
|||
cipher_suites24 : uint24[]) : bool
|
||||
%{
|
||||
if ( ! version_ok(version) )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("unsupported client SSL version 0x%04x", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
}
|
||||
else
|
||||
bro_analyzer()->ProtocolConfirmation();
|
||||
|
||||
|
@ -152,7 +155,10 @@ refine connection SSL_Conn += {
|
|||
comp_method : uint8) : bool
|
||||
%{
|
||||
if ( ! version_ok(version) )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("unsupported server SSL version 0x%04x", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
}
|
||||
|
||||
if ( ssl_server_hello )
|
||||
{
|
||||
|
@ -202,6 +208,7 @@ refine connection SSL_Conn += {
|
|||
// This should be impossible due to the binpac parser
|
||||
// and protocol description
|
||||
bro_analyzer()->ProtocolViolation(fmt("Impossible extension length: %lu", length));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return true;
|
||||
}
|
||||
|
||||
|
@ -392,7 +399,11 @@ refine connection SSL_Conn += {
|
|||
function proc_check_v2_server_hello_version(version: uint16) : bool
|
||||
%{
|
||||
if ( version != SSLv20 )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid version in SSL server hello. Version: %d", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
%}
|
||||
|
@ -479,13 +490,13 @@ refine typeattr ServerHello += &let {
|
|||
};
|
||||
|
||||
refine typeattr V2ServerHello += &let {
|
||||
proc : bool = $context.connection.proc_server_hello(rec, server_version, 0,
|
||||
conn_id_data, 0, 0, ciphers, 0);
|
||||
|
||||
check_v2 : bool = $context.connection.proc_check_v2_server_hello_version(server_version);
|
||||
|
||||
proc : bool = $context.connection.proc_server_hello(rec, server_version, 0,
|
||||
conn_id_data, 0, 0, ciphers, 0) &requires(check_v2) &if(check_v2 == true);
|
||||
|
||||
cert : bool = $context.connection.proc_v2_certificate(rec, cert_data)
|
||||
&requires(proc);
|
||||
&requires(proc) &requires(check_v2) &if(check_v2 == true);
|
||||
};
|
||||
|
||||
refine typeattr Certificate += &let {
|
||||
|
|
|
@ -36,7 +36,7 @@ type SSLRecord(is_orig: bool) = record {
|
|||
} &length = length+5, &byteorder=bigendian,
|
||||
&let {
|
||||
version : int =
|
||||
$context.connection.determine_ssl_record_layer(head0, head1, head2, head3, head4);
|
||||
$context.connection.determine_ssl_record_layer(head0, head1, head2, head3, head4, is_orig);
|
||||
|
||||
content_type : int = case version of {
|
||||
SSLv20 -> head2+300;
|
||||
|
@ -748,7 +748,7 @@ refine connection SSL_Conn += {
|
|||
%}
|
||||
|
||||
function determine_ssl_record_layer(head0 : uint8, head1 : uint8,
|
||||
head2 : uint8, head3: uint8, head4: uint8) : int
|
||||
head2 : uint8, head3: uint8, head4: uint8, is_orig: bool) : int
|
||||
%{
|
||||
// re-check record layer version to be sure that we still are synchronized with
|
||||
// the data stream
|
||||
|
@ -759,6 +759,7 @@ refine connection SSL_Conn += {
|
|||
version != TLSv11 && version != TLSv12 )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid version late in TLS connection. Packet reported version: %d", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return UNKNOWN_VERSION;
|
||||
}
|
||||
}
|
||||
|
@ -768,13 +769,14 @@ refine connection SSL_Conn += {
|
|||
|
||||
if ( head0 & 0x80 )
|
||||
{
|
||||
if ( head2 == 0x01 ) // SSLv2 client hello.
|
||||
if ( head2 == 0x01 && is_orig ) // SSLv2 client hello.
|
||||
{
|
||||
uint16 version = (head3 << 8) | head4;
|
||||
if ( version != SSLv20 && version != SSLv30 && version != TLSv10 &&
|
||||
version != TLSv11 && version != TLSv12 )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid version in SSL client hello. Version: %d", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return UNKNOWN_VERSION;
|
||||
}
|
||||
|
||||
|
@ -782,7 +784,7 @@ refine connection SSL_Conn += {
|
|||
return SSLv20;
|
||||
}
|
||||
|
||||
else if ( head2 == 0x04 ) // SSLv2 server hello. This connection will continue using SSLv2.
|
||||
else if ( head2 == 0x04 && head4 < 2 && ! is_orig ) // SSLv2 server hello. This connection will continue using SSLv2.
|
||||
{
|
||||
record_layer_version_ = SSLv20;
|
||||
return SSLv20;
|
||||
|
@ -791,6 +793,7 @@ refine connection SSL_Conn += {
|
|||
else // this is not SSL or TLS.
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid headers in SSL connection. Head1: %d, head2: %d, head3: %d", head1, head2, head3));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return UNKNOWN_VERSION;
|
||||
}
|
||||
}
|
||||
|
@ -800,6 +803,7 @@ refine connection SSL_Conn += {
|
|||
version != TLSv11 && version != TLSv12 )
|
||||
{
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid version in TLS connection. Version: %d", version));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return UNKNOWN_VERSION;
|
||||
}
|
||||
|
||||
|
@ -810,6 +814,7 @@ refine connection SSL_Conn += {
|
|||
}
|
||||
|
||||
bro_analyzer()->ProtocolViolation(fmt("Invalid type in TLS connection. Version: %d, Type: %d", version, head0));
|
||||
bro_analyzer()->SetSkip(true);
|
||||
return UNKNOWN_VERSION;
|
||||
%}
|
||||
|
||||
|
|
|
@ -28,7 +28,7 @@ TCP_Reassembler::TCP_Reassembler(analyzer::Analyzer* arg_dst_analyzer,
|
|||
TCP_Analyzer* arg_tcp_analyzer,
|
||||
TCP_Reassembler::Type arg_type,
|
||||
TCP_Endpoint* arg_endp)
|
||||
: Reassembler(1, REASSEM_TCP)
|
||||
: Reassembler(1)
|
||||
{
|
||||
dst_analyzer = arg_dst_analyzer;
|
||||
tcp_analyzer = arg_tcp_analyzer;
|
||||
|
|
|
@ -11,9 +11,6 @@ namespace analyzer { namespace tcp {
|
|||
|
||||
class TCP_Analyzer;
|
||||
|
||||
const int STOP_ON_GAP = 1;
|
||||
const int PUNT_ON_PARTIAL = 1;
|
||||
|
||||
class TCP_Reassembler : public Reassembler {
|
||||
public:
|
||||
enum Type {
|
||||
|
|
|
@ -29,8 +29,10 @@ event new_connection_contents%(c: connection%);
|
|||
## new_connection new_connection_contents partial_connection
|
||||
event connection_attempt%(c: connection%);
|
||||
|
||||
## Generated when a SYN-ACK packet is seen in response to a SYN packet during
|
||||
## a TCP handshake. The final ACK of the handshake in response to SYN-ACK may
|
||||
## Generated when seeing a SYN-ACK packet from the responder in a TCP
|
||||
## handshake. An associated SYN packet was not seen from the originator
|
||||
## side if its state is not set to :bro:see:`TCP_ESTABLISHED`.
|
||||
## The final ACK of the handshake in response to SYN-ACK may
|
||||
## or may not occur later, one way to tell is to check the *history* field of
|
||||
## :bro:type:`connection` to see if the originator sent an ACK, indicated by
|
||||
## 'A' in the history string.
|
||||
|
|
20
src/bro.bif
20
src/bro.bif
|
@ -2,8 +2,8 @@
|
|||
##! such as general programming algorithms, string processing, math functions,
|
||||
##! introspection, type conversion, file/directory manipulation, packet
|
||||
##! filtering, interprocess communication and controlling protocol analyzer
|
||||
##! behavior.
|
||||
##!
|
||||
##! behavior.
|
||||
##!
|
||||
##! You'll find most of Bro's built-in functions that aren't protocol-specific
|
||||
##! in this file.
|
||||
|
||||
|
@ -2159,6 +2159,22 @@ function counts_to_addr%(v: index_vec%): addr
|
|||
}
|
||||
%}
|
||||
|
||||
## Converts an :bro:type:`enum` to an :bro:type:`int`.
|
||||
##
|
||||
## e: The :bro:type:`enum` to convert.
|
||||
##
|
||||
## Returns: The :bro:type:`int` value that corresponds to the :bro:type:`enum`.
|
||||
function enum_to_int%(e: any%): int
|
||||
%{
|
||||
if ( e->Type()->Tag() != TYPE_ENUM )
|
||||
{
|
||||
builtin_error("enum_to_int() requires enum value");
|
||||
return new Val(-1, TYPE_INT);
|
||||
}
|
||||
|
||||
return new Val(e->AsEnum(), TYPE_INT);
|
||||
%}
|
||||
|
||||
## Converts a :bro:type:`string` to an :bro:type:`int`.
|
||||
##
|
||||
## str: The :bro:type:`string` to convert.
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
%{
|
||||
#include <ctype.h>
|
||||
#include <string.h>
|
||||
#include <unistd.h>
|
||||
#include "bif_arg.h"
|
||||
|
@ -223,7 +224,7 @@ void init_alternative_mode()
|
|||
|
||||
for ( char* p = guard; *p; p++ )
|
||||
{
|
||||
if ( strchr("/.-", *p) )
|
||||
if ( ! isalnum(*p) )
|
||||
*p = '_';
|
||||
}
|
||||
|
||||
|
|
|
@ -905,7 +905,8 @@ event get_file_handle%(tag: Analyzer::Tag, c: connection, is_orig: bool%);
|
|||
##
|
||||
## f: The file.
|
||||
##
|
||||
## .. bro:see:: file_over_new_connection file_timeout file_gap file_state_remove
|
||||
## .. bro:see:: file_over_new_connection file_timeout file_gap file_mime_type
|
||||
## file_state_remove
|
||||
event file_new%(f: fa_file%);
|
||||
|
||||
## Indicates that a file has been seen being transferred over a connection
|
||||
|
@ -917,16 +918,39 @@ event file_new%(f: fa_file%);
|
|||
##
|
||||
## is_orig: true if the originator of *c* is the one sending the file.
|
||||
##
|
||||
## .. bro:see:: file_new file_timeout file_gap file_state_remove
|
||||
## .. bro:see:: file_new file_timeout file_gap file_mime_type
|
||||
## file_state_remove
|
||||
event file_over_new_connection%(f: fa_file, c: connection, is_orig: bool%);
|
||||
|
||||
## Provide the most likely matching MIME type for this file. The analysis
|
||||
## can be augmented at this time via :bro:see:`Files::add_analyzer`.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## mime_type: The mime type that was discovered.
|
||||
##
|
||||
## .. bro:see:: file_over_new_connection file_timeout file_gap file_mime_type
|
||||
## file_mime_types file_state_remove
|
||||
event file_mime_type%(f: fa_file, mime_type: string%);
|
||||
|
||||
## Provide all matching MIME types for this file. The analysis can be
|
||||
## augmented at this time via :bro:see:`Files::add_analyzer`.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## mime_types: The mime types that were discovered.
|
||||
##
|
||||
## .. bro:see:: file_over_new_connection file_timeout file_gap file_mime_type
|
||||
## file_mime_types file_state_remove
|
||||
event file_mime_types%(f: fa_file, mime_types: mime_matches%);
|
||||
|
||||
## Indicates that file analysis has timed out because no activity was seen
|
||||
## for the file in a while.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## .. bro:see:: file_new file_over_new_connection file_gap file_state_remove
|
||||
## default_file_timeout_interval Files::set_timeout_interval
|
||||
## .. bro:see:: file_new file_over_new_connection file_gap file_mime_type
|
||||
## file_mime_types file_state_remove default_file_timeout_interval
|
||||
## Files::set_timeout_interval
|
||||
event file_timeout%(f: fa_file%);
|
||||
|
||||
|
@ -938,14 +962,34 @@ event file_timeout%(f: fa_file%);
|
|||
##
|
||||
## len: The number of missing bytes.
|
||||
##
|
||||
## .. bro:see:: file_new file_over_new_connection file_timeout file_state_remove
|
||||
## .. bro:see:: file_new file_over_new_connection file_timeout file_mime_type
|
||||
## file_mime_types file_state_remove file_reassembly_overflow
|
||||
event file_gap%(f: fa_file, offset: count, len: count%);
|
||||
|
||||
## Indicates that the file had an overflow of the reassembly buffer.
|
||||
## This is a specialization of the :bro:id:`file_gap` event.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## offset: The byte offset from the start of the file at which the reassembly
|
||||
## couldn't continue due to running out of reassembly buffer space.
|
||||
##
|
||||
## skipped: The number of bytes of the file skipped over to flush some
|
||||
## file data and get back under the reassembly buffer size limit.
|
||||
## This value will also be represented as a gap.
|
||||
##
|
||||
## .. bro:see:: file_new file_over_new_connection file_timeout file_mime_type
|
||||
## file_mime_types file_state_remove file_gap Files::enable_reassembler
|
||||
## Files::reassembly_buffer_size Files::enable_reassembly
|
||||
## Files::disable_reassembly Files::set_reassembly_buffer_size
|
||||
event file_reassembly_overflow%(f: fa_file, offset: count, skipped: count%);
|
||||
|
||||
## This event is generated each time file analysis is ending for a given file.
|
||||
##
|
||||
## f: The file.
|
||||
##
|
||||
## .. bro:see:: file_new file_over_new_connection file_timeout file_gap
|
||||
## file_mime_type file_mime_types
|
||||
event file_state_remove%(f: fa_file%);
|
||||
|
||||
## Generated when an internal DNS lookup produces the same result as last time.
|
||||
|
|
|
@ -111,6 +111,18 @@ public:
|
|||
*/
|
||||
void SetAnalyzerTag(const file_analysis::Tag& tag);
|
||||
|
||||
/**
|
||||
* @return true if the analyzer has ever seen a stream-wise delivery.
|
||||
*/
|
||||
bool GotStreamDelivery() const
|
||||
{ return got_stream_delivery; }
|
||||
|
||||
/**
|
||||
* Flag the analyzer as having seen a stream-wise delivery.
|
||||
*/
|
||||
void SetGotStreamDelivery()
|
||||
{ got_stream_delivery = true; }
|
||||
|
||||
protected:
|
||||
|
||||
/**
|
||||
|
@ -123,7 +135,8 @@ protected:
|
|||
Analyzer(file_analysis::Tag arg_tag, RecordVal* arg_args, File* arg_file)
|
||||
: tag(arg_tag),
|
||||
args(arg_args->Ref()->AsRecordVal()),
|
||||
file(arg_file)
|
||||
file(arg_file),
|
||||
got_stream_delivery(false)
|
||||
{
|
||||
id = ++id_counter;
|
||||
}
|
||||
|
@ -140,7 +153,8 @@ protected:
|
|||
Analyzer(RecordVal* arg_args, File* arg_file)
|
||||
: tag(),
|
||||
args(arg_args->Ref()->AsRecordVal()),
|
||||
file(arg_file)
|
||||
file(arg_file),
|
||||
got_stream_delivery(false)
|
||||
{
|
||||
id = ++id_counter;
|
||||
}
|
||||
|
@ -151,6 +165,7 @@ private:
|
|||
file_analysis::Tag tag; /**< The particular type of the analyzer instance. */
|
||||
RecordVal* args; /**< \c AnalyzerArgs val gives tunable analyzer params. */
|
||||
File* file; /**< The file to which the analyzer is attached. */
|
||||
bool got_stream_delivery;
|
||||
|
||||
static ID id_counter;
|
||||
};
|
||||
|
|
|
@ -72,7 +72,7 @@ bool AnalyzerSet::Add(file_analysis::Tag tag, RecordVal* args)
|
|||
return true;
|
||||
}
|
||||
|
||||
bool AnalyzerSet::QueueAdd(file_analysis::Tag tag, RecordVal* args)
|
||||
Analyzer* AnalyzerSet::QueueAdd(file_analysis::Tag tag, RecordVal* args)
|
||||
{
|
||||
HashKey* key = GetKey(tag, args);
|
||||
file_analysis::Analyzer* a = InstantiateAnalyzer(tag, args);
|
||||
|
@ -80,12 +80,12 @@ bool AnalyzerSet::QueueAdd(file_analysis::Tag tag, RecordVal* args)
|
|||
if ( ! a )
|
||||
{
|
||||
delete key;
|
||||
return false;
|
||||
return 0;
|
||||
}
|
||||
|
||||
mod_queue.push(new AddMod(a, key));
|
||||
|
||||
return true;
|
||||
return a;
|
||||
}
|
||||
|
||||
bool AnalyzerSet::AddMod::Perform(AnalyzerSet* set)
|
||||
|
|
|
@ -57,9 +57,10 @@ public:
|
|||
* Queue the attachment of an analyzer to #file.
|
||||
* @param tag the analyzer tag of the file analyzer to add.
|
||||
* @param args an \c AnalyzerArgs value which specifies an analyzer.
|
||||
* @return true if analyzer was able to be instantiated, else false.
|
||||
* @return if successful, a pointer to a newly instantiated analyzer else
|
||||
* a null pointer. The caller does *not* take ownership of the memory.
|
||||
*/
|
||||
bool QueueAdd(file_analysis::Tag tag, RecordVal* args);
|
||||
file_analysis::Analyzer* QueueAdd(file_analysis::Tag tag, RecordVal* args);
|
||||
|
||||
/**
|
||||
* Remove an analyzer from #file immediately.
|
||||
|
|
|
@ -11,6 +11,7 @@ set(file_analysis_SRCS
|
|||
Manager.cc
|
||||
File.cc
|
||||
FileTimer.cc
|
||||
FileReassembler.cc
|
||||
Analyzer.cc
|
||||
AnalyzerSet.cc
|
||||
Component.cc
|
||||
|
|
|
@ -53,8 +53,6 @@ int File::overflow_bytes_idx = -1;
|
|||
int File::timeout_interval_idx = -1;
|
||||
int File::bof_buffer_size_idx = -1;
|
||||
int File::bof_buffer_idx = -1;
|
||||
int File::mime_type_idx = -1;
|
||||
int File::mime_types_idx = -1;
|
||||
|
||||
void File::StaticInit()
|
||||
{
|
||||
|
@ -74,15 +72,14 @@ void File::StaticInit()
|
|||
timeout_interval_idx = Idx("timeout_interval");
|
||||
bof_buffer_size_idx = Idx("bof_buffer_size");
|
||||
bof_buffer_idx = Idx("bof_buffer");
|
||||
mime_type_idx = Idx("mime_type");
|
||||
mime_types_idx = Idx("mime_types");
|
||||
}
|
||||
|
||||
File::File(const string& file_id, Connection* conn, analyzer::Tag tag,
|
||||
bool is_orig)
|
||||
: id(file_id), val(0), postpone_timeout(false), first_chunk(true),
|
||||
missed_bof(false), need_reassembly(false), done(false),
|
||||
did_file_new_event(false), analyzers(this)
|
||||
File::File(const string& file_id, const string& source_name, Connection* conn,
|
||||
analyzer::Tag tag, bool is_orig)
|
||||
: id(file_id), val(0), file_reassembler(0), stream_offset(0),
|
||||
reassembly_max_buffer(0), did_mime_type(false),
|
||||
reassembly_enabled(false), postpone_timeout(false), done(false),
|
||||
analyzers(this)
|
||||
{
|
||||
StaticInit();
|
||||
|
||||
|
@ -90,11 +87,10 @@ File::File(const string& file_id, Connection* conn, analyzer::Tag tag,
|
|||
|
||||
val = new RecordVal(fa_file_type);
|
||||
val->Assign(id_idx, new StringVal(file_id.c_str()));
|
||||
SetSource(source_name);
|
||||
|
||||
if ( conn )
|
||||
{
|
||||
// add source, connection, is_orig fields
|
||||
SetSource(analyzer_mgr->GetComponentName(tag));
|
||||
val->Assign(is_orig_idx, new Val(is_orig, TYPE_BOOL));
|
||||
UpdateConnectionFields(conn, is_orig);
|
||||
}
|
||||
|
@ -106,12 +102,7 @@ File::~File()
|
|||
{
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] Destroying File object", id.c_str());
|
||||
Unref(val);
|
||||
|
||||
while ( ! fonc_queue.empty() )
|
||||
{
|
||||
delete_vals(fonc_queue.front().second);
|
||||
fonc_queue.pop();
|
||||
}
|
||||
delete file_reassembler;
|
||||
}
|
||||
|
||||
void File::UpdateLastActivityTime()
|
||||
|
@ -124,10 +115,10 @@ double File::GetLastActivityTime() const
|
|||
return val->Lookup(last_active_idx)->AsTime();
|
||||
}
|
||||
|
||||
void File::UpdateConnectionFields(Connection* conn, bool is_orig)
|
||||
bool File::UpdateConnectionFields(Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( ! conn )
|
||||
return;
|
||||
return false;
|
||||
|
||||
Val* conns = val->Lookup(conns_idx);
|
||||
|
||||
|
@ -138,27 +129,28 @@ void File::UpdateConnectionFields(Connection* conn, bool is_orig)
|
|||
}
|
||||
|
||||
Val* idx = get_conn_id_val(conn);
|
||||
if ( ! conns->AsTableVal()->Lookup(idx) )
|
||||
|
||||
if ( conns->AsTableVal()->Lookup(idx) )
|
||||
{
|
||||
Val* conn_val = conn->BuildConnVal();
|
||||
conns->AsTableVal()->Assign(idx, conn_val);
|
||||
|
||||
if ( FileEventAvailable(file_over_new_connection) )
|
||||
{
|
||||
val_list* vl = new val_list();
|
||||
vl->append(val->Ref());
|
||||
vl->append(conn_val->Ref());
|
||||
vl->append(new Val(is_orig, TYPE_BOOL));
|
||||
|
||||
if ( did_file_new_event )
|
||||
FileEvent(file_over_new_connection, vl);
|
||||
else
|
||||
fonc_queue.push(pair<EventHandlerPtr, val_list*>(
|
||||
file_over_new_connection, vl));
|
||||
}
|
||||
Unref(idx);
|
||||
return false;
|
||||
}
|
||||
|
||||
conns->AsTableVal()->Assign(idx, conn->BuildConnVal());
|
||||
Unref(idx);
|
||||
return true;
|
||||
}
|
||||
|
||||
void File::RaiseFileOverNewConnection(Connection* conn, bool is_orig)
|
||||
{
|
||||
if ( conn && FileEventAvailable(file_over_new_connection) )
|
||||
{
|
||||
val_list* vl = new val_list();
|
||||
vl->append(val->Ref());
|
||||
vl->append(conn->BuildConnVal());
|
||||
vl->append(new Val(is_orig, TYPE_BOOL));
|
||||
FileEvent(file_over_new_connection, vl);
|
||||
}
|
||||
}
|
||||
|
||||
uint64 File::LookupFieldDefaultCount(int idx) const
|
||||
|
@ -242,7 +234,7 @@ bool File::IsComplete() const
|
|||
if ( ! total )
|
||||
return false;
|
||||
|
||||
if ( LookupFieldDefaultCount(seen_bytes_idx) >= total->AsCount() )
|
||||
if ( stream_offset >= total->AsCount() )
|
||||
return true;
|
||||
|
||||
return false;
|
||||
|
@ -258,7 +250,10 @@ bool File::AddAnalyzer(file_analysis::Tag tag, RecordVal* args)
|
|||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] Queuing addition of %s analyzer",
|
||||
id.c_str(), file_mgr->GetComponentName(tag).c_str());
|
||||
|
||||
return done ? false : analyzers.QueueAdd(tag, args);
|
||||
if ( done )
|
||||
return false;
|
||||
|
||||
return analyzers.QueueAdd(tag, args) != 0;
|
||||
}
|
||||
|
||||
bool File::RemoveAnalyzer(file_analysis::Tag tag, RecordVal* args)
|
||||
|
@ -269,9 +264,70 @@ bool File::RemoveAnalyzer(file_analysis::Tag tag, RecordVal* args)
|
|||
return done ? false : analyzers.QueueRemove(tag, args);
|
||||
}
|
||||
|
||||
void File::EnableReassembly()
|
||||
{
|
||||
reassembly_enabled = true;
|
||||
}
|
||||
|
||||
void File::DisableReassembly()
|
||||
{
|
||||
reassembly_enabled = false;
|
||||
delete file_reassembler;
|
||||
file_reassembler = 0;
|
||||
}
|
||||
|
||||
void File::SetReassemblyBuffer(uint64 max)
|
||||
{
|
||||
reassembly_max_buffer = max;
|
||||
}
|
||||
|
||||
bool File::DetectMIME()
|
||||
{
|
||||
did_mime_type = true;
|
||||
|
||||
Val* bof_buffer_val = val->Lookup(bof_buffer_idx);
|
||||
|
||||
if ( ! bof_buffer_val )
|
||||
{
|
||||
if ( bof_buffer.size == 0 )
|
||||
return false;
|
||||
|
||||
BroString* bs = concatenate(bof_buffer.chunks);
|
||||
bof_buffer_val = new StringVal(bs);
|
||||
val->Assign(bof_buffer_idx, bof_buffer_val);
|
||||
}
|
||||
|
||||
RuleMatcher::MIME_Matches matches;
|
||||
const u_char* data = bof_buffer_val->AsString()->Bytes();
|
||||
uint64 len = bof_buffer_val->AsString()->Len();
|
||||
len = min(len, LookupFieldDefaultCount(bof_buffer_size_idx));
|
||||
file_mgr->DetectMIME(data, len, &matches);
|
||||
|
||||
if ( matches.empty() )
|
||||
return false;
|
||||
|
||||
if ( FileEventAvailable(file_mime_type) )
|
||||
{
|
||||
val_list* vl = new val_list();
|
||||
vl->append(val->Ref());
|
||||
vl->append(new StringVal(*(matches.begin()->second.begin())));
|
||||
FileEvent(file_mime_type, vl);
|
||||
}
|
||||
|
||||
if ( FileEventAvailable(file_mime_types) )
|
||||
{
|
||||
val_list* vl = new val_list();
|
||||
vl->append(val->Ref());
|
||||
vl->append(file_analysis::GenMIMEMatchesVal(matches));
|
||||
FileEvent(file_mime_types, vl);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
bool File::BufferBOF(const u_char* data, uint64 len)
|
||||
{
|
||||
if ( bof_buffer.full || bof_buffer.replayed )
|
||||
if ( bof_buffer.full )
|
||||
return false;
|
||||
|
||||
uint64 desired_size = LookupFieldDefaultCount(bof_buffer_size_idx);
|
||||
|
@ -279,131 +335,154 @@ bool File::BufferBOF(const u_char* data, uint64 len)
|
|||
bof_buffer.chunks.push_back(new BroString(data, len, 0));
|
||||
bof_buffer.size += len;
|
||||
|
||||
if ( bof_buffer.size >= desired_size )
|
||||
if ( bof_buffer.size < desired_size )
|
||||
return true;
|
||||
|
||||
bof_buffer.full = true;
|
||||
|
||||
if ( bof_buffer.size > 0 )
|
||||
{
|
||||
bof_buffer.full = true;
|
||||
ReplayBOF();
|
||||
BroString* bs = concatenate(bof_buffer.chunks);
|
||||
val->Assign(bof_buffer_idx, new StringVal(bs));
|
||||
}
|
||||
|
||||
return true;
|
||||
return false;
|
||||
}
|
||||
|
||||
bool File::DetectMIME(const u_char* data, uint64 len)
|
||||
void File::DeliverStream(const u_char* data, uint64 len)
|
||||
{
|
||||
RuleMatcher::MIME_Matches matches;
|
||||
len = min(len, LookupFieldDefaultCount(bof_buffer_size_idx));
|
||||
file_mgr->DetectMIME(data, len, &matches);
|
||||
bool bof_was_full = bof_buffer.full;
|
||||
// Buffer enough data for the BOF buffer
|
||||
BufferBOF(data, len);
|
||||
|
||||
if ( matches.empty() )
|
||||
return false;
|
||||
if ( ! did_mime_type && bof_buffer.full &&
|
||||
LookupFieldDefaultCount(missing_bytes_idx) == 0 )
|
||||
DetectMIME();
|
||||
|
||||
val->Assign(mime_type_idx,
|
||||
new StringVal(*(matches.begin()->second.begin())));
|
||||
val->Assign(mime_types_idx, file_analysis::GenMIMEMatchesVal(matches));
|
||||
DBG_LOG(DBG_FILE_ANALYSIS,
|
||||
"[%s] %" PRIu64 " stream bytes in at offset %" PRIu64 "; %s [%s%s]",
|
||||
id.c_str(), len, stream_offset,
|
||||
IsComplete() ? "complete" : "incomplete",
|
||||
fmt_bytes((const char*) data, min((uint64)40, len)),
|
||||
len > 40 ? "..." : "");
|
||||
|
||||
return true;
|
||||
}
|
||||
file_analysis::Analyzer* a = 0;
|
||||
IterCookie* c = analyzers.InitForIteration();
|
||||
|
||||
void File::ReplayBOF()
|
||||
{
|
||||
if ( bof_buffer.replayed )
|
||||
return;
|
||||
|
||||
bof_buffer.replayed = true;
|
||||
|
||||
if ( bof_buffer.chunks.empty() )
|
||||
while ( (a = analyzers.NextEntry(c)) )
|
||||
{
|
||||
// Since we missed the beginning, try file type detect on next data in.
|
||||
missed_bof = true;
|
||||
return;
|
||||
if ( ! a->GotStreamDelivery() )
|
||||
{
|
||||
int num_bof_chunks_behind = bof_buffer.chunks.size();
|
||||
|
||||
if ( ! bof_was_full )
|
||||
// We just added a chunk to the BOF buffer, don't count it
|
||||
// as it will get delivered on its own.
|
||||
num_bof_chunks_behind -= 1;
|
||||
|
||||
uint64 bytes_delivered = 0;
|
||||
|
||||
// Catch this analyzer up with the BOF buffer.
|
||||
for ( int i = 0; i < num_bof_chunks_behind; ++i )
|
||||
{
|
||||
if ( ! a->DeliverStream(bof_buffer.chunks[i]->Bytes(),
|
||||
bof_buffer.chunks[i]->Len()) )
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
|
||||
bytes_delivered += bof_buffer.chunks[i]->Len();
|
||||
}
|
||||
|
||||
a->SetGotStreamDelivery();
|
||||
// May need to catch analyzer up on missed gap?
|
||||
// Analyzer should be fully caught up to stream_offset now.
|
||||
}
|
||||
|
||||
if ( ! a->DeliverStream(data, len) )
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
}
|
||||
|
||||
BroString* bs = concatenate(bof_buffer.chunks);
|
||||
val->Assign(bof_buffer_idx, new StringVal(bs));
|
||||
stream_offset += len;
|
||||
IncrementByteCount(len, seen_bytes_idx);
|
||||
}
|
||||
|
||||
DetectMIME(bs->Bytes(), bs->Len());
|
||||
FileEvent(file_new);
|
||||
void File::DeliverChunk(const u_char* data, uint64 len, uint64 offset)
|
||||
{
|
||||
// Potentially handle reassembly and deliver to the stream analyzers.
|
||||
if ( file_reassembler )
|
||||
{
|
||||
if ( reassembly_max_buffer > 0 &&
|
||||
reassembly_max_buffer < file_reassembler->TotalSize() )
|
||||
{
|
||||
uint64 current_offset = stream_offset;
|
||||
uint64 gap_bytes = file_reassembler->Flush();
|
||||
IncrementByteCount(gap_bytes, overflow_bytes_idx);
|
||||
|
||||
for ( size_t i = 0; i < bof_buffer.chunks.size(); ++i )
|
||||
DataIn(bof_buffer.chunks[i]->Bytes(), bof_buffer.chunks[i]->Len());
|
||||
if ( FileEventAvailable(file_reassembly_overflow) )
|
||||
{
|
||||
val_list* vl = new val_list();
|
||||
vl->append(val->Ref());
|
||||
vl->append(new Val(current_offset, TYPE_COUNT));
|
||||
vl->append(new Val(gap_bytes, TYPE_COUNT));
|
||||
FileEvent(file_reassembly_overflow, vl);
|
||||
}
|
||||
}
|
||||
|
||||
// Forward data to the reassembler.
|
||||
file_reassembler->NewBlock(network_time, offset, len, data);
|
||||
}
|
||||
else if ( stream_offset == offset )
|
||||
{
|
||||
// This is the normal case where a file is transferred linearly.
|
||||
// Nothing special should be done here.
|
||||
DeliverStream(data, len);
|
||||
}
|
||||
else if ( reassembly_enabled )
|
||||
{
|
||||
// This is data that doesn't match the offset and the reassembler
|
||||
// needs to be enabled.
|
||||
file_reassembler = new FileReassembler(this, stream_offset);
|
||||
file_reassembler->NewBlock(network_time, offset, len, data);
|
||||
}
|
||||
else
|
||||
{
|
||||
// We can't reassemble so we throw out the data for streaming.
|
||||
IncrementByteCount(len, overflow_bytes_idx);
|
||||
}
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS,
|
||||
"[%s] %" PRIu64 " chunk bytes in at offset %" PRIu64 "; %s [%s%s]",
|
||||
id.c_str(), len, offset,
|
||||
IsComplete() ? "complete" : "incomplete",
|
||||
fmt_bytes((const char*) data, min((uint64)40, len)),
|
||||
len > 40 ? "..." : "");
|
||||
|
||||
file_analysis::Analyzer* a = 0;
|
||||
IterCookie* c = analyzers.InitForIteration();
|
||||
|
||||
while ( (a = analyzers.NextEntry(c)) )
|
||||
{
|
||||
if ( ! a->DeliverChunk(data, len, offset) )
|
||||
{
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
}
|
||||
}
|
||||
|
||||
if ( IsComplete() )
|
||||
EndOfFile();
|
||||
}
|
||||
|
||||
void File::DataIn(const u_char* data, uint64 len, uint64 offset)
|
||||
{
|
||||
analyzers.DrainModifications();
|
||||
|
||||
if ( first_chunk )
|
||||
{
|
||||
// TODO: this should all really be delayed until we attempt reassembly
|
||||
DetectMIME(data, len);
|
||||
FileEvent(file_new);
|
||||
first_chunk = false;
|
||||
}
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] %" PRIu64 " bytes in at offset" PRIu64 "; %s [%s]",
|
||||
id.c_str(), len, offset,
|
||||
IsComplete() ? "complete" : "incomplete",
|
||||
fmt_bytes((const char*) data, min((uint64)40, len)), len > 40 ? "..." : "");
|
||||
|
||||
file_analysis::Analyzer* a = 0;
|
||||
IterCookie* c = analyzers.InitForIteration();
|
||||
|
||||
while ( (a = analyzers.NextEntry(c)) )
|
||||
{
|
||||
if ( ! a->DeliverChunk(data, len, offset) )
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
}
|
||||
|
||||
DeliverChunk(data, len, offset);
|
||||
analyzers.DrainModifications();
|
||||
|
||||
// TODO: check reassembly requirement based on buffer size in record
|
||||
if ( need_reassembly )
|
||||
reporter->InternalError("file_analyzer::File TODO: reassembly not yet supported");
|
||||
|
||||
// TODO: reassembly overflow stuff, increment overflow count, eval trigger
|
||||
|
||||
IncrementByteCount(len, seen_bytes_idx);
|
||||
}
|
||||
|
||||
void File::DataIn(const u_char* data, uint64 len)
|
||||
{
|
||||
analyzers.DrainModifications();
|
||||
|
||||
if ( BufferBOF(data, len) )
|
||||
return;
|
||||
|
||||
if ( missed_bof )
|
||||
{
|
||||
DetectMIME(data, len);
|
||||
FileEvent(file_new);
|
||||
missed_bof = false;
|
||||
}
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] %" PRIu64 " bytes in; %s [%s]",
|
||||
id.c_str(), len,
|
||||
IsComplete() ? "complete" : "incomplete",
|
||||
fmt_bytes((const char*) data, min((uint64)40, len)), len > 40 ? "..." : "");
|
||||
|
||||
file_analysis::Analyzer* a = 0;
|
||||
IterCookie* c = analyzers.InitForIteration();
|
||||
|
||||
while ( (a = analyzers.NextEntry(c)) )
|
||||
{
|
||||
if ( ! a->DeliverStream(data, len) )
|
||||
{
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
continue;
|
||||
}
|
||||
|
||||
uint64 offset = LookupFieldDefaultCount(seen_bytes_idx) +
|
||||
LookupFieldDefaultCount(missing_bytes_idx);
|
||||
|
||||
if ( ! a->DeliverChunk(data, len, offset) )
|
||||
analyzers.QueueRemove(a->Tag(), a->Args());
|
||||
}
|
||||
|
||||
DeliverChunk(data, len, stream_offset);
|
||||
analyzers.DrainModifications();
|
||||
IncrementByteCount(len, seen_bytes_idx);
|
||||
}
|
||||
|
||||
void File::EndOfFile()
|
||||
|
@ -413,10 +492,17 @@ void File::EndOfFile()
|
|||
if ( done )
|
||||
return;
|
||||
|
||||
if ( ! did_mime_type &&
|
||||
LookupFieldDefaultCount(missing_bytes_idx) == 0 )
|
||||
DetectMIME();
|
||||
|
||||
analyzers.DrainModifications();
|
||||
|
||||
// Send along anything that's been buffered, but never flushed.
|
||||
ReplayBOF();
|
||||
if ( file_reassembler )
|
||||
{
|
||||
file_reassembler->Flush();
|
||||
analyzers.DrainModifications();
|
||||
}
|
||||
|
||||
done = true;
|
||||
|
||||
|
@ -436,14 +522,17 @@ void File::EndOfFile()
|
|||
|
||||
void File::Gap(uint64 offset, uint64 len)
|
||||
{
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] Gap of size %" PRIu64 " at offset %" PRIu64,
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "[%s] Gap of size %" PRIu64 " at offset %," PRIu64,
|
||||
id.c_str(), len, offset);
|
||||
|
||||
analyzers.DrainModifications();
|
||||
if ( file_reassembler && ! file_reassembler->IsCurrentlyFlushing() )
|
||||
{
|
||||
file_reassembler->FlushTo(offset + len);
|
||||
// The reassembler will call us back with all the gaps we need to know.
|
||||
return;
|
||||
}
|
||||
|
||||
// If we were buffering the beginning of the file, a gap means we've got
|
||||
// as much contiguous stuff at the beginning as possible, so work with that.
|
||||
ReplayBOF();
|
||||
analyzers.DrainModifications();
|
||||
|
||||
file_analysis::Analyzer* a = 0;
|
||||
IterCookie* c = analyzers.InitForIteration();
|
||||
|
@ -464,6 +553,8 @@ void File::Gap(uint64 offset, uint64 len)
|
|||
}
|
||||
|
||||
analyzers.DrainModifications();
|
||||
|
||||
stream_offset += len;
|
||||
IncrementByteCount(len, missing_bytes_idx);
|
||||
}
|
||||
|
||||
|
@ -482,30 +573,13 @@ void File::FileEvent(EventHandlerPtr h)
|
|||
FileEvent(h, vl);
|
||||
}
|
||||
|
||||
static void flush_file_event_queue(queue<pair<EventHandlerPtr, val_list*> >& q)
|
||||
{
|
||||
while ( ! q.empty() )
|
||||
{
|
||||
pair<EventHandlerPtr, val_list*> p = q.front();
|
||||
mgr.QueueEvent(p.first, p.second);
|
||||
q.pop();
|
||||
}
|
||||
}
|
||||
|
||||
void File::FileEvent(EventHandlerPtr h, val_list* vl)
|
||||
{
|
||||
if ( h == file_state_remove )
|
||||
flush_file_event_queue(fonc_queue);
|
||||
|
||||
mgr.QueueEvent(h, vl);
|
||||
|
||||
if ( h == file_new )
|
||||
{
|
||||
did_file_new_event = true;
|
||||
flush_file_event_queue(fonc_queue);
|
||||
}
|
||||
|
||||
if ( h == file_new || h == file_timeout || h == file_extraction_limit )
|
||||
if ( h == file_new || h == file_over_new_connection ||
|
||||
h == file_mime_type ||
|
||||
h == file_timeout || h == file_extraction_limit )
|
||||
{
|
||||
// immediate feedback is required for these events.
|
||||
mgr.Drain();
|
||||
|
|
|
@ -3,11 +3,11 @@
|
|||
#ifndef FILE_ANALYSIS_FILE_H
|
||||
#define FILE_ANALYSIS_FILE_H
|
||||
|
||||
#include <queue>
|
||||
#include <string>
|
||||
#include <utility>
|
||||
#include <vector>
|
||||
|
||||
#include "FileReassembler.h"
|
||||
#include "Conn.h"
|
||||
#include "Val.h"
|
||||
#include "Tag.h"
|
||||
|
@ -16,6 +16,8 @@
|
|||
|
||||
namespace file_analysis {
|
||||
|
||||
class FileReassembler;
|
||||
|
||||
/**
|
||||
* Wrapper class around \c fa_file record values from script layer.
|
||||
*/
|
||||
|
@ -86,10 +88,10 @@ public:
|
|||
void SetTotalBytes(uint64 size);
|
||||
|
||||
/**
|
||||
* Compares "seen_bytes" field to "total_bytes" field of #val record to
|
||||
* determine if the full file has been seen.
|
||||
* @return false if "total_bytes" hasn't been set yet or "seen_bytes" is
|
||||
* less than it, else true.
|
||||
* @return true if file analysis is complete for the file, else false.
|
||||
* It is incomplete if the total size is unknown or if the number of bytes
|
||||
* streamed to analyzers (either as data delivers or gap information)
|
||||
* matches the known total size.
|
||||
*/
|
||||
bool IsComplete() const;
|
||||
|
||||
|
@ -166,18 +168,20 @@ public:
|
|||
|
||||
protected:
|
||||
friend class Manager;
|
||||
friend class FileReassembler;
|
||||
|
||||
/**
|
||||
* Constructor; only file_analysis::Manager should be creating these.
|
||||
* @param file_id an identifier string for the file in pretty hash form
|
||||
* (similar to connection uids).
|
||||
* @param source_name the value for the source field to fill in.
|
||||
* @param conn a network connection over which the file is transferred.
|
||||
* @param tag the network protocol over which the file is transferred.
|
||||
* @param is_orig true if the file is being transferred from the originator
|
||||
* of the connection to the responder. False indicates the other
|
||||
* direction.
|
||||
*/
|
||||
File(const string& file_id, Connection* conn = 0,
|
||||
File(const string& file_id, const string& source_name, Connection* conn = 0,
|
||||
analyzer::Tag tag = analyzer::Tag::Error, bool is_orig = false);
|
||||
|
||||
/**
|
||||
|
@ -185,8 +189,14 @@ protected:
|
|||
* \c conn_id and UID taken from \a conn.
|
||||
* @param conn the connection over which a part of the file has been seen.
|
||||
* @param is_orig true if the connection originator is sending the file.
|
||||
* @return true if the connection was previously unknown.
|
||||
*/
|
||||
void UpdateConnectionFields(Connection* conn, bool is_orig);
|
||||
bool UpdateConnectionFields(Connection* conn, bool is_orig);
|
||||
|
||||
/**
|
||||
* Raise the file_over_new_connection event with given arguments.
|
||||
*/
|
||||
void RaiseFileOverNewConnection(Connection* conn, bool is_orig);
|
||||
|
||||
/**
|
||||
* Increment a byte count field of #val record by \a size.
|
||||
|
@ -219,20 +229,40 @@ protected:
|
|||
*/
|
||||
bool BufferBOF(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Forward any beginning-of-file buffered data on to DataIn stream.
|
||||
*/
|
||||
void ReplayBOF();
|
||||
|
||||
/**
|
||||
* Does mime type detection via file magic signatures and assigns
|
||||
* strongest matching mime type (if available) to \c mime_type
|
||||
* field in #val.
|
||||
* @param data pointer to a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* field in #val. It uses the data in the BOF buffer.
|
||||
* @return whether a mime type match was found.
|
||||
*/
|
||||
bool DetectMIME(const u_char* data, uint64 len);
|
||||
bool DetectMIME();
|
||||
|
||||
/**
|
||||
* Enables reassembly on the file.
|
||||
*/
|
||||
void EnableReassembly();
|
||||
|
||||
/**
|
||||
* Disables reassembly on the file. If there is an existing reassembler
|
||||
* for the file, this will cause it to be deleted and won't allow a new
|
||||
* one to be created until reassembly is reenabled.
|
||||
*/
|
||||
void DisableReassembly();
|
||||
|
||||
/**
|
||||
* Set a maximum allowed bytes of memory for file reassembly for this file.
|
||||
*/
|
||||
void SetReassemblyBuffer(uint64 max);
|
||||
|
||||
/**
|
||||
* Perform stream-wise delivery for analyzers that need it.
|
||||
*/
|
||||
void DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Perform chunk-wise delivery for analyzers that need it.
|
||||
*/
|
||||
void DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
|
||||
/**
|
||||
* Lookup a record field index/offset by name.
|
||||
|
@ -246,25 +276,24 @@ protected:
|
|||
*/
|
||||
static void StaticInit();
|
||||
|
||||
private:
|
||||
protected:
|
||||
string id; /**< A pretty hash that likely identifies file */
|
||||
RecordVal* val; /**< \c fa_file from script layer. */
|
||||
FileReassembler* file_reassembler; /**< A reassembler for the file if it's needed. */
|
||||
uint64 stream_offset; /**< The offset of the file which has been forwarded. */
|
||||
uint64 reassembly_max_buffer; /**< Maximum allowed buffer for reassembly. */
|
||||
bool did_mime_type; /**< Whether the mime type ident has already been attempted. */
|
||||
bool reassembly_enabled; /**< Whether file stream reassembly is needed. */
|
||||
bool postpone_timeout; /**< Whether postponing timeout is requested. */
|
||||
bool first_chunk; /**< Track first non-linear chunk. */
|
||||
bool missed_bof; /**< Flags that we missed start of file. */
|
||||
bool need_reassembly; /**< Whether file stream reassembly is needed. */
|
||||
bool done; /**< If this object is about to be deleted. */
|
||||
bool did_file_new_event; /**< Whether the file_new event has been done. */
|
||||
AnalyzerSet analyzers; /**< A set of attached file analyzer. */
|
||||
queue<pair<EventHandlerPtr, val_list*> > fonc_queue;
|
||||
AnalyzerSet analyzers; /**< A set of attached file analyzers. */
|
||||
|
||||
struct BOF_Buffer {
|
||||
BOF_Buffer() : full(false), replayed(false), size(0) {}
|
||||
BOF_Buffer() : full(false), size(0) {}
|
||||
~BOF_Buffer()
|
||||
{ for ( size_t i = 0; i < chunks.size(); ++i ) delete chunks[i]; }
|
||||
|
||||
bool full;
|
||||
bool replayed;
|
||||
uint64 size;
|
||||
BroString::CVec chunks;
|
||||
} bof_buffer; /**< Beginning of file buffer. */
|
||||
|
|
123
src/file_analysis/FileReassembler.cc
Normal file
123
src/file_analysis/FileReassembler.cc
Normal file
|
@ -0,0 +1,123 @@
|
|||
|
||||
#include "FileReassembler.h"
|
||||
#include "File.h"
|
||||
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
class File;
|
||||
|
||||
FileReassembler::FileReassembler(File *f, uint64 starting_offset)
|
||||
: Reassembler(starting_offset), the_file(f), flushing(false)
|
||||
{
|
||||
}
|
||||
|
||||
FileReassembler::~FileReassembler()
|
||||
{
|
||||
}
|
||||
|
||||
uint64 FileReassembler::Flush()
|
||||
{
|
||||
if ( flushing )
|
||||
return 0;
|
||||
|
||||
if ( last_block )
|
||||
{
|
||||
// This is expected to call back into FileReassembler::Undelivered().
|
||||
flushing = true;
|
||||
uint64 rval = TrimToSeq(last_block->upper);
|
||||
flushing = false;
|
||||
return rval;
|
||||
}
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
uint64 FileReassembler::FlushTo(uint64 sequence)
|
||||
{
|
||||
if ( flushing )
|
||||
return 0;
|
||||
|
||||
flushing = true;
|
||||
uint64 rval = TrimToSeq(sequence);
|
||||
flushing = false;
|
||||
last_reassem_seq = sequence;
|
||||
return rval;
|
||||
}
|
||||
|
||||
void FileReassembler::BlockInserted(DataBlock* start_block)
|
||||
{
|
||||
if ( start_block->seq > last_reassem_seq ||
|
||||
start_block->upper <= last_reassem_seq )
|
||||
return;
|
||||
|
||||
for ( DataBlock* b = start_block;
|
||||
b && b->seq <= last_reassem_seq; b = b->next )
|
||||
{
|
||||
if ( b->seq == last_reassem_seq )
|
||||
{ // New stuff.
|
||||
uint64 len = b->Size();
|
||||
last_reassem_seq += len;
|
||||
the_file->DeliverStream(b->block, len);
|
||||
}
|
||||
}
|
||||
|
||||
// Throw out forwarded data
|
||||
TrimToSeq(last_reassem_seq);
|
||||
}
|
||||
|
||||
void FileReassembler::Undelivered(uint64 up_to_seq)
|
||||
{
|
||||
// If we have blocks that begin below up_to_seq, deliver them.
|
||||
DataBlock* b = blocks;
|
||||
|
||||
while ( b )
|
||||
{
|
||||
if ( b->seq < last_reassem_seq )
|
||||
{
|
||||
// Already delivered this block.
|
||||
b = b->next;
|
||||
continue;
|
||||
}
|
||||
|
||||
if ( b->seq >= up_to_seq )
|
||||
// Block is beyond what we need to process at this point.
|
||||
break;
|
||||
|
||||
uint64 gap_at_seq = last_reassem_seq;
|
||||
uint64 gap_len = b->seq - last_reassem_seq;
|
||||
the_file->Gap(gap_at_seq, gap_len);
|
||||
last_reassem_seq += gap_len;
|
||||
BlockInserted(b);
|
||||
// Inserting a block may cause trimming of what's buffered,
|
||||
// so have to assume 'b' is invalid, hence re-assign to start.
|
||||
b = blocks;
|
||||
}
|
||||
|
||||
if ( up_to_seq > last_reassem_seq )
|
||||
{
|
||||
the_file->Gap(last_reassem_seq, up_to_seq - last_reassem_seq);
|
||||
last_reassem_seq = up_to_seq;
|
||||
}
|
||||
}
|
||||
|
||||
void FileReassembler::Overlap(const u_char* b1, const u_char* b2, uint64 n)
|
||||
{
|
||||
// Not doing anything here yet.
|
||||
}
|
||||
|
||||
IMPLEMENT_SERIAL(FileReassembler, SER_FILE_REASSEMBLER);
|
||||
|
||||
bool FileReassembler::DoSerialize(SerialInfo* info) const
|
||||
{
|
||||
reporter->InternalError("FileReassembler::DoSerialize not implemented");
|
||||
return false; // Cannot be reached.
|
||||
}
|
||||
|
||||
bool FileReassembler::DoUnserialize(UnserialInfo* info)
|
||||
{
|
||||
reporter->InternalError("FileReassembler::DoUnserialize not implemented");
|
||||
return false; // Cannot be reached.
|
||||
}
|
||||
|
||||
} // end file_analysis
|
65
src/file_analysis/FileReassembler.h
Normal file
65
src/file_analysis/FileReassembler.h
Normal file
|
@ -0,0 +1,65 @@
|
|||
#ifndef FILE_ANALYSIS_FILEREASSEMBLER_H
|
||||
#define FILE_ANALYSIS_FILEREASSEMBLER_H
|
||||
|
||||
#include "Reassem.h"
|
||||
#include "File.h"
|
||||
|
||||
class BroFile;
|
||||
class Connection;
|
||||
|
||||
namespace file_analysis {
|
||||
|
||||
class File;
|
||||
|
||||
class FileReassembler : public Reassembler {
|
||||
public:
|
||||
|
||||
FileReassembler(File* f, uint64 starting_offset);
|
||||
virtual ~FileReassembler();
|
||||
|
||||
void Done();
|
||||
|
||||
// Checks if we have delivered all contents that we can possibly
|
||||
// deliver for this endpoint.
|
||||
void CheckEOF();
|
||||
|
||||
/**
|
||||
* Discards all contents of the reassembly buffer. This will spin through
|
||||
* the buffer and call File::DeliverStream() and File::Gap() wherever
|
||||
* appropriate.
|
||||
* @return the number of new bytes now detected as gaps in the file.
|
||||
*/
|
||||
uint64 Flush();
|
||||
|
||||
/**
|
||||
* Discards all contents of the reassembly buffer up to a given sequence
|
||||
* number. This will spin through the buffer and call
|
||||
* File::DeliverStream() and File::Gap() wherever appropriate.
|
||||
* @param sequence the sequence number to flush until.
|
||||
* @return the number of new bytes now detected as gaps in the file.
|
||||
*/
|
||||
uint64 FlushTo(uint64 sequence);
|
||||
|
||||
/**
|
||||
* @return whether the reassembler is currently is the process of flushing
|
||||
* out the contents of its buffer.
|
||||
*/
|
||||
bool IsCurrentlyFlushing() const
|
||||
{ return flushing; }
|
||||
|
||||
protected:
|
||||
FileReassembler() { }
|
||||
|
||||
DECLARE_SERIAL(FileReassembler);
|
||||
|
||||
void Undelivered(uint64 up_to_seq);
|
||||
void BlockInserted(DataBlock* b);
|
||||
void Overlap(const u_char* b1, const u_char* b2, uint64 n);
|
||||
|
||||
File* the_file;
|
||||
bool flushing;
|
||||
};
|
||||
|
||||
} // namespace analyzer::*
|
||||
|
||||
#endif
|
|
@ -154,14 +154,12 @@ string Manager::DataIn(const u_char* data, uint64 len, analyzer::Tag tag,
|
|||
void Manager::DataIn(const u_char* data, uint64 len, const string& file_id,
|
||||
const string& source)
|
||||
{
|
||||
File* file = GetFile(file_id);
|
||||
File* file = GetFile(file_id, 0, analyzer::Tag::Error, false, false,
|
||||
source.c_str());
|
||||
|
||||
if ( ! file )
|
||||
return;
|
||||
|
||||
if ( file->GetSource().empty() )
|
||||
file->SetSource(source);
|
||||
|
||||
file->DataIn(data, len);
|
||||
|
||||
if ( file->IsComplete() )
|
||||
|
@ -232,6 +230,39 @@ bool Manager::SetTimeoutInterval(const string& file_id, double interval) const
|
|||
return true;
|
||||
}
|
||||
|
||||
bool Manager::EnableReassembly(const string& file_id)
|
||||
{
|
||||
File* file = LookupFile(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return false;
|
||||
|
||||
file->EnableReassembly();
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::DisableReassembly(const string& file_id)
|
||||
{
|
||||
File* file = LookupFile(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return false;
|
||||
|
||||
file->DisableReassembly();
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::SetReassemblyBuffer(const string& file_id, uint64 max)
|
||||
{
|
||||
File* file = LookupFile(file_id);
|
||||
|
||||
if ( ! file )
|
||||
return false;
|
||||
|
||||
file->SetReassemblyBuffer(max);
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::SetExtractionLimit(const string& file_id, RecordVal* args,
|
||||
uint64 n) const
|
||||
{
|
||||
|
@ -254,28 +285,6 @@ bool Manager::AddAnalyzer(const string& file_id, file_analysis::Tag tag,
|
|||
return file->AddAnalyzer(tag, args);
|
||||
}
|
||||
|
||||
TableVal* Manager::AddAnalyzersForMIMEType(const string& file_id, const string& mtype,
|
||||
RecordVal* args)
|
||||
{
|
||||
if ( ! tag_set_type )
|
||||
tag_set_type = internal_type("files_tag_set")->AsTableType();
|
||||
|
||||
TableVal* sval = new TableVal(tag_set_type);
|
||||
TagSet* l = LookupMIMEType(mtype, false);
|
||||
|
||||
if ( ! l )
|
||||
return sval;
|
||||
|
||||
for ( TagSet::const_iterator i = l->begin(); i != l->end(); i++ )
|
||||
{
|
||||
file_analysis::Tag tag = *i;
|
||||
if ( AddAnalyzer(file_id, tag, args) )
|
||||
sval->Assign(tag.AsEnumVal(), 0);
|
||||
}
|
||||
|
||||
return sval;
|
||||
}
|
||||
|
||||
bool Manager::RemoveAnalyzer(const string& file_id, file_analysis::Tag tag,
|
||||
RecordVal* args) const
|
||||
{
|
||||
|
@ -288,7 +297,8 @@ bool Manager::RemoveAnalyzer(const string& file_id, file_analysis::Tag tag,
|
|||
}
|
||||
|
||||
File* Manager::GetFile(const string& file_id, Connection* conn,
|
||||
analyzer::Tag tag, bool is_orig, bool update_conn)
|
||||
analyzer::Tag tag, bool is_orig, bool update_conn,
|
||||
const char* source_name)
|
||||
{
|
||||
if ( file_id.empty() )
|
||||
return 0;
|
||||
|
@ -300,10 +310,19 @@ File* Manager::GetFile(const string& file_id, Connection* conn,
|
|||
|
||||
if ( ! rval )
|
||||
{
|
||||
rval = new File(file_id, conn, tag, is_orig);
|
||||
rval = new File(file_id,
|
||||
source_name ? source_name
|
||||
: analyzer_mgr->GetComponentName(tag),
|
||||
conn, tag, is_orig);
|
||||
id_map.Insert(file_id.c_str(), rval);
|
||||
rval->ScheduleInactivityTimer();
|
||||
|
||||
// Generate file_new after inserting it into manager's mapping
|
||||
// in case script-layer calls back in to core from the event.
|
||||
rval->FileEvent(file_new);
|
||||
// Same for file_over_new_connection.
|
||||
rval->RaiseFileOverNewConnection(conn, is_orig);
|
||||
|
||||
if ( IsIgnored(file_id) )
|
||||
return 0;
|
||||
}
|
||||
|
@ -311,8 +330,8 @@ File* Manager::GetFile(const string& file_id, Connection* conn,
|
|||
{
|
||||
rval->UpdateLastActivityTime();
|
||||
|
||||
if ( update_conn )
|
||||
rval->UpdateConnectionFields(conn, is_orig);
|
||||
if ( update_conn && rval->UpdateConnectionFields(conn, is_orig) )
|
||||
rval->RaiseFileOverNewConnection(conn, is_orig);
|
||||
}
|
||||
|
||||
return rval;
|
||||
|
@ -461,63 +480,6 @@ Analyzer* Manager::InstantiateAnalyzer(Tag tag, RecordVal* args, File* f) const
|
|||
return a;
|
||||
}
|
||||
|
||||
Manager::TagSet* Manager::LookupMIMEType(const string& mtype, bool add_if_not_found)
|
||||
{
|
||||
MIMEMap::const_iterator i = mime_types.find(to_upper(mtype));
|
||||
|
||||
if ( i != mime_types.end() )
|
||||
return i->second;
|
||||
|
||||
if ( ! add_if_not_found )
|
||||
return 0;
|
||||
|
||||
TagSet* l = new TagSet;
|
||||
mime_types.insert(std::make_pair(to_upper(mtype), l));
|
||||
return l;
|
||||
}
|
||||
|
||||
bool Manager::RegisterAnalyzerForMIMEType(EnumVal* tag, StringVal* mtype)
|
||||
{
|
||||
Component* p = Lookup(tag);
|
||||
|
||||
if ( ! p )
|
||||
return false;
|
||||
|
||||
return RegisterAnalyzerForMIMEType(p->Tag(), mtype->CheckString());
|
||||
}
|
||||
|
||||
bool Manager::RegisterAnalyzerForMIMEType(Tag tag, const string& mtype)
|
||||
{
|
||||
TagSet* l = LookupMIMEType(mtype, true);
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Register analyzer %s for MIME type %s",
|
||||
GetComponentName(tag).c_str(), mtype.c_str());
|
||||
|
||||
l->insert(tag);
|
||||
return true;
|
||||
}
|
||||
|
||||
bool Manager::UnregisterAnalyzerForMIMEType(EnumVal* tag, StringVal* mtype)
|
||||
{
|
||||
Component* p = Lookup(tag);
|
||||
|
||||
if ( ! p )
|
||||
return false;
|
||||
|
||||
return UnregisterAnalyzerForMIMEType(p->Tag(), mtype->CheckString());
|
||||
}
|
||||
|
||||
bool Manager::UnregisterAnalyzerForMIMEType(Tag tag, const string& mtype)
|
||||
{
|
||||
TagSet* l = LookupMIMEType(mtype, true);
|
||||
|
||||
DBG_LOG(DBG_FILE_ANALYSIS, "Unregister analyzer %s for MIME type %s",
|
||||
GetComponentName(tag).c_str(), mtype.c_str());
|
||||
|
||||
l->erase(tag);
|
||||
return true;
|
||||
}
|
||||
|
||||
RuleMatcher::MIME_Matches* Manager::DetectMIME(const u_char* data, uint64 len,
|
||||
RuleMatcher::MIME_Matches* rval) const
|
||||
{
|
||||
|
|
|
@ -213,6 +213,21 @@ public:
|
|||
*/
|
||||
bool SetTimeoutInterval(const string& file_id, double interval) const;
|
||||
|
||||
/**
|
||||
* Enable the reassembler for a file.
|
||||
*/
|
||||
bool EnableReassembly(const string& file_id);
|
||||
|
||||
/**
|
||||
* Disable the reassembler for a file.
|
||||
*/
|
||||
bool DisableReassembly(const string& file_id);
|
||||
|
||||
/**
|
||||
* Set the reassembly for a file in bytes.
|
||||
*/
|
||||
bool SetReassemblyBuffer(const string& file_id, uint64 max);
|
||||
|
||||
/**
|
||||
* Sets a limit on the maximum size allowed for extracting the file
|
||||
* to local disk;
|
||||
|
@ -238,18 +253,6 @@ public:
|
|||
bool AddAnalyzer(const string& file_id, file_analysis::Tag tag,
|
||||
RecordVal* args) const;
|
||||
|
||||
/**
|
||||
* Queue attachment of an all analyzers associated with a given MIME
|
||||
* type to the file identifier.
|
||||
*
|
||||
* @param file_id the file identifier/hash.
|
||||
* @param mtype the MIME type; comparisions will be performanced case-insensitive.
|
||||
* @param args a \c AnalyzerArgs value which describes a file analyzer.
|
||||
* @return A ref'ed \c set[Tag] with all added analyzers.
|
||||
*/
|
||||
TableVal* AddAnalyzersForMIMEType(const string& file_id, const string& mtype,
|
||||
RecordVal* args);
|
||||
|
||||
/**
|
||||
* Queue removal of an analyzer for a given file identifier.
|
||||
* @param file_id the file identifier/hash.
|
||||
|
@ -277,62 +280,6 @@ public:
|
|||
Analyzer* InstantiateAnalyzer(Tag tag, RecordVal* args, File* f) const;
|
||||
|
||||
/**
|
||||
* Registers a MIME type for an analyzer. Once registered, files of
|
||||
* that MIME type will automatically get a corresponding analyzer
|
||||
* assigned.
|
||||
*
|
||||
* @param tag The analyzer's tag as an enum of script type \c
|
||||
* Files::Tag.
|
||||
*
|
||||
* @param mtype The MIME type. It will be matched case-insenistive.
|
||||
*
|
||||
* @return True if successful.
|
||||
*/
|
||||
bool RegisterAnalyzerForMIMEType(EnumVal* tag, StringVal* mtype);
|
||||
|
||||
/**
|
||||
* Registers a MIME type for an analyzer. Once registered, files of
|
||||
* that MIME type will automatically get a corresponding analyzer
|
||||
* assigned.
|
||||
*
|
||||
* @param tag The analyzer's tag as an enum of script type \c
|
||||
* Files::Tag.
|
||||
*
|
||||
* @param mtype The MIME type. It will be matched case-insenistive.
|
||||
*
|
||||
* @return True if successful.
|
||||
*/
|
||||
bool RegisterAnalyzerForMIMEType(Tag tag, const string& mtype);
|
||||
|
||||
/**
|
||||
* Unregisters a MIME type for an analyzer.
|
||||
*
|
||||
* @param tag The analyzer's tag as an enum of script type \c
|
||||
* Files::Tag.
|
||||
*
|
||||
* @param mtype The MIME type. It will be matched case-insenistive.
|
||||
*
|
||||
* @return True if successful (incl. when the type wasn't actually
|
||||
* registered for the analyzer).
|
||||
*
|
||||
*/
|
||||
bool UnregisterAnalyzerForMIMEType(EnumVal* tag, StringVal* mtype);
|
||||
|
||||
/**
|
||||
* Unregisters a MIME type for an analyzer.
|
||||
*
|
||||
* @param tag The analyzer's tag as an enum of script type \c
|
||||
* Files::Tag.
|
||||
*
|
||||
* @param mtype The MIME type. It will be matched case-insenistive.
|
||||
*
|
||||
* @return True if successful (incl. when the type wasn't actually
|
||||
* registered for the analyzer).
|
||||
*
|
||||
*/
|
||||
bool UnregisterAnalyzerForMIMEType(Tag tag, const string& mtype);
|
||||
|
||||
/**
|
||||
* Returns a set of all matching MIME magic signatures for a given
|
||||
* chunk of data.
|
||||
* @param data A chunk of bytes to match magic MIME signatures against.
|
||||
|
@ -372,6 +319,7 @@ protected:
|
|||
* this file isn't related to a connection).
|
||||
* @param update_conn whether we need to update connection-related field
|
||||
* in the \c fa_file record value associated with the file.
|
||||
* @param an optional value of the source field to fill in.
|
||||
* @return the File object mapped to \a file_id or a null pointer if
|
||||
* analysis is being ignored for the associated file. An File
|
||||
* object may be created if a mapping doesn't exist, and if it did
|
||||
|
@ -380,7 +328,8 @@ protected:
|
|||
*/
|
||||
File* GetFile(const string& file_id, Connection* conn = 0,
|
||||
analyzer::Tag tag = analyzer::Tag::Error,
|
||||
bool is_orig = false, bool update_conn = true);
|
||||
bool is_orig = false, bool update_conn = true,
|
||||
const char* source_name = 0);
|
||||
|
||||
/**
|
||||
* Try to retrieve a file that's being analyzed, using its identifier/hash.
|
||||
|
|
|
@ -12,9 +12,9 @@ using namespace file_analysis;
|
|||
Extract::Extract(RecordVal* args, File* file, const string& arg_filename,
|
||||
uint64 arg_limit)
|
||||
: file_analysis::Analyzer(file_mgr->GetComponentTag("EXTRACT"), args, file),
|
||||
filename(arg_filename), limit(arg_limit)
|
||||
filename(arg_filename), limit(arg_limit), depth(0)
|
||||
{
|
||||
fd = open(filename.c_str(), O_WRONLY | O_CREAT | O_TRUNC, 0666);
|
||||
fd = open(filename.c_str(), O_WRONLY | O_CREAT | O_TRUNC | O_APPEND, 0666);
|
||||
|
||||
if ( fd < 0 )
|
||||
{
|
||||
|
@ -53,7 +53,7 @@ file_analysis::Analyzer* Extract::Instantiate(RecordVal* args, File* file)
|
|||
limit->AsCount());
|
||||
}
|
||||
|
||||
static bool check_limit_exceeded(uint64 lim, uint64 off, uint64 len, uint64* n)
|
||||
static bool check_limit_exceeded(uint64 lim, uint64 depth, uint64 len, uint64* n)
|
||||
{
|
||||
if ( lim == 0 )
|
||||
{
|
||||
|
@ -61,29 +61,31 @@ static bool check_limit_exceeded(uint64 lim, uint64 off, uint64 len, uint64* n)
|
|||
return false;
|
||||
}
|
||||
|
||||
if ( off >= lim )
|
||||
if ( depth >= lim )
|
||||
{
|
||||
*n = 0;
|
||||
return true;
|
||||
}
|
||||
|
||||
*n = lim - off;
|
||||
|
||||
if ( len > *n )
|
||||
else if ( depth + len > lim )
|
||||
{
|
||||
*n = lim - depth;
|
||||
return true;
|
||||
}
|
||||
else
|
||||
{
|
||||
*n = len;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
bool Extract::DeliverChunk(const u_char* data, uint64 len, uint64 offset)
|
||||
bool Extract::DeliverStream(const u_char* data, uint64 len)
|
||||
{
|
||||
if ( ! fd )
|
||||
return false;
|
||||
|
||||
uint64 towrite = 0;
|
||||
bool limit_exceeded = check_limit_exceeded(limit, offset, len, &towrite);
|
||||
bool limit_exceeded = check_limit_exceeded(limit, depth, len, &towrite);
|
||||
|
||||
if ( limit_exceeded && file_extraction_limit )
|
||||
{
|
||||
|
@ -92,16 +94,31 @@ bool Extract::DeliverChunk(const u_char* data, uint64 len, uint64 offset)
|
|||
vl->append(f->GetVal()->Ref());
|
||||
vl->append(Args()->Ref());
|
||||
vl->append(new Val(limit, TYPE_COUNT));
|
||||
vl->append(new Val(offset, TYPE_COUNT));
|
||||
vl->append(new Val(len, TYPE_COUNT));
|
||||
f->FileEvent(file_extraction_limit, vl);
|
||||
|
||||
// Limit may have been modified by BIF, re-check it.
|
||||
limit_exceeded = check_limit_exceeded(limit, offset, len, &towrite);
|
||||
// Limit may have been modified by a BIF, re-check it.
|
||||
limit_exceeded = check_limit_exceeded(limit, depth, len, &towrite);
|
||||
}
|
||||
|
||||
if ( towrite > 0 )
|
||||
safe_pwrite(fd, data, towrite, offset);
|
||||
{
|
||||
safe_write(fd, reinterpret_cast<const char*>(data), towrite);
|
||||
depth += towrite;
|
||||
}
|
||||
|
||||
return ( ! limit_exceeded );
|
||||
}
|
||||
|
||||
bool Extract::Undelivered(uint64 offset, uint64 len)
|
||||
{
|
||||
if ( depth == offset )
|
||||
{
|
||||
char* tmp = new char[len]();
|
||||
safe_write(fd, tmp, len);
|
||||
delete [] tmp;
|
||||
depth += len;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
|
|
@ -28,11 +28,18 @@ public:
|
|||
* Write a chunk of file data to the local extraction file.
|
||||
* @param data pointer to a chunk of file data.
|
||||
* @param len number of bytes in the data chunk.
|
||||
* @param offset number of bytes from start of file at which chunk starts.
|
||||
* @return false if there was no extraction file open and the data couldn't
|
||||
* be written, else true.
|
||||
*/
|
||||
virtual bool DeliverChunk(const u_char* data, uint64 len, uint64 offset);
|
||||
virtual bool DeliverStream(const u_char* data, uint64 len);
|
||||
|
||||
/**
|
||||
* Report undelivered bytes.
|
||||
* @param offset distance into the file where the gap occurred.
|
||||
* @param len number of bytes undelivered.
|
||||
* @return true
|
||||
*/
|
||||
virtual bool Undelivered(uint64 offset, uint64 len);
|
||||
|
||||
/**
|
||||
* Create a new instance of an Extract analyzer.
|
||||
|
@ -67,6 +74,7 @@ private:
|
|||
string filename;
|
||||
int fd;
|
||||
uint64 limit;
|
||||
uint64 depth;
|
||||
};
|
||||
|
||||
} // namespace file_analysis
|
||||
|
|
|
@ -11,9 +11,7 @@
|
|||
##
|
||||
## limit: The limit, in bytes, the extracted file is about to breach.
|
||||
##
|
||||
## offset: The offset at which a file chunk is about to be written.
|
||||
##
|
||||
## len: The length of the file chunk about to be written.
|
||||
##
|
||||
## .. bro:see:: Files::add_analyzer Files::ANALYZER_EXTRACT
|
||||
event file_extraction_limit%(f: fa_file, args: any, limit: count, offset: count, len: count%);
|
||||
event file_extraction_limit%(f: fa_file, args: any, limit: count, len: count%);
|
||||
|
|
|
@ -147,7 +147,7 @@ RecordVal* file_analysis::X509::ParseCertificate(X509Val* cert_val)
|
|||
#ifndef OPENSSL_NO_EC
|
||||
else if ( pkey->type == EVP_PKEY_EC )
|
||||
{
|
||||
pX509Cert->Assign(8, new StringVal("dsa"));
|
||||
pX509Cert->Assign(8, new StringVal("ecdsa"));
|
||||
pX509Cert->Assign(11, KeyCurve(pkey));
|
||||
}
|
||||
#endif
|
||||
|
|
|
@ -15,6 +15,27 @@ function Files::__set_timeout_interval%(file_id: string, t: interval%): bool
|
|||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::enable_reassembly`.
|
||||
function Files::__enable_reassembly%(file_id: string%): bool
|
||||
%{
|
||||
bool result = file_mgr->EnableReassembly(file_id->CheckString());
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::disable_reassembly`.
|
||||
function Files::__disable_reassembly%(file_id: string%): bool
|
||||
%{
|
||||
bool result = file_mgr->DisableReassembly(file_id->CheckString());
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::set_reassembly_buffer`.
|
||||
function Files::__set_reassembly_buffer%(file_id: string, max: count%): bool
|
||||
%{
|
||||
bool result = file_mgr->SetReassemblyBuffer(file_id->CheckString(), max);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::add_analyzer`.
|
||||
function Files::__add_analyzer%(file_id: string, tag: Files::Tag, args: any%): bool
|
||||
%{
|
||||
|
@ -26,16 +47,6 @@ function Files::__add_analyzer%(file_id: string, tag: Files::Tag, args: any%): b
|
|||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::add_analyzers_for_mime_type`.
|
||||
function Files::__add_analyzers_for_mime_type%(file_id: string, mtype: string, args: any%): files_tag_set
|
||||
%{
|
||||
using BifType::Record::Files::AnalyzerArgs;
|
||||
RecordVal* rv = args->AsRecordVal()->CoerceTo(AnalyzerArgs);
|
||||
Val* analyzers = file_mgr->AddAnalyzersForMIMEType(file_id->CheckString(), mtype->CheckString(), rv);
|
||||
Unref(rv);
|
||||
return analyzers;
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::remove_analyzer`.
|
||||
function Files::__remove_analyzer%(file_id: string, tag: Files::Tag, args: any%): bool
|
||||
%{
|
||||
|
@ -60,13 +71,6 @@ function Files::__analyzer_name%(tag: Files::Tag%) : string
|
|||
return new StringVal(file_mgr->GetComponentName(tag));
|
||||
%}
|
||||
|
||||
## :bro:see:`Files::register_for_mime_type`.
|
||||
function Files::__register_for_mime_type%(id: Analyzer::Tag, mt: string%) : bool
|
||||
%{
|
||||
bool result = file_mgr->RegisterAnalyzerForMIMEType(id->AsEnumVal(), mt);
|
||||
return new Val(result, TYPE_BOOL);
|
||||
%}
|
||||
|
||||
module GLOBAL;
|
||||
|
||||
## For use within a :bro:see:`get_file_handle` handler to set a unique
|
||||
|
|
42
src/scan.l
42
src/scan.l
|
@ -26,6 +26,7 @@
|
|||
#include "Reporter.h"
|
||||
#include "RE.h"
|
||||
#include "Net.h"
|
||||
#include "Traverse.h"
|
||||
|
||||
#include "analyzer/Analyzer.h"
|
||||
#include "broxygen/Manager.h"
|
||||
|
@ -615,11 +616,50 @@ void end_RE()
|
|||
BEGIN(INITIAL);
|
||||
}
|
||||
|
||||
class LocalNameFinder : public TraversalCallback {
|
||||
public:
|
||||
LocalNameFinder()
|
||||
{}
|
||||
|
||||
virtual TraversalCode PreExpr(const Expr* expr)
|
||||
{
|
||||
if ( expr->Tag() != EXPR_NAME )
|
||||
return TC_CONTINUE;
|
||||
|
||||
const NameExpr* name_expr = static_cast<const NameExpr*>(expr);
|
||||
|
||||
if ( name_expr->Id()->IsGlobal() )
|
||||
return TC_CONTINUE;
|
||||
|
||||
local_names.push_back(name_expr);
|
||||
return TC_CONTINUE;
|
||||
}
|
||||
|
||||
std::vector<const NameExpr*> local_names;
|
||||
};
|
||||
|
||||
void do_atif(Expr* expr)
|
||||
{
|
||||
++current_depth;
|
||||
|
||||
Val* val = expr->Eval(0);
|
||||
LocalNameFinder cb;
|
||||
expr->Traverse(&cb);
|
||||
Val* val = 0;
|
||||
|
||||
if ( cb.local_names.empty() )
|
||||
val = expr->Eval(0);
|
||||
else
|
||||
{
|
||||
for ( size_t i = 0; i < cb.local_names.size(); ++i )
|
||||
cb.local_names[i]->Error("referencing a local name in @if");
|
||||
}
|
||||
|
||||
if ( ! val )
|
||||
{
|
||||
expr->Error("invalid expression in @if");
|
||||
return;
|
||||
}
|
||||
|
||||
if ( ! val->AsBool() )
|
||||
{
|
||||
if_stack.append(current_depth);
|
||||
|
|
|
@ -5,6 +5,10 @@ all: make-verbose coverage
|
|||
|
||||
brief: make-brief coverage
|
||||
|
||||
distclean:
|
||||
@rm -f coverage.log
|
||||
$(MAKE) -C btest $@
|
||||
|
||||
make-verbose:
|
||||
@for repo in $(DIRS); do (cd $$repo && make -s ); done
|
||||
|
||||
|
|
6
testing/btest/Baseline/bifs.enum_to_int/out
Normal file
6
testing/btest/Baseline/bifs.enum_to_int/out
Normal file
|
@ -0,0 +1,6 @@
|
|||
A, 0
|
||||
B, 1
|
||||
C, 2
|
||||
AV, 10
|
||||
BV, 11
|
||||
CV, 12
|
|
@ -1,9 +1,9 @@
|
|||
CUWkUyAuUGXfarKYeMETxOg
|
||||
Ck6kgXLOoSKlnQcgTWjvg4c
|
||||
Cj4u32Pc5bifTEfuqmmG4bh
|
||||
Fj3nTWNjezo6G6xBmyo58Tf
|
||||
Cj4u32Pc5bifTEfuqmmG4bh
|
||||
F4VAnSiNGSQhKEoCPd4zuQd
|
||||
CFrJExwHcSal5OKnoww6xl4
|
||||
C3PKsZ2Uye21VW0XPVINV8a
|
||||
FaJg8mtdsS86cWjSe4spPPl
|
||||
C3PKsZ2Uye21VW0XPVINV8a
|
||||
FvBr89nD30GgGAp3wgtm6qf
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
CUWkUyAuUGXfarKYeMETxOg
|
||||
Ck6kgXLOoSKlnQcgTWjvg4c
|
||||
Cj4u32Pc5bifTEfuqmmG4bh
|
||||
Fj3nTWNjezo6G6xBmyo58Tf
|
||||
Cj4u32Pc5bifTEfuqmmG4bh
|
||||
F4VAnSiNGSQhKEoCPd4zuQd
|
||||
CFrJExwHcSal5OKnoww6xl4
|
||||
C3PKsZ2Uye21VW0XPVINV8a
|
||||
FaJg8mtdsS86cWjSe4spPPl
|
||||
C3PKsZ2Uye21VW0XPVINV8a
|
||||
FvBr89nD30GgGAp3wgtm6qf
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
CXWv6p30
|
||||
CCyvnA30
|
||||
CjhGID40
|
||||
F75yAm10
|
||||
CjhGID40
|
||||
FmGk6O30
|
||||
CdfHBz20
|
||||
CCvvfg30
|
||||
Fuh3fj10
|
||||
CCvvfg30
|
||||
Ftwuyy30
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
CUWkUyAuUGXf0
|
||||
CarKYeMETxOg0
|
||||
Ck6kgXLOoSKl0
|
||||
Fj3nTWNjezo60
|
||||
Ck6kgXLOoSKl0
|
||||
F4VAnSiNGSQh0
|
||||
CnQcgTWjvg4c0
|
||||
Cj4u32Pc5bif0
|
||||
FaJg8mtdsS860
|
||||
Cj4u32Pc5bif0
|
||||
FvBr89nD30Gg0
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
CXWv6p3arKYeMETxOg
|
||||
CjhGID4nQcgTWjvg4c
|
||||
CCvvfg3TEfuqmmG4bh
|
||||
F75yAm1G6xBmyo58Tf
|
||||
CCvvfg3TEfuqmmG4bh
|
||||
FmGk6O3KEoCPd4zuQd
|
||||
CsRx2w45OKnoww6xl4
|
||||
CRJuHdVW0XPVINV8a
|
||||
Fuh3fj1cWjSe4spPPl
|
||||
CRJuHdVW0XPVINV8a
|
||||
Ftwuyy3GAp3wgtm6qf
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
2 1080
|
||||
1 137
|
||||
1 1434
|
||||
1 161
|
||||
1 162
|
||||
1 1812
|
||||
|
@ -11,6 +12,7 @@
|
|||
1 25
|
||||
1 2811
|
||||
1 3128
|
||||
1 3306
|
||||
1 3544
|
||||
1 443
|
||||
1 502
|
||||
|
@ -36,15 +38,14 @@
|
|||
1 8000
|
||||
1 8080
|
||||
1 81
|
||||
1 88
|
||||
1 8888
|
||||
1 989
|
||||
1 990
|
||||
1 992
|
||||
1 993
|
||||
1 995
|
||||
47 and
|
||||
46 or
|
||||
47 port
|
||||
32 tcp
|
||||
15 udp
|
||||
48 and
|
||||
47 or
|
||||
48 port
|
||||
34 tcp
|
||||
14 udp
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path loaded_scripts
|
||||
#open 2014-09-06-01-19-42
|
||||
#open 2014-10-31-20-38-14
|
||||
#fields name
|
||||
#types string
|
||||
scripts/base/init-bare.bro
|
||||
|
@ -73,6 +73,7 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_Login.functions.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_MIME.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Modbus.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_MySQL.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NCP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NetBIOS.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NetBIOS.functions.bif.bro
|
||||
|
@ -114,4 +115,4 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_SQLiteWriter.sqlite.bif.bro
|
||||
scripts/policy/misc/loaded-scripts.bro
|
||||
scripts/base/utils/paths.bro
|
||||
#close 2014-09-06-01-19-42
|
||||
#close 2014-10-31-20-38-14
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
#empty_field (empty)
|
||||
#unset_field -
|
||||
#path loaded_scripts
|
||||
#open 2014-09-06-01-20-32
|
||||
#open 2014-10-31-20-38-48
|
||||
#fields name
|
||||
#types string
|
||||
scripts/base/init-bare.bro
|
||||
|
@ -73,6 +73,7 @@ scripts/base/init-bare.bro
|
|||
build/scripts/base/bif/plugins/Bro_Login.functions.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_MIME.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_Modbus.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_MySQL.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NCP.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NetBIOS.events.bif.bro
|
||||
build/scripts/base/bif/plugins/Bro_NetBIOS.functions.bif.bro
|
||||
|
@ -217,6 +218,9 @@ scripts/base/init-default.bro
|
|||
scripts/base/protocols/modbus/__load__.bro
|
||||
scripts/base/protocols/modbus/consts.bro
|
||||
scripts/base/protocols/modbus/main.bro
|
||||
scripts/base/protocols/mysql/__load__.bro
|
||||
scripts/base/protocols/mysql/main.bro
|
||||
scripts/base/protocols/mysql/consts.bro
|
||||
scripts/base/protocols/pop3/__load__.bro
|
||||
scripts/base/protocols/radius/__load__.bro
|
||||
scripts/base/protocols/radius/main.bro
|
||||
|
@ -243,4 +247,4 @@ scripts/base/init-default.bro
|
|||
scripts/base/misc/find-checksum-offloading.bro
|
||||
scripts/base/misc/find-filtered-trace.bro
|
||||
scripts/policy/misc/loaded-scripts.bro
|
||||
#close 2014-09-06-01-20-32
|
||||
#close 2014-10-31-20-38-48
|
||||
|
|
|
@ -21,6 +21,7 @@ known_services
|
|||
loaded_scripts
|
||||
modbus
|
||||
modbus_register_change
|
||||
mysql
|
||||
notice
|
||||
notice_alarm
|
||||
packet_filter
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue