mirror of
https://github.com/zeek/zeek.git
synced 2025-10-16 21:48:21 +00:00
![]() This allows to read Zeek global variables from inside Spicy code. The main challenge here is supporting all of Zeek's data type in a type-safe manner. The most straight-forward API is a set of functions `get_<type>(<id>)`, where `<type>` is the Zeek-side type name (e.g., `count`, `string`, `bool`) and `<id>` is the fully scoped name of the Zeek-side global (e.g., `MyModule::Boolean`). These functions then return the corresponding Zeek value, converted in an appropriate Spicy type. Example: Zeek: module Foo; const x: count = 42; const y: string = "xxx"; Spicy: import zeek; assert zeek::get_count("Foo::x") == 42; assert zeek::get_string("Foo::y") == b"xxx"; # returns bytes(!) For container types, the `get_*` function returns an opaque types that can be used to access the containers' values. An additional set of functions `as_<type>` allows converting opaque values of atomic types to Spicy equivalents. Example: Zeek: module Foo; const s: set[count] = { 1, 2 }; const t: table[count] of string = { [1] = "One", [2] = "Two" } Spicy: # Check set membership. local set_ = zeek::get_set("Foo::s"); assert zeek::set_contains(set_, 1) == True # Look up table element. local table_ = zeek::get_table("Foo::t"); local value = zeek::table_lookup(t, 1); assert zeek::as_string(value) == b"One" There are also functions for accessing elements of Zeek-side vectors and records. If any of these `zeek::*` conversion functions fails (e.g., due to a global of that name not existing), it will throw an exception. Design considerations: - We support only reading Zeek variables, not writing. This is both to simplify the API, and also conceptually to avoid offering backdoors into Zeek state that could end up with a very tight coupling of Spicy and Zeek code. - We accept that a single access might be relatively slow due to name lookup and data conversion. This is primarily meant for configuration-style data, not for transferring lots of dynamic state over. - In that spirit, we don't support deep-copying complex data types from Zeek over to Spicy. This is (1) to avoid performance problems when accidentally copying large containers over, potentially even at every access; and (2) to avoid the two sides getting out of sync if one ends up modifying a container without the other being able to see it. |
||
---|---|---|
.. | ||
af_packet | ||
Baseline | ||
Baseline.cpp | ||
Baseline.dup | ||
Baseline.inline | ||
Baseline.opt | ||
Baseline.usage | ||
Baseline.xform | ||
Baseline.zam | ||
bifs | ||
broker | ||
core | ||
coverage | ||
doc | ||
Files/mmdb | ||
javascript | ||
language | ||
misc | ||
opt | ||
plugins | ||
scripts | ||
signatures | ||
spicy | ||
supervisor | ||
telemetry | ||
Traces | ||
.gitignore | ||
btest.cfg | ||
Makefile | ||
random.seed | ||
README |
This a test suite of small "unit tests" that verify individual pieces of Zeek functionality. They all utilize BTest, a simple framework/driver for writing unit tests. More information about BTest can be found at https://github.com/zeek/btest The test suite's BTest configuration is handled through the ``btest.cfg`` file. Of particular interest is the "TestDirs" settings, which specifies which directories BTest will recursively search for test files. Significant Subdirectories ========================== * Baseline/ Validated baselines for comparison against the output of each test on future runs. If the new output differs from the Baseline output, then the test fails. * Traces/ Packet captures utilized by the various BTest tests. * scripts/ This hierarchy of tests emulates the hierarchy of the Zeek scripts/ directory. * coverage/ This collection of tests relates to checking whether we're covering everything we want to in terms of tests, documentation, and which scripts get loaded in different Zeek configurations. These tests are more prone to fail as new Zeek scripts are developed and added to the distribution -- checking the individual test's comments is the best place to check for more details on what exactly the test is checking and hints on how to fix it when it fails. Running Tests ============= Either use the ``make all`` or ``make brief`` ``Makefile`` targets, or run ``btest`` directly with desired options/arguments. Examples: * btest <no arguments> If you simply execute btest in this directory with no arguments, then all directories listed as "TestDirs" in btest.cfg will be searched recursively for test files. * btest <btest options> test_directory You can specify a directory on the command line to run just the tests contained in that directory. This is useful if you wish to run all of a given type of test, without running all the tests there are. For example, "btest scripts" will run all of the Zeek script unit tests. * btest <btest options> test_directory/test_file You can specify a single test file to run just that test. This is useful when testing a single failing test or when developing a new test. Adding Tests ============= See either the `BTest documentation <https://github.com/zeek/btest>`_ or the existing unit tests for examples of what they actually look like. The essential components of a new test include: * A test file in one of the subdirectories listed in the ``TestDirs`` of the ``btest.cfg`` file. * If the unit test requires a known-good baseline output against which future tests will be compared (via ``btest-diff``), then that baseline output will need to live in the ``Baseline`` directory. Manually adding that is possible, but it's easier to just use the ``-u`` or ``-U`` options of ``btest`` to do it for you (using ``btest -d`` on a test for which no baseline exists will show you the output so it can be verified first before adding/updating the baseline output). If you create a new top-level testing directory for collecting related tests, then you'll need to add it to the list of ``TestDirs`` in ``btest.cfg``. Do this only if your test really doesn't fit logically in any of the extant directories.