We need this to sender through Broker, and we also leverage it for
cloning opaques. The serialization methods now produce Broker data
instances directly, and no longer go through the binary formatter.
Summary of the new API for types derived from OpaqueVal:
- Add DECLARE_OPAQUE_VALUE(<class>) to the class declaration
- Add IMPLEMENT_OPAQUE_VALUE(<class>) to the class' implementation file
- Implement these two methods (which are declated by the 1st macro):
- broker::data DoSerialize() const
- bool DoUnserialize(const broker::data& data)
This machinery should work correctly from dynamic plugins as well.
OpaqueVal provides a default implementation of DoClone() as well that
goes through serialization. Derived classes can provide a more
efficient version if they want.
The declaration of the "OpaqueVal" class has moved into the header
file "OpaqueVal.h", along with the new serialization infrastructure.
This is breaking existing code that relies on the location, but
because the API is changing anyways that seems fine.
This adds an internal BiF
"Broker::__opaque_clone_through_serialization" that does what the name
says: deep-copying an opaque by serializing, then-deserializing. That
can be used to tests the new functionality from btests.
Not quite done yet. TODO:
- Not all tests pass yet:
[ 0%] language.named-set-ctors ... failed
[ 16%] language.copy-all-opaques ... failed
[ 33%] language.set-type-checking ... failed
[ 50%] language.table-init-container-ctors ... failed
[ 66%] coverage.sphinx-zeekygen-docs ... failed
[ 83%] scripts.base.frameworks.sumstats.basic-cluster ... failed
(Some of the serialization may still be buggy.)
- Clean up the code a bit more.
Note - this compiles, but you cannot run Bro anymore - it crashes
immediately with a 0-pointer access. The reason behind it is that the
required clone functionality does not work anymore.
CompositeHash.
We do this by hashing values added to a BloomFilter another time more
with a stable hash seeded only by either the filter's name or the
global_hash_seed (or Bro's random() seed if neither is defined).
I'm also adding a new bif bloomfilter_internal_state() that returns a
string representation of a Bloom filter's current internal state. This
is solely for writing tests that check that the filters end up
consistent when seeded with the same value.
I'm moving the new files into a subdirectory probabilistic, and into a
corresponding namespace. We can later put code for the other
probabilistic data structures there as well.
* origin/topic/matthias/bloom-filter: (45 commits)
Implement and test Bloom filter merging.
Make hash functions equality comparable.
Make counter vectors mergeable.
Use half adder for bitwise addition and subtraction.
Fix and test counting Bloom filter.
Implement missing CounterVector functions.
Tweak hasher interface.
Add missing include for GCC.
Fixing for unserializion error.
Small fixes and style tweaks.
Only serialize Bloom filter type if available.
Create hash policies through factory.
Remove lingering debug code.
Factor implementation and change interface.
Expose Bro's linear congruence PRNG as utility function.
H3 does not check for zero length input.
Support seeding for hashers.
Add utility function to access first random seed.
Update H3 documentation (and minor style nits.)
Make H3 seed configurable.
...