Spelling testing

* alphabet
* another
* associated
* avoiding
* base
* because
* constructors
* defining
* deterministic
* directly
* endlessly
* entity
* function
* indefinitely
* initial
* interpreter
* into
* modifying
* negotiate
* nonexistent
* observations
* occasional
* omission
* orphaned
* overridden
* passing
* primitive
* produces
* reassembly
* repository
* restore
* shouldn't
* something
* statement
* the
* therefore
* transferred
* uninitialized
* unsuccessful

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>
This commit is contained in:
Josh Soref 2022-10-23 16:00:55 -04:00
parent f810f78e3e
commit 74af1ebe16
62 changed files with 207 additions and 199 deletions

View file

@ -1 +1,2 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
[[name=Broker::log_flush, times_called=2], [name=ChecksumOffloading::check, times_called=2], [name=NetControl::init, times_called=1], [name=analyzer_confirmation_info, times_called=1], [name=connection_established, times_called=1], [name=connection_state_remove, times_called=1], [name=file_new, times_called=1], [name=file_over_new_connection, times_called=1], [name=file_sniff, times_called=1], [name=file_state_remove, times_called=1], [name=filter_change_tracking, times_called=3], [name=get_file_handle, times_called=4], [name=http_begin_entity, times_called=2], [name=http_end_entity, times_called=2], [name=http_header, times_called=13], [name=http_message_done, times_called=2], [name=http_reply, times_called=1], [name=http_request, times_called=1], [name=net_done, times_called=1], [name=new_connection, times_called=1], [name=run_sync_hook, times_called=2], [name=zeek_done, times_called=1], [name=zeek_init, times_called=1]] [[name=Broker::log_flush, times_called=2], [name=ChecksumOffloading::check, times_called=2], [name=NetControl::init, times_called=1], [name=analyzer_confirmation_info, times_called=1], [name=connection_established, times_called=1], [name=connection_state_remove, times_called=1], [name=file_new, times_called=1], [name=file_over_new_connection, times_called=1], [name=file_sniff, times_called=1], [name=file_state_remove, times_called=1], [name=filter_change_tracking, times_called=3], [name=get_file_handle, times_called=4], [name=http_begin_entity, times_called=2], [name=http_end_entity, times_called=2], [name=http_header, times_called=13], [name=http_message_done, times_called=2], [name=http_reply, times_called=1], [name=http_request, times_called=1], [name=net_done, times_called=1], [name=new_connection, times_called=1], [name=run_sync_hook, times_called=2], [name=zeek_done, times_called=1], [name=zeek_init, times_called=1]]

View file

@ -1,6 +1,8 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
Validation result: certificate has expired Validation result: certificate has expired
Validation result: ok Validation result: ok
Resulting chain: Resulting chain:
Fingerprint: 70829f77ff4b6e908324a3f4e1940fce6c489098, Subject: CN=www.tobu-estate.com,OU=Terms of use at www.verisign.com/rpa (c)05,O=TOBU RAILWAY Co.\,Ltd.,L=Sumida-ku,ST=Tokyo,C=JP Fingerprint: 70829f77ff4b6e908324a3f4e1940fce6c489098, Subject: CN=www.tobu-estate.com,OU=Terms of use at www.verisign.com/rpa (c)05,O=TOBU RAILWAY Co.\,Ltd.,L=Sumida-ku,ST=Tokyo,C=JP
Fingerprint: 5deb8f339e264c19f6686f5f8f32b54a4c46b476, Subject: CN=VeriSign Class 3 Secure Server CA - G3,OU=Terms of use at https://www.verisign.com/rpa (c)10,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US Fingerprint: 5deb8f339e264c19f6686f5f8f32b54a4c46b476, Subject: CN=VeriSign Class 3 Secure Server CA - G3,OU=Terms of use at https://www.verisign.com/rpa (c)10,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US
Fingerprint: 4eb6d578499b1ccf5f581ead56be3d9b6744a5e5, Subject: CN=VeriSign Class 3 Public Primary Certification Authority - G5,OU=(c) 2006 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US Fingerprint: 32f30882622b87cf8856c63db873df0853b4dd27, Subject: CN=VeriSign Class 3 Public Primary Certification Authority - G5,OU=(c) 2006 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US
Fingerprint: 742c3192e607e424eb4549542be1bbc53e6174e2, Subject: OU=Class 3 Public Primary Certification Authority,O=VeriSign\, Inc.,C=US

View file

@ -1,4 +1,4 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63. ### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
intial val, init initial val, init
peer added peer added
updated val, newval updated val, newval

View file

@ -10,7 +10,7 @@
[] []
Yield type is documented/cross-referenced for primitize types. Yield type is documented/cross-referenced for primitive types.
.. zeek:id:: test_vector1 .. zeek:id:: test_vector1
:source-code: <...>/vectors.zeek 14 14 :source-code: <...>/vectors.zeek 14 14

View file

@ -1,2 +1,2 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63. ### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
fatal error in <...>/enum-nonexisting.zeek, line 4: unknown enum identifier "notexisting" fatal error in <...>/enum-nonexisting.zeek, line 4: unknown enum identifier "nonexistent"

View file

@ -2,7 +2,7 @@
String! String!
Count! Count!
Bool or address! Bool or address!
Somethign else! Something else!
Bool or address! Bool or address!
n/a n/a

View file

@ -1,2 +1,3 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
error in <...>/init-bare.zeek, line 1: dynamic plugin zeek::asciireader from directory <...>/build/ conflicts with built-in plugin Zeek::AsciiReader (-1.-1.0) error in <...>/init-bare.zeek, line 1: dynamic plugin zeek::asciireader from directory <...>/build/ conflicts with built-in plugin Zeek::AsciiReader (-1.-1.0)
fatal error in <...>/init-bare.zeek, line 1: aborting after plugin errors fatal error in <...>/init-bare.zeek, line 1: aborting after plugin errors

View file

@ -1,132 +1,133 @@
1529347003.888515 | HookUnprocessedPacket [ts=1529347003.888515 len=60] ### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
packet_not_processed: ts=1529347003.888515 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.889372 | HookUnprocessedPacket [ts=1529347003.889372 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.889372 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.890009 | HookUnprocessedPacket [ts=1529347003.890009 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.890009 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.890881 | HookUnprocessedPacket [ts=1529347003.890881 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.890881 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.891520 | HookUnprocessedPacket [ts=1529347003.89152 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.891520 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.892374 | HookUnprocessedPacket [ts=1529347003.892374 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.892374 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.893010 | HookUnprocessedPacket [ts=1529347003.89301 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.893010 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.893973 | HookUnprocessedPacket [ts=1529347003.893973 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.893973 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.894627 | HookUnprocessedPacket [ts=1529347003.894627 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.894627 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.895482 | HookUnprocessedPacket [ts=1529347003.895482 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.895482 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.896120 | HookUnprocessedPacket [ts=1529347003.89612 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.896120 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.896974 | HookUnprocessedPacket [ts=1529347003.896974 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.896974 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.897608 | HookUnprocessedPacket [ts=1529347003.897608 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.897608 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.898627 | HookUnprocessedPacket [ts=1529347003.898627 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.898627 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.899558 | HookUnprocessedPacket [ts=1529347003.899558 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.899558 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.900941 | HookUnprocessedPacket [ts=1529347003.900941 len=62] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.900941 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=62]
1529347003.901790 | HookUnprocessedPacket [ts=1529347003.90179 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.901790 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.902972 | HookUnprocessedPacket [ts=1529347003.902972 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.902972 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.903806 | HookUnprocessedPacket [ts=1529347003.903806 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.903806 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.904631 | HookUnprocessedPacket [ts=1529347003.904631 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.904631 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.905200 | HookUnprocessedPacket [ts=1529347003.9052 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.905200 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.905985 | HookUnprocessedPacket [ts=1529347003.905985 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.905985 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.906561 | HookUnprocessedPacket [ts=1529347003.906561 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.906561 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.907448 | HookUnprocessedPacket [ts=1529347003.907448 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.907448 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.908018 | HookUnprocessedPacket [ts=1529347003.908018 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.908018 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.908803 | HookUnprocessedPacket [ts=1529347003.908803 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.908803 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.909373 | HookUnprocessedPacket [ts=1529347003.909373 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.909373 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.910163 | HookUnprocessedPacket [ts=1529347003.910163 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.910163 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.910733 | HookUnprocessedPacket [ts=1529347003.910733 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.910733 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.911522 | HookUnprocessedPacket [ts=1529347003.911522 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.911522 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.912089 | HookUnprocessedPacket [ts=1529347003.912089 len=66] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.912089 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=66]
1529347003.912976 | HookUnprocessedPacket [ts=1529347003.912976 len=82] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.912976 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=82]
1529347003.913490 | HookUnprocessedPacket [ts=1529347003.91349 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.913490 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.914244 | HookUnprocessedPacket [ts=1529347003.914244 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.914244 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.914754 | HookUnprocessedPacket [ts=1529347003.914754 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.914754 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.915493 | HookUnprocessedPacket [ts=1529347003.915493 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.915493 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.916000 | HookUnprocessedPacket [ts=1529347003.916 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.916000 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.916730 | HookUnprocessedPacket [ts=1529347003.91673 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.916730 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.917240 | HookUnprocessedPacket [ts=1529347003.91724 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.917240 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.917969 | HookUnprocessedPacket [ts=1529347003.917969 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.917969 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.918497 | HookUnprocessedPacket [ts=1529347003.918497 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.918497 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.919336 | HookUnprocessedPacket [ts=1529347003.919336 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.919336 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.919847 | HookUnprocessedPacket [ts=1529347003.919847 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.919847 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.920577 | HookUnprocessedPacket [ts=1529347003.920577 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.920577 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.921087 | HookUnprocessedPacket [ts=1529347003.921087 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.921087 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.921815 | HookUnprocessedPacket [ts=1529347003.921815 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.921815 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.922337 | HookUnprocessedPacket [ts=1529347003.922337 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.922337 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.923067 | HookUnprocessedPacket [ts=1529347003.923067 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.923067 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.923524 | HookUnprocessedPacket [ts=1529347003.923524 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.923524 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.924192 | HookUnprocessedPacket [ts=1529347003.924192 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.924192 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.924644 | HookUnprocessedPacket [ts=1529347003.924644 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.924644 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.925420 | HookUnprocessedPacket [ts=1529347003.92542 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.925420 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.925870 | HookUnprocessedPacket [ts=1529347003.92587 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.925870 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.926550 | HookUnprocessedPacket [ts=1529347003.92655 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.926550 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.926999 | HookUnprocessedPacket [ts=1529347003.926999 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.926999 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.927662 | HookUnprocessedPacket [ts=1529347003.927662 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.927662 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.928108 | HookUnprocessedPacket [ts=1529347003.928108 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.928108 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.928773 | HookUnprocessedPacket [ts=1529347003.928773 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.928773 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.929220 | HookUnprocessedPacket [ts=1529347003.92922 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.929220 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.929885 | HookUnprocessedPacket [ts=1529347003.929885 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.929885 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.930352 | HookUnprocessedPacket [ts=1529347003.930352 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.930352 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.931118 | HookUnprocessedPacket [ts=1529347003.931118 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.931118 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.931567 | HookUnprocessedPacket [ts=1529347003.931567 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.931567 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.932231 | HookUnprocessedPacket [ts=1529347003.932231 len=70] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.932231 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=70]
1529347003.932477 | HookUnprocessedPacket [ts=1529347003.932477 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.932477 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
1529347003.932971 | HookUnprocessedPacket [ts=1529347003.932971 len=60] packet_not_processed: ts=XXXXXXXXXX.XXXXXX
packet_not_processed: ts=1529347003.932971 XXXXXXXXXX.XXXXXX | HookUnprocessedPacket [ts=XXXXXXXXXX.XXXXXX len=60]
packet_not_processed: ts=XXXXXXXXXX.XXXXXX

View file

@ -1,3 +1,4 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
00000000 d4 c3 b2 a1 02 00 04 00 00 00 00 00 00 00 00 00 |................| 00000000 d4 c3 b2 a1 02 00 04 00 00 00 00 00 00 00 00 00 |................|
00000010 00 24 00 00 01 00 00 00 bb fb 27 5b c3 8e 0d 00 |.$........'[....| 00000010 00 24 00 00 01 00 00 00 bb fb 27 5b c3 8e 0d 00 |.$........'[....|
00000020 3c 00 00 00 3c 00 00 00 a7 ab 16 9f 39 00 d4 f1 |<...<.......9...| 00000020 3c 00 00 00 3c 00 00 00 a7 ab 16 9f 39 00 d4 f1 |<...<.......9...|

View file

@ -3,8 +3,8 @@ Input::EVENT_NEW line output (stderr=F): ../mydir:
Input::EVENT_NEW line output (stderr=F): a Input::EVENT_NEW line output (stderr=F): a
Input::EVENT_NEW line output (stderr=F): b Input::EVENT_NEW line output (stderr=F): b
Input::EVENT_NEW line output (stderr=F): c Input::EVENT_NEW line output (stderr=F): c
Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistant> Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistent>
Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistant> Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistent>
Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistant> Input::EVENT_NEW line output (stderr=T): <stderr output contained nonexistent>
End of Data event, input End of Data event, input
Process finished event, input, T Process finished event, input, T

View file

@ -5,7 +5,7 @@ Host: 10.10.10.10 - num:1 - sum:5.0 - avg:5.0 - max:5.0 - min:5.0 - var:0.0 - st
Host: 6.5.4.3 - num:2 - sum:6.0 - avg:3.0 - max:5.0 - min:1.0 - var:8.0 - std_dev:2.8 - unique:2 - hllunique:2 Host: 6.5.4.3 - num:2 - sum:6.0 - avg:3.0 - max:5.0 - min:1.0 - var:8.0 - std_dev:2.8 - unique:2 - hllunique:2
Host: 7.2.1.5 - num:2 - sum:145.0 - avg:72.5 - max:91.0 - min:54.0 - var:684.5 - std_dev:26.2 - unique:2 - hllunique:2 Host: 7.2.1.5 - num:2 - sum:145.0 - avg:72.5 - max:91.0 - min:54.0 - var:684.5 - std_dev:26.2 - unique:2 - hllunique:2
Performing first epoch, no observations Performing first epoch, no observations
Performing second epoch with overvations Performing second epoch with observations
Sending ready for data Sending ready for data
epoch finished, F epoch finished, F
epoch finished, T epoch finished, T

View file

@ -1,7 +1,7 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63. ### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
Performing first epoch, no observations Performing first epoch, no observations
epoch_finished epoch_finished
Performing second epoch with overvations Performing second epoch with observations
Host: 1.2.3.4 - num:5 - sum:221.0 - var:1144.2 - avg:44.2 - max:94.0 - min:5.0 - std_dev:33.8 - unique:4 - hllunique:4 Host: 1.2.3.4 - num:5 - sum:221.0 - var:1144.2 - avg:44.2 - max:94.0 - min:5.0 - std_dev:33.8 - unique:4 - hllunique:4
Host: 6.5.4.3 - num:1 - sum:2.0 - var:0.0 - avg:2.0 - max:2.0 - min:2.0 - std_dev:0.0 - unique:1 - hllunique:1 Host: 6.5.4.3 - num:1 - sum:2.0 - var:0.0 - avg:2.0 - max:2.0 - min:2.0 - std_dev:0.0 - unique:1 - hllunique:1
Host: 7.2.1.5 - num:1 - sum:1.0 - var:0.0 - avg:1.0 - max:1.0 - min:1.0 - std_dev:0.0 - unique:1 - hllunique:1 Host: 7.2.1.5 - num:1 - sum:1.0 - var:0.0 - avg:1.0 - max:1.0 - min:1.0 - std_dev:0.0 - unique:1 - hllunique:1

View file

@ -1 +1,2 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
[svc_priority=1, target_name=.] [svc_priority=1, target_name=.]

View file

@ -1 +1,2 @@
### BTest baseline data generated by btest-diff. Do not edit. Use "btest -U/-u" to update. Requires BTest >= 0.63.
[svc_priority=0, target_name=foo.example.com] [svc_priority=0, target_name=foo.example.com]

View file

@ -1,5 +1,5 @@
These are the trace files that are used by the Zeek test suite. These are the trace files that are used by the Zeek test suite.
Note to maintainers: please take care when modyfing/removing files from here. Note to maintainers: please take care when modifying/removing files from here.
We install these traces with the Zeek distribution and external packages might We install these traces with the Zeek distribution and external packages might
depend on them for tests. depend on them for tests.

View file

@ -7,10 +7,10 @@ global my_alphabet: string = "!#$%&/(),-.:;<>@[]^ `_{|}~abcdefghijklmnopqrstuvwx
print decode_base64("YnJv"); print decode_base64("YnJv");
print decode_base64("YnJv", default_alphabet); print decode_base64("YnJv", default_alphabet);
print decode_base64("YnJv", ""); # should use default alpabet print decode_base64("YnJv", ""); # should use default alphabet
print decode_base64("}n-v", my_alphabet); print decode_base64("}n-v", my_alphabet);
print decode_base64("YnJv"); print decode_base64("YnJv");
print decode_base64("YnJv", default_alphabet); print decode_base64("YnJv", default_alphabet);
print decode_base64("YnJv", ""); # should use default alpabet print decode_base64("YnJv", ""); # should use default alphabet
print decode_base64("}n-v", my_alphabet); print decode_base64("}n-v", my_alphabet);

View file

@ -7,7 +7,7 @@ global my_alphabet: string = "!#$%&/(),-.:;<>@[]^ `_{|}~abcdefghijklmnopqrstuvwx
print encode_base64("bro"); print encode_base64("bro");
print encode_base64("bro", default_alphabet); print encode_base64("bro", default_alphabet);
print encode_base64("bro", ""); # should use default alpabet print encode_base64("bro", ""); # should use default alphabet
print encode_base64("bro", my_alphabet); print encode_base64("bro", my_alphabet);
print encode_base64("padding"); print encode_base64("padding");

View file

@ -46,7 +46,7 @@ event zeek_init()
bloomfilter_add(bf_cnt, 0); bloomfilter_add(bf_cnt, 0);
print bloomfilter_lookup(bf_copy, 0); print bloomfilter_lookup(bf_copy, 0);
print bloomfilter_lookup(bf_copy, 42); print bloomfilter_lookup(bf_copy, 42);
# check that typefication transfered. # check that typefication transferred.
bloomfilter_add(bf_copy, 0.5); # causes stderr output "error: incompatible Bloom filter types" bloomfilter_add(bf_copy, 0.5); # causes stderr output "error: incompatible Bloom filter types"
print "============ Hashes"; print "============ Hashes";

View file

@ -48,7 +48,7 @@ event check_var()
event zeek_init() event zeek_init()
{ {
print "intial val", test_var; print "initial val", test_var;
Broker::subscribe("zeek/ids"); Broker::subscribe("zeek/ids");
Broker::listen("127.0.0.1", to_port(getenv("BROKER_PORT"))); Broker::listen("127.0.0.1", to_port(getenv("BROKER_PORT")));
} }

View file

@ -1,5 +1,5 @@
# This crashes with ZAM because it explicitly violates typing, which happens # This crashes with ZAM because it explicitly violates typing, which happens
# to work in the intepreter, but isn't sound. # to work in the interpreter, but isn't sound.
# #
# @TEST-REQUIRES: test "${ZEEK_ZAM}" != "1" # @TEST-REQUIRES: test "${ZEEK_ZAM}" != "1"
# #

View file

@ -1,5 +1,5 @@
# #
# In "normal" test mode, connection uids should be determistic. # In "normal" test mode, connection uids should be deterministic.
# #
# @TEST-EXEC: zeek -b -D -C -r $TRACES/wikipedia.trace %INPUT >output # @TEST-EXEC: zeek -b -D -C -r $TRACES/wikipedia.trace %INPUT >output
# @TEST-EXEC: btest-diff output # @TEST-EXEC: btest-diff output

View file

@ -1,7 +1,7 @@
# @TEST-EXEC: zeek -b %INPUT 2>&1 | grep -v termination | sort | uniq | wc -l | awk '{print $1}' >output # @TEST-EXEC: zeek -b %INPUT 2>&1 | grep -v termination | sort | uniq | wc -l | awk '{print $1}' >output
# @TEST-EXEC: btest-diff output # @TEST-EXEC: btest-diff output
# In old version, the event would keep triggering endlessely, with the network # In old version, the event would keep triggering endlessly, with the network
# time not moving forward and Zeek not terminating. # time not moving forward and Zeek not terminating.
# #
# Note that the output will not be 20 because we still execute two rounds # Note that the output will not be 20 because we still execute two rounds

View file

@ -1,6 +1,6 @@
# #
# This test procudes a recursive error: the error handler is itself broken. Rather # This test produces a recursive error: the error handler is itself broken. Rather
# than looping indefinitly, the error inside the handler should reported to stderr. # than looping indefinitely, the error inside the handler should reported to stderr.
# #
# @TEST-EXEC: zeek -b %INPUT >output 2>err # @TEST-EXEC: zeek -b %INPUT >output 2>err
# @TEST-EXEC: TEST_DIFF_CANONIFIER=$SCRIPTS/diff-remove-abspath btest-diff output # @TEST-EXEC: TEST_DIFF_CANONIFIER=$SCRIPTS/diff-remove-abspath btest-diff output

View file

@ -2,7 +2,7 @@
# @TEST-EXEC: btest-diff out # @TEST-EXEC: btest-diff out
# In telecoms there is never a GTP tunnel within another GTP tunnel. # In telecoms there is never a GTP tunnel within another GTP tunnel.
# So if we find inside a GTP tunnel anohter IP/UDP packet with port 2152, # So if we find inside a GTP tunnel another IP/UDP packet with port 2152,
# it is just a UDP packet, but not another GTP tunnel. # it is just a UDP packet, but not another GTP tunnel.
event analyzer_violation(c: connection, atype: AllAnalyzers::Tag, aid: count, reason: string) event analyzer_violation(c: connection, atype: AllAnalyzers::Tag, aid: count, reason: string)

View file

@ -10,7 +10,7 @@ type TestRecord: record {
field2: count; field2: count;
}; };
## Yield type is documented/cross-referenced for primitize types. ## Yield type is documented/cross-referenced for primitive types.
global test_vector0: vector of string; global test_vector0: vector of string;
## Yield type is documented/cross-referenced for composite types. ## Yield type is documented/cross-referenced for composite types.

View file

@ -34,7 +34,7 @@ global n = 0;
function send_event() function send_event()
{ {
local event_count = 1; local event_count = 1;
# log fails to be looked up because of a missing print statment # log fails to be looked up because of a missing print statement
# functions must have the same name on both ends of broker. # functions must have the same name on both ends of broker.
local log : myfunctype = function(c: count) : function(d: count) : count local log : myfunctype = function(c: count) : function(d: count) : count
{ {

View file

@ -18,7 +18,7 @@ type myrec: record {
function foo(mr: myrec) function foo(mr: myrec)
{ {
print "foo start"; print "foo start";
# Unitialized field access: unwind out of current event handler body # Uninitialized field access: unwind out of current event handler body
print mr$f; print mr$f;
# Unreachable # Unreachable
print "foo done"; print "foo done";

View file

@ -1,7 +1,7 @@
# @TEST-EXEC: zeek -b %INPUT >out # @TEST-EXEC: zeek -b %INPUT >out
# @TEST-EXEC: btest-diff out # @TEST-EXEC: btest-diff out
# All various container contructors should work at both global and local scope. # All various container constructors should work at both global and local scope.
global gt1: table[port] of count = table( [1/tcp] = 1, [2/tcp] = 2, [3/tcp] = 3 ); global gt1: table[port] of count = table( [1/tcp] = 1, [2/tcp] = 2, [3/tcp] = 3 );
global gs1: set[port] = set( 1/tcp, 2/tcp, 3/tcp ); global gs1: set[port] = set( 1/tcp, 2/tcp, 3/tcp );

View file

@ -42,7 +42,7 @@ event zeek_init()
bloomfilter_add(bf_cnt, 0); bloomfilter_add(bf_cnt, 0);
print bloomfilter_lookup(bf_copy, 0); print bloomfilter_lookup(bf_copy, 0);
print bloomfilter_lookup(bf_copy, 42); print bloomfilter_lookup(bf_copy, 42);
# check that typefication transfered. # check that typefication transferred.
bloomfilter_add(bf_copy, 0.5); # causes stderr output bloomfilter_add(bf_copy, 0.5); # causes stderr output
print "============ Hashes"; print "============ Hashes";

View file

@ -110,7 +110,7 @@ function compare_otr(a: TestRecord, b: TestRecord): bool
if ( same_object(a$i2, b$i2) ) if ( same_object(a$i2, b$i2) )
return F; return F;
# check that we restroe that i1 & i2 point to same object # check that we restore that i1 & i2 point to same object
if ( ! same_object(a$i1, a$i2) ) if ( ! same_object(a$i1, a$i2) )
return F; return F;
if ( ! same_object(b$i1, b$i2) ) if ( ! same_object(b$i1, b$i2) )
@ -151,7 +151,7 @@ event zeek_init()
local pat1 = /.*PATTERN.*/; local pat1 = /.*PATTERN.*/;
local pat2 = copy(pat1); local pat2 = copy(pat1);
# patterns cannot be directoy compared # patterns cannot be directly compared
if ( same_object(pat1, pat2) ) if ( same_object(pat1, pat2) )
print "FAIL P1"; print "FAIL P1";
if ( ! ( pat1 == "PATTERN" ) ) if ( ! ( pat1 == "PATTERN" ) )

View file

@ -1,6 +1,6 @@
# @TEST-EXEC-FAIL: zeek -b %INPUT >output 2>&1 # @TEST-EXEC-FAIL: zeek -b %INPUT >output 2>&1
# @TEST-EXEC: TEST_DIFF_CANONIFIER="$SCRIPTS/diff-remove-abspath" btest-diff output # @TEST-EXEC: TEST_DIFF_CANONIFIER="$SCRIPTS/diff-remove-abspath" btest-diff output
redef enum notexisting += { redef enum nonexistent += {
This_Causes_a_Segfault This_Causes_a_Segfault
}; };

View file

@ -36,7 +36,7 @@ event zeek_init() &priority=-10
schedule 4sec { do_it() }; schedule 4sec { do_it() };
} }
# Test that re-defing a table with an expiry in a specific way # Test that re-defining a table with an expiry in a specific way
# does not crash Zeek; see GH-1687. # does not crash Zeek; see GH-1687.
global hosts: set[addr] &create_expire=1day &redef; global hosts: set[addr] &create_expire=1day &redef;

View file

@ -47,7 +47,7 @@ function make_lambda(start: count): function(): count
event zeek_init() &priority=10 event zeek_init() &priority=10
{ {
# just checking use of unitialized locals "works" (doesn't crash) # just checking use of uninitialized locals "works" (doesn't crash)
local one = make_lambda(1); local one = make_lambda(1);
local two = make_lambda(2); local two = make_lambda(2);
} }

View file

@ -92,7 +92,7 @@ event zeek_init()
print "expect [8, 16, 24]"; print "expect [8, 16, 24]";
print map_1(times_eight, test); print map_1(times_eight, test);
# things like this are only possible becuse we allow functions to # things like this are only possible because we allow functions to
# mutate their closures. # mutate their closures.
local thunder= make_dog("thunder", 10); local thunder= make_dog("thunder", 10);
thunder("get name", ""); thunder("get name", "");

View file

@ -14,7 +14,7 @@ function myfunc(rec: myrec)
event zeek_init() event zeek_init()
{ {
# Orhpaned fields in a record coercion reflect a programming error, like a typo, so should # Orphaned fields in a record coercion reflect a programming error, like a typo, so should
# be reported at parse-time to prevent unexpected run-time behavior. # be reported at parse-time to prevent unexpected run-time behavior.
local rec: myrec = [$a="test", $b=42, $wtf=1sec]; local rec: myrec = [$a="test", $b=42, $wtf=1sec];
print rec; print rec;

View file

@ -68,7 +68,7 @@ print fmt("Count %s: %d", c, |c|);
# type, so this wraps to a very large number. It may be more intuitive if it # type, so this wraps to a very large number. It may be more intuitive if it
# were to coerce to a signed integer, but it can also be more favorable to # were to coerce to a signed integer, but it can also be more favorable to
# simply have consistent behavior across arbitrary arithmetic expressions even # simply have consistent behavior across arbitrary arithmetic expressions even
# if that may result in occassional, unintended overflow/wrapping. # if that may result in occasional, unintended overflow/wrapping.
print fmt("Expr: %d", |5 - 9|); print fmt("Expr: %d", |5 - 9|);
# Same arithmetic on signed integers is likely what's originally intended. # Same arithmetic on signed integers is likely what's originally intended.
print fmt("Signed Expr: %d", |+5 - +9|); print fmt("Signed Expr: %d", |+5 - +9|);
@ -101,7 +101,7 @@ print fmt("Record %s: %d", r, |r|);
# Size of set: returns number of elements in set. # Size of set: returns number of elements in set.
# Don't print the set, as its order depends on the seeding of the hash # Don't print the set, as its order depends on the seeding of the hash
# fnction, and it's not worth the trouble to normalize it. # function, and it's not worth the trouble to normalize it.
print fmt("Set: %d", |si|); print fmt("Set: %d", |si|);
# Size of string: returns string length. # Size of string: returns string length.

View file

@ -11,7 +11,7 @@ function switch_one(v: any): string
case type bool, type count: case type bool, type count:
return "Bool or address!"; return "Bool or address!";
default: default:
return "Somethign else!"; return "Something else!";
} }
} }

View file

@ -11,7 +11,7 @@ function switch_one(v: string): string
case type bool, type addr: case type bool, type addr:
return "Bool or address!"; return "Bool or address!";
default: default:
return "Somethign else!"; return "Something else!";
} }
} }

View file

@ -31,7 +31,7 @@ function switch_one(v: any)
break; break;
default: default:
print "Somethign else!"; print "Something else!";
break; break;
} }
} }

View file

@ -11,7 +11,7 @@ function switch_one(v: any): string
case type bool, type addr: case type bool, type addr:
return "Bool or address!"; return "Bool or address!";
default: default:
return "Somethign else!"; return "Something else!";
} }
} }

View file

@ -35,7 +35,7 @@ for ( i in t )
print t; print t;
for ( i in t ) for ( i in t )
# Trying to delete a non-existent element within in a loop does not # Trying to delete a nonexistent element within in a loop does not
# actually modify membership, so does not trigger a warning. # actually modify membership, so does not trigger a warning.
delete t[0]; delete t[0];
@ -67,7 +67,7 @@ for ( n in s )
print s; print s;
for ( n in s ) for ( n in s )
# Trying to delete a non-existent element within in a loop does not # Trying to delete a nonexistent element within in a loop does not
# actually modify membership, so does not trigger a warning. # actually modify membership, so does not trigger a warning.
delete s[0]; delete s[0];

View file

@ -4,7 +4,7 @@
# The 'when' implementation historically performed an AST-traversal to locate # The 'when' implementation historically performed an AST-traversal to locate
# any index-expressions like `x[9]` and evaluated them so that it could # any index-expressions like `x[9]` and evaluated them so that it could
# register the assocated value as something for which it needs to receive # register the associated value as something for which it needs to receive
# "modification" notifications. # "modification" notifications.
# #
# Evaluating arbitrary expressions during an AST-traversal like that ignores # Evaluating arbitrary expressions during an AST-traversal like that ignores

View file

@ -15,13 +15,13 @@ public:
FOO_Analyzer(zeek::Connection* conn); FOO_Analyzer(zeek::Connection* conn);
virtual ~FOO_Analyzer(); virtual ~FOO_Analyzer();
// Overriden from Analyzer. // Overridden from Analyzer.
virtual void Done(); virtual void Done();
virtual void DeliverStream(int len, const u_char* data, bool orig); virtual void DeliverStream(int len, const u_char* data, bool orig);
virtual void Undelivered(uint64_t seq, int len, bool orig); virtual void Undelivered(uint64_t seq, int len, bool orig);
// Overriden from tcp::TCP_ApplicationAnalyzer. // Overridden from tcp::TCP_ApplicationAnalyzer.
virtual void EndpointEOF(bool is_orig); virtual void EndpointEOF(bool is_orig);
static zeek::analyzer::Analyzer* InstantiateAnalyzer(zeek::Connection* conn) static zeek::analyzer::Analyzer* InstantiateAnalyzer(zeek::Connection* conn)

View file

@ -1,4 +1,4 @@
# This used to crash the file reassemly code. # This used to crash the file reassembly code.
# #
# @TEST-EXEC: zeek -b -r $TRACES/http/byteranges.trace base/protocols/http base/files/hash frameworks/files/extract-all-files FileExtract::default_limit=4000 # @TEST-EXEC: zeek -b -r $TRACES/http/byteranges.trace base/protocols/http base/files/hash frameworks/files/extract-all-files FileExtract::default_limit=4000
# #

View file

@ -57,7 +57,7 @@ event Input::end_of_data(name: string, source:string)
++endcount; ++endcount;
# ... and when we're done, move to reading via events. # ... and when we're done, move to reading via events.
# This makes the reads sequential, avoding races in the output. # This makes the reads sequential, avoiding races in the output.
if ( endcount == 1 ) if ( endcount == 1 )
{ {
Input::add_event([$source="../input.log", $name="sshevent", $error_ev=handle_our_errors_event, $fields=Val, $want_record=T, $ev=line]); Input::add_event([$source="../input.log", $name="sshevent", $error_ev=handle_our_errors_event, $fields=Val, $want_record=T, $ev=line]);

View file

@ -58,7 +58,7 @@ event Input::end_of_data(name: string, source:string)
++endcount; ++endcount;
# ... and when we're done, move to reading via events. # ... and when we're done, move to reading via events.
# This makes the reads sequential, avoding races in the output. # This makes the reads sequential, avoiding races in the output.
if ( endcount == 1 ) if ( endcount == 1 )
{ {
Input::add_event([$source="../input.log", $name="sshevent", $error_ev=handle_our_errors_event, $fields=Val, $want_record=T, $ev=line]); Input::add_event([$source="../input.log", $name="sshevent", $error_ev=handle_our_errors_event, $fields=Val, $want_record=T, $ev=line]);

View file

@ -21,8 +21,8 @@ event line(description: Input::EventDescription, tpe: Input::Event, s: string, i
if ( is_stderr ) if ( is_stderr )
{ {
# work around localized error messages. and if some localization does not include the filename... well... that would be bad :) # work around localized error messages. and if some localization does not include the filename... well... that would be bad :)
if ( strstr(s, "nonexistant") > 0 ) if ( strstr(s, "nonexistent") > 0 )
line_output += "<stderr output contained nonexistant>"; line_output += "<stderr output contained nonexistent>";
else else
line_output += "<unexpected/weird error localization>"; line_output += "<unexpected/weird error localization>";
} }
@ -61,7 +61,7 @@ event zeek_init()
}; };
outfile = open("../out"); outfile = open("../out");
Input::add_event([$source="ls ../mydir ../nonexistant ../nonexistant2 ../nonexistant3 |", Input::add_event([$source="ls ../mydir ../nonexistent ../nonexistent2 ../nonexistent3 |",
$reader=Input::READER_RAW, $name="input", $reader=Input::READER_RAW, $name="input",
$fields=Val, $ev=line, $want_record=F, $fields=Val, $ev=line, $want_record=F,
$config=config_strings, $mode=Input::STREAM]); $config=config_strings, $mode=Input::STREAM]);

View file

@ -82,7 +82,7 @@ event ready_for_data()
event second_test() event second_test()
{ {
print "Performing second epoch with overvations"; print "Performing second epoch with observations";
local ret = SumStats::next_epoch("test"); local ret = SumStats::next_epoch("test");
if ( ! ret ) if ( ! ret )
print "Return value false"; print "Return value false";

View file

@ -16,7 +16,7 @@ event second_test()
SumStats::observe("test.metric", [$host=6.5.4.3], [$num=2]); SumStats::observe("test.metric", [$host=6.5.4.3], [$num=2]);
SumStats::observe("test.metric", [$host=7.2.1.5], [$num=1]); SumStats::observe("test.metric", [$host=7.2.1.5], [$num=1]);
print "Performing second epoch with overvations"; print "Performing second epoch with observations";
local ret = SumStats::next_epoch("test"); local ret = SumStats::next_epoch("test");
if ( ! ret ) if ( ! ret )
print "Return value false"; print "Return value false";

View file

@ -36,7 +36,7 @@ event run_test()
} }
# Grumble grumble, ActiveHTTP actually joins away the \n characters # Grumble grumble, ActiveHTTP actually joins away the \n characters
# from the the response. Not sure how that's helpful. We simply # from the response. Not sure how that's helpful. We simply
# grep out the zeek_version_info{...} endpoint="..." pieces and # grep out the zeek_version_info{...} endpoint="..." pieces and
# expect one for each node to exist as a smoke test. # expect one for each node to exist as a smoke test.
local version_infos = find_all(response$body, /zeek_version_info\{[^}]+\}/); local version_infos = find_all(response$body, /zeek_version_info\{[^}]+\}/);

View file

@ -4,5 +4,5 @@
# This test is mainly checking the request_body_len field for correctness. # This test is mainly checking the request_body_len field for correctness.
# Historical versions of Zeek would mistakenly count the body-lengths of the # Historical versions of Zeek would mistakenly count the body-lengths of the
# multipart sub-entities twice: once upon the end of the sub-entity and then # multipart sub-entities twice: once upon the end of the sub-entity and then
# again upon the end of the top-level enitity that contains all sub-entities. # again upon the end of the top-level entity that contains all sub-entities.
# The size of just the top-level enitity is the correct one to use. # The size of just the top-level entity is the correct one to use.

View file

@ -1,5 +1,5 @@
# This test verifies that given the proper keytab file, the # This test verifies that given the proper keytab file, the
# Kerberos analyzer can open the AD ticket in the Negociate # Kerberos analyzer can open the AD ticket in the Negotiate
# Protocol Request and find the user. # Protocol Request and find the user.
# #
# @TEST-REQUIRES: grep -q "#define USE_KRB5" $BUILD/zeek-config.h # @TEST-REQUIRES: grep -q "#define USE_KRB5" $BUILD/zeek-config.h

View file

@ -1,6 +1,6 @@
# This test verifies that GSSAPI is correctly passing events to # This test verifies that GSSAPI is correctly passing events to
# the Kerberos analyzer. The specific trace example is a # the Kerberos analyzer. The specific trace example is a
# SMB authentication event and therfore relies on the SMB # SMB authentication event and therefore relies on the SMB
# analyzer as well. # analyzer as well.
# @TEST-EXEC: zeek -b -C -r $TRACES/krb/smb_gssapi.trace %INPUT # @TEST-EXEC: zeek -b -C -r $TRACES/krb/smb_gssapi.trace %INPUT

View file

@ -1,5 +1,5 @@
# The parser generated by BinPAC needs to handle this pcap without crashing # The parser generated by BinPAC needs to handle this pcap without crashing
# or asserting. Specifically, pasing Function Code 23, # or asserting. Specifically, passing Function Code 23,
# ReadWriteMultipleRegistersRequest, has a field: # ReadWriteMultipleRegistersRequest, has a field:
# #
# uint16[write_quantity] &length=write_byte_count; # uint16[write_quantity] &length=write_byte_count;

View file

@ -1,4 +1,4 @@
# This tests that successful/unsuccesful auth attempts get logged correctly # This tests that successful/unsuccessful auth attempts get logged correctly
# @TEST-EXEC: zeek -b -r $TRACES/mysql/auth.trace %INPUT # @TEST-EXEC: zeek -b -r $TRACES/mysql/auth.trace %INPUT
# @TEST-EXEC: btest-diff mysql.log # @TEST-EXEC: btest-diff mysql.log

View file

@ -112,7 +112,7 @@ event zeek_init()
ip = "2001:db8:0:0:0:FFFF:192.168.0.5"; ip = "2001:db8:0:0:0:FFFF:192.168.0.5";
print is_valid_ip(ip); print is_valid_ip(ip);
# hybrid ipv6-ipv4 address with zero ommission should work # hybrid ipv6-ipv4 address with zero omission should work
ip = "2001:db8::FFFF:192.168.0.5"; ip = "2001:db8::FFFF:192.168.0.5";
print is_valid_ip(ip); print is_valid_ip(ip);

View file

@ -34,7 +34,7 @@ event zeek_init()
Broker::subscribe(topic); Broker::subscribe(topic);
Broker::listen("127.0.0.1", to_port(getenv("BROKER_PORT"))); Broker::listen("127.0.0.1", to_port(getenv("BROKER_PORT")));
# Create a node that inherits basre mode from us. # Create a node that inherits base mode from us.
local sn = Supervisor::NodeConfig($name="inherit", $directory="inherit"); local sn = Supervisor::NodeConfig($name="inherit", $directory="inherit");
Supervisor::create(sn); Supervisor::create(sn);

View file

@ -66,7 +66,7 @@ function check_group_coverage {
done | sort | uniq) done | sort | uniq)
for i in $DIRS; do for i in $DIRS; do
# For elements in #src, we only care about the files direclty in the directory. # For elements in #src, we only care about the files directly in the directory.
if [[ "$i" = "#src" ]]; then if [[ "$i" = "#src" ]]; then
RUN=$(echo $(grep "$i#[^#]\+$" $DATA | grep "$SRC_FOLDER$i\|build$i" | cut -f 2) | tr " " "+" | bc) RUN=$(echo $(grep "$i#[^#]\+$" $DATA | grep "$SRC_FOLDER$i\|build$i" | cut -f 2) | tr " " "+" | bc)
TOTAL=$(echo $(grep "$i#[^#]\+$" $DATA | grep "$SRC_FOLDER$i\|build$i" | cut -f 3) | tr " " "+" | bc) TOTAL=$(echo $(grep "$i#[^#]\+$" $DATA | grep "$SRC_FOLDER$i\|build$i" | cut -f 3) | tr " " "+" | bc)

View file

@ -119,7 +119,7 @@ verify_run "which lcov" \
echo -n "Creating tracefile for output generation... " echo -n "Creating tracefile for output generation... "
verify_run "lcov --no-external --capture --directory . --output-file $COVERAGE_FILE" verify_run "lcov --no-external --capture --directory . --output-file $COVERAGE_FILE"
# 5. Remove a number of 3rdparty and "extra" files that shoudln't be included in the # 5. Remove a number of 3rdparty and "extra" files that shouldn't be included in the
# Zeek coverage numbers. # Zeek coverage numbers.
for TARGET in $REMOVE_TARGETS; do for TARGET in $REMOVE_TARGETS; do
echo -n "Getting rid of $TARGET files from tracefile... " echo -n "Getting rid of $TARGET files from tracefile... "

View file

@ -106,7 +106,7 @@ there. For each trace, you also need to calculate a checksum with ``md5sum`` and
put it into ``<url>.md5sum``. The scripts use this to decide if they need to put it into ``<url>.md5sum``. The scripts use this to decide if they need to
redownload the trace. Accordingly, if you update a trace, make sure to also redownload the trace. Accordingly, if you update a trace, make sure to also
recalculate its checksum. Note that the traces will be downloaded to ``Traces/`` recalculate its checksum. Note that the traces will be downloaded to ``Traces/``
but must not be added to the git repostiory; there's a ``.gitignore`` installed but must not be added to the git repository; there's a ``.gitignore`` installed
to prevent that. to prevent that.