Real-time Monitoring and Complex Event Processing in PUZZLE

  • March 31, 2022
  • 3 min read

One of the most crucial aspects of PUZZLE relates to its ability to aggregate, normalise and analyse logs in order to provide sophisticated security analytics to SMEs/MEs. Three are the main approaches used for security analytics: a) In Situ Analysis, b) Offline Analysis and c) Streaming/Real Time Analysis. When following the In-situ Analysis, the logs are processed directly in their source. Using Offline Analysis, the logs are centralised in a repository but the process is performed offline.

Finally, on the Streaming-type Analysis, each log is treated as a timestamped event that is controlled by a Complex Event Processing (CEP) engine; in this case the inferencing logic is wired in the format of CEP rules.

PUZZLE uses the Streaming approach, supported by Offline analysis. This choice provides several benefits: On one hand, the streaming approach provides real-time insights into our apps and overall system and increases PUZZLE’s responsiveness in case of malfunctions/failures and security incidents.

An inference rule will be evaluated practically in real time, as several streams can be combined together and assessed by the inference engine. On the other hand, streaming is also supported by offline analysis, in which data is indexed and available for future inspection (e.g., using Machine Learning techniques) to analyse incidents and patterns in a non-real-time fashion. The information obtained by the offline analysis is also used to enrich our CEP logic and enhance the PUZZLE security platform. Security analytics are, thus, provided in PUZZLE using a dedicated CEP pipeline that is presented in Figure 1.

The first component of this pipeline, which receives and processes all the input provided by logging agents, is the Context Broker. This broker is implemented using Apache Kafka, an open-source stream processor / middleware tool which functions as a message broker.

To analyse those streams in real time and check the monitoring reports to produce alerts in real time, Kafka Streams is used – a lightweight client library for processing data stored in Kafka. By using Kafka Streams, we are allowed to execute most of the SQL calculations, to join different streams of data, as also to perform window operations.

The resulting data is forwarded to our main CEP engine, Drools. Drools is an expert system which implements an enhanced version of the Rete algorithm for its rule engine and, in our case, consumes the alerts produced by Kafka Streams and triggers the necessary reaction mechanisms. As any other expert system, Drools consists of three major components: the working memory, the production memory and the result output interface (a.k.a. Agenda).

The Working Memory contains the canonical events per se while the Production Memory contains the static rules that represent the actual conditions in our case. The conditions are formally expressed using the DRL official grammar.

Author: Uni Systems

VALIDATION CONTRACTS CALL!

APPLY UNTIL MAY 31

Validation Contracts of €10.000 for 10 highest ranked SMEs&MES and 5 highest ranked Cybersecurity Vendors.