skip to Main Content

This page was archived on 16th April, 2019 and is no longer actively maintained.

Complex Event Processing

Last Updated:
Analyst Coverage:

This page has been archived, please visit the Streaming Analytics Platforms page for related content.

Complex event processing, also known as event, stream or event stream processing is a technique used for querying data prior to its being stored within a database or, in some cases, without it ever being so stored.

If event processing involves tracking and analysing streams of data about things that happen (events) and deriving appropriate conclusions, complex event processing, or CEP, processes ‘complex events’,  derived from multiple source events (which could be from the same source or not), combined to generate further downstream events or patterns. Complex events usually relate to complicated but meaningful business events (such as opportunities or threats) with the implication that they will be responded to in something approaching real-time. It is important in cases where large volumes of data need to be queried within a very short period of time. Some vendors (unsurprisingly) categorise complex event processing as a big data issue.

There is a secondary market where complex event processing is used in conjunction with business process management to handle environments where a complex pattern of events may occur but without the same requirement to handle high volumes.

This page has been archived, please visit the Streaming Analytics Platforms page for related content.

Traditional query techniques involve storing the data and then running a query against that data. However, the process of ingesting and then storing the data takes time and when there are very large amounts of data to be processed and the query latency requirements are very low then the overhead involved in storing the data is too great. Complex event processing works by having the data pass through a query during the ingestion process, thereby providing much better performance.

However, it is not always as simple as just passing the data through a query. It may be more a question of pattern recognition whereby a series of events are correlated and together meet or fail to meet an expected pattern. For example, credit card fraud detection has been implemented using complex event processing and the same is true for the identification of low and slow attacks against corporate infrastructures.

This page has been archived, please visit the Streaming Analytics Platforms page for related content.

Complex event processing is widely used in capital markets for ‘black box’ trading algorithms and associated activities such as real-time exposure analysis. There is a growing market in the Telco sector and there are a number of vendors who offer complex event processing to support Security Information and Event Management. Fraud detection and prevention (for example, for credit card fraud) is another segment of particular interest. Other than these specific markets there are a number of other use cases for this technology but no recognised markets as such.

In general, complex event processing will be of interest to compliance and security personnel in particular industry verticals as well as CIOs and IT architects that need to resolve relevant big data issues.

This page has been archived, please visit the Streaming Analytics Platforms page for related content.

Vendors of complex event processing products are still looking for more markets. Many of the use cases quoted by vendors are one-offs and, while interesting, do not represent deployments that are repeatable on a large scale. Telco is an exception to this rule.

Complex event processing is also under threat from other directions. For example, there are NoSQL (eg Cassandra) based deployments that rival the performance of complex event processing engines, at least for environments where processing requirements are in the tens of thousands per second rather than hundres of thousands. These will not be appropriate where you have combinations of events triggering new events and complex event processing will be more suitable where you have a specific pattern of events that you are looking for, such as credit card fraud patterns or low and slow attack patterns. Conversely, a NoSQL-based approach will be more suitable if those patterns are changeable and/or if you do need to store data (for example, because you want to trend the data in real-time) and want a single product for that purpose rather than two.

This page has been archived, please visit the Streaming Analytics Platforms page for related content.

Several new vendors have entered this market. Three of the most notable are SAS, SQLStream and Red Lambda. While the first two of these are general purpose products, the latter is focused on SIEM. In addition, StreamBase has added a business intelligence front-end to its product (it had previously been focused on supporting the development of algorithms within capital markets). Darkstar, which entered the market in 2012 seems now to have disappeared from the radar, which will be a disappointment for its founder whose previous venture into this market (Kaskad) met a similar fate.

SQLStream is interesting because, as its name suggests, it is strongly focused on SQL as a development environment. A number of the other products use proprietary languages for this purpose.

Solutions

  • IBM (logo)
  • KX Logo

These organisations are also known to offer solutions:

  • EsperTech
  • Event Zero
  • Informatica
  • iWay
  • Nastel
  • Odysseus
  • OneMarketData
  • Open PDC
  • Oracle
  • PowerStream
  • Red Lambda
  • SAP
  • SAS
  • SQLStream
  • Storm
  • Streambase
  • TIBCO

Research

Cover for Software AG Apama and the Internet of Things

Software AG Apama and the Internet of Things

While this paper focuses on Software AG Apama it is not a review of Apama per se, but rather of Software AG’s approach to IoT Analytics.
Back To Top