• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

EPTS Event Processing Symposium Snippets, Day 1

September 17, 2008 By brenda michelson

I’m at the 4th Event Processing Technical Symposium in Stamford CT.  The meeting is run by the Event Processing Technical Society and co-located with Gartner’s Event Processing Summit.  There are about 60 of us here, mostly from vendors (IBM, Oracle, Streambase, Progress Apama, EventZero, Aleri, Tibco,Coral8, iLog, RuleML) but also represented are customers, researchers, consultants and analysts.  You could say, it’s the usual suspects, lead by Opher Etzion of IBM, Roy Schulte of Gartner and David Luckham.

Today is dedicated to the business and application side of event processing.  Tomorrow is technology and standards side of event processing.

Since I’m participating in the meeting, rather than just observing, I’m posting some snippets from the day rather than ‘live-blogging’.

The day opened with Ramin Marzabani, VC investor, answering the question of “Why is he investing in EventZero“, an event processing vendor. In his comments, Ramin spoke of the characteristics of 21st century business, which he summed up as “complexity * velocity”.  He also spoke of business being about flows and showed examples of visualizations, actual and futuristic, that show how people want to receive and interact with information in real-time.

With this backdrop, Ramin indicated that a new operational layer is required to support these new applications, and that operational layer is event processing.  Relevant technologies in this space include: CEP/EP, ESP, EDA, Operational BI, BAM/BPM, and EPN.  According to Ramin, a key to success is “getting IT architects on board”.

Next up was the panel I participated in.  Our task was to discuss the event processing market from a business perspective, and specifically answer the question “Is event processing a hype or the best invention since sliced bread?

Tibco’s Alan Lundberg moderated the panel of Larry Fulton, Forrester, Raman Marzabani, Event Zero, Stephanie McReynolds, Oracle and John Partridge, Streambase and me.

As panelists, we were in strong agreement that Event Processing isn’t over hyped, especially when compared to SOA, BPM & Web 2.0.  On that point, I showed a Google Trends chart from Monday:

image

As for the ‘best thing ever’, we generally stayed away from this hyperbole and instead focused on market observations and real customer examples.  Although, no customers were named in this exercise because most organizations using event processing, especially CEP, are keeping their activities quiet because of competitive advantage.

In discussion with the audience, we talked about the risk-reward of generating market hype, the need for precision on the definition of the event processing space, if standards are required for broader adoption (mixed answers here), the readiness of event processing solutions for broader adoption (mixed answers here as well) and the relationship between event processing & data warehousing.  Essentially, what is the refresh rate on the data warehouse, and is that for the entire data warehouse, or some portion.

This intersection/interplay of data warehousing, event processing, analytics, real-time, right-time, information visibility, info to the masses, etc. is something I find myself circling over and over.  I plan to carve out some research time for this.

All and all, an interesting conversation.  Lots of reality, lots of promise.

Event Processing in Algorithmic Trading, Robert Almgren, Quantitative Brokers Inc. and New York University

Following our panel, Robert Almgren spoke of his experience choosing and implementing CEP at big sell side firm.  What follows are the notes I took as he spoke.

Business: Agency algorithmic trading.  Execute trades for large institutional clients.  Why best execution is hard: large executions move the market, price moves while trading is executed.

Algo trading flow: Client (hedge fund, mutual fund) sends order and execution parameters to (Broker) Algo Trading system which splits orders into child orders, events back to client on fill reports and execution quality analysis.

How participants make money:

– Client (buy-side): investment decisions

– Broker (sell-side): execution quality (commissions)

– Exchanges: efficiency (transaction fees and market data fees)

– Hardware and Software vendors

Why algo?

– increasing technological capabilities

– sophistication and comfort of clients (confidence and trust)

– cost pressure on commissions

– regulatory changes

– market complexity

Overview of CEP Engine Buying Decision

– Led by business

– Cost of big evaluation (testing all products) is greater than cost of making a mistake

– Criteria: mainstream of product space, easy to program, system easy to work with, people easy to work with

– Didn’t need to change algorithm at run-time (during trading day)

Application Areas for CEP

1. Market Data Analytics – process stream of market data to identify time-varying liquidity

2. Smart Order Routing – split order to child orders for best execution; trying to identify markets that have hidden liquidity; using CEP product to analyze history

3. Trading Limit Checks – compliance checks

*CEP niches likely to be separate pieces — hard to replace entire systems

Challenge: Not coding algorithm.  Getting the pieces to work together, market data in, algo decisions out to trading engine.

Next up was a customer panel, moderated by David Luckham, featuring customers from defense, entertainment, finance and telecommunications.  The panelists were asked to share business scenarios for EP, ROI and what was missing in EP.  This group, having a high technical sophistication, gravitated towards the “what was missing” question.  Suggestions raised were event processing language formalization and in-memory caching to support analytical processing for multiple users.

After the presentations and panels, the EPTS working groups reported out on work on the Event Processing Glossary and Use Cases.  Amongst the use cases shared were fraud detection in a real-time betting market, information dissemination, first responder & crisis management, workplace safety monitoring and alternative trading markets (dark pools).  The group plans to publish the use cases, so I’ll post when that happens.

Share this:

  • Click to share on Twitter (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to email this to a friend (Opens in new window)

Filed Under: circuit, event driven architecture, event processing


Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Great share. Subtraction can be additive. // cc: @AngelaYochem https://t.co/q1OgBcqz7N April 9, 2021 2:13 pm
  • Walking along, solving a writing problem in your head, when suddenly you’re an eyelash away from face planting in a… https://t.co/rVJ7QNuF30 April 5, 2021 1:31 pm
  • Arts, tech, education and (much needed) diversity; Portland, Maine. https://t.co/pSshO4VMXv @wearecywoc March 26, 2021 3:58 pm
© 2004-2021 Elemental Links, Inc.
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.