• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Archives for September 2008

EPTS Event Processing Symposium, Live Blogging Day 2: Susan Urban

September 18, 2008 By brenda michelson

As I mentioned yesterday, today’s focus of the EPTS meeting is technology.  The morning keynote is Susan Urban,Texas Tech. University, Reactive Behavior and Data Management Research Lab.  Susan’s background is in active database systems background, and is current doing research in distributed event systems.  Here are my notes from her talk:

Recap of Recent Activity in Event Processing

– David Luckham’s Pioneering Work, The Power of Events, Complex Event Processing Site

– Dagstuhl Event Processing Seminar, April 2007

– Event Processing Technical Society (that’s this group)

Driving Forces for Maturity of EP

– Autonomic computing / Ambient Intelligence

– RFID / Monitoring Applications

– Personalization / Mobile Devices

– Intelligence Applications / Asynchronous Programming (gaming)

Key keyword is “intelligence”, how to use events to create more intelligent applications to live better. 

Some examples of these event-driven applications: business intelligence, autonomic computing, airline industry (air traffic, baggage, security), healthcare (epidemic, patient monitoring, RFID in-body monitoring), ambient intelligence (smart homes, smart cities, personalized mobile information systems), homeland security (storm evacuation, first responder systems, terrorism situational awareness, power / fuel / water system monitoring, environmental systems and more.

Research directions: language and semantics (event cloud, processing language); data mining, domain knowledge and agents (knowledge intensive task – contextual reasoning, complex event processing agents that learn); modeling, validation and correctness (eliminate false positives/negatives) and reliability; dashboards, GUI’s and human in the loop (HCI for monitoring, visualizing large volumes of complex events in enterprise applications); storage and distributed/parallel processing; cloud computing (software, data, hardware and even events as a service); and event security (event spam, malicious events, event viruses)

Event processing is not paradigm shift, it is the way we currently live.  “I want the world to respond to me when I walk down the street, drive my car, walk through my house, cruise the web, etc.”.

Susan calls for more technical forums (industry and academia) on EP related topics:

– data management

– AI

– Software Engineering & Formal Methods

– HCI

– Security & Information Assurance

– Sensor Networks

– Distributed, Parallel, and High Performance computing

This was an interesting talk.  I enjoy hearing about the bigger picture, both the potential and work still to be done.  The laundry list of research items shouldn’t preclude folks from exploring event processing.  But, keep in mind there is still work to be done, especially if your problem domain is extremely complex.

Filed Under: circuit, event driven architecture, event processing

EPTS Event Processing Symposium Snippets, Day 1

September 17, 2008 By brenda michelson

I’m at the 4th Event Processing Technical Symposium in Stamford CT. The meeting is run by the Event Processing Technical Society and co-located with Gartner’s Event Processing Summit. There are about 60 of us here, mostly from vendors (IBM, Oracle, Streambase, Progress Apama, EventZero, Aleri, Tibco,Coral8, iLog, RuleML) but also represented are customers, researchers, consultants and analysts. You could say, it’s the usual suspects, lead by Opher Etzion of IBM, Roy Schulte of Gartner and David Luckham.

Today is dedicated to the business and application side of event processing. Tomorrow is technology and standards side of event processing.

Since I’m participating in the meeting, rather than just observing, I’m posting some snippets from the day rather than ‘live-blogging’.

The day opened with Ramin Marzabani, VC investor, answering the question of “Why is he investing in EventZero“, an event processing vendor. In his comments, Ramin spoke of the characteristics of 21st century business, which he summed up as “complexity * velocity”. He also spoke of business being about flows and showed examples of visualizations, actual and futuristic, that show how people want to receive and interact with information in real-time.

With this backdrop, Ramin indicated that a new operational layer is required to support these new applications, and that operational layer is event processing. Relevant technologies in this space include: CEP/EP, ESP, EDA, Operational BI, BAM/BPM, and EPN. According to Ramin, a key to success is “getting IT architects on board”.

Next up was the panel I participated in. Our task was to discuss the event processing market from a business perspective, and specifically answer the question “Is event processing a hype or the best invention since sliced bread?

Tibco’s Alan Lundberg moderated the panel of Larry Fulton, Forrester, Raman Marzabani, Event Zero, Stephanie McReynolds, Oracle and John Partridge, Streambase and me.

As panelists, we were in strong agreement that Event Processing isn’t over hyped, especially when compared to SOA, BPM & Web 2.0. On that point, I showed a Google Trends chart from Monday:

image

As for the ‘best thing ever’, we generally stayed away from this hyperbole and instead focused on market observations and real customer examples. Although, no customers were named in this exercise because most organizations using event processing, especially CEP, are keeping their activities quiet because of competitive advantage.

In discussion with the audience, we talked about the risk-reward of generating market hype, the need for precision on the definition of the event processing space, if standards are required for broader adoption (mixed answers here), the readiness of event processing solutions for broader adoption (mixed answers here as well) and the relationship between event processing & data warehousing. Essentially, what is the refresh rate on the data warehouse, and is that for the entire data warehouse, or some portion.

This intersection/interplay of data warehousing, event processing, analytics, real-time, right-time, information visibility, info to the masses, etc. is something I find myself circling over and over. I plan to carve out some research time for this.

All and all, an interesting conversation. Lots of reality, lots of promise.

Event Processing in Algorithmic Trading, Robert Almgren, Quantitative Brokers Inc. and New York University

Following our panel, Robert Almgren spoke of his experience choosing and implementing CEP at big sell side firm. What follows are the notes I took as he spoke.

Business: Agency algorithmic trading. Execute trades for large institutional clients. Why best execution is hard: large executions move the market, price moves while trading is executed.

Algo trading flow: Client (hedge fund, mutual fund) sends order and execution parameters to (Broker) Algo Trading system which splits orders into child orders, events back to client on fill reports and execution quality analysis.

How participants make money:

– Client (buy-side): investment decisions

– Broker (sell-side): execution quality (commissions)

– Exchanges: efficiency (transaction fees and market data fees)

– Hardware and Software vendors

Why algo?

– increasing technological capabilities

– sophistication and comfort of clients (confidence and trust)

– cost pressure on commissions

– regulatory changes

– market complexity

Overview of CEP Engine Buying Decision

– Led by business

– Cost of big evaluation (testing all products) is greater than cost of making a mistake

– Criteria: mainstream of product space, easy to program, system easy to work with, people easy to work with

– Didn’t need to change algorithm at run-time (during trading day)

Application Areas for CEP

1. Market Data Analytics – process stream of market data to identify time-varying liquidity

2. Smart Order Routing – split order to child orders for best execution; trying to identify markets that have hidden liquidity; using CEP product to analyze history

3. Trading Limit Checks – compliance checks

*CEP niches likely to be separate pieces — hard to replace entire systems

Challenge: Not coding algorithm. Getting the pieces to work together, market data in, algo decisions out to trading engine.

Next up was a customer panel, moderated by David Luckham, featuring customers from defense, entertainment, finance and telecommunications. The panelists were asked to share business scenarios for EP, ROI and what was missing in EP. This group, having a high technical sophistication, gravitated towards the “what was missing” question. Suggestions raised were event processing language formalization and in-memory caching to support analytical processing for multiple users.

After the presentations and panels, the EPTS working groups reported out on work on the Event Processing Glossary and Use Cases. Amongst the use cases shared were fraud detection in a real-time betting market, information dissemination, first responder & crisis management, workplace safety monitoring and alternative trading markets (dark pools). The group plans to publish the use cases, so I’ll post when that happens.

Filed Under: circuit, event driven architecture, event processing

Links for September 12, 2008

September 12, 2008 By brenda michelson

The StreamBase Event Processing Blog: What the UAL Incident Teaches Us: Regulate News Market Data Sources

Mark Palmer on Regulation of information sources: “Here’s where regulators should step in: automated trading based on unregulated sources of information should be prohibited. It’s just not right, and the UAL debacle illustrates this perfectly. This “news” was 5 years old. Indeed, it was probably eliminated by most news processors as an outlier; but it only takes one to start a chain reaction. This “data” shouldn’t have entered any trading system in the first place. Dow Jones wouldn’t have put that story on the wire, nor would Reuters.” Good discussion in the comments.

Event Processing Thinking: On Occurrence time: a footnote to the UAL fiasco

Opher on Event Occurrence Time: “The works in the temporal area are talking about several time dimensions – the bi-temporal model talks about: transaction time — the time that a fact is recorded, and valid time — the time interval in which the fact is valid. In event processing we also look at a bi-temporal time similar to this: detection time — the time that the message that represents the event was detected by the processing system, and occurence time — the time which the event happened in reality (occurrence time can be considered as the starting point of a valid time that ends when the event becomes irrelevant, but let’s get it out of the scope and concentrate in occurrence time).”…”Thinking about standard structures for events — I would think that having “standard header” with some mandatory properties for each event…the occurrence time should be a mandatory. Occurrence time has some inherent issues associated with it – but I’ll discuss it another time.”

Microsoft joins OMG | InfoWorld | News | 2008-09-10 | By Paul Krill, InfoWorld

Microsoft joins OMG, (finally) supports UML, will bring BPMN to Visio: “As part of its strategy for model-driven software development, Microsoft on Wednesday announced it has joined the Object Management Group (OMG).”…”Microsoft views model-driven technologies as a main pillar of its “Dynamic IT” vision for aligning business and IT. Other pillars include service enablement, virtualization, and the user experience.”

The 10 Laws of Cloudonomics – GigaOM

Intro: “Public utility cloud services differ from traditional data center environments — and private enterprise clouds — in three fundamental ways. First, they provide true on-demand services, by multiplexing demand from numerous enterprises into a common pool of dynamically allocated resources. Second, large cloud providers operate at a scale much greater than even the largest private enterprises. Third, while enterprise data centers are naturally driven to reduce cost via consolidation and concentration, clouds — whether content, application or infrastructure — benefit from dispersion.” Followed by 10 laws of (groan) ‘Cloudonomics’ — bad name, good read.

Is the Cloud Right for You? Ask Yourself These 5 Questions – GigaOM

5 questions for enterprises on using the/a cloud: is demand constant? is growth predictable? can demand be shaped? where are the users? is the application interactive?

CFOs more optimistic about U.S. economy: survey: Financial News – Yahoo! Finance

“Chief financial officers are more optimistic about the direction of the U.S. economy, but remain concerned about consumer demand and weak credit markets, according to a quarterly survey.” ” Compared with the previous quarter, 28.5 percent said they were more optimistic about the U.S. economy, up more than 7 percentage points from June, according to the Duke University/CFO Magazine survey of about 1,300 CFOs, including 524 from the United States. Those saying they are less optimistic fell to 41.5 percent, from more than half in the previous survey and more than 72 percent in March. About half expect the U.S. economy to begin recovering by the middle of 2009.” “Still, many finance chiefs say they are cutting plans for capital spending and employment and 43 percent say the credit crunch is directly hurting their company. CFOs named weak consumer demand as their most pressing concern, followed by credit markets and the cost of fuel, the survey found.”

User Experience: Learning from the Pros – ReadWriteWeb

Compelling User Experience shouldn’t be limited to the web & consumer tech, enterprise users deserve good UX as well. ReadWriteWeb notes some UX resources to check out: “The initial User Experience has to be compelling or any new application is going to be passed up in favor of whatever shiny object is next in line. What’s a company to do? Luckily, there are people who specialize in the field of User Experience (UX) and many of them share their best practices freely. We see applications all the time that are based on a great idea but are poorly designed in a way that leaves us frustrated and unlikely to return as users. Below are some of our favorite resources for companies that want to smarten-up quickly about User Experience.”

Filed Under: links

UAL, Tribune, Google and the butterfly effect of a single click

September 11, 2008 By brenda michelson

This morning, the WSJ follows-up on the GIGO-based UAL stock drop. image This piece, ironically dated tomorrow (see image), describes the Tribune’s side of the story, that a single click on the 2002 story was the butterfly, if you will, that led to the “computer glitch”. That glitch, from the Tribune’s point of view, began with Google’s web crawler:

“Tribune has offered details of the incident in pieces since Monday. In its latest explanation, Tribune said a single visit during a low-traffic period early Sunday morning pushed the undated story onto the list of most popular business news of its South Florida Sun-Sentinel newspaper’s Web site.

About 30 minutes after that visit, a user viewing a story about airline-cancellation policies during a storm-ravaged weekend clicked on the link for the old story. Seconds later, Google’s automated search agent, Googlebot, visited the Web site and found the story.

Soon after that, the story became available through Google News, and by Monday the article became more widely distributed to users of Bloomberg LP, the financial-news service widely watched on Wall Street.”

The story continues with Tribune pointing a finger at Google, and a Google public statement response:

“Tribune said it previously had identified problems with Google’s automated search service and had asked Google to stop trolling Tribune Web sites for inclusion in Google News.

“Despite the company’s earlier request and the confusion caused by Googlebot and Google News earlier this week, we believe that Googlebot continues to misclassify stories,” Tribune said.

Google spokesman Gabriel Stricker said in a statement: “The claim that the Tribune Company asked Google to stop crawling its newspaper Web sites is untrue.””

So, why did Google’s crawler pick up the results of this lone click? The Googlenews blog shares a chronology of events, including screen shots of the crawled pages. The summary (emphasis is mine):

“On Saturday, September 6th at 10:36 PM Pacific Daylight Time (or Sunday, September 7th at 1:36 AM Eastern Daylight Time), the Google crawler detected a new link on the Florida Sun-Sentinel’s website in a section of the most viewed stories labeled “Popular Stories: Business.” The link had newly appeared in that section since the last time Google News’ Googlebot webcrawler had visited the page (nineteen minutes earlier), so the crawler followed the link and found an article titled “UAL Files for Bankruptcy.” The article failed to include a standard newspaper article dateline, but the Sun-Sentinel page had a fresh date above the article on the top of the page of “September 7, 2008” (Eastern).

Because the Sun-Sentinel included a link to the story in its “Popular Stories” section, and provided a date on the article page of September 7, 2008, the Google News algorithm indexed it as a new story. We removed this story as soon as we were notified that it was posted in error.”

One lesson from this, make judicious use of metadata tagging in your content storage and publication.

Filed Under: business, information strategies

UAL falls victim to GIGO processing at hyper-speed

September 10, 2008 By brenda michelson

Interesting piece in the WSJ today about “the computer glitch that cratered UAL’s stock yesterday“.  Apparently, this was a case of GIGO processing at hyper speed:

“…blame spread to the computers that robotically troll the Web for news stories and execute stock trades automatically.

An old article about UAL’s 2002 bankruptcy-court filing resurfaced Monday as an apparently fresh report on Google’s news service. Stock in the parent company of United Airlines quickly dropped to $3 a share from nearly $12.50 before the Nasdaq Stock Market halted trading and UAL issued a statement denying any fresh Chapter 11 filing.”

The evolving story on the glitch involves web-crawling, search engine optimization (SEO), syndication and complex event processing (CEP):

Web Crawling

“Google traces the appearance of the 2002 article in its search engine to a process that began late last Saturday night. At 10:36 p.m. PDT, Google’s “crawler” — the technology that finds Web pages — discovered a new link on the Web site of Tribune’s South Florida Sun-Sentinel newspaper in a section called “Popular Stories: Business.” The article — which didn’t carry a date but was published by the Chicago Tribune in December 2002 — hadn’t appeared there when Google’s crawler last visited the page at 10:17 p.m., the company said.”

Search Engine Optimization

“Amid serious storms in Florida and on the East Coast, Web surfers checking for news about travel delays may have stumbled onto the old UAL story by mistake, and a small number of fresh hits may have been enough to drive it onto the list. A Tribune spokesman declined to say how many hits the article received but said there was no indication of fraud.”

Syndication

“From the Sun-Sentinel site, the article became available through Google News service, accessible if a user searched for keywords like “United Airlines.” The article didn’t appear in any of the headlines on Google News’s home page, but it was picked up and sent via email to people who had created a custom Google News alert about UAL or related topics.

The stock market opened Monday with no drop in UAL shares, but the UAL story began circulating widely via a posting by research firm Income Securities Advisors Inc. that was made available to users of Bloomberg L.P., the financial-news service widely watched on Wall Street. Shortly after a headline from the outdated report flashed across Bloomberg screens at about 10:45 a.m., UAL shares began a precipitous drop. Over the next 15 minutes, before Nasdaq halted trading, they dropped as low as $3.”

Complex Event Processing | Algorithmic Trading

“The damage was exacerbated by the growing use on Wall Street of automated programs that trigger stock trades without any human interaction. The so-called algorithmic trading mechanisms, which buy and sell stocks based on news headlines and earnings data, were responsible for roughly a quarter of New York Stock Exchange trades in the last week of August.

Investors said simple human scrutiny would have indicated the UAL story was old, but computerized trading systems don’t make such determinations.

“A trader can pull back before proceeding, but some of these less sophisticated [automated trading systems] can’t do that,” said Bernie McSherry, a senior vice president with New York institutional brokerage Cuttone & Co.”

At the end of trading, UAL’s stock had regained much of its value, but the loss to traders and market value was still significant:

“UAL’s stock price ended Tuesday’s session at $10.60, down 2.8% on the day and nearly 13% off Monday’s open.”

As regular readers know, I’m a big proponent of automation, syndication and event processing — casting a wide information net to discover opportunities and threats, and reacting accordingly.  So, I’m not posting this to create FUD — I hate FUD. 

Instead, I am posting this to remind folks that shiny technology alone doesn’t guarantee better decisions.  Just faster decisions. 

Good decisions require good data, and for that tried, true and boring concepts like data quality and source checking remain imperative.  Oh, and when something seems too good or bad to be true, add a little human scrutiny.

Filed Under: business, business-technology, event driven architecture

« Previous Page
Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Just now
© 2004-2022 Elemental Links, Inc.