• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Rise of Event Processing / Active Information Picks

December 17, 2009 By brenda michelson

Earlier this month, I made my first and only prediction for 2010, that we would (finally) see the “Rise of Event Processing”.  Often, when I speak of Event Processing, I refer to a broader context of Active Information and (when relevant) the establishment of an Active Information Tier.  This past week, I read several articles that are applicable to the event processing / active information space.  I’ve included excerpts and links to 5 of those articles.  [Emphasis is my own.]

Briefly… I was particularly heartened to see the MIT Technology Review article dispel the notion that real-time, and the real-time web, is solely the domain of Twitter and related social media technologies.  The Economist SIS and Fast Company Corventis pieces highlight interesting sense, analyze and respond use cases in the real (physical) world.  The Carol Bartz piece, also in The Economist, discusses leadership traits in the age of information deluge.  Finally, the Progress Software predictions echo my sentiment in “Rise of Event Processing”, which is “You can’t change what you can’t see”.

Rise Event Processing / Active Information Picks:

1. MIT Technology Review: Startups Mine the Real-Time Web, There’s more to it than microblog posts and social network updates.

“The "real-time Web" is a hot concept these days. Both Google and Microsoft are racing to add more real-time information to their search results, and a slew of startups are developing technology to collect and deliver the freshest information from around the Web.

But there’s more to the real-time Web than just microblogging posts, social network updates, and up-to-the-minute news stories. Huge volumes of data are generated, behind the scenes, every time a person watches a video, clicks on an ad, or performs just about any other action online. And if this user-generated data can be processed rapidly, it could provide new ways to tailor the content on a website, in close to real time.”

… “Richard Tibbetts, CTO of StreamBase, explains that financial markets make up about 80 percent of his company’s customers today. Web companies are just starting to adopt the technology.

"You’re going to see real-time Web mashups, where data is integrated from multiple sources," Tibbetts says. Such a mashup could, for example, monitor second-to-second fluctuations in the price of airline tickets and automatically purchase one when it falls below a certain price.”

… Real-time applications, whether using traditional database technology or Hadoop, stand to become much more sophisticated going forward. "When people say real-time Web today, they have a narrow view of it–consumer applications like Twitter, Facebook, and a little bit of search," says StreamBase’s Tibbetts.”

2. The Economist: The World in 2010 – Big SIS (Societal Information-Technology Systems) is watching you

…“Thanks to Moore’s law (a doubling of capacity every 18 months or so), chips, sensors and radio devices have become so small and cheap that they can be embedded virtually anywhere. Today, two-thirds of new products already come with some electronics built in. By 2017 there could be 7 trillion wirelessly connected devices and objects—about 1,000 per person.

Sensors and chips will produce huge amounts of data. And IT systems are becoming powerful enough to analyse them in real time and predict how things will evolve. IBM has developed a technology it calls “stream computing”. Machines using it can analyse data streams from hundreds of sources, such as surveillance cameras and Wall Street trading desks, summarise the results and take decisions.

Transport is perhaps the industry in which the trend has gone furthest. Several cities have installed dynamic toll systems whose charges vary according to traffic flow. Drivers in Stockholm pay between $1.50 and $3 per entry into the downtown area. After the system—which uses a combination of smart tags, cameras and roadside sensors—was launched, traffic in the Swedish capital decreased by nearly 20%.

More importantly, 2010 will see a boom in “smart grids”. This is tech-speak for an intelligent network paralleling the power grid, and for applications that then manage energy use in real time. Pacific Gas & Electric, one of California’s main utilities, plans to install 10m “smart meters” to tell consumers how much they have to pay and, eventually, to switch off appliances during peak hours.

Smart technology is also likely to penetrate the natural environment. One example is the SmartBay project at Galway Bay in Ireland. The system there draws information from sensors attached to buoys and weather gauges and from text messages from boaters about potentially dangerous floating objects. Uses range from automatic alerts being sent to the harbourmaster when water levels rise above normal to fishermen selling their catch directly to restaurants, thus pocketing a better profit.

Yet it is in big cities that “smartification” will have the most impact. A plethora of systems can be made more intelligent and then combined into a “system of systems”: not just transport and the power grid, but public safety, water supply and even health care (think remote monitoring of patients). With the help of Cisco, another big IT firm, the South Korean city of Incheon aims to become a “Smart+Connected” community, with virtual government services, green energy services and intelligent buildings…”

3. Fast Company: Corventis’s PiiX Monitor Promises to Predict Heart Failure

… “The company’s first product, PiiX, is a wireless, water-resistant sensor that sticks to a patient’s chest like a large Band-Aid and monitors heart rate, respiratory rate, bodily fluids, and overall activity. It transmits the data to a central server for analysis and review by doctor and patient.

The basic technology platform has already received FDA approval, but Corventis envisions the PiiX as much more than a simple monitoring system. The company is working to generate algorithms that can predict, for instance, when a patient is on the verge of heart failure by comparing trends in his or her vital signs to other cases. "When you apply it in the real world, the algorithm begins to learn," says CEO Ed Manicka. "Not from 5 or 10 patients, but from hundreds of thousands of patients, as the system is applied across the planet."

… "What Corventis is trying to do is fundamentally create a new type of machine intelligence that serves to manage the patient’s overall health," he says. "It moves from the reactive approach of practicing medicine that is prevalent today to something that is much more proactive, preventative, and individualized."”

4. The Economist: The World in 2010 – Leadership in the information age, by Carol Bartz, CEO of Yahoo!

… “The second obligation that information creates for executives is to identify and mentor thought leaders. In the past, seeking out “high potential” employees typically meant looking for those who could climb the next rung of the management ladder. That remains important. But equally pressing is finding those employees who, though perhaps not the best managers, have the ability to digest and interpret information for others. Grooming these in-house ideas people helps foster a culture of openness to fresh thinking—the greatest energy an organisation can have.

The deluge of information is only going to rise. Leadership will increasingly mean leveraging that information, clarifying it, and using it to advance your strategy, engage customers and motivate employees. Business stakeholders are interested not only in your products and services, but also in your ideas.

So, welcome the information flood. Those who learn how to keep their head above it will be the most effective leaders.”

5. 2010: IT set to move from evolution to quiet revolution, predicts Progress Software

“Based on feedback from customers, as well as its own research and development, Progress Software sees five key technology trends set to shake up computing in 2010.

1. Real-time insight and business control will become a must-have, as organizations can ill-afford to lose money and customer through being slow to notice problems in delivery. In 2009, our research found that 67% of businesses only become aware of problems when customers report them. 80% of companies already have critical business events they need to monitor in real time. In 2010, insight into these events, powered by the right technology, will be essential to success.

2. Event-driven computing will accelerate, driven by business needs, and impacting both the way applications are built and how they are deployed in the enterprise. Architectures are increasingly being built around ‘events’, and this will increase to deal with both new sources of events appearing within the enterprise as well as external event sources from partners and customers.”

Filed Under: active information, business, event driven architecture, event processing, information strategies, trends

2010: The Rise of Event Processing

December 1, 2009 By brenda michelson

I don’t typically engage in predictions, but here’s mine for 2010, fresh from my tweet stream:

2010: Event Processing transcends niche status, to well-recognized & adopted business technique for real-time visibility & responsiveness.

I can list tons of reasons why, but it boils down to this:  you can’t change what you can’t see.

Filed Under: active information, business, business intelligence, event driven architecture, event processing, information strategies, innovation

Lessons from Googlenomics: Data abundance, Insight Scarcity

June 29, 2009 By brenda michelson

“"What's ubiquitous and cheap?" [Google’s Hal] Varian asks. "Data." And what is scarce? The analytic ability to utilize that data.”

The June issue of Wired has an excellent article by Steven Levy, entitled Secret of Googlenomics: Data-Fueled Recipe Brews Profitability.  The article delves into the history and algorithms behind Google’s auction based ad system, highlighting the significance of engineering, mathematics, economics, and data mining in Google’s success.

On the economics front, the article explains Hal Varian’s role as Chief Economist at Google, including why Google needs a chief economist:

“The simplest reason is that the company is an economy unto itself. The ad auction, marinated in that special sauce, is a seething laboratory of fiduciary forensics, with customers ranging from giant multinationals to dorm-room entrepreneurs, all billed by the world's largest micropayment system.

Google depends on economic principles to hone what has become the search engine of choice for more than 60 percent of all Internet surfers, and the company uses auction theory to grease the skids of its own operations. All these calculations require an army of math geeks, algorithms of Ramanujanian complexity, and a sales force more comfortable with whiteboard markers than fairway irons.”

After reading the article, Varian’s economic view of data ubiquity and analytic scarcity really stuck with me.  The quote I opened the post with isn’t directed at software availability or processing power.  It refers to the scarcity of people qualified to churn abundant data into economic value.  

What follows are some excerpts “about harnessing supply and demand”.  The sub-headers and emphasis are mine.

Enter Econometricians

"The people working for me are generally econometricians—sort of a cross between statisticians and economists," says Varian, who moved to Google full-time in 2007 (he's on leave from Berkeley) and leads two teams, one of them focused on analysis.

"Google needs mathematical types that have a rich tool set for looking for signals in noise," says statistician Daryl Pregibon, who joined Google in 2003 after 23 years as a top scientist at Bell Labs and AT&T Labs. "The rough rule of thumb is one statistician for every 100 computer scientists."

Ubiquitous Data

“As the amount of data at the company's disposal grows, the opportunities to exploit it multiply, which ends up further extending the range and scope of the Google economy…

Keywords and click rates are their bread and butter. "We are trying to understand the mechanisms behind the metrics," says Qing Wu, one of Varian's minions. His specialty is forecasting, so now he predicts patterns of queries based on the season, the climate, international holidays, even the time of day. "We have temperature data, weather data, and queries data, so we can do correlation and statistical modeling," Wu says. The results all feed into Google's backend system, helping advertisers devise more-efficient campaigns.”

Continuous Analysis

“To track and test their predictions, Wu and his colleagues use dozens of onscreen dashboards that continuously stream information, a sort of Bloomberg terminal for the Googlesphere. Wu checks obsessively to see whether reality is matching the forecasts: "With a dashboard, you can monitor the queries, the amount of money you make, how many advertisers you have, how many keywords they're bidding on, what the rate of return is for each advertiser."”

Behavioral Based Insights

“Wu calls Google "the barometer of the world." Indeed, studying the clicks is like looking through a window with a panoramic view of everything. You can see the change of seasons—clicks gravitating toward skiing and heavy clothes in winter, bikinis and sunscreen in summer—and you can track who's up and down in pop culture. Most of us remember news events from television or newspapers; Googlers recall them as spikes in their graphs. "One of the big things a few years ago was the SARS epidemic," Tang says. Wu didn't even have to read the papers to know about the financial meltdown—he saw the jump in people Googling for gold. And since prediction and analysis are so crucial to AdWords, every bit of data, no matter how seemingly trivial, has potential value.”

Rise of the Datarati

“Varian believes that a new era is dawning for what you might call the datarati—and it's all about harnessing supply and demand. "What's ubiquitous and cheap?" Varian asks. "Data." And what is scarce? The analytic ability to utilize that data. As a result, he believes that the kind of technical person who once would have wound up working for a hedge fund on Wall Street will now work at a firm whose business hinges on making smart, daring choices—decisions based on surprising results gleaned from algorithmic spelunking and executed with the confidence that comes from really doing the math.”

Now, a few questions I think folks should consider:

  1. Who does that math in your organization? 
  2. Does your analytics / active information strategy suffer from information processing richness and insight scarcity?
  3. Who are, or should be, your datarati? 

Filed Under: active information, business, business intelligence, data science, information strategies, innovation, trends Tagged With: archive_0

Business-Driven Architecture Archives: Creating a Blended Architectural Portfolio

January 29, 2009 By brenda michelson

I published my first business-driven architecture paper on October 14, 2004, as a contributor to the Patricia Seybold Group. Today, I’m excerpting the introductory sections of that report. Why? For one, the content is still relevant. Two, the concepts presented here — combining architecture strategies, architecture realization, portfolio management, fluid enterprise — are foundational to my on-going work.

Since 2004, “process based architecture” has evolved into business process management (BPM). Some of the grid constructs (grid enhanced below) that I was intrigued by, and speak to later in the report, have evolved into virtualization and cloud computing.

Excerpt: Creating a Blended Architectural Portfolio, Brenda M. Michelson, Oct 14, 2004

Architecture Strategy Cornucopia: Exciting New Architecture Strategies

There are five compelling architecture strategies currently vying for corporate IT adoption. The front-runner is service-oriented architecture (SOA), but close on the heels of SOA are process-based architecture, event-driven architecture, Grid-enhanced architecture, and real-time/right-time architectures.

What I find interesting, and a bit perplexing, is that organizations believe they need to choose one of the strategies over the others. I refer to this phenomenon as “architecture zeal.”

Organizations are falling victim to architecture zeal—we just got SOA fever, or Grid fever, or business events fever. At the onset of zeal, everyone is happy, because there is a common architecture in place to advance the projects and the business solution portfolio. But then, sure enough, zeal fades. And the fade starts with the emergence of the 20 percent (or so)—those business problems not easily addressed with the chosen architecture strategy.

Sure, SOA is great in transaction-oriented scenarios, but it is not an efficient way (yet) to process an analytic request, which churns through volumes of data. Business events are a great way to inform and act on something notable, but when everything becomes an event, the overhead is untenable. Not only can the wrong architectural approach add extra expense and complexity to the problem at hand, it degrades the integrity of the architecture as a whole. Imagine if every sales transaction were a business event. How would we find the notable sales transactions—those that we want to act on differently? The sales for our best customers or sales with suspicious origins may completely escape our attention, because the business event pipeline is flooded with important, but not notable, things.

As zeal fades, I see one of two things happening: Organizations declare the architecture a failure, or they start to bring in exception side architectures. The side architectures spring up in isolation, a project at a time, without consideration for others that may follow. What starts as one-size-fits-all ends up as one-size-for-many, with custom tailoring for the rest.

I believe this is a problem that can be easily avoided. Organizations shouldn’t be looking at these strategies in isolation; the strategies need to be considered collectively. The strategies should be mixed, as part of an architectural portfolio. Then we can select the right architectural strategy in each situation. But we shouldn’t stop there. With merely a menu of strategies to use, we need to take the next step.

We need to blend the strategies to work together, so we can seamlessly use different architectural strategies within the course of a business interaction. Now when our most important customer places an order, using our service-oriented Web site, the notable event not only informs us, but also invokes a promotions service, which tailors a special offer for that customer. We can send the customer this offer via email, or it can be available to her as a business-process-in-waiting, activated when she makes her next contact with us—in person, on the Web, or on the phone.

This is the true promise of the architecture strategies, used together to create what we call a “fluid enterprise.” In a fluid enterprise, lag time is squeezed, traditional organizational boundaries are dissolved, supply chains are optimized, information delivery is sped up, and attention is focused at the edges—where the customers are.

While this blended approach can bring great power to our businesses, it won’t help us one iota if it is executed poorly. We can’t take on this enterprise architectural blending activity with a traditional enterprise architecture mindset. We need more than a blueprint to make this happen—we need a realized architecture that can be easily used by projects. We need our architecture to be actionable.

But it isn’t just our architecture practices that need adjustment. We also need to think differently about our portfolios: business solution, information, and infrastructure. These new architecture strategies augment what is in place. Their power is in connecting and altering the behavior of existing assets. For example, as inventory is received in a warehouse, a content management system can be automatically reposting the product page for the received product that had been out of stock. The assets in our portfolios are no longer sole-purposed applications or databases; they are also potential multiuse components and triggers to be exploited in the new architecture.

This is great, because we can leverage existing investments, but this can be problematic if our existing investments are poorly formed or underperforming. In that case, all is not lost, but some remediation will be in order. This remediation may include strengthening for performance, replacement of proprietary technology, dissection of monolithic code assets by business concept, or consolidating redundant code, databases, or infrastructure.

Business-Driven Architecture Approach

So to best serve our businesses, we really need to do three things. First, we need to combine these architecture strategies into a blended architecture, to bring new opportunity to our businesses.

Second, we need to shake up the practice of architecture in the enterprise, so architects can execute our blended architecture, taking it from the whiteboard all the way into production.

Third, we need to understand and manage the assets in all of our portfolios (business solution, information, and infrastructure) according to both their value to the business problem at hand, and to the portfolio in which they live. This will allow us to make better decisions as to the degree of architecture and development required as we introduce new assets and remediate existing ones.

To achieve this, we need a new approach to architecture in the enterprise. I believe architecture must have a strong bias to action, business opportunity, and project and portfolio advancement.

clip_image002

The solution is an approach I refer to as Business-Driven Architecture (BDA). BDA is based on the simple premise that “architecture is a means, not an end.” Business-Driven Architecture dictates the following:

• The goals of the business must drive the composition of the architecture.

• The architecture must be defined, realized, and consumed with a bias for action and a healthy dose of pragmatism.

• The architecture must provide tangible products and services to the builders of the business solution portfolio.

• The IT portfolios—and the individual code, information, and infrast
ructure assets within them—must be understood and managed according to their value.

• The links between projects, architecture, and portfolios are managed collectively using business discipline.

</Excerpt>

Filed Under: business-driven architecture, event driven architecture, information strategies, services architecture, soa Tagged With: archive_0

Sticky Quotes?

January 27, 2009 By brenda michelson

Are there certain quotes, passages, pictures, ideas that have stuck with you over time?  Perhaps they’ve informed, influenced or validated your work?  Either immediately, or after lingering in your background processing?  I certainly do.

The immediate impact ones you might expect for someone with a development and architecture background:

“There is no single development, in either technology or management technique, which by itself promises even one order-of-magnitude improvement within a decade in productivity, in reliability, in simplicity.” –Frederick P. Brooks, Jr., No Silver Bullet

“The people can shape buildings for themselves, and have done it for centuries, by using languages which I call pattern languages.  A pattern language gives each person who uses it the power to create an infinite variety of new and unique building, just as his ordinary language gives him the power to create an infinite variety of sentences.” –Christopher Alexander, The Timeless Way of Building

“Any intelligent fool can make things bigger, more complex, and more violent.  It takes a touch of genius–and a lot of courage–to move in the opposite direction.” —E.F. Schumacher

The I’m more right than left-brained one:

“Imagination is more important than knowledge”. —Albert Einstein

And many background lingerers.  Lately, the two pushing to the forefront as I work on my ‘active information tier’ concept are:

“Now when we speak of an information-rich world, we may expect, analogically, that the wealth of information means a dearth of something else — a scarcity of whatever it is that information consumes.  What information consumes is rather obvious: it consumes the attention of its recipients.  Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” —Herbert Simon, Designing Organizations for an Information-Rich World (pdf)

“If killer apps will indeed emerge in nonobvious ways, the process can only be enhanced if applications are deployed in a context that provides more information than is actually needed for the application.  This will create opportunities to discover more important uses for the app than were originally intended…

It may soon no longer be possible for even gifted visionaries to imagine the next killer app. Extrapolation of the present will follow lines less straight and more recombinant than can be deciphered. In that case, we will need processes and technologies that will allow us to intelligently stumble upon the future.”  — Robert D. Austin and Richard L. Nolan, MIT Sloan Review, Summer 2005

And yes, some of these ideas are in conflict.  That, depending on your point of view, is either the frustrating or the interesting part.

Anyway, how about you?  What are your sticky quotes?  Leave a comment, post with a trackback, and/or tweet yours.

Filed Under: active information, enterprise architecture, information strategies

« Previous Page
Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Just now
© 2004-2022 Elemental Links, Inc.