Earlier this month, I made my first and only prediction for 2010, that we would (finally) see the “Rise of Event Processing”. Often, when I speak of Event Processing, I refer to a broader context of Active Information and (when relevant) the establishment of an Active Information Tier. This past week, I read several articles that are applicable to the event processing / active information space. I’ve included excerpts and links to 5 of those articles. [Emphasis is my own.]
Briefly… I was particularly heartened to see the MIT Technology Review article dispel the notion that real-time, and the real-time web, is solely the domain of Twitter and related social media technologies. The Economist SIS and Fast Company Corventis pieces highlight interesting sense, analyze and respond use cases in the real (physical) world. The Carol Bartz piece, also in The Economist, discusses leadership traits in the age of information deluge. Finally, the Progress Software predictions echo my sentiment in “Rise of Event Processing”, which is “You can’t change what you can’t see”.
Rise Event Processing / Active Information Picks:
“The "real-time Web" is a hot concept these days. Both Google and Microsoft are racing to add more real-time information to their search results, and a slew of startups are developing technology to collect and deliver the freshest information from around the Web.
But there’s more to the real-time Web than just microblogging posts, social network updates, and up-to-the-minute news stories. Huge volumes of data are generated, behind the scenes, every time a person watches a video, clicks on an ad, or performs just about any other action online. And if this user-generated data can be processed rapidly, it could provide new ways to tailor the content on a website, in close to real time.”
… “Richard Tibbetts, CTO of StreamBase, explains that financial markets make up about 80 percent of his company’s customers today. Web companies are just starting to adopt the technology.
"You’re going to see real-time Web mashups, where data is integrated from multiple sources," Tibbetts says. Such a mashup could, for example, monitor second-to-second fluctuations in the price of airline tickets and automatically purchase one when it falls below a certain price.”
… Real-time applications, whether using traditional database technology or Hadoop, stand to become much more sophisticated going forward. "When people say real-time Web today, they have a narrow view of it–consumer applications like Twitter, Facebook, and a little bit of search," says StreamBase’s Tibbetts.”
2. The Economist: The World in 2010 – Big SIS (Societal Information-Technology Systems) is watching you
…“Thanks to Moore’s law (a doubling of capacity every 18 months or so), chips, sensors and radio devices have become so small and cheap that they can be embedded virtually anywhere. Today, two-thirds of new products already come with some electronics built in. By 2017 there could be 7 trillion wirelessly connected devices and objects—about 1,000 per person.
Sensors and chips will produce huge amounts of data. And IT systems are becoming powerful enough to analyse them in real time and predict how things will evolve. IBM has developed a technology it calls “stream computing”. Machines using it can analyse data streams from hundreds of sources, such as surveillance cameras and Wall Street trading desks, summarise the results and take decisions.
Transport is perhaps the industry in which the trend has gone furthest. Several cities have installed dynamic toll systems whose charges vary according to traffic flow. Drivers in Stockholm pay between $1.50 and $3 per entry into the downtown area. After the system—which uses a combination of smart tags, cameras and roadside sensors—was launched, traffic in the Swedish capital decreased by nearly 20%.
More importantly, 2010 will see a boom in “smart grids”. This is tech-speak for an intelligent network paralleling the power grid, and for applications that then manage energy use in real time. Pacific Gas & Electric, one of California’s main utilities, plans to install 10m “smart meters” to tell consumers how much they have to pay and, eventually, to switch off appliances during peak hours.
Smart technology is also likely to penetrate the natural environment. One example is the SmartBay project at Galway Bay in Ireland. The system there draws information from sensors attached to buoys and weather gauges and from text messages from boaters about potentially dangerous floating objects. Uses range from automatic alerts being sent to the harbourmaster when water levels rise above normal to fishermen selling their catch directly to restaurants, thus pocketing a better profit.
Yet it is in big cities that “smartification” will have the most impact. A plethora of systems can be made more intelligent and then combined into a “system of systems”: not just transport and the power grid, but public safety, water supply and even health care (think remote monitoring of patients). With the help of Cisco, another big IT firm, the South Korean city of Incheon aims to become a “Smart+Connected” community, with virtual government services, green energy services and intelligent buildings…”
3. Fast Company: Corventis’s PiiX Monitor Promises to Predict Heart Failure
… “The company’s first product, PiiX, is a wireless, water-resistant sensor that sticks to a patient’s chest like a large Band-Aid and monitors heart rate, respiratory rate, bodily fluids, and overall activity. It transmits the data to a central server for analysis and review by doctor and patient.
The basic technology platform has already received FDA approval, but Corventis envisions the PiiX as much more than a simple monitoring system. The company is working to generate algorithms that can predict, for instance, when a patient is on the verge of heart failure by comparing trends in his or her vital signs to other cases. "When you apply it in the real world, the algorithm begins to learn," says CEO Ed Manicka. "Not from 5 or 10 patients, but from hundreds of thousands of patients, as the system is applied across the planet."
… "What Corventis is trying to do is fundamentally create a new type of machine intelligence that serves to manage the patient’s overall health," he says. "It moves from the reactive approach of practicing medicine that is prevalent today to something that is much more proactive, preventative, and individualized."”
4. The Economist: The World in 2010 – Leadership in the information age, by Carol Bartz, CEO of Yahoo!
… “The second obligation that information creates for executives is to identify and mentor thought leaders. In the past, seeking out “high potential” employees typically meant looking for those who could climb the next rung of the management ladder. That remains important. But equally pressing is finding those employees who, though perhaps not the best managers, have the ability to digest and interpret information for others. Grooming these in-house ideas people helps foster a culture of openness to fresh thinking—the greatest energy an organisation can have.
The deluge of information is only going to rise. Leadership will increasingly mean leveraging that information, clarifying it, and using it to advance your strategy, engage customers and motivate employees. Business stakeholders are interested not only in your products and services, but also in your ideas.
So, welcome the information flood. Those who learn how to keep their head above it will be the most effective leaders.”
5. 2010: IT set to move from evolution to quiet revolution, predicts Progress Software
“Based on feedback from customers, as well as its own research and development, Progress Software sees five key technology trends set to shake up computing in 2010.
1. Real-time insight and business control will become a must-have, as organizations can ill-afford to lose money and customer through being slow to notice problems in delivery. In 2009, our research found that 67% of businesses only become aware of problems when customers report them. 80% of companies already have critical business events they need to monitor in real time. In 2010, insight into these events, powered by the right technology, will be essential to success.
2. Event-driven computing will accelerate, driven by business needs, and impacting both the way applications are built and how they are deployed in the enterprise. Architectures are increasingly being built around ‘events’, and this will increase to deal with both new sources of events appearing within the enterprise as well as external event sources from partners and customers.”