• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Rise of Event Processing / Active Information Picks

December 17, 2009 By brenda michelson

Earlier this month, I made my first and only prediction for 2010, that we would (finally) see the “Rise of Event Processing”.  Often, when I speak of Event Processing, I refer to a broader context of Active Information and (when relevant) the establishment of an Active Information Tier.  This past week, I read several articles that are applicable to the event processing / active information space.  I’ve included excerpts and links to 5 of those articles.  [Emphasis is my own.]

Briefly… I was particularly heartened to see the MIT Technology Review article dispel the notion that real-time, and the real-time web, is solely the domain of Twitter and related social media technologies.  The Economist SIS and Fast Company Corventis pieces highlight interesting sense, analyze and respond use cases in the real (physical) world.  The Carol Bartz piece, also in The Economist, discusses leadership traits in the age of information deluge.  Finally, the Progress Software predictions echo my sentiment in “Rise of Event Processing”, which is “You can’t change what you can’t see”.

Rise Event Processing / Active Information Picks:

1. MIT Technology Review: Startups Mine the Real-Time Web, There’s more to it than microblog posts and social network updates.

“The "real-time Web" is a hot concept these days. Both Google and Microsoft are racing to add more real-time information to their search results, and a slew of startups are developing technology to collect and deliver the freshest information from around the Web.

But there’s more to the real-time Web than just microblogging posts, social network updates, and up-to-the-minute news stories. Huge volumes of data are generated, behind the scenes, every time a person watches a video, clicks on an ad, or performs just about any other action online. And if this user-generated data can be processed rapidly, it could provide new ways to tailor the content on a website, in close to real time.”

… “Richard Tibbetts, CTO of StreamBase, explains that financial markets make up about 80 percent of his company’s customers today. Web companies are just starting to adopt the technology.

"You’re going to see real-time Web mashups, where data is integrated from multiple sources," Tibbetts says. Such a mashup could, for example, monitor second-to-second fluctuations in the price of airline tickets and automatically purchase one when it falls below a certain price.”

… Real-time applications, whether using traditional database technology or Hadoop, stand to become much more sophisticated going forward. "When people say real-time Web today, they have a narrow view of it–consumer applications like Twitter, Facebook, and a little bit of search," says StreamBase’s Tibbetts.”

2. The Economist: The World in 2010 – Big SIS (Societal Information-Technology Systems) is watching you

…“Thanks to Moore’s law (a doubling of capacity every 18 months or so), chips, sensors and radio devices have become so small and cheap that they can be embedded virtually anywhere. Today, two-thirds of new products already come with some electronics built in. By 2017 there could be 7 trillion wirelessly connected devices and objects—about 1,000 per person.

Sensors and chips will produce huge amounts of data. And IT systems are becoming powerful enough to analyse them in real time and predict how things will evolve. IBM has developed a technology it calls “stream computing”. Machines using it can analyse data streams from hundreds of sources, such as surveillance cameras and Wall Street trading desks, summarise the results and take decisions.

Transport is perhaps the industry in which the trend has gone furthest. Several cities have installed dynamic toll systems whose charges vary according to traffic flow. Drivers in Stockholm pay between $1.50 and $3 per entry into the downtown area. After the system—which uses a combination of smart tags, cameras and roadside sensors—was launched, traffic in the Swedish capital decreased by nearly 20%.

More importantly, 2010 will see a boom in “smart grids”. This is tech-speak for an intelligent network paralleling the power grid, and for applications that then manage energy use in real time. Pacific Gas & Electric, one of California’s main utilities, plans to install 10m “smart meters” to tell consumers how much they have to pay and, eventually, to switch off appliances during peak hours.

Smart technology is also likely to penetrate the natural environment. One example is the SmartBay project at Galway Bay in Ireland. The system there draws information from sensors attached to buoys and weather gauges and from text messages from boaters about potentially dangerous floating objects. Uses range from automatic alerts being sent to the harbourmaster when water levels rise above normal to fishermen selling their catch directly to restaurants, thus pocketing a better profit.

Yet it is in big cities that “smartification” will have the most impact. A plethora of systems can be made more intelligent and then combined into a “system of systems”: not just transport and the power grid, but public safety, water supply and even health care (think remote monitoring of patients). With the help of Cisco, another big IT firm, the South Korean city of Incheon aims to become a “Smart+Connected” community, with virtual government services, green energy services and intelligent buildings…”

3. Fast Company: Corventis’s PiiX Monitor Promises to Predict Heart Failure

… “The company’s first product, PiiX, is a wireless, water-resistant sensor that sticks to a patient’s chest like a large Band-Aid and monitors heart rate, respiratory rate, bodily fluids, and overall activity. It transmits the data to a central server for analysis and review by doctor and patient.

The basic technology platform has already received FDA approval, but Corventis envisions the PiiX as much more than a simple monitoring system. The company is working to generate algorithms that can predict, for instance, when a patient is on the verge of heart failure by comparing trends in his or her vital signs to other cases. "When you apply it in the real world, the algorithm begins to learn," says CEO Ed Manicka. "Not from 5 or 10 patients, but from hundreds of thousands of patients, as the system is applied across the planet."

… "What Corventis is trying to do is fundamentally create a new type of machine intelligence that serves to manage the patient’s overall health," he says. "It moves from the reactive approach of practicing medicine that is prevalent today to something that is much more proactive, preventative, and individualized."”

4. The Economist: The World in 2010 – Leadership in the information age, by Carol Bartz, CEO of Yahoo!

… “The second obligation that information creates for executives is to identify and mentor thought leaders. In the past, seeking out “high potential” employees typically meant looking for those who could climb the next rung of the management ladder. That remains important. But equally pressing is finding those employees who, though perhaps not the best managers, have the ability to digest and interpret information for others. Grooming these in-house ideas people helps foster a culture of openness to fresh thinking—the greatest energy an organisation can have.

The deluge of information is only going to rise. Leadership will increasingly mean leveraging that information, clarifying it, and using it to advance your strategy, engage customers and motivate employees. Business stakeholders are interested not only in your products and services, but also in your ideas.

So, welcome the information flood. Those who learn how to keep their head above it will be the most effective leaders.”

5. 2010: IT set to move from evolution to quiet revolution, predicts Progress Software

“Based on feedback from customers, as well as its own research and development, Progress Software sees five key technology trends set to shake up computing in 2010.

1. Real-time insight and business control will become a must-have, as organizations can ill-afford to lose money and customer through being slow to notice problems in delivery. In 2009, our research found that 67% of businesses only become aware of problems when customers report them. 80% of companies already have critical business events they need to monitor in real time. In 2010, insight into these events, powered by the right technology, will be essential to success.

2. Event-driven computing will accelerate, driven by business needs, and impacting both the way applications are built and how they are deployed in the enterprise. Architectures are increasingly being built around ‘events’, and this will increase to deal with both new sources of events appearing within the enterprise as well as external event sources from partners and customers.”

Filed Under: active information, business, event driven architecture, event processing, information strategies, trends

Straddling the “Good Enough | Crapification Divide”

September 2, 2009 By brenda michelson

The current edition of Wired has an interesting article entitled “The Good Enough Revolution”.  The article highlights several successful products – from Pure Digital’s Flip Video Camera to the Predator drone aircraft – where the success was due to a “good enough” mindset.

Good Enough centers on accessibility:

“The attributes that now matter most all fall under the rubric of accessibility. Thanks to the speed and connectivity of the digital age, we’ve stopped fussing over pixel counts, sample rates, and feature lists. Instead, we’re now focused on three things: ease of use, continuous availability, and low price. Is it simple to get what we want out of the technology? Is it available everywhere, all the time—or as close to that ideal as possible? And is it so cheap that we don’t have to think about price? Products that benefit from the MP3 effect capitalize on one or more of these qualities. And they’ll happily sacrifice power and features to do so.”

Good Enough takes a unique view of technology:

“Speaking at an Online publishers conference in London last October, New York University new-media studies professor Clay Shirky had a mantra to offer the assembled producers and editors: "Don’t believe the myth of quality." When it comes to the future of media on the Web, Shirky sternly warned, resist the reflex to focus on high production values. "We’re getting to the point where the Internet can support high-quality content, and it’s as if what we’ve had so far has all been nice—a kind of placeholder—but now the professionals are coming," Shirky said. "That’s not true." To reinforce his point, he pointed to the MP3. The music industry initially laughed off the format, he explained, because compared with the CD it sounded terrible. What record labels and retailers failed to recognize was that although MP3 provided relatively low audio quality, it had a number of offsetting positive qualities.

Shirky’s point is crucial. By reducing the size of audio files, MP3s allowed us to get music into our computers—and, more important, onto the Internet—at a manageable size. This in turn let us listen to, manage, and manipulate tracks on our PCs, carry thousands of songs in our pockets, purchase songs from our living rooms, and share tracks with friends and even strangers. And as it turned out, those benefits actually mattered a lot more to music lovers than the single measure of quality we had previously applied to recorded music—fidelity. It wasn’t long before record labels were wringing their hands over declining CD sales.

"There comes a point at which improving upon the thing that was important in the past is a bad move," Shirky said in a recent interview. "It’s actually feeding competitive advantage to outsiders by not recognizing the value of other qualities." In other words, companies that focus on traditional measures of quality—fidelity, resolution, features—can become myopic and fail to address other, now essential attributes like convenience and shareability. And that means someone else can come along and drink their milk shake.

To a degree, the MP3 follows the classic pattern of a disruptive technology, as outlined by Clayton Christensen in his 1997 book The Innovator’s Dilemma. Disruptive technologies, Christensen explains, often enter at the bottom of the market, where they are ignored by established players. These technologies then grow in power and sophistication to the point where they eclipse the old systems.

That is certainly part of what happens with Good Enough tech: MP3s entered at the bottom of the market, were ignored, and then turned the music business upside down. But oddly, audio quality never really readjusted upward. Sure, software engineers have cooked up new encoding algorithms that produce fuller sound without drastically increasing file sizes. And with recent increases in bandwidth and the advent of giant hard drives, it’s now even possible to maintain, share, and carry vast libraries of uncompressed files. But better-sounding options have hardly gained any ground on the lo-fi MP3. The big advance—the one that had all the impact—was the move to easier-to-manage bits. Compared with that, improved sound quality just doesn’t move the needle.”

Good Enough is not crapification:

“To some, it looks like the crapification of everything. But it’s really an improvement. And businesses need to get used to it, because the Good Enough revolution has only just begun.”

Good Enough | Crapification Divide 

Undoubtedly, the Good Enough revolution will, if it hasn’t already, make inroads into enterprise and government IT shops.  And frankly, the pragmatist in me views this as a positive thing.  However, I’ve spent enough time in the real world to know that “good enough” can easily be (mis)interpreted as “slam something in” and result in “crapification”.

So, for me, the real question becomes: how do you straddle the good enough | crapification divide.  Top of mind, I’m thinking:

1.  Don’t deviate from the “good enough” design points: ease of use, continuous availability, and low price. 

2.  Understand that the above design points – ease of use, continuous availability and low price – are only possible with significant investment (time, talent) in design.

3. Pick a target audience, use case, scenario and stick to it.  Don’t be afraid to be niche or say no.  Better to win over a smaller audience than fail a large one.

4. Don’t force fit the use cases and scenarios where “good enough” isn’t good enough.

These are my early thoughts.  What points would you add to avoid crapification?

Filed Under: business, business-technology, trends

Lessons from the Crisis: Behavior Matters

August 25, 2009 By brenda michelson

The July/August issue of the Harvard Business Review has a feature by McKinsey & Company on 10 Trends You Have to Watch.  The premise is after a year in turmoil, business executives are starting to look towards the future.  However, the world has changed, and with it, so have some key trends.

The trend that caught my attention – Management as Science — falls squarely in the datarati realm:

“Data, computing power, and mathematical models have been transforming many realms of management from art to science. But the crisis exposed the limitations of certain tools. In particular, the world saw the folly of the reliance by banks, insurance companies, and others on financial models that assumed economic rationality, linearity, equilibrium, and bell-curve distributions. As the recession unfolded, it became clear that the models had failed badly.

It would be wrong to conclude that managers should go back to making decisions only on the basis of gut instinct. The real lessons are that the tools need to incorporate more-realistic visions of human behavior—most likely by drawing on behavioral economics, becoming more dynamic, and integrating real-world feedback—and that business executives need to get better at using them. Companies will, rightly, continue to seek ways to exploit the increasing amounts of data and computing power. As they do so, decision makers in every industry must take responsibility for looking inside the black boxes that advanced quantitative tools often represent and understanding their functioning, assumptions, and limitations.”

In retrospect, this makes perfect sense.  Human behavior is far from universally predictable.  Recall how the U.S. Government expected citizens to re-invigorate the economy by engaging in non-essential shopping with that first stimulus check.  Instead, what did many do?  Paid bills, bought groceries or tucked it away for the tough times to come.  Survival instincts won out over an algorithm.

Once you recognize that behavior matters, a natural follow-on is, “where does behavioral data come”?  No surprise, Google has a veritable treasure trove:

“Wu calls Google "the barometer of the world." Indeed, studying the clicks is like looking through a window with a panoramic view of everything. You can see the change of seasons—clicks gravitating toward skiing and heavy clothes in winter, bikinis and sunscreen in summer—and you can track who’s up and down in pop culture. Most of us remember news events from television or newspapers; Googlers recall them as spikes in their graphs. "One of the big things a few years ago was the SARS epidemic," Tang says. Wu didn’t even have to read the papers to know about the financial meltdown—he saw the jump in people Googling for gold.”

As for the rest of us, we can mine internal and public datasets, setup prediction markets, employ sentiment tools and/or hire behavioral economics consultants.  First though, I’d recommend familiarizing yourself with the field of behavioral economics, and pay special attention to the datarati ties. I plan to ease myself in with Dan Ariely’s Predictably Irrational. 

If you have experience applying behavioral economics in your business, or reading/learning suggestions, please share what you can in the comments or via email.

Filed Under: active information, business, business intelligence, data science, trends

Lessons from Baseball Science: A picture is worth 1000 data points

August 5, 2009 By brenda michelson

It’d be easy to chalk up today’s choice to my being in pre-vacation mode, but in truth, I’ve had this New York Times Baseball Science article open in a tab for nearly a month.  When I first read it, I immediately thought of connections to my recent post Lessons from Googlenomics: Data Abundance, Insight Scarcity.

In the referenced Wired Googlenomics article, Hal Varian asks, “What’s ubiquitous and cheap?” His answer “Data.” He follows up with “And what is scarce? The analytic ability to utilize that data.”

The Baseball Science article highlights an innovative way Major League Baseball is collecting even more player data – defense and base running – via a new system of high-resolution cameras and supporting software:

“A new camera and software system in its final testing phases will record the exact speed and location of the ball and every player on the field, allowing the most digitized of sports to be overrun anew by hundreds of innovative statistics that will rate players more accurately, almost certainly affect their compensation and perhaps alter how the game itself is played.

…In San Francisco, four high-resolution cameras sit on light towers 162 feet up, capturing everything that happens on the field in three dimensions and wiring it to a control room below. Software tools determine which movements are the ball, which are fielders and runners, and which are passing seagulls. More than two million meaningful location points are recorded per game.”

However, the system output is “simple time-stamped x-y-z coordinates” which require sophisticated algorithms to turn the raw data into insights:

“Software and artificial-intelligence algorithms must still be developed to turn simple time-stamped x-y-z coordinates into batted-ball speeds, throwing distances and comparative tools to make the data come alive.”

Beyond turning the raw data into meaningful information regarding player actions and game outcomes, the teams, league, and legions of fans and broadcasters, still need to figure out how to act on, and manage, this data trove:

“Teams have begun scrambling to develop uses for the new data, which will be unveiled Saturday to a group of baseball executives, statisticians and academics, knowing it will probably become the largest single advance in baseball science since the development of the box score. Several major league executives would not publicly acknowledge their enthusiasm for the new system, to better protect their plans for leveraging it.

“It can be a big deal,” the Cleveland Indians’ general manager, Mark Shapiro, said. “We’ve gotten so much data for offense, but defensive objective analysis has been the most challenging area to get any meaningful handle on. This is information that’s not available anywhere. When you create that much data you almost have to change the structure of the front office to make sense of it.””

The above two challenges, making the data meaningful, and developing actionable business insights, are accomplished by individuals that Hal Varian refers to as the “datarati”:

“Varian believes that a new era is dawning for what you might call the datarati—and it’s all about harnessing supply and demand. “What’s ubiquitous and cheap?” Varian asks. “Data.” And what is scarce? The analytic ability to utilize that data. As a result, he believes that the kind of technical person who once would have wound up working for a hedge fund on Wall Street will now work at a firm whose business hinges on making smart, daring choices—decisions based on surprising results gleaned from algorithmic spelunking and executed with the confidence that comes from really doing the math.”

In the baseball world, Billy Beane and Theo Epstein are considered ‘datarati’ archetypes.

As a geek by trade and a lifelong baseball fan, I find myself intrigued by this new data collection technology and the resulting analytic and management possibilities.  Of course, it also got me thinking beyond baseball, and sports, to wonder what other fields (no pun intended) might benefit from digital camera based data collection and data point to scenario reconciliation.

From my own background, I can envision the technology being applied to analyze and improve efficiencies in retail stores, warehouses and factories.  How about you?

Some questions to consider:

Could this data collection technique benefit your organization?

How about as a data consumer?  Can you think of an external scenario that might provide meaningful “simple time-stamped x-y-z coordinates” to your organization?

Has your organization embraced the rise of the datarati?

Filed Under: active information, business, business intelligence, data science, innovation, trends Tagged With: archive_0

Lessons from Googlenomics: Data abundance, Insight Scarcity

June 29, 2009 By brenda michelson

“"What's ubiquitous and cheap?" [Google’s Hal] Varian asks. "Data." And what is scarce? The analytic ability to utilize that data.”

The June issue of Wired has an excellent article by Steven Levy, entitled Secret of Googlenomics: Data-Fueled Recipe Brews Profitability.  The article delves into the history and algorithms behind Google’s auction based ad system, highlighting the significance of engineering, mathematics, economics, and data mining in Google’s success.

On the economics front, the article explains Hal Varian’s role as Chief Economist at Google, including why Google needs a chief economist:

“The simplest reason is that the company is an economy unto itself. The ad auction, marinated in that special sauce, is a seething laboratory of fiduciary forensics, with customers ranging from giant multinationals to dorm-room entrepreneurs, all billed by the world's largest micropayment system.

Google depends on economic principles to hone what has become the search engine of choice for more than 60 percent of all Internet surfers, and the company uses auction theory to grease the skids of its own operations. All these calculations require an army of math geeks, algorithms of Ramanujanian complexity, and a sales force more comfortable with whiteboard markers than fairway irons.”

After reading the article, Varian’s economic view of data ubiquity and analytic scarcity really stuck with me.  The quote I opened the post with isn’t directed at software availability or processing power.  It refers to the scarcity of people qualified to churn abundant data into economic value.  

What follows are some excerpts “about harnessing supply and demand”.  The sub-headers and emphasis are mine.

Enter Econometricians

"The people working for me are generally econometricians—sort of a cross between statisticians and economists," says Varian, who moved to Google full-time in 2007 (he's on leave from Berkeley) and leads two teams, one of them focused on analysis.

"Google needs mathematical types that have a rich tool set for looking for signals in noise," says statistician Daryl Pregibon, who joined Google in 2003 after 23 years as a top scientist at Bell Labs and AT&T Labs. "The rough rule of thumb is one statistician for every 100 computer scientists."

Ubiquitous Data

“As the amount of data at the company's disposal grows, the opportunities to exploit it multiply, which ends up further extending the range and scope of the Google economy…

Keywords and click rates are their bread and butter. "We are trying to understand the mechanisms behind the metrics," says Qing Wu, one of Varian's minions. His specialty is forecasting, so now he predicts patterns of queries based on the season, the climate, international holidays, even the time of day. "We have temperature data, weather data, and queries data, so we can do correlation and statistical modeling," Wu says. The results all feed into Google's backend system, helping advertisers devise more-efficient campaigns.”

Continuous Analysis

“To track and test their predictions, Wu and his colleagues use dozens of onscreen dashboards that continuously stream information, a sort of Bloomberg terminal for the Googlesphere. Wu checks obsessively to see whether reality is matching the forecasts: "With a dashboard, you can monitor the queries, the amount of money you make, how many advertisers you have, how many keywords they're bidding on, what the rate of return is for each advertiser."”

Behavioral Based Insights

“Wu calls Google "the barometer of the world." Indeed, studying the clicks is like looking through a window with a panoramic view of everything. You can see the change of seasons—clicks gravitating toward skiing and heavy clothes in winter, bikinis and sunscreen in summer—and you can track who's up and down in pop culture. Most of us remember news events from television or newspapers; Googlers recall them as spikes in their graphs. "One of the big things a few years ago was the SARS epidemic," Tang says. Wu didn't even have to read the papers to know about the financial meltdown—he saw the jump in people Googling for gold. And since prediction and analysis are so crucial to AdWords, every bit of data, no matter how seemingly trivial, has potential value.”

Rise of the Datarati

“Varian believes that a new era is dawning for what you might call the datarati—and it's all about harnessing supply and demand. "What's ubiquitous and cheap?" Varian asks. "Data." And what is scarce? The analytic ability to utilize that data. As a result, he believes that the kind of technical person who once would have wound up working for a hedge fund on Wall Street will now work at a firm whose business hinges on making smart, daring choices—decisions based on surprising results gleaned from algorithmic spelunking and executed with the confidence that comes from really doing the math.”

Now, a few questions I think folks should consider:

  1. Who does that math in your organization? 
  2. Does your analytics / active information strategy suffer from information processing richness and insight scarcity?
  3. Who are, or should be, your datarati? 

Filed Under: active information, business, business intelligence, data science, information strategies, innovation, trends Tagged With: archive_0

« Previous Page
Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Harshest editorial feedback I ever received “stultified and like death”… (wildly popular paper, as it turned out):… https://t.co/qWNwBCOS5i February 28, 2023 2:16 pm
  • “…where the process of drawing itself can take us. We can follow a suggestion, a squiggle, shadow, or smudge, and s… https://t.co/oRg0x2LoXG November 30, 2022 5:05 pm
  • On the waiting list for Post, join me (on the waitlist) via https://t.co/U8wYK707f6 November 24, 2022 4:17 pm
© 2004-2022 Elemental Links, Inc.