• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Cloud Watch now underway… Lawyers, Data and Money

November 13, 2009 By brenda michelson

This week, I’ve started up the Cloud Watch section of Elemental Cloud Computing.  Cloud Watch items are snippets from cloud computing industry news, business and technical publications, and thought leaders.  These snippets may be as short as a 140-character tweet, or as long as a few paragraphs.  Cloud Watch items will be posted throughout the course of the day, and may be expanded as more voices and sources cover a hot topic.  Cloud Watch items may include a short Elemental Cloud Computing perspective.

The opening cloud watch items are a mix of business and economics, and data, contracts and legal issues:

  • Bessemer’s 8th of Cloud Computing and SaaS: Leverage and monetize the data asset
  • Workload Metrics: Business Transactions per Kilowatt?
  • Microsoft Exec: Customers Embracing "Cloud Computing" <– But whose?
  • GigaOm: 10 Open Source Resources for Cloud Computing
  • Lawyers, Clouds and Warrants
  • McKinsey to CIOs in ‘New Normal’: Rethink Procurement…
  • Irving Wladawsky-Berger: Cloud Computing is Relevant for (mostly) everyone
  • Joyent is First in China: Launches Commercial Cloud Computing Platform

From Elemental Cloud Computing, there several ways to follow the cloud watch.  The Current Cloud Watch section on the homepage, the Cloud Watch tab of the Recent Posts navigation element in the sidebar, the Cloud Watch tab of the site navigation, via the Cloud Watch or Full Site feeds and email, and on Twitter. 

Filed Under: business, cloud computing, economy, open source, sustainability

What are real people doing with SOA?

October 1, 2009 By brenda michelson

I just posted this on SOA Consortium Insights. This is the final evolution of the SOA Drivers diagram I first posted here a year ago (more or less).

A frequent question to the SOA Consortium is “What are real people doing with SOA?”. Sure, folks see and appreciate the winning case studies. But, what about the “every organization”. What problems, or opportunities, are being resolved using a SOA approach? What’s the best place to start? Top-down from business strategy? Bottoms-up from IT projects? Somewhere in the middle?

This is best answered in three parts, starting by level-setting on business-driven SOA.

By “business-driven SOA”, we mean three things:

1. Creating a portfolio of services that represent capabilities offered by, or required of, your organization. Those capabilities may represent business, information, or technology concepts.

2. Composing, or orchestrating, those services along with events, rules and policies, into business processes and solutions that fulfill business scenarios.

3. Never proceed without a business outcome in mind. That “business outcome” could be cost and complexity reduction via a rationalized IT portfolio. In other words, “business-driven” doesn’t require a business person tapping you on the shoulder, it means executing for business reasons.

Next, our take on “SOA initiatives”, which is, there shouldn’t be any. Rather, there should only be business initiatives that (when appropriate) use a SOA approach. Sure, it sounds like semantic nit-picking, but in actuality it’s a lesson from the trenches. Focus on the business problem, not the SOA grand challenge.

As you know, business initiatives can arise from strategic decisions, business architecture/design decisions, and/or operational results. And since SOA can be applied to a variety of business situations, it only makes sense that SOA can appear at point, from strategy to operations. In other words, there is no best starting point, only the best starting point for you.

As these business initiatives are further refined in planning and execution, business and implementation details are surfaced, which also may call for a SOA approach. For example, Joint Business and IT Planning activities may surface common process, function and/or information needs, across projects. These are potential drivers for a SOA approach.

This business and IT activity continuum is expressed by the blue, green and yellow boxes, and grey arrows and boundaries, in the diagram below. While blue represents business activities, and yellow represents IT activities, we’ve learned that the most successful organizations spend their time in the green, where business and IT continuously collaborate.

Finally, the business scenarios. Referring to the diagram below, in each column of the continuum, we’ve listed real world SOA approach drivers. For example, strategic business initiatives that have benefited from a SOA approach include introducing new business capability, entering a new or adjacent market, integrating mergers and acquisitions, and introducing multi-channel strategies.

We captured these drivers during interactions with practitioners (real people) over the last three years. These interactions included our executive summit series, our community of practice discussions, invited speakers at our events, case study contest submissions, and the thousands of practitioners we’ve met via industry events, private forums, and in their conference rooms.

SOA Approach Drivers Diagram, October 2009 version [click on picture to enlarge]

[The SOA Consortium is a client of my firm, Elemental Links]

Filed Under: bpm, business, business architecture, enterprise architecture, services architecture, soa

Straddling the “Good Enough | Crapification Divide”

September 2, 2009 By brenda michelson

The current edition of Wired has an interesting article entitled “The Good Enough Revolution”.  The article highlights several successful products – from Pure Digital’s Flip Video Camera to the Predator drone aircraft – where the success was due to a “good enough” mindset.

Good Enough centers on accessibility:

“The attributes that now matter most all fall under the rubric of accessibility. Thanks to the speed and connectivity of the digital age, we’ve stopped fussing over pixel counts, sample rates, and feature lists. Instead, we’re now focused on three things: ease of use, continuous availability, and low price. Is it simple to get what we want out of the technology? Is it available everywhere, all the time—or as close to that ideal as possible? And is it so cheap that we don’t have to think about price? Products that benefit from the MP3 effect capitalize on one or more of these qualities. And they’ll happily sacrifice power and features to do so.”

Good Enough takes a unique view of technology:

“Speaking at an Online publishers conference in London last October, New York University new-media studies professor Clay Shirky had a mantra to offer the assembled producers and editors: "Don’t believe the myth of quality." When it comes to the future of media on the Web, Shirky sternly warned, resist the reflex to focus on high production values. "We’re getting to the point where the Internet can support high-quality content, and it’s as if what we’ve had so far has all been nice—a kind of placeholder—but now the professionals are coming," Shirky said. "That’s not true." To reinforce his point, he pointed to the MP3. The music industry initially laughed off the format, he explained, because compared with the CD it sounded terrible. What record labels and retailers failed to recognize was that although MP3 provided relatively low audio quality, it had a number of offsetting positive qualities.

Shirky’s point is crucial. By reducing the size of audio files, MP3s allowed us to get music into our computers—and, more important, onto the Internet—at a manageable size. This in turn let us listen to, manage, and manipulate tracks on our PCs, carry thousands of songs in our pockets, purchase songs from our living rooms, and share tracks with friends and even strangers. And as it turned out, those benefits actually mattered a lot more to music lovers than the single measure of quality we had previously applied to recorded music—fidelity. It wasn’t long before record labels were wringing their hands over declining CD sales.

"There comes a point at which improving upon the thing that was important in the past is a bad move," Shirky said in a recent interview. "It’s actually feeding competitive advantage to outsiders by not recognizing the value of other qualities." In other words, companies that focus on traditional measures of quality—fidelity, resolution, features—can become myopic and fail to address other, now essential attributes like convenience and shareability. And that means someone else can come along and drink their milk shake.

To a degree, the MP3 follows the classic pattern of a disruptive technology, as outlined by Clayton Christensen in his 1997 book The Innovator’s Dilemma. Disruptive technologies, Christensen explains, often enter at the bottom of the market, where they are ignored by established players. These technologies then grow in power and sophistication to the point where they eclipse the old systems.

That is certainly part of what happens with Good Enough tech: MP3s entered at the bottom of the market, were ignored, and then turned the music business upside down. But oddly, audio quality never really readjusted upward. Sure, software engineers have cooked up new encoding algorithms that produce fuller sound without drastically increasing file sizes. And with recent increases in bandwidth and the advent of giant hard drives, it’s now even possible to maintain, share, and carry vast libraries of uncompressed files. But better-sounding options have hardly gained any ground on the lo-fi MP3. The big advance—the one that had all the impact—was the move to easier-to-manage bits. Compared with that, improved sound quality just doesn’t move the needle.”

Good Enough is not crapification:

“To some, it looks like the crapification of everything. But it’s really an improvement. And businesses need to get used to it, because the Good Enough revolution has only just begun.”

Good Enough | Crapification Divide 

Undoubtedly, the Good Enough revolution will, if it hasn’t already, make inroads into enterprise and government IT shops.  And frankly, the pragmatist in me views this as a positive thing.  However, I’ve spent enough time in the real world to know that “good enough” can easily be (mis)interpreted as “slam something in” and result in “crapification”.

So, for me, the real question becomes: how do you straddle the good enough | crapification divide.  Top of mind, I’m thinking:

1.  Don’t deviate from the “good enough” design points: ease of use, continuous availability, and low price. 

2.  Understand that the above design points – ease of use, continuous availability and low price – are only possible with significant investment (time, talent) in design.

3. Pick a target audience, use case, scenario and stick to it.  Don’t be afraid to be niche or say no.  Better to win over a smaller audience than fail a large one.

4. Don’t force fit the use cases and scenarios where “good enough” isn’t good enough.

These are my early thoughts.  What points would you add to avoid crapification?

Filed Under: business, business-technology, trends

Lessons from the Crisis: Behavior Matters

August 25, 2009 By brenda michelson

The July/August issue of the Harvard Business Review has a feature by McKinsey & Company on 10 Trends You Have to Watch.  The premise is after a year in turmoil, business executives are starting to look towards the future.  However, the world has changed, and with it, so have some key trends.

The trend that caught my attention – Management as Science — falls squarely in the datarati realm:

“Data, computing power, and mathematical models have been transforming many realms of management from art to science. But the crisis exposed the limitations of certain tools. In particular, the world saw the folly of the reliance by banks, insurance companies, and others on financial models that assumed economic rationality, linearity, equilibrium, and bell-curve distributions. As the recession unfolded, it became clear that the models had failed badly.

It would be wrong to conclude that managers should go back to making decisions only on the basis of gut instinct. The real lessons are that the tools need to incorporate more-realistic visions of human behavior—most likely by drawing on behavioral economics, becoming more dynamic, and integrating real-world feedback—and that business executives need to get better at using them. Companies will, rightly, continue to seek ways to exploit the increasing amounts of data and computing power. As they do so, decision makers in every industry must take responsibility for looking inside the black boxes that advanced quantitative tools often represent and understanding their functioning, assumptions, and limitations.”

In retrospect, this makes perfect sense.  Human behavior is far from universally predictable.  Recall how the U.S. Government expected citizens to re-invigorate the economy by engaging in non-essential shopping with that first stimulus check.  Instead, what did many do?  Paid bills, bought groceries or tucked it away for the tough times to come.  Survival instincts won out over an algorithm.

Once you recognize that behavior matters, a natural follow-on is, “where does behavioral data come”?  No surprise, Google has a veritable treasure trove:

“Wu calls Google "the barometer of the world." Indeed, studying the clicks is like looking through a window with a panoramic view of everything. You can see the change of seasons—clicks gravitating toward skiing and heavy clothes in winter, bikinis and sunscreen in summer—and you can track who’s up and down in pop culture. Most of us remember news events from television or newspapers; Googlers recall them as spikes in their graphs. "One of the big things a few years ago was the SARS epidemic," Tang says. Wu didn’t even have to read the papers to know about the financial meltdown—he saw the jump in people Googling for gold.”

As for the rest of us, we can mine internal and public datasets, setup prediction markets, employ sentiment tools and/or hire behavioral economics consultants.  First though, I’d recommend familiarizing yourself with the field of behavioral economics, and pay special attention to the datarati ties. I plan to ease myself in with Dan Ariely’s Predictably Irrational. 

If you have experience applying behavioral economics in your business, or reading/learning suggestions, please share what you can in the comments or via email.

Filed Under: active information, business, business intelligence, data science, trends

Lessons from Baseball Science: A picture is worth 1000 data points

August 5, 2009 By brenda michelson

It’d be easy to chalk up today’s choice to my being in pre-vacation mode, but in truth, I’ve had this New York Times Baseball Science article open in a tab for nearly a month.  When I first read it, I immediately thought of connections to my recent post Lessons from Googlenomics: Data Abundance, Insight Scarcity.

In the referenced Wired Googlenomics article, Hal Varian asks, “What’s ubiquitous and cheap?” His answer “Data.” He follows up with “And what is scarce? The analytic ability to utilize that data.”

The Baseball Science article highlights an innovative way Major League Baseball is collecting even more player data – defense and base running – via a new system of high-resolution cameras and supporting software:

“A new camera and software system in its final testing phases will record the exact speed and location of the ball and every player on the field, allowing the most digitized of sports to be overrun anew by hundreds of innovative statistics that will rate players more accurately, almost certainly affect their compensation and perhaps alter how the game itself is played.

…In San Francisco, four high-resolution cameras sit on light towers 162 feet up, capturing everything that happens on the field in three dimensions and wiring it to a control room below. Software tools determine which movements are the ball, which are fielders and runners, and which are passing seagulls. More than two million meaningful location points are recorded per game.”

However, the system output is “simple time-stamped x-y-z coordinates” which require sophisticated algorithms to turn the raw data into insights:

“Software and artificial-intelligence algorithms must still be developed to turn simple time-stamped x-y-z coordinates into batted-ball speeds, throwing distances and comparative tools to make the data come alive.”

Beyond turning the raw data into meaningful information regarding player actions and game outcomes, the teams, league, and legions of fans and broadcasters, still need to figure out how to act on, and manage, this data trove:

“Teams have begun scrambling to develop uses for the new data, which will be unveiled Saturday to a group of baseball executives, statisticians and academics, knowing it will probably become the largest single advance in baseball science since the development of the box score. Several major league executives would not publicly acknowledge their enthusiasm for the new system, to better protect their plans for leveraging it.

“It can be a big deal,” the Cleveland Indians’ general manager, Mark Shapiro, said. “We’ve gotten so much data for offense, but defensive objective analysis has been the most challenging area to get any meaningful handle on. This is information that’s not available anywhere. When you create that much data you almost have to change the structure of the front office to make sense of it.””

The above two challenges, making the data meaningful, and developing actionable business insights, are accomplished by individuals that Hal Varian refers to as the “datarati”:

“Varian believes that a new era is dawning for what you might call the datarati—and it’s all about harnessing supply and demand. “What’s ubiquitous and cheap?” Varian asks. “Data.” And what is scarce? The analytic ability to utilize that data. As a result, he believes that the kind of technical person who once would have wound up working for a hedge fund on Wall Street will now work at a firm whose business hinges on making smart, daring choices—decisions based on surprising results gleaned from algorithmic spelunking and executed with the confidence that comes from really doing the math.”

In the baseball world, Billy Beane and Theo Epstein are considered ‘datarati’ archetypes.

As a geek by trade and a lifelong baseball fan, I find myself intrigued by this new data collection technology and the resulting analytic and management possibilities.  Of course, it also got me thinking beyond baseball, and sports, to wonder what other fields (no pun intended) might benefit from digital camera based data collection and data point to scenario reconciliation.

From my own background, I can envision the technology being applied to analyze and improve efficiencies in retail stores, warehouses and factories.  How about you?

Some questions to consider:

Could this data collection technique benefit your organization?

How about as a data consumer?  Can you think of an external scenario that might provide meaningful “simple time-stamped x-y-z coordinates” to your organization?

Has your organization embraced the rise of the datarati?

Filed Under: active information, business, business intelligence, data science, innovation, trends Tagged With: archive_0

« Previous Page
Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Just now
© 2004-2022 Elemental Links, Inc.