elemental links

brenda michelson: technology intersected

  • Blog
  • About
  • Services
  • Archives

Big data fetishes: social and mobile – Active Information

May 10, 2012 By brenda michelson

This week, I wrote about data fetishes on Active Information. Excerpt:

“On the Big Data front, I’m intrigued by the potential of fast, wide and deep data processing to solve hard problems, learn from outliers and make informed, data-driven decisions.

And, as my clients will attest, I advocate instrumenting everything as a means to discover true customer, business and systems behaviors.

However, I don’t believe that all data has equal value. Nor does all valuable data hold its value over time. Good data programs rely on context and include data weeding.

But, what about the data that should never, ever get in your attention? According to Wharton’s Peter Fader, the least valuable data is the noisiest in the Big Data space: social and mobile.”

Read the post: Big data fetishes: social and mobile – Input Output.

Filed Under: active information, information strategies

Active Information: Streaming through Computational World, Changing change via experimentation platforms

January 4, 2012 By brenda michelson

My latest posts on the HPIO Active Information blog:

Streaming through a Computational World — (most popular post to date)

To take advantage of the computational world, or the nearer term internet of things, we need to infuse smarts throughout our data collection networks.  We need to employ up-front and intermediate filters, traffic cops, aggregators, pattern detectors, and intelligent agents.  We need to get over being data hoarders, and have the astuteness to leave data behind.

Busting cultural resistance via experimentation platforms — (changing change)

Culture, mistrust of the data, lack of interest. These very human factors are adoption barriers for 46% of the respondents. Yet, these barriers aren’t new. Nor, confined to big data and advanced analytics. To change a culture, you need to bring proof to the table.  And proof requires hands-on experimentation and real-world data. We need data to prove that we need data. How will we get that?

Filed Under: active information, change, event driven architecture, event processing, information strategies

Data Quality for Real-time: Correctness and Tolerance

January 5, 2011 By brenda michelson

As part of my research agenda, I’m catching up on works by my thought-leading friends in the enterprise architecture field.  Today, I read Gene Leganza’s Top 15 Technology Trends EA Should Watch 2011 to 2013, published by Forrester’s EA Practice. [This is a client (for pay) report, which the good folks at Forrester shared with me.]

The themes of business visibility and responsiveness are present throughout the report, supported by several of the technology trends, including next generation business intelligence, business rules processing and policy-based SOA, smart technology management and event-driven patterns.

Tucked into the section [6-1] on Next-gen Business Intelligence (BI) is an extremely critical, and oft-overlooked, point on real-time processing and data quality:

“The shift from historical to real-time analytics will require that related processes such as data quality services also move to real time.”

The report continues to (correctly) state that “The complexity challenge [of Next-gen BI] will not be around the technologies per se but rather in the continued effort of gaining business consensus on data governance so that bad data is not driving strategic and operational decisions.”

I’ve written in the past how real-time operational adjustments can actually aid strategic decision-making, because the adjustments add a degree of correctness to the historical information base.  However, that’s only true when the operational decisions are based on good data. 

As Gene’s report points out, this is where things get tricky.  You need to re-think and reposition data quality procedures to achieve correctness in real-time operational decision-making. 

However, you need to be cognizant of the impact, in terms of delay, caused by adding data quality checks to real-time processing.  Obviously, the tolerance for delay will vary by business scenario, and perhaps even by transaction.

If the business scenario (or transaction) has a low tolerance for introducing delay, then you need to shift your tolerance view to impact of error.  If the outcome of the real-time decision falls within your error tolerance range, then proceed with the action.  If not, don’t force a real-time decision that your business might regret.

As you work through data governance policies for real-time, be sure to include tolerances for delay, correctness and decision (action or inaction) risk.

Filed Under: active information, change, information strategies

  • 1
  • 2
  • 3
  • 4
  • Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO) (services)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Spilled coffee flows to highest potential idea, right? January 17, 2021 4:21 pm
  • World’s oldest painting of animals discovered in an Indonesian cave | New Scientist https://t.co/V7VVsjQdOa January 16, 2021 2:29 pm
  • Public office isn’t a prize, it’s a responsibility. January 8, 2021 4:37 pm

Contact Brenda

Have a question? Want to work together? Reach out via your preferred mode:
  • Email
  • LinkedIn
  • RSS
  • Twitter
© 2004-2021 Elemental Links, Inc.