• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Data Quality for Real-time: Correctness and Tolerance

January 5, 2011 By brenda michelson

As part of my research agenda, I’m catching up on works by my thought-leading friends in the enterprise architecture field.  Today, I read Gene Leganza’s Top 15 Technology Trends EA Should Watch 2011 to 2013, published by Forrester’s EA Practice. [This is a client (for pay) report, which the good folks at Forrester shared with me.]

The themes of business visibility and responsiveness are present throughout the report, supported by several of the technology trends, including next generation business intelligence, business rules processing and policy-based SOA, smart technology management and event-driven patterns.

Tucked into the section [6-1] on Next-gen Business Intelligence (BI) is an extremely critical, and oft-overlooked, point on real-time processing and data quality:

“The shift from historical to real-time analytics will require that related processes such as data quality services also move to real time.”

The report continues to (correctly) state that “The complexity challenge [of Next-gen BI] will not be around the technologies per se but rather in the continued effort of gaining business consensus on data governance so that bad data is not driving strategic and operational decisions.”

I’ve written in the past how real-time operational adjustments can actually aid strategic decision-making, because the adjustments add a degree of correctness to the historical information base.  However, that’s only true when the operational decisions are based on good data. 

As Gene’s report points out, this is where things get tricky.  You need to re-think and reposition data quality procedures to achieve correctness in real-time operational decision-making. 

However, you need to be cognizant of the impact, in terms of delay, caused by adding data quality checks to real-time processing.  Obviously, the tolerance for delay will vary by business scenario, and perhaps even by transaction.

If the business scenario (or transaction) has a low tolerance for introducing delay, then you need to shift your tolerance view to impact of error.  If the outcome of the real-time decision falls within your error tolerance range, then proceed with the action.  If not, don’t force a real-time decision that your business might regret.

As you work through data governance policies for real-time, be sure to include tolerances for delay, correctness and decision (action or inaction) risk.

Share this:

  • Click to share on Twitter (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to email this to a friend (Opens in new window)

Filed Under: active information, change, information strategies

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • Why Strangers Are Good for Us https://t.co/t4hVuJp4Hr June 12, 2022 2:28 pm
  • https://t.co/vyUtnBwhTO June 8, 2022 10:11 pm
  • “We find ourselves not by being The most seen, but the most seeing.” - Amanda Gorman, Compass, from Call Us What We Carry June 3, 2022 3:43 pm
© 2004-2022 Elemental Links, Inc.
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.