• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Big data fetishes: social and mobile – Active Information

May 10, 2012 By brenda michelson

This week, I wrote about data fetishes on Active Information. Excerpt:

“On the Big Data front, I’m intrigued by the potential of fast, wide and deep data processing to solve hard problems, learn from outliers and make informed, data-driven decisions.

And, as my clients will attest, I advocate instrumenting everything as a means to discover true customer, business and systems behaviors.

However, I don’t believe that all data has equal value. Nor does all valuable data hold its value over time. Good data programs rely on context and include data weeding.

But, what about the data that should never, ever get in your attention? According to Wharton’s Peter Fader, the least valuable data is the noisiest in the Big Data space: social and mobile.”

Read the post: Big data fetishes: social and mobile – Input Output.

Filed Under: active information, information strategies

Active Information: Streaming through Computational World, Changing change via experimentation platforms

January 4, 2012 By brenda michelson

My latest posts on the HPIO Active Information blog:

Streaming through a Computational World — (most popular post to date)

To take advantage of the computational world, or the nearer term internet of things, we need to infuse smarts throughout our data collection networks.  We need to employ up-front and intermediate filters, traffic cops, aggregators, pattern detectors, and intelligent agents.  We need to get over being data hoarders, and have the astuteness to leave data behind.

Busting cultural resistance via experimentation platforms — (changing change)

Culture, mistrust of the data, lack of interest. These very human factors are adoption barriers for 46% of the respondents. Yet, these barriers aren’t new. Nor, confined to big data and advanced analytics. To change a culture, you need to bring proof to the table.  And proof requires hands-on experimentation and real-world data. We need data to prove that we need data. How will we get that?

Filed Under: active information, change, event driven architecture, event processing, information strategies

Data Quality for Real-time: Correctness and Tolerance

January 5, 2011 By brenda michelson

As part of my research agenda, I’m catching up on works by my thought-leading friends in the enterprise architecture field.  Today, I read Gene Leganza’s Top 15 Technology Trends EA Should Watch 2011 to 2013, published by Forrester’s EA Practice. [This is a client (for pay) report, which the good folks at Forrester shared with me.]

The themes of business visibility and responsiveness are present throughout the report, supported by several of the technology trends, including next generation business intelligence, business rules processing and policy-based SOA, smart technology management and event-driven patterns.

Tucked into the section [6-1] on Next-gen Business Intelligence (BI) is an extremely critical, and oft-overlooked, point on real-time processing and data quality:

“The shift from historical to real-time analytics will require that related processes such as data quality services also move to real time.”

The report continues to (correctly) state that “The complexity challenge [of Next-gen BI] will not be around the technologies per se but rather in the continued effort of gaining business consensus on data governance so that bad data is not driving strategic and operational decisions.”

I’ve written in the past how real-time operational adjustments can actually aid strategic decision-making, because the adjustments add a degree of correctness to the historical information base.  However, that’s only true when the operational decisions are based on good data. 

As Gene’s report points out, this is where things get tricky.  You need to re-think and reposition data quality procedures to achieve correctness in real-time operational decision-making. 

However, you need to be cognizant of the impact, in terms of delay, caused by adding data quality checks to real-time processing.  Obviously, the tolerance for delay will vary by business scenario, and perhaps even by transaction.

If the business scenario (or transaction) has a low tolerance for introducing delay, then you need to shift your tolerance view to impact of error.  If the outcome of the real-time decision falls within your error tolerance range, then proceed with the action.  If not, don’t force a real-time decision that your business might regret.

As you work through data governance policies for real-time, be sure to include tolerances for delay, correctness and decision (action or inaction) risk.

Filed Under: active information, change, information strategies

Business Ecology: Optimization for Innovation

January 19, 2010 By brenda michelson

As I mentioned in my 2010 plans, one of my projects this year is writing and advocacy for the OMG’s Business Ecology Initiative (BEI).  This morning, we launched the new Business Ecology Initiative blog, and I published the following overview post on Business Ecology, the Business Ecology Initiative and Business Ecology enablers. 

As you read the post, you’ll notice several themes common to my on-going writings (soapboxes), which is why getting involved with the BEI is a no-brainer for me.  The original post follows.

What is Business Ecology?

Business Ecology is a business-technology imperative focused on streamlining business processes, removing waste from technology portfolios, and adjusting resource consumption, to optimize business operations and foster business innovation.

As the world economy emerges from a painful recession, organizations are confronted with the challenge of retaining bottom-line diligence, while pursuing market sustaining and gaining innovation. 

For many organizations, the answer lies in harvesting savings and trapped value from existing processes, resources and capabilities.  To accomplish this, organizations are turning to Business Ecology. 

Business Ecology is not a one-time fix, but rather a management philosophy concerned with business vitality over time, balancing current conditions, optimization and innovation focus areas, resource allocations, and longer-term business motivations, capabilities and outcomes.

An important enabler of Business Ecology is the use of technology beyond automation. Business Ecology practitioners employ technology to identify, measure, model and drive business change. 

Business Ecology adoption and execution requires a cross functional team, comprised of business and information technology professionals. Key team capabilities include business process thinking, business performance measurement and analysis, financial analysis, IT architecture, portfolio management, service delivery and iterative project management.

Successful Business Ecology initiatives adopt the principles and values of Business Technology.  The team communicates with each other, sponsors and constituents in a common, business based language.  Measurement is defined and reported in business terms.  Funding and governance decisions and mechanisms reflect a shared resource, investment portfolio approach.

As Business Ecology teams progress, technology and business savvy is exchanged, fostering a greater understanding of each other’s challenges, skills and tools, which leads to a break in the longstanding, constraining, business and IT divide.  A byproduct of successful Business Ecology is business-IT integration.

The Chief Information Officer (CIO), given his/her unique position to view business processes, resource consumption, and technology portfolios across the organization, most often champions Business Ecology adoption and execution.  C-level executives, including the CEO, CFO and COO, sponsor Business Ecology initiatives.

What is the Business Ecology Initiative?

OMG, via the Business Ecology Initiative, leads the drive towards Business Ecology. 

The Business Ecology Initiative provides education, advocacy and member programs to enable organizations to achieve Business Ecology success, employ Actionable ArchitectureTM, and carve a path to business-IT integration.

Actionable ArchitectureTM brings transparent business methodology to the definition and delivery of common IT infrastructure, platforms and services.  Emphasized business attributes include quality, efficiency, compliance, agility, value, effectiveness, ease of use, sustainability and business goal traceability.

Business-IT Integration is the organizational model for business and information technology convergence.  This model promotes collaborative strategy, planning, architecture and execution; shared decision-making, business-tech savvy personnel and service delivery at the point of value generation.   

What enables Business Ecology?

Techniques: LEAN, Six Sigma, BPM, Value Chain Analysis, Actionable ArchitectureTM , Business Technology, Agile, Modeling, Simulation, Business Measurement, and Sustainability Analysis

Technology: SOA, BPM, Cloud Computing, Event Processing, Analytics, Master Data Management (MDM), and Open Standards

Measurement Models: Business Process Maturity Model (BPMM), Sustainability Assessment Model (SAM)

People: C-level executives, business and information technology professionals, who embrace the philosophy of Business Ecology

 

[Disclosure: The Business Ecology Initiative is a client of my firm, Elemental Links.]

Filed Under: bpm, business, business ecology, business-technology, information strategies, innovation, services architecture, soa, trends

Julian Hyde in ACM on In-Flight Data Processing with Streaming SQL Technology

January 14, 2010 By brenda michelson

Julian Hyde, chief architect of SQLstream, and lead developer of Mondrian, has a great article in Communications of the ACM on Data in Flight.  The article provides an overview of streaming query engines, demonstrates simple queries using a clickstream example, compares streaming query engines to relational database technology, discusses the advantages of streaming, and concludes with additional streaming applications, including CEP.

On streaming query engines vs. relational database technology:

“The streaming query engine is a new technology that excels in processing rapidly flowing data and producing results with low latency. It arose out of the database research community and therefore shares some of the characteristics that make relational databases popular, but it is most definitely not a database. In a database, the data arrives first and is stored on disk; then users apply queries to the stored data. In a streaming query engine, the queries arrive before the data. The data flows through a number of continuously executing queries, and the transformed data flows out to applications. One might say that a relational database processes data at rest, whereas a streaming query engine processes data in flight.”

On CEP and Streaming, Hyde states:

“Application areas include complex event processing (CEP), monitoring, population data warehouses, and middleware. A CEP query looks for sequences of events on a single stream or on multiple streams that, together, match a pattern and create a "complex event" of interest to the business. Applications of CEP include fraud detection and electronic trading.

CEP has been used within the industry as a blanket term to describe the entire field of streaming query systems. This is regrettable because it has resulted in a religious war between SQL-based and non-SQL-based vendors and, in overly focusing on financial services applications, has caused other application areas to be neglected.”

Hyde concludes his article as follows:

“Streaming query engines are based on the same technology as relational databases but are designed to process data in flight. Streaming query engines can solve some common problems much more efficiently than databases because they match the time-based nature of the problems, they retain only the working set of data needed to solve the problem, and they process data asynchronously and continuously.

Because of their shared SQL language, streaming query engines and relational databases can collaborate to solve problems in monitoring and realtime business intelligence. SQL makes them accessible to a large pool of people with SQL expertise.

Just as databases can be applied to a wide range of problems, from transaction processing to data warehousing, streaming query systems can support patterns such as enterprise messaging, complex event processing, continuous data integration, and new application areas that are still being discovered.”

If you are even remotely interested in event processing, active information strategies and/or stream processing, I highly recommend reading Hyde’s article.

Filed Under: active information, event driven architecture, event processing, information strategies, trends

Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • pair: "can't change what can't see | context is king ... blah blah..." + systems awareness. [collecting myself, o… https://t.co/Rvv0LakbB8 May 13, 2022 1:55 pm
  • Whiteboard gravitational pull. May 12, 2022 12:20 pm
  • The Lost Kitchen’s Erin French helps raises nearly $1M for Maine farmers affected by PFAS - @MainePublic https://t.co/LRaHiqZYB0 May 11, 2022 1:19 pm
© 2004-2022 Elemental Links, Inc.