• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Archives for September 2009

Cloud Computing Picks for Business Analysts

September 29, 2009 By brenda michelson

Recently, a business analyst asked me for some cloud computing reading ideas. His goal was to get familiar with cloud computing without drowning in technobabble or wading through near-religious rhetoric. After sending him an email with some picks, it occurred to me that many business analysts are similarly interested. With that, here is a selection of cloud computing reads and topics you should keep an eye on.

Cloud Computing Basics

In the basics, you are looking to learn about cloud computing components, offerings and economic value. Because without value, there is no need to explore further. Some starting points:

The Economist’s special cloud computing section from October 2008. I’ve linked to the lead article. Be sure to check out the follow-on pieces, listed to the right of the article, in the sidebar.

MIT’s Technology Review recently published a Cloud Computing Briefing. Be sure to read the Introduction and Technology Overview pieces.

WSJ’s The Internet Industry is on a Cloud. The value of this piece is seeing what your business executives are reading. Particularly noting the key players and how hype is outpacing reality.

Steve O’Grady of Redmonk on cloud economics and recessionary times. Me on the credit crisis and cloud computing, and how much cheap servers really cost.

Cloud Computing Issues

Nothing is without risk. The major concerns (as I type this) in the cloud computing space revolve around security, regulatory compliance, delegating management control, and early maturity. Of course, some of those concerns are also present in non-cloud environments. Some starting points:

Vint Cerf’s blog post on the early stage of the Cloud. In a perfect world, all clouds would seamlessly connect, and an organization could easily move applications and data between clouds, or run a business process across clouds. This is the “intercloud” vision.

Keep abreast on cloud computing (and intercloud) evolution by reading smart folks like James Urquhart.

Doug Cornelius’ Compliance Building blog. I covered a cloud panel Doug participated in, post is here.
Another from MIT Technology Review on the current state of, and need for, cloud standards.

As for delegating management control, the key is establishing and managing service-level agreements (SLA). The early collection of SLA requirements, for an outside party service agreement, will be critical.

Cloud Computing and Business

In addition, whenever possible, read what your business executives are reading. Over the last 6-12 months, all of the major business publications – WSJ, Financial Times, Economist, BusinessWeek, Fortune, Forbes, etc – have covered some aspect of Cloud Computing.

Most often, these business journal articles lead with the strategies and movements of a cloud computing player – Amazon, Cisco, Google, HP, IBM, Microsoft, Oracle, Sun – so be creative in your searches.

Once you are comfortable with the basics, start to read for business analyst specific concerns, such as:

Use cases (pdf) and business scenarios that are a good fit for cloud computing
Cloud computing case studies in your industry
Service analysis methods for business analysis, service-oriented architecture and cloud computing
As for the last point, look at value chain analysis techniques or Geoffrey Moore’s Dealing with Darwin for determining core vs. non-core capabilities. And look for service analysis techniques that drive from business capabilities. This area is a particular interest (and practice) area of mine. I’ll publish more on this in the future.

If you have some additional cloud computing reading suggestions for business analysts, please leave a comment, or send me a link on Twitter.

Filed Under: business architecture, cloud computing Tagged With: archive_0

Net Neutrality 2009 – An Invitation for National Conversation

September 25, 2009 By brenda michelson

“I am convinced that there are few goals more essential in the communications landscape than preserving and maintaining an open and robust Internet. I also know that achieving this goal will take an approach that is smart about technology, smart about markets, smart about law and policy, and smart about the lessons of history.”  — FCC Chairman Julius Genachowski at The Brookings Institution, Washington DC, September 21, 2009

 

On Monday, September 21, 2009, FCC Chairman Julius Genachowski outlined proposed net neutrality rules to preserve the openness of the Internet.  For those unfamiliar with the concept and issues of net neutrality, here is a snippet from my February 2006 paper, Net Neutrality: An Important Topic for National Conversation:

“The concept of Net Neutrality is straightforward.

In a neutral network, all network traffic is treated neutrally (not discriminated against) regardless of origin or destination. In a neutral network, all endpoints (content, applications, equipment) are treated neutrally (not discriminated against), regardless of function, ownership, or implementation. Since the Internet’s inception, the network has been neutral.

From the consumer’s point of view, a neutral network allows for unfettered access to any (legal) content and applications, using the equipment of their choice.

From the application and content provider’s point of view, a neutral network supports the offering of any (legal) content and application, using the platforms of their choice.

From the network operator’s point of view, a neutral network dictates the implementation and management of a standards-based network, to transmit information from origin to destination, without consideration for the usage patterns, payload, and volume generated by the endpoints.

In essence, a neutral network is a dumb pipe. The intelligence resides at the endpoints. And there lies the conflict.

In a neutral model, the network operators claim the application and content providers get a free ride on the network, with great financial returns, on the network operator’s investment, while the network operators themselves are prohibited from offering new services (access tiers, prioritized traffic) to realize a return on their investment. Without a return on investment, network operators claim no incentive to continue their investment, specifically in broadband deployment.

In response, the application and content providers claim a discriminatory model would limit consumers’ choice on the Internet, and create high barriers to entry for new, innovative, application, and content providers. Without accessible and compelling content and applications, broadband adoption and usage will decline. Not to mention, the network operators do collect fees from the endpoints (consumers and providers).

As you can see, the Net Neutrality issue is circular. The network needs compelling applications and content. The applications and content need a viable network.

However, the good news is both sides, and the government, are in agreement that further broadband deployment and adoption are critical for innovation, social causes, economic growth, and global competitiveness.

The challenge, then, is setting forth policies that encourage both network and endpoint investment and innovation today, without placing unforeseen restrictions on future innovation.”

In making the case for net neutrality, Chairman Genachowksi began by attributing the success of the Internet to the its open design:

“Why has the Internet proved to be such a powerful engine for creativity, innovation, and economic growth? A big part of the answer traces back to one key decision by the Internet’s original architects: to make the Internet an open system.

Historian John Naughton describes the Internet as an attempt to answer the following question: How do you design a network that is “future proof” — that can support the applications that today’s inventors have not yet dreamed of? The solution was to devise a network of networks that would not be biased in favor of any particular application. The Internet’s creators didn’t want the network architecture — or any single entity — to pick winners and losers. Because it might pick the wrong ones. Instead, the Internet’s open architecture pushes decision-making and intelligence to the edge of the network — to end users, to the cloud, to businesses of every size and in every sector of the economy, to creators and speakers across the country and around the globe. In the words of Tim Berners-Lee, the Internet is a “blank canvas” — allowing anyone to contribute and to innovate without permission.”

Next, Chairman Genachowksi spoke of real challenges to the Internet’s openness, including providers purposely blocking user access, limited broadband provider choice, broadband provider incentives and traffic explosion.

“Notwithstanding its unparalleled record of success, today the free and open Internet faces emerging and substantial challenges. We’ve already seen some clear examples of deviations from the Internet’s historic openness. We have witnessed certain broadband providers unilaterally block access to VoIP applications (phone calls delivered over data networks) and implement technical measures that degrade the performance of peer-to-peer software distributing lawful content. We have even seen at least one service provider deny users access to political content. And as many members of the Internet community and key Congressional leaders have noted, there are compelling reasons to be concerned about the future of openness.

One reason has to do with limited competition among service providers. As American consumers make the shift from dial-up to broadband, their choice of providers has narrowed substantially. I don’t intend that remark as a policy conclusion or criticism — it is simply a fact about today’s marketplace that we must acknowledge and incorporate into our policymaking.

A second reason involves the economic incentives of broadband providers. The great majority of companies that operate our nation’s broadband pipes rely upon revenue from selling phone service, cable TV subscriptions, or both. These services increasingly compete with voice and video products provided over the Internet. The net result is that broadband providers’ rational bottom-line interests may diverge from the broad interests of consumers in competition and choice.

The third reason involves the explosion of traffic on the Internet. With the growing popularity of high-bandwidth applications, Internet traffic is roughly doubling every two years. Technologies for managing broadband networks have become more sophisticated and widely deployed. But these technologies are just tools. They cannot by themselves determine the right answers to difficult policy questions — and they raise their own set of new questions.”

According to the Chairman, we are at a crossroads:

“The rise of serious challenges to the free and open Internet puts us at a crossroads. We could see the Internet’s doors shut to entrepreneurs, the spirit of innovation stifled, a full and free flow of information compromised. Or we could take steps to preserve Internet openness, helping ensure a future of opportunity, innovation, and a vibrant marketplace of ideas.

I understand the Internet is a dynamic network and that technology continues to grow and evolve. I recognize that if we were to create unduly detailed rules that attempted to address every possible assault on openness, such rules would become outdated quickly. But the fact that the Internet is evolving rapidly does not mean we can, or should, abandon the underlying values fostered by an open network, or the important goal of setting rules of the road to protect the free and open Internet.

…

In view of these challenges and opportunities, and because it is vital that the Internet continue to be an engine of innovation, economic growth, competition and democratic engagement, I believe the FCC must be a smart cop on the beat preserving a free and open Internet.”

How will the FCC be a “smart cop”?  Chairman Genachowksi recommends adding two more principles to the Four Freedoms (pdf) originally proposed by then FCC Chairman Michael Powell in 2004.  In his speech, Chairman Genachowksi summarized the Four Freedoms as:

“Network operators cannot prevent users from accessing the lawful Internet content, applications, and services of their choice, nor can they prohibit users from attaching non-harmful devices to the network.”

New 5th Principle is Non-discrimination:

“The fifth principle is one of non-discrimination — stating that broadband providers cannot discriminate against particular Internet content or applications.”

“This means they cannot block or degrade lawful traffic over their networks, or pick winners by favoring some content or applications over others in the connection to subscribers’ homes. Nor can they disfavor an Internet service just because it competes with a similar service offered by that broadband provider. The Internet must continue to allow users to decide what content and applications succeed.

This principle will not prevent broadband providers from reasonably managing their networks. During periods of network congestion, for example, it may be appropriate for providers to ensure that very heavy users do not crowd out everyone else. And this principle will not constrain efforts to ensure a safe, secure, and spam-free Internet experience, or to enforce the law…”

New 6th Principle is Transparency:

“The sixth principle is a transparency principle — stating that providers of broadband Internet access must be transparent about their network management practices.”

“…Today, broadband providers have the technical ability to change how the Internet works for millions of users — with profound consequences for those users and content, application, and service providers around the world.

…

We cannot afford to rely on happenstance for consumers, businesses, and policymakers to learn about changes to the basic functioning of the Internet. Greater transparency will give consumers the confidence of knowing that they’re getting the service they’ve paid for, enable innovators to make their offerings work effectively over the Internet, and allow policymakers to ensure that broadband providers are preserving the Internet as a level playing field…”

Call for Public Participation

After outlining the principles, the Chairman spoke of the rule-making process.  Although the process starts with fellow commissioners, there is a call for public participation, via openinternet.gov, the same site that published the Chairman’s full speech.

“While my goals are clear — to ensure the Internet remains a free and open platform that promotes innovation, investment, competition, and users’ interests — our path to implementing them is not pre-determined. I will ensure that the rulemaking process will be fair, transparent, fact-based, and data-driven. Anyone will be able to participate in this process, and I hope everyone will. We will hold a number of public workshops and, of course, use the Internet and other new media tools to facilitate participation. Today we’ve launched a new website, www.openinternet.gov, to kick off discussion of the issues I’ve been talking about. We encourage everyone to visit the site and contribute to the process.”

Public and Industry Reaction

As you might expect, not everyone is thrilled with the administration’s move towards formalized net neutrality policy.  The party lines remain drawn, users, content, application providers are thrilled with this action, broadband, and wireless carriers remain skeptical.  For a great synopsis of reactions, see this WSJ piece by Amy Schatz.

Why this post?

So, why such a long post on Net Neutrality, the same reason as I shared back in Feb 2006, I couldn’t help myself.  Net Neutrality is an important concept for all IT professionals to understand. Even more so now, as Carl Brooks points out in era (or not) of cloud computing.

“The implications of net neutrality for the public cloud are plain; because it’s basically margins-driven, any squeeze from carriers would hamstring providers. Amazon’s cloud success is driven precisely by the fact that using it is easy and costs about the same as running your own server, minus the investment.

If it became more expensive to run a cloud server than a real server, which prejudicial network pricing would assuredly do, cloud adoption would stumble badly. Little users would stick with hosting; enterprises might still move into private cloud, but there would be no compelling reason for them to stick appropriate applications and data in the public cloud

The true benefits of cloud computing– cheap, elastic and massively parallel computing power at the finger tips of the bright young things in industry and academia– would never be realized, since Comcast or Verizon would be lying in wait to pounce on data crunching projects and surcharge them.”

To learn about Net Neutrality’s historical context (policy and marketplace), issues, pro and con positions, intersections and convergences, see the full piece I wrote in February 2006, on the heels of a Senate Commerce Committee hearing.

To get involved in the conversation, check out openinternet.gov.

Filed Under: cloud computing, net neutrality

SOA Case Study Contest Special Recognition Winner: NY State Department of Taxation & Finance for e-MPIRE

September 21, 2009 By brenda michelson

Last week, the SOA Consortium announced that NY State Department of Taxation and Finance is the special recognition winner for Government/Public Sector in the SOA Consortium | CIO magazine case study contest.  NY State Department of Taxation and Finance Case highlights – originally posted by me on SOA Consortium Insights — follow. 

Organization Background

The New York State Department of Taxation and Finance is responsible for the administration of the state’s tax laws, including the administration of related local taxes, and the management of the State Treasury.

Business Scenario

Modernization & Re-engineering: Existing systems had several business issues: User view into the system was not integrated across all platforms and systems, leading to longer training times and an inconsistency of service. The delivery or work was still paper based.  The expectations of both internal and external customers changed dramatically with the emergence of the Web. The department wanted to facilitate Web filing and change the processing model to transactional from its historical batch pattern.

Legacy Constraints: Meanwhile the technical side of the house was not only having trouble keeping up with the new demands, but was becoming unable to support all the different legacy technologies.  In fact the primary system for processing personal income returns dates back to 1970s and was built on a homegrown database system by experts who have all retired. 

Talent & Intellectual Capital: All the while, in both business and technical areas, experts were retiring with considerable undocumented enterprise intellectual capital.  The department was finding it hard to recruit the next generation of leaders, because the work involved was not as attractive as other offers. 

Cost Reductions: At the same time, the technical organization was being asked to cut costs and reduce total cost of ownership. 

e-MPIRE Project: The goal was to establish a 21st century government system and toolset. To meet these goals Tax brought in through RFP a vendor who proposed an Integrated Tax System. The solution was a black box with security, work management and written business rules all built into it. After reviewing the initial high-level design of the vendor, the department decided to dismiss the vendor and implement an approach that lead to a more open solution that leveraged existing assets.

ROI

Expected Value: The planned business value of e-MPIRE was to eliminate the risks of having the core departmental systems on unsupported platforms, give the user a single interface into all systems and build agility to adapt to changing legislative and business requirements. 

Agility: On the agility part, the annual legislative changes (referred to as annual cycles) for Corporation Tax historically had taken six weeks to code and more than two months to test.  Under e-MPIRE R2, and because of the externalized rules and improved testing tools, the annual cycle changes took two weeks to code and two weeks to test. 

Optimization: With the implementation of a workflow engine in R2, they found that individual work item time was reduced by 40%, exception inventories were reduced on average by 60% and backlog was reduced by 80%.  

Volume: In R3, the ability to process the high volumes of income tax returns and deliver refunds to taxpayers improved dramatically. Historically, barely 150,000 returns were processed a night, with a 24 hour delay for fraud detection.  e-MPIRE R3 hit a high water mark of 390,000 returns in a night (ran out of input), which includes a near real time evaluation for fraud. 

Project Organization

Business and IT Collaboration: The project team included managers and staff people from across the Department of Taxation and Finance, along with business analysts and programmers from the IT organization. 

Two enterprise wide IT organizations participated via dotted line to the project: Enterprise Architecture and Infrastructure.  Enterprise architecture had some embedded teams with the project, since the project was going to establish many standards and tools moving forward.  The User Interface Team, Java Framework Team and Technical Workflow Team were essentially Architecture’s Centers for Excellence on the project.   

The User Interface Team worked with business analysts and users to establish everything from navigation patterns to field naming conventions, all with the intent of having a consistent UI to simplify training and usage. 

The Java Framework Team worked with the programmers and architecture teams to develop the coding behind the navigation, tools to aid development (some code generation), develop consistent integration services (to Content Management, Workflow, Business Components, etc.) and with basic application development support. 

The Technical Workflow Team worked with business analysts, users and the Workflow team (non-technical team of business modelers) to develop common workflow services, monitor models and process patterns. 

The Architecture Team helped align the business with the project because they enforced the consistent enterprise view of process and look and feel (the two places the system touches the users). 

There were validation meetings with larger groups of users to validate the design and direction of the project. 

The Business Roles and Navigation team of users was established to build the tabsets (functions) for the user groups, establish roles, make sure the right roles have access to those functions, and establish that function in the overall navigational scheme of e-MPIRE.

Lessons

NY State Department of Taxation and Finance cites four reasons for success:

  1. Executives (commissioner, deputies and CIO) were committed to the creation of a system that will grow with the department for the next 20 years.  This is the most highly visible application at Tax and the one that has the largest impact on the citizenry ($3.5B in refunds in the economy). This was a risky system. It took real commitment and belief to pull the trigger.
  2. A shared vision of what the system had to be. 
  3. A department culture that facilitated success.  The cultural advantages they had were a trust of IT across the department, a history of working together, and within IT a culture of reuse of services (again a pattern of how they work together). 
  4. The considerable effort expended by their users, programmers and partners.

 

[Disclosures: The SOA Consortium is a client of my firm, Elemental Links. I was a contest judge.]

Filed Under: services architecture, soa

SOA Case Study Contest Special Recognition Winner: FINRA for NYSE & NASD Member Regulation Merger

September 18, 2009 By brenda michelson

Earlier this week, the SOA Consortium announced that Financial Industry Regulatory Agency (FINRA) is the special recognition winner for Regulatory in the SOA Consortium | CIO magazine case study contest.  FINRA Case Highlights — originally posted by me on SOA Consortium Insights — follow. 

Organization Background

The Financial Industry Regulatory Authority (FINRA) is the largest independent regulator for all securities firms doing business in the United States.  Created in July 2007 through the consolidation of NASD and the member regulation, enforcement and arbitration functions of the New York Stock Exchange, FINRA is dedicated to investor protection and market integrity through effective and efficient regulation and complementary compliance and technology-based services.

Business Scenario

Merger/Consolidation: The Financial Industry Regulatory Authority (FINRA) SOA project consolidated the New York Stock Exchange Member Regulation systems with the NASD Member Regulation systems. FINRA is the largest private independent regulator for all securities firms doing business in the United States. FINRA oversees nearly 4,850 brokerage firms, about 173,000 branch offices and approximately 649,000 registered securities representatives.

The primary challenges:

  1. Consolidation of the two organizations’ application portfolios that support the member regulation business.  Each application portfolio was sizable and heterogeneous. At the onset FINRA had ~160 applications and NYSE Member Regulation was supported by 86 applications.
  2. Reconciliation of two sets of legacy business processes into a final-state business process.
  3. Final-state business processes must seamlessly integrate new systems and existing systems from both legacy organizations. The existing systems required enhancements.
  4. Business teams were distributed across the United States in district office locations.  The development team was located in New York City and the Washington D.C. area.

The objectives:

  1. The final-state business processes of the merged company required seamless operation.
  2. The team needed to ensure a continuity of business operations while transitioning in phases to the new final-state business processes.
  3. Performance and reliability of the systems was a key requirement in maintaining core mission success.

Why SOA:

  1. The size and complexity of the project required multiple teams in different locations working effectively in parallel to meet the aggressive schedule.
  2. A SOA approach reduced risks presented by the large team size.
  3. The end-state systems had to be flexible and provide the ability to quickly deploy changed and new business process without breaking the architecture.
  4. It was anticipated that the approach would deliver significant savings in both cost and time when compared to competing approaches.

ROI

The Member Regulation function of FINRA (the new, merged regulator) benefited greatly from the new system.  Broker regulation tasks were simplified and accelerated, and delivered cost savings for the business.

The key business values achieved are:

Time to Market – Project delivery was greatly accelerated by allowing development teams to conduct parallel development of 10 major services with minimal interaction and dependencies. The service oriented approach and detailed overall vision allowed each team to rapidly deliver individual services that were seamlessly integrated and tested by the system team.

Reduced Risk – The SOA approach mitigated many of the risks associated with large development teams (100+ staff) by facilitating parallel development while minimizing team interdependencies and setting clear team responsibilities. The key to reducing risk is the early definition of business service interfaces and responsibilities.

Cost Savings – The modular SOA architecture of the new system consolidated business functions into a common set of business services that are leveraged across many business processes, resulting in cost savings for construction, deployment and maintenance of the system. 

Improved Agility – The business-centric service design and modularity of the SOA approach provides flexible deployment to support current business processes and to rapidly adapt to support future business process. Current business centric services include data sourcing, analytic surveillance and case management.

Resilience – Fault tolerant business-continuity is achieved using guaranteed message delivery, as individual business services are moved off-line for maintenance and restored.

Process Optimization – Technology duplication is eliminated through the consolidation of functionality into discrete standardized business services. This also provides a uniform approach and consistent results across all the business processes.

Project Organization

FINRA used a three-tier approach to organize the project:

Tier 1 was comprised of business analysts that spanned the various business processes.

Tier 2 consisted of 10 distinct service teams that defined the function of their service based on the business analysis and under the supervision and guidance of architecture and program management.

Each service team was individually sponsored by the organization and was a cross-functional team comprised of business analysts, architecture, development, and testing. The collaboration and alignment of technology and business staff on each individual service team was key to their success.

Tier 3 was comprised of the system architecture team and program management who provided overall vision, governance, and timeline. The overall vision provided business-centric services spanning many business processes. This naturally created emphasis on major business functions such as analytic surveillance and case management.

Lessons

  1. The single most important lesson from this SOA project is that extremely large, time critical applications can utilize a SOA approach to segregate and compartmentalize common services and allow for massively parallel work by independent teams. Not only does this approach increase organizational productivity but also mitigates some of the risk presented by a large project.
  2. The SOA approach gave the teams a measure of insulation, helping ensure decisions on one component did not negatively impact other components or the project.  The de-coupling allowed teams to deliver well-defined components on an aggressive time line required for project success.
  3. Understanding the underlying business problems and processes is crucial to creating well-defined services that are reusable and exhibit the correct level of granularity.  The payoff for this is a flexible business process that can change and grow without changing the architecture
  4. A concise architectural vision shared between system architects and application architects is key in large projects. Effective governance, along with well-defined services with clear functions and interfaces, is essential. Since the interfaces will change over time, it is important to develop a plan to handle this early.
  5. For projects that use process orchestration, identify and document those processes early in the project.  This will help ensure that they operate in conjunction with the business function they automate and avoid problems that would be more costly and difficult to solve later in the project.
  6. The combination of an ESB and BPM is a robust and powerful enterprise pattern. This combination greatly simplified several of the tasks of adapting existing capabilities into a new unified SOA system.

[Disclosures: The SOA Consortium is a client of my firm, Elemental Links. I was a contest judge.]

Filed Under: bpm, services architecture, soa

SOA Case Study Contest Special Recognition Winner: BlueStar Energy for NextStar™

September 17, 2009 By brenda michelson

Yesterday, the SOA Consortium announced that BlueStar Energy is the special recognition winner for Energy/Utility in the SOA Consortium | CIO magazine case study contest.  Highlights from the BlueStar Energy Case follow. 

Company Background

BlueStar Energy is an independent retail electric supplier, certified to sell electricity in Illinois, Maryland and the District of Columbia. In addition, BlueStar provides green power and energy efficiency solutions to home and business customers.

Business Scenario

Business Agility: BlueStar’s business is in a very fluid regulatory environment. Business conditions change all the time, and the IT infrastructure has to adapt to those changes immediately. The BlueStar executive team formulated a strong strategy and needed a foundation for business execution.

Unite Business & Technology: In mid 2006, BlueStar sponsored an enterprise-wide SOA initiative to unite business and technology based on a strategic enterprise vision to increase competitive advantage. The goal was to provide BlueStar with a rock-solid IT platform and digitized business processes to automate the company’s core capabilities.

Enterprise Architecture Evolution:  BlueStar adopted the concepts of a mature EA methodology and embraced service orientation (SOA) as their enterprise architecture style. The EA methodology allow the CTO to make tough decisions about which processes BlueStar had to execute well, then implement the IT systems needed to automate those processes.

NextStar™: The enterprise architecture would be named NextStar™ (for the “next” BlueStar) with the goal of streamlining and automating core business processes including eCommerce, B-to-B Integration, Accounting, Automated Provisioning, Risk/Energy Management, Pricing and Product Management, Sales Force Automation, Customer Relationship Management and Billing Systems. Millions of dollars in revenue are tied to timely access to information and the ability to act on that information.

Incremental Delivery: Key to BlueStar’s success was thinking big and starting small—partitioning BlueStar into narrowly defined business domains, processes and services, building out features iteratively, and, showing early value and success—instead of attempting to define the architecture for the entire company at once.

Their services and applications have been architected leveraging SOA constructs such as standard-based interface, consumer heterogeneity, loosely coupling, and, composability, among others. Said services and applications as well as existing legacy systems, workflows and third-party service providers interact with each other in a standard, loosely coupled manner via their Business Integration Suite, which consists of open source distributed, scalable and reliable components such as enterprise service bus, business process management system and messaging fabric.

ROI

BlueStar’s CEO Guy Morgan attributes much of the company’s recent growth to the NextStar initiative. BlueStar has grown 12,197% over five years.

In addition to growth contribution, between the adoption of enterprise architecture, open source and offshore development, the company estimates saving $24 million over the course of five (5) years.

Business benefits:

  • Improve business agility by streamlining core business processes, accelerating supply chain integration and providing business tools to offer new products in new geographies for new demographics, at an attractive price
  • Reduce business operational expense and cycle time through automation of manual processes and seamless integration with trading partners and value-added vendor-supplied services
  • Position the company to compete with Illinois utilities to serve residential and small-business customers after the discounted rate the state required utilities to charge for 10 years, ended in January 2007
  • 50% reduction in billing staff
  • Met per account operational cost targets for small business and residential customers through 100% automation of account management
  • Ability to cope with frequent regulatory requirements changes from states where BlueStar operates
  • Ability to enter new energy markets such as natural gas
  • Ability to either in-source or outsource business processes to respond to market conditions
  • Facilitate merger and acquisition (M&A) activity due to likely industry consolidation

IT benefits:

  • Ability to scale from thousands of customers to tens of millions of customers nationwide
  • Serve as a foundation for massively scalable solutions
  • 10% reduction in IT expense associated with enrollment and billing processing
  • Reduced IT capital investment and ongoing operational expense through 100% use of open source technologies

Project Organization

Business & IT Collaboration: From inception to deployment, this project has been a collaborative effort between business and technology teams. As a first step in the governance and change management areas, BlueStar established a Steering Committee comprising the company’s executives. Then, with executive management sponsorship, the company formed several domain-based change control boards consisting of multi-disciplinary teams including business, legal and IT members. The change control board members meet weekly to discuss prioritization, progress, change and control management, and release planning.

In-house Effort: NextStar was developed 100% in house. The Chief Technology Officer, Director of Enterprise Architecture and Lead Solutions Architect worked in Chicago, IL. Software Engineering and Quality Assurance teams worked in Lima, Peru.

Enterprise Architecture: The Enterprise Architecture team was responsible for providing leadership, mentorship and oversight of the Business, Solutions, Information and Technology Architectures, all of this under the SOA paradigm.

Lessons

BlueStar discovered early on that these actions would make the project more successful:

  • Increased focus on program management (strategy) versus project management (tactical)
  • Importance to address and agree on budgetary planning and allocation
  • More strict adoption of the architecture and software engineering methodologies
  • More strict management and governance of IT assets

[Disclosures: The SOA Consortium is a client of my firm, Elemental Links. I was a contest judge.]

Filed Under: enterprise architecture, services architecture, soa

Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • “…where the process of drawing itself can take us. We can follow a suggestion, a squiggle, shadow, or smudge, and s… https://t.co/oRg0x2LoXG November 30, 2022 5:05 pm
  • On the waiting list for Post, join me (on the waitlist) via https://t.co/U8wYK707f6 November 24, 2022 4:17 pm
  • Meet the longtime librarian being honored at the National Book Awards : NPR https://t.co/S44VQeJg83 November 13, 2022 2:51 pm
© 2004-2022 Elemental Links, Inc.