• Blog
  • About
  • Archives

elemental links

brenda michelson's technology advisory practice

Archives for March 2009

Cloud Watching for Enterprise Architects: Picks

March 23, 2009 By brenda michelson

As promised, here are some links that should be of interest to enterprise architects who are (or need to be) cloud watching.  And yes, I realize I’ve mixed in a leadership interview with Xerox’s Anne Mulcahy, but once you read the others, you’ll be appreciative of a leader who gets that dealing with ambiguity is a sought after, and well compensated, skill.

What SOA Can Learn from Cloud Computing and Vice Versa | David Linthicum

SOA can learn from cloud computing Service Design & Expandability. Cloud from SOA: Governance & Architecture driven. Service Design: "Those who deploy services in the cloud, such as Amazon, TheWebService, Force.com, have done a pretty good job with service design. You really have to do a good job in order to rent the darn things out. Many SOA projects have a tendency to build services that are too course-grained, too fine-grained, or just not at all well designed. The reality is that services that are not well defined and designed won’t sell well when delivered on-demand, and thus those who provide services out of the cloud – which are most major cloud computing providers – have to spend a lot of time on the design of the services, including usability and durability. I urge those who build services within their SOA, no matter the enabling technology and standards involved, look at what’s out there for rent as good examples of how services should be designed, developed, and deployed."

IBM, Sun and cloud computing | Gathering clouds | The Economist

"The economic crisis has pummelled Sun, which never really recovered from the dotcom bust. As its share price plumbed new lows, IBM’s remained relatively unscathed—a reflection of its business, which has been protected by the computer giant’s global scope and the fact that it makes most of its money from software and services. In the months to come, more big fish will seek to swallow smaller fry. That is because something deeper is going on in the computer industry. Thanks to ever more powerful chips and new software, servers and other hardware can now be “virtualised”, meaning physically separate systems can act as one. This enables computing power to become a utility: it is generated somewhere on the network (“in the cloud”) and supplied as a service. To simplify their complex data centres and cut costs, more and more companies are thinking about building in-house computing utilities, called “private clouds”, or outsourcing computing to “public clouds” of the kind Sun launched…"

James Governor’s Monkchips » Amazon Web Services: an instance of weakness as strength

"Amazon isn’t the de facto standard cloud services provider because it is complex – it is the leader because the company understands simplicity at a deep level, and minimum progress to declare victory. Competitors should take note – by the time you have established a once and future Fabric infrastructure Amazon is going to have created a billion dollar market. And what then? It will start offering more and more compelling fabric calls… People will start relying on things like SimpleDB and Simple Queue Service. Will that mean less portability? Sure it will…"

10 Must-Know Topics For Software Architects In 2009

"after quite a lull, the software architecture business has gotten rather exciting again…The hegemony of traditional 3 and 4-tier application models, heavyweight run-time platforms, and classical soa that has dominated for about a decade is now literally being torn asunder by a raft of new approaches for designing and architecting applications…incautious words but major changes are in the air and architects are reaching out for new solutions as they encounter novel new challenges in the field…these new advances either address increasingly well-understood shortcomings of existing approaches or add new capabilities that we haven’t generally focused on before…Mainstays of application architecture such as the relational database model, monolithic run-times, and even deterministic behavior are being challenged by non-relational systems, cloud computing, and new pull-based systems where consistency and even data integrity sometimes take a backseat to uptime and performance."

Corner Office – The Manager of Change at Xerox – Question – NYTimes.com

"Q. Do you find yourself looking for certain qualities in a candidate more than you did several years ago?

A. Adaptability and flexibility. One of the things that is mind-boggling right now is how much we have to change all the time. For anybody who’s into comfort and structure, it gets harder and harder to feel satisfied in the company. It’s almost like you have to embrace a lot of ambiguity and be adaptable and not get into the rigidness or expectation-setting that I think there used to be 10 years ago, when you could kind of plot it out and define where you were going to go. I think it’s a lot more fluid right now. It has to be. The people who really do the best are those who actually sense it, enjoy it almost, that lack of definition around their roles and what they can contribute."

Filed Under: cloud computing, enterprise architecture, leadership, links, trends

Revisiting “Where will the services come from?”

March 19, 2009 By brenda michelson

[What follows is my post first published on SOA Consortium Insights]

In February 2007, as we were starting up the SOA Consortium, I facilitated a series of invitation-only, vendor-free, SOA Executive Summits, with leading CIOs and CTOs representing Fortune 1000 corporations, major government agencies and non-governmental organizations. The purpose of the Summits was two-fold. First, was to validate, augment or contradict the mission, vision, strategies and tactics of the newly formed SOA Consortium. Second, was to conduct a roundtable discussion on real-world Service Oriented Architecture (SOA) implementation opportunities and challenges with advanced SOA adopters.

One of the questions I asked during the roundtable was “Where will the services come from?”  As documented in our Executive Insights Paper (pdf), this simple question generated an interesting conversation on the future of applications:

“Three of the CIOs attending the San Francisco Summit offered related, yet varying points of view:

“We’re going to completely change the IT execution model, it’ll be based on SOA principles…I’m not going to build any more monolithic applications. I don’t even want to buy any more monolithic applications. I want to use SOA to de-customize the ones I have spent the last 8 years customizing.”

“SOA fundamentally enables a change in the marketplace. The way we buy software today is changing and we are not going to buy software in the future. We are going to subscribe to services and you are going to deploy those services to develop and deploy your next generation applications.”

“Are we going to have software as a service? Yes. Is it a major part of our SOA? No. We’ll pick and choose things.”
– CIOs on future of applications

What became apparent during this conversation is services will come from a variety of providers – internal builds, service-oriented application packages, software bundles, service marketplace subscriptions, and open source. CIOs believe the challenges of this “mix and match” environment will be service certification, business level interoperability and quality-of-service.”

Two years later, inspired by a conversation with a cloud computing software provider, I revisited the “where will the services come from question” with members of our community of practice.  However, instead of focusing on where they might “get services”, I asked if their organizations are currently, or considering, “offering services via the cloud”, to augment interactions with business partners and existing customers, and/or as a means to generate new revenue streams.

Without giving away the secrets of our members, here are five key insights from the conversation:

1. Eventually, at some level, everyone is going to be a provider to the cloud.  The offering might be as simple as an information or business service, or it might be an advanced scientific algorithm service that requires dedicated data storage and compute cycles.

2. In developing cloud offerings, organizations are extremely mindful of “the business they are in”. If a potential service offering supports the core business without risking competitive advantage or intellectual property, then sure, creating a service offering, to support an existing business relationship, or to generate new revenue, will be pursued.

3. Following the “sticking with our core business” line of thinking, many of these organizations are considering engaging with third parties to host and manage any service offering. No members seem compelled to neglect their core business to become infrastructure-as-a-service providers.

4. In respect to defining potential service offerings and associated pricing models (service call, business transaction) that “product leadership” is coming from the business, rather than IT, organization. However, in many cases, it is still incumbent upon IT to explain the potential of offering an external service, and of course, how that ties to a service-oriented approach.

5. This wasn’t a theoretical conversation amongst architects, call participants spoke of current internal discussions / studies / proofs-of-concept underway exploring cloud computing and offering services externally.

So, where will the services come from? Everywhere and everyone.

 

[Disclosure: The SOA Consortium is a client of my company, Elemental Links]

Filed Under: cloud computing, services architecture, soa, trends

Software from Walmart? Water from IBM? Giants, adjacent markets & tech providers

March 17, 2009 By brenda michelson

This week, the talk will be about Cisco boldly entering the blade server market and the end of co-opetition as we know it:

“Cisco’s chief technology officer, Padmasree Warrior, says the company has moved boldly in the past, and suggests the old rules are changing. “We’re going to compete with H-P. I don’t want to sugarcoat that,” she says. “There is bound to be change in the landscape of who you compete with and who you partner with.”

Battles are breaking out across the industry. Within the past year or so, H-P has fueled a new rivalry with IBM in tech outsourcing by buying services giant Electronic Data Systems Inc. Microsoft set its sights on Internet-search giant Google Inc. by attempting to buy Yahoo Inc. Sun Microsystems Inc. is moving beyond its core market in servers and software to take on database-software leader Oracle Corp. Later this month, Dell Inc. says it plans to introduce new data-center management software that will compete with existing offerings by H-P, IBM and others.”

Of course, that is interesting and important news. And I completely get the drivers of wanting to win in the data center and the convergence of data center technology – compute, networking and storage.

However, I find myself interested in what I’d categorize as ‘recessionary moves of giants with cash, guts and (relatively) decent stock prices’. Namely, recent adjacent market moves by Walmart and IBM.

On March 12, the WSJ reported that Walmart will start selling electronic medical records software, installation and maintenance, to single physicians and medical practices via Sam’s Club:

“Wal-Mart said it is forming a partnership with computer maker Dell Inc. and closely held software maker eClinicalWorks to offer a lower-priced medical records system, plus installation and maintenance, through its Sam’s Club membership warehouses. Sam’s Club would be the one-stop contact for any physician follow-up questions about the system. The questions would then be routed to the appropriate person at Dell or eClinicalWorks.

“Whether it is a single physician or a physician’s group who comes to us, we can offer a system that enables them to electronically prescribe medication, set appointments, track billings and keep records,” said Gregg Rossiter, a spokesman for Wal-Mart.

The system, expected to be available at the clubs in the spring, will cost $25,000 for the first installed system, and $10,000 for each additional system, plus $4,000 to $5,000 a year in maintenance costs.

The more complex systems cost about $40,000 for the first installation in a small physician group, said Kent Gale, founder of Klas Enterprises LLC, a research company for health-care technology.”

As I shared via Twitter, this is an installed offering, not a cloud offering, like the patient oriented Google Health. That tweet led into an interesting discussion on compliance in the cloud, which is exactly why companies such as Sonoa Systems exist. But, I digress.

In what’s nothing more than conjecture, I’d speculate that Walmart’s investment in EMR software is related to its in-store health clinic initiative. If so, it’s a good way to capitalize on an internal investment — something more organizations should consider. If not, then it’s possibly a harbinger of software offerings to come.

On March 13, the WSJ (amongst others) reported that IBM is “embarking on a new business venture in which it will help manage water resources, an attempt by the technology giant to further expand its footprint outside traditional computer services.”

“The new business…will design and install systems of sensors and back-end software to monitor water pipes, reservoirs, rivers and harbors, according to Sharon Nunes, who heads the Big Green venture.

IBM has been touting its ability to help create a “smarter planet” by designing systems to monitor physical world activities such as electricity flows and traffic patterns. “There’s a lot of stress on water systems around the world. With a limited supply, you’d better be able to manage it,” said Ms. Nunes. She estimates that information technology for water management could become a $20 billion market.”

Sure, you could say that IBM selling IT for water management is business as usual, but the truly interesting part of this announcement is the research development:

“In a related development, IBM researchers said they have created a new desalination-membrane technology that goes beyond current systems and removes arsenic and boron salts from contaminated ground water, making it safe for humans. Desalination membranes filter out salts, allowing clean water to pass through.

Robert Allen, a chemist at IBM’s Almaden, Calif., lab said that his team found a way to put a polymer designed for immersive lithography — a technique for making semiconductors — into membranes that reject the toxic salts. He said arsenic contamination is a problem in some water supplies in Texas, Turkey, Bangladesh and China. IBM expects to license the technology rather than make desalination plants itself.”

For a quick overview of the breakthrough, check out this youtube video.

This move comes days after CEO Samuel J. Palmisano addressed, in the chairman’s letter of IBM’s 2008 annual report (pdf), the economic climate:

“We’re not looking back, we’re looking ahead. We’re continuing to invest in R&D, in strategic acquisitions, in growth initiatives—and most importantly, during these difficult times, in our people.

In other words, we will not simply ride out the storm. Rather, we will take a long-term view, and go on offense. Throughout our history, during periods of disruption and global change, this is what IBM has done. Again and again, we have played a leadership role. We have imagined what the world might be, and actually built it.

We find ourselves at such a moment now. This is an inflection point—both in the course of modern technology and economic history, and in the nearly 100-year journey of IBM. As someone who has been here for more than a third of that journey, I can tell you that it presents the best opportunity I have seen in my IBM career to align those two trajectories in very powerful ways.”

What else does this “inflection point” have in store for us? Who will our primary IT providers be? And will IT be their primary business? Curse or not, we d
o live in interesting times…

Illustration: Since the market collapse, Walmart & IBM have consistently outperformed the Dow.

[Click on Chart to enlarge]

[Disclosure: None of the companies mentioned in this post are direct clients of my company, Elemental Links. However, Cisco, IBM, HP & Sun are sponsors of the SOA Consortium, which is a client of Elemental Links.]

Filed Under: business, cloud computing, economy, sustainability, trends

Getting Intentional with my Cloud Watching: Cloud Computing Expo, NY March 30 & 31

March 16, 2009 By brenda michelson

Last month, I spoke of how, despite my best intentions, I found myself cloud watching; and shared my cloud watching plan:

1.  Cloud watch with the lens of an enterprise architect type practicing business-driven architecture.
2. Share information and observations on selected (#1) cloud computing developments and activities.
3. Highlight interesting, relevant (#1) works of the cloud computing community — providers, consumers and consortia.
4. Add to the conversation, but not engage in "
yet another" syndrome. (No "What is Cloud Computing" piece from me)

Since then, I’ve had some interesting conversations with cloud computing providers, consumers and some individuals who fall in both categories.  I imagine the latter will become quite common.

Next on my agenda is a trip to the Cloud Computing Expo in NY.  I’ll be attending Monday and Tuesday, and wifi willing, plan to do some live blogging.

If you are planning to attend the Cloud Computing Expo and want to connect, or bribe me to cover your session (strong black coffee please), drop me an email, or ping me on Twitter.

In the meantime, I’ll be preparing for and running a “SOA All-star” SOA Consortium meeting in DC.

Filed Under: circuit, cloud computing

Economist Tech Quarterly: Fueling your morning and commute with coffee

March 11, 2009 By brenda michelson

As you probably gathered from the title, although this post is technology related, it is definitely off-topic.  However, I found the article interesting, and thought others might as well.  Plus, my brother (an engineering geek) stops by occasionally, so if nothing else, this one is reward for slogging through posts littered with “tech acronym du jour”.

And yes, I did the math.  1 gallon of coffee ground derived biodiesel requires the consumption of 50lbs, approximately 2,250 cups of coffee.  So, if you see a major uptick in my writing output, accompanied by jittery speech, you know I’m doing my part in the “beaning of America”. 

Without (even) further ado, what follows are excerpts from Fuelled by coffee, in the most recent Economist Technology Quarterly.

The basics:

“In the case of coffee, the biodiesel is made from the leftover grounds, which would otherwise be thrown away or used as compost. Narasimharao Kondamudi, Susanta Mohapatra and Manoranjan Misra of the University of Nevada at Reno have found that coffee grounds can yield 10-15% of biodiesel by weight relatively easily. And when burned in an engine the fuel does not have an offensive smell—just a whiff of coffee. (Some biodiesels made from used cooking-oil produce exhaust that smells like a fast-food joint.) And after the diesel has been extracted, the coffee grounds can still be used for compost.”

The accidental discovery:

“The researchers’ work began two years ago when Dr Misra, a heavy coffee drinker, left a cup unfinished and noticed the next day that the coffee was covered by a film of oil. Since he was investigating biofuels, he enlisted his colleagues to look at coffee’s potential.”

Advantages beyond aroma:

“The researchers found that coffee biodiesel is comparable to the best biodiesels on the market. But unlike biodiesels based on soya or other plants, it does not divert crops or land from food production into fuel production.

A further advantage is that unmodified oils from plants, like the peanut oil used by Diesel in the 19th century, have high viscosity and require engine alterations. Diesel derived from coffee is less thick and can usually be burned in an engine with little or no tinkering.”

The math (why we won’t be doing this at home):

“Although some people make their own diesel at home from leftovers and recycled cooking oil, coffee-based biodiesel seems better suited to larger-scale processes. Dr Misra says that a litre of biodiesel requires 5-7kg of coffee grounds, depending on the oil content of the coffee in question. In their laboratory his team has set up a one-gallon-a-day production facility, which uses between 19kg and 26kg of coffee grounds. The biofuel should cost about $1 per gallon to make in a medium-sized installation, the researchers estimate.

Commercial production could be carried out by a company that collected coffee grounds from big coffee-chains and cafeterias. There is plenty available: according to a report by the United States Department of Agriculture, more than 7m tonnes of coffee are consumed every year, which the researchers estimate could produce some 340m gallons of biodiesel.”

If you found this interesting, check out the full article and the reader comments.

Filed Under: innovation, sustainability

Next Page »

Brenda M. Michelson

Brenda Michelson

Technology Architect.

Trusted Advisor.

(BIO)

  • Email
  • LinkedIn
  • RSS
  • Twitter

Recent Posts

  • Experts Sketch
  • PEW Research: Tech Saturation, Well-Being and (my) Remedies
  • technology knowledge premise
  • The Curse of Knowledge
  • better problems and technology knowledge transfer

Recent Tweets

  • “…where the process of drawing itself can take us. We can follow a suggestion, a squiggle, shadow, or smudge, and s… https://t.co/oRg0x2LoXG November 30, 2022 5:05 pm
  • On the waiting list for Post, join me (on the waitlist) via https://t.co/U8wYK707f6 November 24, 2022 4:17 pm
  • Meet the longtime librarian being honored at the National Book Awards : NPR https://t.co/S44VQeJg83 November 13, 2022 2:51 pm
© 2004-2022 Elemental Links, Inc.