It wasn’t that long ago when the typical “sensor” (and perhaps only) use case for event processing was RFID. I certainly used that example — TVs escaping the warehouse — if I recall correctly. Oh, and more recently, the interesting, but non-mainstream case of cow containment.
Well, that’s all drastically changing with the emergence of “smart” public infrastructure – roads, bridges, grids, waterways and the like — accompanied by a hefty “shovel ready” stimulus package. (Shovels to build the infrastructure, not for processing the events!)
“Imagine highways that alert motorists of a traffic jam before it forms. Or bridges that report when they’re at risk of collapse. Or an electric grid that fixes itself when blackouts hit.
This vision — known as “smart” infrastructure — promises to make the nation more productive and competitive, while helping the environment and saving lives. Not to mention saving money by making what we’ve got work better and break down less often.”
As I was reading the Smart Roads. Smart Bridges. Smart Grids. feature in the WSJ, I couldn’t help but notice all the event processing scenarios and patterns. As you would expect, all the scenarios start with sensors and instrumentation and involve event (instrumentation) transmission, detection and processing. The order, frequency and end actions of transmission, detection and processing vary by scenario.
What follows are excerpts from the article in the domains of smart transportation and grids and my associated event processing observations. The emphasis is mine.
The article describes the smartening of transportation as evolving from data collection and visibility all the way to “the ultimate science-fiction vision — roadways that control vehicles and make “driving” unnecessary — isn’t that far in the future. Mr. Lamba says IBM is in discussions with a small city to build a completely automated transportation system that would include 3,000 remote-controlled vehicles. The company won’t identify the city or give any other details.”
Being a pragmatist by nature, I’ll leave the science fiction to others, and stick with the nearer term:
“One promising avenue: real-time information about road conditions, traffic jams and other events. People can increasingly find that data on the Web with services such as Google Maps. But the next generation of technologies promises to get that news — and even more detailed information — directly to drivers in their cars. Armed with that information, drivers can make better decisions about which routes to take — which can have a big effect on traffic.”
Instrumentation for tracking, simple calculation and real-time visibility — roadside “dashboards” for decision-makers (drivers)
“The first step is collecting better data about traffic flows. The California Department of Transportation, or Caltrans, has installed radio receivers along several freeways in the San Francisco Bay area that read the electronic toll tags in passing cars. Using that information, Caltrans can track the speed of individual vehicles and determine the travel time from one point to another. Then those times are posted on electronic road signs. (Caltrans officials say they don’t keep track of personally identifiable information from the tags, to protect privacy.)
Eventually, the data from the roadside sensors could help traffic controllers guide drivers to other travel alternatives: Is a bus or a train faster than the freeway? To that end, Caltrans and the Bay area’s Metropolitan Transportation Commission are testing three electronic signs south of San Francisco. Along with freeway travel times, the signs show scheduled travel and arrival times on Caltrain. Drivers can see if they’d be better off getting out of heavy traffic, heading to a station and catching a train.
In the future, planners intend to show real-time train travel and arrival times, as well as the number of available parking places at the nearest station.”
Adding prediction and real-time actions — adjusting traffic signals
Another way to ease congestion is to predict traffic jams before they form. IBM has developed software that can examine current traffic patterns and foresee congestion up to 45 minutes ahead. The system, being tested in Singapore, has proved to be about 90% accurate in predicting the volume and speed of drivers in the central business district. The information is then used to adjust 1,700 sets of traffic lights to smooth the flow of traffic.
“We say that real time is too late,” says Naveen Lamba, leader of IBM’s global intelligent-transportation efforts. “You have to see into the future to minimize the impact of what’s going to happen.”
[Aside: “Seeing the future”, or in their terms “Querying the Future” is the mantra of SQLStream, a real-time business intelligence player with a new streaming engine offering built on 2003 Standard SQL.]
Instrumentation for exceptions (accidents), correlation (time, location, severity), notification and dispatch (responders)
“Some researchers are attacking another source of traffic backups: accidents. Trimming the time it takes to clear the roadway after a crash would help ease congestion. Reducing the number of accidents would be even better — lowering injuries and fatalities, as well as costs associated with accidents.
Enter a concept called vehicle infrastructure integration, or VII. These systems would let roads, traffic signals and vehicles talk to each other, and share crucial information automatically, by using a range of technologies — GPS navigation, wireless communications, advanced sensors and onboard computers. For instance, a car in an accident could send out an automatic message about the time, location and severity of the crash to receivers along the roadside, which would then automatically dispatch emergency vehicles.”
Bringing the ‘dashboard’ to the driver’s dashboard
“Caltrans is testing a left-turn signal that flashes a big red arrow with a slash through it when it detects a vehicle is approaching rapidly from the opposite direction. For now, the arrow flashes on the signal itself; the goal is to have the warning beamed directly into the turning vehicles.”
“The U.S. electricity grid is arguably one of the most important technological achievements of the 20th century. And yet, it’s pretty dumb. Power flows one way, but the utility gets back very little information about how it’s being used. And the grid is poorly set up to handle power coming in from alternative-energy sources, such as wind farms.”
Instrumentation for tracking, simple calculation and real-time visibility — “dashboards” for decision-makers (providers and consumers)
“The first step is installing advanced electric meters that send a steady stream of information back to the utility. They make it possible to read meters remotely and to determine more precisely the location of power outages. And they can give customers a more detailed view of their electricity use.”
Adding “what-if” capabilities to the consumer’s dashboard
“Beginning next month, Houston-based CenterPoint Energy Inc. is preparing to install more than two million smart meters over five years. During a two-year test of the technology, consumers were able to call up a Web portal showing the energy consumption of the home’s major appliances. Consumers also could calculate energy bills in different situations: What would be the effect of keeping the house at 75 degrees in the summer instead of 65? What adjustments would be necessary to keep summer electric bills under $200?”
Instrumentation for exceptions (outages), notification and near real-time response
“A smart grid would even be able to partly heal itself. Today, when a storm drops a tree branch on a power line, utilities typically have to rely on customer calls to locate the damage and assess the scope of the outage. CenterPoint is testing special sensors and switches that sit alongside power lines and detect sudden changes in the amount of current through the wire. The utility then can quickly route power around the break, restoring electricity within seconds to a large part of the blacked-out area and limiting the number of households affected.”
If you lived here in Maine, that last one would be particularly enticing. Camping in your den loses it’s allure by the 10th day.
Of course, the smartening of our infrastructure also requires innovation in sensor technologies. While not “event processing” per se, I found the following from smart bridge section interesting:
“The University of Texas at Austin received a grant to create wireless networks of sensors to monitor cracks in existing bridges where the failure of a single piece could bring down the entire structure. Because getting power to the sensors can be a problem, the group is studying how to use the vibrations of the bridge to generate electricity for the devices. It’s also working on devices with enough computing power to analyze the stream of data and send alerts when potentially serious damage occurs.
Looking beyond traditional sensor technologies, another grant went to a group led by the University of Michigan. The group is developing smart materials that can be built into or applied to key bridge components to detect and measure changes. For instance, researchers are working on a sensing “skin” that can carry electrical or magnetic signals to give a two-dimensional picture of how the structure responds to different stresses, says Jerome Lynch, an assistant professor at the university.”
If these excerpts intrigue you, I highly recommend reading the article. If you are thinking, “interesting, but we aren’t in the infrastructure building, monitoring or use business”, then I ask you to think about portions of your business that can benefit from instrumentation, monitoring and rapid response in order to mitigate risk, seize opportunity, and/or help you intelligently stumble on the future.