I was delighted to read this month’s Milestone column in the Ann Arbor Chronicle. Not only because it features the elmcity calendar syndication service, but also because the Chronicle’s editor, Dave Askins, connects the dots to a larger vision of community information management based on syndication of authoritative sources. Dave makes a seemingly unlikely comparison between the syndication of calendars and of crime reports. He traces a story that was reported by one publication, rewritten and retransmitted by others, and then revised by the original source in a way that wasn’t echoed by the secondaries.
The problem with the approach those organizations take to reporting the “spot news” of crime incidents is that they disconnect the information from its single, authoritative source. And as a result, any update to their original reports would need to be undertaken manually — that is, someone would need to think to do it.
Yes, exactly! Here’s the same thing in the calendar domain. The Knights Chess Club in Keene, NH, meets on Monday evenings. The venue used to be the Best Western hotel. A couple of years ago, the chess club posted that information on its website and also relayed it to our local newspaper, the Keene Sentinel. Sometime later, acting as a proxy for the chess club, I added the same info to one of the calendar feeds that flows into the Keene hub. Then I noticed the event had moved from the Best Western to the E.F. Lane, so I adjusted the event accordingly. Months later, I noticed the listing in the Sentinel. You can guess the punchline: the event was still reported to be at the Best Western! (It has since moved to Langdon Place.)
In the world I imagine and am trying to bootstrap, the chess club itself is the authoritative source for this information. The Sentinel syndicates it from the chess club, as does the chamber of commerce, and the Monadnock Shopper, and What’s Up in the Valley, and any other attention hub that cares about the chess club. When the club updates its info at the source, everybody downstream gets refreshed automatically. Attention hubs compete not by trying to capture the info exclusively, but by “amplifying the signal,” as Dave Askins so nicely puts it, in ways appropriate to their unique editorial missions and capabilities.
Here’s an architectural view of what I have in mind:
Monadnock Arts Alive is a real organization chartered to advance arts and culture in the Monadnock region. It runs an instance of an elmcity hub into which flow calendar feeds from local arts organizations, including the ones shown here.
Monadnock Arts Alive has relationships with the Mariposa Museum, the Sharon Arts Center, the Monadnock Folklore Society, and as few dozen other local arts and culture organizations. It’s appropriate for Arts Alive to merge their calendars into a view that brands them as a collection and “amplifies the signal.” Arts Alive can then retransmit that signal to one or more attention hubs that can leverage the editorial work done by Arts Alive — that is, gathering a set of feeds that represent the local arts scene, and working with sources to refine those feeds.
Attention hubs aren’t restricted by Arts Alive’s choices, though. The Sentinel, the Chamber, or What’s Up in the Valley can use Arts Alive’s combined feed if it suits them. Alternatively they can create their own views that merge some of the feeds on Arts Alive’s list with other feeds not on that list.
What emerges, in theory if not yet in practice, is a pool of sources underlying a network of hubs. In this network sources are always authoritative for their data, as intermediaries are always authoritative for the views of those sources they present. What’s more, all intermediaries can be bypassed. If as an individual I care a lot about a particular source, say the Monadnock Folklore Society, I can subscribe to that calendar directly on my desktop or phone. Similarly, if the Chamber of Commerce has a different idea than Arts Alive about which set of feed best represents the local arts scene, it can go direct to those sources and synthesize its own view.
It’s a very general model. We can, for example, apply it to Dave Askins’ crime reporting example. Police reports aren’t, after all, the only possible authoritative basis for crime reporting. Citizens are another. Major incidents provoke online discussion. That discussion can by aggregated by emergent tags. And it can be filtered by whitelisting particular blogs, Twitter accounts, or other sources according to their reputations. Who establishes those reputations? Attention hubs whose editorial choices define views of reality that subscribers either will or won’t find useful. If you find that an attention hub usefully aggregates citizen chatter you may decide to peer through that lens. If not you can go direct to the sources which, in this model, will always be transparently cataloged by intermediaries.
I’m working the calendar angle because I see it as a way to get a wide variety of people and organizations engaged with this model. But my hope is that they’ll be able to generalize from it, and apply it creatively in other domains.
2 thoughts on “A general model for community information management”
Is the aggregated page really XML under the covers, or is it HTML? I’m curious, because I believe XML (including RSS/Atom feeds) and XSLT are very powerful for repurposing data on the web – although of course XSLT can rework HTML if necessary.
I also think the GEO: stuff should be worked into this somehow – think of how it would leverage things if the location worked in your calendar entry on your smartphone worked with the GPS app.
The aggregation is multiply available, so for example:
You can replace /html with:
Geocoding is in there but limited by the (minimal) degree to which current sources provide it.