Interop in the Bazaar

Do open source projects want to inter-communicate and share? This article asks this question in the context of OSCOM Interop, a new project to foster interoperability and sharing between open source content management systems.
by Paul Everitt and Gregor J. Rothfuss

Everybody loves the idea of the bazaar. Small, autonomous shops doing commerce in the wild, right in the shadow of the centrally-planned economy of the cathedral. But even a bazaar needs rules, right?Coordination and cooperation don’t always spring up out of thin air.

In the world of open source, developers wonder if KDE and Gnome will ever interoperate in a meaningful way. But first we have to address if the question is even legitimate. Should they?

This article discusses a budding effort towards interoperability between open source content management systems, while evaluating the question, “Why interoperate?”

Background

The market of content management has always been associated with the big boys. Large software, large consulting teams, and very large prices.

Due to a number of factors, this mindset is in decline. Smaller approaches to content management, including open source projects, are popping up constantly. The open source projects are attracting attention from mainstream analysts and journalists.

From this grew OSCOM, an international non-profit for open source content management. The basic idea is to foster communications and community amongst the creators of these open source projects. A very successful conference was held in Zurich earlier this year. Another is slated for Berkeley in September.

After Zurich, some of the presenters discussed ways to make future meetings less a parade of individual projects, and more a forum for sharing ideas and working together. This led to a discussion of interoperability amongst open source content management projects, particularly in relation to a Java Community Proposal for content repositories, created by and for the big boys.

To test drive our ability to tackle interop issues, the OSCOM folks are working on a single problem: a common way to give presentations using a “SlideML” format and a set of rendering templates.

Reality Check

We are eager to continue these discussions face-to-face in Berkeley. But we should also step back and ask, “Is interop a bunch of crap?”

It’s a serious question. Why should a project leader or project team do the extra work? Many of the best open source projects aren’t really architectures. They are programs that started by scratching an individual itch. Later in their life, if they live long enough, they realize the bigger picture and do a massive rewrite, thus getting an architecture. But rarely is this new architecture designed with the idea of talking to other, similar systems.

So interop can impose serious scope creep on the architecture of a project. Strike one.

Next, how powerful is the motivation for working with “the competition”? At the least, a project leader has little cultural involvement with other projects, and thus doesn’t have that good old maternal feeling that sparks late hours doing something for free. At the worst, one project can view another with condescension, envy, or any other mixture of emotions that come from the tribalism of balkanized projects.

Strike two.

Finally, aren’t there already enough standards? Writing standards is a difficult process, one that doesn’t come naturally to open source developers with the ethic of “speak with code”. Shouldn’t we embrace the man-years of existing standards and focus on good implementations? (Note: the answer is “yes”.)

Beneficiaries and Their Expectations

We now have a stark, bleak picture. Thus, what is the driving need for interop, and who are its beneficiaries?

The first benefit is the “cognitive burden” that our projects place on developers. Imagine you are a consultant, and you have become an expert at Midgard. But you have a project where you need to work with AxKit. Atop the difference in programming languages, everything about the world of content management is different. Concepts, jargon, etc. If interop can give the tenuous grip of 5% commonality in approach, this can at least provide the mental connections to the next 25% of functionality.

The second beneficiary is customers who might have more than one project in use, or want to reserve the right to throw out their current project next year if they aren’t happy. Can they even get 25% of the current site’s content and configuration migrated? If not, then they are locked in. It is often argued that open source does not lock you in. But is this really true in a meaningful way? While it is certainly possible to migrate data between open source projects, or content management systems for that matter, it is by no means an easy and painless process.

The third beneficiary is the implementor of authoring tools. Imagine you are a developer at Mozilla, OpenOffice, Xopus, Bitflux, or KDE. You’d like to tie the client-side environment, where *real* authoring happens, into the server side environment, where collaboration happens.

There are over ten projects presenting at OSCOM. If Mozilla has to talk to ten different systems in ten different ways, they will probably opt to talk to none of them. However, if the various projects agree to a thin set of common capabilities, then there is a basis for authoring-side integration.

But we’re all open source veterans here, so let’s cut the crap. Do any of these people have a right to ask for interop? This is open source, scratch your own itch, I-do-this-because-I-like-it territory. The time spent serving these beneficiaries could be better spent implementing Gadget B, which my mailing list tells me will cure cancer. Right?

Wrong, but first, let’s explore the hidden costs of the process of interop.

Hidden Costs

Doing interop is hard. It’s a lot harder than starting your own software project. Just review the mailing list archives for an interoperability project such as WebDAV. On and on, the messages go on for months and years. It takes time to distill the common wisdom from diverse perspectives into a standard that can have multiple implementations.

Harder, though, are the human issues. As we have learned with the SlideML project, you have to bootstrap a culture and a process. Most of the participants are used to being the big fish in their pond. So who is the big fish in a shared pond? How do decisions become final?

From a process perspective, standards require a different kind of rigor than software. In fact, the purpose is to render something that exists separate from the software.

Similar to the projects themselves, though, successful efforts seem to show character traits that combine intellectual confrontation with patient encouragement, with a strong dose of humor and enjoyment.

The Revenge of the Upside

We have discussed the reality check of interop, explored the beneficiaries and questioned their rights, and surveyed the hidden costs. So that’s the downside. What’s the upside of interop that makes it worthwhile?

The authors of this article are promoting the idea of pursuing interop between open source content management. We are advocates. So w
e’ll focus the article on the provocative questions of interop in general and thus we will limit the upside to one discussion point.

In the world of open source web servers, there is one project that has a majority of the gravity. For databases, there are a couple of projects that split the gravity. Same for desktop environments. But for content management, there are a trillion. This kind of market splintering helps ensure that the big boys are safe to dominate the mainstream, where size and stability matter more than innovation and revolution.

Interop efforts, such as the Linux Standards Base, reduce risks for the mainstream customer. Not completely, perhaps not much at all initially. But it proves that we are interested in the motivations of the mainstream.

But interop is not solely a “feature” to appeal to the mass market, it can also unleash many new possibilities. Consider XML-RPC, which brought interop to scripting languages, and is now baked into dozens of scripting environments on various platforms.

Possible Progress

The existence of OSCOM, the conferences, the budding community, SlideML, and the interop workshops in Berkeley next week are all signs that this interop effort is taking baby steps. At this early stage, we can all be prognosticators and foretell with 100% certainty the future. Choose your pick:

  • Prediction One: Interop between open source projects is a fool’s errand.
  • Prediction Two: If we stay practical and focus on small steps, we can provide value with lower risk.
  • Prediction Three: We’ll stumble across the Big Idea that is the bait to get the fish (project leaders) on the hook in a big way.
  • Prediction Four: Somebody will get sued over a patent infringement and we’ll all move to Norway.

Open Questions

There are no easy answers for interop, nor are the questions that need to be answered unique to the content management space.

How and when is interop “sexy” and arouses interest among developers? What can be learned from interop efforts that succeeded?

Is lowest common denominator functionality still worth anything? Let’s say the choices are 100% interoperability (fantasy), 0% interoperability (surrender), or 20% interoperability (pragmatism).

Is 20% better than nothing?

.net and community mind share

microsoft is trying something new. asp had the downside that there were zero interesting open source apps beyond “hello world”. apparently, this shall not happen with .net, and several initiatives suggest that.


is a free ASP.net IDE that features wysiwyg web forms, web services support and much more.

mono
mono is making quick progress. a lot of the core libraries have already been implemented.

rotor
rotor is a complete CLI implementation that will help the mono effort.

protecting developers from users

typically, rookie open source developers like the ones now boasting how they “act in the interest of users” over at postnuke have just not been around long enough. user interaction is overrated, as whiprush points out.

I think a lot of OSS developers are probably sick of backseat drivers trying to dictate features and direction of something they’re doing for free anyway. The more OSS project mailing lists and forums I read, the more I am glad that developers choose to ignore more and more user requests.

here is my prediction: postnuke will switch to a less open development model within 6 months (just look at cvs commit logs, there is nothing happening already) or the ones carrying the torch will be burnt out.

mozilla widgets & cms

I had a very interesting conversation with nisheet from netscape today. he heads the xml dom system and several other initiatives, and is now looking for ways to make the browser do more interesting stuff. we talked about how innovations happened pretty much on the server side (cms, j2ee, xml technologies) and that the browser is still stuck with basic forms for most of the gui.

nisheet is eager to learn more about the content management open source community, and to figure out how to work with oscom to make mozilla a better platform for accessing cms. I mentioned xopus to nisheet as an example for gui innovation, and we mused about ways to provide stuff like xopus for a wide variety of systems.

there is a lot of good technology out there in the browser that needs to be leveraged. nisheet thinks that the interests of mozilla and oscom are well aligned and I have invited him to our mailing list so that we can start the dialog.

we agreed that discussions should be result-driven, and that we should start to look for issues that we can solve together rather than talk about interop all day :) going forward, we should ask ourselves what mozilla can do for us, and vice versa. that may be a good approach to getting results.

beyond wyswiwg


xopus is a pretty nifty xml editor that allows in-place editing of xml layouts. used at the nzz and other places, this is the editor that we have been waiting for. cant wait to integrate this sucker into postnuke.