This blog post is a wrap-up of the GilbaneSF 2010 debate on the “Future of Open Source CMS” (#fosc on twitter) with Geoff Bock (Gilbane Group), Ian White (The Business Insider), and Jahia’s inputs from an Open Source Content Management vendor perspective.
You will find below the presented slides and a summary of the main topics we covered during the debate. This blog post will be followed by a couple of others over the coming months detailing the most important paradigms: the future of Open Source CMS. We will consolidate all these entries into a whitepaper available for download next fall.
Today, it is hard to define what an “Open Source CMS vendor” is, since virtually every CMS vendor uses open source in its products, contributes to open source, or provides services around open source. Additionally, most, if not all softwares are dealing with “Content” in one way or another.
To get a clearer picture of the future of Open Source CMS, we need to approach the topic from two different angles:
CMS is a strange beast whose definition is broad and uncertain. CMS mostly implies two different audiences: techies (CIOs; CTOs; and developers) and practitioners (marketers, information Workers, and lines of business). From this perspective, it is evident we are not dealing with a single “system,” but rather two.
CMS: Content Management Services or Content Management Solutions?
The former stakeholders are looking for a “content platform”, the latter, for finished products and solutions to solve some of their content issues.
As Wikipedia notes:
Application software is contrasted with system software and middleware, which manage and integrate a computer's capabilities, but typically do not directly apply them in the performance of tasks that benefit the user. A simple, if imperfect analogy in the world of hardware would be the relationship of an electric light bulb (an application) to an electric power generation plant (a system). The power plant merely generates electricity, not itself of any real use until harnessed to an application like the electric light that performs a service that benefits the user.
For a long time, CMS was a simple mixture of horizontal infrastructural libraries combined with vertical applications, without any clear segregation of duties. Most CMS solutions available today are still based on this monolithic approach.
Recently, the industry-led (think JCR or CMIS) massive standardization and interoperability effort was coupled with a push to quickly prototype and launch rich content-enabled applications. This combination led to a greater separation of content platforms and content-enabled applications.
Towards Composite Content Platforms and Content-Enabled Applications
Even though the term “composite” has existed for quite some time, only recently did it gain traction, due to its role as the cornerstone of SharePoint 2010, actively pushed by Gartner as a replacement for the older and more limited CEVA term.
Content particles are becoming increasingly granular and structured. Moreover, there is an ever-increasing need to rapidly assemble, cross-link, enrich, and combine heterogeneous content objects. Therefore, the term “composite” sounds convenient and appropriate.
Composite Content Platforms are tomorrow's ECM 2.0
The nice thing about composite content platforms (call them content application servers or content management platforms if you prefer), is that they act as dynamic content containers or as content runtimes, which can run content composite applications. The next generation of composite content applications will be even more dynamic. They will not only glue cold content together, but also will natively inherit from the merge of application servers and content stores, and create hot actionable content-driven applications.
Of course, a simple website could be considered a composite content application (in which case your httpd server could be seen as a kind of first generation and lightweight composite content server). However, composite content applications can also scale to more complex content-enabled applications requiring advanced business processing schema, strong business integration, and heavy personalization requirements.
All these composite content applications can produce and publish massive amounts of content and data, which need to be correctly managed. And here come the usual content management product families (WCM, DMS, DAM, RM…) that will help manage this deluge of information for all the content-enabled applications.
This platform/product split is quite common, at least as part of the high-end enterprise spectrum of the CMS market niche. It is rapidly moving down and impacting all other sub-segments.
Some recent examples:
Due to their historical focus and inherited technologies, some of these frameworks or content foundations, are still driven by a portal-centric approach (e.g. Exo), a document-centric approach (e.g. Nuxeo, Alfresco), or a web-centric approach (e.g. Jahia, Day). However, we can assume that there will be a rapid consolidation towards a universal set of core value-added services able to nurture and enrich any content asset, be it a web page, a document, a record, an email or a scanned fax.
Content Lifecycle Services such as versioning, file plans, workspaces, content types, searching and querying services, interoperability services, mashability services, Social Services, persistence-independent storage services, etc. are becoming commodities.
As open source commoditization is actively ramping up and rapidly extending its borders, competitors must decide whether to horizontally extend the level of content services offered as part of their content middleware (e.g. archive more volume, support more load, add new value added content enrichment services), or to go up the value chain and provide new lines of content-enabled products and applications to solve the needs of various business lines. Usually, both expansion strategies are pursued in parallel.
Four main trends are emerging:
With the rise of composite content platforms and content-enabled applications, we should see a shift from monolithic CMS towards better fractioning ones:
Ideally, the next generation of composite content platforms should ensure a level of data openness and interoperability, aligned with the current CMIS, OpenData and other DataPortability trends.
The goal of this new generation of Composite Content Platforms will be to offer a wide range of content enrichment services, while ensuring proper data interoperability and freedom. Ideally, composite content applications will become more standardized and portable, much as web applications became more standardized during the last decade. However, such a standardization process would take at least 5-10 years. Data Portability will therefore become one of the key purchase criteria.
2) The Future of Open Source: Properly defining the limits of Open Core
The line between proprietary and open source software has become increasingly blurred, as open source software is embedded in proprietary products and extensions. There is also plenty of confusion about the term “community”: community builds, originally based on the unstable development branch, are now promoted as “Freemium” editions for viral marketing purposes. Besides, the scope of “core” features tends to be slowly but surely pared back to boost sales of newly created commercial extensions.
So what can we expect of “Open Core” software vendors? How can we better define ethical and fair boundaries both for open source communities and vendors, while ensuring a reasonable level of open source “purity”?
Simply put, more than 70% of open source contributors are now paid professionals while all open source commercial vendors look for ways to monetize their initial investments. This makes perfect sense, as any commercial entity needs to generate revenues, to pay employees and reward their shareholders.
The Open Core business model is only the latest in a long series of commercial open source business models. Over the past two years, it has rapidly gained momentum. But it is also facing heavy criticisms (cf Gartner or InfoWorld). The model is more and more considered just another type of “Shareware 2.0” or, at best, a lightweight, limited and free SMB edition of the vendor’s main product offering.
Essentially, there is nothing wrong with the Open Core approach and it has existed for years, even if it was not marketed under these terms. It comes down to how the vendor or the community defines the notion, including the scope and the “raison d’être” of the core vs the vendor’s product derivatives.
In today's landscape, we can discern several common pitfalls:
In practice, an Open Core strategy often leads to the following consequences:
Open Core Main Success Criteria
We can try to solve some of the classical Open Core issues with a series of best practices:
The value proposition and scope of the Open Core product offering should be clear to all stakeholders. Most importantly, users should be able to foresee a future for this Open Core product beside the extensions, derivatives, or additional lines promoted by the original vendor.
This leads us to the following suggestions:
Suggestion #1: Better distinguish your product branches
First, let’s not confuse Freemium offerings aimed at practitioners with development builds for code contributors or early adopters. The term “community” normally relates to the crowd source collaborative aspects of an Open Source project (free speech), not that it is gratis (free beer). Releasing some community builds does not prevent a vendor from simultaneously offering Freemium editions of its various product families.
So why are so many Open Core software vendors trying to redistribute an unstable version of their product as a promotional resource, in the hopes of converting users to stable and enhanced commercial editions?
Second, most vendors need to improve the transparency of their Open Core strategies. The vendor should clearly state which of its products aim to become a community-driven open source Core, and which it will more strictly control with dual GPL/Commercial licenses, or even more proprietary licensing schema. There is no shame in being a Commercial Open Source vendor in 2010, so vendors should be candid about their position.
Suggestion #2: Keep your Core away from possible conflicts of interest
The second suggestion is to clearly delineate the scope of various product lines, to avoid any long-term product cannibalization. This not only helps to clarify the audience and the scope of features, but also the entire roadmap for each sub-product. The only sure way to do this is avoid all direct conflicts of interest between your core kernel and your product derivatives. Ideally, the Core should have a long-term perspective which encompasses the vendor's commercial derivatives. Organizations, or competitors, should be able to reuse, leverage and extend your core. Co-optition should be made possible.
Common bad practices are the following:
A frequent underlying cause of these bad practices is the lack of clear product boundaries vis-a-vis the Core, leading to severe conflicts of interest.
Suggestion #3: One size does not fit all
Now that Open Source business models are better understood by developers and customers, more vendors are using various open source strategies as part of their product families.
For example, one could combine a community-based Open Core released under a business-friendly license, associated with some hybrid GPL/Commercial derivatives, combined with some other proprietary extensions.
The “purity” of an open source vendor no longer has much meaning in 2010. Rather, we should speak of the purity of a given open source project, be it a Core, a library, or an entire product line. As vendors continue to adopt more hybrid strategies, the various levels of “purity” should be assessed for its particular product offerings. 100% pure Open Source vendors should start exploring various licensing models and apply them distinctly to each of their sub-products. As a result, these vendors will develop a more global and valuable Open Source business model. Customers will have to understand a vendor’s entire open source strategy before rushing to deploy the core or on another sub-product.
This trend underscores the need for vendors to avoid Open Source FUD to their customers. The company's open source business model should be rather simple to explain, and clearly state the value proposition for all the stakeholders.
We can summarize this chapter by listing the following key points:
3) Applying the Open Platform paradigm to the CMS industry
Let’s now try to combine the first chapter, Future of CMS, with the second one, Future of Open Source, to envision how Open Source CMS could evolve over the next few years:
Applying our conclusion to the Open Source CMS industry, we can now try to draw a general picture of future business models:
Of course, each CM product and vendor is different, so there will certainly be hundreds of heterogeneous variations of this business model over the next few years. But the underlying paradigms should be pretty similar.
I will further detail each of these major paradigms in future blog posts. Meanwhile, please do not hesitate to add your thoughts and comments below.
[Editor's Note: Our thanks to Stéphane Croisier for granting us permission to publish his article here at CMSReport.com. The original posting of this article can be found at Mr. Croisier's personal blog, Contention Re-Considered.]Back to top