Updated: Dec 22, 2019
The past twenty-five years of enterprise application software has witnessed a cold war between “integrated” suite solutions and best-of-breed (BOB) solutions. While there has been no formal end to this war, it is now over. Best-of-breed has won. It is now time to move on.
But what are we moving on to? This article explores the decision-making dynamic that played out between suite and BOB solutions; this exploration offers insights for enterprise applications and their future. These insights are useful for both end-user companies and their software providers.
The suite versus BOB struggle was primarily a philosophical one between central control and innovation. The primary argument for the suite was one of commonality and integration – enterprises buy a set of applications from a single vendor; these applications have a common architecture and design philosophy and are all integrated back to the same data model and physical database. This eliminates headaches associated with multiple architectures, integration, and associated costs. This is a compelling argument.
The primary argument of the best-of-breed approach was one of focus and innovation. Best-of-breed provides superior capabilities – richer data models, algorithms, user experiences, and a faster rate of innovation. Companies that need to innovate in their core competencies would be at a competitive disadvantage if they chose solutions with lesser capabilities. This is also a compelling argument.
These were the primary and initial arguments on both sides, as they were hatched twenty-plus years ago. Multiple sub-arguments and nuanced points emerged from these foundational arguments. For example, since the suite players were almost always larger than the BOB players, an argument was made that although the suite solutions were functionally inferior, they would eventually catch up. This argument launched the PowerPoint roadmap sub-wars, as the suite players sought to “freeze” the market. Dozens of such sub-arguments evolved over the years, some of substance, but many which were simply intended to sow fear, uncertainty, and doubt (FUD) for whichever side was benefiting.
Software sales is a classic adversarial process, with all sides presenting their position in the best possible light and decision-makers having to filter through layers of details to get to the substance. Thankfully, this process has evolved in recent years to focus more on product, diminishing the use of PowerPoint subterfuge.
In the Beginning
The suite versus BOB war started in the 1990s when ERP hit its stride in the market. At that time, ERP dominated enterprise purchase discussions for commercial off-the-shelf software (COTS). The argument was that if ERP could be used for transaction management in finance, administration, human resources, and manufacturing, then it should also be used for other areas such as customer relationship management (CRM), supply chain management (SCM), and supplier relationship management (SRM). This argument was made under the banner of integration and vendor rationalization.
At that time, the enterprise markets for CRM, SCM, and SRM were small compared to the market for ERP. Furthermore, the ERP systems integration (SI) vendor ecosystem created a self-reinforcing industrial complex in which enterprises that started down the ERP path for one function stayed on that path for other functions, often with negative consequences. The industrial complex created a whole set of formal and informal rules meant to continue to perpetuate it. That these rules (e.g. “why not ERP?”) became part of the fabric of enterprises and their technology ecosystem is a psychological study in groupthink, writ large.
Penetrating this sheath of armor required significant technical innovation and salesmanship on the part of the best-of-breed vendors. The most successful of the BOB players to emerge from this era was salesforce.com (started in 1999), which not only proved that BOB was a good direction, but also managed to create such dominance that CRM is now long past the time that when it was looked at as an add-on module; indeed, CRM is now a market that significantly exceeds ERP in size.
Limitations of the Suite Approach
The problem with the primary suite argument is that it has significant limitations in the real world. The suite argument leads to simple questions like what does an integrated suite even mean? What is an appropriate functional scope for an integrated suite? Should it exist for a functional area, a division, a geography, a business process, or for the entire enterprise? Should the suite cover all the needs of the finance department, or both the finance organization and the human resources department? Or, should it cover finance, HR, manufacturing, and procurement?
Early thinking from thirty years ago suggested that it was possible to extend suite thinking to the entire enterprise. At that time, many companies sought to create a unified enterprise data model spanning all functional areas. The argument was that if such a model were possible, then it could feed all applications, and seamless integration would be a natural bi-product. This turned out to be elusive.
As Martin Fowler of ThoughtWorks points out:
A single unified data model is impractical for anything but the smallest organizations. To model even a slightly complex domain you need multiple bounded contexts, each with its own data model.
Nowhere has this statement played out more than in the domain of supply chain management. Each functional area – procurement, production, transportation, warehousing, retail, and service – is complex; each also has a complex set of interconnections with other functional areas. And, as I point out in Data Synchronization in Supply Chains, each, and together, the functional areas must be managed across a time horizon.
For vendors, this does not mean that a common architectural approach cannot be sought and attained. This means common design patterns, underlying technologies, and abstractions can be used to create a common technology platform, from which multiple applications can be created. This can also be used to collapse functional boundaries to create synchronized business processes across functions. I pointed this out in a 2018 paper titled “10 technologies that will reshape SCM software.” In this approach, a single architecture with different pluggable data models can be used to solve similar problems across functional areas.
Decline of the Original Suite Argument
The original idea behind buying suites was that all the applications in the suite were integrated back to a single database in a star configuration. This idea harkens back to the days when unified data models and databases were seen to be the best path forward. As companies bought these products, they quickly realized that implementations required separate instances for geographies and divisions. Furthermore, in order to get up and going quickly in places like emerging markets, it was expedient to just stand up another instance. In a short period of time, large enterprises had instance proliferation. On top of that most large companies executed mergers and acquisitions, which each brought their own instances, and in many cases their own data models from different vendors.
On the vendor side, the historical integrated suite providers have had difficulty growing organically and have resorted to buying other software companies. They have thus created under their own roof the very problem they purported to solve for clients – a seamless integrated set of applications with a single architectural design. In essence, they have become a proxy for their client environments – heterogeneous applications with different architectures and databases that must be integrated. The suite vendors have ironically been busy buying up best-of-breed vendors, many of which they competed against by arguing to clients against a best-of-breed approach.
Therefore, the suite argument shifted from one of architectural and technical purity to one of business: a single vendor which can handle the integration problem. This has turned out to be a specious argument.
For these vendors, this leads to internal struggles between investment in integration and investment in innovation in the application areas themselves. Executives seek to take on acquisitions while holding the R&D investment level constant as a percentage of revenue. Thus, the integration investment has to come from somewhere – most typically from innovation in the individual product lines. The right approach in acquisitions, such as those sought by the large ERPs – is to hold constant the R&D investment in individual product lines while adding an entirely new area of R&D investment called integration. In essence, integration must be managed and funded as its own separate product line; furthermore, each product area must have an integration investment line item and this line item must be funded without impacting the investment in other product line investments.
Without this approach, integration investment squeezes out functional innovation investments. The reverse is also true. This is precisely what has happened at large ERP suite vendors. Investment in integration has squeezed functional investment, causing individual product areas to fall further behind the focused best-of-breed players. Or, in some cases, a sort of middle ground investment on both sides has driven the vendor into a betwixt-and-between quagmire where it appears to external observers that nothing is getting done. The catch-22 is that if this necessary level of investment were included in the original financial analysis associated with the acquisition due diligence, then the acquisition may have never been done.
If this problem sounds similar to the one faced by enterprise IT departments, it’s because it’s the same problem. Large ERP and other similar companies that are assemblies of acquired products are proxies for the IT departments to which they sell.
Where To Go From Here
Application portfolio management is an important part of the work of IT departments of large companies. At the same time, today’s competitive climate places a premium on IT departments that can deliver innovation at the pace at which business moves. Too often in the past, the IT department gated the pace at which business moved.
This need for innovation is a principal reason why best-of-breed won the war. At the same time, it is important to note that in enterprise applications, innovation must be driven in a such a way that it does not create an unsupportable mess. Too much proliferation leads to significant support costs, which can then crowd out investment necessary for perpetuating innovation. At the same time, too little proliferation leads to stunted innovation. How do we reconcile this and find the goldilocks balance between not too much and not too little?
This is an important strategic question and leads to other questions:
Do I continue with my integrated suite vendor strategy?
Should I engage with a roll-up ERP company that can handle the integration problem for me?
Should I go all-in on an innovation strategy and embrace all best-of-breed and handle the underlying data flow problems myself?
Should I execute the above all-in innovation strategy along with partners who can handle the data pipeline?
Should I identify areas of strategic importance to the enterprise and then establish strategic partnerships in each of those areas and evaluate all other areas on a case by case basis
Some companies are promoting a strategy where enterprises will have a handful of platforms for areas such as ERP, CRM, SCM, procurement, and service. These platforms would form anchors, and innovation could presumably be done around those platforms. This approach requires application vendors to not only invest in applications, but to also invest in platform and integration.
Tug of War Between Control and Innovation
The above discussion highlights a consistent tension between control and innovation. The following 2x2 may help in thinking through the trade-offs in this tug of war.
2x2 charts have their limits, but what this chart attempts to show is that in most companies innovation should be prioritized above control and that innovation and control should be respectively focused in areas where they are most important. Where is control most important and where is innovation most important?
Innovation is most important for operations of core competency and competitive differentiation. Control is particularly important for data management and for functional areas that do not represent competitive differentiation. Data is the lifeblood of the corporation and increasingly a source of competitive differentiation. Access to data must be controlled not just for regulatory purposes, but also to ensure all areas of the enterprise can access it in a common way. Control is also important for non-competitive differentiation areas so that high levels of efficiency can be attained.
There are no silver bullet answers here. Each company will have to devise its own strategy, driven by its business strategies, and a keen understanding of its core competencies and competitive differentiation in the market. Using this understanding, companies can devise an enterprise architecture with three very important components:
A common data flow infrastructure to support different application needs
Suite solutions to manage non-competitive differentiation areas
Best-of-breed solutions to support and manage areas of competitive differentiation
In this approach, the IT department, along with tool partners, focuses on data pipelines. It identifies business process areas that are not core to competitive differentiation and seeks to drive high efficiencies in those areas by engaging as few vendors as possible. It identifies priority-ranked areas of competitive differentiation and seeks the very best solutions for those areas, irrespective of other strategic vendor relationships.
I broach this complex topic here at a high level. Fleshing out the details is better served by another post.