The fees that Lokad charges to its enterprise clients are straightforward1: a flat monthly fee for a mix of software+experts2. To the surprise of some, our monthly fees tend to be flat over time, rather than sharply decreasing at the end of the onboarding phase. Most are not surprised, however, as facing yet another preposterous demonstration of greed from a vendor is just another Monday in the world of enterprise software. However, this is not a textbook display of wanton avarice. Rather, this fee is what it takes to achieve lasting supply chain performance.

Iron worker over cityscape

The most profitable path for enterprise software vendors is, and has always been, take the money and run. Upfront licensing fees are akin to printing money. Compared to licensing, integration is more arduous. Risks are higher and margins thinner. As a result, large vendors usually subcontract this part entirely by cultivating a network of integrators that can take the heat for them. However, the least profitable part, from the vendor’s perspective, is - by far - the maintenance. Anecdotally, this is why vendors, despite charging sizeable maintenance fees, still mandate upgrades for their clients. The maintenance fees, despite being substantial, are not even close to what it would take for the vendor to support its own legacy.

Supply chain optimization is a special case, however, as vendors (not Lokad) have “succeeded” in eliminating the maintenance. This “success”, though, is certainly not the one that their clients had envisioned.

Since the 1980s, vendors (like Lokad, but decades prior), have been delivering software to automate supply chain decisions3. Since then, almost every large company has acquired not one but often several such solutions. Even ERPs, which means Enterprise Resource Planning, got their name in the 1990s from this ambition to automate the “planning” part. Otherwise, ERPs would be named ERMs, denoting Enterprise Resource Management.

Yet, the automation of supply chain decisions did not happen. The systems have been deployed, but they are either collecting dust or dodging their original mission4. As a result, the vast majority of supply chains are still managed through spreadsheets, proving that even if those optimization solutions were initially deemed a success, something went wrong with the maintenance.

Those failures are profitable as far as software vendors are concerned. The vendor walks away with the licensing fees, possibly in the form of a multi-year commitment (in the case of SaaS). Given the solutions are not working – at least not the optimization part - little or no maintenance is required. Clients are not concerned about software capabilities they are not using anyway, and consequently, they do not pressure the vendor much. Of the original solution, only a fragment remains in use - usually, a thin data entry gateway to manage basic automation rules integrated into company systems (e.g., min/max settings for SKUs).

On the other end of the spectrum, Lokad does succeed in delivering production-grade automated supply chain decisions. However, it takes ongoing efforts from a task force dedicated to the client - the supply chain scientists in Lokad parlance - to make it happen. The scientist is in charge of crafting, and later maintaining, the numerical recipe that generates the supply chain decisions of interest.

The resulting numerical recipe can be left unattended. This is, pretty much, what “production-grade” automation means in the context of supply chain optimization. Thus, the supply chain scientist can be removed from the picture at any point in time without causing any harm to the company.

That said, supply chain is a continuously changing beast, which naturally carries knock-on effects. While our algorithms can cope with flows changing in magnitude, we do not yet have algorithms that can deal with all the other subtle changes that are needed to maintain the production-grade numerical recipe.

As a result, the supply chain scientists are left with quite a series of tasks that need to be addressed once Lokad is in production:

  • Newer data become available5, and the numerical recipe must be updated to take advantage of this new data. Conversely, some data sources are phased out, and numerical dependencies must be severed accordingly. In sizeable companies, the applicative landscape is in constant evolution, not just during the upgrade of the ERP.

  • The strategy of the company changes. The numerical recipe is the reflection of the client’s strategic intent, and this reflection goes much deeper than picking values for a handful of parameters. It is not common for the supply chain scientist to rewrite entire portions of the recipe to accommodate inflections of the strategy, but this does happen occasionally.

  • Trust must be maintained. The supply chain leadership needs the supply chain scientist to provide ongoing evidence that the numerical recipe is well-behaved. The scientist is expected not only to produce new instrumentation to reflect fresh or revised performance indicators, but also to address any questions that the leadership may put forward.

  • Transparency must be maintained. The scientist is responsible for ‘white-boxing’ the numerical recipe. This implies training teams so that they have an adequate level of understanding, which in turn lets them make the most of the automation provided by the numerical recipe. As teams rotate, new entrants must be (re)trained.

If we fail at any of these tasks, then supply chain practitioners have no choice but to revert to their spreadsheets.

Hence, while the numerical recipe can be left unattended for weeks6, its relevancy invariably decays over time. As such, ongoing engineering resources are needed to keep the numerical recipe relevant. Despite the recent progress in artificial intelligence, engineering a piece of software capable of self-maintenance remains far beyond the current state of the art. It is perhaps contentious to write, but the task appears as difficult as the challenge of attaining artificial general intelligence.

Though ongoing contributions from the supply chain scientist are needed, one would be forgiven for thinking that these efforts will reduce once the numerical recipe is in production. Our experience has proved the opposite. The complexity of the numerical recipe invariably expands to match whatever level of engineering resources happens to be available7.

Over the last decade, we have repeatedly observed a tipping point when it comes to resource investment. If the initial resources invested in the recipe’s setup8 exceeds what the company projects to invest yearly in its maintenance, then the recipe does not get the proper level of care required to preserve its production-grade status. The most frequent symptom of this oversight is an extended backlog for all the important, but not immediately critical, pieces: documentation, code reviews, code clean-up, instrumentation, etc.

No technology or process guarantees business success9, but improper maintenance is a time-tested recipe to bring a company back to square one - before summarily drowning it in a sea of spreadsheets. Do not let your company become another data point in our growing ledger of perfectly avoidable supply chain failures.


  1. The world of enterprise software is full of fringe situations, with companies getting acquired, split, merged, and/or going bankrupt. From time to time, we have to give up on simplicity in order to remain aligned with whatever has become of the original client. ↩︎

  2. Lokad’s business model is probably best described as Supply Chain as a Service. In Lokad’s parlance, the supply chain scientists are the employees who spearhead the supply chain initiative on behalf of our clients. See Supply chain as a Service ↩︎

  3. Mundane daily decisions like inventory replenishment decisions, production batch decisions, stock allocation and transport decisions, etc. ↩︎

  4. There are tons of pseudo-automations floating around out there: min/max inventory settings where the planner is expected to update the min and the max; safety stocks where the planner is expected to tune the target service levels; fractional demand forecasts where the planner is expected to round up – at the right time – because MOQs are present; etc. All those tasks treat the planner as some sort of “human coprocessor” of the system, invariably shifting the burden, in practice, back to spreadsheets. ↩︎

  5. The newer data might just be an existing table within an application. Enterprise applications are vast, and most of the time people only use a small fraction of the capabilities available to them. If the process is revised to take advantage of capabilities that have been, so far, left unused, new data may become relevant for supply chain purposes. ↩︎

  6. Unless something dramatic happens like a war, a lockdown, an ERP migration, a flood, ransomware, a strike, a new CEO, an earthquake, a reorganization, a new tariff, a budget cut, a snowstorm, a new regulation, etc. In other words, an event that commands immediate revision of the numerical recipe. Fortunately, such situations are rare, with only a couple of instances at most per quarter. ↩︎

  7. The setup of the numerical recipe can be seen as a direct application of Parkinson’s Law, which states that work expands to fill the time allotted for its completion. ↩︎

  8. A typical time horizon is 6 to 9 months for this phase. ↩︎

  9. Some technologies do provide near-certainty of immense overheads, however. Not all technologies are equal, let alone equally disposed to addressing supply chain challenges. See Factors of success in predictive supply chains ↩︎