FAQ: Glossary

Lokad’s approach to optimizing supply chain decisions leverages several perspectives and tools from different fields, not just supply chain itself. These include, amongst others, machine learning (ML), artificial intelligence (AI), probabilistic forecasting, philosophy, and economics. As such, our terminology is influenced by a multitude of disciplines. This page is intended to provide an explanation for how (and why) Lokad uses these terms in the context of supply chain optimization - as well as the specific nuance we aim to convey when we use them.

Intended audience: The supply chain and/or planning departments.
Last modified: May 2024

A man attaches a printed list to the doors of a factory while professionals observe in the background. The scene is reminiscent of Martin Luther.

What does “Supply Chain” mean?

For Lokad, supply chain is both a practice and field of study that can be defined as follows:

Supply chain is the mastery of optionality in the presence of variability when managing the flow of physical goods.

Optionality refers to the capacity to pick the right “option” among many competing alternatives. This one selected “option” becomes the “decision”. All the decisions that shape the flow of physical goods are considered, such as replenishment orders, production orders, and price changes. Furthermore, optionality refers to making the options available in the first place. For example, investing resources to identify alternative suppliers comes with the intent of creating further options for the company.

Variability refers to the incredible uncertainty associated with the future state of the market, i.e., conditions can shift significantly from moment to moment. This is because supply chains are, by design, exposed to forces that cannot be fully controlled by the company. These forces include customer demand, commodity prices, supplier lead times, etc. Thus, whatever methods or instruments are used for supply chain purposes, they must frontally address the problem of imperfect knowledge and risk, that are inherent to supply chains.

Finally, the flow of physical goods is critical and differentiates the mastery of supply chain from, say, financial trading. Supply chains are, naturally, constrained by their physical nature - the stakeholders (e.g., customers, suppliers, wholesalers, transporters, producers, etc.) are geographically distributed. Whatever methods or instruments are used to connect these stakeholders must directly (and adequately) address the numerous constraints that typically get in the way. Examples of these constraints are Minimal Order Quantity (MOQ), Minimal Order Value (MOV), full truck loads, limited warehouse space, and a company’s overall capacity to handle inbound/outbound orders, etc.

What is the “mainstream supply chain perspective”?

Mainstream supply chain perspective (MSCP), or classic supply chain perspective, refers to several faulty assumptions and practices one typically encounters, including:

  • MSCP assumes the future is perfectly knowable. Methods, such as classic time-series forecasting, attempt to express the future as a single value (e.g., demand, returns, scrap rates, lead times, etc.). This is flawed because the future is, naturally, unknowable (i.e., future uncertainty cannot be completely eliminated with forecasting). Thus, identifying only one future value is suboptimal from a risk management perspective (as likelihood dimensions are missing).

  • MSCP assumes the supply chain itself is not subject to adversarial behavior. At no point does the MSCP even consider that stakeholders (e.g., employees, clients, suppliers, partners, competitors, etc.) may have agendas of their own, and that these agendas can adversely impact their supply chain.

  • MSCP assumes observability. The reality is that every large company operates through an infuriatingly opaque applicative landscape (software applications), thus making direct observation of the supply chain exceedingly difficult.

  • MSCP lacks falsifiability. It is immune to reality. No matter how poorly the techniques listed in mainstream textbooks or vendor pitches are in practice, none of those techniques can be invalidated by real-world feedback.

As of 2024, most large companies have implemented several supply chain optimization solutions since the 1980s, yet many (if not most) of those companies are still running on spreadsheets. This is another key aspect of the MSCP: software vendors have become masters at blame deflection, and invariably blame the supply chain practitioners for not being able to “follow the process” or “use adequate parameters”.

However, the reality is simpler: the MSCP theory itself does not work, and supply chain practitioners revert to their spreadsheets because whatever crude heuristics they have, those heuristics (though far from perfect) outperform the “sophisticated methods” found in MSCP textbooks.

This is precisely why Lokad undertook a refoundation of supply chain in 2011, both as a field of study and practice. We refer to this reformation as Quantitative Supply Chain.

The MSCP is perfectly captured by a series of classic textbooks:

  • Production and Operations Analysis, Seventh Edition, by Steven Nahmias and Tava Lennon Olsen, 2015
  • Inventory and Production Management in Supply Chains, Fourth Edition, by Edward A. Silver, David F. Pyke, and Douglas J. Thomas, 2016
  • Fundamentals of Supply Chain Theory, Second Edition, Lawrence V. Snyder, Zuo-Jun Max Shen, 2019

What is “Demand Planning”?

From the mainstream supply chain perspective, demand planning is the collection of processes used by a company to quantify future demand. The implicit insight supporting demand planning is that once future demand has been accurately assessed, adequate supply chain management is mostly a matter of the correct and timely allocation of resources so that the company delivers “just enough” for the market.

Demand planning processes include backward-looking techniques, such as the statistical analysis of historical sales, and forecasting their associated time-series. It also includes forward-looking techniques, such as collaborating with sales and marketing to refine the numbers based on targets set by the company itself.

However, from Lokad’s Quantitative Supply Chain (QSC) perspective, demand planning is an antiquated concept that has no place in modern supply chain. Instead, QSC states that the execution of the supply chain should be robotized, driven by numerical recipes that typically feature a predictive modeling phase, and followed by a stochastic optimization phase.

The predictive modeling phase encompasses all aspects of “forecasting”, not just for the future demand, but also for all the other sources of uncertainty (e.g., future lead times, future commodity prices, future customer returns, etc.). The stochastic optimization phase encompasses all the “decision-marking” parts (e.g., choosing the quantities to be reordered, allocating stock on hand through sales network, repricing) – something that is traditionally kept separated from the demand planning.

Demand planning is outdated for several reasons.

First, it assumes that people should be involved in the execution of the “quantitative assessment of the future”. This is a complete waste of time and energy. People should certainly be involved in crafting the numerical recipes that support the predictive models (as is the role of Lokad’s Supply Chain Scientists). However, the predictive models should operate completely unattended, as there is simply zero value-add in having people manually interfering with a process that typically generates thousands - if not millions - of numbers on a daily basis.

Second, as “manual demand planning” is already slow and costly, companies typically have no resources left to address all the other sources of uncertainty (e.g., lead times, returns, quality, etc.). While assessing future demand is critical, it is not the only source of uncertainty. Future lead times, future commodity prices, future competitor prices, future customer returns, etc., are other key sources of uncertainty that must also be assessed quantitatively.

In conclusion, demand planning is an outdated perspective on how to orchestrate supply chains. This perspective emerged before the advent of computers and has survived a few more decades than it should have. At this point, it should be replaced by more suitable approaches, such as Lokad’s QSC perspective.

What is “Quality of Service”?

In the context of supply chain optimization, “quality of service” (QoS) refers to the capacity of the company to service its client while meeting their implicit expectations. QoS is not a metric or even something tangible: it reflects the company’s intent to adequately serve its customers. QoS is thus directional but vague.

Trying to uncover the implicit expectations of customers is a broad and multifaceted problem. Surveying customers is fraught with problems. Customers may politely say they are fully satisfied with the service and still visit your competition. Conversely, customers may loudly complain while remaining fervently loyal.

Also, QoS is never a one-dimensional problem. Cannibalization and substitution, along with pricing differences, usually obfuscate what “availability” really means for customers. Frequently, domain-specific concerns must be considered as well. For example, while there are plenty of yogurts left on the shelf, customers may find those unacceptable if they all expire three days from now.

In practice, metrics that reasonably approximate QoS can only be uncovered through careful and intelligent examination of the business. It requires deep thinking and empathy with the customers. Certain methodologies go a long way to identifying high-quality proximate metrics, such as experimental optimization - an approach pioneered by Lokad.

One of the biggest errors of the mainstream supply chain perspective is to present service levels - the probability of not having an item out-of-stock - as if service levels were a reasonable proxy of QoS. This is almost invariably incorrect. Service levels entirely ignore all the cannibalization and substitution that are ubiquitous in most verticals. They also entirely ignore lumpy demand, which is where the client needs many items to be jointly available in order to be satisfied (e.g., a professor purchasing books for an entire class of students, multiple identical light switches needed for a house renovation project). Service levels also entirely dismiss the willingness of the client to pay more to be served faster, or conversely to be able to pay less if the service is delayed.

In conclusion, QoS is an aspirational perspective. It reflects what the company wants to optimize, even if optimization criteria remain elusive as the company is facing a wicked problem. The QoS is the directional principle that will guide the search of metrics that are credible proxies of this aspiration. Service levels, and other naïve metrics, should not be confused a reasonable proxy of QoS.

What is an “AI Pilot”?

This refers to the general automation of the orchestration of a supply chain using AI. The AI Pilot includes the decision-making processes (e.g., how much should I produce?) as well as the mundane supporting processes (e.g., obtaining up-to-date MOQs for a given supplier). Lokad coined this term in early 2024. The AI Pilot, as a piece of software, is crafted by Lokad’s Supply Chain Scientists. We craft one pilot per client company, although there are a lot of commonalities between our implementations. The AI Pilot runs on Lokad’s platform, featuring big data and machine learning capabilities. The AI Pilot is a service delivered by Lokad and typically billed monthly.

For more on how AI Pilots work, consult our long-form podcast on the topic.

What does “Quantitative Supply Chain” mean?

The “Quantitative Supply Chain” (QSC) is a set of methodologies and technologies pioneered by Lokad during the 2010s. It features a series of techniques like probabilistic forecasting, stochastic optimization, and differentiable programming, which are absent from the mainstream supply chain perspective. It also features a series of methodologies like experimental optimization, supply chain personae, adversarial market research that are also absent from the mainstream supply chain perspective.

The term “Quantitative Supply Chain” was coined in 2017 in Lokad founder Joannes Vermorel’s book The Quantitative Supply Chain. The QSC manifesto can be summarized as follows:

  1. All possible futures must be considered; a probability for each possibility
  2. All feasible decisions must be considered; possibilities vs. probabilities
  3. Economic drivers must be used to prioritize feasible decisions
  4. Being in control requires automation for every mundane task
  5. A supply chain scientist must take ownership of the numerical results

The Quantitative Supply Chain can be seen as the field of study that gathers all the materials needed for the practical implementation of an AI Pilot for a given supply chain.

What is “Supply Chain as a Service”?

From afar, Lokad can be described as an enterprise software company. However, up close, Lokad is quite different to what people might expect from a software vendor. Lokad not only delivers robotized supply chain decisions, but also takes ownership of the supply chain performance resulting from those decisions. Our Supply Chain Scientists are there to continuously monitor and refine the automation that we have put in place for every single one of our clients. Thus, a subscription to Lokad buys our clients a more profitable execution of their supply chain.

This approach is very different to, say, a SaaS (software as a service) offering where the client company remains ultimately responsible for everything that matters. With SaaS, the IT department is relieved from managing yet another app, but that is about it. If anything goes wrong, such as erratic demand, chaotic supplies, incomplete data, etc., it is entirely up to the supply chain teams to figure it out. On the other hand, which SCaaS, it is Lokad’s job to figure it out. Naturally, in practice, this is a collaborative effort between Lokad and its client. However, Lokad is committed to getting results, not merely committed to keeping the servers up.

What is a “numerical recipe”?

Lokad automates supply chain decision-making processes, and we commonly refer to the pieces of software that effectively perform the automation as our “numerical recipes”. These numerical recipes are typically quite complex, as they reflect the intrinsic complexities of the supply chains themselves. The recipes are a mix of a myriad of mundane data preparation steps interleaved with sophisticated analytical steps - typically machine learning or mathematical optimization algorithms.

We are using the term “recipe” rather than “algorithm” as those recipes are nowhere as “pure” as what most software engineers would expect when talking about algorithms. Furthermore, while an algorithm is expected to address a well-defined problem, this is typically not the case for our recipes. Ultimately, the end-to-end performance of the supply chain is at stake, and this is an open and ill-defined problem. Thus, the assessment of the recipe is as complex, if not more so, than the crafting of the recipe itself.

What are “(mundane) supply chain decisions”?

A supply chain decision is one that has real consequences for the flow of goods. For example, inventory replenishments, production orders, and price changes are decisions that profoundly affect the flow of goods.

Modern supply chains typically entail daily tens of thousands, and sometimes millions, of decisions. As a rule of thumb, every single SKU (storage keeping unit) encompasses about half a dozen decisions, including not doing anything, which is also a decision (though a trivial one).

These (daily) decisions are often referred to as “mundane” because they can be completely automated. In this sense, “mundane” does not mean “inconsequential”.

Decisions are opposed to “numerical artifacts” (things that appear important but lack any substantive impact on the supply chain in question). Indeed, with more analytics, there is typically a great deal of confusion between the two. For example, a demand forecast, an ABC class, a safety stock, a service level can be considered numerical artifacts. Those elements can certainly be instrumental in computing actual decisions, but by themselves, they are entirely inconsequential. It does not matter if the forecast is wrong as long as the inventory replenishment is adequate, however the converse is obviously not true.

Lokad focuses on “decisions” instead of “numerical artifacts”, as too many companies cannot see the forest for the trees. On the contrary, they have so many performance indicators (artifacts) that they cannot even see anymore what is actually happening with the decisions they make. Our focus on “decisions” is what guarantees that Lokad pursues what actually matters for our clients (better supply chain performance) rather than chasing arbitrary indicators (numerical artifacts).

What is a “numerical artifact”?

A numerical artifact refers to a number that is perceived as important, even if this number does not have any direct/tangible consequence for the company’s supply chain. For example, service levels are numerical artifacts. Numerical artifacts are not real, rather they are abstractions - often arbitrarily selected by a practitioner.

For example, a 98% service level may hide the fact that numerous clients have already ceased to order altogether due to past poor quality of service. Furthermore, numerical artifacts cannot be controlled directly. A company can make more replenishment orders, but it cannot decide unilaterally that service levels will be at 98%, as customers ultimately decide how much of a given product is consumed.

Numerical artifacts are definitionally opposed to “decisions”, which have real consequences for the company. Decisions are also entirely at the discretion of the company. Typical supply chain decisions include replenishment orders, production orders, price changes, etc. Unlike numerical artifacts, every single bad decision is an irrevocable loss for the company. For example, the ABC class (numerical artifact) of an item can be completely incorrect/inaccurate, but as long there is no overstock and no stockout, it does not matter. On the other hand, a single insanely large purchase order (decision) can turn a well-stocked item into a massive write-off.

Over the years, Lokad has learned the hard way that numerical indicators are mostly delusional and misguided. More often than not, forecasting accuracies and service levels entirely mischaracterize the problem of interest (namely, anticipating and satisfying demand in a profitable way for a company). We must focus on “dollars of error” for a given decision, not “percentages of error” for a numerical artifact.

What does “robotized” mean?

Lokad robotizes (“automates”) repetitive supply chain decisions making them mundane. This means that all the daily decisions trusted to Lokad (e.g., production orders, price changes, stock allocations) are generated in a fully unattended manner. As a rule of thumb, most of our clients can operate for weeks without any direct intervention from Lokad - assuming market conditions do not change too dramatically. Our Supply Chain Scientists continuously improve the numerical recipes that we have put in place, but no Supply Chain Scientist is needed to generate the supply chain decisions of any given day - our numerical recipes are designed to run unattended.

This approach is radically different from what our peers (other enterprise software vendors) offer. In their case, supply chain practitioners are treated like the “human coprocessors” of their system. The minute the practitioners stop working with a piece of software, nothing happens anymore in the company as their time is consumed to produce supply chain decisions.

In contrast, Lokad turns every single minute of the supply chain practitioners into an investment for the improvement of the numerical recipe that enables decision robotization. If the supply chain practitioners stop working, it has no impact on the robotized execution of the numerical recipe. The numerical recipe just stops improving. If this were to continue for a while, this would result in the inevitable decay of the decisions’ quality as the numerical recipe would lose its relevance (due to changes in market conditions).

However, Lokad has implemented a series of self-attentive mechanisms for the early detection of such quality decay, and can thus trigger human inspection of the numerical recipe (first, by a Supply Chain Scientist, and then by a practitioner of the client company).

What is a “supply chain initiative”?

This refers specifically to a supply chain project driven by Lokad’s “Quantitative Supply Chain” (QSC) perspective. The goal of such an initiative is to robotize a given class of supply chain decisions and achieve beyond-human performance in the process. The goal of such an initiative is not only to improve a specific issue within the client’s supply chain, but to deliver systemic, company-wide, improvement.

Lokad’s supply chain initiatives typically tackle purchase orders, production orders, inventory allocations, price changes, fine-grained production schedules, etc. In essence, we address all the mundane, repetitive decisions needed to keep the client’s supply chain functioning.

The supply chain initiative is intended to be spearheaded by Lokad’s Supply Chain Scientists (SCSs). An SCS is there to craft all the necessary numerical recipes required to automate the supply chain decisions of interest. Furthermore, the SCS is responsible for visualizing (e.g., through reports and dashboards) the numerical recipes in order for clients (particularly upper management) to understand both how and why the automation delivers beyond-human performance.

The deliverable for a Lokad supply chain initiative is putting into production the numerical recipe(s) that the SCS has crafted for the client. This deliverable automates decisions and effectively converts the supply chain into a productive asset for the client (in the same way a piece of equipment automates the production of physical goods).

What is “experimental optimization”?

Experimental optimization is a methodology, employed by Lokad, for tackling problems where the very notion of “improvement” is unclear at the beginning of optimization process. This lack of clarity is because the optimization criteria (metrics) and their favorable levels are not known, or, even if previously set at certain levels, cannot be immediately justified using financial terms (e.g., profitability, ROI, etc.). The point of “experimental optimization” is to establish a rigorous (some would say “scientific”) method to quantify what “improvement” means for a supply chain from a financial perspective.

For example, consider a fashion store that wants to improve its quality of service. One of the major problems is that identifying what “quality of service” really means for the average client is difficult given the different nature of male vs. female shopping habits, as well as the influence of substitutions. Substitutions, by nature, make it difficult to identify how clients perceive your offering, even if they have made a purchase – e.g., purchasing a plain black t-shirt instead of a plain white one that is out-of-stock. On paper, a sale was made, but the absence of the plain white t-shirt may signal the end of the customer’s loyalty, particularly if it is a regular occurrence. Furthermore, men are less likely to spend time browsing multiple items than women, thus not having the exact item they want (or a suitable substitution) can be decisive when it comes to making a purchase. As such, what management thinks they know about their own business (and customers) might be incredibly misleading, which in turn can orient their stocking strategies in the wrong direction.

To this end, experimental optimization consists of conducting a series of experiments that challenge the optimization criteria itself - the very instrument that quantifies whether the supply chain is better or worse (e.g., “quality of service”). The gist of the method consists of picking a criterion, running a mathematical optimization (or more specifically a stochastic optimization) against this criterion, and assessing the resulting supply chain decisions. This assessment is not in aggregate, but in comparison to the most insane decisions that simply cannot be correct. The criterion must then be modified in order to gradually eliminate those insane decisions, until there are none left.

The resulting criterion has, thus, been obtained through a series of experiments. Unlike the classic optimization perspective that assumes the criteria to be known in advance, never to be challenged by the real-world; experimental optimization uncovers the criteria by repeatedly confronting the criteria to real-world settings.

To return to the previous fashion example, an experimental optimization may indicate that redistributing preexisting stock between stores is the optimal solution, or perhaps that simply rearranging the displays in each store is sufficient to stimulate greater traffic and perceived quality of service. These conclusions are only discoverable after repeatedly experimenting with (“tweaking”) the numerical recipe that generates the optimization recommendation(s).

Supply chain, like any other system, is more than the sum of its parts. In fact, the curse of supply chain is that most supposed improvements merely displace problems rather than solving them, as the problems are treated as local issues rather than expressions of system-wide ones. For example, increasing service levels usually implies increasing the inventory write-off as well. As such, there is no nudging or tweaking an isolated element within a system without impacting the rest of the system too. As a result, it is invariably difficult to quantify whether something is making the system (the supply chain) better or worse.

Furthermore, in the specific case of supply chains, this difficulty is compounded as it usually takes a lot of time for events to unfold. In the case of the fashion store from earlier, men may be incredibly loyal to a store that consistently has the items they want, driven by the sheer convenience of not having to spend too long shopping. As a result, experiencing stockout events can be devastating for customer loyalty and take a long time to manifest (as men may only shop a few times per year, yet purchase many items with each visit to maximize the value of each trip). These considerations and contingencies confound all the naïve approaches at quantifying supply chain performance, hence Lokad advocates a solution predicated upon experimental optimization.

For more detailed perspective there is a full-length lecture on Experimental Optimization for Supply Chain.

What is “probabilistic forecasting”?

Probabilistic forecasting is the process of identifying all possible future values/outcomes/scenarios (e.g., demand for a given SKU), and assigning a probability to each value. The probability represents how likely that value is of becoming “real” (e.g., one might have a 3% probability of selling 4 units; a 4% probability of selling 5 units; a 2% probability of selling 6 units; etc.). Numerically, when summed together, this probability distribution (aka “probabilistic forecast”) covers 100% of the potential values (e.g., demand).

A forecast Is said to be “probabilistic” if the forecasted value is a probability distribution rather than a single point. Probabilistic forecasts are the opposite of the traditional point forecasts which dominate the mainstream supply chain theory. The primary upside of probabilistic forecasts is that they embrace the irreducible uncertainty of the future, instead of pretending that the “perfect” forecast is just around the corner if only a slightly more accurate model could be uncovered. Probabilistic forecasts quantify uncertainty, and this quantification is fundamental in order to later produce risk-adjusted supply chain decisions. Without probabilistic forecasts, decisions are fragile as they are entirely dismissive of the mundane variations (e.g., demand, lead time) that are expected to happen even considering regular market conditions.

It is worth noting that any point time-series forecast can be “mathematically” turned into a probability distribution. This is exactly what is being done with safety stocks, as both demand and lead time can be paired with normal (Gaussian) distributions. However, while such techniques nominally generate probability distributions, those techniques are also entirely missing the point. The core issue to be addressed by a probabilistic forecast is to produce a richer forecast - a forecast that holds more information than a point forecast. This probabilistic forecast is not necessarily more accurate, just like a color photograph does not necessarily have a better resolution than a black-and-white one. However, by construction, a point forecast lack this extra dimension. Even if a mathematical trick can be used to add probabilities, those probabilities will be almost entirely made up, just like the colorization of a picture can be highly plausible while being factually incorrect.

In short, probabilistic forecasts represent one of the core data processing stages needed for the predictive optimization of a supply chain.

What is “general forecasting”?

A forecasting technique is said to be “general” if it supports data that do not present themselves as time-series. Indeed, while time-series forecasts are very useful for visualization purposes, they are ultimately a simplistic and one-dimensional model that fail to reflect events as they unfold in a real-world supply chain.

Considering multiple time-series does not address the problem either.

For example:

  • repeat purchases from the same clients cannot be modeled with time-series as a demand time-series completely flattens the origin of every unit being purchased.
  • cannibalization or substitution cannot be represented as time-series as the dependencies between the items is lost.
  • competitors competing on prices, bulk discounts, service levels, etc., cannot be captured by a time-series as it cannot reflect these causal factors.

In mainstream supply chain theory, time-series forecasts are the alpha and the omega. Yet, a careful examination of real-world situations should demonstrate that time-series forecasts are invariably a severely misguided simplification of the situation – see examples listed above. In Lokad’s Quantitative Supply Chain (QSC) theory, it is better to be approximately correct rather than exactly wrong. Pretending that a real-world problem (e.g., substitutions) does not exist does not make the problem go away.

For this reason, since the early 2010s Lokad has developed and pioneered a whole series of superior forecasting technologies that deliver more general forms of forecasts (beyond basic time-series ones). As per our QSC, every single source of uncertainty requires a probabilistic forecast of its own. These “general forecasts” are delivered not by “forecasting models”, but through programmatic machine learning paradigms, such as differentiable programming.

What is “classic forecasting”?

By “classic forecasting” we mean point time-series forecasting. Point time-series forecasts are so ubiquitous in the mainstream supply chain theory that many people, including many supply chain practitioners, do not realize that point time-series forecasts are merely one form of statistical forecasts. There are, in fact, a wealth of alternative forms of statistical forecasts, with point time-series being one of the most simplistic.

Note: A simplistic forecast is not necessarily a bad thing. In fact, Lokad believes forecasting software should be no more complicated than necessary to accomplish its task. That said, point time-series forecasts are insupportably simplistic, as was demonstrated in “What is “general forecasting”?”.

Point time-series forecasts garnered popularity at the very beginning of the 20th century, half a century before the advent of corporate computers. Until powerful computers became widely affordable, point time-series forecasts were the only sort of statistical forecasts that could be produced. Despite their rather extreme simplicity, producing time-series forecasts was already too much work to really be worth the investment - given they were done without the processing power of corporate computers. As a result, most companies used all sorts of tricks to entirely remove the need to statistically forecast anything in the first place.

There are two distinct and complementary avenues to go beyond classic forecasts. The first avenue consists of replacing the “point forecast” angle with a “probabilistic forecast” one.

Probabilistic forecasts, unlike their “point” counterparts, deliver full probability densities. Probabilistic forecasts embrace the irreducible uncertainty of the future, and frontally quantify this uncertainty. Supply-chain wise, probabilistic forecasts are vastly superior to point forecasts because they lend themselves to the later computation of risk-adjusted supply chain decision. On the contrary, point forecasts ignore all sources of uncertainty and decisions derived from these forecasts are fragile by design.

The second avenue consists of replacing the “time-series” angle with a higher dimensional alternative. Time-series are one-dimensional by design. This inherent limitation means that time-series forecasts are simply unable to capture even the most basic interdependencies that can be observed in the flow of supply chain goods.

For example, time-series forecasts cannot apprehend cannibalization and substitution. They cannot apprehend the risk of having steady sales volume that is entirely dependent on a single large customer (e.g., in B2B situations). They cannot apprehend the basket perspective of a patron shopping in a hypermarket who needs all the necessary ingredients to complete a recipe (i.e., not having any single item means nothing is purchased). Lokad uses differentiable programming to craft predictive models that go beyond the one-dimensional perspective of the time-series and capture the true information of interest.

In conclusion, classic forecasting is an antiquated statistical perspective that has no place in a modern supply chain. Relying on classic forecasts - aka point time-series forecasts - is a recipe for failure as those forecasts lead to fragile decisions at best, and downright incorrect decisions at worst. Instead, we recommend using general probabilistic forecasting, typically leveraging a programmatic machine learning paradigm like differentiable programming.

What is “the basket perspective”?

The basket perspective is a concern of prime relevance for all verticals where clients are expected to buy many items at once (as a single transaction), rather than a single item. It refers to the perceived value of having the purchases be made in combination rather than in isolation. In other words, the value of all the items being available as a whole might be more than the sum of the disjoined values of the items acquired separately. This perspective is critical for many verticals such as general merchandize retail for example. Recognizing this interdependence in demand leads to superior supply chain decisions compared to traditional methodologies that treat each SKU’s purchase as an isolated event.

For example, consider a customer entering a supermarket to purchase multiple items. These items represent a mix of essential staples (e.g., milk, bread, and eggs) and discretionary purchases (e.g., ice cream and chocolate). If the supermarket experiences a stockout for a discretionary item (e.g., chocolate), the customer is likely to still purchase the other items (milk, bread, eggs, and ice cream). However, if there is a stockout for an essential staple (e.g., milk), the customer may leave without buying anything and go to a competitor to complete their purchases. Thus, the financial penalty of the stockout for the essential item extends beyond the item itself, impacting the entire basket of sales.

Essentially, there are relationships between products, and the absence of some products affects the likelihood of customers purchasing others. Lokad incorporates this subtle but significant phenomenon into its supply chain decision recommendations to optimize inventory and reduce stockouts (ranked from those that would hurt the most to the least), thereby enhancing overall sales, client profits, and customer satisfaction.

What is a “supply chain scientist”?

A supply chain scientist (SCS) is the person who spearheads one of Lokad’s Quantitative Supply Chain (QSC) initiatives with a client, e.g., the provision of risk-adjusted purchase orders, stock allocation lists, prices, etc. The term “supply chain scientist” was coined by Joannes Vermorel, CEO and founder of Lokad, in 2017. The primary commitment of the SCS is the generation, maintenance, and ownership of the numerical recipes responsible for the decision-making in a given supply chain initiative.

Unlike a data scientist, whose primary responsibility consists of producing models to support decision-making process, the SCS takes personal responsibility for the quality of decision-recommendations generated by the numerical recipes. Furthermore, the SCS also takes direct responsibility for crafting all the instrumentation (e.g., dashboards, reports) that explain the logic and suitability of the generated decisions. It might seem a little paradoxical, but while Lokad emphasizes the robotization of mundane decision-making processes, we also put personal responsibility front and center. A QSC is not a “system” that is responsible for the performance of the supply chain, rather it is a person driving the QSC.

However, while an SCS has a personal responsibility, they are not alone in their mission. Lokad is entirely dedicated to making sure that every SCS gets as much support as possible. This implies providing the SCS with all the necessary software tools, mathematical instruments, methodologies, training, and monitoring from senior SCS.

A more granular description of what an SCS does can be found in our dedicated knowledge base article The Supply Chain Scientist.

What is a “supply chain practitioner”?

The term “supply chain practitioner” generally refers to all the people who are traditionally involved in or responsible for making the supply chain decisions that the company requires to operate. As there is no unified terminology, the title varies for different verticals and from one company to the next. Common variations of “supply chain practitioner” include supply and demand planner, inventory analyst, demand forecaster, category manager, inventory manager, production manager, purchasing manager, pricing manager, etc.

The Quantitative Supply Chain (QSC) offers a modernized vision of the role of supply chain practitioner. While the traditional supply chain practitioner is directly responsible for manually supporting the decision-making process, the QSC recommends entirely mechanizing all the repetitive tasks. Through this robotization, supply chain practitioners can focus on tasks that bring more value-add for the company. In particular, supply chain practitioners are at the forefront of challenging the numerical recipes (the software pieces that support the robotization of the supply chain) by gathering feedback and high-level intelligence from customers and suppliers.

What does “supply chain executive” mean?

For Lokad, this term specifically refers to a person who is in a position to arbitrage conflicting proposals within the company concerning the robotized decision-making processes that orchestrate the supply chain.

This role is critical in a Quantitative Supply Chain (QSC) initiative, which unifies the supply chain decision-making processes through explicit financial assessments performed with software logic. The QSC reveals all the contradictions and ambiguities that typically preexist in the company. As a result, in order to make sure that a supply chain initiative does not become deadlocked through sheer indecision, a supply chain executive must be appointed with the power to arbitrage conflicting proposals from a strategic perspective (e.g., achieving expected service level at any cost versus finding an optimized set of decisions within a constrained budget).

Note: The QSC does not assume that the supply chain executive has some innate capacity to be “right” all the time. Sorting out what works from what does not is the role of the experimental optimization methodology Lokad employs, not the supply chain executive.

What is a “risk-adjusted decision”?

In the context of supply chain optimization, a decision is said to be risk-adjusted if the decision carefully balances the economic costs associated with the uncertain future state of the market and of the supply chain itself. A risk-adjusted decision is “better” in the sense that by considering all the possible futures, and their associated probabilities, the average financial outcome associated with this decision will turn out to be greater than alternatives.

Also risk-adjusted decisions tend to be anti-fragile decisions (as opposed to fragile decisions). This means their expected economic gains remain somewhat good (or just acceptable) for a wide range of future variations. This property is achieved thanks to factoring in all non-linear constraints and costs (e.g., perishability).

The implicit assumption behind the very idea of risk-adjusted decisions is that the serious economic costs lie at the extremes: it is unexpectedly high demand that causes stockouts, and it is the unexpectedly low demand that causes the overstock. In between, everything goes pretty much according to plan, and whatever improvements are brought to the “according to plan” case is mostly negligible for the company.

In contrast, most decision-making processes recommended by the mainstream supply chain theory do not generate risk-adjusted decisions. In fact, the decisions are typically fragile. They are fragile (see full explanation below) due to lacking the one critical ingredient for generating a risk-adjusted decision in the first place: a probabilistic forecast. Indeed, if the only forecast available is a point time-series forecast, then the decision-making process is implicitly going “all in” on a single future value (e.g., demand) that is assumed to be perfectly known. This approach invariably leads to fragile decisions as they are rendered immediately inadequate the moment an exception or unanticipated risk is presented - something that is all too common in supply chain, and all too easy to anticipate with a probabilistic forecast.

A more granular (though more technical) illustration of how risk-adjusted decisions are derived in practice can be found in our tutorial Prioritized inventory replenishment in Excel with probabilistic forecasts and our QSC lecture Retail stock allocation with probabilistic forecasts.

What is a fragile decision?

In the context of supply chain optimization, a decision is fragile if minute variations of the market conditions or of the state of the supply chain itself undermine the economic gains that were originally expected from this decision. The decision-making processes promoted by the mainstream supply chain theory invariably produce fragile decisions - even considering mild market conditions where nothing notable is happening.

In our opinion, decisions must be risk-adjusted. This is the approach that Lokad’s Quantitative Supply Chain (QSC) philosophy recommends. In practice, producing a risk-adjusted decision requires two notable ingredients: first probabilistic forecasting, and second, stochastic optimization.

Probabilistic forecasting quantifies the future uncertainty in the form of a probability distribution. Stochastic optimization computes the decision that will prove to be “the best” on average when considering all the possible futures and their respective probabilities. It does this by combining the economic drivers, the constraints, and the probabilistic forecasts.

What is epistemic corruption?

Epistemic corruption is when a body of knowledge loses its integrity and ceases to be of value for the people or the organization that relies on this knowledge for the betterment of their activity.

Supply chain, as a field of study, has unfortunately undergone a severe case of epistemic corruption since the end World War II. There are two primary root causes for this present-day state of affairs:

First, academia, mostly unintentionally, dropped the ball decades ago. While tens of thousands of papers are published every year, practically none of those papers can either be reproduced or falsified (in the Popperian sense of falsification*). Unlike other fields that do not suffer from widespread epistemic corruption (e.g., the study of algorithms), supply chain papers are almost never used in real-world settings, and certainly not for long when they are.

Second, market analysts, software vendors, and consultants, have been acting as adversaries for decades. Indeed, there are profits to be made in prolonging rather than addressing the problems. Antiquated methods that would have perished a long time ago have been put on life-support by actors that were all too eager to maintain the status quo. Oddly enough, the status quo has been around for so long that most of those people can honestly claim that, from their perspective, the methods have “always” been around – because, technically, the problems effectively predate them.

The solution to widespread cases of epistemic corruption is more effective methodologies and methods that let companies separate faster (and better) the wheat from the chaff. To this end, Lokad has been conducting a refoundation effort of supply chain since 2011. This novel approach is named “Quantitative Supply Chain” (QSC). It features alternative techniques and methodologies, like probabilistic forecasting and experimental optimization.

*The “Popperian sense of falsification” here refers to the philosophy of science developed by Karl Popper. According to Popper, for a theory to be considered scientific, it must be falsifiable — that is, it must be possible to conceive of an observation or an experiment that could prove the theory wrong. In other words, scientific theories should make predictions that can be tested and potentially refuted. This concept is critical in distinguishing scientific theories from non-scientific ones. Supply chain research (typically) lacks falsifiability as the theories cannot be tested and potentially disproven, which undermines their scientific value and contributes to the epistemic corruption of the field.

What is “correctness by design”?

Correctness by design is a principle that emphasizes the importance of ensuring that the design of a system inherently prevents certain types of errors or failures. This approach is in contrast to the more common practice of relying on extensive testing and quality control to catch and correct issues after they have occurred. The goal of correctness by design is to minimize the need for ongoing maintenance and to reduce the risk of catastrophic failures that can result from complex systems. In the context of supply chain optimization, correctness by design is particularly relevant because the analytical layer (the layer responsible for the decision-making processes) must not augment the chaos endemic to supply chain – an admittedly already chaotic environment.

Numerical recipes - intended to support the supply chain decision-making processes – are often implemented with generic tools that offer no correctness by design. As result, such numerical recipes typically fail due to a thousand cuts. The production fails due to index-out-of-range errors, out-of-memory errors, race conditions, numerical overflows or underflows, etc. “Move fast and break things” is an acceptable engineering philosophy for a lifestyle app, not for a mission critical business system.

To this end, Lokad has engineered Envision, its DSL (domain-specific programming language) dedicated to the predictive optimization of supply chains with correctness by design in mind. Lokad did not start with Envision back when it was founded in 2008. For years, we relied on general purpose languages like Python. It took us years to realize that our attempts failed more often than not because of Python.

Even more puzzling, the situation was exactly the same for the data science teams of our own clients. The story was almost always unfolding in the same manner: within three weeks, the data science team had crafted what appeared to be a highly promising prototype. Yet, after one year of intense efforts to make sure it would work in real-world production settings, the project was discarded as it had never reached the necessary “production-gradiness”.

Thus, after years of pain and misery, we concluded in 2012 that the programming language itself was the core problem to be addressed. In other words, Python was not the solution, but the problem. Thus, left with no better alternative, the Lokad engineering team initiated a decade-long engineering effort to create a DSL dedicated to supply chains that would “by design” address all those issues to the fullest extent. This is how Envision came to be.

More than a decade later, we now have several billions’ worth (USD and EUR) of inventory under the direct control of the extensive numerical recipes written in Envision. Envision has dramatically improved not only the productivity of Lokad’s Supply Chain Scientists, but it has also enormously reduced the frequency of the “dumb” and very costly mistakes.

In conclusion, correctness by design is a core requirement for any programming language intended to steer real-world supply chains. Many software vendors, out of sheer negligence or incompetence, do not frontally address this problem, invariably causing immense damage for their clients.

What is “maintainability”?

Maintainability, in the context of supply chain software, refers to the capacity of the company and its supporting software vendors to keep its applicative landscape in working order.

As far as the “management” of the supply chain is concerned, “maintainability” is a relatively straightforward affair. There is a humdrum of security and compatibility patches to keep the software working under changing conditions (e.g., changes of operating systems, browsers, database versions, etc.). Unless the company desires functional changes to be brought to its “management” apps, maintainability is largely a given if the vendor is even modestly competent.

However, supply chain optimization is a very different problem. The numerical recipes that robotize the decision-making processes invariably lose their relevance over time. The cause of the decay is not so much the market growing and shrinking, as it is fairly straightforward to numerically accommodate this sort of variation as part of the static numerical recipes (e.g., a moving average does that, albeit in a crude manner).

Rather, the cause of the decay is the evolution of the problems to be solved. Evolving market conditions do not merely require answers that happen to be quantitatively different, but different sorts of answer altogether. For example, mail-order companies never had to face the problem of steering their SEM (search engine marketing) investments to support the liquidation of excess inventory - a situation routinely faced by e-commerce companies.

Supply chain optimization software is much more susceptible to evolving market conditions than supply chain management software. As of 2024, it is not rare to encounter companies that still operate an inventory management system that was deployed in the 1990s (which may still appear to work just fine), given clerical stock-keeping tasks have remained virtually unchanged over the last 30 years. However, a supply chain optimization logic that can even stay relevant for 3 years is vanishingly rare.

Most supply chain optimization vendors fail to acknowledge this fundamental problem. As a result, investments tend to be heavily concentrated at the very beginning, when the client company is being onboarded by the vendors. During the first couple of months, as the vendor is still heavily involved with an evolving setup, the solution gives the illusion of being satisfying. However, fast forward 18 months after the end of onboarding phase by the software vendor, and the numerical recipes have decayed to the point of irrelevancy. Invariably, supply chain practitioners revert to their spreadsheets that, despite being crude, can be maintained to stay somewhat relevant.

The maintainability problem is one of the core reasons that lead Lokad in 2012 to create Envision – a DSL (domain-specific programming language) dedicated to the predictive optimization of supply chains. Indeed, during the early years of Lokad (founded in 2008), we came to realize that whatever numerical recipes we could craft, more often than not, and no matter how good our initial implementation was, those recipes would have to be extensively rewritten every 18 months or so. This was the price to be paid to keep the automation strictly aligned with the ever-changing strategy and priorities of our clients. Thus, Envision was specifically engineered to accommodate the need for constant rewrite in order to avoid irrelevancy.

In conclusion, maintainability, as far as supply chain optimization is concerned, is largely defined by the capacity of the company to routinely rewrite the numerical recipes that govern the execution of its own supply chain. While this capacity hinges on the size of the Supply Chain Scientist team that can be allocated for the task, it is also heavily dependent on the quality of the programming language used to implement the numerical recipes in the first place.