00:00:07 The black box problem in supply chain management.
00:02:20 The rise of AI and complex numerical recipes exacerbating the black box problem.
00:03:59 Differences between older ERP systems and modern ones, and their relation to the black box issue.
00:06:10 Real-world examples of the black box problem and its effects on companies.
00:07:36 How companies are overcoming the black box problem using Excel sheets.
00:09:06 The concept of white boxing for transparency and insight.
00:11:27 Importance of crafting a few well-designed numbers to explain decisions.
00:14:17 The benefits of using economic drivers for decision-making.
00:15:28 Excel’s limitations in handling complex data and the need for better tools.
00:17:36 The importance of thorough documentation of system inputs to avoid garbage outputs.
00:18:47 Exploring the need for agile data crunching layers that support intermediate calculations.
00:20:08 Importance of crafting independent economic drivers to avoid misleading results.
00:21:09 Why white-boxing is crucial and the consequences of not having a transparent process.

Summary

In this interview, Kieran Chandler and Joannes Vermorel discuss the importance of “white boxing” in supply chain optimization. Supply chains are complex systems, and incorporating software or AI-driven techniques can make them more opaque. Vermorel argues that traditional ERP systems, often designed with zero intelligence, contribute to this opacity. Lokad’s white-boxing approach aims to recreate transparency by focusing on a few carefully crafted numbers and using company-wide metrics like economic drivers. With Envision, their programming language, Lokad helps maintain transparency in the supply chain optimization process. Embracing white-boxing can build trust in supply chain systems and improve overall efficiency.

Extended Summary

In this interview, Kieran Chandler, the host, discusses the concept of “white boxing” with Joannes Vermorel, the founder of Lokad, a software company that specializes in supply chain optimization. The conversation begins with an explanation of the “black box” problem in computer science, which refers to systems where inputs and outputs can be observed without understanding their internal workings. This can be problematic, especially with the increasing prevalence of artificial intelligence in various industries.

Vermorel explains that supply chains are inherently complex systems involving numerous people, products, and locations. This complexity leads to opacity, making it difficult for those involved to fully grasp the workings of the supply chain. The addition of software, even basic software that is not particularly “smart,” can further exacerbate this opacity. This issue is further compounded when advanced numerical recipes or AI-driven techniques are introduced, which can make it increasingly difficult for those in the organization to understand the meaning behind the numbers and results produced by these systems.

Opacity, as Vermorel defines it, is the difficulty in determining the origins and meaning behind results or measurements produced by a system. In the context of supply chains, this means that it can be challenging to understand why certain values or results were produced and what factors contributed to them.

According to Vermorel, the black box problem is widespread in the supply chain industry. Traditional enterprise resource planning (ERP) systems and other older technologies were often opaque for various reasons, such as difficulties in designing effective relational schemas for databases or the existence of multiple systems that were not well-integrated. These systems might be relatively simple on their own, but when combined, they can create a significant amount of confusion and complexity. Ad hoc integrations between systems that are not well-documented further contribute to the opacity.

The introduction of advanced numerical recipes or AI-driven techniques into these systems can cause opacity to skyrocket. This makes it even more difficult for those involved in the supply chain to validate whether the numbers being produced are accurate. Previously, it might have been possible to manually check stock values in different systems, for example. However, with the addition of complex numerical recipes, such validation becomes increasingly challenging.

Vermorel explains that traditional enterprise resource planning (ERP) systems are more about management than planning, and they are often designed with zero intelligence. The advanced numerical recipes that some vendors may call AI are usually layered on top of the ERP system, making the entire process more opaque.

In real-world examples, Vermorel has observed that even simple calculations, such as safety stock, can result in black-box effects. For instance, a company might input a service level of 99.9% but only achieve a 97% service level. This disconnect between input and output results in a lack of understanding of the system’s inner workings. Companies typically resort to using Excel sheets to overcome this issue, extracting foundational data from the ERP system and crafting their own numerical recipes to retain control over calculations.

The white-boxing approach, as Vermorel describes, acknowledges that even simple models can become opaque when applied to complex supply chain scenarios. The goal of white-boxing is to recreate transparency and understanding within these models. By developing a process geared toward recreating transparency, companies can generate trust and eventually move away from reliance on Excel sheets.

One challenge of the white-boxing approach is avoiding an overload of metrics and indicators. Companies often demand more KPIs and indicators when faced with a black-box model, but this can lead to even greater complexity and opacity. Vermorel suggests focusing on a few carefully crafted numbers to explain the decisions made by the model.

White-boxing aims to provide explanations for the end results of decisions with physical impacts on the supply chain, such as production, purchasing, or relocating inventory. To achieve this, the process considers a handful of figures that explain these decisions, measured in dollars or euros, as these units make the most sense company-wide. By focusing on the endgame and explaining the decisions in terms of company-wide metrics, the white-boxing approach can help create understanding and transparency in supply chain optimization.

Vermorel explains that the risk of stockouts can be much higher than the value of the items being purchased if it leads to production line stoppages. To deal with this, Lokad applies economic drivers to make decisions. These drivers are useful because they allow for comparison across the company, ensuring decisions are made using comparable metrics.

One of the challenges with traditional supply chain optimization techniques is the black-box nature of many analytical systems. In contrast, Lokad’s approach supports white-boxing, allowing users to understand the inner workings of the system and validate its outputs. Vermorel points out that Excel is an effective tool for white-boxing on a small scale, but it struggles when dealing with large datasets or complex calculations.

To address this limitation, Lokad has developed a programming language called Envision, which allows users to stay close to the data, perform validations, and generate dashboards. This helps maintain transparency in the supply chain optimization process.

For companies using black-box software solutions, Vermorel recommends starting with thorough documentation of the system inputs. Often, companies have poor documentation of their data inputs, leading to a garbage in, garbage out scenario. The next step is to ensure that the analytical layer of the system allows for agility, similar to Excel, so users can add columns and debug their logic easily.

Once these prerequisites are met, companies should focus on crafting good economic drivers that are as independent from one another as possible. This avoids the risk of double-counting or overlooking important factors in the decision-making process.

White-boxing is crucial in supply chain optimization because it ensures the validity of the system’s results. If people do not trust the system, they will fall back on their Excel sheets, which Vermorel describes as a necessary defense against “system insanity.” By embracing white-boxing, companies can build trust in their supply chain optimization processes and improve their overall efficiency.

Full Transcript

Kieran Chandler: Today on Lokad TV, we’re going to be discussing the solution, something that is known as white boxing. So, Joannes, before we get on to white boxing, perhaps we should start off with the black box problem. What is it about black boxes and what is the problem here?

Joannes Vermorel: Supply chains are complex systems with a lot of people involved, many products, and potentially numerous locations. From the start, it’s a given that it’s a very complex problem and thus a problem that already has its own opacity just because of its complexity. It’s very hard to grasp the whole thing. Then when you add layers of software in the middle, it just makes the problem worse. I’m only talking about relatively basic software that is just shuffling the plumbing parts of your supply chain IT, nothing smart. But even that creates another layer of opacity. Now what is happening is that some vendors are advertising AI, but I prefer to think of smart, advanced numerical recipes for supply chain. As soon as you add non-trivial numerical recipes in the middle of your software that is actually driving your supply chain, it creates a whole new level of opacity.

Kieran Chandler: You mentioned a key word there, which is opacity. Could you expand on what you mean by opacity?

Joannes Vermorel: By opacity, I mean that for people in the organization who are working in the supply chain or organizing the supply chain, it’s very hard to figure out exactly what a number means, where it came from, and why it’s set to this value and not another value. The opacity is a reflection of how hard it is to get to the bottom of things when you want to investigate your supply chain.

Kieran Chandler: So we’re basically talking about a black box problem where results are coming out of a system and we’re not too sure where those results come from. How widespread is this problem? Is this something we’re seeing a lot in the supply chain industry?

Joannes Vermorel: Yes, I would say the older ERP and IT systems were very opaque for various reasons. At the time, it was much harder to design good relational schemas for databases, so the internal setup can be a bit messy. You can have many systems that are not well integrated in a complex supply chain, so when you put all of them together, there is a significant mess and a lot of ad hoc integrations between the systems that are not so well documented. This creates emerging opacity out of systems that are individually relatively simple. And then when you put numerical recipes in the mix, the opacity skyrockets because suddenly, it becomes very hard to validate whether the numbers are correct.

Kieran Chandler: It’s not about checking the central system for the stock value that I should have in a remote location and then checking in the system of the remote location itself that the stock value matches. That was, you know, sometimes it could be tricky to do these sorts of checks. But as long as there is no numerical recipe involved, you could do it. As soon as you have a linear regression model in the middle, it becomes like hell to even replicate anything or even to understand what is going on. So, I guess that’s what’s kind of changed between these older ERP systems and the more modern ones. We’re now using more complex numerical recipes. Is that the big change? Is that why this black box approach is kind of occurring?

Joannes Vermorel: Typically, those advanced numerical recipes don’t live in the ERP. I mean, the ERP, despite its name, Enterprise Resource Planning, has very little to do with planning – almost nothing at all. It’s all about management. So you have Enterprise Resource Management, which is typically implemented with zero intelligence by design. You just want to track your assets, and you have layers of analytics on top. But then, you’re correct, within those layers of analytics, you have advanced recipes that some vendors might call AI, and it becomes way more opaque.

Kieran Chandler: How about a real-world example of how these black box problems are really affecting companies?

Joannes Vermorel: In the real world, what is surprising is that you don’t need AI to get a black box effect. Much simpler things are already causing widespread black box effects. I’ve seen many companies where even something as simple as a safety stock calculation, which is just how much stock you need to keep if you assume that your demand is normally distributed on top of your forecast and the same thing for the lead time, they end up with black box effects. They say, “Well, I am entering a service level of 99.9 percent, but then the measurement shows in reality that I’m only getting 97 percent service level.” So, you end up with a bizarre disconnect between the setting that you enter in the system, like 99.9, and what you get in reality, which is 97. That’s the sort of situation where you have a black box, and you don’t really understand what is going on. Clearly, the output of the system does not match what you expected from the system, so you’re facing this black box effect head-on.

Kieran Chandler: If this is so widespread, how are companies overcoming these problems?

Joannes Vermorel: Typically, they overcome it with truckloads of Excel sheets. What happens when you have numbers that you cannot trust from the system? People just extract data from the ERP, foundational data that they can trust, like stock levels, historical sales, historical purchase orders – these sorts of things. They can double-check that the Excel sheet is aligned with the system, so no black box effect. They even double-check that what’s in the Excel sheet is actually aligned with what they have on the shelves in the warehouse, just to be sure. And then they start crafting their own numerical recipes in the Excel sheet itself, where the supply chain practitioner has control over the calculation at every single step, and it’s very visual. That’s how they try to stay away from a black box effect. Although, when an Excel sheet is passed from one supply chain practitioner to another over the years, you can end up with some degree of opacity and black box effect in those Excel sheets themselves. Excel is not a silver bullet; it just helps.

Kieran Chandler: Let’s talk about the solution. White boxing approach, I mean white boxing is in practice, from our observation, even some trivial numerical recipes, just a simple safety stock formula or anything that is slightly more complicated than the moving average, is a black box. It doesn’t take AI to get a black box. I mean, any linear model with three variables, most people are not able to compute in their head. So even a model as simplistic as a linear model with three variables gets very quickly relatively opaque to people who are actually using the results of this model.

Joannes Vermorel: As soon as you move toward anything that is smarter, especially anything that would capture the nonlinearities that you have in supply chains, it gets opaque pretty much by design. What you need is a process, what we call white boxing, that is completely geared toward the idea that you will recreate transparency and insights knowing that by default what you get is pretty much the opposite. There is no alternative. If even a model as simplistic and oversimplistic like safety stock is already opaque, you cannot have any hope that a better, more realistic model, such as one based on probabilistic forecasts and the probabilistic assessment of risk for your inventory, will be any less opaque. Quite the opposite, it will be more opaque, a lot more accurate. Thus, we need this white boxing process to create understanding, generate trust, and ultimately have people giving up, for good reasons, on Excel sheets.

Kieran Chandler: So, how does it actually work in practice? How do you go about checking the results you’re given and having that real white box approach?

Joannes Vermorel: One tricky aspect is that it’s very easy to have people drowning in a sea of metrics. Generating numbers, when people are facing a black box, the main reaction is to say, “Give me more indicators, create more KPIs, I want to see more and more.” But when you do that, you end up with gigantic tables with dozens of columns that are still completely opaque, and incredibly complex. The starting point is to say, we need to have a few numbers that are exceedingly well-crafted when we want to explain decisions.

First, one idea is that you don’t want to try to explain everything, especially numerical artifacts. You don’t really care about how some intermediate steps of the computation are happening. What you care about is the end result, the decision that has a physical impact on your supply chain. You decide to produce, you decide to purchase more, or you decide to relocate inventory from one place to another. That’s the endgame, and that’s the decision you’re concerned about.

The white boxing is about having, for every single decision, maybe half a dozen figures that explain these decisions, measured in dollars or euros. The second idea is to use dollars and euros because it’s the thing that will make more sense company-wide. If you know why you are purchasing 100 more units of a product and have a variable that says, “Well, the risk of stockout is 50 thousand euros because if we run out of this thing, the production line will stop,” then the cost of the stockout can be much higher than the value of what you’re purchasing if you’re putting a production line at risk. These are the insights that people will get.

Kieran Chandler: So, there are these economic drivers that you apply. What is it about those? Economic drivers are extremely useful because they are comparable company-wide. The thing is, you have one decision where you have half a dozen of economic drivers, and you have another type of decision with other types of economic drivers, and you want all your measurements to be compatible so that you can compare apples to apples and oranges to oranges. If you have apples and oranges, then you’re kind of lost. That’s typically what is happening with all those percentages of error. So, other than economic drivers, what is it about Lokad’s approach that makes white-boxing so possible? How does it differ from other techniques?

Joannes Vermorel: You need tools to do that, and most analytical systems do not do such a good job at supporting white-boxing. Excel is actually quite good because you stay very close to the data, but it falls apart when the complexity and volume of the data increase. Excel is excellent as long as you’re dealing with less than ten columns and a thousand lines. If you start crunching millions of lines and dozens of columns, it rapidly becomes like a gigantic spaghetti code base within your Excel spreadsheet. Excel is very good at limited scale, but it falls apart when facing complexity.

Many analytical systems do not manage to preserve any properties of Excel, and the data gets very far from the user. At Lokad, we have engineered Envision, which has the term “vision” in the name of this programming language, stemming from the idea that we needed to stay very close to the data so that we can do all those Excel-like validations all the time. Envision makes it very easy when you craft formulas to plot, in a dashboard, all your numbers so that you can check them just like you were doing in Excel, just for validation’s sake.

The idea is that we can generate highly composite dashboards where your main drivers are going to be displayed prominently, but you can have many small tiles that contain all your checks, just like you would do in Excel, so that you can check that your calculations are sound and that your intermediate steps are not completely out of whack.

Kieran Chandler: If I’m a company and I’m using a piece of software that’s exhibiting a lot of those black box characteristics, what should I be doing? What should be the first step I should take to improve my understanding about what’s going on?

Joannes Vermorel: First, you need to have thorough documentation of what goes into the system. Usually, the documentation is nonexistent. Just to clarify, we’re talking about having a table that requires about one page of documentation, not IT documentation, but supply chain, business-driven documentation.

Kieran Chandler: You know, working with clients, at best if every field has one line of documentation, we are happy. Usually, there is like 20 tables, each table 20 fields, and we barely have one line of documentation per table. We should have one page of documentation per field. So that’s probably the starting point for most companies, is to fully document what you have as input of the system; otherwise, you end up with garbage in, garbage out.

Joannes Vermorel: The second thing is to check that your data crunching layers, your analytical layers, give you basically the sort of agility that you have with Excel. So that you can, with a few statements or some lines of code, add all those easy reporting columns that you need to debug your logic. Unfortunately, you can’t avoid the coding aspect because, just like Excel, it’s about formulas and whatnot, so it’s coding. But within a few lines of code, you can have all those columns. If you don’t have a system that supports such an agile process, where you can just create columns like in Excel to double-check intermediate calculations, then it’s basically game over. You will never get to the bottom of your black box.

So, probably at this step, you need to change the analytical layer if you don’t have a layer that gives you some kind of agility. Then, the next step is to start crafting very good economic drivers. And by very good, I mean that every single driver needs to be as independent from the other drivers as possible. You want to have things that are very orthogonal. The danger is that if you do not manage to craft a series of indicators that are reflecting things that are literally orthogonal, then you might be looking at the same thing multiple times and be misled.

That’s a tricky part, but the idea is that when you look at an economic indicator that tells you €100 of cost or reward, you want to make sure that it’s as independent as possible from another driver that also claims to bring €100 of reward or cost.

Kieran Chandler: So the key points are really understanding your inputs and outputs and also defining those economic drivers. As a final word today, why is white-boxing so important?

Joannes Vermorel: White-boxing is so important because, otherwise, all your efforts at optimizing your supply chain will go to waste. People will, for good reason, fall back to their Excel sheets because if you don’t have a white-boxing process in place that gives everybody the reassurance that the results coming out of those systems are sane, then the odds are overwhelmingly that your results are insane. People do the sane thing, which is not to trust those results and fall back on their Excel sheets. The Excel sheets are, unfortunately, the necessary solution against system insanity. White-boxing is required; otherwise, don’t expect your teams of supply chain practitioners to give up on their Excel sheets anytime soon. They will not, and by not doing that, they are protecting your company because trusting an insane system is much worse than wasting time on Excel sheets.

Kieran Chandler: And who knew a lot of analysts did probably think Excel gives you insanity, but there you go. So that’s everything for this week. Thanks very much for your time, and we’ll see you again next week. Thanks for watching.