By Nicole Nelson
Scott Shaw recalls a time in the not too distant past when he attempted to help a food manufacturer calculate an optimal scheduling scenario to grind grain.
As the Clarkston Consulting senior consultant was well aware, the utilization of big hammer mills was extremely energy intensive, yet inevitable, as said grain served as the necessary basis—mixed with other ingredients—to ultimately create the firm’s final product.
“It took a ton of electricity and it cost a boatload to operate the grinders that they were using,” Shaw said, “and their electric rate at night was a lot cheaper than their electric rate during the day because the power company is trying to balance their demand and supply like everybody else.”
So, the solution was relatively straightforward, right? The obvious answer was to grind at night and process during the day.
But not so fast.
“It seemed like a really simple question until we started to dig into it, and we realized what we’re really balancing is our receiving capacity, with our grinding capacity, with our holding capacity, with what our projected schedule is going to be—and comparing that to our inventory position and the cost of either buying ahead, or potentially running low on stock, or running out of stock.
“There are a ton of variables there and, frankly, we gave up,” Shaw said, noting that scheduling optimization engines typically balance capacity and demand, but many of the variables that should be considered simply aren’t because the math is too hard.
This type of math computation is just hard enough that it would take years—if not hundreds or thousands—for a traditional, binary computer program to calculate all of the outlying variables.
“In a lot of optimization engines that we’ve used historically, they use different techniques to try and solve problems without considering every possible combination because that calculation would run for years,” Shaw said, noting the programs aren’t actual optimization but rather probabilistic optimization. “If you really want to optimize, you’ve got to consider all the different potential combinations.”
“At some point, we said, ‘Well, we’re not going to do that. Let’s just grind when we need to grind.’”
Fast forward to the not-too-distant future, though, and Shaw believes a true optimization solution is quite possibly within reach with the hopeful promise aligned with the advent of quantum computing (QC) solutions. Unlike classical computing, which uses binary code with transistors of zeroes or ones, quantum computing calculates with qubits, which can represent either zeroes or ones at the same time in a fluid, non-binary state.
Because quantum computing is not just zeros and ones—it is zeros and ones and then a series of possibilities in between—it can consider all of those multiple combinations across multiple variables much more quickly.
This quantum state is oftentimes visualized as a coin flipped in midair. While the coin drops, it is neither heads nor tails, but once it stops dropping, it will definitively be either heads or tails. Likewise, the computer’s data exists as having multiple possibilities until the program is forced to make a decision. This principle of quantum uncertainty allows the computer to consider all probable outcomes at once instead of having to process each scenario individually.