What does leveraging “Data Science (DS)” in Supply Chain actually mean ? What should be considered “Data Science” as far as Supply Chain function is concerned ?
If you go by Wiki definition of DS…..
“an inter-disciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from many structural and unstructured data”….
Then Operations Research has been doing that with structured data for decades. So then the only new addition is unstructured data and advanced AI algorithms like Deep Learning ? But then….Advanced AI algorithms have been in existence for decades. We knew where they can be used and how. So why is DS in SCM suddenly exciting ?
The answer to what is different today, also, in my perspective, suggests how we should approach leveraging analytics in SCM. Despite all the advantages that have led to DS boom, we are still trying to fill old wine in new bottles in Supply Chains.
My this article highlights three key aspects of why DS is exciting, unbounded and unconstrained today…and what are the lessons embedded in those aspects, as far as leveraging DS in Supply Chains go:
Optimization problems that were identified decades ago but could not be solved in days can now be solved in hours or minutes. I remember reading a reserach paper from a German scholar during my undergrad days, written in early 90s, identifying instances of Multi Echelon Inventory Optimization (MEIO) problems and the corresponding problem formulation, citing that the “current technology” does not support solving this on an Industrial scale. Those constraints do not exist now. That is exactly the reason we hear terms like “Big Data” and “unstructured data” more frequently. Computing power, combined with technology allows you to ingest data of any size and format.
Stop constrainng your thinking of leveraging optimization only for “traditional” optimization problems in SCM. The ability to optimize a significantly high number of variables mean that you can expand classic optimization problems to areas of operations that you did not think about before.
Easily Accessible, inexpensive Technology
Open Source dev and interfacing: Allows you to build you own custom stacks, environments and platforms, inexpensively and rapidly.
Who said you still need to build seperate modeling environments for demand planning, Network Optimization, Inventory Optimization, Vehicle Route Planning etc. What if they all sit in the same environment- tapping into same base data and leverage outputs from each other, where feasible ? Who says Optimization and AI algorithms can’t coexist ? Why can’t you forecasting algorithm feed into a capacity optimization model?
Cloud: Cheap (relatively), rapid and safe access to unlimited tech infrastructure, with rapid scaling capabilities
Don’t be constrained by the amount of data-be a data hoarder. Even simple AI algorithms like Regression has become more powerful now that computing power, combined with availability of cloud platforms, allows companies to leverage it for more extensive, and unstructured, true “Big Data” sets.
Views my own.