Did Lean slow down evolution of Manufacturing Analytics ?

Lean had a drawback……

If you are well versed in Lean, and hence Toyota Production System, you are aware of the fact that unique Toyota tools such as the four step rapid setup are engineering intensive and they only yield a high return on investment in a repetitive manufacturing environment. Every press and machine tool at Toyota produces only a dozen or so different part numbers during its lifetime – i.e, it is a highly repetitive manufacturing environment.

………and so did Lean Six Sigma

Since American manufacturers generally produce both new production and spare parts –  i.e they do both repetitive AND nonrepetitive manufacturing, they modified Lean methods and one of such methods was Lean Six Sigma.

Lean Six Sigma used Pareto analysis to find the repetitive 20% of part numbers that delivered 80% of revenue. The ideas was that if you make 80% of revenue production highly efficient, 20% of the remaining portion will not bimpact your numbers significantly. For years, companies like Caterpillar, ITT, H.B Fuller and others embraced this methodology.

Lean Six Sigma thus neglected all the waste in the 20% of revenue, and thus not use the Big Data on 80% of part numbers. This approach of keeping the focus on 20% of the parts eventually, inadvertently made most of the Big Data into Dark Data – the data that was never looked at and could contain significant value.

Why that “Dark Data” is important ?

That 80% of Big Data contains valuable insights into many aspects that plague Manufacturing processes today. In one of the studies done by Blackwell and Rajan, it was found that

73% of setup waste was concentrated in 20% of the revenue from low volume parts !

Rather than evaluation cost and waste globally, Lean Six Sigma focused on value stream maps at the department level and had Black Belts work on removing local sources of waste. This was holding back the evaluation of Manufacturing Analytics.

AI is the Solution

Fortunately, the advent of multiprocessor PCs and the availability of cloud computing make the size of the data and the time taken to process and analyze it irrelevant. But another important aspect is the open source availability of advanced Deep Learning tools to perform analytics that can have monumental impact on many Manufacturing metrics. I write about applications of Deep Learning in Manufacturing frequently on this blog and in one of my posts, I have provided my perspectives on how Neural Networks can be used to reduce setup times significantly. NNs have the ability to process large amounts of data, both structured as well as unstructured and hence open a many doors to another level of Manufacturing Analytics.


 

Views strictly my own.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s