It is time for Lean Six Sigma to evolve- by leveraging Big Data

Note: As a standard practice, I will include a plagiarism detection link in all the articles on my blog site. You can use the online tool below to evaluate any article published on the web for plagiarized content. All you have to do is to paste the url of the article there and it will go through every sentence in the article and flag any plagiarism. I believe as a follower of “writer’s integrity” I should include this tool in all my articles:


What is once revolutionary always becomes defunct

The first industrial revolution was a result  of usage of steam engines to power manufacturing machinery. Henry Ford’s mass production of a single model of car at lowest cost was a key element of seconfd Industrial revolution. Toyota copied Ford’s line production model and evolved a methodology around it- popularly known as Toyota Production system (TPS).

Lean Six Sigma approach, that derives its inspiration primarily from TPS, has been used widely in Manufacturing. It was so popular at one point that it spawned an entire Industry of “Lean Six Sigma professionals”. But in my perspective, lean Six Sigma had some drawbacks, which, though were overlooked at that time due to tech constraints, that need to be addressed now. And Big Data technologies can help.

Big Data or no Big Data….Lean Six Sigma needs to evolve

For its time, Toyota Production System was indeed revolutionary. Lean Six Sigma, based on Toyota Production system, became the most widely used process improvement method in Industry and the military, creating the Third Industrial revolution. Lean Six Sigma incorporated Little’s Law of cycle time reduction, allowing the TPS to be applied to any industry and process.

This was state of the art…..but in 2001.

Now let us explore some of the drawbacks of TPS that were ignored back then.

Lean Six Sigma of 2001 was based on the repetitive manufacturing model of Toyota Motors.  TPS allowed Toyotal to setup a 2,000 ton stamping press in 4 minutes vs GM’s 4 hours (in 1986).This allowed Toyotal to reduce the batch size by a factor of 24 and still have the same cost efficiency, and avoid dings and rust on parts as they were immediately assembled onto a chasis.

However, every press and machine tool at Toyota produced only a dozen or so different part numbers during its lifetime. What does that mean ?

It means it was a highly repetitive manufacturing environment.

The Gaping hole in Lean Six Sigma Approach

The typical manufacturer in America today produces both new production as well as spare parts – which essentially means that they produce both repetitive and non-repetitive manufacturing. Product designs have become more complex, and customization has increased, which means a reduction in repetitive manufacturing aspects.

Also, to gain the advantages of TPS, Lean Six Sigma used Pareto Analysis to find the repetitive 20% of part numbers that delivered 80% of revenue. The belief was that if we make 80% of revenue production highly efficient, we are making giant strides.

Lean Six Sigma thus neglected all the waste in the 20% of revenue.

The data for these 80% of parts went into a Data Black hole….the kind which we like to call “Dark Data” now. No one bothered to collect that data, let alone analyze it. But within the last two decades, the customer expectation landscape changed.

Customer satisfaction now reigns supreme

Customer now demand quality and service, irrespective of whether that part accounts for a miniscule portion of your revenue. If you had 5000 low volume parts and 20% of them are late or have quality issues- it will start impacting your high volume high revenue parts soon. Customer today has many channels to vent and social media provides them unlimited power. You are being judged for every part you produce- even if it is a fastner.

How Big Data technologies can help ?

As discussed earlier, rather than evaluate costs and wastes holistically, Lean Six Sigma focussed on value stream maps at the department level and had Black belts work on removing local sources of waste.

The good news is that with advances in computing hardware (think processing power), software etc. and the availability of cloud computing, it is now very much feasible to analyze the massive datasets for all your manufacturing parts and processes. This capability will allow us to focus on the “vital few” vs the so-called “irrelevant many”.

And as I have indicated in many of my posts on this blogsite, those same Big Data tools can then be enhanced by leveraging AI algorithms on each and every part being manufactured, to generate plans and recommendations in near real time. Technology has opened a flood gate and we should allow the current version of Lean Six Sigma to be washed away so that the next generation of LSS can sprout.

Views expressed are my own.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s