Big Data and 30 Years of Total Quality Management

Blue Down Arrow

The notion of ‘Quality’ in business performance has exploded since I wrote the first edition of ‘Total Quality Management’ in 1989. That book was the first in the world carrying that title and it presented one of the early ‘models’ of TQM. This was a simple triangle of ‘systems,’ ‘tools’ and ‘teams,’ surrounding a core of customer- supplier chains, both external and internal. It was created predominantly from my research on quality and competiveness of industries in the ‘West,’ following the huge impact of generally higher quality goods from Japan being imported to America and Europe. Whole industries in the UK and USA were being devastated at that time.

In spite of the jacket looking more like a Dickens’ novel than a business book, a review of several books on similar topics at the time carried the heading: “And the winner is? – John Oakland’s new book on TQM!”

The Explosion of Big Data

I’m often asked, “What changes have you seen since that first publication?” Given that over 30 years have passed, it’s not surprising that nearly everything has changed – business and the world today is almost unrecognisable from that time. The main driver, of course, has been the huge technology and software development that has led to the ‘big data’ world that we live in today. The process of evolution speeded up exponentially to drive changes we couldn’t have even thought of seeing. Smartphones and ‘tablets’ didn’t even exist in 1989, now they’ve completely taken over our lives!

Just take data in a manufacturing environment – in the past we often had the problem of not being able to see the wood because we didn’t have enough trees to look at, now it is more likely that we will not be able to see the wood for the trees – we’re generating terabytes of stuff – it is now possible to obtain product or process measurements every few seconds, but it’s not always a positive picture. So, in the 2020s we face questions around how to effectively manage real-time applications to collect, clean and provide useable quality data, and then analyse it correctly to make better decisions and to improve product performance.

Not surprisingly then, over the 30 years since publication of the first edition, interest in TQM and business performance improvement has exploded to keep pace with the technology. So much has been learned during those years of implementation that it has been necessary to rewrite the book and revise it again and again. The content of this latest edition needed updating to reflect further developments, current understanding, and experience gained in the era of ‘big data’ and the application of powerful tools, collectively known as data analytics. In our own business, for example, we have seen how our development of new approaches to managing the operation and quality of outcome of large infrastructure programmes was needed to handle the huge quantities of available data and make better decisions to reduce overruns on both time and budget. Importantly given post pandemic plans to launch many large infrastructure projects, our ‘Intelligent Forecasting’ model is achieving great results in this sphere https://vimeo.com/455730290

Neglect of product and service quality can have disastrous consequences, as we have seen repeatedly in recent years around the globe. Consequential reputational damage is deeper and quicker now than ever before because information, opinion, and ultimately consumer choice, is affected at scale due to the nature of modern communication technologies. Maintaining or increasing the satisfaction of customers and other stakeholders through effective goal deployment, cost reduction, process & project improvement, people involvement, and supply chain development has proved essential for organisations to stay in existence in the 21st century. Alongside technology developments and the massive changes they have brought to our lives, we cannot avoid seeing how quality – meeting, indeed anticipating our requirements – has remained the most important competitive weapon, and many organisations have realised that total quality and its ‘add-ons’ are the way of managing for the future.

The 5th edition of our book is structured in the main around four parts of what has become known as the ‘Oakland model for TQM & OpEx’ – improving Performance through better Planning and management of People and Processes in which they work. The core of the model will always be performance in the eyes of the customer, but this must be extended to include performance measures for all stakeholders. This core still needs to be surrounded by Commitment to quality and meeting customer requirements, Communication of the quality message, and recognition of the need in many cases to change the Culture of most organisations to create total quality. These three Cs are the ‘soft foundations’ which must encase the ‘hard management necessities’ of the four Ps, and they are STILL of paramount importance, even in the highly technology and data driven world of today. Hopefully, the ten case studies in the book will help to show how this can be done in practice.

https://www.routledge.com/Total-Quality-Management-and-Operational-Excellence-Text-with-Cases/Oakland-Oakland-Turner/p/book/9781138673410

Author: John Oakland, Chairman of the Oakland Group

Previous Post: Applying digital twin thinking to programme managementNext Post: Improving outcomes of complex projects using advanced analytics.