Companies that have made a commitment to deploying Industry 4.0 technology at scale have doubled productivity improvements over those who have a less focussed approach. With a heavy dependence on analytics to drive value, this commitment requires the capability to mass produce analytical models. Like physical mass production, analytic mass production will only be successful if there are well-defined processes and suitable technology that can be optimised to scale to the desired throughput.
Adapting production and maintenance processes to accept input from analytics, artificial intelligence and machine learning to the shop floor can be challenging. Gartner research suggests that through 2022, only 20% of analytic insights will deliver business outcomes. It need not be so.
As the early adopters have shown, there is significant value to be gained by introducing Industry 4.0 if approached systematically. I’ve collected three insights from recent research that provide guidance on how to increase the value analytics can provide to the shop floor.
Insight 1: Financial Commitment has far Superior Returns than Dabbling
In 2019 a Deloitte study found that the leaders – Trailblazers – in deploying smart factory technology achieved 20% improvement in production output, factory capacity utilisation and employee productivity. Trailblazers do not necessarily have larger factory budgets than the others. They simply focus their global factory budgets, with 65% being spent on Smart Factory Initiatives, vs. a 19% investment ratio for the next cohort.
Same budget, different investment focus, double the gains.
However, just throwing money at Smart Factory initiatives will not guarantee success. It also isn’t one or two things you install, flick a switch, and watch productivity improve. Value is realised though hundreds of separate analytical routines constantly ingesting vast quantities of machine data and providing insights to operators in a timely manner. To achieve this, you need the correct analytic environment.
Insight 2: Begin with a Highly Scalable Analytical Environment
A common mistake is to focus is on the development of an analytic algorithm, and not consider how to integrate the algorithm into business processes. This leads directly to the large analytics drop-out Gartner has found. In June 2020, an article in the Harvard Business Review recognised this challenge. The authors highlight the need to “to start with a well-designed production environment” to unlock the value of AI. They go on to describe this as an environment that is “flexible enough for quick and smooth system reconfiguration and data synchronization without compromising running efficiency.”
That is, the analytics environment to support the smart factory must be able to:
Continually ingest large quantities of data
Prepare this data for analytics
Run the analytic algorithms and
Send results to the edge or a human machine interface
All this within a timeframe that enables operators to adapt their processes according to the insight delivered.
Selecting technology which performs adequately with smaller projects is no guarantee it will perform at the scale required for smart factory deployments.
Also, technology alone will not guarantee success at scale. To significantly increase manufacturing productivity through analytics, you must also increase data scientist productivity.
Insight 3: Focus on Minimising the Data Management Bottleneck
When I started my career in the mid ‘90s we accepted the 80/20 data rule. 80% of the time was spent gathering data, 20% on delivering value. Almost 20 years later, this ratio hasn’t changed much, if at all.
The productivity gains of sharing data management tasks become clearer once you understand the analytic process for smart factory projects. My colleague, Martin Willcox, explains the analytic process in more detail in this blog post. I’ll illustrate the benefits of data management collaboration with an example.
Regardless of the business goal for individual projects (efficiency, quality, predictive maintenance), data scientists will start by building a machine event table. This table logs when a machine was running, and what state it was in when running. By creating machine event tables as a common resource, productivity improvements follow quickly. Projects using the same machine simply reuse the data. Projects which require data from different machines can still benefit from reusing the transformations that created the first machine event tables.
These productivity improvements accelerate as more projects contribute data and analytic routines for data preparation to a common “library.” As a bonus, curating a data library also increases data quality, by focussing data governance efforts. High data quality in turn increases analytic quality.
In Summary: Start small, use technology designed to scale – and collaborate.
The combination of these three insights shows a clear path to improving productivity through smart factory initiatives: Financial commitment, careful technology selection and collaboration across projects. The financial commitment of a factory budget enables a focus on analytic processes and technology required to scale -- and deploy -- analytic output.
With cloud deployment options and consumption-based pricing models, there is no excuse not to start your initial projects on Teradata Vantage. Teradata enables data scientists to draw on an agile data foundation, which encourages collaboration. Most importantly, Teradata technology is proven at the analytic scale required to deliver value to manufacturing processes.