Skip to content

American Manufacturing Has a Data Problem

By: Berk Birand Berk Birand Headshot
Victor garcia Uo Ii V Yka3 VY unsplash

Manufacturing is one of the most data-intensive industries. With constant input from hundreds of sensors, smart factories produce as much as 5 PB of data per week. However, most companies still aren’t effectively dealing with the data deluge.

It’s true that over recent decades, manufacturers have made vast strides in terms of storing and managing the data they collect. Eager to join the Industry 4.0 revolution, manufacturers have invested billions in smart technologies including digital sensors and connectivity devices, and the global big data in manufacturing industry is set to top $9 billion by 2026. Google “big data in manufacturing” and you’ll find millions of results.

However, collecting data provides zero business value. It’s the insights you can get from the data that matter.

And when it comes to actually making sense of all the data they’re gathering, industrial companies continue to lag behind the curve. As a result, they’re missing out on key insights that could help them save millions on production costs and reduce carbon emissions.

The reason is simple—slow technology adoption. New technologies like machine learning are specifically designed for processing big, complex data sets, making them ideal for an industrial setting. With a machine learning solution, engineers no longer have to manually select which data points they want to analyze. Rather, the software can take in all the data produced in a given factory and provide real-time insights automatically, with much more speed and accuracy than human engineers.

However, automotive manufacturers, chemical manufacturers, and others frequently rely on outdated techniques to conduct data analysis, such as Excel and traditional statistics. These methods are not intended for large data sets or real-time use cases. By neglecting newer tactics like industrial machine learning, manufacturers are also missing out on valuable insights and essentially leaving money on the table.

Turning data into savings

Making sense of the data deluge is vital for decision-making. Machine learning software can process vast amounts of data and mine it for insights around how to optimize the manufacturing process, in ways that engineers are simply unable to replicate.

Product quality issues are one critical area where industrial machine learning proves its worth. Given the complex nature of quality issues, particularly in process industries such as chemicals, it often takes months to understand why a particular problem occurred. Engineers have to come up with a hypothesis, decide which parameters to analyze, and then conduct time-consuming root cause analysis.

Using machine learning, however, changes the picture. With software combing through all the data produced on the factory floor, engineers can discover the root cause of any quality issue in a matter of hours.

Even in cases where there are no quality issues, engineers may often wonder how processes can be optimized. By quickly ingesting large amounts of data, machine learning can offer insight into how various parameters are connected and how the process could be made more efficient. In just a few months, a company can save upwards of millions of dollars from resulting optimizations.

Making sense of data has a clear payoff. Lofty goals that once conflicted with each other—like reducing raw material costs, saving money, and lowering carbon emissions—are now within reach. The big data revolution provides many opportunities. Will companies invest in the technology allowing them to take advantage of them?

Originally published in Industry Today.