Artificial intelligence, augmented reality, Internet of Things, mobile devices, etc. are increasingly being applied in manufacturing, warehousing and other areas of the supply chain by more and more companies. In the following article, Forbes reflects on how analytics can solve problems more efficiently, thus aiding in the progress and growth of companies.
Focus on solving problems that transform operations
When discussing intelligent manufacturing, it can be harmful to dwell on the technologies, such as Artificial Intelligence (AI) or Machine Learning. Too often, when business leaders focus on technologies, they tend to view them as silver bullets that will fix everything. Instead, it’s better to think of your digital deployment strategy as a journey that will help you solve more – and more complex – problems. In practice, that means focusing on the proven use cases – whether or not they involve AI, machine learning or any other emerging technology.
The previous article on this topic focused on reaping the low-hanging fruit: reducing variability, addressing quality and machine problems faster, and having a system identify and alert operators when issues happen. These are perennial problems that manufacturers face and solving them faster drives value creation. The next step is to leverage the power of data analytics to solve problems that you otherwise would not be able to.
Solving New Problems
As you build data sets to solve existing problems, you can also be working to accumulate the data needed to address new issues – those that can only be resolved using new technologies and large data sets. Consider two use cases that, though they involve AI and machine learning, highlight targeted capabilities, not its gee-whiz possibilities. Still, these use cases are transformative because they solve problems you wouldn’t otherwise be able to solve and will drive more efficiency into the business.
Environmental Analytics: The environment in a factory – including its temperature, humidity, dew point, etc. – plays a significant role when you’re processing materials. Without environmental data and a way to quickly analyze it, a person can’t walk into a factory and say, “Today it’s a little more humid than usual, so I’m going to change the parameters like this.”
However, with enough of the right data ingested and analyzed using a simple algorithm that understands how the environment affects the processing of a particular material, you can make a concrete recommendation to adjust production parameters, while optimizing for quality and output.
Performance Optimization: Similarly, with data about how you’re making a product (or a family of products if they’ve been grouped), you can use a simple algorithm to analyze your production process. It looks for the best relationship between multiple variables, while optimizing for quality and output, and makes recommendations for changes that will improve the process. For example, the data might show that you’d reduce the cycle time by 5 percent if you use a different set of parameters. That information helps engineers move faster and find new solutions very quickly.
Both of these use cases involve machine learning, but deploying machine learning isn’t the goal, solving problems is. Also, both use cases can be expanded upon. In the beginning, you’ll likely route recommendations from the models to an engineer for validation and execution. In the future, you will be able to direct them into an autonomous system, thus creating a closed-loop. With the production process leveraging artificial intelligence and machine learning, it will produce products as efficiently as possible and improve continually, with every cycle.
What about predictive maintenance?
Many organizations think predictive maintenance is a good first digital use case, but it’s not, and here’s why. Predictive maintenance solutions require significantly more data than you need for anomaly detection and quality or performance optimization. That means it’ll take more time to build the data sets for predictive maintenance – and a much longer wait for ROI.
The rule of thumb about building a robust, intelligent predictive system is that you need about ten times the number of examples of what you’re looking to predict multiplied by the number of variables that you’re analyzing. So, if you’re working to build a predictive maintenance case with 30 variables in a particular situation, you’re going to need 300 examples. For most manufacturers, quickly amassing such a large data set and simulate those failure modes will be challenging.
Building the biggest data set
Becoming an intelligent manufacturer isn’t as much about what technologies you deploy as it is about helping people solve problems, make decisions and take action faster. In turn, that means making sure they have the right information at the right time. The fastest way to get started is to implement targeted, high-impact projects that leverage production data to help solve problems.
Fundamentally, the sooner you begin ingesting data, the faster you’ll be able to identify and solve problems, first by solving existing problems quicker and then by solving problems you wouldn’t otherwise be able to solve. By starting with smaller, targeted projects, you’ll be able to drive ROI as you build the much larger data sets required for more complex predictive systems – and the sooner you’ll be able to delight your customers in entirely new ways.
Because the manufacturer who will win in the end is the one who has the most data, and the biggest constraint is the time that it takes to ingest it.