Mark Boyle, Amada Weld Tech, discusses how Artificial Intelligence (AI) and Machine Learning (ML) algorithms are being implemented to determine material flow through production lines on the manufacturing floor, allocate parts to the right place at the right time to enhance throughput, and predict which product to build to maximise profits.
One key area of development is weld process monitoring: using AI/ML to look directly at a resistance or laser weld manufacturing process to determine success. This involves recording and analysing an array of the physical signals emitted which requires not only high-resolution sensors to collect them (the duration of the weld can be on the order of milliseconds), but also an infrastructure to export and analyse copious amounts of data.
This blog post will explore the three main areas of development - and one collateral area - that are paving the road to AI for laser weld processes in today’s manufacturing. These developments will facilitate a deeper understanding of what is happening during the process, resulting in improved quality and yield.
Fig 1 – The road to artificial intelligence and machine learning for welding processes requires high-resolution data acquisition, high-speed data handling, and AI/ML algorithms to analyze and use the data. The collateral branch of data security is a logical additional step required when placing process monitors on a network.
Data acquisition: collecting high-resolution data
The first, and, arguably, the most important part of the process is data acquisition, which includes collection, digitisation, and storage.
Resistance and laser weld processes can take just milliseconds for smaller parts, but there are dynamics, even within that very short time scale, that will yield valuable information about the success of the weld. Thus, the data capture resolution needs to be high enough to gather features on the microsecond time scale. For AI/ML algorithms, the richer the data set (higher resolution), the more accurately the algorithms can sort good from bad. Let’s take a closer look at the importance of resolution.
Fig. 2, below, depicts a physical signal from a weld collected by a sensor over time. When the data is collected at a low sample rate (top centre) the output (top right) is a stair-step signal that misses the outlier feature that appears in the centre of the curve which could result in a poor weld being deemed good or vice versa. If the sample rate is high (bottom centre), however, the true signal is better reconstructed (bottom right) in the output and that outlier would be noticed. The same is true for the signal level: the higher resolution of the signal, the better the reconstruction of the actual physical signal occurring during the welding process.
Fig 2 – High-resolution data is key to accurate reconstruction and interpretation of the physical signals. In this illustration, we see a representation of low and high sampling rates and the effect they have on capturing a feature.
Why is this so important? When making any kind of prediction about quality, more data results in more accurate results. If that blip in the middle of the curve was the primary indicator of a successful weld, it is clear that the higher resolution signal is required to confirm its presence.
To be fair, the curve in the above example is fairly simple, and an experienced process engineer could easily pick out good/bad welds. In reality, however, the curves are much more involved, and the correlation between certain segments of the measured curve and welding success will not be obvious. This is where the AI/ML algorithms can be used to parse and de-trend the data in multiple ways to find new correlations. The takeaway here? Higher-resolution data will yield better results from the AI/ML algorithms.
Networking takes data collection from local to global
Acquiring high-resolution data does pose a challenge, however, when it comes time to transfer it from the sensor to local storage, or to remote, networked storage. Depending on the resolution and number of channels, this can translate to many megabits per second. This clearly requires high-speed data transfer over the internet and fast write-speed hard drives capable of accepting this amount of information continuously throughout production.
Of course, having a network brings additional advantages.
Historically, each individual work cell was equipped with a dedicated process monitor. Data was stored locally and aggregated so that basic numeric values - for example a maximum or minimum signal - could be collected and compared. Exporting data was a time-consuming task, via USB stick or over RS-232. These devices are relatively slow, so huge amounts of data could not be easily transferred, and a lot of manual manipulation of the files was required. This had the effect of creating machine and operator-dependent information silos.
Fig 3 – Networked weld monitoring
Creating process monitors that are connected via Ethernet significantly eases the transfer of information which now can be collected globally, assuming there is an outside connection to the internet. This means that process engineers can collect and analyse data from across multiple factories located all around the world.
Tying this into AI/ML algorithms, these can source data from similar setups in different locations. This enriches the weld repository more quickly and expands the data to provide more refined judgments of the welding process.
Using collected and stored data in AI and ML algorithms
Now a look at how to use this collected and stored data. Ultimately, the data should tell a story - or in other words, provide information that can be interpreted and used to make further decisions. It can help the process engineer or machine operator understand and answer the following questions:
- Is the process efficient, under control, and producing good product?
- Is the equipment performing correctly? Does it need maintenance?
- Is the equipment well utilised? What is the production rate?
- Are there conditions resulting in defective process results?
- Are there anomalies, unusual events – or data that do not fit?
The answers to these questions can help the manufacturer make business decisions to improve product quality or throughput. Historically, this decision making is done by process engineers and operators based on their experience with the welding process. Initially, this might have been done by sight or sound, but more recently with the aid of basic process monitors.
Applying AI/ML algorithms to welding process data as captured by advanced network monitors, expands the capability of the process engineer and operator by looking for new features that are not readily seen in an aggregated number and within the waveform – and makes correlations that cannot be easily seen. The AI/ML algorithm can determine any number of features from a number of sensor measurements simultaneously.
A “feature” is a quantity (scalar or vector) that is measured or calculated from gathered data. A “useful feature” is a quantity that changes resultant of a change in the process or machine setup. For example, the bump on the waveform shown in Fig. 2, might be a feature that, when present, indicates a weld that did not meet pull strength requirements.
AI/ML algorithms work on a basis of either supervised or unsupervised learning. Supervised learning requires a series of tests with input for each measurement. Because this requires a dataset to be made prior to the outset, this process is fairly time-consuming and can be expensive. Unsupervised learning, on the other hand, starts from scratch without a dataset. As the dataset is populated, it can find outliers or anomalies and flag them for further review.
For welding processes, unsupervised learning is best. The process engineer or operator can start welding parts immediately. As the programme develops, it can identify welds that fell outside the norm, and the process engineer or operator can inspect and feed the result back into the algorithm. This is a far more cost effective learning programme and products can still be produced. The selection criteria will continue to improve with more data and information of good and bad welds that are added to the database. This goes beyond just process limits as the actual limits can be adjusted for different input measurements.
The hidden piece of the puzzle – network security
Because these new process monitors are networked together, can communicate with each other, and may be remotely accessed by a process engineer, this also means that there is a potential for outside sources to reach into the network of the monitor. This is not so much a concern that a hacker or competition has access to the data, but rather that this could be a potential port into the broader company network.
Indeed, network security must be (and has been) considered for Amada Weld Tech's newest products, where the company has selectively opened ports and recommend programmes like SecureLink 1 to ensure the product is not exposing the customers to a broader network.
Summary
The road to AI for welding processes includes high-resolution data acquisition, high-speed data transfer and storage through networked products, and AI/ML algorithms. Along with this, network security is key for a strong implementation at a factory. Achieving these allows AI/ML programmes to work with real-time process monitoring to further advance the understanding of weld processes and improve manufacturing quality and throughput.