Vis enkel innførsel

dc.contributor.advisorSen, Sagar
dc.contributor.advisorJenssen, Robert
dc.contributor.authorMohammad, Gutama Ibrahim
dc.date.accessioned2022-05-10T05:38:26Z
dc.date.available2022-05-10T05:38:26Z
dc.date.issued2022-01-26en
dc.description.abstractIn industry 4.0 manufacturing, sensors provide information about the state, behavior, and performance of processes. Therefore, one of the main goals of Industry 4.0 is to collect high-quality data to realize its business goal, namely zero-defect manufacturing, and high-quality products. However, hardware sensors cannot always gather quality data due to several factors. First, industrial 4.0 deploys sensors in harsh environments. Consequently, measurements are likely to be corrupted by errors such as outliers, noise, or missing values. Sensors can, over time, be subject to faults such as bias, drifting, complete failure, and precision degradation. Moreover, direct sensing of a process variable can be unavailable due to environmental constraints such as surface temperature being beyond the range of the physical sensor. A virtual sensor is a tools to solve these problems by allowing for online estimation of process variables when the physical sensor is unreliable or unavailable. Deep learning method is effective in developing virtual sensors; however, it assumes that the data used for training and deployment are independent and identical (i. i. d). Therefore, deep learning in high-risk environments, such as industry 4.0, is challenging because if i.i.d assumptions fail to hold, the model may make errors that lead to disastrous consequences, such as financial losses, reputational damage, or even death. We can prevent model mistakes only if the model estimates the uncertainty of its predictions. Unfortunately, current deep learning-based virtual sensors are created using frequentist models, making them unable to capture uncertainty accurately. In this thesis, we explore the possibility of Bayesian convolutional neural networks (BCNN) to generate uncertainty-aware virtual sensors for Industry 4.0. We use two publicly available realistic industrial datasets to generate virtual sensors and conduct experiments. CNC Mill Tool Wear data (CNC) from CNC milling machine provided by the University of Michigan, and Tennessee Eastman Process data (TEP) provided by Eastman Chemical Company for process monitoring and control studies. The root-mean-square error (RMSE), mean absolute percentage error (MAPE), and R-squared (R2) is used to evaluate the predictive capability of the generated virtual sensor. The performance is compared to that of the standard neural network-based virtual sensor, namely convolutional neural network (CNN) and long short-term memory (LSTM). We demonstrated Bayesian neural networks' ability to quantify uncertainty by computing the coverage probability of the uncertainty. Additionally, we tested whether the estimated uncertainty could detect changes in input data distribution using the fault injection method. Our BCNN virtual sensor had the best R-squared scores, with R2 = 0.99 on CNC and R2 = 0.98 on TEP data. The result of the coverage probability score indicates a reasonably good uncertainty estimate. However, despite predictive uncertainty detecting faults in input datasets, its accuracy declined as fault length increased.en_US
dc.identifier.urihttps://hdl.handle.net/10037/25039
dc.language.isoengen_US
dc.publisherUiT Norges arktiske universitetno
dc.publisherUiT The Arctic University of Norwayen
dc.rights.holderCopyright 2022 The Author(s)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-sa/4.0en_US
dc.rightsAttribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)en_US
dc.subject.courseIDFYS-3941
dc.subjectVirtual sensorsen_US
dc.subjectPredictive uncertaintyen_US
dc.subjectBayesian deep learningen_US
dc.subjectSensor faultsen_US
dc.titleValidating Uncertainty-Aware Virtual Sensors For Industry 4.0en_US
dc.typeMastergradsoppgaveno
dc.typeMaster thesisen


Tilhørende fil(er)

Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
Med mindre det står noe annet, er denne innførselens lisens beskrevet som Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)