
Ebook: Intelligent Manufacturing and Cloud Computing

Intelligent manufacturing is sometimes referred to as smart manufacturing, and is the term applied to manufacturing that uses automation, data collection, artificial intelligence, machine learning or other types of integrated technology to create optimal production conditions.
This book presents the proceedings of ICIMCC 2024, the 2024 International Conference on Intelligent Manufacturing and Cloud Computing, held from 22 - 24 November 2024 in Wuhan, China. The conference provided a valuable international platform for invited speakers, authors, and participants from around the globe, attracting a diverse range of participants, and serving not only as an academic forum during which papers and topics related to intelligent manufacturing and cloud computing were enthusiastically welcomed, but also as an excellent opportunity to establish business cooperation.
All submissions were subjected to a stringent peer-review process by between 2 and 4 expert referees, and papers were selected on the basis of originality, significance, and clarity in relation to the scope of the conference. The book presents 77 papers, which are divided into 5 chapters according to subject: mechatronics and factory, computing, communication, optics, and robotics.
The book represents a compilation of the most up-to-date, comprehensive, and worldwide state-of-the-art knowledge on intelligent manufacturing and cloud computing, showcasing the latest research results and will be of interest to all those working in the field.
ICIMCC 2024, the 2024 International Conference on Intelligent Manufacturing and Cloud Computing, was held from 22–24 November 2024 in Wuhan, China.
ICIMCC 2024 provided a valuable international platform for invited speakers, authors, and participants from around the globe. The conference attracted a diverse range of participants, and served not only as an academic forum during which papers and topics related to intelligent manufacturing and cloud computing were enthusiastically welcomed, but also as an excellent opportunity to establish business cooperation.
The proceedings of ICIMCC 2024 represent a compilation of the most up-to-date, comprehensive, and worldwide state-of-the-art knowledge on intelligent manufacturing and cloud computing. All accepted papers underwent a stringent peer-review process by between 2 and 4 expert referees, and were selected based on their originality, significance, and clarity in relation to the conference’s objectives. The conference program was rich and profound, featuring the high-impact presentation of selected papers, as well as additional late-breaking contributions. The conference not only showcased the latest research results in the field but also provided a significant platform for academic connection and exchange.
The final conference program included 77 papers across 5 sessions. The proceedings will be published by IOS Press as part of their series, ensuring quick, informal, and high-quality publication.
We extend our heartfelt gratitude to the members of the Technical Program Committee, who worked tirelessly to meet the review deadlines, and to the organizers for all their hard work. The valuable time and dedication of both in preparing for the conference is much appreciated. Our deepest thanks also go to all volunteers and staff for the long hours they generously contributed to the conference. Lastly, we would like to express our appreciation to each and every author, speaker, and participant for their significant contribution to the success of ICIMCC 2024.
ICIMCC 2024 Organizing Committee
In federated learning systems, malicious attackers can manipulate training datasets by injecting backdoor triggers to achieve data poisoning. Addressing this vulnerability, this paper proposes a new defense method, the adaptive differential privacy stochastic gradient descent (ADP-SGD) algorithm, which increases the Euclidean distance between malicious and benign updates to enhance aggregation efficiency. Experiments were conducted on typical datasets, and the results show that combining ADP-SGD with the m-Krum aggregation algorithm can effectively defend against backdoor attacks in complex attack environments. While ensuring the accuracy of the main task, it significantly reduces the success rate of backdoor attacks, especially under the condition of unknown proportions of malicious participants, proving to be effective and robust.
This study, from the perspective of online public opinion, uses web crawler retrieval technology and manual screening, and adopts targeted collection and information extraction from online public opinion monitoring platforms to sort out websites directly related to product quality. It monitors, analyzes, and summarizes product quality and safety information from 31 provinces in China, points out existing problems, and proposes corresponding countermeasures and suggestions. The monitoring results show that in 2023, public opinion information on product quality and safety is mostly concentrated in the eastern region, as well as product categories such as electrical appliances, automobiles, and daily chemical products. Quality and safety issues are currently the focus of social attention on people’s livelihoods. Therefore, relevant regulatory authorities should further strengthen product quality and safety supervision, crack down severely on quality violations, improve product quality and safety levels, prevent product quality incidents, and effectively safeguard the declaration and property safety of the people.
In recent years, the country has released a large number of standard documents related to prefabricated concrete components. Due to the dispersion and complexity of these standards, it is difficult for industry managers to implement them. Therefore, it is imperative to establish a knowledge graph in the field of quality standards for precast concrete components to achieve knowledge management. The construction of knowledge graph first requires named entity recognition, but due to the specificity of the construction industry, entity recognition of the text in this field still suffers from the problems of low recognition accuracy and high resource consumption. Aiming at the low recognition accuracy and high resource consumption in named entity recognition due to the specificity of the industry, this paper proposes a named entity recognition model ALBERT-BiLSTM-SA-CRF for the field of precast concrete component quality standard, which can capture the full text and local feature information more effectively through the combination of BiLSTM and self-attention mechanism, and accurately recognize entities in the standard. Meanwhile, using ALBERT as the word embedding layer can play the role of reducing resource consumption and improving processing efficiency.
Aiming at the poor matching between elevator inspection data and twin body model, which leads to the problem of data redundancy and delay in elevator twin model operation, based on the analysis of elevator inspection data, we study the driving data configuration based on the correlation analysis to study the running state of elevator twin model, and the driving of elevator twin gross oh operation based on the coordinate transformation, and form the driving method of elevator twin visualization based on the inspection data to realize the inspection data and elevator twin model running automatic matching and driving. Finally, the effectiveness of the method is illustrated by examples.
This paper focuses on the utilization of waste heat from Stirling engines. Simulation calculations were performed using Aspen exchanger design and rating (Aspen EDR) software. Based on the results, a simulated air purification device utilizing Stirling engine waste heat was designed. The experimental results demonstrate that the temperature of the cabin air at the outlet of the heat exchanger meets the operating conditions of the catalytic device. At this point, the heat required to warm the cabin air accounts for 45.6% of the available waste heat from the engine.
The detection algorithm can greatly improve the safety of construction workers under the background of construction sites and ensure the personal safety of construction workers. In this paper, the detection algorithm of helmet wearing is studied. YOLOv5 model is adopted as the core algorithm for personnel helmet wearing detection, and the lightweight model GHOST and attention mechanism CBAM are added to improve the original algorithm for construction scenarios. The lightweight model GHOST is a lightweight model used to reduce the computational load of convolutional neural networks. Attention mechanism CBAM used to amplify the receptive field of convolutional neural networks. Due to many dangerous factors, emergencies may occur at any time in the construction scene. Efficient and reliable personnel helmet wearing can further improve the operational efficiency and detection accuracy by using the attention mechanism. The algorithm has good performance in accuracy and robustness, and has high detection accuracy and good adaptability.
In view of the lagging development of agricultural machinery in the red soil hilly and mountainous areas and the common problems such as broken gears and broken blades of rotary tillers during the tillage process, the front claws of the gryllotalpa orientalis were taken as the research object. Combined with agronomic requirements, a virtual model was constructed with the help of digital twin technology to conduct simulation analysis on the crawler self-propelled rotary tiller with the optimized cutter bionic to the front claws of the oriental mole cricket. A comprehensive solution for this automated equipment was proposed. Adhering to the concept of the integration of art and science in industrial design, taking light weight, high efficiency, reliability and applicability as the core principles, digital twins were utilized to achieve design optimization, and a small rotary tiller matching the crawler tractor was designed to meet the tillage needs of hilly terrain. The design, performance and function of the rotary knife were simulated and optimized, the materials and structure were improved to enhance durability, the power system was optimized to improve the operation performance, bringing new changes. This product has achieved the goals of automation, standardization, high efficiency and low cost to a good extent, improving the agricultural production efficiency in the red soil hilly areas, and has excellent application prospects and promotion value.
This study addresses challenges inherent to water quality monitoring devices used in aquaculture, especially in extensive pond systems. These challenges include vulnerability to contamination, complex maintenance requirements, and constrained monitoring scope. We introduce a design and implementation strategy for a water quality monitoring system that leverages unmanned aerial vehicles (UAVs) integrated with three-dimensional Geographic Information Systems (3D GIS). Utilizing 3D GIS technology facilitates the creation of a precise operational milieu for the aquaculture monitoring UAV. This includes defining virtual trajectories for safety zones and operational pathways, thereby mitigating the risk of UAV positional loss. Furthermore, on-site experiments were undertaken to quantify deviations from the designated virtual operational trajectory. Results suggest that the proposed UAV-centric water quality monitoring system, enhanced by 3D GIS, not only offers precision in positioning and stability during flight but also ensures dependable data acquisition and supports offline maintenance procedures. Moreover, this system is capable of servicing water expanses exceeding 1000 acres, remaining unaffected by the fragmentation patterns inherent to large pond systems.
This paper establishes a correlation analysis model based on gray relation analysis technology. It solves the main problem that there are limited sample data for space launch vehicle quality problems thus, it is unable to use statistical analysis techniques effectively. This model takes a certain type of launch vehicle as the analysis object. The quality problem associated objects are divided into introducing processes, systems and factors. The number of quality problems occurred is counted respectively according to the related objects, and then the gray relation analysis technology is used to evaluate correlation degree between each related object and the quality problems, which is used to interpret the main reasons for the frequent occurrence of quality problems. Finally, the feasibility of the model is verified by the deductive data of quality problems of a certain type of launch vehicle during the launch missions.
With the rapid development of technology, intelligent education has gradually become a research hotspot in the field of education. The Third Plenary Session of the 20th Central Committee of the Communist Party of China explicitly proposed to promote the digitalization of education and align with the global “Education 4.0”. This article explores the application of intelligent education in practical teaching for undergraduate students majoring in mechanical and electrical engineering, aiming to improve teaching effectiveness and students’ operational skills. By analyzing relevant literature, this article proposes specific application strategies for establishing a complete practical teaching system for undergraduate students majoring in mechanical and electrical engineering, constructing an intelligent teaching platform for practical teaching in this field, enhancing teachers’ intelligent literacy in mechanical and electrical engineering, and innovating the intelligent teaching evaluation methods for practical teaching. These strategies can provide references for the cultivation of undergraduate students majoring in mechanical and electrical engineering in colleges and universities.
In this study, a multi-layer perceptron model (EL-NEP-MLP) combining ensemble learning and noise-enhanced prediction is proposed to improve the prediction accuracy and stability of gas concentration in photoacoustic spectroscopy (PAS) under complex noise environments. In practical applications, PAS signals are susceptible to various noises, especially in low-concentration gas detection. The multi-layer perceptron (MLP) is used for photoacoustic signal feature extraction, ensemble learning and noise-enhanced prediction to improve system robustness and generalization capability. The experimental results show that EL-NEP-MLP can predict the gas concentration quickly and accurately under complex noise conditions.
Extra-high voltage transmission lines and cables are subject to irregular vibration due to icing or unusual weather conditions, which is potentially hazardous to the safety of the towers and the structural strength of the wires. Most of the current monitoring methods use electronic acceleration sensors, which have low sensitivity, poor signal-to-noise ratio, troublesome installation process, need for additional power supply and other issues. Distributed fibre optic vibration demodulation method is currently more advanced sensing technology, can achieve long distance, large scale, multi-dimensional online monitoring and structural safety warning. The laser fibre sensing distributed Michaelson coherent technology is used to develop a cable vibrate sensor, which can achieve the vibrate monitoring of cables and towers. The sensor consists of an elastic thin-walled cylinder, a mass ring, a fixed ring, and a sensing fibre, which uses a highly sensitive fine optical fibre and a coupler to build a laser-fibre Michelson coherent optical path, which is very sensitive to vibration signals, and encapsulates the interferometer into a specific metal casing to achieve the encapsulation and integration of the vibrate monitoring sensor. The optical system of this laser fibre sensor has the advantages of high sensitivity and high signal-to-noise ratio. Using the differential cross-multiplication demodulation algorithm for many experiments and results analysis, the demodulation output of the linear correlation coefficient of up to 0.95 in the frequency of 10Hz ∼ 200Hz under the conditions of the vibrate, suitable for high-voltage towers and cables for online monitoring of the vibrate.
The disposal of used smartphones has become an increasingly severe environmental problem, which are not only huge in quantities, but also contain harmful elements and scarce resources. There is an urgent need for effective management strategies to reduce the potential smartphones’ environment impacts. A new Life Cycle Assessment (LCA) model, covering five phases of “production, transportation, distribution, usage and recycling” was constructed for quantification of the carbon footprint 5G smartphones. Based on the usage and recycling scenarios of 5G smartphones taken by the youth, adult and elderly users in China, the results of carbon footprints of smartphones from cradle to cradle were compared, which depend more on recycling pattern. Repurposing of smartphone is the best way to reduce the environmental impact. and secondly use of replacement parts can bring a certain carbon offset, while leaving the used devices idle are the worst. Results of the analysis reveal that functional transformation and reuse represents a strategic choice of circular economy with more environmental advantages. Recycling pattern and the reuse ratio of disassembled parts determine the carbon footprint of smartphones, and promoting the repurposing of used devices within the household is much more helpful to improve the carbon footprint of smartphones.
This study explores the integration of digital production technologies in the preservation and innovation of intangible cultural heritage (ICH) products. With the decline of traditional craftsmanship and the rise of modern technology, digital tools like 3D modeling, CAD, and artificial intelligence provide new methods for documenting, replicating, and enhancing traditional crafts. Smart manufacturing, automation, and data-driven approaches offer scalable solutions to meet global demand without compromising the uniqueness of ICH products. Furthermore, the study discusses the challenges of blending traditional and modern practices, addressing issues such as the loss of cultural meaning in digital transformations and the global dissemination of ICH. By analyzing the role of digital tools in both preserving and transforming heritage, the research aims to offer strategies for ensuring the sustainable transmission of ICH in the digital age, while retaining its cultural essence.
Summer and autumn tea tastes bitter and has a weak aroma, which is restricted by technology, quality and other factors. Therefore, the utilization rate of summer and autumn tea has been relatively low, and the tea-producing areas mainly pick spring tea in China. However, the matcha powder made by grinding summer and autumn tea can be deeply processed into various products, which can effectively enhance the utilization value of summer and autumn tea. The low-temperature matcha powder grinder manufactured in this paper can release the potential for summer and autumn tea utilization. The tea leaves are coarsely crushed into small particles, then enter the crushing bin, and are hammered into very fine powder by the double crushing grinding structure. The powder passes through the mesh screen and is finally collected. The machine can control the particle size of the powder by adjusting the aperture of the mesh screen, and solve the problem of color change caused by too high temperature in the tea processing by temperature control technology. This meets the standards and requirements of tea factory production and operation, can provide a wider development path for the middle and low-grade green tea industry in China, and can significantly improve the added value of tea deep processing.
To meet the needs of the remote center-of-motion (RCM) in minimally invasive surgery, a novel 3-UPrU parallel mechanism is proposed that allows for two rotations around the incision site and movement in the direction of the incision point. The screw theory is used to solve the degree-of-freedom(DOF) of the mechanism, utilizing the limit boundary method solve the workspace of the mechanism. The research in this paper demonstrates that the mechanism has RCM motion properties, which can suit the needs of minimally invasive surgery implying that the mechanism has some application potential.
By the end of the 14th Five-Year Plan, China will have entered a moderately aging society. The rapid aging process has brought about real issues such as “empty-nest elderly,” “aging before becoming affluent,” and “uneven distribution of elderly care resources.” Building a diversified elderly care service system has become a key solution to these challenges. Zhejiang Province, in response to the planning requirements, has proposed the construction standards for “Future Communities,” effectively integrating, supplementing, consolidating, ssand developing the features of home-based, community-based, and institutional elderly care. This project aims to explore the construction of an evaluation system for elderly care scenarios in Zhejiang’s Future Communities based on the Interpretive Structural Modeling (ISM) approach. Through this model, we will analyze and stratify the various influencing factors of elderly care scenarios in Future Communities in Zhejiang.
In the design of modern compilers, the generation and optimization of intermediate code play a crucial role. Serving as a bridge between source code and target machine code, intermediate code provides ample opportunities for optimization. This paper focuses on porthole optimization techniques, proposing a new optimization strategy aimed at reducing unnecessary computations and memory accesses by analyzing control and data flows within intermediate code, thereby enhancing program execution efficiency. We first outline the various stages of the compilation process, followed by a detailed exploration of the basic principles and implementation methods of porthole optimization. Finally, we validate the effectiveness and advantages of this optimization strategy through experiments. The experimental results indicate significant improvements in the execution performance of intermediate code after the adoption of porthole optimization, providing new insights and directions for compiler optimization.
In a fiber loop ring-down spectroscopy (FLRDS) system, the gain saturation absorption effect of the erbium-doped fiber amplifier (EDFA) leads to gain fluctuations in attenuation pulses thereby significantly impacting measurement accuracy. To reduce this impact, we propose an end-to-end model using one-dimensional convolutional neural network (1D-CNN). By preprocessing FLRDS data and feeding it into the model for training, the model can predict data with unknown concentrations. This method avoids the traditional decay curve fitting process and indirectly reduces the negative impact of EDFA gain saturation absorption behavior on FLRDS measurement accuracy.
In order to enhance the accuracy of the POI recommendation system, a model that dynamically captures user behavior patterns was established by combining Graph Neural Networks (GNNs) and the Attention Mechanism. The experimental results show that the model improved the Top-1 accuracy by 15%, and the Top-5 and Top-10 accuracy by 13% and 15%, respectively. The conclusion indicates that the proposed method significantly improves recommendation performance and provides effective support for personalized recommendations.
In response to the frequent trip alarm issue in distribution networks, this study innovatively proposes a deep learning-based analysis and prediction method. This method conducts exploratory analysis on massive trip alarm data to reveal the spatiotemporal distribution patterns and influencing factors of events. Subsequently, the ConvLSTM model is employed to automatically extract spatial correlation features of trip events, constructing a high-precision risk prediction model. Based on this, a graph attention network is utilized to model the topology of the distribution network, achieving precise estimation of the impact range and fault propagation paths of trip events. Experimental results demonstrate that this method effectively mines the spatiotemporal correlation information contained in trip data, providing real-time early warning and intelligent decision support for the safe and stable operation of distribution networks. Future research will incorporate more external factors to further enhance model performance and focus on engineering practice applications, providing new ideas and methods for the construction and development of smart distribution networks.
In the contemporary era, with its incidence rising at an alarming rate, diabetes has become a major global health concern. This work presents a data mining approach that compares the K-Means and Fuzzy C-Means (FCM) clustering algorithms to answer the increasing demand for accurate diabetes management and prediction in a wide range of datasets. As data mining is essential for drawing insightful conclusions from large, complicated datasets, the study concentrates on using these methods to improve the precision and effectiveness of diabetes prediction models. The diabetes dataset is divided into separate clusters using FCM and K-Means after preprocessing steps to manage missing values and outliers. Algorithms often define the clustering process’s conclusion and the effectiveness of its domain application. Two significant clustering algorithms centroid-based K-Means and representative object-based FCM (Fuzzy C-Means) clustering algorithms are compared in this research work. These techniques are used, and the effectiveness of the output clustering is to assess performance. The evaluation metrics accuracy, sensitivity, specificity, and AUC-ROC highlight the model’s superior performance over individual clustering techniques. This work addresses the urgent problem of diabetes currently, which advances the field of diabetes prediction and highlights the crucial role of data mining, which plays in healthcare analytics over early detection and intervention.