Search results for: measurement accuracy
5883 The Effect of Using LDOCE on Iranian EFL Learners’ Pronunciation Accuracy
Authors: Mohammad Hadi Mahmoodi, Elahe Saedpanah
Abstract:
Since pronunciation is among those factors that can have strong effects on EFL learners’ successful communication, instructional programs with accurate pronunciation purposes seem to be a necessity in any L2 teaching context. The widespread use of smart mobile phones brings with itself various educational applications, which can assist foreign language learners in learning and speaking another language other than their L1. In line with this supportive innovation, the present study investigated the role of LDOCE (Longman Dictionary of Contemporary English), a mobile application, on improving Iranian EFL learners’ pronunciation accuracy. To this aim, 40 EFL learners studying English at the intermediate level participated in the current study. This was an experimental research with two groups of 20 students in an experimental and a control group. The data were collected through the administration of a pronunciation pretest before the instruction and a post-test after the treatment. In addition, the assessment was based on the pupils’ recorded voices while reading the selected words. The results of the independent samples t-test indicated that using LDOCE significantly affected Iranian EFL learners' pronunciation accuracy with those in the experimental group outperforming their control group counterparts.Keywords: LDOCE, EFL learners, pronunciation accuracy, CALL, MALL
Procedia PDF Downloads 5495882 A Summary-Based Text Classification Model for Graph Attention Networks
Authors: Shuo Liu
Abstract:
In Chinese text classification tasks, redundant words and phrases can interfere with the formation of extracted and analyzed text information, leading to a decrease in the accuracy of the classification model. To reduce irrelevant elements, extract and utilize text content information more efficiently and improve the accuracy of text classification models. In this paper, the text in the corpus is first extracted using the TextRank algorithm for abstraction, the words in the abstract are used as nodes to construct a text graph, and then the graph attention network (GAT) is used to complete the task of classifying the text. Testing on a Chinese dataset from the network, the classification accuracy was improved over the direct method of generating graph structures using text.Keywords: Chinese natural language processing, text classification, abstract extraction, graph attention network
Procedia PDF Downloads 1025881 The Doctor-Patient Interaction Experience Hierarchy Using Rasch Measurement Model Analysis
Authors: Wan Nur'ashiqin Wan Mohamad, Zarina Othman, Mohd Azman Abas, Azizah Ya'acob, Rozmel Abdul Latiff
Abstract:
Effective doctor-patient interaction is vital to both doctor and patient relationship. It is the cornerstone of good practice and an integral quality of a healthcare institution. This paper presented the hierarchy of the communication elements in doctor-patient interaction during medical consultations in a medical centre in Malaysia. This study adapted The Picker Patient Experience Questionnaire (2002) to obtain the information from patients. The questionnaire survey was responded by 100 patients between the ages of 20 and 50. Data collected were analysed using Rasch Measurement Model to yield the hierarchy of the communication elements in doctor-patient interaction. The findings showed that the three highest ranking on the doctor-patient interaction were doctor’s treatment, important information delivery and patient satisfaction of doctor’s responses. The results are valuable in developing the framework for communication ethics of doctors.Keywords: communication elements, doctor-patient interaction, hierarchy, Rasch measurement model
Procedia PDF Downloads 1635880 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma
Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu
Abstract:
The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter
Procedia PDF Downloads 1015879 Sentiment Classification Using Enhanced Contextual Valence Shifters
Authors: Vo Ngoc Phu, Phan Thi Tuoi
Abstract:
We have explored different methods of improving the accuracy of sentiment classification. The sentiment orientation of a document can be positive (+), negative (-), or neutral (0). We combine five dictionaries from [2, 3, 4, 5, 6] into the new one with 21137 entries. The new dictionary has many verbs, adverbs, phrases and idioms, that are not in five ones before. The paper shows that our proposed method based on the combination of Term-Counting method and Enhanced Contextual Valence Shifters method has improved the accuracy of sentiment classification. The combined method has accuracy 68.984% on the testing dataset, and 69.224% on the training dataset. All of these methods are implemented to classify the reviews based on our new dictionary and the Internet Movie data set.Keywords: sentiment classification, sentiment orientation, valence shifters, contextual, valence shifters, term counting
Procedia PDF Downloads 5055878 Application of Electro-Optical Hybrid Cables in Horizontal Well Production Logging
Authors: Daofan Guo, Dong Yang
Abstract:
For decades, well logging with coiled tubing has relied solely on surface data such as pump pressure, wellhead pressure, depth counter, and weight indicator readings. While this data serves the oil industry well, modern smart logging utilizes real-time downhole information, which automatically increases operational efficiency and optimizes intervention qualities. For example, downhole pressure, temperature, and depth measurement data can be transmitted through the electro-optical hybrid cable in the coiled tubing to surface operators on a real-time base. This paper mainly introduces the unique structural features and various applications of the electro-optical hybrid cables which were deployed into downhole with the help of coiled tubing technology. Fiber optic elements in the cable enable optical communications and distributed measurements, such as distributed temperature and acoustic sensing. The electrical elements provide continuous surface power for downhole tools, eliminating the limitations of traditional batteries, such as temperature, operating time, and safety concerns. The electrical elements also enable cable telemetry operation of cable tools. Both power supply and signal transmission were integrated into an electro-optical hybrid cable, and the downhole information can be captured by downhole electrical sensors and distributed optical sensing technologies, then travels up through an optical fiber to the surface, which greatly improves the accuracy of measurement data transmission.Keywords: electro-optical hybrid cable, underground photoelectric composite cable, seismic cable, coiled tubing, real-time monitoring
Procedia PDF Downloads 1465877 Reliability of Eyewitness Statements in Fire and Explosion Investigations
Authors: Jeff Colwell, Benjamin Knox
Abstract:
While fire and explosion incidents are often observed by eyewitnesses, the weight that fire investigators should place on those observations in their investigations is a complex issue. There is no doubt that eyewitness statements can be an important component to an investigation, particularly when other evidence is sparse, as is often the case when damage to the scene is severe. However, it is well known that eyewitness statements can be incorrect for a variety of reasons, including deception. In this paper, we reviewed factors that can have an effect on the complex processes associated with the perception, retention, and retrieval of an event. We then review the accuracy of eyewitness statements from unique criminal and civil incidents, including fire and explosion incidents, in which the accuracy of the statements could be independently evaluated. Finally, the motives for deceptive eyewitness statements are described, along with techniques that fire and explosion investigators can employ, to increase the accuracy of the eyewitness statements that they solicit.Keywords: fire, explosion, eyewitness, reliability
Procedia PDF Downloads 3835876 The Antecedent Factor Affecting the Entrepreneurs’ Decision Making for Using Accounting Office Service in Chiang Mai Province
Authors: Nawaporn Thongnut
Abstract:
The objective was to study the process and how to prepare the accounting of the Thai temples and to study the performance and quality in the accounting preparation of the temples in accordance with the regulation. The population was the accountants and individuals involved in the accounting preparation of 17 temples in the suburban Bangkok. The measurement used in this study was questionnaire. The statistics used in the analysis are the descriptive statistic. The data was presented in the form of percentage tables to describe the data on the demographic characteristics. The study found that temple wardens were responsible for the accounting and reporting of the temples. Abbots were to check the accuracy of the accounts in the monasteries. Mostly, there was no account auditing of the monasteries from the outside. The practice when receiving income for most of the monasteries had been keeping financial document in an orderly manner.Keywords: corporate social responsibility, creating shared value, management accountant’s roles, stock exchange of Thailand
Procedia PDF Downloads 2325875 Electricity Demand Modeling and Forecasting in Singapore
Authors: Xian Li, Qing-Guo Wang, Jiangshuai Huang, Jidong Liu, Ming Yu, Tan Kok Poh
Abstract:
In power industry, accurate electricity demand forecasting for a certain leading time is important for system operation and control, etc. In this paper, we investigate the modeling and forecasting of Singapore’s electricity demand. Several standard models, such as HWT exponential smoothing model, the ARMA model and the ANNs model have been proposed based on historical demand data. We applied them to Singapore electricity market and proposed three refinements based on simulation to improve the modeling accuracy. Compared with existing models, our refined model can produce better forecasting accuracy. It is demonstrated in the simulation that by adding forecasting error into the forecasting equation, the modeling accuracy could be improved greatly.Keywords: power industry, electricity demand, modeling, forecasting
Procedia PDF Downloads 6415874 Spatial Correlation of Channel State Information in Real Long Range Measurement
Authors: Ahmed Abdelghany, Bernard Uguen, Christophe Moy, Dominique Lemur
Abstract:
The Internet of Things (IoT) is developed to ensure monitoring and connectivity within different applications. Thus, it is critical to study the channel propagation characteristics in Low Power Wide Area Network (LPWAN), especially Long Range Wide Area Network (LoRaWAN). In this paper, an in-depth investigation of the reciprocity between the uplink and downlink Channel State Information (CSI) is done by performing an outdoor measurement campaign in the area of Campus Beaulieu in Rennes. At each different location, the CSI reciprocity is quantified using the Pearson Correlation Coefficient (PCC) which shows a very high linear correlation between the uplink and downlink CSI. This reciprocity feature could be utilized for the physical layer security between the node and the gateway. On the other hand, most of the CSI shapes from different locations are highly uncorrelated from each other. Hence, it can be anticipated that this could achieve significant localization gain by utilizing the frequency hopping in the LoRa systems by getting access to a wider band.Keywords: IoT, LPWAN, LoRa, effective signal power, onsite measurement
Procedia PDF Downloads 1635873 6D Posture Estimation of Road Vehicles from Color Images
Authors: Yoshimoto Kurihara, Tad Gonsalves
Abstract:
Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.Keywords: 6D posture estimation, image recognition, deep learning, AlexNet
Procedia PDF Downloads 1575872 Kýklos Dimensional Geometry: Entity Specific Core Measurement System
Authors: Steven D. P Moore
Abstract:
A novel method referred to asKýklos(Ky) dimensional geometry is proposed as an entity specific core geometric dimensional measurement system. Ky geometric measures can constructscaled multi-dimensionalmodels using regular and irregular sets in IRn. This entity specific-derived geometric measurement system shares similar fractal methods in which a ‘fractal transformation operator’ is applied to a set S to produce a union of N copies. The Kýklos’ inputs use 1D geometry as a core measure. One-dimensional inputs include the radius interval of a circle/sphere or the semiminor/semimajor axes intervals of an ellipse or spheroid. These geometric inputs have finite values that can be measured by SI distance units. The outputs for each interval are divided and subdivided 1D subcomponents with a union equal to the interval geometry/length. Setting a limit of subdivision iterations creates a finite value for each 1Dsubcomponent. The uniqueness of this method is captured by allowing the simplest 1D inputs to define entity specific subclass geometric core measurements that can also be used to derive length measures. Current methodologies for celestial based measurement of time, as defined within SI units, fits within this methodology, thus combining spatial and temporal features into geometric core measures. The novel Ky method discussed here offers geometric measures to construct scaled multi-dimensional structures, even models. Ky classes proposed for consideration include celestial even subatomic. The application of this offers incredible possibilities, for example, geometric architecture that can represent scaled celestial models that incorporates planets (spheroids) and celestial motion (elliptical orbits).Keywords: Kyklos, geometry, measurement, celestial, dimension
Procedia PDF Downloads 1665871 Qualitative Measurement of Literacy
Authors: Indrajit Ghosh, Jaydip Roy
Abstract:
Literacy rate is an important indicator for measurement of human development. But this is not a good one to capture the qualitative dimension of educational attainment of an individual or a society. The overall educational level of an area is an important issue beyond the literacy rate. The overall educational level can be thought of as an outcome of the educational levels of individuals. But there is no well-defined algorithm and mathematical model available to measure the overall educational level of an area. A heuristic approach based on accumulated experience of experts is effective one. It is evident that fuzzy logic offers a natural and convenient framework in modeling various concepts in social science domain. This work suggests the implementation of fuzzy logic to develop a mathematical model for measurement of educational attainment of an area in terms of Education Index. The contribution of the study is two folds: conceptualization of “Education Profile” and proposing a new mathematical model to measure educational attainment in terms of “Education Index”.Keywords: education index, education profile, fuzzy logic, literacy
Procedia PDF Downloads 3175870 Evaluation of Machine Learning Algorithms and Ensemble Methods for Prediction of Students’ Graduation
Authors: Soha A. Bahanshal, Vaibhav Verdhan, Bayong Kim
Abstract:
Graduation rates at six-year colleges are becoming a more essential indicator for incoming fresh students and for university rankings. Predicting student graduation is extremely beneficial to schools and has a huge potential for targeted intervention. It is important for educational institutions since it enables the development of strategic plans that will assist or improve students' performance in achieving their degrees on time (GOT). A first step and a helping hand in extracting useful information from these data and gaining insights into the prediction of students' progress and performance is offered by machine learning techniques. Data analysis and visualization techniques are applied to understand and interpret the data. The data used for the analysis contains students who have graduated in 6 years in the academic year 2017-2018 for science majors. This analysis can be used to predict the graduation of students in the next academic year. Different Predictive modelings such as logistic regression, decision trees, support vector machines, Random Forest, Naïve Bayes, and KNeighborsClassifier are applied to predict whether a student will graduate. These classifiers were evaluated with k folds of 5. The performance of these classifiers was compared based on accuracy measurement. The results indicated that Ensemble Classifier achieves better accuracy, about 91.12%. This GOT prediction model would hopefully be useful to university administration and academics in developing measures for assisting and boosting students' academic performance and ensuring they graduate on time.Keywords: prediction, decision trees, machine learning, support vector machine, ensemble model, student graduation, GOT graduate on time
Procedia PDF Downloads 735869 On the Performance Analysis of Coexistence between IEEE 802.11g and IEEE 802.15.4 Networks
Authors: Chompunut Jantarasorn, Chutima Prommak
Abstract:
This paper presents an intensive measurement studying of the network performance analysis when IEEE 802.11g Wireless Local Area Networks (WLAN) coexisting with IEEE 802.15.4 Wireless Personal Area Network (WPAN). The measurement results show that the coexistence between both networks could increase the Frame Error Rate (FER) of the IEEE 802.15.4 networks up to 60% and it could decrease the throughputs of the IEEE 802.11g networks up to 55%.Keywords: wireless performance analysis, coexistence analysis, IEEE 802.11g, IEEE 802.15.4
Procedia PDF Downloads 5535868 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector
Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau
Abstract:
Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement
Procedia PDF Downloads 1985867 Offshore Wind Assessment and Analysis for South Western Mediterranean Sea
Authors: Abdallah Touaibia, Nachida Kasbadji Merzouk, Mustapha Merzouk, Ryma Belarbi
Abstract:
accuracy assessment and a better understand of the wind resource distribution are the most important tasks for decision making before installing wind energy operating systems in a given region, there where our interest come to the Algerian coastline and its Mediterranean sea area. Despite its large coastline overlooking the border of Mediterranean Sea, there is still no strategy encouraging the development of offshore wind farms in Algerian waters. The present work aims to estimate the offshore wind fields for the Algerian Mediterranean Sea based on wind data measurements ranging from 1995 to 2018 provided of 24 years of measurement by seven observation stations focusing on three coastline cities in Algeria under a different measurement time step recorded from 30 min, 60 min, and 180 min variate from one to each other, two stations in Spain, two other ones in Italy and three in the coast of Algeria from the east Annaba, at the center Algiers, and to Oran taken place at the west of it. The idea behind consists to have multiple measurement points that helping to characterize this area in terms of wind potential by the use of interpolation method of their average wind speed values between these available data to achieve the approximate values of others locations where aren’t any available measurement because of the difficulties against the implementation of masts within the deep depth water. This study is organized as follow: first, a brief description of the studied area and its climatic characteristics were done. After that, the statistical properties of the recorded data were checked by evaluating wind histograms, direction roses, and average speeds using MatLab programs. Finally, ArcGIS and MapInfo soft-wares were used to establish offshore wind maps for better understanding the wind resource distribution, as well as to identify windy sites for wind farm installation and power management. The study pointed out that Cap Carbonara is the windiest site with an average wind speed of 7.26 m/s at 10 m, inducing a power density of 902 W/m², then the site of Cap Caccia with 4.88 m/s inducing a power density of 282 W/m². The average wind speed of 4.83 m/s is occurred for the site of Oran, inducing a power density of 230 W/m². The results indicated also that the dominant wind direction where the frequencies are highest for the site of Cap Carbonara is the West with 34%, an average wind speed of 9.49 m/s, and a power density of 1722 W/m². Then comes the site of Cap Caccia, where the prevailing wind direction is the North-west, about 20% and 5.82 m/s occurring a power density of 452 W/m². The site of Oran comes in third place with the North dominant direction with 32% inducing an average wind speed of 4.59 m/s and power density of 189 W/m². It also shown that the proposed method is either crucial in understanding wind resource distribution for revealing windy sites over a large area and more effective for wind turbines micro-siting.Keywords: wind ressources, mediterranean sea, offshore, arcGIS, mapInfo, wind maps, wind farms
Procedia PDF Downloads 1475866 Application of Rapid Eye Imagery in Crop Type Classification Using Vegetation Indices
Authors: Sunita Singh, Rajani Srivastava
Abstract:
For natural resource management and in other applications about earth observation revolutionary remote sensing technology plays a significant role. One of such application in monitoring and classification of crop types at spatial and temporal scale, as it provides latest, most precise and cost-effective information. Present study emphasizes the use of three different vegetation indices of Rapid Eye imagery on crop type classification. It also analyzed the effect of each indices on classification accuracy. Rapid Eye imagery is highly demanded and preferred for agricultural and forestry sectors as it has red-edge and NIR bands. The three indices used in this study were: the Normalized Difference Vegetation Index (NDVI), the Green Normalized Difference Vegetation Index (GNDVI), and the Normalized Difference Red Edge Index (NDRE) and all of these incorporated the Red Edge band. The study area is Varanasi district of Uttar Pradesh, India and Radial Basis Function (RBF) kernel was used here for the Support Vector Machines (SVMs) classification. Classification was performed with these three vegetation indices. The contribution of each indices on image classification accuracy was also tested with single band classification. Highest classification accuracy of 85% was obtained using three vegetation indices. The study concluded that NDRE has the highest contribution on classification accuracy compared to the other vegetation indices and the Rapid Eye imagery can get satisfactory results of classification accuracy without original bands.Keywords: GNDVI, NDRE, NDVI, rapid eye, vegetation indices
Procedia PDF Downloads 3625865 The Accuracy of an In-House Developed Computer-Assisted Surgery Protocol for Mandibular Micro-Vascular Reconstruction
Authors: Christophe Spaas, Lies Pottel, Joke De Ceulaer, Johan Abeloos, Philippe Lamoral, Tom De Backer, Calix De Clercq
Abstract:
We aimed to evaluate the accuracy of an in-house developed low-cost computer-assisted surgery (CAS) protocol for osseous free flap mandibular reconstruction. All patients who underwent primary or secondary mandibular reconstruction with a free (solely or composite) osseous flap, either a fibula free flap or iliac crest free flap, between January 2014 and December 2017 were evaluated. The low-cost protocol consisted out of a virtual surgical planning, a prebend custom reconstruction plate and an individualized free flap positioning guide. The accuracy of the protocol was evaluated through comparison of the postoperative outcome with the 3D virtual planning, based on measurement of the following parameters: intercondylar distance, mandibular angle (axial and sagittal), inner angular distance, anterior-posterior distance, length of the fibular/iliac crest segments and osteotomy angles. A statistical analysis of the obtained values was done. Virtual 3D surgical planning and cutting guide design were performed with Proplan CMF® software (Materialise, Leuven, Belgium) and IPS Gate (KLS Martin, Tuttlingen, Germany). Segmentation of the DICOM data as well as outcome analysis were done with BrainLab iPlan® Software (Brainlab AG, Feldkirchen, Germany). A cost analysis of the protocol was done. Twenty-two patients (11 fibula /11 iliac crest) were included and analyzed. Based on voxel-based registration on the cranial base, 3D virtual planning landmark parameters did not significantly differ from those measured on the actual treatment outcome (p-values >0.05). A cost evaluation of the in-house developed CAS protocol revealed a 1750 euro cost reduction in comparison with a standard CAS protocol with a patient-specific reconstruction plate. Our results indicate that an accurate transfer of the planning with our in-house developed low-cost CAS protocol is feasible at a significant lower cost.Keywords: CAD/CAM, computer-assisted surgery, low-cost, mandibular reconstruction
Procedia PDF Downloads 1435864 Enhancing Fault Detection in Rotating Machinery Using Wiener-CNN Method
Authors: Mohamad R. Moshtagh, Ahmad Bagheri
Abstract:
Accurate fault detection in rotating machinery is of utmost importance to ensure optimal performance and prevent costly downtime in industrial applications. This study presents a robust fault detection system based on vibration data collected from rotating gears under various operating conditions. The considered scenarios include: (1) both gears being healthy, (2) one healthy gear and one faulty gear, and (3) introducing an imbalanced condition to a healthy gear. Vibration data was acquired using a Hentek 1008 device and stored in a CSV file. Python code implemented in the Spider environment was used for data preprocessing and analysis. Winner features were extracted using the Wiener feature selection method. These features were then employed in multiple machine learning algorithms, including Convolutional Neural Networks (CNN), Multilayer Perceptron (MLP), K-Nearest Neighbors (KNN), and Random Forest, to evaluate their performance in detecting and classifying faults in both the training and validation datasets. The comparative analysis of the methods revealed the superior performance of the Wiener-CNN approach. The Wiener-CNN method achieved a remarkable accuracy of 100% for both the two-class (healthy gear and faulty gear) and three-class (healthy gear, faulty gear, and imbalanced) scenarios in the training and validation datasets. In contrast, the other methods exhibited varying levels of accuracy. The Wiener-MLP method attained 100% accuracy for the two-class training dataset and 100% for the validation dataset. For the three-class scenario, the Wiener-MLP method demonstrated 100% accuracy in the training dataset and 95.3% accuracy in the validation dataset. The Wiener-KNN method yielded 96.3% accuracy for the two-class training dataset and 94.5% for the validation dataset. In the three-class scenario, it achieved 85.3% accuracy in the training dataset and 77.2% in the validation dataset. The Wiener-Random Forest method achieved 100% accuracy for the two-class training dataset and 85% for the validation dataset, while in the three-class training dataset, it attained 100% accuracy and 90.8% accuracy for the validation dataset. The exceptional accuracy demonstrated by the Wiener-CNN method underscores its effectiveness in accurately identifying and classifying fault conditions in rotating machinery. The proposed fault detection system utilizes vibration data analysis and advanced machine learning techniques to improve operational reliability and productivity. By adopting the Wiener-CNN method, industrial systems can benefit from enhanced fault detection capabilities, facilitating proactive maintenance and reducing equipment downtime.Keywords: fault detection, gearbox, machine learning, wiener method
Procedia PDF Downloads 815863 Airport Pavement Crack Measurement Systems and Crack Density for Pavement Evaluation
Authors: Ali Ashtiani, Hamid Shirazi
Abstract:
This paper reviews the status of existing practice and research related to measuring pavement cracking and using crack density as a pavement surface evaluation protocol. Crack density for pavement evaluation is currently not widely used within the airport community and its use by the highway community is limited. However, surface cracking is a distress that is closely monitored by airport staff and significantly influences the development of maintenance, rehabilitation and reconstruction plans for airport pavements. Therefore crack density has the potential to become an important indicator of pavement condition if the type, severity and extent of surface cracking can be accurately measured. A pavement distress survey is an essential component of any pavement assessment. Manual crack surveying has been widely used for decades to measure pavement performance. However, the accuracy and precision of manual surveys can vary depending upon the surveyor and performing surveys may disrupt normal operations. Given the variability of manual surveys, this method has shown inconsistencies in distress classification and measurement. This can potentially impact the planning for pavement maintenance, rehabilitation and reconstruction and the associated funding strategies. A substantial effort has been devoted for the past 20 years to reduce the human intervention and the error associated with it by moving toward automated distress collection methods. The automated methods refer to the systems that identify, classify and quantify pavement distresses through processes that require no or very minimal human intervention. This principally involves the use of a digital recognition software to analyze and characterize pavement distresses. The lack of established protocols for measurement and classification of pavement cracks captured using digital images is a challenge to developing a reliable automated system for distress assessment. Variations in types and severity of distresses, different pavement surface textures and colors and presence of pavement joints and edges all complicate automated image processing and crack measurement and classification. This paper summarizes the commercially available systems and technologies for automated pavement distress evaluation. A comprehensive automated pavement distress survey involves collection, interpretation, and processing of the surface images to identify the type, quantity and severity of the surface distresses. The outputs can be used to quantitatively calculate the crack density. The systems for automated distress survey using digital images reviewed in this paper can assist the airport industry in the development of a pavement evaluation protocol based on crack density. Analysis of automated distress survey data can lead to a crack density index. This index can be used as a means of assessing pavement condition and to predict pavement performance. This can be used by airport owners to determine the type of pavement maintenance and rehabilitation in a more consistent way.Keywords: airport pavement management, crack density, pavement evaluation, pavement management
Procedia PDF Downloads 1855862 Low-Cost IoT System for Monitoring Ground Propagation Waves due to Construction and Traffic Activities to Nearby Construction
Authors: Lan Nguyen, Kien Le Tan, Bao Nguyen Pham Gia
Abstract:
Due to the high cost, specialized dynamic measurement devices for industrial lands are difficult for many colleges to equip for hands-on teaching. This study connects a dynamic measurement sensor and receiver utilizing an inexpensive Raspberry Pi 4 board, some 24-bit ADC circuits, a geophone vibration sensor, and embedded Python open-source programming. Gather and analyze signals for dynamic measuring, ground vibration monitoring, and structure vibration monitoring. The system may wirelessly communicate data to the computer and is set up as a communication node network, enabling real-time monitoring of background vibrations at various locations. The device can be utilized for a variety of dynamic measurement and monitoring tasks, including monitoring earthquake vibrations, ground vibrations from construction operations, traffic, and vibrations of building structures.Keywords: sensors, FFT, signal processing, real-time data monitoring, ground propagation wave, python, raspberry Pi 4
Procedia PDF Downloads 1035861 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms
Authors: Rikson Gultom
Abstract:
Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.Keywords: abusive language, hate speech, machine learning, optimization, social media
Procedia PDF Downloads 1295860 Intra and International Collaborations as Important Factors of Organisational Innovation of Government Agencies in STI Ecosystem in ASEAN
Authors: Salinthip Thipayang, Achara Chandrachai, Rath Pichyangkura, Sukree Sinthupinyo
Abstract:
Most of the well-known frameworks and tools to measure and compare organisational innovation of the public or government agencies have been designed and used in the developed economies such as the EU, Nordic Region, Australia, and South Korea. This project is one of the very first attempts to develop a measurement tool to adequately measure the organisational (administrative) innovation of the government agencies in the developing economies in ASEAN. New measurement framework with the components including the intra and international collaborations of these government agencies to other private, public and academic sectors were added to the proposed measurement framework. Questionnaires and in-depth interviews with the experts and the middle to top executives of the participating public agencies in the ASEAN member states were conducted to determine the suitability and develop the indicators that should be included in the measurement model. The results showed that intra and international collaborations of these government organisations to other agencies in the public, private and academic sectors can lead to new changes and greatly impact the ways in which these government agencies in the ASEAN STI ecosystem are operated and administered. Government organisations in less developing countries in ASEAN are ready and willing to learn from their counterparts in other more advanced countries and adjust their internal management to be more innovative and to better handle international collaborative projects and commitments.Keywords: organisational innovation, administrative innovation, government agencies, public agencies, ASEAN science technology and innovation ecosystem, international collaborations
Procedia PDF Downloads 3865859 Applying Multiplicative Weight Update to Skin Cancer Classifiers
Authors: Animish Jain
Abstract:
This study deals with using Multiplicative Weight Update within artificial intelligence and machine learning to create models that can diagnose skin cancer using microscopic images of cancer samples. In this study, the multiplicative weight update method is used to take the predictions of multiple models to try and acquire more accurate results. Logistic Regression, Convolutional Neural Network (CNN), and Support Vector Machine Classifier (SVMC) models are employed within the Multiplicative Weight Update system. These models are trained on pictures of skin cancer from the ISIC-Archive, to look for patterns to label unseen scans as either benign or malignant. These models are utilized in a multiplicative weight update algorithm which takes into account the precision and accuracy of each model through each successive guess to apply weights to their guess. These guesses and weights are then analyzed together to try and obtain the correct predictions. The research hypothesis for this study stated that there would be a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The SVMC model had an accuracy of 77.88%. The CNN model had an accuracy of 85.30%. The Logistic Regression model had an accuracy of 79.09%. Using Multiplicative Weight Update, the algorithm received an accuracy of 72.27%. The final conclusion that was drawn was that there was a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The conclusion was made that using a CNN model would be the best option for this problem rather than a Multiplicative Weight Update system. This is due to the possibility that Multiplicative Weight Update is not effective in a binary setting where there are only two possible classifications. In a categorical setting with multiple classes and groupings, a Multiplicative Weight Update system might become more proficient as it takes into account the strengths of multiple different models to classify images into multiple categories rather than only two categories, as shown in this study. This experimentation and computer science project can help to create better algorithms and models for the future of artificial intelligence in the medical imaging field.Keywords: artificial intelligence, machine learning, multiplicative weight update, skin cancer
Procedia PDF Downloads 805858 Measurement Technologies for Advanced Characterization of Magnetic Materials Used in Electric Drives and Automotive Applications
Authors: Lukasz Mierczak, Patrick Denke, Piotr Klimczyk, Stefan Siebert
Abstract:
Due to the high complexity of the magnetization in electrical machines and influence of the manufacturing processes on the magnetic properties of their components, the assessment and prediction of hysteresis and eddy current losses has remained a challenge. In the design process of electric motors and generators, the power losses of stators and rotors are calculated based on the material supplier’s data from standard magnetic measurements. This type of data does not include the additional loss from non-sinusoidal multi-harmonic motor excitation nor the detrimental effects of residual stress remaining in the motor laminations after manufacturing processes, such as punching, housing shrink fitting and winding. Moreover, in production, considerable attention is given to the measurements of mechanical dimensions of stator and rotor cores, whereas verification of their magnetic properties is typically neglected, which can lead to inconsistent efficiency of assembled motors. Therefore, to enable a comprehensive characterization of motor materials and components, Brockhaus Measurements developed a range of in-line and offline measurement technologies for testing their magnetic properties under actual motor operating conditions. Multiple sets of experimental data were obtained to evaluate the influence of various factors, such as elevated temperature, applied and residual stress, and arbitrary magnetization on the magnetic properties of different grades of non-oriented steel. Measured power loss for tested samples and stator cores varied significantly, by more than 100%, comparing to standard measurement conditions. Quantitative effects of each of the applied measurement were analyzed. This research and applied Brockhaus measurement methodologies emphasized the requirement for advanced characterization of magnetic materials used in electric drives and automotive applications.Keywords: magnetic materials, measurement technologies, permanent magnets, stator and rotor cores
Procedia PDF Downloads 1415857 Assessment of Arterial Stiffness through Measurement of Magnetic Flux Disturbance and Electrocardiogram Signal
Authors: Jing Niu, Jun X. Wang
Abstract:
Arterial stiffness predicts mortality and morbidity, independently of other cardiovascular risk factors. And it is a major risk factor for age-related morbidity and mortality. The non-invasive industry gold standard measurement system of arterial stiffness utilizes pulse wave velocity method. However, the desktop device is expensive and requires trained professional to operate. The main objective of this research is the proof of concept of the proposed non-invasive method which uses measurement of magnetic flux disturbance and electrocardiogram (ECG) signal for measuring arterial stiffness. The method could enable accurate and easy self-assessment of arterial stiffness at home, and to help doctors in research, diagnostic and prescription in hospitals and clinics. A platform for assessing arterial stiffness through acquisition and analysis of radial artery pulse waveform and ECG signal has been developed based on the proposed method. Radial artery pulse waveform is acquired using the magnetic based sensing technology, while ECG signal is acquired using two dry contact single arm ECG electrodes. The measurement only requires the participant to wear a wrist strap and an arm band. Participants were recruited for data collection using both the developed platform and the industry gold standard system. The results from both systems underwent correlation assessment analysis. A strong positive correlation between the results of the two systems is observed. This study presents the possibility of developing an accurate, easy to use and affordable measurement device for arterial stiffness assessment.Keywords: arterial stiffness, electrocardiogram, pulse wave velocity, Magnetic Flux Disturbance
Procedia PDF Downloads 1885856 Nonlinear Power Measurement Algorithm of the Input Mix Components of the Noise Signal and Pulse Interference
Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev, Andrey V. Klyuev
Abstract:
A power measurement algorithm of the input mix components of the noise signal and pulse interference is considered. The algorithm efficiency analysis has been carried out for different interference to signal ratio. Algorithm performance features have been explored by numerical experiment results.Keywords: noise signal, pulse interference, signal power, spectrum width, detection
Procedia PDF Downloads 3375855 Bone Fracture Detection with X-Ray Images Using Mobilenet V3 Architecture
Authors: Ashlesha Khanapure, Harsh Kashyap, Abhinav Anand, Sanjana Habib, Anupama Bidargaddi
Abstract:
Technologies that are developing quickly are being developed daily in a variety of disciplines, particularly the medical field. For the purpose of detecting bone fractures in X-ray pictures of different body segments, our work compares the ResNet-50 and MobileNetV3 architectures. It evaluates accuracy and computing efficiency with X-rays of the elbow, hand, and shoulder from the MURA dataset. Through training and validation, the models are evaluated on normal and fractured images. While ResNet-50 showcases superior accuracy in fracture identification, MobileNetV3 showcases superior speed and resource optimization. Despite ResNet-50’s accuracy, MobileNetV3’s swifter inference makes it a viable choice for real-time clinical applications, emphasizing the importance of balancing computational efficiency and accuracy in medical imaging. We created a graphical user interface (GUI) for MobileNet V3 model bone fracture detection. This research underscores MobileNetV3’s potential to streamline bone fracture diagnoses, potentially revolutionizing orthopedic medical procedures and enhancing patient care.Keywords: CNN, MobileNet V3, ResNet-50, healthcare, MURA, X-ray, fracture detection
Procedia PDF Downloads 685854 Organizational Resilience in the Perspective of Supply Chain Risk Management: A Scholarly Network Analysis
Authors: William Ho, Agus Wicaksana
Abstract:
Anecdotal evidence in the last decade shows that the occurrence of disruptive events and uncertainties in the supply chain is increasing. The coupling of these events with the nature of an increasingly complex and interdependent business environment leads to devastating impacts that quickly propagate within and across organizations. For example, the recent COVID-19 pandemic increased the global supply chain disruption frequency by at least 20% in 2020 and is projected to have an accumulative cost of $13.8 trillion by 2024. This crisis raises attention to organizational resilience to weather business uncertainty. However, the concept has been criticized for being vague and lacking a consistent definition, thus reducing the significance of the concept for practice and research. This study is intended to solve that issue by providing a comprehensive review of the conceptualization, measurement, and antecedents of operational resilience that have been discussed in the supply chain risk management literature (SCRM). We performed a Scholarly Network Analysis, combining citation-based and text-based approaches, on 252 articles published from 2000 to 2021 in top-tier journals based on three parameters: AJG ranking and ABS ranking, UT Dallas and FT50 list, and editorial board review. We utilized a hybrid scholarly network analysis by combining citation-based and text-based approaches to understand the conceptualization, measurement, and antecedents of operational resilience in the SCRM literature. Specifically, we employed a Bibliographic Coupling Analysis in the research cluster formation stage and a Co-words Analysis in the research cluster interpretation and analysis stage. Our analysis reveals three major research clusters of resilience research in the SCRM literature, namely (1) supply chain network design and optimization, (2) organizational capabilities, and (3) digital technologies. We portray the research process in the last two decades in terms of the exemplar studies, problems studied, commonly used approaches and theories, and solutions provided in each cluster. We then provide a conceptual framework on the conceptualization and antecedents of resilience based on studies in these clusters and highlight potential areas that need to be studied further. Finally, we leverage the concept of abnormal operating performance to propose a new measurement strategy for resilience. This measurement overcomes the limitation of most current measurements that are event-dependent and focus on the resistance or recovery stage - without capturing the growth stage. In conclusion, this study provides a robust literature review through a scholarly network analysis that increases the completeness and accuracy of research cluster identification and analysis to understand conceptualization, antecedents, and measurement of resilience. It also enables us to perform a comprehensive review of resilience research in SCRM literature by including research articles published during the pandemic and connects this development with a plethora of articles published in the last two decades. From the managerial perspective, this study provides practitioners with clarity on the conceptualization and critical success factors of firm resilience from the SCRM perspective.Keywords: supply chain risk management, organizational resilience, scholarly network analysis, systematic literature review
Procedia PDF Downloads 74