Search results for: machine failures
2074 Modeling Aeration of Sharp Crested Weirs by Using Support Vector Machines
Authors: Arun Goel
Abstract:
The present paper attempts to investigate the prediction of air entrainment rate and aeration efficiency of a free over-fall jets issuing from a triangular sharp crested weir by using regression based modelling. The empirical equations, support vector machine (polynomial and radial basis function) models and the linear regression techniques were applied on the triangular sharp crested weirs relating the air entrainment rate and the aeration efficiency to the input parameters namely drop height, discharge, and vertex angle. It was observed that there exists a good agreement between the measured values and the values obtained using empirical equations, support vector machine (Polynomial and rbf) models, and the linear regression techniques. The test results demonstrated that the SVM based (Poly & rbf) model also provided acceptable prediction of the measured values with reasonable accuracy along with empirical equations and linear regression techniques in modelling the air entrainment rate and the aeration efficiency of a free over-fall jets issuing from triangular sharp crested weir. Further sensitivity analysis has also been performed to study the impact of input parameter on the output in terms of air entrainment rate and aeration efficiency.Keywords: air entrainment rate, dissolved oxygen, weir, SVM, regression
Procedia PDF Downloads 4362073 Alphabet Recognition Using Pixel Probability Distribution
Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay
Abstract:
Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix
Procedia PDF Downloads 3892072 A Novel Stress Instability Workability Criteria for Internal Ductile Failure in Steel Cold Heading Process
Authors: Amar Sabih, James Nemes
Abstract:
The occurrence of internal ductile failure within the Adiabatic Shear Band (ASB) in cold-headed products presents a significant barrier in the fast-expanding cold-heading (CH) industry. The presence of internal ductile failure in cold-headed products may lead to catastrophic fracture under tensile loads despite the ductile nature of the material causing expensive industrial recalls. Therefore, this paper presents a workability criterion that uses stress instability as an indicator to accurately reveal the locus of initiation of internal ductile failures. The concept of the instability criterion is to use the stress ratio at failure as a weighting function to indicate the initiation of ductile failure inside the ASBs. This paper presents a comprehensive experimental, metallurgical, and finite element simulation study to calculate the material constants used in this criterion.Keywords: adiabatic shear band, workability criterion, ductile failure, stress instability
Procedia PDF Downloads 902071 Hardware in the Loop Platform for Virtual Commissioning: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Ana Maria Macarulla
Abstract:
Hydraulic-press commissioning consumes a great amount of man-hours, due to the fact that it takes place several miles away from where it has been designed. This factor became exacerbated due to control designers’ lack of knowledge about which will be the final controller gains before they start working with it. Virtual commissioning has been postulated as an optimal solution to deal with this lack of knowledge. Here, a case study is presented in which a controller is set up against a real-time model based on a hydraulic-press. The press model is designed following manufacturer specifications and it is embedded in a real-time simulator. This methodology ensures that the model achieves similar responses as the real machine that would be placed on the industry. A deterministic communication protocol is in charge of the bidirectional information transmission between the real-time model and the controller. This platform allows the engineer to test and verify the final control responses with exactly the same hardware that is going to be installed in the hydraulic-press, in other words, realize a virtual commissioning of the electro-hydraulic actuator. The Hardware in the Loop (HiL) platform validates in laboratory conditions and harmless for the machine the control algorithms designed, which allows embedding them afterwards in the industrial environment without further modifications.Keywords: deterministic communication protocol, electro-hydraulic actuator, hardware in the loop, real-time, virtual commissioning
Procedia PDF Downloads 1422070 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering
Authors: Sharifah Mousli, Sona Taheri, Jiayuan He
Abstract:
Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.Keywords: autism spectrum disorder, clustering, optimization, unsupervised machine learning
Procedia PDF Downloads 1152069 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 3562068 Charting Sentiments with Naive Bayes and Logistic Regression
Authors: Jummalla Aashrith, N. L. Shiva Sai, K. Bhavya Sri
Abstract:
The swift progress of web technology has not only amassed a vast reservoir of internet data but also triggered a substantial surge in data generation. The internet has metamorphosed into one of the dynamic hubs for online education, idea dissemination, as well as opinion-sharing. Notably, the widely utilized social networking platform Twitter is experiencing considerable expansion, providing users with the ability to share viewpoints, participate in discussions spanning diverse communities, and broadcast messages on a global scale. The upswing in online engagement has sparked a significant curiosity in subjective analysis, particularly when it comes to Twitter data. This research is committed to delving into sentiment analysis, focusing specifically on the realm of Twitter. It aims to offer valuable insights into deciphering information within tweets, where opinions manifest in a highly unstructured and diverse manner, spanning a spectrum from positivity to negativity, occasionally punctuated by neutrality expressions. Within this document, we offer a comprehensive exploration and comparative assessment of modern approaches to opinion mining. Employing a range of machine learning algorithms such as Naive Bayes and Logistic Regression, our investigation plunges into the domain of Twitter data streams. We delve into overarching challenges and applications inherent in the realm of subjectivity analysis over Twitter.Keywords: machine learning, sentiment analysis, visualisation, python
Procedia PDF Downloads 562067 Dynamic Cellular Remanufacturing System (DCRS) Design
Authors: Tariq Aljuneidi, Akif Asil Bulgak
Abstract:
Remanufacturing may be defined as the process of bringing used products to “like-new” functional state with warranty to match, and it is one of the most popular product end-of-life scenarios. An efficient remanufacturing network lead to an efficient design of sustainable manufacturing enterprise. In remanufacturing network, products are collected from the customer zone, disassembled and remanufactured at a suitable remanufacturing facility. In this respect, another issue to consider is how the returned product to be remanufactured, in other words, what is the best layout for such facility. In order to achieve a sustainable manufacturing system, Cellular Manufacturing System (CMS) designs are highly recommended, CMSs combine high throughput rates of line layouts with the flexibility offered by functional layouts (job shop). Introducing the CMS while designing a remanufacturing network will benefit the utilization of such a network. This paper presents and analyzes a comprehensive mathematical model for the design of Dynamic Cellular Remanufacturing Systems (DCRSs). In this paper, the proposed model is the first one to date that consider CMS and remanufacturing system simultaneously. The proposed DCRS model considers several manufacturing attributes such as multi-period production planning, dynamic system reconfiguration, duplicate machines, machine capacity, available time for workers, worker assignments, and machine procurement, where the demand is totally satisfied from a returned product. A numerical example is presented to illustrate the proposed model.Keywords: cellular manufacturing system, remanufacturing, mathematical programming, sustainability
Procedia PDF Downloads 3782066 Performance Analysis of 5G for Low Latency Transmission Based on Universal Filtered Multi-Carrier Technique and Interleave Division Multiple Access
Authors: A. Asgharzadeh, M. Maroufi
Abstract:
5G mobile communication system has drawn more and more attention. The 5G system needs to provide three different types of services, including enhanced Mobile BroadBand (eMBB), massive machine-type communication (mMTC), and ultra-reliable and low-latency communication (URLLC). Universal Filtered Multi-Carrier (UFMC), Filter Bank Multicarrier (FBMC), and Filtered Orthogonal Frequency Division Multiplexing (f-OFDM) are suggested as a well-known candidate waveform for the coming 5G system. Themachine-to-machine (M2M) communications are one of the essential applications in 5G, and it involves exchanging of concise messages with a very short latency. However, in UFMC systems, the subcarriers are grouped into subbands but f-OFDM only one subband covers the entire band. Furthermore, in FBMC, a subband includes only one subcarrier, and the number of subbands is the same as the number of subcarriers. This paper mainly discusses the performance of UFMC with different parameters for the UFMC system. Also, paper shows that UFMC is the best choice outperforming OFDM in any case and FBMC in case of very short packets while performing similarly for long sequences with channel estimation techniques for Interleave Division Multiple Access (IDMA) systems.Keywords: universal filtered multi-carrier technique, UFMC, interleave division multiple access, IDMA, fifth-generation, subband
Procedia PDF Downloads 1342065 Estimation of the Exergy-Aggregated Value Generated by a Manufacturing Process Using the Theory of the Exergetic Cost
Authors: German Osma, Gabriel Ordonez
Abstract:
The production of metal-rubber spares for vehicles is a sequential process that consists in the transformation of raw material through cutting activities and chemical and thermal treatments, which demand electricity and fossil fuels. The energy efficiency analysis for these cases is mostly focused on studying of each machine or production step, but is not common to study of the quality of the production process achieves from aggregated value viewpoint, which can be used as a quality measurement for determining of impact on the environment. In this paper, the theory of exergetic cost is used for determining of aggregated exergy to three metal-rubber spares, from an exergy analysis and thermoeconomic analysis. The manufacturing processing of these spares is based into batch production technique, and therefore is proposed the use of this theory for discontinuous flows from of single models of workstations; subsequently, the complete exergy model of each product is built using flowcharts. These models are a representation of exergy flows between components into the machines according to electrical, mechanical and/or thermal expressions; they determine the demanded exergy to produce the effective transformation in raw materials (aggregated exergy value), the exergy losses caused by equipment and irreversibilities. The energy resources of manufacturing process are electricity and natural gas. The workstations considered are lathes, punching presses, cutters, zinc machine, chemical treatment tanks, hydraulic vulcanizing presses and rubber mixer. The thermoeconomic analysis was done by workstation and by spare; first of them describes the operation of the components of each machine and where the exergy losses are; while the second of them estimates the exergy-aggregated value for finished product and wasted feedstock. Results indicate that exergy efficiency of a mechanical workstation is between 10% and 60% while this value in the thermal workstations is less than 5%; also that each effective exergy-aggregated value is one-thirtieth of total exergy required for operation of manufacturing process, which amounts approximately to 2 MJ. These troubles are caused mainly by technical limitations of machines, oversizing of metal feedstock that demands more mechanical transformation work, and low thermal insulation of chemical treatment tanks and hydraulic vulcanizing presses. From established information, in this case, it is possible to appreciate the usefulness of theory of exergetic cost for analyzing of aggregated value in manufacturing processes.Keywords: exergy-aggregated value, exergy efficiency, thermoeconomics, exergy modeling
Procedia PDF Downloads 1702064 Hazard Alert in Malaysia Related to Occupational Safety and Health
Authors: Atikah Binti Azudin, Nurin Nazlah Binti Muhamad Yani, Nur Alya Nadhirah Binti Naaidith, Nur Amylia Wahida Binti Mat Ayob, Nurshamimi Shakirah Binti Suboh, Nur Auni Batrisyia Binti Md. Zaini, Nur Aziemah Binti Mohamad, Nurul Suffiyah Binti Sa’Dun, Sabrina Sasha Izzati Binti Zubaile, Umi Huwaina Binti Ahmiruddin, Wan Nur Shafawati Binti Wan Ghazali
Abstract:
A hazard alert is intended to provide brief information about significant incidents or existing difficulties in Department workplaces. The alert gives guidelines for proper processes, practices, and controls to be applied. When operated in accordance with the manufacturer's instructions, any machine or tool utilized at work provides a safe and dependable platform for workers to accomplish job duties. However, when not utilized appropriately, the machine might pose a major hazard to employees. Employers have a duty to keep employees safe in this scenario. This Hazard Alert outlines specific occupational dangers and the controls that employers must apply to prevent injury or fatal accidents. There have been several cases of hazard alerts in Malaysia, which have had a negative impact on a few workers. Looking on the bright side, we can overcome every incident in a variety of ways. One of these is that only qualified individuals operate mobile machinery and equipment. In addition, employees may also perform frequent pre-use inspections of machinery to discover and fix flaws. Hazard alert is very important, and this study would cover a variety of subjects, including the methods employed.Keywords: safe, hazard, impacts, duties.
Procedia PDF Downloads 922063 Challenges Brought about by Integrating Multiple Stakeholders into Farm Management Mentorship of Land Reform Beneficiaries in South Africa
Authors: Carlu Van Der Westhuizen
Abstract:
The South African Agricultural Sector is of major socio-economic importance to the country due to its contribution in maintaining stability in food production and food security, providing labour opportunities, eradicating poverty and earning foreign currency. Against this reality, this paper investigates within the Agricultural Sector in South Africa the changes in Land Policies that the new democratically elected government (African National Congress) brought about since their takeover in 1994. The change in the agricultural environment is decidedly dualistic, with 1) a commercial sector, and 2) a subsistence and emerging farmer sector. The future demands and challenges are mostly identified as those of land redistribution and social upliftment. Opportunities that arose from the challenge of change are, among others, the small-holder participation in the value chain, while the challenge of change in Agriculture and the opportunities that were identified could serve as a yardstick against which the Sectors’ (Agriculture) Performance could be measured in future. Unfortunately, despite all Governments’ Policies, Programmes and Projects and inputs of the Private Sector, the outcomes are, to a large extend, unsuccessful. The urgency with the Land Redistribution Programme is that, for the period 1994 – 2014, only 7.5% of the 30% aim in the redistribution of land was achieved. Another serious aspect of concern is that 90% of the Land Redistribution Projects are not in a state of productive use by emerging farmers. Several reasons may be offered for these failures, amongst others the uncoordinated way in which different stakeholders are involved in a specific farming project. These stakeholders could generally in most cases be identified as: - The Government as the policy maker; - The Private Sector that has the potential to contribute to the sustainable pre- and post-settlement stages of the Programme by cooperating the supporting services to Government; - Inputs from the communities in rural areas where the settlement takes place; - The landowners as sellers of land (e.g. a Traditional Council); and - The emerging beneficiaries as the receivers of land. Mentorship is mostly the medium with which the support are coordinated. In this paper focus will be on three scenarios of different types of mentorship (or management support) namely: - The Taung Irrigation Scheme (TIS) where multiple new land beneficiaries were established by sharing irrigation pivots and receiving mentorship support from commodity organisations within a traditional land sharing system; - Projects whereby the mentor is a strategic partner (mostly a major agricultural 'cooperative' which is also providing inputs to the farmer and responsible for purchasing/marketing all commodities produced); and - An individual mentor who is a private person focussing mainly on farm management mentorship without direct gain other than a monthly stipend paid to the mentor by Government. Against this introduction the focus of the study is investigating the process for the sustainable implementation of Governments’ Land Redistribution in South African Agriculture. To achieve this, the research paper is presented under the themes of problem statement, objectives, methodology and limitations, outline of the research process, as well as proposing possible solutions.Keywords: land reform, role-players, failures, mentorship, management models
Procedia PDF Downloads 2712062 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain
Authors: Zachary Blanks, Solomon Sonya
Abstract:
Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection
Procedia PDF Downloads 2922061 Critical Heights of Sloped Unsupported Trenches in Unsaturated Sand
Authors: Won Taek Oh, Adin Richard
Abstract:
Workers are often required to enter unsupported trenches during the construction process, which may present serious risks. Trench failures can result in death or damage to adjacent properties, therefore trenches should be excavated with extreme precaution. Excavation work is often done in unsaturated soils, where the critical height (i.e. maximum depth that can be excavated without failure) of unsupported trenches can be more reliably estimated by considering the influence of matric suction. In this study, coupled stress/pore-water pressure analyses are conducted to investigate the critical height of sloped unsupported trenches considering the influence of pore-water pressure redistribution caused by excavating. Four different wall slopes (1.5V:1H, 2V:1H, 3V:1H, and 90°) and a vertical trench with the top 0.3 m sloped 1:1 were considered in the analyses with multiple depths of the ground water table in a sand. For comparison, the critical heights were also estimated using the limit equilibrium method for the same excavation scenarios used in the coupled analyses.Keywords: critical height, matric suction, unsaturated soil, unsupported trench
Procedia PDF Downloads 1212060 Development of Ceramic Spheres Buoyancy Modules for Deep-Sea Oil Exploration
Authors: G. Blugan, B. Jiang, J. Thornberry, P. Sturzenegger, U. Gonzenbach, M. Misson, D. Cartlidge, R. Stenerud, J. Kuebler
Abstract:
Low-cost ceramic spheres were developed and manufactured from the engineering ceramic aluminium oxide. Hollow spheres of 50 mm diameter with a wall thickness of 0.5-1.0 mm were produced via an adapted slip casting technique. It was possible to produce the spheres with good repeatability and with no defects or failures in the spheres due to the manufacturing process. The spheres were developed specifically for use in buoyancy devices for deep-sea exploration conditions at depths of 3000 m below sea level. The spheres with a 1.0 mm wall thickness exhibit a buoyancy of over 54% while the spheres with a 0.5 mm wall thickness exhibit a buoyancy of over 73%. The mechanical performance of the spheres was confirmed by performing a hydraulic burst pressure test on individual spheres. With a safety factor of 3, all spheres with 1.0 mm wall thickness survived a hydraulic pressure of greater than 150 MPa which is equivalent to a depth of more than 5000 m below sea level. The spheres were then incorporated into a buoyancy module. These hollow aluminium oxide ceramic spheres offer an excellent possibility of deep-sea exploration to depths greater than the currently used technology.Keywords: buoyancy, ceramic spheres, deep-sea, oil exploration
Procedia PDF Downloads 4142059 Availability Analysis of Milling System in a Rice Milling Plant
Authors: P. C. Tewari, Parveen Kumar
Abstract:
The paper describes the availability analysis of milling system of a rice milling plant using probabilistic approach. The subsystems under study are special purpose machines. The availability analysis of the system is carried out to determine the effect of failure and repair rates of each subsystem on overall performance (i.e. steady state availability) of system concerned. Further, on the basis of effect of repair rates on the system availability, maintenance repair priorities have been suggested. The problem is formulated using Markov Birth-Death process taking exponential distribution for probable failures and repair rates. The first order differential equations associated with transition diagram are developed by using mnemonic rule. These equations are solved using normalizing conditions and recursive method to drive out the steady state availability expression of the system. The findings of the paper are presented and discussed with the plant personnel to adopt a suitable maintenance policy to increase the productivity of the rice milling plant.Keywords: availability modeling, Markov process, milling system, rice milling plant
Procedia PDF Downloads 2342058 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data
Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine
Procedia PDF Downloads 2402057 The Lubrication Regimes Recognition of a Pressure-Fed Journal Bearing by Time and Frequency Domain Analysis of Acoustic Emission Signals
Authors: S. Hosseini, M. Ahmadi Najafabadi, M. Akhlaghi
Abstract:
The health of the journal bearings is very important in preventing unforeseen breakdowns in rotary machines, and poor lubrication is one of the most important factors for producing the bearing failures. Hydrodynamic lubrication (HL), mixed lubrication (ML), and boundary lubrication (BL) are three regimes of a journal bearing lubrication. This paper uses acoustic emission (AE) measurement technique to correlate features of the AE signals to the three lubrication regimes. The transitions from HL to ML based on operating factors such as rotating speed, load, inlet oil pressure by time domain and time-frequency domain signal analysis techniques are detected, and then metal-to-metal contacts between sliding surfaces of the journal and bearing are identified. It is found that there is a significant difference between theoretical and experimental operating values that are obtained for defining the lubrication regions.Keywords: acoustic emission technique, pressure fed journal bearing, time and frequency signal analysis, metal-to-metal contact
Procedia PDF Downloads 1552056 i2kit: A Tool for Immutable Infrastructure Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.Keywords: container, deployment, immutable infrastructure, microservice
Procedia PDF Downloads 1792055 Numerical Investigation of Wave Run-Up on Curved Dikes
Authors: Suba Periyal Subramaniam, Babette Scheres, Altomare Corrado, Holger Schuttrumpf
Abstract:
Due to the climatic change and the usage of coastal areas, there is an increasing risk of dike failures along the coast worldwide. Wave run-up plays a key role in planning and design of a coastal structure. The coastal dike lines are bent either due to geological characteristics or due to influence of anthropogenic activities. The effect of the curvature of coastal dikes on wave run-up and overtopping is not yet investigated. The scope of this research is to find the effects of the dike curvature on wave run-up by employing numerical model studies for various dike opening angles. Numerical simulation is carried out using DualSPHysics, a meshless method, and OpenFOAM, a mesh-based method. The numerical results of the wave run-up on a curved dike and the wave transformation process for various opening angles, wave attacks, and wave parameters will be compared and discussed. This research aims to contribute a more precise analysis and understanding the influence of the curvature in the dike line and thus ensuring a higher level of protection in the future development of coastal structures.Keywords: curved dikes, DualSPHysics, OpenFOAM, wave run-up
Procedia PDF Downloads 1482054 Mechanical Simulation with Electrical and Dimensional Tests for AISHa Containment Chamber
Authors: F. Noto, G. Costa, L. Celona, F. Chines, G. Ciavola, G. Cuttone, S. Gammino, O. Leonardi, S. Marletta, G. Torrisi
Abstract:
At Istituto Nazionale di Fisica Nucleare – Laboratorio Nazionale del Sud (INFN-LNS), a broad experience in the design, construction and commissioning of ECR and microwave ion sources is available. The AISHa ion source has been designed by taking into account the typical requirements of hospital-based facilities, where the minimization of the mean time between failures (MTBF) is a key point together with the maintenance operations, which should be fast and easy. It is intended to be a multipurpose device, operating at 18 GHz, in order to achieve higher plasma densities. It should provide enough versatility for future needs of the hadron therapy, including the ability to run at larger microwave power to produce different species and highly charged ion beams. The source is potentially interesting for any hadron therapy facility using heavy ions. In this paper, we analyzed the dimensional test and electrical test about an innovative solution for the containment chamber that allows us to solve our isolation and structural problems.Keywords: FEM analysis, electron cyclotron resonance ion source, dielectrical measurement, hadron therapy
Procedia PDF Downloads 2932053 Applying Artificial Neural Networks to Predict Speed Skater Impact Concussion Risk
Authors: Yilin Liao, Hewen Li, Paula McConvey
Abstract:
Speed skaters often face a risk of concussion when they fall on the ice floor and impact crash mats during practices and competitive races. Several variables, including those related to the skater, the crash mat, and the impact position (body side/head/feet impact), are believed to influence the severity of the skater's concussion. While computer simulation modeling can be employed to analyze these accidents, the simulation process is time-consuming and does not provide rapid information for coaches and teams to assess the skater's injury risk in competitive events. This research paper promotes the exploration of the feasibility of using AI techniques for evaluating skater’s potential concussion severity, and to develop a fast concussion prediction tool using artificial neural networks to reduce the risk of treatment delays for injured skaters. The primary data is collected through virtual tests and physical experiments designed to simulate skater-mat impact. It is then analyzed to identify patterns and correlations; finally, it is used to train and fine-tune the artificial neural networks for accurate prediction. The development of the prediction tool by employing machine learning strategies contributes to the application of AI methods in sports science and has theoretical involvements for using AI techniques in predicting and preventing sports-related injuries.Keywords: artificial neural networks, concussion, machine learning, impact, speed skater
Procedia PDF Downloads 1092052 Melanoma and Non-Melanoma, Skin Lesion Classification, Using a Deep Learning Model
Authors: Shaira L. Kee, Michael Aaron G. Sy, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar AlDahoul
Abstract:
Skin diseases are considered the fourth most common disease, with melanoma and non-melanoma skin cancer as the most common type of cancer in Caucasians. The alarming increase in Skin Cancer cases shows an urgent need for further research to improve diagnostic methods, as early diagnosis can significantly improve the 5-year survival rate. Machine Learning algorithms for image pattern analysis in diagnosing skin lesions can dramatically increase the accuracy rate of detection and decrease possible human errors. Several studies have shown the diagnostic performance of computer algorithms outperformed dermatologists. However, existing methods still need improvements to reduce diagnostic errors and generate efficient and accurate results. Our paper proposes an ensemble method to classify dermoscopic images into benign and malignant skin lesions. The experiments were conducted using the International Skin Imaging Collaboration (ISIC) image samples. The dataset contains 3,297 dermoscopic images with benign and malignant categories. The results show improvement in performance with an accuracy of 88% and an F1 score of 87%, outperforming other existing models such as support vector machine (SVM), Residual network (ResNet50), EfficientNetB0, EfficientNetB4, and VGG16.Keywords: deep learning - VGG16 - efficientNet - CNN – ensemble – dermoscopic images - melanoma
Procedia PDF Downloads 812051 Noise Measurement and Awareness at Construction Site: A Case Study
Authors: Feiruz Ab'lah, Zarini Ismail, Mohamad Zaki Hassan, Siti Nadia Mohd Bakhori, Mohamad Azlan Suhot, Mohd Yusof Md. Daud, Shamsul Sarip
Abstract:
The construction industry is one of the major sectors in Malaysia. Apart from providing facilities, services, and goods it also offers employment opportunities to local and foreign workers. In fact, the construction workers are exposed to a hazardous level of noises that generated from various sources including excavators, bulldozers, concrete mixer, and piling machines. Previous studies indicated that the piling and concrete work was recorded as the main source that contributed to the highest level of noise among the others. Therefore, the aim of this study is to obtain the noise exposure during piling process and to determine the awareness of workers against noise pollution at the construction site. Initially, the reading of noise was obtained at construction site by using a digital sound level meter (SLM), and noise exposure to the workers was mapped. Readings were taken from four different distances; 5, 10, 15 and 20 meters from the piling machine. Furthermore, a set of questionnaire was also distributed to assess the knowledge regarding noise pollution at the construction site. The result showed that the mean noise level at 5m distance was more than 90 dB which exceeded the recommended level. Although the level of awareness regarding the effect of noise pollution is satisfactory, majority of workers (90%) still did not wear ear protecting device during work period. Therefore, the safety module guidelines related to noise pollution controls should be implemented to provide a safe working environment and prevent initial occupational hearing loss.Keywords: construction, noise awareness, noise pollution, piling machine
Procedia PDF Downloads 3842050 Improved Classification Procedure for Imbalanced and Overlapped Situations
Authors: Hankyu Lee, Seoung Bum Kim
Abstract:
The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.Keywords: classification, imbalanced data with class overlap, split data space, support vector machine
Procedia PDF Downloads 3082049 Smart Disassembly of Waste Printed Circuit Boards: The Role of IoT and Edge Computing
Authors: Muhammad Mohsin, Fawad Ahmad, Fatima Batool, Muhammad Kaab Zarrar
Abstract:
The integration of the Internet of Things (IoT) and edge computing devices offers a transformative approach to electronic waste management, particularly in the dismantling of printed circuit boards (PCBs). This paper explores how these technologies optimize operational efficiency and improve environmental sustainability by addressing challenges such as data security, interoperability, scalability, and real-time data processing. Proposed solutions include advanced machine learning algorithms for predictive maintenance, robust encryption protocols, and scalable architectures that incorporate edge computing. Case studies from leading e-waste management facilities illustrate benefits such as improved material recovery efficiency, reduced environmental impact, improved worker safety, and optimized resource utilization. The findings highlight the potential of IoT and edge computing to revolutionize e-waste dismantling and make the case for a collaborative approach between policymakers, waste management professionals, and technology developers. This research provides important insights into the use of IoT and edge computing to make significant progress in the sustainable management of electronic wasteKeywords: internet of Things, edge computing, waste PCB disassembly, electronic waste management, data security, interoperability, machine learning, predictive maintenance, sustainable development
Procedia PDF Downloads 302048 Does Stock Markets Asymmetric Information Affect Foreign Capital Flows?
Authors: Farid Habibi Tanha, Mojtaba Jahanbazi, Morteza Foroutan, Rasidah Mohd Rashid
Abstract:
This paper depicts the effects of asymmetric information in determining capital inflows to be captured through stock market microstructure. The model can explain several stylized facts regarding the capital immobility. The first phase of the research involves in collecting and refining 150,000,000 daily data of 11 stock markets over a period of one decade in an effort to minimize the impact of survivorship bias. Three micro techniques were used to measure information asymmetries. The final phase analyzes the model through panel data approach. As a unique contribution, this research will provide valuable information regarding negative effects of information asymmetries in stock markets on attracting foreign investments. The results of this study can be directly considered by policy makers to monitor and control changes of capital flow in order to keep market conditions in a healthy manner, by preventing and managing possible shocks to avoid sudden reversals and market failures.Keywords: asymmetric information, capital inflow, market microstructure, investment
Procedia PDF Downloads 3212047 Study on Water Level Management Criteria of Reservoir Failure Alert System
Authors: B. Lee, B. H. Choi
Abstract:
The loss of safety for reservoirs brought about by climate change and facility aging leads to reservoir failures, which results in the loss of lives and property damage in downstream areas. Therefore, it is necessary to provide a reservoir failure alert system for downstream residents to detect the early signs of failure (with sensors) in real-time and perform safety management to prevent and minimize possible damage. 10 case studies were carried out to verify the water level management criteria of four levels (attention, caution, alert, serious). Peak changes in water level data were analysed. The results showed that ‘Caution’ and ‘Alert’ were closed to 33% and 66% of difference in level between flood water level and full water level. Therefore, it is adequate to use initial water level management criteria of reservoir failure alert system for the first year. Acknowledgment: This research was supported by a grant (2017-MPSS31-002) from 'Supporting Technology Development Program for Disaster Management' funded by the Ministry of the Interior and Safety(MOIS)Keywords: alert system, management criteria, reservoir failure, sensor
Procedia PDF Downloads 2002046 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity
Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Saifur Rahman Sabuj
Abstract:
This paper examines relationships between solar activity and earthquakes; it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to affect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth.Keywords: k-nearest neighbour, support vector regression, random forest regression, long short-term memory network, earthquakes, solar activity, sunspot number, solar wind, solar flares
Procedia PDF Downloads 732045 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 66