Search results for: Brain Machine Interfaces
996 Use of Bayesian Network in Information Extraction from Unstructured Data Sources
Authors: Quratulain N. Rajput, Sajjad Haider
Abstract:
This paper applies Bayesian Networks to support information extraction from unstructured, ungrammatical, and incoherent data sources for semantic annotation. A tool has been developed that combines ontologies, machine learning, and information extraction and probabilistic reasoning techniques to support the extraction process. Data acquisition is performed with the aid of knowledge specified in the form of ontology. Due to the variable size of information available on different data sources, it is often the case that the extracted data contains missing values for certain variables of interest. It is desirable in such situations to predict the missing values. The methodology, presented in this paper, first learns a Bayesian network from the training data and then uses it to predict missing data and to resolve conflicts. Experiments have been conducted to analyze the performance of the presented methodology. The results look promising as the methodology achieves high degree of precision and recall for information extraction and reasonably good accuracy for predicting missing values.Keywords: Information Extraction, Bayesian Network, ontology, Machine Learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232995 Microwave Shielding of Magnetized Hydrogen Plasma in Carbon Nanotubes
Authors: Afshin Moradi, Mohammad Hosain Teimourpour
Abstract:
We derive simple sets of equations to describe the microwave response of a thin film of magnetized hydrogen plasma in the presence of carbon nanotubes, which were grown by ironcatalyzed high-pressure disproportionation (HiPco). By considering the interference effects due to multiple reflections between thin plasma film interfaces, we present the effects of the continuously changing external magnetic field and plasma parameters on the reflected power, absorbed power, and transmitted power in the system. The simulation results show that the interference effects play an important role in the reflectance, transmittance and absorptance of microwave radiation at the magnetized plasma slab. As a consequence, the interference effects lead to a sinusoidal variation of the reflected intensity and can greatly reduce the amount of reflection power, but the absorption power increases.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403994 A Real-time 4M Collecting Method for Production Information System
Authors: Seung Woo Lee, So Jeong Nam, Jai-Kyung Lee
Abstract:
It can be said that the business sector is faced with a range of challenges–a rapidly changing business environment, an increase and diversification of customers- demands and the consequent need for quick response–for having in place flexible management and production info systems. As a matter of fact, many manufacturers have adopted production info management systems such as MES and ERP. Nevertheless, managers are having difficulties obtaining ever-changing production process information in real time, or responding quickly to any change in production related needs on the basis of such information. This is because they rely on poor production info systems which are not capable of providing real-time factory settings. If the manufacturer doesn-t have a capacity for collecting or digitalizing the 4 Ms (Man, Machine, Material, Method), which are resources for production, on a real time basis, it might to difficult to effectively maintain the information on production process. In this regard, this paper will introduce some new alternatives to the existing methods of collecting the 4 Ms in real time, which are currently comprise the production field.
Keywords: 4M, Acquisition of Data on shop-floor, Real-time machine interface
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4333993 Three Dimensional Analysis of Sequential Quasi Isotropic Composite Disc for Rotating Machine Application
Authors: Amin Almasi
Abstract:
Composite laminates are relatively weak in out of plane loading, inter-laminar stress, stress concentration near the edge and stress singularities. This paper develops a new analytical formulation for laminated composite rotating disc fabricated from symmetric sequential quasi isotropic layers to predict three dimensional stress and deformation. This analysis is necessary to evaluate mechanical integrity of fiber reinforced multi-layer laminates used for high speed rotating applications such as high speed impellers. Three dimensional governing equations are written for rotating composite disc. Explicit solution is obtained with "Frobenius" expansion series. Based on analytical results, there are two separate zones of three dimensional stress fields in centre and edge of rotating disc. For thin discs, out of plane deformations and stresses are small in comparison with plane ones. For relatively thick discs deformation and stress fields are three dimensional.Keywords: Composite Disc, Rotating Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393992 Genetic Algorithms for Feature Generation in the Context of Audio Classification
Authors: José A. Menezes, Giordano Cabral, Bruno T. Gomes
Abstract:
Choosing good features is an essential part of machine learning. Recent techniques aim to automate this process. For instance, feature learning intends to learn the transformation of raw data into a useful representation to machine learning tasks. In automatic audio classification tasks, this is interesting since the audio, usually complex information, needs to be transformed into a computationally convenient input to process. Another technique tries to generate features by searching a feature space. Genetic algorithms, for instance, have being used to generate audio features by combining or modifying them. We find this approach particularly interesting and, despite the undeniable advances of feature learning approaches, we wanted to take a step forward in the use of genetic algorithms to find audio features, combining them with more conventional methods, like PCA, and inserting search control mechanisms, such as constraints over a confusion matrix. This work presents the results obtained on particular audio classification problems.
Keywords: Feature generation, feature learning, genetic algorithm, music information retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078991 Reducing the Imbalance Penalty through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey
Abstract:
In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations, since the geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning and time series methods, the total generation of the power plants belonging to Zorlu Doğal Electricity Generation, which has a high installed capacity in terms of geothermal, was predicted for the first one-week and first two-weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.
Keywords: Machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204990 Design and Implementation of an AI-Enabled Task Assistance and Management System
Authors: Arun Prasad Jaganathan
Abstract:
In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper presents an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.
Keywords: Artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76989 Performance Analysis of a Flexible Manufacturing Line Operated Under Surplus-based Production Control
Authors: K. K. Starkov, A. Y. Pogromsky, I. J. B. F. Adan, J. E. Rooda
Abstract:
In this paper we present our results on the performance analysis of a multi-product manufacturing line. We study the influence of external perturbations, intermediate buffer content and the number of manufacturing stages on the production tracking error of each machine in the multi-product line operated under a surplusbased production control policy. Starting by the analysis of a single machine with multiple production stages (one for each product type), we provide bounds on the production error of each stage. Then, we extend our analysis to a line of multi-stage machines, where similarly, bounds on each production tracking error for each product type, as well as buffer content are obtained. Details on performance of the closed-loop flow line model are illustrated in numerical simulations.
Keywords: Flexible manufacturing systems, tracking systems, discrete time systems, production control, boundary conditions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1516988 On Speeding Up Support Vector Machines: Proximity Graphs Versus Random Sampling for Pre-Selection Condensation
Authors: Xiaohua Liu, Juan F. Beltran, Nishant Mohanchandra, Godfried T. Toussaint
Abstract:
Support vector machines (SVMs) are considered to be the best machine learning algorithms for minimizing the predictive probability of misclassification. However, their drawback is that for large data sets the computation of the optimal decision boundary is a time consuming function of the size of the training set. Hence several methods have been proposed to speed up the SVM algorithm. Here three methods used to speed up the computation of the SVM classifiers are compared experimentally using a musical genre classification problem. The simplest method pre-selects a random sample of the data before the application of the SVM algorithm. Two additional methods use proximity graphs to pre-select data that are near the decision boundary. One uses k-Nearest Neighbor graphs and the other Relative Neighborhood Graphs to accomplish the task.Keywords: Machine learning, data mining, support vector machines, proximity graphs, relative-neighborhood graphs, k-nearestneighbor graphs, random sampling, training data condensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1919987 Climate Change in Albania and Its Effect on Cereal Yield
Abstract:
This study is focused on analyzing climate change in Albania and its potential effects on cereal yields. Initially, monthly temperature and rainfalls in Albania were studied for the period 1960-2021. Climacteric variables are important variables when trying to model cereal yield behavior, especially when significant changes in weather conditions are observed. For this purpose, in the second part of the study, linear and nonlinear models explaining cereal yield are constructed for the same period, 1960-2021. The multiple linear regression analysis and lasso regression method are applied to the data between cereal yield and each independent variable: average temperature, average rainfall, fertilizer consumption, arable land, land under cereal production, and nitrous oxide emissions. In our regression model, heteroscedasticity is not observed, data follow a normal distribution, and there is a low correlation between factors, so we do not have the problem of multicollinearity. Machine learning methods, such as Random Forest (RF), are used to predict cereal yield responses to climacteric and other variables. RF showed high accuracy compared to the other statistical models in the prediction of cereal yield. We found that changes in average temperature negatively affect cereal yield. The coefficients of fertilizer consumption, arable land, and land under cereal production are positively affecting production. Our results show that the RF method is an effective and versatile machine-learning method for cereal yield prediction compared to the other two methods: multiple linear regression and lasso regression method.
Keywords: Cereal yield, climate change, machine learning, multiple regression model, random forest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 249986 VISMA: A Method for System Analysis in Early Lifecycle Phases
Authors: Walter Sebron, Hans Tschürtz, Peter Krebs
Abstract:
The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.
Keywords: Analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1135985 Yield Prediction Using Support Vectors Based Under-Sampling in Semiconductor Process
Authors: Sae-Rom Pak, Seung Hwan Park, Jeong Ho Cho, Daewoong An, Cheong-Sool Park, Jun Seok Kim, Jun-Geol Baek
Abstract:
It is important to predict yield in semiconductor test process in order to increase yield. In this study, yield prediction means finding out defective die, wafer or lot effectively. Semiconductor test process consists of some test steps and each test includes various test items. In other world, test data has a big and complicated characteristic. It also is disproportionably distributed as the number of data belonging to FAIL class is extremely low. For yield prediction, general data mining techniques have a limitation without any data preprocessing due to eigen properties of test data. Therefore, this study proposes an under-sampling method using support vector machine (SVM) to eliminate an imbalanced characteristic. For evaluating a performance, randomly under-sampling method is compared with the proposed method using actual semiconductor test data. As a result, sampling method using SVM is effective in generating robust model for yield prediction.
Keywords: Yield Prediction, Semiconductor Test Process, Support Vector Machine, Under Sampling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2398984 Fracture Location Characterizations of Dissimilar Friction Stir Welds
Authors: Esther T. Akinlabi, Stephen A. Akinlabi
Abstract:
This paper reports the tensile fracture location characterizations of dissimilar friction stir welds between 5754 aluminium alloy and C11000 copper. The welds were produced using three shoulder diameter tools; namely, 15, 18 and 25 mm by varying the process parameters. The rotational speeds considered were 600, 950 and 1200 rpm while the feed rates employed were 50, 150 and 300 mm/min to represent the low, medium and high settings respectively. The tensile fracture locations were evaluated using the optical microscope to identify the fracture locations and were characterized. It was observed that 70% of the tensile samples failed in the Thermo Mechanically Affected Zone (TMAZ) of copper at the weld joints. Further evaluation of the fracture surfaces of the pulled tensile samples revealed that welds with low Ultimate Tensile Strength either have defects or intermetallics present at their joint interfaces.Keywords: fracture location, friction stir welding, intermetallics, metallography,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970983 Phase Transformation Temperatures for Shape Memory Alloy Wire
Authors: Tan Wee Choon, Abdul Saad Salleh, Saifulnizan Jamian, Mohd. Imran Ghazali
Abstract:
Phase transformation temperature is one of the most important parameters for the shape memory alloys (SMAs). The most popular method to determine these phase transformation temperatures is the Differential Scanning Calorimeter (DSC), but due to the limitation of the DSC testing itself, it made it difficult for the finished product which is not in the powder form. A novel method which uses the Universal Testing Machine has been conducted to determine the phase transformation temperatures. The Flexinol wire was applied with force and maintained throughout the experiment and at the same time it was heated up slowly until a temperature of approximately 1000C with direct current. The direct current was then slowly decreased to cool down the temperature of the Flexinol wire. All the phase transformation temperatures for Flexinol wire were obtained. The austenite start at 52.540C and austenite finish at 60.900C, while martensite start at 44.780C and martensite finish at 32.840C.Keywords: Phase transformation temperature, Robotic, Shapememory alloy, Universal Testing Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3932982 A Machine Learning-based Analysis of Autism Prevalence Rates across US States against Multiple Potential Explanatory Variables
Authors: Ronit Chakraborty, Sugata Banerji
Abstract:
There has been a marked increase in the reported prevalence of Autism Spectrum Disorder (ASD) among children in the US over the past two decades. This research has analyzed the growth in state-level ASD prevalence against 45 different potentially explanatory factors including socio-economic, demographic, healthcare, public policy and political factors. The goal was to understand if these factors have adequate predictive power in modeling the differential growth in ASD prevalence across various states, and, if they do, which factors are the most influential. The key findings of this study include (1) there is a confirmation that the chosen feature set has considerable power in predicting the growth in ASD prevalence, (2) the most influential predictive factors are identified, (3) given the nature of the most influential predictive variables, an indication that a considerable portion of the reported ASD prevalence differentials across states could be attributable to over and under diagnosis, and (4) Florida is identified as a key outlier state pointing to a potential under-diagnosis of ASD.
Keywords: Autism Spectrum Disorder, ASD, clustering, Machine Learning, predictive modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 675981 Models to Customise Web Service Discovery Result using Static and Dynamic Parameters
Authors: Kee-Leong Tan, Cheng-Suan Lee, Hui-Na Chua
Abstract:
This paper presents three models which enable the customisation of Universal Description, Discovery and Integration (UDDI) query results, based on some pre-defined and/or real-time changing parameters. These proposed models detail the requirements, design and techniques which make ranking of Web service discovery results from a service registry possible. Our contribution is two fold: First, we present an extension to the UDDI inquiry capabilities. This enables a private UDDI registry owner to customise or rank the query results, based on its business requirements. Second, our proposal utilises existing technologies and standards which require minimal changes to existing UDDI interfaces or its data structures. We believe these models will serve as valuable reference for enhancing the service discovery methods within a private UDDI registry environment.Keywords: Web service, discovery, semantic, SOA, registry, UDDI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486980 Control of Thermal Flow in Machine Tools Using Shape Memory Alloys
Authors: Reimund Neugebauer, Welf-Guntram Drossel, Andre Bucht, Christoph Ohsenbrügge
Abstract:
In this paper the authors propose and verify an approach to control heat flow in machine tool components. Thermal deformations are a main aspect that affects the accuracy of machining. Due to goals of energy efficiency, thermal basic loads should be reduced. This leads to inhomogeneous and time variant temperature profiles. To counteract these negative consequences, material with high melting enthalpy is used as a method for thermal stabilization. The increased thermal capacity slows down the transient thermal behavior. To account for the delayed thermal equilibrium, a control mechanism for thermal flow is introduced. By varying a gap in a heat flow path the thermal resistance of an assembly can be controlled. This mechanism is evaluated in two experimental setups. First to validate the ability to control the thermal resistance and second to prove the possibility of a self-sufficient option based on the selfsensing abilities of thermal shape memory alloys.
Keywords: energy-efficiency, heat transfer path, MT thermal stability, thermal shape memory alloy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932979 Performance Analysis of List Scheduling in Heterogeneous Computing Systems
Authors: Keqin Li
Abstract:
Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.Keywords: Average-case performance, list scheduling algorithm, mixed-machine heterogeneous computing system, worst-case performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349978 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition
Authors: Chuan Li, Ming Liang
Abstract:
Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717977 Machine Learning Methods for Flood Hazard Mapping
Authors: S. Zappacosta, C. Bove, M. Carmela Marinelli, P. di Lauro, K. Spasenovic, L. Ostano, G. Aiello, M. Pietrosanto
Abstract:
This paper proposes a neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The classification capability was compared with the flood hazard mapping River Basin Plans (Piani Assetto Idrogeologico, acronimed as PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale), encoding four different increasing flood hazard levels. The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.
Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 726976 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.
Keywords: Social Media, text mining, knowledge discovery, predictive analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3849975 Knowledge Based Wear Particle Analysis
Authors: Mohammad S. Laghari, Qurban A. Memon, Gulzar A. Khuwaja
Abstract:
The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.Keywords: Computer vision, knowledge based systems, morphology, wear particles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744974 Prediction of Research Topics Using Ensemble of Best Predictors from Similar Dataset
Authors: Indra Budi, Rizal Fathoni Aji, Agus Widodo
Abstract:
Prediction of future research topics by using time series analysis either statistical or machine learning has been conducted previously by several researchers. Several methods have been proposed to combine the forecasting results into single forecast. These methods use fixed combination of individual forecast to get the final forecast result. In this paper, quite different approach is employed to select the forecasting methods, in which every point to forecast is calculated by using the best methods used by similar validation dataset. The dataset used in the experiment is time series derived from research report in Garuda, which is an online sites belongs to the Ministry of Education in Indonesia, over the past 20 years. The experimental result demonstrates that the proposed method may perform better compared to the fix combination of predictors. In addition, based on the prediction result, we can forecast emerging research topics for the next few years.
Keywords: Combination, emerging topics, ensemble, forecasting, machine learning, prediction, research topics, similarity measure, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2126973 The Effects of Speed on the Performance of Routing Protocols in Mobile Ad-hoc Networks
Authors: Narendra Singh Yadav, R.P.Yadav
Abstract:
Mobile ad hoc network is a collection of mobile nodes communicating through wireless channels without any existing network infrastructure or centralized administration. Because of the limited transmission range of wireless network interfaces, multiple "hops" may be needed to exchange data across the network. Consequently, many routing algorithms have come into existence to satisfy the needs of communications in such networks. Researchers have conducted many simulations comparing the performance of these routing protocols under various conditions and constraints. One question that arises is whether speed of nodes affects the relative performance of routing protocols being studied. This paper addresses the question by simulating two routing protocols AODV and DSDV. Protocols were simulated using the ns-2 and were compared in terms of packet delivery fraction, normalized routing load and average delay, while varying number of nodes, and speed.Keywords: AODV, DSDV, MANET, relative performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2162972 Modelling Silica Optical Fibre Reliability: A Software Application
Authors: I. Severin, M. Caramihai, R. El Abdi, M. Poulain, A. Avadanii
Abstract:
In order to assess optical fiber reliability in different environmental and stress conditions series of testing are performed simulating overlapping of chemical and mechanical controlled varying factors. Each series of testing may be compared using statistical processing: i.e. Weibull plots. Due to the numerous data to treat, a software application has appeared useful to interpret selected series of experiments in function of envisaged factors. The current paper presents a software application used in the storage, modelling and interpretation of experimental data gathered from optical fibre testing. The present paper strictly deals with the software part of the project (regarding the modelling, storage and processing of user supplied data).
Keywords: Optical fibres, computer aided analysis, data models, data processing, graphical user interfaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823971 On the Learning of Causal Relationships between Banks in Saudi Equities Market Using Ensemble Feature Selection Methods
Authors: Adel Aloraini
Abstract:
Financial forecasting using machine learning techniques has received great efforts in the last decide . In this ongoing work, we show how machine learning of graphical models will be able to infer a visualized causal interactions between different banks in the Saudi equities market. One important discovery from such learned causal graphs is how companies influence each other and to what extend. In this work, a set of graphical models named Gaussian graphical models with developed ensemble penalized feature selection methods that combine ; filtering method, wrapper method and a regularizer will be shown. A comparison between these different developed ensemble combinations will also be shown. The best ensemble method will be used to infer the causal relationships between banks in Saudi equities market.
Keywords: Causal interactions , banks, feature selection, regularizere,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748970 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model
Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong
Abstract:
This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Keywords: Defective autoparts products, Bayesian framework, Generalized linear mixed model (GLMM), Risk factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911969 A Frequency Dependence of the Phase Field Model in Laminar Boundary Layer with Periodic Perturbations
Authors: Yasuo Obikane
Abstract:
The frequency dependence of the phase field model(PFM) is studied. A simple PFM is proposed, and is tested in a laminar boundary layer. The Blasius-s laminar boundary layer solution on a flat plate is used for the flow pattern, and several frequencies are imposed on the PFM, and the decay times of the interfaces are obtained. The computations were conducted for three cases: 1) no-flow, and 2) a half ball on the laminar boundary layer, 3) a line of mass sources in the laminar boundary layer. The computations show the decay time becomes shorter as the frequency goes larger, and also show that it is sensitive to both background disturbances and surface tension parameters. It is concluded that the proposed simple PFM can describe the properties of decay process, and could give the fundamentals for the decay of the interface in turbulent flows.Keywords: Phase field model, two phase flows, Laminarboundary Layer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509968 Human Action Recognition Based on Ridgelet Transform and SVM
Authors: A. Ouanane, A. Serir
Abstract:
In this paper, a novel algorithm based on Ridgelet Transform and support vector machine is proposed for human action recognition. The Ridgelet transform is a directional multi-resolution transform and it is more suitable for describing the human action by performing its directional information to form spatial features vectors. The dynamic transition between the spatial features is carried out using both the Principal Component Analysis and clustering algorithm K-means. First, the Principal Component Analysis is used to reduce the dimensionality of the obtained vectors. Then, the kmeans algorithm is then used to perform the obtained vectors to form the spatio-temporal pattern, called set-of-labels, according to given periodicity of human action. Finally, a Support Machine classifier is used to discriminate between the different human actions. Different tests are conducted on popular Datasets, such as Weizmann and KTH. The obtained results show that the proposed method provides more significant accuracy rate and it drives more robustness in very challenging situations such as lighting changes, scaling and dynamic environmentKeywords: Human action, Ridgelet Transform, PCA, K-means, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071967 Genetic Algorithm Application in a Dynamic PCB Assembly with Carryover Sequence- Dependent Setups
Authors: M. T. Yazdani Sabouni, Rasaratnam Logendran
Abstract:
We consider a typical problem in the assembly of printed circuit boards (PCBs) in a two-machine flow shop system to simultaneously minimize the weighted sum of weighted tardiness and weighted flow time. The investigated problem is a group scheduling problem in which PCBs are assembled in groups and the interest is to find the best sequence of groups as well as the boards within each group to minimize the objective function value. The type of setup operation between any two board groups is characterized as carryover sequence-dependent setup time, which exactly matches with the real application of this problem. As a technical constraint, all of the boards must be kitted before the assembly operation starts (kitting operation) and by kitting staff. The main idea developed in this paper is to completely eliminate the role of kitting staff by assigning the task of kitting to the machine operator during the time he is idle which is referred to as integration of internal (machine) and external (kitting) setup times. Performing the kitting operation, which is a preparation process of the next set of boards while the other boards are currently being assembled, results in the boards to continuously enter the system or have dynamic arrival times. Consequently, a dynamic PCB assembly system is introduced for the first time in the assembly of PCBs, which also has characteristics similar to that of just-in-time manufacturing. The problem investigated is computationally very complex, meaning that finding the optimal solutions especially when the problem size gets larger is impossible. Thus, a heuristic based on Genetic Algorithm (GA) is employed. An example problem on the application of the GA developed is demonstrated and also numerical results of applying the GA on solving several instances are provided.Keywords: Genetic algorithm, Dynamic PCB assembly, Carryover sequence-dependent setup times, Multi-objective.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569