Search results for: Low cost ECG machine
2513 A Machine Learning-based Analysis of Autism Prevalence Rates across US States against Multiple Potential Explanatory Variables
Authors: Ronit Chakraborty, Sugata Banerji
Abstract:
There has been a marked increase in the reported prevalence of Autism Spectrum Disorder (ASD) among children in the US over the past two decades. This research has analyzed the growth in state-level ASD prevalence against 45 different potentially explanatory factors including socio-economic, demographic, healthcare, public policy and political factors. The goal was to understand if these factors have adequate predictive power in modeling the differential growth in ASD prevalence across various states, and, if they do, which factors are the most influential. The key findings of this study include (1) there is a confirmation that the chosen feature set has considerable power in predicting the growth in ASD prevalence, (2) the most influential predictive factors are identified, (3) given the nature of the most influential predictive variables, an indication that a considerable portion of the reported ASD prevalence differentials across states could be attributable to over and under diagnosis, and (4) Florida is identified as a key outlier state pointing to a potential under-diagnosis of ASD.
Keywords: Autism Spectrum Disorder, ASD, clustering, Machine Learning, predictive modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6902512 Identification of Coauthors in Scientific Database
Authors: Thiago M. R Dias, Gray F. Moita
Abstract:
The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.
Keywords: Extraction and data integration, Information Retrieval, Scientific Collaboration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17152511 Control of Thermal Flow in Machine Tools Using Shape Memory Alloys
Authors: Reimund Neugebauer, Welf-Guntram Drossel, Andre Bucht, Christoph Ohsenbrügge
Abstract:
In this paper the authors propose and verify an approach to control heat flow in machine tool components. Thermal deformations are a main aspect that affects the accuracy of machining. Due to goals of energy efficiency, thermal basic loads should be reduced. This leads to inhomogeneous and time variant temperature profiles. To counteract these negative consequences, material with high melting enthalpy is used as a method for thermal stabilization. The increased thermal capacity slows down the transient thermal behavior. To account for the delayed thermal equilibrium, a control mechanism for thermal flow is introduced. By varying a gap in a heat flow path the thermal resistance of an assembly can be controlled. This mechanism is evaluated in two experimental setups. First to validate the ability to control the thermal resistance and second to prove the possibility of a self-sufficient option based on the selfsensing abilities of thermal shape memory alloys.
Keywords: energy-efficiency, heat transfer path, MT thermal stability, thermal shape memory alloy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19382510 Performance Analysis of List Scheduling in Heterogeneous Computing Systems
Authors: Keqin Li
Abstract:
Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.Keywords: Average-case performance, list scheduling algorithm, mixed-machine heterogeneous computing system, worst-case performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13522509 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition
Authors: Chuan Li, Ming Liang
Abstract:
Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17202508 Machine Learning Methods for Flood Hazard Mapping
Authors: S. Zappacosta, C. Bove, M. Carmela Marinelli, P. di Lauro, K. Spasenovic, L. Ostano, G. Aiello, M. Pietrosanto
Abstract:
This paper proposes a neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The classification capability was compared with the flood hazard mapping River Basin Plans (Piani Assetto Idrogeologico, acronimed as PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale), encoding four different increasing flood hazard levels. The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.
Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7432507 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.
Keywords: Social Media, text mining, knowledge discovery, predictive analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38522506 A Combinatorial Approach to Planning Manufacturing Safety Programme
Authors: Kazeem A. Adebiyi
Abstract:
Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Keywords: Combination, Manufacturing Safety, Monetary Savings, Prevention Strategies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12562505 Knowledge Based Wear Particle Analysis
Authors: Mohammad S. Laghari, Qurban A. Memon, Gulzar A. Khuwaja
Abstract:
The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.Keywords: Computer vision, knowledge based systems, morphology, wear particles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17512504 Prediction of Research Topics Using Ensemble of Best Predictors from Similar Dataset
Authors: Indra Budi, Rizal Fathoni Aji, Agus Widodo
Abstract:
Prediction of future research topics by using time series analysis either statistical or machine learning has been conducted previously by several researchers. Several methods have been proposed to combine the forecasting results into single forecast. These methods use fixed combination of individual forecast to get the final forecast result. In this paper, quite different approach is employed to select the forecasting methods, in which every point to forecast is calculated by using the best methods used by similar validation dataset. The dataset used in the experiment is time series derived from research report in Garuda, which is an online sites belongs to the Ministry of Education in Indonesia, over the past 20 years. The experimental result demonstrates that the proposed method may perform better compared to the fix combination of predictors. In addition, based on the prediction result, we can forecast emerging research topics for the next few years.
Keywords: Combination, emerging topics, ensemble, forecasting, machine learning, prediction, research topics, similarity measure, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21362503 Performance Monitoring of the Refrigeration System with Minimum Set of Sensors
Authors: Radek Fisera, Petr Stluka
Abstract:
This paper describes a methodology for remote performance monitoring of retail refrigeration systems. The proposed framework starts with monitoring of the whole refrigeration circuit which allows detecting deviations from expected behavior caused by various faults and degradations. The subsequent diagnostics methods drill down deeper in the equipment hierarchy to more specifically determine root causes. An important feature of the proposed concept is that it does not require any additional sensors, and thus, the performance monitoring solution can be deployed at a low installation cost. Moreover only a minimum of contextual information is required, which also substantially reduces time and cost of the deployment process.Keywords: Condition monitoring, energy baselining, fault detection and diagnostics, commercial refrigeration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28852502 Adaptive Motion Estimator Based on Variable Block Size Scheme
Authors: S. Dhahri, A. Zitouni, H. Chaouch, R. Tourki
Abstract:
This paper presents an adaptive motion estimator that can be dynamically reconfigured by the best algorithm depending on the variation of the video nature during the lifetime of an application under running. The 4 Step Search (4SS) and the Gradient Search (GS) algorithms are integrated in the estimator in order to be used in the case of rapid and slow video sequences respectively. The Full Search Block Matching (FSBM) algorithm has been also integrated in order to be used in the case of the video sequences which are not real time oriented. In order to efficiently reduce the computational cost while achieving better visual quality with low cost power, the proposed motion estimator is based on a Variable Block Size (VBS) scheme that uses only the 16x16, 16x8, 8x16 and 8x8 modes. Experimental results show that the adaptive motion estimator allows better results in term of Peak Signal to Noise Ratio (PSNR), computational cost, FPGA occupied area, and dissipated power relatively to the most popular variable block size schemes presented in the literature.Keywords: H264, Configurable Motion Estimator, VariableBlock Size, PSNR, Dissipated power.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16592501 On the Learning of Causal Relationships between Banks in Saudi Equities Market Using Ensemble Feature Selection Methods
Authors: Adel Aloraini
Abstract:
Financial forecasting using machine learning techniques has received great efforts in the last decide . In this ongoing work, we show how machine learning of graphical models will be able to infer a visualized causal interactions between different banks in the Saudi equities market. One important discovery from such learned causal graphs is how companies influence each other and to what extend. In this work, a set of graphical models named Gaussian graphical models with developed ensemble penalized feature selection methods that combine ; filtering method, wrapper method and a regularizer will be shown. A comparison between these different developed ensemble combinations will also be shown. The best ensemble method will be used to infer the causal relationships between banks in Saudi equities market.
Keywords: Causal interactions , banks, feature selection, regularizere,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17532500 AGV Guidance System: An Application of Simple Active Contour for Visual Tracking
Authors: M.Asif, M.R.Arshad, P.A.Wilson
Abstract:
In this paper, a simple active contour based visual tracking algorithm is presented for outdoor AGV application which is currently under development at the USM robotic research group (URRG) lab. The presented algorithm is computationally low cost and able to track road boundaries in an image sequence and can easily be implemented on available low cost hardware. The proposed algorithm used an active shape modeling using the B-spline deformable template and recursive curve fitting method to track the current orientation of the road.Keywords: Active contour, B-spline, recursive curve fitting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21262499 The Efficiency of Mechanization in Weed Control in Artificial Regeneration of Oriental Beech (Fagus orientalis Lipsky.)
Authors: Tuğrul Varol, Halil Barış Özel
Abstract:
In this study which has been conducted in Akçasu Forest Range District of Devrek Forest Directorate; 3 methods (weed control with labourer power, cover removal with Hitachi F20 Excavator, and weed control with agricultural equipment mounted on a Ferguson 240S agriculture tractor) were utilized in weed control efforts in regeneration of degraded oriental beech forests have been compared. In this respect, 3 methods have been compared by determining certain work hours and standard durations of unit areas (1 hectare). For this purpose, evaluating the tasks made with human and machine force from the aspects of duration, productivity and costs, it has been aimed to determine the most productive method in accordance with the actual ecological conditions of research field. Within the scope of the study, the time studies have been conducted for 3 methods used in weed control efforts. While carrying out those studies, the performed implementations have been evaluated by dividing them into business stages. Also, the actual data have been used while calculating the cost accounts. In those calculations, the latest formulas and equations which are also used in developed countries have been utilized. The variance of analysis (ANOVA) was used in order to determine whether there is any statistically significant difference among obtained results, and the Duncan test was used for grouping if there is significant difference. According to the measurements and findings carried out within the scope of this study, it has been found during living cover removal efforts in regeneration efforts in demolished oriental beech forests that the removal of weed layer in 1 hectare of field has taken 920 hours with labourer force, 15.1 hours with excavator and 60 hours with an equipment mounted on a tractor. On the other hand, it has been determined that the cost of removal of living cover in unit area (1 hectare) was 3220.00 TL for labourer power, 1250 TL for excavator and 1825 TL for equipment mounted on a tractor. According to the obtained results, it has been found that the utilization of excavator in weed control effort in regeneration of degraded oriental beech regions under actual ecological conditions of research field has been found to be more productive from both of aspects of duration and costs. These determinations carried out should be repeated in weed control efforts in degraded forest fields with different ecological conditions, it is compulsory for finding the most efficient weed control method. These findings will light the way of technical staff of forestry directorate in determination of the most effective and economic weed control method. Thus, the more actual data will be used while preparing the weed control budgets, and there will be significant contributions to national economy. Also the results of this and similar studies are very important for developing the policies for our forestry in short and long term.
Keywords: Artificial regeneration, weed control, oriental beech, productivity, mechanization, man power, cost analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20352498 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model
Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong
Abstract:
This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Keywords: Defective autoparts products, Bayesian framework, Generalized linear mixed model (GLMM), Risk factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19152497 An Approach for Optimization of Functions and Reducing the Value of the Product by Using Virtual Models
Authors: A. Bocevska, G. Todorov, T. Neshkov
Abstract:
New developed approach for Functional Cost Analysis (FCA) based on virtual prototyping (VP) models in CAD/CAE environment, applicable and necessary in developing new products is presented. It is instrument for improving the value of the product while maintaining costs and/or reducing the costs of the product without reducing value. Five broad classes of VP methods are identified. Efficient use of prototypes in FCA is a vital activity that can make the difference between successful and unsuccessful entry of new products into the competitive word market. Successful realization of this approach is illustrated for a specific example using press joint power tool.
Keywords: CAD/CAE environment, Functional Cost Analysis (FCA), Virtual prototyping (VP) models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13392496 Human Action Recognition Based on Ridgelet Transform and SVM
Authors: A. Ouanane, A. Serir
Abstract:
In this paper, a novel algorithm based on Ridgelet Transform and support vector machine is proposed for human action recognition. The Ridgelet transform is a directional multi-resolution transform and it is more suitable for describing the human action by performing its directional information to form spatial features vectors. The dynamic transition between the spatial features is carried out using both the Principal Component Analysis and clustering algorithm K-means. First, the Principal Component Analysis is used to reduce the dimensionality of the obtained vectors. Then, the kmeans algorithm is then used to perform the obtained vectors to form the spatio-temporal pattern, called set-of-labels, according to given periodicity of human action. Finally, a Support Machine classifier is used to discriminate between the different human actions. Different tests are conducted on popular Datasets, such as Weizmann and KTH. The obtained results show that the proposed method provides more significant accuracy rate and it drives more robustness in very challenging situations such as lighting changes, scaling and dynamic environmentKeywords: Human action, Ridgelet Transform, PCA, K-means, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20752495 Genetic Algorithm Application in a Dynamic PCB Assembly with Carryover Sequence- Dependent Setups
Authors: M. T. Yazdani Sabouni, Rasaratnam Logendran
Abstract:
We consider a typical problem in the assembly of printed circuit boards (PCBs) in a two-machine flow shop system to simultaneously minimize the weighted sum of weighted tardiness and weighted flow time. The investigated problem is a group scheduling problem in which PCBs are assembled in groups and the interest is to find the best sequence of groups as well as the boards within each group to minimize the objective function value. The type of setup operation between any two board groups is characterized as carryover sequence-dependent setup time, which exactly matches with the real application of this problem. As a technical constraint, all of the boards must be kitted before the assembly operation starts (kitting operation) and by kitting staff. The main idea developed in this paper is to completely eliminate the role of kitting staff by assigning the task of kitting to the machine operator during the time he is idle which is referred to as integration of internal (machine) and external (kitting) setup times. Performing the kitting operation, which is a preparation process of the next set of boards while the other boards are currently being assembled, results in the boards to continuously enter the system or have dynamic arrival times. Consequently, a dynamic PCB assembly system is introduced for the first time in the assembly of PCBs, which also has characteristics similar to that of just-in-time manufacturing. The problem investigated is computationally very complex, meaning that finding the optimal solutions especially when the problem size gets larger is impossible. Thus, a heuristic based on Genetic Algorithm (GA) is employed. An example problem on the application of the GA developed is demonstrated and also numerical results of applying the GA on solving several instances are provided.Keywords: Genetic algorithm, Dynamic PCB assembly, Carryover sequence-dependent setup times, Multi-objective.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15722494 Current Status of Industry 4.0 in Material Handling Automation and In-house Logistics
Authors: Orestis Κ. Efthymiou, Stavros T. Ponis
Abstract:
In the last decade, a new industrial revolution seems to be emerging, supported -once again- by the rapid advancements of Information Technology in the areas of Machine-to-Machine (M2M) communication permitting large numbers of intelligent devices, e.g. sensors to communicate with each other and take decisions without any or minimum indirect human intervention. The advent of these technologies have triggered the emergence of a new category of hybrid (cyber-physical) manufacturing systems, combining advanced manufacturing techniques with innovative M2M applications based on the Internet of Things (IoT), under the umbrella term Industry 4.0. Even though the topic of Industry 4.0 has attracted much attention during the last few years, the attempts of providing a systematic literature review of the subject are scarce. In this paper, we present the authors’ initial study of the field with a special focus on the use and applications of Industry 4.0 principles in material handling automations and in-house logistics. Research shows that despite the vivid discussion and attractiveness of the subject, there are still many challenges and issues that have to be addressed before Industry 4.0 becomes standardized and widely applicable.Keywords: Industry 4.0, internet of things, manufacturing systems, material handling, logistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16692493 Identification of Arousal and Relaxation by using SVM-Based Fusion of PPG Features
Authors: Chi Jung Kim, Mincheol Whang, Eui Chul Lee
Abstract:
In this paper, we propose a new method to distinguish between arousal and relaxation states by using multiple features acquired from a photoplethysmogram (PPG) and support vector machine (SVM). To induce arousal and relaxation states in subjects, 2 kinds of sound stimuli are used, and their corresponding biosignals are obtained using the PPG sensor. Two features–pulse to pulse interval (PPI) and pulse amplitude (PA)–are extracted from acquired PPG data, and a nonlinear classification between arousal and relaxation is performed using SVM. This methodology has several advantages when compared with previous similar studies. Firstly, we extracted 2 separate features from PPG, i.e., PPI and PA. Secondly, in order to improve the classification accuracy, SVM-based nonlinear classification was performed. Thirdly, to solve classification problems caused by generalized features of whole subjects, we defined each threshold according to individual features. Experimental results showed that the average classification accuracy was 74.67%. Also, the proposed method showed the better identification performance than the single feature based methods. From this result, we confirmed that arousal and relaxation can be classified using SVM and PPG features.Keywords: Support Vector Machine, PPG, Emotion Recognition, Arousal, Relaxation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24882492 Online Robust Model Predictive Control for Linear Fractional Transformation Systems Using Linear Matrix Inequalities
Authors: Peyman Sindareh Esfahani, Jeffery Kurt Pieper
Abstract:
In this paper, the problem of robust model predictive control (MPC) for discrete-time linear systems in linear fractional transformation form with structured uncertainty and norm-bounded disturbance is investigated. The problem of minimization of the cost function for MPC design is converted to minimization of the worst case of the cost function. Then, this problem is reduced to minimization of an upper bound of the cost function subject to a terminal inequality satisfying the l2-norm of the closed loop system. The characteristic of the linear fractional transformation system is taken into account, and by using some mathematical tools, the robust predictive controller design problem is turned into a linear matrix inequality minimization problem. Afterwards, a formulation which includes an integrator to improve the performance of the proposed robust model predictive controller in steady state condition is studied. The validity of the approaches is illustrated through a robust control benchmark problem.
Keywords: Linear fractional transformation, linear matrix inequality, robust model predictive control, state feedback control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13002491 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis
Authors: Abeer Aljohani
Abstract:
The COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred as corona virus which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as Omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. Numerous COVID-19 cases have produced a huge burden on hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease based on the symptoms and medical history of the patient. As machine learning is a widely accepted area and gives promising results for healthcare, this research presents an architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard University of California Irvine (UCI) dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques on the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and Principal Component Analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, Receiver Operating Characteristic (ROC) and Area under Curve (AUC). The results depict that Decision tree, Random Forest and neural networks outperform all other state-of-the-art ML techniques. This result can be used to effectively identify COVID-19 infection cases.
Keywords: Supervised machine learning, COVID-19 prediction, healthcare analytics, Random Forest, Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3892490 Conceptual Overview of Housing Affordability in Selangor, Malaysia
Authors: M. S. Suhaida, N. M. Tawil, N. Hamzah, A. I. Che-Ani, M.M. Tahir
Abstract:
Socioeconomic stability and development of a country, can be describe by housing affordability. It is aimed to ensure the housing provided as one of the key factors that is affordable by every income earner group whether low-income, middle income and high income group. This research carried out is to find out affordability of home ownership level for first medium cost landed-house by the middle-income group in Selangor, Malaysia. It is also hope that it could be seen as able to contribute to the knowledge and understanding on housing affordability level for the middleincome group and variables that influenced the medium income group-s ability to own first medium-cost houses.Keywords: Residential, Housing Affordability, Middle income.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44452489 Formation and Evaluation of Lahar/HDPE Hybrid Composite as a Structural Material for Household Biogas Digester
Authors: Lady Marianne E. Polinga, Candy C. Mercado, Camilo A. Polinga
Abstract:
This study was an investigation on the suitability of Lahar/HDPE composite as a primary material for low-cost smallscale biogas digesters. While sources of raw materials for biogas are abundant in the Philippines, cost of the technology has made the widespread utilization of this resource an indefinite proposition. Aside from capital economics, another problem arises with space requirements of current digester designs. These problems may be simultaneously addressed by fabricating digesters on a smaller, household scale to reach a wider market, and to use materials that may accommodate optimization of overall design and fabrication cost without sacrificing operational efficiency. This study involved actual fabrication of the Lahar/HDPE composite at varying composition and geometry, subsequent mechanical and thermal characterization, and implementation of Statistical Analysis to find intrinsic relationships between variables. From the results, Lahar/HDPE composite was found to be feasible for use as digester material from both mechanical and economic standpoints.
Keywords: Biogas digester, Composite, High density polyethylene, Lahar.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22612488 Valorization of Industrial Wastes on Hybrid Low Embodied Carbon Cement Based Mortars
Authors: Z. Abdollahnejad, M. Mastali, F. Pacheco-Torgal
Abstract:
Waste reuse is crucial in a context of circular economy and zero waste sustainable needs. Some wastes deserve further studies by the scientific community not only because they are generated in high amount but also because they have a low reuse rate. This paper reports results of 32 hybrid cement mortars based on fly ash and waste glass. They allow to explore the influence of mix design on the cost and on the embodied carbon of the hybrid cement mortars. The embodied carbon data for all constituents were taken from the database Ecoinvent. This study led to the development of a mixture with just 70 kg CO2e.Keywords: Waste reuse, fly ash, waste glass, hybrid cements, cost, embodied carbon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8422487 Utilization of Process Mapping Tool to Enhance Production Drilling in Underground Metal Mining Operations
Authors: Sidharth Talan, Sanjay Kumar Sharma, Eoin Joseph Wallace, Nikita Agrawal
Abstract:
Underground mining is at the core of rapidly evolving metals and minerals sector due to the increasing mineral consumption globally. Even though the surface mines are still more abundant on earth, the scales of industry are slowly tipping towards underground mining due to rising depth and complexities of orebodies. Thus, the efficient and productive functioning of underground operations depends significantly on the synchronized performance of key elements such as operating site, mining equipment, manpower and mine services. Production drilling is the process of conducting long hole drilling for the purpose of charging and blasting these holes for the production of ore in underground metal mines. Thus, production drilling is the crucial segment in the underground metal mining value chain. This paper presents the process mapping tool to evaluate the production drilling process in the underground metal mining operation by dividing the given process into three segments namely Input, Process and Output. The three segments are further segregated into factors and sub-factors. As per the study, the major input factors crucial for the efficient functioning of production drilling process are power, drilling water, geotechnical support of the drilling site, skilled drilling operators, services installation crew, oils and drill accessories for drilling machine, survey markings at drill site, proper housekeeping, regular maintenance of drill machine, suitable transportation for reaching the drilling site and finally proper ventilation. The major outputs for the production drilling process are ore, waste as a result of dilution, timely reporting and investigation of unsafe practices, optimized process time and finally well fragmented blasted material within specifications set by the mining company. The paper also exhibits the drilling loss matrix, which is utilized to appraise the loss in planned production meters per day in a mine on account of availability loss in the machine due to breakdowns, underutilization of the machine and productivity loss in the machine measured in drilling meters per unit of percussion hour with respect to its planned productivity for the day. The given three losses would be essential to detect the bottlenecks in the process map of production drilling operation so as to instigate the action plan to suppress or prevent the causes leading to the operational performance deficiency. The given tool is beneficial to mine management to focus on the critical factors negatively impacting the production drilling operation and design necessary operational and maintenance strategies to mitigate them.
Keywords: Process map, drilling loss matrix, availability, utilization, productivity, percussion rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10942486 Adsorption Refrigeration Working Pairs: The State-of-the-Art in the Application
Authors: Ahmed N. Shmroukh, Ahmed Hamza H. Ali, Ali K. Abel-Rahman
Abstract:
Adsorption refrigeration working pair is a vital and is the main component in the adsorption refrigeration machine. Therefore the development key is laying on the adsorption pair that leads to the improvement of the adsorption refrigeration machine. In this study the state-of-the-art in the application of the adsorption refrigeration working pairs in both classical and modern adsorption pairs are presented, compared and summarized. It is found that the maximum adsorption capacity for the classical working pairs was 0.259kg/kg for activated carbon/methanol and that for the modern working pairs was 2kg/kg for maxsorb III/R-134a. The study concluded that, the performances of the adsorption working pairs of adsorption cooling systems are still need further investigations as well as developing adsorption pairs having higher sorption capacity with low or no impact on environmental, to build compact, efficient, reliable and long life performance adsorption chillier. Also, future researches need to be focused on designing the adsorption system that provide efficient heating and cooling for the adsorbent materials through distributing the adsorbent material over heat exchanger surface, to allow good heat and mass transfer between the adsorbent and the refrigerant.
Keywords: Adsorption, Adsorbent/Adsorbate Pairs, Refrigeration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47812485 Ranking of Performance Measures of GSCM towards Sustainability: Using Analytic Hierarchy Process
Authors: Dixit Garg, S. Luthra, A. Haleem
Abstract:
During recent years, the natural environment has become a challenging topic that business organizations must consider due to the economic and ecological impacts and increasing awareness of environment protection among society. Organizations are trying to achieve the goals of improvement in environment, low cost, high quality, flexibility and more customer satisfaction. Performance measurement frameworks are very useful to monitor the performance of any organization. The basic goal of this paper is to identify performance measures and ranking of these performance measures of GSCM performance measurement towards sustainability framework. Five perspectives (Environment, Economic, Social, Operational and Cost performances) and nineteen performance measures of GSCM performance towards sustainability have been have been identified from extensive literature review. Analytical Hierarchy Process (AHP) technique has been utilized for ranking of these performance perspectives and measures. All pair comparisons in AHP have been made on the basis on the experts’ opinions (selected from academia and industry). Ranking of these performance perspectives and measures will help to understand the importance of environmental, economic, social, operational performances and cost performances in the supply chain.
Keywords: Analytical Hierarchy Process (AHP), Green Supply Chain Management, Performance Measures (PM), Sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32532484 Color Image Segmentation Using SVM Pixel Classification Image
Authors: K. Sakthivel, R. Nallusamy, C. Kavitha
Abstract:
The goal of image segmentation is to cluster pixels into salient image regions. Segmentation could be used for object recognition, occlusion boundary estimation within motion or stereo systems, image compression, image editing, or image database lookup. In this paper, we present a color image segmentation using support vector machine (SVM) pixel classification. Firstly, the pixel level color and texture features of the image are extracted and they are used as input to the SVM classifier. These features are extracted using the homogeneity model and Gabor Filter. With the extracted pixel level features, the SVM Classifier is trained by using FCM (Fuzzy C-Means).The image segmentation takes the advantage of both the pixel level information of the image and also the ability of the SVM Classifier. The Experiments show that the proposed method has a very good segmentation result and a better efficiency, increases the quality of the image segmentation compared with the other segmentation methods proposed in the literature.
Keywords: Image Segmentation, Support Vector Machine, Fuzzy C–Means, Pixel Feature, Texture Feature, Homogeneity model, Gabor Filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6754