Search results for: incremental principal components analysis (IPCA).
9009 A New Weighted LDA Method in Comparison to Some Versions of LDA
Authors: Delaram Jarchi, Reza Boostani
Abstract:
Linear Discrimination Analysis (LDA) is a linear solution for classification of two classes. In this paper, we propose a variant LDA method for multi-class problem which redefines the between class and within class scatter matrices by incorporating a weight function into each of them. The aim is to separate classes as much as possible in a situation that one class is well separated from other classes, incidentally, that class must have a little influence on classification. It has been suggested to alleviate influence of classes that are well separated by adding a weight into between class scatter matrix and within class scatter matrix. To obtain a simple and effective weight function, ordinary LDA between every two classes has been used in order to find Fisher discrimination value and passed it as an input into two weight functions and redefined between class and within class scatter matrices. Experimental results showed that our new LDA method improved classification rate, on glass, iris and wine datasets, in comparison to different versions of LDA.Keywords: Discriminant vectors, weighted LDA, uncorrelation, principle components, Fisher-face method, Bootstarp method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15229008 An Assessment of Food Control System and Development Perspective: The Case of Myanmar
Authors: Wai Yee Lin, Masahiro Yamao
Abstract:
Food control measures are critical in fostering food safety management of a nation. However, no academic study has been undertaken to assess the food control system of Myanmar up to now. The objective of this research paper was to assess the food control system with in depth examination of five key components using desktop analysis and short survey from related food safety program organizations including regulators and inspectors. Study showed that the existing food control system is conventional, mainly focusing on primary health care approach while relying on reactive measures. The achievements of food control work have been limited to a certain extent due to insufficienttechnical capacity that is needed to upgrade staffs, laboratory equipment and technical assistance etc. associated with various sectors. Assessing food control measures is the first step in the integration of food safety management, this paper could assist policy makers in providing information for enhancing the safety and quality of food produced and consumed in Myanmar.
Keywords: Food Control, Food Policy, Legislation, Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40539007 A New Model for Question Answering Systems
Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour
Abstract:
Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).Keywords: Answer Processing, Classification, QuestionAnswering and Query Reformulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21249006 Modelling of Heating and Evaporation of Biodiesel Fuel Droplets
Authors: Mansour Al Qubeissi, Sergei S. Sazhin, Cyril Crua, Morgan R. Heikal
Abstract:
This paper presents the application of the Discrete Component Model for heating and evaporation to multi-component biodiesel fuel droplets in direct injection internal combustion engines. This model takes into account the effects of temperature gradient, recirculation and species diffusion inside droplets. A distinctive feature of the model used in the analysis is that it is based on the analytical solutions to the temperature and species diffusion equations inside the droplets. Nineteen types of biodiesel fuels are considered. It is shown that a simplistic model, based on the approximation of biodiesel fuel by a single component or ignoring the diffusion of components of biodiesel fuel, leads to noticeable errors in predicted droplet evaporation time and time evolution of droplet surface temperature and radius.
Keywords: Heat/Mass Transfer, Biodiesel, Multi-component Fuel, Droplet, Evaporation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27979005 Noise Removal from Surface Respiratory EMG Signal
Authors: Slim Yacoub, Kosai Raoof
Abstract:
The aim of this study was to remove the two principal noises which disturb the surface electromyography signal (Diaphragm). These signals are the electrocardiogram ECG artefact and the power line interference artefact. The algorithm proposed focuses on a new Lean Mean Square (LMS) Widrow adaptive structure. These structures require a reference signal that is correlated with the noise contaminating the signal. The noise references are then extracted : first with a noise reference mathematically constructed using two different cosine functions; 50Hz (the fundamental) function and 150Hz (the first harmonic) function for the power line interference and second with a matching pursuit technique combined to an LMS structure for the ECG artefact estimation. The two removal procedures are attained without the use of supplementary electrodes. These techniques of filtering are validated on real records of surface diaphragm electromyography signal. The performance of the proposed methods was compared with already conducted research results.Keywords: Surface EMG, Adaptive, Matching Pursuit, Powerline interference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43249004 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline M. R. Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge dataset configurations.
Keywords: Brazil, classifiers, data-mining, Image Segmentation, oil well visualization, classifiers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25439003 Online Signature Verification Using Angular Transformation for e-Commerce Services
Authors: Peerapong Uthansakul, Monthippa Uthansakul
Abstract:
The rapid growth of e-Commerce services is significantly observed in the past decade. However, the method to verify the authenticated users still widely depends on numeric approaches. A new search on other verification methods suitable for online e-Commerce is an interesting issue. In this paper, a new online signature-verification method using angular transformation is presented. Delay shifts existing in online signatures are estimated by the estimation method relying on angle representation. In the proposed signature-verification algorithm, all components of input signature are extracted by considering the discontinuous break points on the stream of angular values. Then the estimated delay shift is captured by comparing with the selected reference signature and the error matching can be computed as a main feature used for verifying process. The threshold offsets are calculated by two types of error characteristics of the signature verification problem, False Rejection Rate (FRR) and False Acceptance Rate (FAR). The level of these two error rates depends on the decision threshold chosen whose value is such as to realize the Equal Error Rate (EER; FAR = FRR). The experimental results show that through the simple programming, employed on Internet for demonstrating e-Commerce services, the proposed method can provide 95.39% correct verifications and 7% better than DP matching based signature-verification method. In addition, the signature verification with extracting components provides more reliable results than using a whole decision making.Keywords: Online signature verification, e-Commerce services, Angular transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15819002 Examining of Tool Wear in Cryogenic Machining of Cobalt-Based Haynes 25 Superalloy
Authors: Murat Sarıkaya, Abdulkadir Güllü
Abstract:
Haynes 25 alloy (also known as L-605 alloy) is cobalt based super alloy which has widely applications such as aerospace industry, turbine and furnace parts, power generators and heat exchangers and petroleum refining components due to its excellent characteristics. However, the workability of this alloy is more difficult compared to normal steels or even stainless. In present work, an experimental investigation was performed under cryogenic cooling to determine cutting tool wear patterns and obtain optimal cutting parameters in turning of cobalt based superalloy Haynes 25. In experiments, uncoated carbide tool was used and cutting speed (V) and feed rate (f) were considered as test parameters. Tool wear (VBmax) were measured for process performance indicators. Analysis of variance (ANOVA) was performed to determine the importance of machining parameters.Keywords: Cryogenic machining, difficult-to-cut alloy, tool wear, turning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27609001 A Framework for Data Mining Based Multi-Agent: An Application to Spatial Data
Authors: H. Baazaoui Zghal, S. Faiz, H. Ben Ghezala
Abstract:
Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Keywords: Databases, data mining, multi-agent, spatial datamart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20449000 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16708999 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data
Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi
Abstract:
Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.
Keywords: Autoregression, Bootstrap, Edgeworth Expansion, Monte Carlo Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5948998 Performance Analysis of a Series of Adaptive Filters in Non-Stationary Environment for Noise Cancelling Setup
Authors: Anam Rafique, Syed Sohail Ahmed
Abstract:
One of the essential components of much of DSP application is noise cancellation. Changes in real time signals are quite rapid and swift. In noise cancellation, a reference signal which is an approximation of noise signal (that corrupts the original information signal) is obtained and then subtracted from the noise bearing signal to obtain a noise free signal. This approximation of noise signal is obtained through adaptive filters which are self adjusting. As the changes in real time signals are abrupt, this needs adaptive algorithm that converges fast and is stable. Least mean square (LMS) and normalized LMS (NLMS) are two widely used algorithms because of their plainness in calculations and implementation. But their convergence rates are small. Adaptive averaging filters (AFA) are also used because they have high convergence, but they are less stable. This paper provides the comparative study of LMS and Normalized NLMS, AFA and new enhanced average adaptive (Average NLMS-ANLMS) filters for noise cancelling application using speech signals.Keywords: AFA, ANLMS, LMS, NLMS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19338997 Different Approaches for the Design of IFIR Compaction Filter
Authors: Sheeba V.S, Elizabeth Elias
Abstract:
Optimization of filter banks based on the knowledge of input statistics has been of interest for a long time. Finite impulse response (FIR) Compaction filters are used in the design of optimal signal adapted orthonormal FIR filter banks. In this paper we discuss three different approaches for the design of interpolated finite impulse response (IFIR) compaction filters. In the first method, the magnitude squared response satisfies Nyquist constraint approximately. In the second and third methods Nyquist constraint is exactly satisfied. These methods yield FIR compaction filters whose response is comparable with that of the existing methods. At the same time, IFIR filters enjoy significant saving in the number of multipliers and can be implemented efficiently. Since eigenfilter approach is used here, the method is less complex. Design of IFIR filters in the least square sense is presented.
Keywords: Principal Component Filter Bank, InterpolatedFinite Impulse Response filter, Orthonormal Filter Bank, Eigen Filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15778996 Energy Distribution of EEG Signals: EEG Signal Wavelet-Neural Network Classifier
Authors: I. Omerhodzic, S. Avdakovic, A. Nuhanovic, K. Dizdarevic
Abstract:
In this paper, a wavelet-based neural network (WNN) classifier for recognizing EEG signals is implemented and tested under three sets EEG signals (healthy subjects, patients with epilepsy and patients with epileptic syndrome during the seizure). First, the Discrete Wavelet Transform (DWT) with the Multi-Resolution Analysis (MRA) is applied to decompose EEG signal at resolution levels of the components of the EEG signal (δ, θ, α, β and γ) and the Parseval-s theorem are employed to extract the percentage distribution of energy features of the EEG signal at different resolution levels. Second, the neural network (NN) classifies these extracted features to identify the EEGs type according to the percentage distribution of energy features. The performance of the proposed algorithm has been evaluated using in total 300 EEG signals. The results showed that the proposed classifier has the ability of recognizing and classifying EEG signals efficiently.
Keywords: Epilepsy, EEG, Wavelet transform, Energydistribution, Neural Network, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19738995 Synthesis and Analysis of Swelling and Controlled Release Behaviour of Anionic sIPN Acrylamide based Hydrogels
Authors: Atefeh Hekmat, Abolfazl Barati, Ebrahim Vasheghani Frahani, Ali Afraz
Abstract:
In modern agriculture, polymeric hydrogels are known as a component able to hold an amount of water due to their 3-dimensional network structure and their tendency to absorb water in humid environments. In addition, these hydrogels are able to controllably release the fertilisers and pesticides loaded in them. Therefore, they deliver these materials to the plants' roots and help them with growing. These hydrogels also reduce the pollution of underground water sources by preventing the active components from leaching. In this study, sIPN acrylamide based hydrogels are synthesised by using acrylamide free radical, potassium acrylate, and linear polyvinyl alcohol. Ammonium nitrate is loaded in the hydrogel as the fertiliser. The effect of various amounts of monomers and linear polymer, measured in molar ratio, on the swelling rate, equilibrium swelling, and release of ammonium nitrate is studied.Keywords: Hydrogel, controlled release, ammonium nitrate fertiliser, sIPN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21598994 Improving Taint Analysis of Android Applications Using Finite State Machines
Authors: Assad Maalouf, Lunjin Lu, James Lynott
Abstract:
We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.Keywords: Android, static analysis, string analysis, taint analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6618993 A Development of the Multiple Intelligences Measurement of Elementary Students
Authors: Chaiwat Waree
Abstract:
This research aims at development of the Multiple Intelligences Measurement of Elementary Students. The structural accuracy test and normality establishment are based on the Multiple Intelligences Theory of Gardner. This theory consists of eight aspects namely linguistics, logic and mathematics, visual-spatial relations, body and movement, music, human relations, self-realization/selfunderstanding and nature. The sample used in this research consists of elementary school students (aged between 5-11 years). The size of the sample group was determined by Yamane Table. The group has 2,504 students. Multistage Sampling was used. Basic statistical analysis and construct validity testing were done using confirmatory factor analysis. The research can be summarized as follows; 1. Multiple Intelligences Measurement consisting of 120 items is content-accurate. Internal consistent reliability according to the method of Kuder-Richardson of the whole Multiple Intelligences Measurement equals .91. The difficulty of the measurement test is between .39-.83. Discrimination is between .21-.85. 2). The Multiple Intelligences Measurement has construct validity in a good range, that is 8 components and all 120 test items have statistical significance level at .01. Chi-square value equals 4357.7; p=.00 at the degree of freedom of 244 and Goodness of Fit Index equals 1.00. Adjusted Goodness of Fit Index equals .92. Comparative Fit Index (CFI) equals .68. Root Mean Squared Residual (RMR) equals 0.064 and Root Mean Square Error of Approximation equals 0.82. 3). The normality of the Multiple Intelligences Measurement is categorized into 3 levels. Those with high intelligence are those with percentiles of more than 78. Those with moderate/medium intelligence are those with percentiles between 24 and 77.9. Those with low intelligence are those with percentiles from 23.9 downwards.
Keywords: Multiple Intelligences, Measurement, Elementary Students.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29578992 Investigation of Boll Properties on Cotton Picker Machine Performance
Authors: Shahram Nowrouzieh, Abbas Rezaei Asl, Mohamad Ali Jafari
Abstract:
Cotton, as a strategic crop, plays an important role in providing human food and clothing need, because of its oil, protein, and fiber. Iran has been one of the largest cotton producers in the world in the past, but unfortunately, for economic reasons, its production is reduced now. One of the ways to reduce the cost of cotton production is to expand the mechanization of cotton harvesting. Iranian farmers do not accept the function of cotton harvesters. One reason for this lack of acceptance of cotton harvesting machines is the number of field losses on these machines. So, the majority of cotton fields are harvested by hand. Although the correct setting of the harvesting machine is very important in the cotton losses, the morphological properties of the cotton plant also affect the performance of cotton harvesters. In this study, the effect of some cotton morphological properties such as the height of the cotton plant, number, and length of sympodial and monopodial branches, boll dimensions, boll weight, number of carpels and bracts angle were evaluated on the performance of cotton picker. In this research, the efficiency of John Deere 9920 spindle Cotton picker is investigated on five different Iranian cotton cultivars. The results indicate that there was a significant difference between the five cultivars in terms of machine harvest efficiency. Golestan cultivar showed the best cotton harvester performance with an average of 87.6% of total harvestable seed cotton and Khorshid cultivar had the least cotton harvester performance. The principal component analysis showed that, at 50.76% probability, the cotton picker efficiency is affected by the bracts angle positively and by boll dimensions, the number of carpels and the height of cotton plants negatively. The seed cotton remains (in the plant and on the ground) after harvester in PCA scatter plot were in the same zone with boll dimensions and several carpels.
Keywords: Cotton, bract, harvester, carpel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7098991 The Co-application of Plant Growth Promoting Rhizobacteria and Inoculation with Rhizobium Bacteria on Grain Yield and Its Components of Mungbean (Vigna radiate L.) in Ilam Province, Iran
Authors: Abdollah Hosseini, Abbas Maleki, Khalil Fasihi, Rahim Naseri
Abstract:
In order to investigate the effect of Plant Growth Promoting Rhizobacteria (PGPR) and rhizobium bacteria on grain yield and some agronomic traits of mungbean (Vigna radiate L.), an experiment was carried out based on randomized complete block design with three replications in Malekshahi, Ilam province, Iran during 2012-2013 cropping season. Experimental treatments consisted of control treatment, inoculation with rhizobium bacteria, rhizobium bacteria and Azotobacter, rhizobium bacteria and Azospirillum, rhizobium bacteria and Pseudomonas, rhizobium bacteria, Azotobacter and Azospirillum, rhizobium bacteria, Azotobacter and Pseudomonas, rhizobium bacteria, Azospirillum and Pseudomonas and rhizobium bacteria, Azotobacter, Azospirillum and Pseudomonas. The results showed that the effect of PGPR and rhizobium bacteria were significant affect on grain and its components in mungbean plant. Grain yield significantly increased by PGPR and rhizobium bacteria, so that the maximum grain yield was obtained from rhizobium bacteria + Azospirillum + Pseudomonas with the amount of 2287 kg.ha-1 as compared to control treatment. Excessive application of chemical fertilizers causes environmental and economic problems. That is, the overfertilization of P and N leads to pollution due to soil erosion and runoff water, so the use of PGPR and rhizobium bacteria can be justified due to reduce input costs, increase in grain yield and environmental friendly.
Keywords: Azotobacter, Mungbean, Pseudomonas, Rhizobium bacteria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28588990 Design of Thermal Control Subsystem for TUSAT Telecommunication Satellite
Authors: N. Sozbir, M. Bulut, M.F.Oktem, A.Kahriman, A. Chaix
Abstract:
TUSAT is a prospective Turkish Communication Satellite designed for providing mainly data communication and broadcasting services through Ku-Band and C-Band channels. Thermal control is a vital issue in satellite design process. Therefore, all satellite subsystems and equipments should be maintained in the desired temperature range from launch to end of maneuvering life. The main function of the thermal control is to keep the equipments and the satellite structures in a given temperature range for various phases and operating modes of spacecraft during its lifetime. This paper describes the thermal control design which uses passive and active thermal control concepts. The active thermal control is based on heaters regulated by software via thermistors. Alternatively passive thermal control composes of heat pipes, multilayer insulation (MLI) blankets, radiators, paints and surface finishes maintaining temperature level of the overall carrier components within an acceptable value. Thermal control design is supported by thermal analysis using thermal mathematical models (TMM).Keywords: Spacecraft thermal control, design of thermal control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36988989 An Intelligent Water Drop Algorithm for Solving Economic Load Dispatch Problem
Authors: S. Rao Rayapudi
Abstract:
Economic Load Dispatch (ELD) is a method of determining the most efficient, low-cost and reliable operation of a power system by dispatching available electricity generation resources to supply load on the system. The primary objective of economic dispatch is to minimize total cost of generation while honoring operational constraints of available generation resources. In this paper an intelligent water drop (IWD) algorithm has been proposed to solve ELD problem with an objective of minimizing the total cost of generation. Intelligent water drop algorithm is a swarm-based natureinspired optimization algorithm, which has been inspired from natural rivers. A natural river often finds good paths among lots of possible paths in its ways from source to destination and finally find almost optimal path to their destination. These ideas are embedded into the proposed algorithm for solving economic load dispatch problem. The main advantage of the proposed technique is easy is implement and capable of finding feasible near global optimal solution with less computational effort. In order to illustrate the effectiveness of the proposed method, it has been tested on 6-unit and 20-unit test systems with incremental fuel cost functions taking into account the valve point-point loading effects. Numerical results shows that the proposed method has good convergence property and better in quality of solution than other algorithms reported in recent literature.Keywords: Economic load dispatch, Transmission loss, Optimization, Valve point loading, Intelligent Water Drop Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36288988 Multi-view Description of Real-Time Systems- Architecture
Authors: A. Bessam, M. T. Kimour
Abstract:
Real-time embedded systems should benefit from component-based software engineering to handle complexity and deal with dependability. In these systems, applications should not only be logically correct but also behave within time windows. However, in the current component based software engineering approaches, a few of component models handles time properties in a manner that allows efficient analysis and checking at the architectural level. In this paper, we present a meta-model for component-based software description that integrates timing issues. To achieve a complete functional model of software components, our meta-model focuses on four functional aspects: interface, static behavior, dynamic behavior, and interaction protocol. With each aspect we have explicitly associated a time model. Such a time model can be used to check a component-s design against certain properties and to compute the timing properties of component assemblies.Keywords: Real-time systems, Software architecture, software component, dependability, time properties, ADL, metamodeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16338987 Blood Elements Activation in Hemodialysis – Animal Model Studies
Authors: Karolina Grzeszczuk-Kuć, Jolanta Bujok, Tomasz Walski, Małgorzata Komorowska
Abstract:
Haemodialysis (HD) is a procedure saving patient lives around the world, unfortunately it brings numerous complications. Oxidative stress is one of the major factors which lead to erythrocytes destruction during extracorporeal circulation. Repeated HD procedures destroy blood elements and the organism is not able to keep up with their production. 30 HD procedures on healthy sheep were performed to evaluate effects of such treatment. Oxidative stress study was performed together with an analysis of basic blood parameters and empirical assessment of dialyzer condition after the procedure. A reversible decline in absolute leukocyte count, during first 30 min of HD, was observed. Blood clots were formed in the area of the blood inlet and outlet of the dialyzer. Our results are consistent with outcomes presented throughout the literature specifically with respect to the effects observed in humans and will provide a basis to evaluate methods for blood protection during haemodialysis.
Keywords: Animal model, blood components, haemodialysis, leukocytes, oxidative stress, sheep.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23458986 New Security Approach of Confidential Resources in Hybrid Clouds
Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander Ghorbel
Abstract:
Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Keywords: Confidentiality, cryptography, security issues, trust issues.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14718985 DRE - A Quality Metric for Component based Software Products
Authors: K. S. Jasmine, R. Vasantha
Abstract:
The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).Keywords: Software Reuse, Defect density, Reuse metrics, Defect Removal efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28078984 Wear and Friction Analysis of Sintered Metal Powder Self Lubricating Bush Bearing
Authors: J. K. Khare, Abhay Kumar Sharma, Ajay Tiwari, Amol A. Talankar
Abstract:
Powder metallurgy (P/M) is the only economic way to produce porous parts/products. P/M can produce near net shape parts hence reduces wastage of raw material and energy, avoids various machining operations. The most vital use of P/M is in production of metallic filters and self lubricating bush bearings and siding surfaces. The porosity of the part can be controlled by varying compaction pressure, sintering temperature and composition of metal powder mix. The present work is aimed for experimental analysis of friction and wear properties of self lubricating copper and tin bush bearing. Experimental results confirm that wear rate of sintered component is lesser for components having 10% tin by weight percentage. Wear rate increases for high tin percentage (experimented for 20% tin and 30% tin) at same sintering temperature. Experimental results also confirms that wear rate of sintered component is also dependent on sintering temperature, soaking period, composition of the preform, compacting pressure, powder particle shape and size. Interfacial friction between die and punch, between inter powder particles, between die face and powder particle depends on compaction pressure, powder particle size and shape, size and shape of component which decides size & shape of die & punch, material of die & punch and material of powder particles.
Keywords: Interfacial friction, porous bronze bearing, sintering temperature, wear rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39738983 A Product Development for Green Logistics Model by Integrated Evaluation of Design and Manufacturing and Green Supply Chain
Authors: Yuan-Jye Tseng, Yen-Jung Wang
Abstract:
A product development for green logistics model using the fuzzy analytic network process method is presented for evaluating the relationships among the product design, the manufacturing activities, and the green supply chain. In the product development stage, there can be alternative ways to design the detailed components to satisfy the design concept and product requirement. In different design alternative cases, the manufacturing activities can be different. In addition, the manufacturing activities can affect the green supply chain of the components and product. In this research, a fuzzy analytic network process evaluation model is presented for evaluating the criteria in product design, manufacturing activities, and green supply chain. The comparison matrices for evaluating the criteria among the three groups are established. The total relational values between the three groups represent the relationships and effects. In application, the total relational values can be used to evaluate the design alternative cases for decision-making to select a suitable design case and the green supply chain. In this presentation, an example product is illustrated. It shows that the model is useful for integrated evaluation of design and manufacturing and green supply chain for the purpose of product development for green logistics.
Keywords: Supply chain management, green supply chain, product development for logistics, fuzzy analytic network process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22428982 Investigation of Effective Parameters on Pullout Capacity in Soil Nailing with Special Attention to International Design Codes
Authors: R. Ziaie Moayed, M. Mortezaee
Abstract:
An important and influential factor in design and determining the safety factor in Soil Nailing is the ultimate pullout capacity, or, in other words, bond strength. This important parameter depends on several factors such as material and soil texture, method of implementation, excavation diameter, friction angle between the nail and the soil, grouting pressure, the nail depth (overburden pressure), the angle of drilling and the degree of saturation in soil. Federal Highway Administration (FHWA), a customary regulation in the design of nailing, is considered only the effect of the soil type (or rock) and the method of implementation in determining the bond strength, which results in non-economic design. The other regulations are each of a kind, some of the parameters affecting bond resistance are not taken into account. Therefore, in the present paper, at first the relationships and tables presented by several valid regulations are presented for estimating the ultimate pullout capacity, and then the effect of several important factors affecting on ultimate Pullout capacity are studied. Finally, it was determined, the effect of overburden pressure (in method of injection with pressure), soil dilatation and roughness of the drilling surface on pullout strength is incremental, and effect of degree of soil saturation on pullout strength to a certain degree of saturation is increasing and then decreasing. therefore it is better to get help from nail pullout-strength test results and numerical modeling to evaluate the effect of parameters such as overburden pressure, dilatation, and degree of soil saturation, and so on to reach an optimal and economical design.
Keywords: Soil nailing, pullout capacity, FHWA, grout.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6898981 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include the cost of infrastructure, personnel, and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion, and various indirect costs in terms of air transport. This research aims to predict the probabilistic crash prediction of vehicles using Machine Learning due to natural and structural reasons by excluding spontaneous reasons, like overspeeding, etc., in the United States. These factors range from meteorological elements such as weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity, to human-made structures, like road structure components such as Bumps, Roundabouts, No Exit, Turning Loops, Give Away, etc. The probabilities are categorized into ten distinct classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes in all states collected by the US government. The probability of the crash was determined by employing Multinomial Expected Value, and a classification label was assigned accordingly. We applied three classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-depth insights through exploratory data analysis.
Keywords: Road safety, crash prediction, exploratory analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 818980 Prediction Heating Values of Lignocellulosics from Biomass Characteristics
Authors: Kaltima Phichai, Pornchanoke Pragrobpondee, Thaweesak Khumpart, Samorn Hirunpraditkoon
Abstract:
The paper provides biomasses characteristics by proximate analysis (volatile matter, fixed carbon and ash) and ultimate analysis (carbon, hydrogen, nitrogen and oxygen) for the prediction of the heating value equations. The heating value estimation of various biomasses can be used as an energy evaluation. Thirteen types of biomass were studied. Proximate analysis was investigated by mass loss method and infrared moisture analyzer. Ultimate analysis was analyzed by CHNO analyzer. The heating values varied from 15 to 22.4MJ kg-1. Correlations of the calculated heating value with proximate and ultimate analyses were undertaken using multiple regression analysis and summarized into three and two equations, respectively. Correlations based on proximate analysis illustrated that deviation of calculated heating values from experimental heating values was higher than the correlations based on ultimate analysis.
Keywords: Heating value equation, Proximate analysis, Ultimate analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3721