Search results for: Deep approach metacognitive methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8510

Search results for: Deep approach metacognitive methods

6770 Improved Feature Processing for Iris Biometric Authentication System

Authors: Somnath Dey, Debasis Samanta

Abstract:

Iris-based biometric authentication is gaining importance in recent times. Iris biometric processing however, is a complex process and computationally very expensive. In the overall processing of iris biometric in an iris-based biometric authentication system, feature processing is an important task. In feature processing, we extract iris features, which are ultimately used in matching. Since there is a large number of iris features and computational time increases as the number of features increases, it is therefore a challenge to develop an iris processing system with as few as possible number of features and at the same time without compromising the correctness. In this paper, we address this issue and present an approach to feature extraction and feature matching process. We apply Daubechies D4 wavelet with 4 levels to extract features from iris images. These features are encoded with 2 bits by quantizing into 4 quantization levels. With our proposed approach it is possible to represent an iris template with only 304 bits, whereas existing approaches require as many as 1024 bits. In addition, we assign different weights to different iris region to compare two iris templates which significantly increases the accuracy. Further, we match the iris template based on a weighted similarity measure. Experimental results on several iris databases substantiate the efficacy of our approach.

Keywords: Iris recognition, biometric, feature processing, patternrecognition, pattern matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135
6769 Dimensionality Reduction of PSSM Matrix and its Influence on Secondary Structure and Relative Solvent Accessibility Predictions

Authors: Rafał Adamczak

Abstract:

State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.

Keywords: Secondary structure prediction, feature selection, position specific scoring matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
6768 Alternative Approach toward Waste Treatment: Biodrying for Solid Waste in Malaysia

Authors: Nurul' Ain Ab Jalil, Hassan Basri

Abstract:

This paper reviews the objectives, methods and results of previous studies on biodrying of solid waste in several countries. Biodrying of solid waste is a novel technology in developing countries such as in Malaysia where high moisture content in organic waste makes the segregation process for recycling purposes complicated and diminishes the calorific value for the use of fuel source. In addition, the high moisture content also encourages the breeding of vectors and disease-bearing animals. From the laboratory results, the average moisture content of organic waste, paper, plastics and metals are 58.17%, 37.93%, 29.79% and 1.03% respectively for UKM campus. Biodrying of solid waste is a simple method of waste treatment as well as a cost-efficient technology to dry the solid waste. The process depends on temperature monitoring and air flow control along with the natural biodegradable process of organic waste. This review shows that the biodrying of solid waste method has high potential in treatment and recycling of solid waste, be useful for biodrying study and implementation in Malaysia.

Keywords: Biodrying of solid waste, Organic waste, Fuel source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
6767 Removing Ocular Artifacts from EEG Signals using Adaptive Filtering and ARMAX Modeling

Authors: Parisa Shooshtari, Gelareh Mohamadi, Behnam Molaee Ardekani, Mohammad Bagher Shamsollahi

Abstract:

EEG signal is one of the oldest measures of brain activity that has been used vastly for clinical diagnoses and biomedical researches. However, EEG signals are highly contaminated with various artifacts, both from the subject and from equipment interferences. Among these various kinds of artifacts, ocular noise is the most important one. Since many applications such as BCI require online and real-time processing of EEG signal, it is ideal if the removal of artifacts is performed in an online fashion. Recently, some methods for online ocular artifact removing have been proposed. One of these methods is ARMAX modeling of EEG signal. This method assumes that the recorded EEG signal is a combination of EOG artifacts and the background EEG. Then the background EEG is estimated via estimation of ARMAX parameters. The other recently proposed method is based on adaptive filtering. This method uses EOG signal as the reference input and subtracts EOG artifacts from recorded EEG signals. In this paper we investigate the efficiency of each method for removing of EOG artifacts. A comparison is made between these two methods. Our undertaken conclusion from this comparison is that adaptive filtering method has better results compared with the results achieved by ARMAX modeling.

Keywords: Ocular Artifacts, EEG, Adaptive Filtering, ARMAX

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894
6766 An Empirical Analysis of Arabic WebPages Classification using Fuzzy Operators

Authors: Ahmad T. Al-Taani, Noor Aldeen K. Al-Awad

Abstract:

In this study, a fuzzy similarity approach for Arabic web pages classification is presented. The approach uses a fuzzy term-category relation by manipulating membership degree for the training data and the degree value for a test web page. Six measures are used and compared in this study. These measures include: Einstein, Algebraic, Hamacher, MinMax, Special case fuzzy and Bounded Difference approaches. These measures are applied and compared using 50 different Arabic web pages. Einstein measure was gave best performance among the other measures. An analysis of these measures and concluding remarks are drawn in this study.

Keywords: Text classification, HTML documents, Web pages, Machine learning, Fuzzy logic, Arabic Web pages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899
6765 Optimizing Materials Cost and Mechanical Properties of PVC Electrical Cable-s Insulation by Using Mixture Experimental Design Approach

Authors: Safwan Altarazi, Raghad Hemeimat, Mousa Wakileh, Ra'ad Qsous, Aya Khreisat

Abstract:

With the development of the Polyvinyl chloride (PVC) products in many applications, the challenge of investigating the raw material composition and reducing the cost have both become more and more important. Considerable research has been done investigating the effect of additives on the PVC products. Most of the PVC composites research investigates only the effect of single/few factors, at a time. This isolated consideration of the input factors does not take in consideration the interaction effect of the different factors. This paper implements a mixture experimental design approach to find out a cost-effective PVC composition for the production of electrical-insulation cables considering the ASTM Designation (D) 6096. The results analysis showed that a minimum cost can be achieved through using 20% virgin PVC, 18.75% recycled PVC, 43.75% CaCO3 with participle size 10 microns, 14% DOP plasticizer, and 3.5% CPW plasticizer. For maximum UTS the compound should consist of: 17.5% DOP, 62.5% virgin PVC, and 20.0% CaCO3 of particle size 5 microns. Finally, for the highest ductility the compound should be made of 35% virgin PVC, 20% CaCO3 of particle size 5 microns, and 45.0% DOP plasticizer.

Keywords: ASTM 6096, mixture experimental-design approach, PVC electrical cable insulation, recycled PVC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4699
6764 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method

Authors: M. K. Balyan

Abstract:

The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.

Keywords: Dynamical diffraction, hologram, object image, X-ray holography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
6763 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4430
6762 Impacts of E-learning in Nursing Education: In the Light of Recent Studies

Authors: A.Ö. İlkay, C.O. Zeynep

Abstract:

Information and Communication Technologies (ICT) has changed our life and learn. ICT bares doors to new innovative methods to deliver education. E-learning is a part of ICT and has been endorsed as a tool for developing “21st century skills” in higher education. The aim of this review is to establish the impacts of e-learning in undergraduate nursing education. A systematic literature review was conducted to assess the impacts of e-learning in nursing education by using Akdeniz University electronic databases. According to results, we can decelerate that the nursing faculties cannot treat e-learning methods as a single tool. E-learning should be used with a good understanding of learners’ needs.

Keywords: E-learning, nursing education, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4621
6761 An Improved Resource Discovery Approach Using P2P Model for Condor: A Grid Middleware

Authors: Anju Sharma, Seema Bawa

Abstract:

Resource Discovery in Grids is critical for efficient resource allocation and management. Heterogeneous nature and dynamic availability of resources make resource discovery a challenging task. As numbers of nodes are increasing from tens to thousands, scalability is essentially desired. Peer-to-Peer (P2P) techniques, on the other hand, provide effective implementation of scalable services and applications. In this paper we propose a model for resource discovery in Condor Middleware by using the four axis framework defined in P2P approach. The proposed model enhances Condor to incorporate functionality of a P2P system, thus aim to make Condor more scalable, flexible, reliable and robust.

Keywords: Condor Middleware, Grid Computing, P2P, Resource Discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
6760 Contour Estimation in Synthetic and Real Weld Defect Images based on Maximum Likelihood

Authors: M. Tridi, N. Nacereddine, N. Oucief

Abstract:

This paper describes a novel method for automatic estimation of the contours of weld defect in radiography images. Generally, the contour detection is the first operation which we apply in the visual recognition system. Our approach can be described as a region based maximum likelihood formulation of parametric deformable contours. This formulation provides robustness against the poor image quality, and allows simultaneous estimation of the contour parameters together with other parameters of the model. Implementation is performed by a deterministic iterative algorithm with minimal user intervention. Results testify for the very good performance of the approach especially in synthetic weld defect images.

Keywords: Contour, gaussian, likelihood, rayleigh.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
6759 Near Shore Wave Manipulation for Electricity Generation

Authors: K. D. R. Jagath-Kumara, D. D. Dias

Abstract:

The sea waves carry thousands of GWs of power globally. Although there are a number of different approaches to harness offshore energy, they are likely to be expensive, practically challenging, and vulnerable to storms. Therefore, this paper considers using the near shore waves for generating mechanical and electrical power. It introduces two new approaches, the wave manipulation and using a variable duct turbine, for intercepting very wide wave fronts and coping with the fluctuations of the wave height and the sea level, respectively. The first approach effectively allows capturing much more energy yet with a much narrower turbine rotor. The second approach allows using a rotor with a smaller radius but captures energy of higher wave fronts at higher sea levels yet preventing it from totally submerging. To illustrate the effectiveness of the first approach, the paper contains a description and the simulation results of a scale model of a wave manipulator. Then, it includes the results of testing a physical model of the manipulator and a single duct, axial flow turbine in a wave flume in the laboratory. The paper also includes comparisons of theoretical predictions, simulation results, and wave flume tests with respect to the incident energy, loss in wave manipulation, minimal loss, brake torque, and the angular velocity.

Keywords: Near-shore sea waves, Renewable energy, Wave energy conversion, Wave manipulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
6758 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

Authors: Dr. L. Arockiam, A. Aloysius

Abstract:

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Keywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
6757 A Study of Numerical Reaction-Diffusion Systems on Closed Surfaces

Authors: Mei-Hsiu Chi, Jyh-Yang Wu, Sheng-Gwo Chen

Abstract:

The diffusion-reaction equations are important Partial Differential Equations in mathematical biology, material science, physics, and so on. However, finding efficient numerical methods for diffusion-reaction systems on curved surfaces is still an important and difficult problem. The purpose of this paper is to present a convergent geometric method for solving the reaction-diffusion equations on closed surfaces by an O(r)-LTL configuration method. The O(r)-LTL configuration method combining the local tangential lifting technique and configuration equations is an effective method to estimate differential quantities on curved surfaces. Since estimating the Laplace-Beltrami operator is an important task for solving the reaction-diffusion equations on surfaces, we use the local tangential lifting method and a generalized finite difference method to approximate the Laplace-Beltrami operators and we solve this reaction-diffusion system on closed surfaces. Our method is not only conceptually simple, but also easy to implement.

Keywords: Close surfaces, high-order approach, numerical solutions, reaction-diffusion systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1262
6756 Advanced Neural Network Learning Applied to Pulping Modeling

Authors: Z. Zainuddin, W. D. Wan Rosli, R. Lanouette, S. Sathasivam

Abstract:

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

Keywords: Convergence, pulping modeling, neural networks, preconditioned conjugate gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
6755 Clustering Approach to Unveiling Relationships between Gene Regulatory Networks

Authors: Hiba Hasan, Khalid Raza

Abstract:

Reverse engineering of genetic regulatory network involves the modeling of the given gene expression data into a form of the network. Computationally it is possible to have the relationships between genes, so called gene regulatory networks (GRNs), that can help to find the genomics and proteomics based diagnostic approach for any disease. In this paper, clustering based method has been used to reconstruct genetic regulatory network from time series gene expression data. Supercoiled data set from Escherichia coli has been taken to demonstrate the proposed method.

Keywords: Gene expression, gene regulatory networks (GRNs), clustering, data preprocessing, network visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145
6754 Assessment of ATC with Shunt FACTS Devices

Authors: Ashwani Kumar, Jitender Kumar

Abstract:

In this paper, an optimal power flow based approach has been applied for multi-transactions deregulated environment for ATC determination with SVC and STATCOM. The main contribution of the paper is (i) OPF based approach for evaluation of ATC with multi-transactions, (ii) ATC enhancement with FACTS devices viz. SVC and STATCOM for intact and line contingency cases, (iii) Impact of ZIP load on ATC determination and comparison of ATC obtained with SVC and STATCOM. The results have been determined for intact and line contingency cases taking simultaneous as well as single transaction cases for IEEE 24 bus RTS.

Keywords: Available transfer capability, FACTS devices, line contingency, multi-transactions, ZIP load model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
6753 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement

Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov

Abstract:

One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.

Keywords: Fuzzy logic, fuzzy sets, performance measurement, project analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1071
6752 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris

Abstract:

The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.

Keywords: Energy efficiency, handover, HetNets, MADM, small cells.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 492
6751 Numerical Modelling of Crack Initiation around a Wellbore Due to Explosion

Authors: Meysam Lak, Mohammad Fatehi Marji, Alireza Yarahamdi Bafghi, Abolfazl Abdollahipour

Abstract:

A wellbore is a hole that is drilled to aid in the exploration and recovery of natural resources including oil and gas. Occasionally, in order to increase productivity index and porosity of the wellbore and reservoir, the well stimulation methods have been used. Hydraulic fracturing is one of these methods. Moreover, several explosions at the end of the well can stimulate the reservoir and create fractures around it. In this study, crack initiation in rock around the wellbore has been numerically modeled due to explosion. One, two, three, and four pairs of explosion have been set at the end of the wellbore on its wall. After each stage of the explosion, results have been presented and discussed. Results show that this method can initiate and probably propagate several fractures around the wellbore.

Keywords: Crack initiation, explosion, finite difference modelling, well productivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
6750 Promoting Gender Equality within Islamic Tradition via Contextualist Approach

Authors: Ali Akbar

Abstract:

The importance of advancing women’s rights is closely intertwined with the development of civil society and the institutionalization of democracy in Middle Eastern countries. There is indeed an intimate relationship between the process of democratization and promoting gender equality, since democracy necessitates equality between men and women. In order to advance the issue of gender equality, what is required is a solid theoretical framework which has its roots in the reexamination of pre-modern interpretation of certain Qurʾānic passages that seem to have given men more rights than it gives women. This paper suggests that those Muslim scholars who adopt a contextualist approach to the Qurʾānic text and its interpretation provide a solid theoretical background for improving women’s rights. Indeed, the aim of the paper is to discuss how the contextualist approach to the Qurʾānic text and its interpretation given by a number of prominent scholars is capable of promoting the issue of gender equality. The paper concludes that since (1) much of the gender inequality found in the primary sources of Islam as well as pre-modern Muslim writings is rooted in the natural cultural norms and standards of early Islamic societies and (2) since the context of today’s world is so different from that of the pre-modern era, the proposed models provide a solid theoretical framework for promoting women’s rights and gender equality.

Keywords: Contextualism, Gender equality, Islam, Women’s rights.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
6749 Object-Oriented Multivariate Proportional-Integral-Derivative Control of Hydraulic Systems

Authors: J. Fernandez de Canete, S. Fernandez-Calvo, I. García-Moral

Abstract:

This paper presents and discusses the application of the object-oriented modelling software SIMSCAPE to hydraulic systems, with particular reference to multivariable proportional-integral-derivative (PID) control. As a result, a particular modelling approach of a double cylinder-piston coupled system is proposed and motivated, and the SIMULINK based PID tuning tool has also been used to select the proper controller parameters. The paper demonstrates the usefulness of the object-oriented approach when both physical modelling and control are tackled.

Keywords: Object-oriented modeling, multivariable hydraulic system, multivariable PID control, computer simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1102
6748 Comparative Study of QRS Complex Detection in ECG

Authors: Ibtihel Nouira, Asma Ben Abdallah, Ibtissem Kouaja, Mohamed Hèdi Bedoui

Abstract:

The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In addition, this paper will include two main R peak detection methods by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on Dyadic Wavelet Transform DyWT.

Keywords: Derived calculation methods, Electrocardiogram, R peaks, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2562
6747 Modeling Exponential Growth Activity Using Technology: A Research with Bachelor of Business Administration Students

Authors: V. Vargas-Alejo, L. E. Montero-Moguel

Abstract:

Understanding the concept of function has been important in mathematics education for many years. In this study, the models built by a group of five business administration and accounting undergraduate students when carrying out a population growth activity are analyzed. The theoretical framework is the Models and Modeling Perspective. The results show how the students included tables, graphics, and algebraic representations in their models. Using technology was useful to interpret, describe, and predict the situation. The first model, the students built to describe the situation, was linear. After that, they modified and refined their ways of thinking; finally, they created exponential growth. Modeling the activity was useful to deep on mathematical concepts such as covariation, rate of change, and exponential function also to differentiate between linear and exponential growth.

Keywords: Covariation reasoning, exponential function, modeling, representations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 493
6746 Application Reliability Method for Concrete Dams

Authors: Mustapha Kamel Mihoubi, Mohamed Essadik Kerkar

Abstract:

Probabilistic risk analysis models are used to provide a better understanding of the reliability and structural failure of works, including when calculating the stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of reliability analysis methods including the methods used in engineering. It is in our case, the use of level 2 methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type first order risk method (FORM) and the second order risk method (SORM). By way of comparison, a level three method was used which generates a full analysis of the problem and involves an integration of the probability density function of random variables extended to the field of security using the Monte Carlo simulation method. Taking into account the change in stress following load combinations: normal, exceptional and extreme acting on the dam, calculation of the results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities, thus causing a significant decrease in strength, shear forces then induce a shift that threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case the increase of uplift in a hypothetical default of the drainage system.

Keywords: Dam, failure, limit-state, Monte Carlo simulation, reliability, probability, simulation, sliding, Taylor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1220
6745 Determination of Measurement Uncertainty in Extracting of Forming Limit Diagrams

Authors: M. Mahboubkhah, H. Fayazfar

Abstract:

In this research, Forming Limit Diagrams for supertension sheet metals which are using in automobile industry have been obtained. The exerted strains to sheet metals have been measured with four different methods and the errors of each method have also been represented. These methods have been compared with together and the most efficient and economic way of extracting of the exerted strains to sheet metals has been introduced. In this paper total error and uncertainty of FLD extraction procedures have been derived. Determination of the measurement uncertainty in extracting of FLD has a great importance in design and analysis of the sheet metal forming process.

Keywords: Forming Limit Diagram, Major and Minor Strain, Measurement Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
6744 Estimation of Real Power Transfer Allocation Using Intelligent Systems

Authors: H. Shareef, A. Mohamed, S. A. Khalid, Aziah Khamis

Abstract:

This paper presents application artificial intelligent (AI) techniques, namely artificial neural network (ANN), adaptive neuro fuzzy interface system (ANFIS), to estimate the real power transfer between generators and loads. Since these AI techniques adopt supervised learning, it first uses modified nodal equation method (MNE) to determine real power contribution from each generator to loads. Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques. The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of both AI methods compared to that of the MNE method. The mean squared error of the estimate of ANN and ANFIS power transfer allocation methods are 1.19E-05 and 2.97E-05, respectively. Furthermore, when compared to MNE method, ANN and ANFIS methods computes generator contribution to loads within 20.99 and 39.37msec respectively whereas the MNE method took 360msec for the calculation of same real power transfer allocation. 

Keywords: Artificial intelligence, Power tracing, Artificial neural network, ANFIS, Power system deregulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2576
6743 Probabilistic Electrical Power Generation Modeling Using Decimal to Binary Conversion

Authors: Ahmed S. Al-Abdulwahab

Abstract:

Generation system reliability assessment is an important task which can be performed using deterministic or probabilistic techniques. The probabilistic approaches have significant advantages over the deterministic methods. However, more complicated modeling is required by the probabilistic approaches. Power generation model is a basic requirement for this assessment. One form of the generation models is the well known capacity outage probability table (COPT). Different analytical techniques have been used to construct the COPT. These approaches require considerable mathematical modeling of the generating units. The unit-s models are combined to build the COPT which will add more burdens on the process of creating the COPT. Decimal to Binary Conversion (DBC) technique is widely and commonly applied in electronic systems and computing This paper proposes a novel utilization of the DBC to create the COPT without engaging in analytical modeling or time consuming simulations. The simple binary representation , “0 " and “1 " is used to model the states o f generating units. The proposed technique is proven to be an effective approach to build the generation model.

Keywords: Decimal to Binary, generation, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
6742 Fuzzy Based Environmental System Approach for Impact Assessment - Case Studies

Authors: Marius Pislaru, Alexandru F. Trandabat

Abstract:

Environmental studies have expanded dramatically all over the world in the past few years. Nowadays businesses interact with society and the environment in ways that put their mark on both sides. Efforts improving human standard living, through the control of nature and the development of new products, have also resulted in contamination of the environment. Consequently companies play an important role in environmental sustainability of a region or country. Therefore we can say that a company's sustainable development is strictly dependent on the environment. This article presents a fuzzy model to evaluate a company's environmental impact. Article illustrates an example of the automotive industry in order to prove the usefulness of using such a model.

Keywords: fuzzy approach, environmental impact assessment, sustainability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1949
6741 A Comparison of Some Thresholding Selection Methods for Wavelet Regression

Authors: Alsaidi M. Altaher, Mohd T. Ismail

Abstract:

In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.

Keywords: wavelet regression, simulation, Threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1760