Search results for: Desirability Function Approach.
5938 A Distributed Approach to Extract High Utility Itemsets from XML Data
Authors: S. Kannimuthu, K. Premalatha
Abstract:
This paper investigates a new data mining capability that entails mining of High Utility Itemsets (HUI) in a distributed environment. Existing research in data mining deals with only presence or absence of an items and do not consider the semantic measures like weight or cost of the items. Thus, HUI mining algorithm has evolved. HUI mining is the one kind of utility mining concept, aims to identify itemsets whose utility satisfies a given threshold. Although, the approach of mining HUIs in a distributed environment and mining of the same from XML data have not explored yet. In this work, a novel approach is proposed to mine HUIs from the XML based data in a distributed environment. This work utilizes Service Oriented Computing (SOC) paradigm which provides Knowledge as a Service (KaaS). The interesting patterns are provided via the web services with the help of knowledge server to answer the queries of the consumers. The performance of the approach is evaluated on various databases using execution time and memory consumption.
Keywords: Data mining, Knowledge as a Service, service oriented computing, utility mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24545937 Improving Image Segmentation Performance via Edge Preserving Regularization
Authors: Ying-jie Zhang, Li-ling Ge
Abstract:
This paper presents an improved image segmentation model with edge preserving regularization based on the piecewise-smooth Mumford-Shah functional. A level set formulation is considered for the Mumford-Shah functional minimization in segmentation, and the corresponding partial difference equations are solved by the backward Euler discretization. Aiming at encouraging edge preserving regularization, a new edge indicator function is introduced at level set frame. In which all the grid points which is used to locate the level set curve are considered to avoid blurring the edges and a nonlinear smooth constraint function as regularization term is applied to smooth the image in the isophote direction instead of the gradient direction. In implementation, some strategies such as a new scheme for extension of u+ and u- computation of the grid points and speedup of the convergence are studied to improve the efficacy of the algorithm. The resulting algorithm has been implemented and compared with the previous methods, and has been proved efficiently by several cases.Keywords: Energy minimization, image segmentation, level sets, edge regularization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14985936 Evaluation of Edge Configuration in Medical Echo Images Using Genetic Algorithms
Authors: Ching-Fen Jiang
Abstract:
Edge detection is usually the first step in medical image processing. However, the difficulty increases when a conventional kernel-based edge detector is applied to ultrasonic images with a textural pattern and speckle noise. We designed an adaptive diffusion filter to remove speckle noise while preserving the initial edges detected by using a Sobel edge detector. We also propose a genetic algorithm for edge selection to form complete boundaries of the detected entities. We designed two fitness functions to evaluate whether a criterion with a complex edge configuration can render a better result than a simple criterion such as the strength of gradient. The edges obtained by using a complex fitness function are thicker and more fragmented than those obtained by using a simple fitness function, suggesting that a complex edge selecting scheme is not necessary for good edge detection in medical ultrasonic images; instead, a proper noise-smoothing filter is the key.Keywords: edge detection, ultrasonic images, speckle noise
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14835935 Automata Theory Approach for Solving Frequent Pattern Discovery Problems
Authors: Renáta Iváncsy, István Vajk
Abstract:
The various types of frequent pattern discovery problem, namely, the frequent itemset, sequence and graph mining problems are solved in different ways which are, however, in certain aspects similar. The main approach of discovering such patterns can be classified into two main classes, namely, in the class of the levelwise methods and in that of the database projection-based methods. The level-wise algorithms use in general clever indexing structures for discovering the patterns. In this paper a new approach is proposed for discovering frequent sequences and tree-like patterns efficiently that is based on the level-wise issue. Because the level-wise algorithms spend a lot of time for the subpattern testing problem, the new approach introduces the idea of using automaton theory to solve this problem.Keywords: Frequent pattern discovery, graph mining, pushdownautomaton, sequence mining, state machine, tree mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16285934 Connectionist Approach to Generic Text Summarization
Authors: Rajesh S.Prasad, U. V. Kulkarni, Jayashree.R.Prasad
Abstract:
As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15915933 Limits Problem Solving in Engineering Careers: Competences and Errors
Authors: Veronica Diaz Quezada
Abstract:
In this article, the performance and errors are featured and analysed in the limit problems solving of a real-valued function, in correspondence to competency-based education in engineering careers, in the south of Chile. The methodological component is contextualised in a qualitative research, with a descriptive and explorative design, with elaboration, content validation and application of quantitative instruments, consisting of two parallel forms of open answer tests, based on limit application problems. The mathematical competences and errors made by students from five engineering careers from a public University are identified and characterized. Results show better performance only to solve routine-context problem-solving competence, thus they are oriented towards a rational solution or they use a suitable problem-solving method, achieving the correct solution. Regarding errors, most of them are related to techniques and the incorrect use of theorems and definitions of real-valued function limits of real variable.
Keywords: Engineering education, errors, limits, mathematics competences, problem solving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13185932 A Sub-Pixel Image Registration Technique with Applications to Defect Detection
Authors: Zhen-Hui Hu, Jyh-Shong Ju, Ming-Hwei Perng
Abstract:
This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.Keywords: Defect detection, Image registration, Straight line segment, Sub-pixel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19605931 A Cohesive Lagrangian Swarm and Its Application to Multiple Unicycle-like Vehicles
Authors: Jito Vanualailai, Bibhya Sharma
Abstract:
Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.
Keywords: Attractive-repulsive swarm model, individual-based swarm model, Lagrangian swarm model, Lyapunov stability, Lyapunov-like function, practical stability, unicycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15395930 A Study for Carbonation Degree on Concrete using a Phenolphthalein Indicator and Fourier-Transform Infrared Spectroscopy
Authors: Ho Jae Lee, Do Gyeum Kim, Jang Hwa Lee, Myoung Suk Cho
Abstract:
A concrete structure is designed and constructed for its purpose of use, and is expected to maintain its function for the target durable years from when it was planned. Nevertheless, as time elapses the structure gradually deteriorates and then eventually degrades to the point where the structure cannot exert the function for which it was planned. The performance of concrete that is able to maintain the level of the performance required over the designed period of use as it has less deterioration caused by the elapse of time under the designed condition is referred to as Durability. There are a number of causes of durability degradation, but especially chloride damage, carbonation, freeze-thaw, etc are the main causes. In this study, carbonation, one of the main causes of deterioration of the durability of a concrete structure, was investigated via a microstructure analysis technique. The method for the measurement of carbonation was studied using the existing indicator method, and the method of measuring the progress of carbonation in a quantitative manner was simultaneously studied using a FT-IR (Fourier-Transform Infrared) Spectrometer along with the microstructure analysis technique.Keywords: Concrete, Carbonation, Microsturcture, FT-IR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46435929 Processing Web-Cam Images by a Neuro-Fuzzy Approach for Vehicular Traffic Monitoring
Authors: A. Faro, D. Giordano, C. Spampinato
Abstract:
Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.
Keywords: Neuro-fuzzy networks, computer vision, Fuzzy systems, intelligent transportation system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15925928 Power-Efficient AND-EXOR-INV Based Realization of Achilles' heel Logic Functions
Authors: Padmanabhan Balasubramanian, R. Chinnadurai
Abstract:
This paper deals with a power-conscious ANDEXOR- Inverter type logic implementation for a complex class of Boolean functions, namely Achilles- heel functions. Different variants of the above function class have been considered viz. positive, negative and pure horn for analysis and simulation purposes. The proposed realization is compared with the decomposed implementation corresponding to an existing standard AND-EXOR logic minimizer; both result in Boolean networks with good testability attribute. It could be noted that an AND-OR-EXOR type logic network does not exist for the positive phase of this unique class of logic function. Experimental results report significant savings in all the power consumption components for designs based on standard cells pertaining to a 130nm UMC CMOS process The simulations have been extended to validate the savings across all three library corners (typical, best and worst case specifications).
Keywords: Achilles' heel functions, AND-EXOR-Inverter logic, CMOS technology, low power design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18745927 Operating Conditions Optimization of Steam Injection in Enhanced Oil Recovery Using Duelist Algorithm
Authors: Totok R. Biyanto, Sonny Irawan, Hiskia J. Ginting, Matradji, Ya’umar, A. I. Fitri
Abstract:
Steam injection is the most suitable of Enhanced Oil Recovery (EOR) methods to recover high viscosity oil. This is due to the capabilities of steam to reduce oil viscosity and increase the sweep capability of oil from the injection well toward the production well. Oil operating conditions in production should be match well with the operating condition target at the bottom of the production well. It is influenced by oil properties and reservoir rock properties. Hence, the operating condition should be optimized. Optimization requires three components i.e., objective function, model, and optimization technique. In this paper, the objective function is to obtain the optimum operating condition at the production well. The model was built using Darcy equation and mass-energy balance. The optimization technique utilizes Duelist Algorithm due to the effectiveness of its algorithm to obtain the desirable optimization results at the optimum operating condition.Keywords: Enhanced oil recovery, steam injection, operating conditions, modeling, optimization, Duelist algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15765926 A Generic, Functionally Comprehensive Approach to Maintaining an Ontology as a Relational Database
Authors: Jennifer Leopold, Alton Coalter, Leong Lee
Abstract:
An ontology is a data model that represents a set of concepts in a given field and the relationships among those concepts. As the emphasis on achieving a semantic web continues to escalate, ontologies for all types of domains increasingly will be developed. These ontologies may become large and complex, and as their size and complexity grows, so will the need for multi-user interfaces for ontology curation. Herein a functionally comprehensive, generic approach to maintaining an ontology as a relational database is presented. Unlike many other ontology editors that utilize a database, this approach is entirely domain-generic and fully supports Webbased, collaborative editing including the designation of different levels of authorization for users.Keywords: Ontology Editor, Relational Database, CollaborativeCuration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14465925 A Particle Swarm Optimization Approach for the Earliness-Tardiness No-Wait Flowshop Scheduling Problem
Authors: Sedighe Arabameri, Nasser Salmasi
Abstract:
In this researcha particle swarm optimization (PSO) algorithm is proposedfor no-wait flowshopsequence dependent setuptime scheduling problem with weighted earliness-tardiness penalties as the criterion (|, |Σ " ).The smallestposition value (SPV) rule is applied to convert the continuous value of position vector of particles in PSO to job permutations.A timing algorithm is generated to find the optimal schedule and calculate the objective function value of a given sequence in PSO algorithm. Twodifferent neighborhood structures are applied to improve the solution quality of PSO algorithm.The first one is based on variable neighborhood search (VNS) and the second one is a simple one with invariable structure. In order to compare the performance of two neighborhood structures, random test problems are generated and solved by both neighborhood approaches.Computational results show that the VNS algorithmhas better performance than the other one especially for the large sized problems.Keywords: minimization of summation of weighed earliness and tardiness, no-wait flowshop scheduling, particle swarm optimization, sequence dependent setup times
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16265924 Application of Artificial Intelligence for Tuning the Parameters of an AGC
Authors: R. N. Patel
Abstract:
This paper deals with the tuning of parameters for Automatic Generation Control (AGC). A two area interconnected hydrothermal system with PI controller is considered. Genetic Algorithm (GA) and Particle Swarm optimization (PSO) algorithms have been applied to optimize the controller parameters. Two objective functions namely Integral Square Error (ISE) and Integral of Time-multiplied Absolute value of the Error (ITAE) are considered for optimization. The effectiveness of an objective function is considered based on the variation in tie line power and change in frequency in both the areas. MATLAB/SIMULINK was used as a simulation tool. Simulation results reveal that ITAE is a better objective function than ISE. Performances of optimization algorithms are also compared and it was found that genetic algorithm gives better results than particle swarm optimization algorithm for the problems of AGC.
Keywords: Area control error, Artificial intelligence, Automatic generation control, Genetic Algorithms and modeling, ISE, ITAE, Particle swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20305923 Perturbation in the Fractional Fourier Span due to Erroneous Transform Order and Window Function
Authors: Sukrit Shankar, Chetana Shanta Patsa, Jaydev Sharma
Abstract:
Fractional Fourier Transform is a generalization of the classical Fourier Transform. The Fractional Fourier span in general depends on the amplitude and phase functions of the signal and varies with the transform order. However, with the development of the Fractional Fourier filter banks, it is advantageous in some cases to have different transform orders for different filter banks to achieve better decorrelation of the windowed and overlapped time signal. We present an expression that is useful for finding the perturbation in the Fractional Fourier span due to the erroneous transform order and the possible variation in the window shape and length. The expression is based on the dependency of the time-Fractional Fourier span Uncertainty on the amplitude and phase function of the signal. We also show with the help of the developed expression that the perturbation of span has a varying degree of sensitivity for varying degree of transform order and the window coefficients.Keywords: Fractional Fourier Transform, Perturbation, Fractional Fourier span, amplitude, phase, transform order, filterbanks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14685922 Fourier Galerkin Approach to Wave Equation with Absorbing Boundary Conditions
Authors: Alexandra Leukauf, Alexander Schirrer, Emir Talic
Abstract:
Numerical computation of wave propagation in a large domain usually requires significant computational effort. Hence, the considered domain must be truncated to a smaller domain of interest. In addition, special boundary conditions, which absorb the outward travelling waves, need to be implemented in order to describe the system domains correctly. In this work, the linear one dimensional wave equation is approximated by utilizing the Fourier Galerkin approach. Furthermore, the artificial boundaries are realized with absorbing boundary conditions. Within this work, a systematic work flow for setting up the wave problem, including the absorbing boundary conditions, is proposed. As a result, a convenient modal system description with an effective absorbing boundary formulation is established. Moreover, the truncated model shows high accuracy compared to the global domain.Keywords: Absorbing boundary conditions, boundary control, Fourier Galerkin approach, modal approach, wave equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8885921 Gold Nanoparticle: Synthesis, Characterization, Clinico-Pathological, Pathological, and Bio-Distribution Studies in Rabbits
Authors: M. M. Bashandy, A. R. Ahmed, M. El-Gaffary, Sahar S. Abd El-Rahman
Abstract:
This study evaluated the acute toxicity and tissue distribution of intravenously administered gold nanoparticles (AuNPs) in male rabbits. Rabbits were exposed to single dose of AuNPs (300 μg/ kg). Toxic effects were assessed via general behavior, hematological parameters, serum biochemical parameters, and histopathological examination of various rabbits’ organs. Inductively coupled plasma–mass spectrometry (ICP-MS) was used to determine gold concentrations in tissue samples collected at predetermined time intervals. After one week, AuNPs exerted no obvious acute toxicity in rabbits. However, inflammatory reactions were observed in liver, lungs and kidneys accompanied with mild absolute neutrophilia and significant monocytosis. The highest gold levels were found in the spleen and liver followed by lungs, and kidneys. These results indicated that AuNPs could be distributed extensively to various tissues in the body, but primarily in the spleen and liver.Keywords: Gold nanoparticles, toxicity, pathology, hematology, liver function, kidney function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18475920 Base Change for Fisher Metrics: Case of the q−Gaussian Inverse Distribution
Authors: Gabriel I. Loaiza O., Carlos A. Cadavid M., Juan C. Arango P.
Abstract:
It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ = −1/2 , as does the family of usual Gaussian distributions. In the present paper, firstly we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ1, θ2; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the Inverse q−Gaussian distribution family (q < 3), as the family obtained by replacing the usual exponential function by the Tsallis q−exponential function in the expression for the Inverse Gaussian distribution, and observe that it supports two possible geometries, the Fisher and the q−Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q−Fisher geometry of the Inverse q−Gaussian distribution family, similar to the ones obtained in the case of the Inverse Gaussian distribution family.
Keywords: Base of Changes, Information Geometry, Inverse Gaussian distribution, Inverse q-Gaussian distribution, Statistical Manifolds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3875919 Probabilistic Characteristics of older PR Frames in the Mid-America Earthquake Region
Authors: Do-Hwan Kim, Roberto Leon
Abstract:
Probabilistic characteristics of seismic responses of the Partially Restrained connection rotation (PRCR) and panel zone deformation (PZD) installed in older steel moment frames were investigated in accordance with statistical inference in decision-making process. The 4, 6 and 8 story older steel moment frames with clip angle and T-stub connections were designed and analyzed using 2%/50yrs ground motions in four cities of the Mid-America earthquake region. The probability density function and cumulative distribution function of PRCR and PZD were determined by the goodness-of-fit tests based on probabilistic parameters measured from the results of the nonlinear time-history analyses. The obtained probabilistic parameters and distributions can be used to find out what performance level mainly PR connections and panel zones satisfy and how many PR connections and panel zones experience a serious damage under the Mid-America ground motions.Keywords: Mid-America earthquake, Panel zone, PR connection, Probabilistic characteristics, seismic performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14125918 A Post Processing Method for Quantum Prime Factorization Algorithm based on Randomized Approach
Authors: Mir Shahriar Emami, Mohammad Reza Meybodi
Abstract:
Prime Factorization based on Quantum approach in two phases has been performed. The first phase has been achieved at Quantum computer and the second phase has been achieved at the classic computer (Post Processing). At the second phase the goal is to estimate the period r of equation xrN ≡ 1 and to find the prime factors of the composite integer N in classic computer. In this paper we present a method based on Randomized Approach for estimation the period r with a satisfactory probability and the composite integer N will be factorized therefore with the Randomized Approach even the gesture of the period is not exactly the real period at least we can find one of the prime factors of composite N. Finally we present some important points for designing an Emulator for Quantum Computer Simulation.Keywords: Quantum Prime Factorization, RandomizedAlgorithms, Quantum Computer Simulation, Quantum Computation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14945917 Globally Convergent Edge-preserving Reconstruction with Contour-line Smoothing
Authors: Marc C. Robini, Pierre-Jean Viverge, Yuemin Zhu, Jianhua Luo
Abstract:
The standard approach to image reconstruction is to stabilize the problem by including an edge-preserving roughness penalty in addition to faithfulness to the data. However, this methodology produces noisy object boundaries and creates a staircase effect. The existing attempts to favor the formation of smooth contour lines take the edge field explicitly into account; they either are computationally expensive or produce disappointing results. In this paper, we propose to incorporate the smoothness of the edge field in an implicit way by means of an additional penalty term defined in the wavelet domain. We also derive an efficient half-quadratic algorithm to solve the resulting optimization problem, including the case when the data fidelity term is non-quadratic and the cost function is nonconvex. Numerical experiments show that our technique preserves edge sharpness while smoothing contour lines; it produces visually pleasing reconstructions which are quantitatively better than those obtained without wavelet-domain constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13445916 Towards a Suitable and Systematic Approach for Component Based Software Development
Authors: Kuljit Kaur, Parminder Kaur, Jaspreet Bedi, Hardeep Singh
Abstract:
Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.Keywords: Component Model, Software Components, SoftwareRepository, Process Models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17655915 Real-time Laser Monitoring based on Pipe Detective Operation
Authors: Mongkorn Klingajay, Tawatchai Jitson
Abstract:
The pipe inspection operation is the difficult detective performance. Almost applications are mainly relies on a manual recognition of defective areas that have carried out detection by an engineer. Therefore, an automation process task becomes a necessary in order to avoid the cost incurred in such a manual process. An automated monitoring method to obtain a complete picture of the sewer condition is proposed in this work. The focus of the research is the automated identification and classification of discontinuities in the internal surface of the pipe. The methodology consists of several processing stages including image segmentation into the potential defect regions and geometrical characteristic features. Automatic recognition and classification of pipe defects are carried out by means of using an artificial neural network technique (ANN) based on Radial Basic Function (RBF). Experiments in a realistic environment have been conducted and results are presented.Keywords: Artificial neural network, Radial basic function, Curve fitting, CCTV, Image segmentation, Data acquisition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18195914 Modelling Conditional Volatility of Saving Rate by a Time-Varying Parameter Model
Authors: Katleho D. Makatjane, Kalebe M. Kalebe
Abstract:
The present paper used time-varying parameters which are based on the score function of a probability density at time t to model volatility of saving rate. We used a scaled likelihood function to update the parameters of the model overtime. Our results revealed high diligence of time-varying since the location parameter is greater than zero. Furthermore, we discovered a leptokurtic condition on saving rate’s distribution. Kapetanios, Shin-Shell Nonlinear Augmented Dickey-Fuller (KSS-NADF) test showed that the saving rate has a nonlinear unit root; therefore, it can be modeled by a generalised autoregressive score (GAS) model. Additionally, value at risk (VaR) and conditional tail expectation (CTE) indicate that 99% of the time people in Lesotho are saving more than spending. This puts the economy in high risk of not expanding. Therefore, the monetary policy committee (MPC) of Lesotho should revise their monetary policies towards this high saving rates risk.
Keywords: Generalized autoregressive score, time-varying, saving rate, Lesotho.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6195913 A Model-Driven Approach of User Interface for MVP Rich Internet Application
Authors: Sarra Roubi, Mohammed Erramdani, Samir Mbarki
Abstract:
This paper presents an approach for the model-driven generating of Rich Internet Application (RIA) focusing on the graphical aspect. We used well known Model-Driven Engineering (MDE) frameworks and technologies, such as Eclipse Modeling Framework (EMF), Graphical Modeling Framework (GMF), Query View Transformation (QVTo) and Acceleo to enable the design and the code automatic generation of the RIA. During the development of the approach, we focused on the graphical aspect of the application in terms of interfaces while opting for the Model View Presenter pattern that is designed for graphics interfaces. The paper describes the process followed to define the approach, the supporting tool and presents the results from a case study.Keywords: Code generation, Design Pattern, metamodel, Model Driven Engineering, MVP, Rich Internet Application, transformation, User Interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17425912 About the Instability Modes of Current Sheet in Wide Range of Frequencies
Authors: V. V. Lyahov, V. M. Neshchadim
Abstract:
We offer a new technique for research of stability of current sheaths in space plasma taking into account the effect of polarization. At the beginning, the found perturbation of the distribution function is used for calculation of the dielectric permeability tensor, which simulates inhomogeneous medium of a current sheath. Further, we in the usual manner solve the system of Maxwell's equations closed with the material equation. The amplitudes of Fourier perturbations are considered to be exponentially decaying through the current sheath thickness. The dispersion equation follows from the nontrivial solution requirement for perturbations of the electromagnetic field. The resulting dispersion equation allows one to study the temporal and spatial characteristics of instability modes of the current sheath (within the limits of the proposed model) over a wide frequency range, including low frequencies.
Keywords: Current sheath, distribution function, effect of polarization, instability modes, low frequencies, perturbation of electromagnetic field dispersion equation, space plasma, tensor of dielectric permeability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16535911 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage
Authors: Oh Hyeon Jeon, WooYoung Jung
Abstract:
In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.
Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11605910 Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm
Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara
Abstract:
The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.
Keywords: Multimodal biometrics, data fusion, Choquet integral, fuzzy measures, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25165909 Enhancement of Higher Order Thinking Skills among Teacher Trainers by Fun Game Learning Approach
Authors: Malathi Balakrishnan, Gananathan M. Nadarajah, Saraswathy Vellasamy, Evelyn Gnanam William George
Abstract:
The purpose of the study is to explore how the fun game-learning approach enhances teacher trainers’ higher order thinking skills. Two-day fun filled fun game learning-approach was introduced to teacher trainers as a Continuous Professional Development Program (CPD). 26 teacher trainers participated in this Transformation of Teaching and Learning Fun Way Program, organized by Institute of Teacher Education Malaysia. Qualitative research technique was adopted as the researchers observed the participants’ higher order thinking skills developed during the program. Data were collected from observational checklist; interview transcriptions of four participants and participants’ reflection notes. All the data were later analyzed with NVivo data analysis process. The finding of this study presented five main themes, which are critical thinking, hands on activities, creating, application and use of technology. The studies showed that the teacher trainers’ higher order thinking skills were enhanced after the two-day CPD program. Therefore, Institute of Teacher Education will have more success using the fun way game-learning approach to develop higher order thinking skills among its teacher trainers who can implement these skills to their trainee teachers in future. This study also added knowledge to Constructivism learning theory, which will further highlight the prominence of the fun way learning approach to enhance higher order thinking skills.
Keywords: Constructivism, game-learning approach, higher order thinking skill, teacher trainer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2817