Search results for: Trend based segmentation method
15621 Face Authentication for Access Control based on SVM using Class Characteristics
Authors: SeHun Lim, Sanghoon Kim, Sun-Tae Chung, Seongwon Cho
Abstract:
Face authentication for access control is a face membership authentication which passes the person of the incoming face if he turns out to be one of an enrolled person based on face recognition or rejects if not. Face membership authentication belongs to the two class classification problem where SVM(Support Vector Machine) has been successfully applied and shows better performance compared to the conventional threshold-based classification. However, most of previous SVMs have been trained using image feature vectors extracted from face images of each class member(enrolled class/unenrolled class) so that they are not robust to variations in illuminations, poses, and facial expressions and much affected by changes in member configuration of the enrolled class In this paper, we propose an effective face membership authentication method based on SVM using class discriminating features which represent an incoming face image-s associability with each class distinctively. These class discriminating features are weakly related with image features so that they are less affected by variations in illuminations, poses and facial expression. Through experiments, it is shown that the proposed face membership authentication method performs better than the threshold rule-based or the conventional SVM-based authentication methods and is relatively less affected by changes in member size and membership.Keywords: Face Authentication, Access control, member ship authentication, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150815620 A Noble Flow Rate Control based on Leaky Bucket Method for Multi-Media OBS Networks
Authors: Kentaro Miyoko, Yoshihiko Mori, Yugo Ikeda, Yoshihiro Nishino, Yong-Bok Choi, Hiromi Okada
Abstract:
Optical burst switching (OBS) has been proposed to realize the next generation Internet based on the wavelength division multiplexing (WDM) network technologies. In the OBS, the burst contention is one of the major problems. The deflection routing has been designed for resolving the problem. However, the deflection routing becomes difficult to prevent from the burst contentions as the network load becomes high. In this paper, we introduce a flow rate control methods to reduce burst contentions. We propose new flow rate control methods based on the leaky bucket algorithm and deflection routing, i.e. separate leaky bucket deflection method, and dynamic leaky bucket deflection method. In proposed methods, edge nodes which generate data bursts carry out the flow rate control protocols. In order to verify the effectiveness of the flow rate control in OBS networks, we show that the proposed methods improve the network utilization and reduce the burst loss probability through computer simulations.Keywords: Optical burst switching, OBS, flow rate control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170615619 Effect Comparison of Speckle Noise Reduction Filters on 2D-Echocardigraphic Images
Authors: Faten A. Dawood, Rahmita W. Rahmat, Suhaini B. Kadiman, Lili N. Abdullah, Mohd D. Zamrin
Abstract:
Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.
Keywords: Gaussian operator, median filter, speckle texture, peak signal-to-ratio
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199515618 A Nonlinear Parabolic Partial Differential Equation Model for Image Enhancement
Authors: Tudor Barbu
Abstract:
We present a robust nonlinear parabolic partial differential equation (PDE)-based denoising scheme in this article. Our approach is based on a second-order anisotropic diffusion model that is described first. Then, a consistent and explicit numerical approximation algorithm is constructed for this continuous model by using the finite-difference method. Finally, our restoration experiments and method comparison, which prove the effectiveness of this proposed technique, are discussed in this paper.Keywords: Image denoising and restoration, nonlinear PDE model, anisotropic diffusion, numerical approximation scheme, finite differences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 130315617 Application of Mapping and Superimposing Rule for Solution of Parabolic PDE in Porous Medium under Cyclic Loading
Authors: Mohammad M. Toufigh, Ahad Ouria
Abstract:
This paper presents an analytical method to solve governing consolidation parabolic partial differential equation (PDE) for inelastic porous Medium (soil) with consideration of variation of equation coefficient under cyclic loading. Since under cyclic loads, soil skeleton parameters change, this would introduce variable coefficient of parabolic PDE. Classical theory would not rationalize consolidation phenomenon in such condition. In this research, a method based on time space mapping to a virtual time space along with superimposing rule is employed to solve consolidation of inelastic soils in cyclic condition. Changes of consolidation coefficient applied in solution by modification of loading and unloading duration by introducing virtual time. Mapping function is calculated based on consolidation partial differential equation results. Based on superimposing rule a set of continuous static loads in specified times used instead of cyclic load. A set of laboratory consolidation tests under cyclic load along with numerical calculations were performed in order to verify the presented method. Numerical solution and laboratory tests results showed accuracy of presented method.Keywords: Mapping, Consolidation, Inelastic porous medium, Cyclic loading, Superimposing rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177815616 Experimental Analysis and Optimization of Process Parameters in Plasma Arc Cutting Machine of EN-45A Material Using Taguchi and ANOVA Method
Authors: Sahil Sharma, Mukesh Gupta, Raj Kumar, N. S Bindra
Abstract:
This paper presents an experimental investigation on the optimization and the effect of the cutting parameters on Material Removal Rate (MRR) in Plasma Arc Cutting (PAC) of EN-45A Material using Taguchi L 16 orthogonal array method. Four process variables viz. cutting speed, current, stand-off-distance and plasma gas pressure have been considered for this experimental work. Analysis of variance (ANOVA) has been performed to get the percentage contribution of each process parameter for the response variable i.e. MRR. Based on ANOVA, it has been observed that the cutting speed, current and the plasma gas pressure are the major influencing factors that affect the response variable. Confirmation test based on optimal setting shows the better agreement with the predicted values.Keywords: Analysis of variance, Material removal rate, plasma arc cutting, Taguchi method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 125315615 Design of PID Controller for Higher Order Continuous Systems using MPSO based Model Formulation Technique
Authors: S. N. Deepa, G. Sugumaran
Abstract:
This paper proposes a new algebraic scheme to design a PID controller for higher order linear time invariant continuous systems. Modified PSO (MPSO) based model order formulation techniques have applied to obtain the effective formulated second order system. A controller is tuned to meet the desired performance specification by using pole-zero cancellation method. Proposed PID controller is attached with both higher order system and formulated second order system. The closed loop response is observed for stabilization process and compared with general PSO based formulated second order system. The proposed method is illustrated through numerical example from literature.
Keywords: Higher order systems, model order formulation, modified particle swarm optimization, PID controller, pole-zero cancellation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 502815614 Calculation of Heating Load for an Apartment Complex with Unit Building Method
Authors: Ju-Seok Kim, Sun-Ae Moon, Tae-Gu Lee, Seung-Jae Moon, Jae-Heon Lee
Abstract:
As a simple to method estimate the plant heating energy capacity of an apartment complex, a new load calculation method has been proposed. The method which can be called as unit building method, predicts the heating load of the entire complex instead of summing up that of each apartment belonging to complex. Comparison of the unit heating load for various floor sizes between the present method and conventional approach shows a close agreement with dynamic load calculation code. Some additional calculations are performed to demonstrate it-s application examples.Keywords: Unit Building Method, Unit Heating Load, TFMLoad.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 343915613 A Relationship Extraction Method from Literary Fiction Considering Korean Linguistic Features
Authors: Hee-Jeong Ahn, Kee-Won Kim, Seung-Hoon Kim
Abstract:
The knowledge of the relationship between characters can help readers to understand the overall story or plot of the literary fiction. In this paper, we present a method for extracting the specific relationship between characters from a Korean literary fiction. Generally, methods for extracting relationships between characters in text are statistical or computational methods based on the sentence distance between characters without considering Korean linguistic features. Furthermore, it is difficult to extract the relationship with direction from text, such as one-sided love, because they consider only the weight of relationship, without considering the direction of the relationship. Therefore, in order to identify specific relationships between characters, we propose a statistical method considering linguistic features, such as syntactic patterns and speech verbs in Korean. The result of our method is represented by a weighted directed graph of the relationship between the characters. Furthermore, we expect that proposed method could be applied to the relationship analysis between characters of other content like movie or TV drama.
Keywords: Data mining, Korean linguistic feature, literary fiction, relationship extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179515612 A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease
Authors: Ersin Kaya, Bulent Oran, Ahmet Arslan
Abstract:
In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.Keywords: Congenital heart disease, Fuzzy rule-basedclassifiers, Classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182215611 An Optimization of the New Die Design of Sheet Hydroforming by Taguchi Method
Authors: M. Hosseinzadeh, S. A. Zamani, A. Taheri
Abstract:
During the last few years, several sheet hydroforming processes have been introduced. Despite the advantages of these methods, they have some limitations. Of the processes, the two main ones are the standard hydroforming and hydromechanical deep drawing. A new sheet hydroforming die set was proposed that has the advantages of both processes and eliminates their limitations. In this method, a polyurethane plate was used as a part of the die-set to control the blank holder force. This paper outlines the Taguchi optimization methodology, which is applied to optimize the effective parameters in forming cylindrical cups by the new die set of sheet hydroforming process. The process parameters evaluated in this research are polyurethane hardness, polyurethane thickness, forming pressure path and polyurethane hole diameter. The design of experiments based upon L9 orthogonal arrays by Taguchi was used and analysis of variance (ANOVA) was employed to analyze the effect of these parameters on the forming pressure. The analysis of the results showed that the optimal combination for low forming pressure is harder polyurethane, bigger diameter of polyurethane hole and thinner polyurethane. Finally, the confirmation test was derived based on the optimal combination of parameters and it was shown that the Taguchi method is suitable to examine the optimization process.Keywords: Sheet Hydroforming, Optimization, Taguchi Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 259615610 Trajectory Tracking of a Redundant Hybrid Manipulator Using a Switching Control Method
Authors: Atilla Bayram
Abstract:
This paper presents the trajectory tracking control of a spatial redundant hybrid manipulator. This manipulator consists of two parallel manipulators which are a variable geometry truss (VGT) module. In fact, each VGT module with 3-degress of freedom (DOF) is a planar parallel manipulator and their operational planes of these VGT modules are arranged to be orthogonal to each other. Also, the manipulator contains a twist motion part attached to the top of the second VGT module to supply the missing orientation of the endeffector. These three modules constitute totally 7-DOF hybrid (parallel-parallel) redundant spatial manipulator. The forward kinematics equations of this manipulator are obtained, then, according to these equations, the inverse kinematics is solved based on an optimization with the joint limit avoidance. The dynamic equations are formed by using virtual work method. In order to test the performance of the redundant manipulator and the controllers presented, two different desired trajectories are followed by using the computed force control method and a switching control method. The switching control method is combined with the computed force control method and genetic algorithm. In the switching control method, the genetic algorithm is only used for fine tuning in the compensation of the trajectory tracking errors.Keywords: Computed force control method, genetic algorithm, hybrid manipulator, inverse kinematics of redundant manipulators, variable geometry truss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157315609 Design and Analysis of Gauge R&R Studies: Making Decisions Based on ANOVA Method
Authors: Afrooz Moatari Kazerouni
Abstract:
In a competitive production environment, critical decision making are based on data resulted by random sampling of product units. Efficiency of these decisions depends on data quality and also their reliability scale. This point leads to the necessity of a reliable measurement system. Therefore, the conjecture process and analysing the errors contributes to a measurement system known as Measurement System Analysis (MSA). The aim of this research is on determining the necessity and assurance of extensive development in analysing measurement systems, particularly with the use of Repeatability and Reproducibility Gages (GR&R) to improve physical measurements. Nowadays in productive industries, repeatability and reproducibility gages released so well but they are not applicable as well as other measurement system analysis methods. To get familiar with this method and gain a feedback in improving measurement systems, this survey would be on “ANOVA" method as the most widespread way of calculating Repeatability and Reproducibility (R&R).Keywords: Analysis of Variance (ANOVA), MeasurementSystem Analysis (MSA), Part-Operator interaction effect, Repeatability and Reproducibility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 466815608 A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment
Authors: Byung-Kon Kim
Abstract:
According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.
Keywords: Construction IT, BIM(Building Information Modeling), Cloud Computing, BIM Service Based Cloud Computing, Viewer Based BIM Server, 3D Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 410115607 DIFFER: A Propositionalization approach for Learning from Structured Data
Authors: Thashmee Karunaratne, Henrik Böstrom
Abstract:
Logic based methods for learning from structured data is limited w.r.t. handling large search spaces, preventing large-sized substructures from being considered by the resulting classifiers. A novel approach to learning from structured data is introduced that employs a structure transformation method, called finger printing, for addressing these limitations. The method, which generates features corresponding to arbitrarily complex substructures, is implemented in a system, called DIFFER. The method is demonstrated to perform comparably to an existing state-of-art method on some benchmark data sets without requiring restrictions on the search space. Furthermore, learning from the union of features generated by finger printing and the previous method outperforms learning from each individual set of features on all benchmark data sets, demonstrating the benefit of developing complementary, rather than competing, methods for structure classification.Keywords: Machine learning, Structure classification, Propositionalization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122315606 New Laguerre-s Type Method for Solving of a Polynomial Equations Systems
Authors: Oleksandr Poliakov, Yevgen Pashkov, Marina Kolesova, Olena Chepenyuk, Mykhaylo Kalinin, Vadym Kramar
Abstract:
In this paper we present a substantiation of a new Laguerre-s type iterative method for solving of a nonlinear polynomial equations systems with real coefficients. The problems of its implementation, including relating to the structural choice of initial approximations, were considered. Test examples demonstrate the effectiveness of the method at the solving of many practical problems solving.Keywords: Iterative method, Laguerre's method, Newton's method, polynomial equation, system of equations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148915605 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.
Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 191115604 Problem Based Learning in B. P. Koirala Institute of Health Sciences
Authors: Gurung S., Yadav B. N., Budhathoki SS.
Abstract:
Problem based learning is one of the highly acclaimed learning methods in medical education since its first introduction at Mc-Master University in Canada in the 1960s. It has now been adopted as a teaching learning method in many medical colleges of Nepal. B.P. Koirala Institute of Health Sciences (BPKIHS), a health science deemed university is the second institute in Nepal to establish problem-based learning academic program and need-based teaching approach hence minimizing teaching through lectures since its inception. During the first two years of MBBS course, the curriculum is divided into various organ-systems incorporated with problem-based learning exercise each of one week duration.
Keywords: PBL, medical education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234015603 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection
Authors: A. Mirrashid, M. Khoshbin, A. Atghaei, H. Shahbazi
Abstract:
In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.
Keywords: Attention, fire detection, smoke detection, spatiotemporal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35715602 Influence of Heterogeneous Traffic on the Roadside Fine (PM2.5 and PM1) and Coarse(PM10) Particulate Matter Concentrations in Chennai City, India
Authors: Srimuruganandam. B, S.M. Shiva Nagendra
Abstract:
In this paper the influence of heterogeneous traffic on the temporal variation of ambient PM10, PM2.5 and PM1 concentrations at a busy arterial route (Sardar Patel Road) in the Chennai city has been analyzed. The hourly PM concentration, traffic counts and average speed of the vehicles have been monitored at the study site for one week (19th-25th January 2009). Results indicated that the concentrations of coarse (PM10) and fine PM (PM2.5 and PM1) concentrations at SP road are having similar trend during peak and non-peak hours, irrespective of the days. The PM concentrations showed daily two peaks corresponding to morning (8 to 10 am) and evening (7 to 9 pm) peak hour traffic flow. The PM10 concentration is dominated by fine particles (53% of PM2.5 and 45% of PM1). The high PM2.5/PM10 ratio indicates that the majority of PM10 particles originate from re-suspension of road dust. The analysis of traffic flow at the study site showed that 2W, 3W and 4W are having similar diurnal trend as PM concentrations. This confirms that the 2W, 3W and 4W are the main emission source contributing to ambient PM concentration at SP road. The speed measurement at SP road showed that the average speed of 2W, 3W, 4W, LCV and HCV are 38, 40, 38, 40 and 38 km/hr and 43, 41, 42, 40 and 41 km/hr respectively for the weekdays and weekdays.Keywords: particulate matter, heterogeneous traffic, fineparticles, coarse particles, vehicle speed, weekend and weekday.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146815601 3D Objects Indexing with a Direct and Analytical Method for Calculating the Spherical Harmonics Coefficients
Authors: S. Hellam, Y. Oulahrir, F. El Mounchid, A. Sadiq, S. Mbarki
Abstract:
In this paper, we propose a new method for threedimensional object indexing based on D.A.M.C-S.H.C descriptor (Direct and Analytical Method for Calculating the Spherical Harmonics Coefficients). For this end, we propose a direct calculation of the coefficients of spherical harmonics with perfect precision. The aims of the method are to minimize, the processing time on the 3D objects database and the searching time of similar objects to a request object. Firstly we start by defining the new descriptor using a new division of 3-D object in a sphere. Then we define a new distance which will be tested and prove his efficiency in the search for similar objects in the database in which we have objects with very various and important size.Keywords: 3D Object indexing, 3D shape descriptor, spherical harmonic, 3D Object similarity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 147615600 A Rule-based Approach for Anomaly Detection in Subscriber Usage Pattern
Authors: Rupesh K. Gopal, Saroj K. Meher
Abstract:
In this report we present a rule-based approach to detect anomalous telephone calls. The method described here uses subscriber usage CDR (call detail record) data sampled over two observation periods: study period and test period. The study period contains call records of customers- non-anomalous behaviour. Customers are first grouped according to their similar usage behaviour (like, average number of local calls per week, etc). For customers in each group, we develop a probabilistic model to describe their usage. Next, we use maximum likelihood estimation (MLE) to estimate the parameters of the calling behaviour. Then we determine thresholds by calculating acceptable change within a group. MLE is used on the data in the test period to estimate the parameters of the calling behaviour. These parameters are compared against thresholds. Any deviation beyond the threshold is used to raise an alarm. This method has the advantage of identifying local anomalies as compared to techniques which identify global anomalies. The method is tested for 90 days of study data and 10 days of test data of telecom customers. For medium to large deviations in the data in test window, the method is able to identify 90% of anomalous usage with less than 1% false alarm rate.Keywords: Subscription fraud, fraud detection, anomalydetection, maximum likelihood estimation, rule based systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 281315599 Novel Approach to Design of a Class-EJ Power Amplifier Using High Power Technology
Authors: F. Rahmani, F. Razaghian, A. R. Kashaninia
Abstract:
This article proposes a new method for application in communication circuit systems that increase efficiency, PAE, output power and gain in the circuit. The proposed method is based on a combination of switching class-E and class-J and has been termed class-EJ. This method was investigated using both theory and simulation to confirm ∼72% PAE and output power of >39dBm. The combination and design of the proposed power amplifier accrues gain of over 15dB in the 2.9 to 3.5GHz frequency bandwidth. This circuit was designed using MOSFET and high power transistors. The loadand source-pull method achieved the best input and output networks using lumped elements. The proposed technique was investigated for fundamental and second harmonics having desirable amplitudes for the output signal.Keywords: Power Amplifier (PA), GaN HEMT, Class-J and Class-E, High Efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235815598 A Minimum Spanning Tree-Based Method for Initializing the K-Means Clustering Algorithm
Authors: J. Yang, Y. Ma, X. Zhang, S. Li, Y. Zhang
Abstract:
The traditional k-means algorithm has been widely used as a simple and efficient clustering method. However, the algorithm often converges to local minima for the reason that it is sensitive to the initial cluster centers. In this paper, an algorithm for selecting initial cluster centers on the basis of minimum spanning tree (MST) is presented. The set of vertices in MST with same degree are regarded as a whole which is used to find the skeleton data points. Furthermore, a distance measure between the skeleton data points with consideration of degree and Euclidean distance is presented. Finally, MST-based initialization method for the k-means algorithm is presented, and the corresponding time complexity is analyzed as well. The presented algorithm is tested on five data sets from the UCI Machine Learning Repository. The experimental results illustrate the effectiveness of the presented algorithm compared to three existing initialization methods.
Keywords: Degree, initial cluster center, k-means, minimum spanning tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155215597 Generalized Method for Estimating Best-Fit Vertical Alignments for Profile Data
Authors: Said M. Easa, Shinya Kikuchi
Abstract:
When the profile information of an existing road is missing or not up-to-date and the parameters of the vertical alignment are needed for engineering analysis, the engineer has to recreate the geometric design features of the road alignment using collected profile data. The profile data may be collected using traditional surveying methods, global positioning systems, or digital imagery. This paper develops a method that estimates the parameters of the geometric features that best characterize the existing vertical alignments in terms of tangents and the expressions of the curve, that may be symmetrical, asymmetrical, reverse, and complex vertical curves. The method is implemented using an Excel-based optimization method that minimizes the differences between the observed profile and the profiles estimated from the equations of the vertical curve. The method uses a 'wireframe' representation of the profile that makes the proposed method applicable to all types of vertical curves. A secondary contribution of this paper is to introduce the properties of the equal-arc asymmetrical curve that has been recently developed in the highway geometric design field.Keywords: Optimization, parameters, data, reverse, spreadsheet, vertical curves
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 244815596 An Iterative Method for the Least-squares Symmetric Solution of AXB+CYD=F and its Application
Authors: Minghui Wang
Abstract:
Based on the classical algorithm LSQR for solving (unconstrained) LS problem, an iterative method is proposed for the least-squares like-minimum-norm symmetric solution of AXB+CYD=E. As the application of this algorithm, an iterative method for the least-squares like-minimum-norm biymmetric solution of AXB=E is also obtained. Numerical results are reported that show the efficiency of the proposed methods.
Keywords: Matrix equation, bisymmetric matrix, least squares problem, like-minimum norm, iterative algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148915595 Source Direction Detection based on Stationary Electronic Nose System
Authors: Jie Cai, David C. Levy
Abstract:
Electronic nose (array of chemical sensors) are widely used in food industry and pollution control. Also it could be used to locate or detect the direction of the source of emission odors. Usually this task is performed by electronic nose (ENose) cooperated with mobile vehicles, but when a source is instantaneous or surrounding is hard for vehicles to reach, problem occurs. Thus a method for stationary ENose to detect the direction of the source and locate the source will be required. A novel method which uses the ratio between the responses of different sensors as a discriminant to determine the direction of source in natural wind surroundings is presented in this paper. The result shows that the method is accurate and easily to be implemented. This method could be also used in movably, as an optimized algorithm for robot tracking source location.Keywords: Electronic nose, Nature wind situation, Source direction detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 133015594 Design of Compliant Mechanism Based Microgripper with Three Finger Using Topology Optimization
Authors: R. Bharanidaran, B. T. Ramesh
Abstract:
High precision in motion is required to manipulate the micro objects in precision industries for micro assembly, cell manipulation etc. Precision manipulation is achieved based on the appropriate mechanism design of micro devices such as microgrippers. Design of a compliant based mechanism is the better option to achieve a highly precised and controlled motion. This research article highlights the method of designing a compliant based three fingered microgripper suitable for holding asymmetric objects. Topological optimization technique, a systematic method is implemented in this research work to arrive a topologically optimized design of the mechanism needed to perform the required micro motion of the gripper. Optimization technique has a drawback of generating senseless regions such as node to node connectivity and staircase effect at the boundaries. Hence, it is required to have post processing of the design to make it manufacturable. To reduce the effect of post processing stage and to preserve the edges of the image, a cubic spline interpolation technique is introduced in the MATLAB program. Structural performance of the topologically developed mechanism design is tested using finite element method (FEM) software. Further the microgripper structure is examined to find its fatigue life and vibration characteristics.
Keywords: Compliant mechanism, Cubic spline interpolation, FEM, Topology optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 358015593 Application of Smooth Ergodic Hidden Markov Model in Text to Speech Systems
Authors: Armin Ghayoori, Faramarz Hendessi, Asrar Sheikh
Abstract:
In developing a text-to-speech system, it is well known that the accuracy of information extracted from a text is crucial to produce high quality synthesized speech. In this paper, a new scheme for converting text into its equivalent phonetic spelling is introduced and developed. This method is applicable to many applications in text to speech converting systems and has many advantages over other methods. The proposed method can also complement the other methods with a purpose of improving their performance. The proposed method is a probabilistic model and is based on Smooth Ergodic Hidden Markov Model. This model can be considered as an extension to HMM. The proposed method is applied to Persian language and its accuracy in converting text to speech phonetics is evaluated using simulations.Keywords: Hidden Markov Models, text, synthesis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154915592 Frequency Modulation in Vibro-Acoustic Modulation Method
Authors: D. Liu, D. M. Donskoy
Abstract:
The vibroacoustic modulation method is based on the modulation effect of high-frequency ultrasonic wave (carrier) by low-frequency vibration in the presence of various defects, primarily contact-type such as cracks, delamination, etc. The presence and severity of the defect are measured by the ratio of the spectral sidebands and the carrier in the spectrum of the modulated signal. This approach, however, does not differentiate between amplitude and frequency modulations, AM and FM, respectfully. This paper is an attempt to explain the generation mechanisms of FM and its correlation with the flaw properties. Here we proposed two possible mechanisms leading to FM modulation based on nonlinear local defect resonance and dynamic acoustoelastic models.
Keywords: Non-destructive testing, nonlinear acoustics, structural health monitoring, acoustoelasticity, local defect resonance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 502