Search results for: computational complexity
1118 Compressive Stresses near Crack Tip Induced by Thermo-Electric Field
Authors: Thomas Jin-Chee Liu
Abstract:
In this paper, the thermo-electro-structural coupledfield in a cracked metal plate is studied using the finite element analysis. From the computational results, the compressive stresses reveal near the crack tip. This conclusion agrees with the past reference. Furthermore, the compressive condition can retard and stop the crack growth during the Joule heating process.
Keywords: Compressive stress, crack tip, Joule heating, finite element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20151117 Inverse Sets-based Recognition of Video Clips
Authors: Alexei M. Mikhailov
Abstract:
The paper discusses the mathematics of pattern indexing and its applications to recognition of visual patterns that are found in video clips. It is shown that (a) pattern indexes can be represented by collections of inverted patterns, (b) solutions to pattern classification problems can be found as intersections and histograms of inverted patterns and, thus, matching of original patterns avoided.Keywords: Artificial neural cortex, computational biology, data mining, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21141116 Two-Phase Optimization for Selecting Materialized Views in a Data Warehouse
Authors: Jiratta Phuboon-ob, Raweewan Auepanwiriyakul
Abstract:
A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.Keywords: Data warehouse, materialized views, view selectionproblem, two-phase optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17041115 Data-organization Before Learning Multi-Entity Bayesian Networks Structure
Authors: H. Bouhamed, A. Rebai, T. Lecroq, M. Jaoua
Abstract:
The objective of our work is to develop a new approach for discovering knowledge from a large mass of data, the result of applying this approach will be an expert system that will serve as diagnostic tools of a phenomenon related to a huge information system. We first recall the general problem of learning Bayesian network structure from data and suggest a solution for optimizing the complexity by using organizational and optimization methods of data. Afterward we proposed a new heuristic of learning a Multi-Entities Bayesian Networks structures. We have applied our approach to biological facts concerning hereditary complex illnesses where the literatures in biology identify the responsible variables for those diseases. Finally we conclude on the limits arched by this work.
Keywords: Data-organization, data-optimization, automatic knowledge discovery, Multi-Entities Bayesian networks, score merging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16101114 Demand and Supply Chain Simulation in Telecommunication Industry by Multi-Rate Expert Systems
Authors: Andrus Pedai, Igor Astrov
Abstract:
In modern telecommunications industry, demand & supply chain management (DSCM) needs reliable design and versatile tools to control the material flow. The objective for efficient DSCM is reducing inventory, lead times and related costs in order to assure reliable and on-time deliveries from manufacturing units towards customers. In this paper the multi-rate expert system based methodology for developing simulation tools that would enable optimal DSCM for multi region, high volume and high complexity manufacturing environment was proposed.Keywords: Demand & supply chain management, expert systems, inventory control, multi-rate control, performance metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18901113 Joint Optimization of Pricing and Advertisement for Seasonal Branded Products
Authors: Mohammad Modarres, Shirin Aslani
Abstract:
The goal of this paper is to develop a model to integrate “pricing" and “advertisement" for short life cycle products, such as branded fashion clothing products. To achieve this goal, we apply the concept of “Dynamic Pricing". There are two classes of advertisements, for the brand (regardless of product) and for a particular product. Advertising the brand affects the demand and price of all the products. Thus, the model considers all these products in relation with each other. We develop two different methods to integrate both types of advertisement and pricing. The first model is developed within the framework of dynamic programming. However, due to the complexity of the model, this method cannot be applicable for large size problems. Therefore, we develop another method, called hieratical approach, which is capable of handling the real world problems. Finally, we show the accuracy of this method, both theoretically and also by simulation.Keywords: Advertising, Dynamic programming, Dynamic pricing, Promotion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16321112 A New Floating Point Implementation of Base 2 Logarithm
Authors: Ahmed M. Mansour, Ali M. El-Sawy, Ahmed T Sayed
Abstract:
Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving insights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.
Keywords: Logarithms, log2, floor, iterative, CORDIC, Taylor series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38211111 Algebraic Approach for the Reconstruction of Linear and Convolutional Error Correcting Codes
Authors: Johann Barbier, Guillaume Sicot, Sebastien Houcke
Abstract:
In this paper we present a generic approach for the problem of the blind estimation of the parameters of linear and convolutional error correcting codes. In a non-cooperative context, an adversary has only access to the noised transmission he has intercepted. The intercepter has no knowledge about the parameters used by the legal users. So, before having acess to the information he has first to blindly estimate the parameters of the error correcting code of the communication. The presented approach has the main advantage that the problem of reconstruction of such codes can be expressed in a very simple way. This allows us to evaluate theorical bounds on the complexity of the reconstruction process but also bounds on the estimation rate. We show that some classical reconstruction techniques are optimal and also explain why some of them have theorical complexities greater than these experimentally observed.
Keywords: Blind estimation parameters, error correcting codes, non-cooperative context, reconstruction algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21961110 Development of Interaction Factors Charts for Piled Raft Foundation
Authors: Abdelazim Makki Ibrahim, Esamaldeen Ali
Abstract:
This study aims at analysing the load settlement behavior and predict the bearing capacity of piled raft foundation a series of finite element models with different foundation configurations and stiffness were established. Numerical modeling is used to study the behavior of the piled raft foundation due to the complexity of piles, raft, and soil interaction and also due to the lack of reliable analytical method that can predict the behavior of the piled raft foundation system. Simple analytical models are developed to predict the average settlement and the load sharing between the piles and the raft in piled raft foundation system. A simple example to demonstrate the applications of these charts is included.Keywords: Finite element, pile-raft foundation, method, PLAXIS software, settlement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17311109 A Robust Visual Tracking Algorithm with Low-Rank Region Covariance
Authors: Songtao Wu, Yuesheng Zhu, Ziqiang Sun
Abstract:
Region covariance (RC) descriptor is an effective and efficient feature for visual tracking. Current RC-based tracking algorithms use the whole RC matrix to track the target in video directly. However, there exist some issues for these whole RCbased algorithms. If some features are contaminated, the whole RC will become unreliable, which results in lost object-tracking. In addition, if some features are very discriminative to the background, other features are still processed and thus reduce the efficiency. In this paper a new robust tracking method is proposed, in which the whole RC matrix is decomposed into several low rank matrices. Those matrices are dynamically chosen and processed so as to achieve a good tradeoff between discriminability and complexity. Experimental results have shown that our method is more robust to complex environment changes, especially either when occlusion happens or when the background is similar to the target compared to other RC-based methods.Keywords: Visual tracking, region covariance descriptor, lowrankregion covariance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15831108 Potential Field Functions for Motion Planning and Posture of the Standard 3-Trailer System
Authors: K. Raghuwaiya, S. Singh, B. Sharma, J. Vanualailai
Abstract:
This paper presents a set of artificial potential field functions that improves upon, in general, the motion planning and posture control, with theoretically guaranteed point and posture stabilities, convergence and collision avoidance properties of 3-trailer systems in a priori known environment. We basically design and inject two new concepts; ghost walls and the distance optimization technique (DOT) to strengthen point and posture stabilities, in the sense of Lyapunov, of our dynamical model. This new combination of techniques emerges as a convenient mechanism for obtaining feasible orientations at the target positions with an overall reduction in the complexity of the navigation laws. The effectiveness of the proposed control laws were demonstrated via simulations of two traffic scenarios.
Keywords: Artificial potential fields, 3-trailer systems, motion planning, posture, parking and collision-free trajectories.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21271107 Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach
Authors: Cláudio Santos, Madalena Araújo, Nuno Correia
Abstract:
The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.Keywords: Competencies, Delphi Method, Quality Function Deployment, Technology Foresight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17011106 A Hyper-Domain Image Watermarking Method based on Macro Edge Block and Wavelet Transform for Digital Signal Processor
Authors: Yi-Pin Hsu, Shin-Yu Lin
Abstract:
In order to protect original data, watermarking is first consideration direction for digital information copyright. In addition, to achieve high quality image, the algorithm maybe can not run on embedded system because the computation is very complexity. However, almost nowadays algorithms need to build on consumer production because integrator circuit has a huge progress and cheap price. In this paper, we propose a novel algorithm which efficient inserts watermarking on digital image and very easy to implement on digital signal processor. In further, we select a general and cheap digital signal processor which is made by analog device company to fit consumer application. The experimental results show that the image quality by watermarking insertion can achieve 46 dB can be accepted in human vision and can real-time execute on digital signal processor.
Keywords: watermarking, digital signal processor, embedded system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12461105 Frequency Offset Estimation Schemes Based On ML for OFDM Systems in Non-Gaussian Noise Environments
Authors: Keunhong Chae, Seokho Yoon
Abstract:
In this paper, frequency offset (FO) estimation schemes robust to the non-Gaussian noise environments are proposed for orthogonal frequency division multiplexing (OFDM) systems. First, a maximum-likelihood (ML) estimation scheme in non-Gaussian noise environments is proposed, and then, the complexity of the ML estimation scheme is reduced by employing a reduced set of candidate values. In numerical results, it is demonstrated that the proposed schemes provide a significant performance improvement over the conventional estimation scheme in non-Gaussian noise environments while maintaining the performance similar to the estimation performance in Gaussian noise environments.
Keywords: Frequency offset estimation, maximum-likelihood, non-Gaussian noise environment, OFDM, training symbol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19471104 Scenarios of Societal Security and Business Continuity Cycles
Authors: Jiří F. Urbánek, Jiří Barta
Abstract:
Societal security, continuity scenarios and methodological cycling approach explained in this article. Namely societal security organizational challenges ask implementation of international standards BS 25999-2 & global ISO 22300 which is a family of standards for business continuity management system. Efficient global organization system is distinguished of high entity´s complexity, connectivity & interoperability, having not only cooperative relations in a fact. Competing business have numerous participating ´enemies´, which are in apparent or hidden opponent and antagonistic roles with prosperous organization system, resulting to a crisis scene or even to a battle theatre. Organization business continuity scenarios are necessary for such ´a play´ preparedness, planning, management & overmastering in real environments.
Keywords: Business Continuity, Societal Security Crisis Scenarios Cycles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21631103 Multiuser Detection in CDMA Fast Fading Multipath Channel using Heuristic Genetic Algorithms
Authors: Muhammad Naeem, Syed Ismail Shah, Habibullah Jamal
Abstract:
In this paper, a simple heuristic genetic algorithm is used for Multistage Multiuser detection in fast fading environments. Multipath channels, multiple access interference (MAI) and near far effect cause the performance of the conventional detector to degrade. Heuristic Genetic algorithms, a rapidly growing area of artificial intelligence, uses evolutionary programming for initial search, which not only helps to converge the solution towards near optimal performance efficiently but also at a very low complexity as compared with optimal detector. This holds true for Additive White Gaussian Noise (AWGN) and multipath fading channels. Experimental results are presented to show the superior performance of the proposed techque over the existing methods.Keywords: Genetic Algorithm (GA), Multiple AccessInterference (MAI), Multistage Detectors (MSD), SuccessiveInterference Cancellation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20461102 Prediction of Bath Temperature Using Neural Networks
Authors: H. Meradi, S. Bouhouche, M. Lahreche
Abstract:
In this work, we consider an application of neural networks in LD converter. Application of this approach assumes a reliable prediction of steel temperature and reduces a reblow ratio in steel work. It has been applied a conventional model to charge calculation, the obtained results by this technique are not always good, this is due to the process complexity. Difficulties are mainly generated by the noisy measurement and the process non linearities. Artificial Neural Networks (ANNs) have become a powerful tool for these complex applications. It is used a backpropagation algorithm to learn the neural nets. (ANNs) is used to predict the steel bath temperature in oxygen converter process for the end condition. This model has 11 inputs process variables and one output. The model was tested in steel work, the obtained results by neural approach are better than the conventional model.
Keywords: LD converter, bath temperature, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18361101 Recognition by Online Modeling – a New Approach of Recognizing Voice Signals in Linear Time
Authors: Jyh-Da Wei, Hsin-Chen Tsai
Abstract:
This work presents a novel means of extracting fixedlength parameters from voice signals, such that words can be recognized in linear time. The power and the zero crossing rate are first calculated segment by segment from a voice signal; by doing so, two feature sequences are generated. We then construct an FIR system across these two sequences. The parameters of this FIR system, used as the input of a multilayer proceptron recognizer, can be derived by recursive LSE (least-square estimation), implying that the complexity of overall process is linear to the signal size. In the second part of this work, we introduce a weighting factor λ to emphasize recent input; therefore, we can further recognize continuous speech signals. Experiments employ the voice signals of numbers, from zero to nine, spoken in Mandarin Chinese. The proposed method is verified to recognize voice signals efficiently and accurately.Keywords: Speech Recognition, FIR system, Recursive LSE, Multilayer Perceptron
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14161100 Shadow Detection for Increased Accuracy of Privacy Enhancing Methods in Video Surveillance Edge Devices
Authors: F. Matusek, G. Pujolle, R. Reda
Abstract:
Shadow detection is still considered as one of the potential challenges for intelligent automated video surveillance systems. A pre requisite for reliable and accurate detection and tracking is the correct shadow detection and classification. In such a landscape of conditions, privacy issues add more and more complexity and require reliable shadow detection. In this work the intertwining between security, accuracy, reliability and privacy is analyzed and, accordingly, a novel architecture for Privacy Enhancing Video Surveillance (PEVS) is introduced. Shadow detection and masking are dealt with through the combination of two different approaches simultaneously. This results in a unique privacy enhancement, without affecting security. Subsequently, the methodology was employed successfully in a large-scale wireless video surveillance system; privacy relevant information was stored and encrypted on the unit, without transferring it over an un-trusted network.Keywords: Video Surveillance, Intelligent Video Surveillance, Physical Security, WSSU, Privacy, Shadow Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13411099 Digital Transformation as the Subject of the Knowledge Model of the Discursive Space
Authors: Rafal Maciag
Abstract:
Due to the development of the current civilization, one must create suitable models of its pervasive massive phenomena. Such a phenomenon is the digital transformation, which has a substantial number of disciplined, methodical interpretations forming the diversified reflection. This reflection could be understood pragmatically as the current temporal, a local differential state of knowledge. The model of the discursive space is proposed as a model for the analysis and description of this knowledge. Discursive space is understood as an autonomous multidimensional space where separate discourses traverse specific trajectories of what can be presented in multidimensional parallel coordinate system. Discursive space built on the world of facts preserves the complex character of that world. Digital transformation as a discursive space has a relativistic character that means that at the same time, it is created by the dynamic discourses and these discourses are molded by the shape of this space.
Keywords: Knowledge, digital transformation, discourse, discursive space, complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6541098 Urban Transformations of the Mediterranean Cities in Light of Developments in the Modern Era
Authors: Bakr Hashem Paumey Ahmed Alashwal
Abstract:
The urban transformation processes in its framework and its general significance became a fundamental and vital subject of consideration for both the developed and the developing societies. It has become important to regulate the architectural systems adopted by the city, to sustain the present development on one hand, and on the other hand, to facilitate its future growth. Thus, the study dealt with the phenomenon of urban transformation of the Mediterranean cities, and the city of Alexandria in particular, because of its significant historical and cultural legacy, its historical architecture and its contemporary urbanization. This article investigates the entirety of cities in the Mediterranean region through the analysis of the relationship between inflation and growth of these cities and the extent of the complexity of the city barriers. We hope to analyze not only the internal transformations, but the external relationships (both imperial and post-colonial) that have shaped Alexandria city growth from the nineteenth century until today.Keywords: Urban Transformations, Mediterranean cities, Modern Era, Alexandria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22831097 Action Potential of Lateral Geniculate Neurons at Low Threshold Currents: Simulation Study
Authors: Faris Tarlochan, Siva Mahesh Tangutooru
Abstract:
Lateral Geniculate Nucleus (LGN) is the relay center in the visual pathway as it receives most of the input information from retinal ganglion cells (RGC) and sends to visual cortex. Low threshold calcium currents (IT) at the membrane are the unique indicator to characterize this firing functionality of the LGN neurons gained by the RGC input. According to the LGN functional requirements such as functional mapping of RGC to LGN, the morphologies of the LGN neurons were developed. During the neurological disorders like glaucoma, the mapping between RGC and LGN is disconnected and hence stimulating LGN electrically using deep brain electrodes can restore the functionalities of LGN. A computational model was developed for simulating the LGN neurons with three predominant morphologies each representing different functional mapping of RGC to LGN. The firings of action potentials at LGN neuron due to IT were characterized by varying the stimulation parameters, morphological parameters and orientation. A wide range of stimulation parameters (stimulus amplitude, duration and frequency) represents the various strengths of the electrical stimulation with different morphological parameters (soma size, dendrites size and structure). The orientation (0-1800) of LGN neuron with respect to the stimulating electrode represents the angle at which the extracellular deep brain stimulation towards LGN neuron is performed. A reduced dendrite structure was used in the model using Bush–Sejnowski algorithm to decrease the computational time while conserving its input resistance and total surface area. The major finding is that an input potential of 0.4 V is required to produce the action potential in the LGN neuron which is placed at 100 μm distance from the electrode. From this study, it can be concluded that the neuroprostheses under design would need to consider the capability of inducing at least 0.4V to produce action potentials in LGN.Keywords: Lateral geniculate nucleus, visual cortex, finite element, glaucoma, neuroprostheses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20231096 Greek Compounds: A Challenging Case for the Parsing Techniques of PC-KIMMO v.2
Authors: Angela Ralli, Eleni Galiotou
Abstract:
In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.
Keywords: Morpho-phonological parsing, compound words, two-level morphology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16071095 Ranking and Unranking Algorithms for k-ary Trees in Gray Code Order
Authors: Fateme Ashari-Ghomi, Najme Khorasani, Abbas Nowzari-Dalini
Abstract:
In this paper, we present two new ranking and unranking algorithms for k-ary trees represented by x-sequences in Gray code order. These algorithms are based on a gray code generation algorithm developed by Ahrabian et al.. In mentioned paper, a recursive backtracking generation algorithm for x-sequences corresponding to k-ary trees in Gray code was presented. This generation algorithm is based on Vajnovszki-s algorithm for generating binary trees in Gray code ordering. Up to our knowledge no ranking and unranking algorithms were given for x-sequences in this ordering. we present ranking and unranking algorithms with O(kn2) time complexity for x-sequences in this Gray code orderingKeywords: k-ary Tree Generation, Ranking, Unranking, Gray Code.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21051094 Evaluating and Selecting Optimization Software Packages: A Framework for Business Applications
Authors: Waleed Abohamad, Amr Arisha
Abstract:
Owing the fact that optimization of business process is a crucial requirement to navigate, survive and even thrive in today-s volatile business environment, this paper presents a framework for selecting a best-fit optimization package for solving complex business problems. Complexity level of the problem and/or using incorrect optimization software can lead to biased solutions of the optimization problem. Accordingly, the proposed framework identifies a number of relevant factors (e.g. decision variables, objective functions, and modeling approach) to be considered during the evaluation and selection process. Application domain, problem specifications, and available accredited optimization approaches are also to be regarded. A recommendation of one or two optimization software is the output of the framework which is believed to provide the best results of the underlying problem. In addition to a set of guidelines and recommendations on how managers can conduct an effective optimization exercise is discussed.Keywords: Complex Business Problems, Optimization, Selection Criteria, Software Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29081093 Detection and Pose Estimation of People in Images
Authors: Mousa Mojarrad, Amir Masoud Rahmani, Mehrab Mohebi
Abstract:
Detection, feature extraction and pose estimation of people in images and video is made challenging by the variability of human appearance, the complexity of natural scenes and the high dimensionality of articulated body models and also the important field in Image, Signal and Vision Computing in recent years. In this paper, four types of people in 2D dimension image will be tested and proposed. The system will extract the size and the advantage of them (such as: tall fat, short fat, tall thin and short thin) from image. Fat and thin, according to their result from the human body that has been extract from image, will be obtained. Also the system extract every size of human body such as length, width and shown them in output.Keywords: Analysis of Image Processing, Canny Edge Detection, Human Body Recognition, Measurement, Pose Estimation, 2D Human Dimension.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22991092 A Novel Compression Algorithm for Electrocardiogram Signals based on Wavelet Transform and SPIHT
Authors: Sana Ktata, Kaïs Ouni, Noureddine Ellouze
Abstract:
Electrocardiogram (ECG) data compression algorithm is needed that will reduce the amount of data to be transmitted, stored and analyzed, but without losing the clinical information content. A wavelet ECG data codec based on the Set Partitioning In Hierarchical Trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm has achieved notable success in still image coding. We modified the algorithm for the one-dimensional (1-D) case and applied it to compression of ECG data. By this compression method, small percent root mean square difference (PRD) and high compression ratio with low implementation complexity are achieved. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. Compression ratios of up to 48:1 for ECG signals lead to acceptable results for visual inspection.Keywords: Discrete Wavelet Transform, ECG compression, SPIHT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21301091 A Review of the Theoretical Context of the Role of Innovation in Economic Development
Authors: Maria E. Eggink
Abstract:
It is the aim of this paper to place the role of innovation in economic development in its theoretical context through a literature review. The review compares classical economic theory and the neoclassical theories of “equilibrium in the markets” and “perfectly competitive markets” with the Schumpeterian theory. It was found that Schumpeter’s role in contributing towards economic development theories, and by creating awareness of the role of innovation in these theories is of immeasurable importance. His contribution led to a change in economic thinking, although this was only realized much later than when his theories were first published. The neo-Schumpeterian thinking expanded on the Schumpeterian theory by studying innovation within a system of interaction among different role players. Studies on innovation should be founded in the neo-Schumpeterian school of thought in order to accommodate the complexity of the innovation system concept.
Keywords: Economic development, evolutionary economics, innovation, Schumpeter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44231090 Courses Pre-Required Visualization Using Force Directed Placement Technique
Authors: Imen Ammari, Mourad Elloumi, Ala Eddine Barouni
Abstract:
Visualizing “Courses – Pre – Required - Architecture" on the screen has proven to be useful and helpful for university actors and specially for students. In fact, these students can easily identify courses and their pre required, perceive the courses to follow in the future, and then can choose rapidly the appropriate course to register in. Given a set of courses and their prerequired, we present an algorithm for visualization a graph entitled “Courses-Pre-Required-Graph" that present courses and their prerequired in order to help students to recognize, lonely, what courses to take in the future and perceive the contain of all courses that they will study. Our algorithm using “Force Directed Placement" technique visualizes the “Courses-Pre-Required-Graph" in such way that courses are easily identifiable. The time complexity of our drawing algorithm is O (n2), where n is the number of courses in the “Courses-Pre-Required-Graph".Keywords: Courses–Pre-Required-Architecture, Courses-Pre- Required-Graph, Courses-Pre-Required-Visualization, Force directed Placement, Resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13181089 An Optimal Feature Subset Selection for Leaf Analysis
Authors: N. Valliammal, S.N. Geethalakshmi
Abstract:
This paper describes an optimal approach for feature subset selection to classify the leaves based on Genetic Algorithm (GA) and Kernel Based Principle Component Analysis (KPCA). Due to high complexity in the selection of the optimal features, the classification has become a critical task to analyse the leaf image data. Initially the shape, texture and colour features are extracted from the leaf images. These extracted features are optimized through the separate functioning of GA and KPCA. This approach performs an intersection operation over the subsets obtained from the optimization process. Finally, the most common matching subset is forwarded to train the Support Vector Machine (SVM). Our experimental results successfully prove that the application of GA and KPCA for feature subset selection using SVM as a classifier is computationally effective and improves the accuracy of the classifier.Keywords: Optimization, Feature extraction, Feature subset, Classification, GA, KPCA, SVM and Computation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2240