Search results for: greedy approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5065

Search results for: greedy approach

4135 A New Biologically Inspired Pattern Recognition Spproach for Face Recognition

Authors: V. Kabeer, N.K.Narayanan

Abstract:

This paper reports a new pattern recognition approach for face recognition. The biological model of light receptors - cones and rods in human eyes and the way they are associated with pattern vision in human vision forms the basis of this approach. The functional model is simulated using CWD and WPD. The paper also discusses the experiments performed for face recognition using the features extracted from images in the AT & T face database. Artificial Neural Network and k- Nearest Neighbour classifier algorithms are employed for the recognition purpose. A feature vector is formed for each of the face images in the database and recognition accuracies are computed and compared using the classifiers. Simulation results show that the proposed method outperforms traditional way of feature extraction methods prevailing for pattern recognition in terms of recognition accuracy for face images with pose and illumination variations.

Keywords: Face recognition, Image analysis, Wavelet feature extraction, Pattern recognition, Classifier algorithms

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
4134 Evolutionary Approach for Automated Discovery of Censored Production Rules

Authors: Kamal K. Bharadwaj, Basheer M. Al-Maqaleh

Abstract:

In the recent past, there has been an increasing interest in applying evolutionary methods to Knowledge Discovery in Databases (KDD) and a number of successful applications of Genetic Algorithms (GA) and Genetic Programming (GP) to KDD have been demonstrated. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski & Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations, in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the 'If P Then D' part of the CPR expresses important information, while the Unless C part acts only as a switch and changes the polarity of D to ~D. This paper presents a classification algorithm based on evolutionary approach that discovers comprehensible rules with exceptions in the form of CPRs. The proposed approach has flexible chromosome encoding, where each chromosome corresponds to a CPR. Appropriate genetic operators are suggested and a fitness function is proposed that incorporates the basic constraints on CPRs. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Keywords: Censored Production Rule, Data Mining, MachineLearning, Evolutionary Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
4133 A Multilevel Comparative Assessment Approach to International Services Trade Competitiveness: The Case of Romania and Bulgaria

Authors: Ana Bobirca, Paul-Gabriel Miclaus

Abstract:

International competitiveness receives much attention nowadays, but up to now its assessment has been heavily based on manufacturing industry statistics. This paper addresses the need for competitiveness indicators that cover the service sector and sets out a multilevel framework for measuring international services trade competitiveness. The approach undertaken here aims at comparatively examining the international competitiveness of the EU-25 (the twenty-five European Union member states before the 1st of January 2007), Romanian and Bulgarian services trade, as well as the last two countries- structure of specialization on the EU-25 services market. The primary changes in the international competitiveness of three major services sectors – transportation, travel and other services - are analyzed. This research attempts to determine the ability of the two recent European Union (EU) member states to contend with the challenges that might arise from the hard competition within the enlarged EU, in the field of services trade.

Keywords: Bulgaria, EU-25, international competitiveness, international services trade, Romania.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2019
4132 Model-Based Small Area Estimation with Application to Unemployment Estimates

Authors: Hichem Omrani, Philippe Gerber, Patrick Bousch

Abstract:

The problem of Small Area Estimation (SAE) is complex because of various information sources and insufficient data. In this paper, an approach for SAE is presented for decision-making at national, regional and local level. We propose an Empirical Best Linear Unbiased Predictor (EBLUP) as an estimator in order to combine several information sources to evaluate various indicators. First, we present the urban audit project and its environmental, social and economic indicators. Secondly, we propose an approach for decision making in order to estimate indicators. An application is used to validate the theoretical proposal. Finally, a decision support system is presented based on open-source environment.

Keywords: Small area estimation, statistical method, sampling, empirical best linear unbiased predictor (EBLUP), decision-making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712
4131 An LMI Approach of Robust H∞ Fuzzy State-Feedback Controller Design for HIV/AIDS Infection System with Dual Drug Dosages

Authors: Wudhichai Assawinchaichote

Abstract:

This paper examines the problem of designing robust H controllers for for HIV/AIDS infection system with dual drug dosages described by a Takagi-Sugeno (S) fuzzy model. Based on a linear matrix inequality (LMI) approach, we develop an H controller which guarantees the L2-gain of the mapping from the exogenous input noise to the regulated output to be less than some prescribed value for the system. A sufficient condition of the controller for this system is given in term of Linear Matrix Inequalities (LMIs). The effectiveness of the proposed controller design methodology is finally demonstrated through simulation results. It has been shown that the anti-HIV vaccines are critically important in reducing the infected cells.

Keywords: H∞ Fuzzy control; Takagi-Sugeno (TS) fuzzy model; Linear Matrix Inequalities (LMIs); HIV/AIDS infection system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
4130 Feature Reduction of Nearest Neighbor Classifiers using Genetic Algorithm

Authors: M. Analoui, M. Fadavi Amiri

Abstract:

The design of a pattern classifier includes an attempt to select, among a set of possible features, a minimum subset of weakly correlated features that better discriminate the pattern classes. This is usually a difficult task in practice, normally requiring the application of heuristic knowledge about the specific problem domain. The selection and quality of the features representing each pattern have a considerable bearing on the success of subsequent pattern classification. Feature extraction is the process of deriving new features from the original features in order to reduce the cost of feature measurement, increase classifier efficiency, and allow higher classification accuracy. Many current feature extraction techniques involve linear transformations of the original pattern vectors to new vectors of lower dimensionality. While this is useful for data visualization and increasing classification efficiency, it does not necessarily reduce the number of features that must be measured since each new feature may be a linear combination of all of the features in the original pattern vector. In this paper a new approach is presented to feature extraction in which feature selection, feature extraction, and classifier training are performed simultaneously using a genetic algorithm. In this approach each feature value is first normalized by a linear equation, then scaled by the associated weight prior to training, testing, and classification. A knn classifier is used to evaluate each set of feature weights. The genetic algorithm optimizes a vector of feature weights, which are used to scale the individual features in the original pattern vectors in either a linear or a nonlinear fashion. By this approach, the number of features used in classifying can be finely reduced.

Keywords: Feature reduction, genetic algorithm, pattern classification, nearest neighbor rule classifiers (k-NNR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
4129 Split-Pipe Design of Water Distribution Networks Using a Combination of Tabu Search and Genetic Algorithm

Authors: J. Tospornsampan, I. Kita, M. Ishii, Y. Kitamura

Abstract:

In this paper a combination approach of two heuristic-based algorithms: genetic algorithm and tabu search is proposed. It has been developed to obtain the least cost based on the split-pipe design of looped water distribution network. The proposed combination algorithm has been applied to solve the three well-known water distribution networks taken from the literature. The development of the combination of these two heuristic-based algorithms for optimization is aimed at enhancing their strengths and compensating their weaknesses. Tabu search is rather systematic and deterministic that uses adaptive memory in search process, while genetic algorithm is probabilistic and stochastic optimization technique in which the solution space is explored by generating candidate solutions. Split-pipe design may not be realistic in practice but in optimization purpose, optimal solutions are always achieved with split-pipe design. The solutions obtained in this study have proved that the least cost solutions obtained from the split-pipe design are always better than those obtained from the single pipe design. The results obtained from the combination approach show its ability and effectiveness to solve combinatorial optimization problems. The solutions obtained are very satisfactory and high quality in which the solutions of two networks are found to be the lowest-cost solutions yet presented in the literature. The concept of combination approach proposed in this study is expected to contribute some useful benefits in diverse problems.

Keywords: GAs, Heuristics, Looped network, Least-cost design, Pipe network, Optimization, TS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
4128 Embedding the Dimensions of Sustainability into City Information Modelling

Authors: Ali M. Al-Shaery

Abstract:

The purpose of this paper is to address the functions of sustainability dimensions in city information modelling and to present the required sustainability criteria that support establishing a sustainable planning framework for enhancing existing cities and developing future smart cities. The paper is divided into two sections. The first section is based on the examination of a wide and extensive array of cross-disciplinary literature in the last decade and a half to conceptualize the terms ‘sustainable’ and ‘smart city’, and map their associated criteria to city information modelling. The second section is based on analyzing two approaches relating to city information modelling, namely statistical and dynamic approaches, and their suitability in the development of cities’ action plans. The paper argues that the use of statistical approaches to embed sustainability dimensions in city information modelling have limited value. Despite the popularity of such approaches in addressing other dimensions like utility and service management in development and action plans of the world cities, these approaches are unable to address the dynamics across various city sectors with regards to economic, environmental and social criteria. The paper suggests an integrative dynamic and cross-disciplinary planning approach to embedding sustainability dimensions in city information modelling frameworks. Such an approach will pave the way towards optimal planning and implementation of priority actions of projects and investments. The approach can be used to achieve three main goals: (1) better development and action plans for world cities (2) serve the development of an integrative dynamic and cross-disciplinary framework that incorporates economic, environmental and social sustainability criteria and (3) address areas that require further attention in the development of future sustainable and smart cities. The paper presents an innovative approach for city information modelling and a well-argued, balanced hierarchy of sustainability criteria that can contribute to an area of research which is still in its infancy in terms of development and management.

Keywords: Information modelling, smart city, sustainable city, sustainability dimensions, sustainability criteria, city development planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1177
4127 Long-Term Economic-Ecological Assessment of Optimal Local Heat-Generating Technologies for the German Unrefurbished Residential Building Stock on the Quarter Level

Authors: M. A. Spielmann, L. Schebek

Abstract:

In order to reach the long-term national climate goals of the German government for the building sector, substantial energetic measures have to be executed. Historically, those measures were primarily energetic efficiency measures at the buildings’ shells. Advanced technologies for the on-site generation of heat (or other types of energy) often are not feasible at this small spatial scale of a single building. Therefore, the present approach uses the spatially larger dimension of a quarter. The main focus of the present paper is the long-term economic-ecological assessment of available decentralized heat-generating (CHP power plants and electrical heat pumps) technologies at the quarter level for the German unrefurbished residential buildings. Three distinct terms have to be described methodologically: i) Quarter approach, ii) Economic assessment, iii) Ecological assessment. The quarter approach is used to enable synergies and scaling effects over a single-building. For the present study, generic quarters that are differentiated according to significant parameters concerning their heat demand are used. The core differentiation of those quarters is made by the construction time period of the buildings. The economic assessment as the second crucial parameter is executed with the following structure: Full costs are quantized for each technology combination and quarter. The investment costs are analyzed on an annual basis and are modeled with the acquisition of debt. Annuity loans are assumed. Consequently, for each generic quarter, an optimal technology combination for decentralized heat generation is provided in each year of the temporal boundaries (2016-2050). The ecological assessment elaborates for each technology combination and each quarter a Life Cycle assessment. The measured impact category hereby is GWP 100. The technology combinations for heat production can be therefore compared against each other concerning their long-term climatic impacts. Core results of the approach can be differentiated to an economic and ecological dimension. With an annual resolution, the investment and running costs of different energetic technology combinations are quantified. For each quarter an optimal technology combination for local heat supply and/or energetic refurbishment of the buildings within the quarter is provided. Coherently to the economic assessment, the climatic impacts of the technology combinations are quantized and compared against each other.

Keywords: Building sector, heat, LCA, quarter level, systemic approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 946
4126 A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms

Authors: Parisut Jitpakdee, Pakinee Aimmanee, Bunyarit Uyyanonvara

Abstract:

Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.

Keywords: Clustering, Color quantization, Firefly algorithm, Kmeans.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218
4125 Fault-Tolerant Optimal Broadcast Algorithm for the Hypercube Topology

Authors: Lokendra Singh Umrao, Ravi Shankar Singh

Abstract:

This paper presents an optimal broadcast algorithm for the hypercube networks. The main focus of the paper is the effectiveness of the algorithm in the presence of many node faults. For the optimal solution, our algorithm builds with spanning tree connecting the all nodes of the networks, through which messages are propagated from source node to remaining nodes. At any given time, maximum n − 1 nodes may fail due to crashing. We show that the hypercube networks are strongly fault-tolerant. Simulation results analyze to accomplish algorithm characteristics under many node faults. We have compared our simulation results between our proposed method and the Fu’s method. Fu’s approach cannot tolerate n − 1 faulty nodes in the worst case, but our approach can tolerate n − 1 faulty nodes.

Keywords: Fault tolerance, hypercube, broadcasting, link/node faults, routing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
4124 Project Objective Structure Model: An Integrated, Systematic and Balanced Approach in Order to Achieve Project Objectives

Authors: Mohammad Reza Oftadeh

Abstract:

The purpose of the article is to describe project objective structure (POS) concept that was developed on research activities and experiences about project management, Balanced Scorecard (BSC) and European Foundation Quality Management Excellence Model (EFQM Excellence Model). Furthermore, this paper tries to define a balanced, systematic, and integrated measurement approach to meet project objectives and project strategic goals based on a process-oriented model. In this paper, POS is suggested in order to measure project performance in the project life cycle. After using the POS model, the project manager can ensure in order to achieve the project objectives on the project charter. This concept can help project managers to implement integrated and balanced monitoring and control project work.

Keywords: Project objectives, project performance management, PMBOK, key performance indicators, integration management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 812
4123 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents

Authors: Artur Matuck, Guilherme F. Nobre

Abstract:

Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.

Keywords: Artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1312
4122 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: Pedestrian detection, color segmentation, false positives, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1144
4121 An Interactive Ontology Visualization Approach for the Networked Home Environment

Authors: Ilkka Niskanen, Jarmo Kalaoja, Julia Kantorovitch, Toni Piirainen

Abstract:

Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.

Keywords: Ontologies, visualization, interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374
4120 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: G. Candel, D. Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 489
4119 Designing Creative Events with Deconstructivism Approach

Authors: Maryam Memarian, Mahmood Naghizadeh

Abstract:

Deconstruction is an approach that is entirely incompatible with the traditional prevalent architecture. Considering the fact that this approach attempts to put architecture in sharp contrast with its opposite events and transpires with attending to the neglected and missing aspects of architecture and deconstructing its stable structures. It also recklessly proceeds beyond the existing frameworks and intends to create a different and more efficient prospect for space. The aim of deconstruction architecture is to satisfy both the prospective and retrospective visions as well as takes into account all tastes of the present in order to transcend time. Likewise, it ventures to fragment the facts and symbols of the past and extract new concepts from within their heart, which coincide with today’s circumstances. Since this approach is an attempt to surpass the limits of the prevalent architecture, it can be employed to design places in which creative events occur and imagination and ambition flourish. Thought-provoking artistic events can grow and mature in such places and be represented in the best way possible to all people. The concept of event proposed in the plan grows out of the interaction between space and creation. In addition to triggering surprise and high impressions, it is also considered as a bold journey into the suspended realms of the traditional conflicts in architecture such as architecture-landscape, interior-exterior, center-margin, product-process, and stability-instability. In this project, at first, through interpretive-historical research method and examining the inputs and data collection, recognition and organizing takes place. After evaluating the obtained data using deductive reasoning, the data is eventually interpreted. Given the fact that the research topic is in its infancy and there is not a similar case in Iran with limited number of corresponding instances across the world, the selected topic helps to shed lights on the unrevealed and neglected parts in architecture. Similarly, criticizing, investigating and comparing specific and highly prized cases in other countries with the project under study can serve as an introduction into this architecture style.

Keywords: Creativity, deconstruction, event.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
4118 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection

Authors: A. Mirrashid, M. Khoshbin, A. Atghaei, H. Shahbazi

Abstract:

In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.

Keywords: Attention, fire detection, smoke detection, spatiotemporal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 357
4117 Verification of a Locked CFD Approach to Cool Down Modeling

Authors: P. Bárta

Abstract:

Increasing demand on the performance of Subsea Production Systems (SPS) suggests a need for more detailed investigation of fluid behavior taking place in subsea equipment. Complete CFD cool down analyses of subsea equipment are very time demanding. The objective of this paper is to investigate a Locked CFD approach, which enables significant reduction of the computational time and at the same time maintains sufficient accuracy during thermal cool down simulations. The result comparison of a dead leg simulation using the Full CFD and the three LCFD-methods confirms the validity of the locked flow field assumption for the selected case. For the tested case the LCFD simulation speed up by factor of 200 results in the absolute thermal error of 0.5 °C (3% relative error), speed up by factor of 10 keeps the LCFD results within 0.1 °C (0.5 % relative error) comparing to the Full CFD.

Keywords: CFD, Locked Flow Field, Speed up of CFD simulation time, Subsea

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
4116 On Measuring the Reusability Proneness of Mobile Applications

Authors: Fathi Taibi

Abstract:

The abnormal increase in the number of applications available for download in Android markets is a good indication that they are being reused. However, little is known about their real reusability potential. A considerable amount of these applications is reported as having a poor quality or being malicious. Hence, in this paper, an approach to measure the reusability potential of classes in Android applications is proposed. The approach is not meant specifically for this particular type of applications. Rather, it is intended for Object-Oriented (OO) software systems in general and aims also to provide means to discard the classes of low quality and defect prone applications from being reused directly through inheritance and instantiation. An empirical investigation is conducted to measure and rank the reusability potential of the classes of randomly selected Android applications. The results obtained are thoroughly analyzed in order to understand the extent of this potential and the factors influencing it.

Keywords: Reusability, Software Quality Factors, Software Metrics, Empirical Investigation, Object-Oriented Software, Android Applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
4115 Low-Cost Space-Based Geoengineering: An Assessment Based on Self-Replicating Manufacturing of in-Situ Resources on the Moon

Authors: Alex Ellery

Abstract:

Geoengineering approaches to climate change mitigation are unpopular and regarded with suspicion. Of these, space-based approaches are regarded as unworkable and enormously costly. Here, a space-based approach is presented that is modest in cost, fully controllable and reversible, and acts as a natural spur to the development of solar power satellites over the longer term as a clean source of energy. The low-cost approach exploits self-replication technology which it is proposed may be enabled by 3D printing technology. Self-replication of 3D printing platforms will enable mass production of simple spacecraft units. Key elements being developed are 3D-printable electric motors and 3D-printable vacuum tube-based electronics. The power of such technologies will open up enormous possibilities at low cost including space-based geoengineering.

Keywords: 3D printing, in-situ resource utilization, self-replication technology, space-based geoengineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1103
4114 Loop Back Connected Component Labeling Algorithm and Its Implementation in Detecting Face

Authors: A. Rakhmadi, M. S. M. Rahim, A. Bade, H. Haron, I. M. Amin

Abstract:

In this study, a Loop Back Algorithm for component connected labeling for detecting objects in a digital image is presented. The approach is using loop back connected component labeling algorithm that helps the system to distinguish the object detected according to their label. Deferent than whole window scanning technique, this technique reduces the searching time for locating the object by focusing on the suspected object based on certain features defined. In this study, the approach was also implemented for a face detection system. Face detection system is becoming interesting research since there are many devices or systems that require detecting the face for certain purposes. The input can be from still image or videos, therefore the sub process of this system has to be simple, efficient and accurate to give a good result.

Keywords: Image processing, connected components labeling, face detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2300
4113 Emergentist Metaphorical Creativity: Towards a Model of Analysing Metaphorical Creativity in Interactive Talk

Authors: Afef Badri

Abstract:

Metaphorical creativity does not constitute a static property of discourse. It is an interactive dynamic process created online. There has been a lack of research concerning online produced metaphorical creativity. This paper intends to account for metaphorical creativity in online talk-in-interaction as a dynamic process that emerges as discourse unfolds. It brings together insights from the emergentist approach to the study of metaphor in verbal interactions and insights from conceptual blending approach as a model for analysing online metaphorical constructions to propose a model for studying metaphorical creativity in interactive talk. The model is based on three focal points. First, metaphorical creativity is a dynamic emergent and open-to-change process that evolves in real time as interlocutors constantly blend and re-blend previous metaphorical contributions. Second, it is not a product of isolated individual minds but a joint achievement that is co-constructed and co-elaborated by interlocutors. The third and most important point is that the emergent process of metaphorical creativity is tightly shaped by contextual variables surrounding talk-in-interaction. It is grounded in the framework of interpretation of interlocutors. It is constrained by preceding contributions in a way that creates textual cohesion of the verbal exchange and it is also a goal-oriented process predefined by the communicative intention of each participant in a way that reveals the ideological coherence/incoherence of the entire conversation.

Keywords: Communicative intention, conceptual blending, contextual variables, the emergentist approach, ideological coherence, metaphorical creativity, textual cohesion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
4112 Chua’s Circuit Regulation Using a Nonlinear Adaptive Feedback Technique

Authors: Abolhassan Razminia, Mohammad-Ali Sadrnia

Abstract:

Chua’s circuit is one of the most important electronic devices that are used for Chaos and Bifurcation studies. A central role of secure communication is devoted to it. Since the adaptive control is used vastly in the linear systems control, here we introduce a new trend of application of adaptive method in the chaos controlling field. In this paper, we try to derive a new adaptive control scheme for Chua’s circuit controlling because control of chaos is often very important in practical operations. The novelty of this approach is for sake of its robustness against the external perturbations which is simulated as an additive noise in all measured states and can be generalized to other chaotic systems. Our approach is based on Lyapunov analysis and the adaptation law is considered for the feedback gain. Because of this, we have named it NAFT (Nonlinear Adaptive Feedback Technique). At last, simulations show the capability of the presented technique for Chua’s circuit.

Keywords: Chaos, adaptive control, nonlinear control, Chua's circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
4111 Multi-Line Flexible Alternating Current Transmission System (FACTS) Controller for Transient Stability Analysis of a Multi-Machine Power System Network

Authors: A.V.Naresh Babu, S.Sivanagaraju

Abstract:

A considerable progress has been achieved in transient stability analysis (TSA) with various FACTS controllers. But, all these controllers are associated with single transmission line. This paper is intended to discuss a new approach i.e. a multi-line FACTS controller which is interline power flow controller (IPFC) for TSA of a multi-machine power system network. A mathematical model of IPFC, termed as power injection model (PIM) presented and this model is incorporated in Newton-Raphson (NR) power flow algorithm. Then, the reduced admittance matrix of a multi-machine power system network for a three phase fault without and with IPFC is obtained which is required to draw the machine swing curves. A general approach based on L-index has also been discussed to find the best location of IPFC to reduce the proximity to instability of a power system. Numerical results are carried out on two test systems namely, 6-bus and 11-bus systems. A program in MATLAB has been written to plot the variation of generator rotor angle and speed difference curves without and with IPFC for TSA and also a simple approach has been presented to evaluate critical clearing time for test systems. The results obtained without and with IPFC are compared and discussed.

Keywords: Flexible alternating current transmission system (FACTS), first swing stability, interline power flow controller (IPFC), power injection model (PIM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196
4110 MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes

Authors: Achraf El Allali, John R. Rose

Abstract:

A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.

Keywords: Coding Non-coding Classification, Entropy, GeneRecognition, Mutual Information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
4109 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal

Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert

Abstract:

We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.

Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2795
4108 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
4107 Proposing Problem-Based Learning as an Effective Pedagogical Technique for Social Work Education

Authors: Christine K. Fulmer

Abstract:

Social work education is competency based in nature. There is an expectation that graduates of social work programs throughout the world are to be prepared to practice at a level of competence, which is beneficial to both the well-being of individuals and community. Experiential learning is one way to prepare students for competent practice. The use of Problem-Based Learning (PBL) is a form experiential education that has been successful in a number of disciplines to bridge the gap between the theoretical concepts in the classroom to the real world. PBL aligns with the constructivist theoretical approach to learning, which emphasizes the integration of new knowledge with the beliefs students already hold. In addition, the basic tenants of PBL correspond well with the practice behaviors associated with social work practice including multi-disciplinary collaboration and critical thinking. This paper makes an argument for utilizing PBL in social work education.

Keywords: Constructivist theoretical approach, experiential learning, pedagogy, problem-based learning, social work education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
4106 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty

Authors: Pulak Swain, A. K. Ojha

Abstract:

Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of  E- constraint method.

Keywords: Portfolio optimization, multi-objective optimization, E-constraint method, box uncertainty, robust optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622