Search results for: connection approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5332

Search results for: connection approach

4132 Loop Back Connected Component Labeling Algorithm and Its Implementation in Detecting Face

Authors: A. Rakhmadi, M. S. M. Rahim, A. Bade, H. Haron, I. M. Amin

Abstract:

In this study, a Loop Back Algorithm for component connected labeling for detecting objects in a digital image is presented. The approach is using loop back connected component labeling algorithm that helps the system to distinguish the object detected according to their label. Deferent than whole window scanning technique, this technique reduces the searching time for locating the object by focusing on the suspected object based on certain features defined. In this study, the approach was also implemented for a face detection system. Face detection system is becoming interesting research since there are many devices or systems that require detecting the face for certain purposes. The input can be from still image or videos, therefore the sub process of this system has to be simple, efficient and accurate to give a good result.

Keywords: Image processing, connected components labeling, face detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2300
4131 Emergentist Metaphorical Creativity: Towards a Model of Analysing Metaphorical Creativity in Interactive Talk

Authors: Afef Badri

Abstract:

Metaphorical creativity does not constitute a static property of discourse. It is an interactive dynamic process created online. There has been a lack of research concerning online produced metaphorical creativity. This paper intends to account for metaphorical creativity in online talk-in-interaction as a dynamic process that emerges as discourse unfolds. It brings together insights from the emergentist approach to the study of metaphor in verbal interactions and insights from conceptual blending approach as a model for analysing online metaphorical constructions to propose a model for studying metaphorical creativity in interactive talk. The model is based on three focal points. First, metaphorical creativity is a dynamic emergent and open-to-change process that evolves in real time as interlocutors constantly blend and re-blend previous metaphorical contributions. Second, it is not a product of isolated individual minds but a joint achievement that is co-constructed and co-elaborated by interlocutors. The third and most important point is that the emergent process of metaphorical creativity is tightly shaped by contextual variables surrounding talk-in-interaction. It is grounded in the framework of interpretation of interlocutors. It is constrained by preceding contributions in a way that creates textual cohesion of the verbal exchange and it is also a goal-oriented process predefined by the communicative intention of each participant in a way that reveals the ideological coherence/incoherence of the entire conversation.

Keywords: Communicative intention, conceptual blending, contextual variables, the emergentist approach, ideological coherence, metaphorical creativity, textual cohesion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
4130 Chua’s Circuit Regulation Using a Nonlinear Adaptive Feedback Technique

Authors: Abolhassan Razminia, Mohammad-Ali Sadrnia

Abstract:

Chua’s circuit is one of the most important electronic devices that are used for Chaos and Bifurcation studies. A central role of secure communication is devoted to it. Since the adaptive control is used vastly in the linear systems control, here we introduce a new trend of application of adaptive method in the chaos controlling field. In this paper, we try to derive a new adaptive control scheme for Chua’s circuit controlling because control of chaos is often very important in practical operations. The novelty of this approach is for sake of its robustness against the external perturbations which is simulated as an additive noise in all measured states and can be generalized to other chaotic systems. Our approach is based on Lyapunov analysis and the adaptation law is considered for the feedback gain. Because of this, we have named it NAFT (Nonlinear Adaptive Feedback Technique). At last, simulations show the capability of the presented technique for Chua’s circuit.

Keywords: Chaos, adaptive control, nonlinear control, Chua's circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
4129 Multi-Line Flexible Alternating Current Transmission System (FACTS) Controller for Transient Stability Analysis of a Multi-Machine Power System Network

Authors: A.V.Naresh Babu, S.Sivanagaraju

Abstract:

A considerable progress has been achieved in transient stability analysis (TSA) with various FACTS controllers. But, all these controllers are associated with single transmission line. This paper is intended to discuss a new approach i.e. a multi-line FACTS controller which is interline power flow controller (IPFC) for TSA of a multi-machine power system network. A mathematical model of IPFC, termed as power injection model (PIM) presented and this model is incorporated in Newton-Raphson (NR) power flow algorithm. Then, the reduced admittance matrix of a multi-machine power system network for a three phase fault without and with IPFC is obtained which is required to draw the machine swing curves. A general approach based on L-index has also been discussed to find the best location of IPFC to reduce the proximity to instability of a power system. Numerical results are carried out on two test systems namely, 6-bus and 11-bus systems. A program in MATLAB has been written to plot the variation of generator rotor angle and speed difference curves without and with IPFC for TSA and also a simple approach has been presented to evaluate critical clearing time for test systems. The results obtained without and with IPFC are compared and discussed.

Keywords: Flexible alternating current transmission system (FACTS), first swing stability, interline power flow controller (IPFC), power injection model (PIM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196
4128 MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes

Authors: Achraf El Allali, John R. Rose

Abstract:

A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.

Keywords: Coding Non-coding Classification, Entropy, GeneRecognition, Mutual Information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
4127 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal

Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert

Abstract:

We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.

Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2795
4126 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
4125 Proposing Problem-Based Learning as an Effective Pedagogical Technique for Social Work Education

Authors: Christine K. Fulmer

Abstract:

Social work education is competency based in nature. There is an expectation that graduates of social work programs throughout the world are to be prepared to practice at a level of competence, which is beneficial to both the well-being of individuals and community. Experiential learning is one way to prepare students for competent practice. The use of Problem-Based Learning (PBL) is a form experiential education that has been successful in a number of disciplines to bridge the gap between the theoretical concepts in the classroom to the real world. PBL aligns with the constructivist theoretical approach to learning, which emphasizes the integration of new knowledge with the beliefs students already hold. In addition, the basic tenants of PBL correspond well with the practice behaviors associated with social work practice including multi-disciplinary collaboration and critical thinking. This paper makes an argument for utilizing PBL in social work education.

Keywords: Constructivist theoretical approach, experiential learning, pedagogy, problem-based learning, social work education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
4124 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty

Authors: Pulak Swain, A. K. Ojha

Abstract:

Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of  E- constraint method.

Keywords: Portfolio optimization, multi-objective optimization, E-constraint method, box uncertainty, robust optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622
4123 Speed Sensorless Control with a Linearizationby State Feedback of Asynchronous Machine Using a Model Reference Adaptive System

Authors: A. Larabi, M. S. Boucherit

Abstract:

In this paper, we show that the association of the PI regulators for the speed and stator currents with a control strategy using the linearization by state feedback for an induction machine without speed sensor, and with an adaptation of the rotor resistance. The rotor speed is estimated by using the model reference adaptive system approach (MRAS). This method consists of using two models: The first is the reference model and the second is an adjustable one in which two components of the stator flux, obtained from the measurement of the currents and stator voltages are estimated. The estimated rotor speed is then obtained by canceling the difference between stator-flux of the reference model and those of the adjustable one. Satisfactory results of simulation are obtained and discussed in this paper to highlight the proposed approach.

Keywords: Asynchronous actuator, PI Regulator, adaptivemethod with reference model, Vector control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1116
4122 Modeling of Bio Scaffolds: Structural and Fluid Transport Characterization

Authors: Sahba Sadir, M. R. A. Kadir, A. Öchsner, M. N. Harun

Abstract:

Scaffolds play a key role in tissue engineering and can be produced in many different ways depending on the applications and the materials used. Most researchers used an experimental trialand- error approach into new biomaterials but computer simulation applied to tissue engineering can offer a more exhaustive approach to test and screen out biomaterials. This paper develops the model of scaffolds and Computational Fluid Dynamics that show the value of computer simulations in determining the influence of the geometrical scaffold parameter porosity, pore size and shape on the permeability of scaffolds, magnitude of velocity, drop pressure, shear stress distribution and level and the proper design of the geometry of the scaffold. This creates a need for more advanced studies that include aspects of dynamic conditions of a micro fluid passing through the scaffold were characterized for tissue engineering applications and differentiation of tissues within scaffolds.

Keywords: Scaffold engineering, Tissue engineering, Cellularstructure, Biomaterial, Computational fluid dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
4121 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: Credibility scale, media credibility components, new media credibility scale, scale development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2882
4120 Design Process and Real-Time Validation of an Innovative Autonomous Mid-Air Flight and Landing System

Authors: De Lellis E., Di Vito V., Garbarino L., Lai C., Corraro F.

Abstract:

This paper describes the design process and the realtime validation of an innovative autonomous mid-air flight and landing system developed by the Italian Aerospace Research Center in the framework of the Italian national funded project TECVOL (Technologies for the Autonomous Flight). In the paper it is provided an insight of the whole development process of the system under study. In particular, the project framework is illustrated at first, then the functional context and the adopted design and testing approach are described, and finally the on-ground validation test rig on purpose designed is addressed in details. Furthermore, the hardwarein- the-loop validation of the autonomous mid-air flight and landing system by means of the real-time test rig is described and discussed.

Keywords: Autonomous landing, autonomous mid-air flight, design and test approach, real-time hardware-in-the-loop validation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
4119 Augmented Reality Sandbox and Constructivist Approach for Geoscience Teaching and Learning

Authors: Muhammad Nawaz, Sandeep N. Kundu, Farha Sattar

Abstract:

Augmented reality sandbox adds new dimensions to education and learning process. It can be a core component of geoscience teaching and learning to understand the geographic contexts and landform processes. Augmented reality sandbox is a useful tool not only to create an interactive learning environment through spatial visualization but also it can provide an active learning experience to students and enhances the cognition process of learning. Augmented reality sandbox can be used as an interactive learning tool to teach geomorphic and landform processes. This article explains the augmented reality sandbox and the constructivism approach for geoscience teaching and learning, and endeavours to explore the ways to teach the geographic processes using the three-dimensional digital environment for the deep learning of the geoscience concepts interactively.

Keywords: Augmented Reality Sandbox, constructivism, deep learning, geoscience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
4118 Formosa3: A Cloud-Enabled HPC Cluster in NCHC

Authors: Chin-Hung Li, Te-Ming Chen, Ying-Chuan Chen, Shuen-Tai Wang

Abstract:

This paper proposes a new approach to offer a private cloud service in HPC clusters. In particular, our approach relies on automatically scheduling users- customized environment request as a normal job in batch system. After finishing virtualization request jobs, those guest operating systems will dismiss so that compute nodes will be released again for computing. We present initial work on the innovative integration of HPC batch system and virtualization tools that aims at coexistence such that they suffice for meeting the minimizing interference required by a traditional HPC cluster. Given the design of initial infrastructure, the proposed effort has the potential to positively impact on synergy model. The results from the experiment concluded that goal for provisioning customized cluster environment indeed can be fulfilled by using virtual machines, and efficiency can be improved with proper setup and arrangements.

Keywords: Cloud Computing, HPC Cluster, Private Cloud, Virtualization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
4117 MITAutomatic ECG Beat Tachycardia Detection Using Artificial Neural Network

Authors: R. Amandi, A. Shahbazi, A. Mohebi, M. Bazargan, Y. Jaberi, P. Emadi, A. Valizade

Abstract:

The application of Neural Network for disease diagnosis has made great progress and is widely used by physicians. An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which was the great motivation towards our study. In our work, tachycardia features obtained are used for the training and testing of a Neural Network. In this study we are using Fuzzy Probabilistic Neural Networks as an automatic technique for ECG signal analysis. As every real signal recorded by the equipment can have different artifacts, we needed to do some preprocessing steps before feeding it to our system. Wavelet transform is used for extracting the morphological parameters of the ECG signal. The outcome of the approach for the variety of arrhythmias shows the represented approach is superior than prior presented algorithms with an average accuracy of about %95 for more than 7 tachy arrhythmias.

Keywords: Fuzzy Logic, Probabilistic Neural Network, Tachycardia, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
4116 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images

Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire

Abstract:

In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.

Keywords: Defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
4115 Learning Undergraduate Mathematics in a Discovery-Enriched Approach

Authors: Kam-moon Liu, Kwok-chi Chim, Kwok-wai Chung, Daniel Wing-cheong Ho

Abstract:

Students often adopt routine practicing as learning strategy for mathematics. The reason is they are often bound and trained to solving conventional-typed questions in Mathematics in high school. This will be problematic if students further consolidate this practice in university. Therefore, the Department of Mathematics emphasized and integrated the Discovery-enriched approach in the undergraduate curriculum. This paper presents the details of implementing the Discovery-enriched Curriculum by providing adequate platform for project-learning, expertise for guidance and internship opportunities for students majoring in Mathematics. The Department also provided project-learning opportunities to mathematics courses targeted for students majoring in other science or engineering disciplines. The outcome is promising: the research ability and problem solving skills of students are enhanced.

Keywords: Discovery-enriched curriculum, higher education, mathematics education, project learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
4114 Case Study Approach Using Scenario Analysis to Analyze Unabsorbed Head Office Overheads

Authors: K. C. Iyer, T. Gupta, Y. M. Bindal

Abstract:

Head office overhead (HOOH) is an indirect cost and is recovered through individual project billings by the contractor. Delay in a project impacts the absorption of HOOH cost allocated to that particular project and thus diminishes the expected profit of the contractor. This unabsorbed HOOH cost is later claimed by contractors as damages. The subjective nature of the available formulae to compute unabsorbed HOOH is the difficulty that contractors and owners face and thus dispute it. The paper attempts to bring together the rationale of various HOOH formulae by gathering contractor’s HOOH cost data on all of its project, using case study approach and comparing variations in values of HOOH using scenario analysis. The case study approach uses project data collected from four construction projects of a contractor in India to calculate unabsorbed HOOH costs from various available formulae. Scenario analysis provides further variations in HOOH values after considering two independent situations mainly scope changes and new projects during the delay period. Interestingly, one of the findings in this study reveals that, in spite of HOOH getting absorbed by additional works available during the period of delay, a few formulae depict an increase in the value of unabsorbed HOOH, neglecting any absorption by the increase in scope. This indicates that these formulae are inappropriate for use in case of a change to the scope of work. Results of this study can help both parties in deciding on an appropriate formula more objectively, considering the events on a project causing the delay and contractor's position in respect of obtaining new projects.

Keywords: Absorbed and unabsorbed overheads, head office overheads, scenario analysis, scope variation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 826
4113 Some Issues of Measurement of Impairment of Non-Financial Assets in the Public Sector

Authors: Mariam Vardiashvili

Abstract:

The economic value of the asset impairment process is quite large. Impairment reflects the reduction of future economic benefits or service potentials itemized in the asset. The assets owned by public sector entities bring economic benefits or are used for delivery of the free-of-charge services. Consequently, they are classified as cash-generating and non-cash-generating assets. IPSAS 21 - Impairment of non-cash-generating assets, and IPSAS 26 - Impairment of cash-generating assets, have been designed considering this specificity.  When measuring impairment of assets, it is important to select the relevant methods. For measurement of the impaired Non-Cash-Generating Assets, IPSAS 21 recommends three methods: Depreciated Replacement Cost Approach, Restoration Cost Approach, and  Service Units Approach. Impairment of Value in Use of Cash-Generating Assets (according to IPSAS 26) is measured by discounted value of the money sources to be received in future. Value in use of the cash-generating asserts (as per IPSAS 26) is measured by the discounted value of the money sources to be received in the future. The article provides classification of the assets in the public sector  as non-cash-generating assets and cash-generating assets and, deals also with the factors which should be considered when evaluating  impairment of assets. An essence of impairment of the non-financial assets and the methods of measurement thereof evaluation are formulated according to IPSAS 21 and IPSAS 26. The main emphasis is put on different methods of measurement of the value in use of the impaired Cash-Generating Assets and Non-Cash-Generation Assets and the methods of their selection. The traditional and the expected cash flow approaches for calculation of the discounted value are reviewed. The article also discusses the issues of recognition of impairment loss and its reflection in the financial reporting. The article concludes that despite a functional purpose of the impaired asset, whichever method is used for measuring the asset, presentation of realistic information regarding the value of the assets should be ensured in the financial reporting. In the theoretical development of the issue, the methods of scientific abstraction, analysis and synthesis were used. The research was carried out with a systemic approach. The research process uses international standards of accounting, theoretical researches and publications of Georgian and foreign scientists.

Keywords: Non-cash-generating assets, cash-generating assets, recoverable value, recoverable service amount, value in use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 699
4112 Dynamic Threshold Adjustment Approach For Neural Networks

Authors: Hamza A. Ali, Waleed A. J. Rasheed

Abstract:

The use of neural networks for recognition application is generally constrained by their inherent parameters inflexibility after the training phase. This means no adaptation is accommodated for input variations that have any influence on the network parameters. Attempts were made in this work to design a neural network that includes an additional mechanism that adjusts the threshold values according to the input pattern variations. The new approach is based on splitting the whole network into two subnets; main traditional net and a supportive net. The first deals with the required output of trained patterns with predefined settings, while the second tolerates output generation dynamically with tuning capability for any newly applied input. This tuning comes in the form of an adjustment to the threshold values. Two levels of supportive net were studied; one implements an extended additional layer with adjustable neuronal threshold setting mechanism, while the second implements an auxiliary net with traditional architecture performs dynamic adjustment to the threshold value of the main net that is constructed in dual-layer architecture. Experiment results and analysis of the proposed designs have given quite satisfactory conducts. The supportive layer approach achieved over 90% recognition rate, while the multiple network technique shows more effective and acceptable level of recognition. However, this is achieved at the price of network complexity and computation time. Recognition generalization may be also improved by accommodating capabilities involving all the innate structures in conjugation with Intelligence abilities with the needs of further advanced learning phases.

Keywords: Classification, Recognition, Neural Networks, Pattern Recognition, Generalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
4111 High Quality Speech Coding using Combined Parametric and Perceptual Modules

Authors: M. Kulesza, G. Szwoch, A. Czyżewski

Abstract:

A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.

Keywords: CELP residual coding, hybrid codec architecture, perceptual speech coding, speech codecs comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
4110 Intelligent Agent Communication by Using DAML to Build Agent Community Ontology

Authors: Cheng-Hsiung Hung, Hong-Jie Dai, Jason Jen-Yen Chen

Abstract:

This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.

Keywords: Intelligent agent communication, DARPA agent markup language (DAML), Community ontology, Advanced Traveler Information System (ATIS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
4109 Analyzing Periurban Fringe with Rough Set

Authors: Benedetto Manganelli, Beniamino Murgante

Abstract:

The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.

Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
4108 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

Authors: Karima Siham Aoubid, Mohamed Boulemden

Abstract:

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338
4107 Physical Modeling of Oil Well Fire Extinguishing Using a Turbojet on a Barge

Authors: M. Abbaspour, D. Mansouri, N. Mansouri

Abstract:

There are reports of gas and oil wells fire due to different accidents. Many different methods are used for fire fighting in gas and oil industry. Traditional fire extinguishing techniques are mostly faced with many problems and are usually time consuming and needs lots of equipments. Besides, they cause damages to facilities, and create health and environmental problems. This article proposes innovative approach in fire extinguishing techniques in oil and gas industry, especially applicable for burning oil wells located offshore. Fire extinguishment employing a turbojet is a novel approach which can help to extinguishment the fire in short period of time. Divergent and convergent turbojets modeled in laboratory scale along with a high pressure flame were used. Different experiments were conducted to determine the relationship between output discharges of trumpet and oil wells. The results were corrected and the relationship between dimensionless parameters of flame and fire extinguishment distances and also the output discharge of turbojet and oil wells in specified distances are demonstrated by specific curves.

Keywords: Burning well, fire extinguishment, gas/oil industry, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
4106 An Approach for Blind Source Separation using the Sliding DFT and Time Domain Independent Component Analysis

Authors: Koji Yamanouchi, Masaru Fujieda, Takahiro Murakami, Yoshihisa Ishida

Abstract:

''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.

Keywords: Cocktail party problem, blind Source Separation(BSS), independent component analysis, sliding DFT, onlineprocessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
4105 Energy Map Construction using Adaptive Alpha Grey Prediction Model in WSNs

Authors: Surender Kumar Soni, Dhirendra Pratap Singh

Abstract:

Wireless Sensor Networks can be used to monitor the physical phenomenon in such areas where human approach is nearly impossible. Hence the limited power supply is the major constraint of the WSNs due to the use of non-rechargeable batteries in sensor nodes. A lot of researches are going on to reduce the energy consumption of sensor nodes. Energy map can be used with clustering, data dissemination and routing techniques to reduce the power consumption of WSNs. Energy map can also be used to know which part of the network is going to fail in near future. In this paper, Energy map is constructed using the prediction based approach. Adaptive alpha GM(1,1) model is used as the prediction model. GM(1,1) is being used worldwide in many applications for predicting future values of time series using some past values due to its high computational efficiency and accuracy.

Keywords: Adaptive Alpha GM(1, 1) Model, Energy Map, Prediction Based Data Reduction, Wireless Sensor Networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801
4104 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank, will be focused first when developing their test cases as these modules are vulnerable for defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: Software testing, mutation test, network centrality measure, test case prioritization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 545
4103 A Discrete-Event-Simulation Approach for Logistic Systems with Real Time Resource Routing and VR Integration

Authors: Gerrit Alves, Jürgen Roßmann, Roland Wischnewski

Abstract:

Today, transport and logistic systems are often tightly integrated in the production. Lean production and just-in-time delivering create multiple constraints that have to be fulfilled. As transport networks often have evolved over time they are very expensive to change. This paper describes a discrete-event-simulation system which simulates transportation models using real time resource routing and collision avoidance. It allows for the specification of own control algorithms and validation of new strategies. The simulation is integrated into a virtual reality (VR) environment and can be displayed in 3-D to show the progress. Simulation elements can be selected through VR metaphors. All data gathered during the simulation can be presented as a detailed summary afterwards. The included cost-benefit calculation can help to optimize the financial outcome. The operation of this approach is shown by the example of a timber harvest simulation.

Keywords: Discrete-Event-Simulation, Logistic, Simulation, Virtual Reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881