Search results for: weighted goal programming.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1633

Search results for: weighted goal programming.

1003 Physical Properties and Stability of Emulsions as Affected by Native and Modified Yam Starches

Authors: Nor Hayati Ibrahim, Shamini Nair Achudan

Abstract:

This study was conducted in order to determine the physical properties and stability of mayonnaise-like emulsions as affected by modified yam starches. Native yam starch was modified via pre-gelatinization and cross-linking phosphorylation procedures. The emulsions (50% oil dispersed phase) were prepared with 0.3% native potato, native yam, pre-gelatinized yam and cross-linking phosphorylation yam starches. The droplet size of surface weighted mean diameter was found to be significantly (p < 0.05) lower in the sample with cross-linking phosphorylation yam starch as compared to other samples. Moreover, the viscosity of the sample with pregelatinized yam starch was observed to be higher than that of other samples. The phase separation stability was low in the freshly prepared and stored (45 days, 5°C) emulsions containing native yam starch. This study thus generally suggested that modified yam starches were more suitable (i.e. better physical properties and stability) to be used as stabilizers in a similar system i.e. light mayonnaises, rather than a native yam starch.

Keywords: Oil-in-water emulsions, low-fat mayonnaises, modified yam starches, droplet size distribution, viscosity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3427
1002 Role of GIS in Distribution Power Systems

Authors: N. Rezaee, M Nayeripour, A. Roosta, T. Niknam

Abstract:

With the prevalence of computer and development of information technology, Geographic Information Systems (GIS) have long used for a variety of applications in electrical engineering. GIS are designed to support the analysis, management, manipulation and mapping of spatial data. This paper presents several usages of GIS in power utilities such as automated route selection for the construction of new power lines which uses a dynamic programming model for route optimization, load forecasting and optimizing planning of substation-s location and capacity with comprehensive algorithm which involves an accurate small-area electric load forecasting procedure and simulates the different cost functions of substations.

Keywords: Geographic information systems (GIS), optimallocation and capacity, power distribution planning, route selection, spatial load forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5466
1001 Multi Objective Micro Genetic Algorithm for Combine and Reroute Problem

Authors: Soottipoom Yaowiwat, Manoj Lohatepanont, Proadpran Punyabukkana

Abstract:

Several approaches such as linear programming, network modeling, greedy heuristic and decision support system are well-known approaches in solving irregular airline operation problem. This paper presents an alternative approach based on Multi Objective Micro Genetic Algorithm. The aim of this research is to introduce the concept of Multi Objective Micro Genetic Algorithm as a tool to solve irregular airline operation, combine and reroute problem. The experiment result indicated that the model could obtain optimal solutions within a few second.

Keywords: Irregular Airline Operation, Combine and RerouteRoutine, Genetic Algorithm, Micro Genetic Algorithm, Multi ObjectiveOptimization, Evolutionary Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618
1000 A Heuristic Based Conceptual Framework for Product Innovation

Authors: Amalia Suzianti

Abstract:

This research elaborates decision models for product innovation in the early phases, focusing on one of the most widely implemented method in marketing research: conjoint analysis and the related conjoint-based models with special focus on heuristics programming techniques for the development of optimal product innovation. The concept, potential, requirements and limitations of conjoint analysis and its conjoint-based heuristics successors are analysed and the development of conceptual framework of Genetic Algorithm (GA) as one of the most widely implemented heuristic methods for developing product innovations are discussed.

Keywords: Product Innovation, Conjoint Analysis, Heuristic Model, Genetic Algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584
999 Automated ECG Segmentation Using Piecewise Derivative Dynamic Time Warping

Authors: Ali Zifan, Sohrab Saberi, Mohammad Hassan Moradi, Farzad Towhidkhah

Abstract:

Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.

Keywords: Adaptive Piecewise Constant Approximation, Dynamic programming, ECG segmentation, Piecewise Derivative Dynamic Time Warping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2356
998 Digital Library Evaluation by SWARA-WASPAS Method

Authors: Mehmet Yörükoğlu, Serhat Aydın

Abstract:

Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.

Keywords: Digital library, multi criteria decision making, SWARA-WASPAS method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 838
997 An Engineering Approach to Forecast Volatility of Financial Indices

Authors: Irwin Ma, Tony Wong, Thiagas Sankar

Abstract:

By systematically applying different engineering methods, difficult financial problems become approachable. Using a combination of theory and techniques such as wavelet transform, time series data mining, Markov chain based discrete stochastic optimization, and evolutionary algorithms, this work formulated a strategy to characterize and forecast non-linear time series. It attempted to extract typical features from the volatility data sets of S&P100 and S&P500 indices that include abrupt drops, jumps and other non-linearity. As a result, accuracy of forecasting has reached an average of over 75% surpassing any other publicly available results on the forecast of any financial index.

Keywords: Discrete stochastic optimization, genetic algorithms, genetic programming, volatility forecast

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605
996 Evaluation of Linear and Geometrically Nonlinear Static and Dynamic Analysis of Thin Shells by Flat Shell Finite Elements

Authors: Djamel Boutagouga, Kamel Djeghaba

Abstract:

The choice of finite element to use in order to predict nonlinear static or dynamic response of complex structures becomes an important factor. Then, the main goal of this research work is to focus a study on the effect of the in-plane rotational degrees of freedom in linear and geometrically non linear static and dynamic analysis of thin shell structures by flat shell finite elements. In this purpose: First, simple triangular and quadrilateral flat shell finite elements are implemented in an incremental formulation based on the updated lagrangian corotational description for geometrically nonlinear analysis. The triangular element is a combination of DKT and CST elements, while the quadrilateral is a combination of DKQ and the bilinear quadrilateral membrane element. In both elements, the sixth degree of freedom is handled via introducing fictitious stiffness. Secondly, in the same code, the sixth degrees of freedom in these elements is handled differently where the in-plane rotational d.o.f is considered as an effective d.o.f in the in-plane filed interpolation. Our goal is to compare resulting shell elements. Third, the analysis is enlarged to dynamic linear analysis by direct integration using Newmark-s implicit method. Finally, the linear dynamic analysis is extended to geometrically nonlinear dynamic analysis where Newmark-s method is used to integrate equations of motion and the Newton-Raphson method is employed for iterating within each time step increment until equilibrium is achieved. The obtained results demonstrate the effectiveness and robustness of the interpolation of the in-plane rotational d.o.f. and present deficiencies of using fictitious stiffness in dynamic linear and nonlinear analysis.

Keywords: Flat shell, dynamic analysis, nonlinear, Newmark, drilling rotation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2892
995 Implementing a Database from a Requirement Specification

Authors: M. Omer, D. Wilson

Abstract:

Creating a database scheme is essentially a manual process. From a requirement specification the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a relational database from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that a first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore this method is a step forward in finding a solution that requires little or no user intervention.

Keywords: Information Extraction, Natural Language Processing, Relation Extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2191
994 Determinate Fuzzy Set Ranking Analysis for Combat Aircraft Selection with Multiple Criteria Group Decision Making

Authors: C. Ardil

Abstract:

Using the aid of Hausdorff distance function and Minkowski distance function, this study proposes a novel method for selecting combat aircraft for Air Force. In order to do this, the proximity measure method was developed with determinate fuzzy degrees based on the relationship between attributes and combat aircraft alternatives. The combat aircraft selection attributes were identified as payloadability, maneuverability, speedability, stealthability, and survivability. Determinate fuzzy data from the combat aircraft attributes was then aggregated using the determinate fuzzy weighted arithmetic average operator. For the selection of combat aircraft, correlation analysis of the ranking order patterns of options was also examined. A numerical example from military aviation is used to demonstrate the applicability and effectiveness of the proposed method.

Keywords: Combat aircraft selection, multiple criteria decision making, fuzzy sets, determinate fuzzy sets, intuitionistic fuzzy sets, proximity measure method, Hausdorff distance function, Minkowski distance function, PMM, MCDM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307
993 An Application for Web Mining Systems with Services Oriented Architecture

Authors: Thiago M. R. Dias, Gray F. Moita, Paulo E. M. Almeida

Abstract:

Although the World Wide Web is considered the largest source of information there exists nowadays, due to its inherent dynamic characteristics, the task of finding useful and qualified information can become a very frustrating experience. This study presents a research on the information mining systems in the Web; and proposes an implementation of these systems by means of components that can be built using the technology of Web services. This implies that they can encompass features offered by a services oriented architecture (SOA) and specific components may be used by other tools, independent of platforms or programming languages. Hence, the main objective of this work is to provide an architecture to Web mining systems, divided into stages, where each step is a component that will incorporate the characteristics of SOA. The separation of these steps was designed based upon the existing literature. Interesting results were obtained and are shown here.

Keywords: Web Mining, Service Oriented Architecture, WebServices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
992 Possibilistic Aggregations in the Investment Decision Making

Authors: I. Khutsishvili, G. Sirbiladze, B. Ghvaberidze

Abstract:

This work proposes a fuzzy methodology to support the investment decisions. While choosing among competitive investment projects, the methodology makes ranking of projects using the new aggregation OWA operator – AsPOWA, presented in the environment of possibility uncertainty. For numerical evaluation of the weighting vector associated with the AsPOWA operator the mathematical programming problem is constructed. On the basis of the AsPOWA operator the projects’ group ranking maximum criteria is constructed. The methodology also allows making the most profitable investments into several of the project using the method developed by the authors for discrete possibilistic bicriteria problems. The article provides an example of the investment decision-making that explains the work of the proposed methodology.

Keywords: Expert evaluations, investment decision making, OWA operator, possibility uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2006
991 Classification of Soil Aptness to Establish of Panicum virgatum in Mississippi using Sensitivity Analysis and GIS

Authors: Eduardo F. Arias, William Cooke III, Zhaofei Fan, William Kingery

Abstract:

During the last decade Panicum virgatum, known as Switchgrass, has been broadly studied because of its remarkable attributes as a substitute pasture and as a functional biofuel source. The objective of this investigation was to establish soil suitability for Switchgrass in the State of Mississippi. A linear weighted additive model was developed to forecast soil suitability. Multicriteria analysis and Sensitivity analysis were utilized to adjust and optimize the model. The model was fit using seven years of field data associated with soils characteristics collected from Natural Resources Conservation System - United States Department of Agriculture (NRCS-USDA). The best model was selected by correlating calculated biomass yield with each model's soils-based output for Switchgrass suitability. Coefficient of determination (r2) was the decisive factor used to establish the 'best' soil suitability model. Coefficients associated with the 'best' model were implemented within a Geographic Information System (GIS) to create a map of relative soil suitability for Switchgrass in Mississippi. A Geodatabase associated with soil parameters was built and is available for future Geographic Information System use.

Keywords: Aptness, GIS, sensitivity analysis, switchgrass, soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
990 A Development of a Simulation Tool for Production Planning with Capacity-Booking at Specialty Store Retailer of Private Label Apparel Firms

Authors: Erika Yamaguchi, Sirawadee Arunyanrt, Shunichi Ohmori, Kazuho Yoshimoto

Abstract:

In this paper, we suggest a simulation tool to make a decision of monthly production planning for maximizing a profit of Specialty store retailer of Private label Apparel (SPA) firms. Most of SPA firms are fabless and make outsourcing deals for productions with factories of their subcontractors. Every month, SPA firms make a booking for production lines and manpower in the factories. The booking is conducted a few months in advance based on a demand prediction and a monthly production planning at that time. However, the demand prediction is updated month by month, and the monthly production planning would change to meet the latest demand prediction. Then, SPA firms have to change the capacities initially booked within a certain range to suit to the monthly production planning. The booking system is called “capacity-booking”. These days, though it is an issue for SPA firms to make precise monthly production planning, many firms are still conducting the production planning by empirical rules. In addition, it is also a challenge for SPA firms to match their products and factories with considering their demand predictabilities and regulation abilities. In this paper, we suggest a model for considering these two issues. An objective is to maximize a total profit of certain periods, which is sales minus costs of production, inventory, and capacity-booking penalty. To make a better monthly production planning at SPA firms, these points should be considered: demand predictabilities by random trends, previous and next month’s production planning of the target month, and regulation abilities of the capacity-booking. To decide matching products and factories for outsourcing, it is important to consider seasonality, volume, and predictability of each product, production possibility, size, and regulation ability of each factory. SPA firms have to consider these constructions and decide orders with several factories per one product. We modeled these issues as a linear programming. To validate the model, an example of several computational experiments with a SPA firm is presented. We suppose four typical product groups: basic, seasonal (Spring / Summer), seasonal (Fall / Winter), and spot product. As a result of the experiments, a monthly production planning was provided. In the planning, demand predictabilities from random trend are reduced by producing products which are different product types. Moreover, priorities to produce are given to high-margin products. In conclusion, we developed a simulation tool to make a decision of monthly production planning which is useful when the production planning is set every month. We considered the features of capacity-booking, and matching of products and factories which have different features and conditions.

Keywords: Capacity-booking, SPA, monthly production planning, linear programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1308
989 Joint Use of Factor Analysis (FA) and Data Envelopment Analysis (DEA) for Ranking of Data Envelopment Analysis

Authors: Reza Nadimi, Fariborz Jolai

Abstract:

This article combines two techniques: data envelopment analysis (DEA) and Factor analysis (FA) to data reduction in decision making units (DMU). Data envelopment analysis (DEA), a popular linear programming technique is useful to rate comparatively operational efficiency of decision making units (DMU) based on their deterministic (not necessarily stochastic) input–output data and factor analysis techniques, have been proposed as data reduction and classification technique, which can be applied in data envelopment analysis (DEA) technique for reduction input – output data. Numerical results reveal that the new approach shows a good consistency in ranking with DEA.

Keywords: Effectiveness, Decision Making, Data EnvelopmentAnalysis, Factor Analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2396
988 Limit State of Heterogeneous Smart Structures under Unknown Cyclic Loading

Authors: M. Chen, S-Q. Zhang, X. Wang, D. Tate

Abstract:

This paper presents a numerical solution, namely limit and shakedown analysis, to predict the safety state of smart structures made of heterogeneous materials under unknown cyclic loadings, for instance, the flexure hinge in the micro-positioning stage driven by piezoelectric actuator. In combination of homogenization theory and finite-element method (FEM), the safety evaluation problem is converted to a large-scale nonlinear optimization programming for an acceptable bounded loading as the design reference. Furthermore, a general numerical scheme integrated with the FEM and interior-point-algorithm based optimization tool is developed, which makes the practical application possible.

Keywords: Limit state, shakedown analysis, homogenization, heterogeneous structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 817
987 ReSeT : Reverse Engineering System Requirements Tool

Authors: Rosziati Ibrahim, Tiu Kian Yong

Abstract:

Reverse Engineering is a very important process in Software Engineering. It can be performed backwards from system development life cycle (SDLC) in order to get back the source data or representations of a system through analysis of its structure, function and operation. We use reverse engineering to introduce an automatic tool to generate system requirements from its program source codes. The tool is able to accept the Cµ programming source codes, scan the source codes line by line and parse the codes to parser. Then, the engine of the tool will be able to generate system requirements for that specific program to facilitate reuse and enhancement of the program. The purpose of producing the tool is to help recovering the system requirements of any system when the system requirements document (SRD) does not exist due to undocumented support of the system.

Keywords: System Requirements, Reverse Engineering, SourceCodes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
986 Image Contrast Enhancement based Sub-histogram Equalization Technique without Over-equalization Noise

Authors: Hyunsup Yoon, Youngjoon Han, Hernsoo Hahn

Abstract:

In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.

Keywords: Contrast Enhancement, Histogram Equalization, Histogram Region Equalization, Equalization Noise

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3385
985 Using ε Value in Describe Regular Languages by Using Finite Automata, Operation on Languages and the Changing Algorithm Implementation

Authors: Abdulmajid Mukhtar Afat

Abstract:

This paper aims at introducing nondeterministic finite automata with ε value which is used to perform some operations on languages. a program is created to implement the algorithm that converts nondeterministic finite automata with ε value (ε-NFA) to deterministic finite automata (DFA).The program is written in c++ programming language. The program inputs are FA 5-tuples from text file and then classifies it into either DFA/NFA or ε -NFA. For DFA, the program will get the string w and decide whether it is accepted or rejected. The tracking path for an accepted string is saved by the program. In case of NFA or ε-NFA automation, the program changes the automation to DFA to enable tracking and to decide if the string w exists in the regular language or not.

Keywords: Finite automata, DFA, NFA, ε-NFA, Eclose, operations on languages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 813
984 Study of EEGs from Somatosensory Cortex and Alzheimer's Disease Sources

Authors: Md R. Bashar, Yan Li, Peng Wen

Abstract:

This study is to investigate the electroencephalogram (EEG) differences generated from a normal and Alzheimer-s disease (AD) sources. We also investigate the effects of brain tissue distortions due to AD on EEG. We develop a realistic head model from T1 weighted magnetic resonance imaging (MRI) using finite element method (FEM) for normal source (somatosensory cortex (SC) in parietal lobe) and AD sources (right amygdala (RA) and left amygdala (LA) in medial temporal lobe). Then, we compare the AD sourced EEGs to the SC sourced EEG for studying the nature of potential changes due to sources and 5% to 20% brain tissue distortions. We find an average of 0.15 magnification errors produced by AD sourced EEGs. Different brain tissue distortion models also generate the maximum 0.07 magnification. EEGs obtained from AD sources and different brain tissue distortion levels vary scalp potentials from normal source, and the electrodes residing in parietal and temporal lobes are more sensitive than other electrodes for AD sourced EEG.

Keywords: Alzheimer's disease (AD), brain tissue distortion, electroencephalogram, finite element method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
983 Level of Service Based Methodology for Municipal Infrastructure Management

Authors: Z. Khan, O. Moselhi, T. Zayed

Abstract:

Development of levels of service in municipal context is a flexible vehicle to assist in performing quality-cost trade-off analysis for municipal services. This trade-off depends on the willingness of a community to pay as well as on the condition of the assets. Community perspective of the performance of an asset from service point of view may be quite different from the municipality perspective of the performance of the same asset from condition point of view. This paper presents a three phased level of service based methodology for water mains that consists of :1)development of an Analytical Hierarchy model of level of service 2) development of Fuzzy Weighted Sum model of water main condition index and 3) deriving a Fuzzy logic based function that maps level of service to asset condition index. This mapping will assist asset managers in quantifying condition improvement requirement to meet service goals and to make more informed decisions on interventions and relayed priorities.

Keywords: Asset Management, Level of Service, Condition Index, Analytical Hierarchy, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
982 Data and Control Flow Analysis of VDMµ Specifications

Authors: Mubina Nazmeen, Iram Rubab

Abstract:

Formal Specification languages are being widely used for system specification and testing. Highly critical systems such as real time systems, avionics, and medical systems are represented using Formal specification languages. Formal specifications based testing is mostly performed using black box testing approaches thus testing only the set of inputs and outputs of the system. The formal specification language such as VDMµ can be used for white box testing as they provide enough constructs as any other high level programming language. In this work, we perform data and control flow analysis of VDMµ class specifications. The proposed work is discussed with an example of SavingAccount.

Keywords: VDM-SL, VDMµ, data flow graph, control flowgraph, testing, formal specification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4354
981 Hybrid Association Control Scheme and Load Balancing in Wireless LANs

Authors: Chutima Prommak, Airisa Jantaweetip

Abstract:

This paper presents a hybrid association control scheme that can maintain load balancing among access points in the wireless LANs and can satisfy the quality of service requirements of the multimedia traffic applications. The proposed model is mathematically described as a linear programming model. Simulation study and analysis were conducted in order to demonstrate the performance of the proposed hybrid load balancing and association control scheme. Simulation results shows that the proposed scheme outperforms the other schemes in term of the percentage of blocking and the quality of the data transfer rate providing to the multimedia and real-time applications.

Keywords: Association control, Load balancing, Wireless LANs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
980 A 24-Bit, 8.1-MS/s D/A Converter for Audio Baseband Channel Applications

Authors: N. Ben Ameur, M. Loulou

Abstract:

This paper study the high-level modelling and design of delta-sigma (ΔΣ) noise shapers for audio Digital-to-Analog Converter (DAC) so as to eliminate the in-band Signal-to-Noise- Ratio (SNR) degradation that accompany one channel mismatch in audio signal. The converter combines a cascaded digital signal interpolation, a noise-shaping single loop delta-sigma modulator with a 5-bit quantizer resolution in the final stage. To reduce sensitivity of Digital-to-Analog Converter (DAC) nonlinearities of the last stage, a high pass second order Data Weighted Averaging (R2DWA) is introduced. This paper presents a MATLAB description modelling approach of the proposed DAC architecture with low distortion and swing suppression integrator designs. The ΔΣ Modulator design can be configured as a 3rd-order and allows 24-bit PCM at sampling rate of 64 kHz for Digital Video Disc (DVD) audio application. The modeling approach provides 139.38 dB of dynamic range for a 32 kHz signal band at -1.6 dBFS input signal level.

Keywords: DVD-audio, DAC, Interpolator and Interpolation Filter, Single-Loop ΔΣ Modulation, R2DWA, Clock Jitter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2597
979 Simultaneous Segmentation and Recognition of Arabic Characters in an Unconstrained On-Line Cursive Handwritten Document

Authors: Randa I. Elanwar, Mohsen A. Rashwan, Samia A. Mashali

Abstract:

The last two decades witnessed some advances in the development of an Arabic character recognition (CR) system. Arabic CR faces technical problems not encountered in any other language that make Arabic CR systems achieve relatively low accuracy and retards establishing them as market products. We propose the basic stages towards a system that attacks the problem of recognizing online Arabic cursive handwriting. Rule-based methods are used to perform simultaneous segmentation and recognition of word portions in an unconstrained cursively handwritten document using dynamic programming. The output of these stages is in the form of a ranked list of the possible decisions. A new technique for text line separation is also used.

Keywords: Arabic handwriting, character recognition, cursive handwriting, on-line recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
978 Thermal Stability of a Vertical SOI-Based Capacitorless One-Transistor DRAM with Trench-Body Structure

Authors: Po-Hsieh Lin, Jyi-Tsong Lin

Abstract:

A vertical SOI-based MOSFET with trench body structure operated as 1T DRAM cell at various temperatures has been studied and investigated. Different operation temperatures are assigned for the device for its performance comparison, thus the thermal stability is carefully evaluated for the future memory device applications. Based on the simulation, the vertical SOI-based MOSFET with trench body structure demonstrates the electrical characteristics properly and possess conspicuous kink effect at various operation temperatures. Transient characteristics were also performed to prove that its programming window values and retention time behaviors are acceptable when the new 1T DRAM cell is operated at high operation temperature.

Keywords: SOI, 1T DRAM, thermal stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
977 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: linked open data, information integration, digital libraries, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692
976 Qualitative Parametric Comparison of Load Balancing Algorithms in Parallel and Distributed Computing Environment

Authors: Amit Chhabra, Gurvinder Singh, Sandeep Singh Waraich, Bhavneet Sidhu, Gaurav Kumar

Abstract:

Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. One of the biggest issues in such systems is the development of effective techniques/algorithms for the distribution of the processes/load of a parallel program on multiple hosts to achieve goal(s) such as minimizing execution time, minimizing communication delays, maximizing resource utilization and maximizing throughput. Substantive research using queuing analysis and assuming job arrivals following a Poisson pattern, have shown that in a multi-host system the probability of one of the hosts being idle while other host has multiple jobs queued up can be very high. Such imbalances in system load suggest that performance can be improved by either transferring jobs from the currently heavily loaded hosts to the lightly loaded ones or distributing load evenly/fairly among the hosts .The algorithms known as load balancing algorithms, helps to achieve the above said goal(s). These algorithms come into two basic categories - static and dynamic. Whereas static load balancing algorithms (SLB) take decisions regarding assignment of tasks to processors based on the average estimated values of process execution times and communication delays at compile time, Dynamic load balancing algorithms (DLB) are adaptive to changing situations and take decisions at run time. The objective of this paper work is to identify qualitative parameters for the comparison of above said algorithms. In future this work can be extended to develop an experimental environment to study these Load balancing algorithms based on comparative parameters quantitatively.

Keywords: SLB, DLB, Host, Algorithm and Load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
975 Human Resource Management in the Innovation Activity in the Republic of Kazakhstan

Authors: A. T. Omarova, G. N. Nakipova

Abstract:

This article discusses the principles of object-oriented human capital development using the technology program. Also the article includes priorities of the strategy of industrial-innovative development of Kazakhstan in conditions of integration activity into the world community. The article shows the tasks of human resource management in the implementation of industrial and innovation development, particularities of Kazakhstan's theory of management staff, as well as due to the specificity of the Kazakhstan authorities. In the article had considered the factors which are affecting to the people in the organization and also had considered mechanisms of HRM within organization in the conditions of innovative development in Kazakhstan.

Keywords: Programming, management of human resources, innovation, investment, innovation process, HRD model, innovative development, integration, management, transformation, economic potential, competitiveness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
974 Sensitivity Computations of Time Relaxation Model with an Application in Cavity Computation

Authors: Monika Neda, Elena Nikonova

Abstract:

We present a numerical study of the sensitivity of the so called time relaxation family of models of fluid motion with respect to the time relaxation parameter χ on the two dimensional cavity problem. The goal of the study is to compute and compare the sensitivity of the model using finite difference method (FFD) and sensitivity equation method (SEM).

Keywords: Sensitivity, time relaxation, deconvolution, Navier- Stokes equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305