Search results for: Heterogeneous Earliest Finish Time (HEFT) algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9221

Search results for: Heterogeneous Earliest Finish Time (HEFT) algorithm

4721 Multi-Objective Optimization of a Steam Turbine Stage

Authors: Alvise Pellegrini, Ernesto Benini

Abstract:

The design of a steam turbine is a very complex engineering operation that can be simplified and improved thanks to computer-aided multi-objective optimization. This process makes use of existing optimization algorithms and losses correlations to identify those geometries that deliver the best balance of performance (i.e. Pareto-optimal points). This paper deals with a one-dimensional multi-objective and multi-point optimization of a single-stage steam turbine. Using a genetic optimization algorithm and an algebraic one-dimensional ideal gas-path model based on loss and deviation correlations, a code capable of performing the optimization of a predefined steam turbine stage was developed. More specifically, during this study the parameters modified (i.e. decision variables) to identify the best performing geometries were solidity and angles both for stator and rotor cascades, while the objective functions to maximize were totalto- static efficiency and specific work done. Finally, an accurate analysis of the obtained results was carried out.

Keywords: Steam turbine, optimization, genetic algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2758
4720 Packaging in a Multivariate Conceptual Design Synthesis of a BWB Aircraft

Authors: Paul Okonkwo, Howard Smith

Abstract:

A study to estimate the size of the cabin and major aircraft components as well as detect and avoid interference between internally placed components and the external surface, during the conceptual design synthesis and optimisation to explore the design space of a BWB, was conducted. Sizing of components follows the Bradley cabin sizing and rubber engine scaling procedures to size the cabin and engine respectively. The interference detection and avoidance algorithm relies on the ability of the Class Shape Transform parameterisation technique to generate polynomial functions of the surfaces of a BWB aircraft configuration from the sizes of the cabin and internal objects using few variables. Interference detection is essential in packaging of non-conventional configuration like the BWB because of the non-uniform airfoil-shaped sections and resultant varying internal space. The unique configuration increases the need for a methodology to prevent objects from being placed in locations that do not sufficiently enclose them within the geometry.

Keywords: Packaging, Optimisation, BWB, Parameterisation, Aircraft Conceptual Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2421
4719 Efficiency Improvements of GaAs-based Solar Cells by Hydrothermally-deposited ZnO Nanostructure Array

Authors: Chun-Yuan Huang, Chiao-Yang Cheng, Chun-Yem Huang, Yan-Kuin Su, James Chin-Lung Fang

Abstract:

ZnO nanostructures including nanowires, nanorods, and nanoneedles were successfully deposited on GaAs substrates, respectively, by simple two-step chemical method for the first time. A ZnO seed layer was firstly pre-coated on the O2-plasma treated substrate by sol-gel process, followed by the nucleation of ZnO nanostructures through hydrothermal synthesis. Nanostructures with different average diameter (15-250 nm), length (0.9-1.8 μm), density (0.9-16×109 cm-2) were obtained via adjusting the growth time and concentration of precursors. From the reflectivity spectra, we concluded ordered and taper nanostructures were preferential for photovoltaic applications. ZnO nanoneedles with an average diameter of 106 nm, a moderate length of 2.4 μm, and the density of 7.2×109 cm-2 could be synthesized in the concentration of 0.04 M for 18 h. Integrated with the nanoneedle array, the power conversion efficiency of single junction solar cell was increased from 7.3 to 12.2%, corresponding to a 67% improvement.

Keywords: Anti-reflection, Chemical synthesis, Solar cells, ZnO nanostructures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
4718 Incident Shock Wave Interaction with an Axisymmetric Cone Body Placed in Shock Tube

Authors: Rabah Haoui

Abstract:

This work presents a numerical simulation of the interaction of an incident shock wave propagates from the left to the right with a cone placed in a tube at shock. The Mathematical model is based on a non stationary, viscous and axisymmetric flow. The Discretization of the Navier-stokes equations is carried out by the finite volume method in the integral form along with the Flux Vector Splitting method of Van Leer. Here, adequate combination of time stepping parameter, CFL coefficient and mesh size level is selected to ensure numerical convergence. The numerical simulation considers a shock tube filled with air. The incident shock wave propagates to the right with a determined Mach number and crosses the cone by leaving behind it a stationary detached shock wave in front of the nose cone. This type of interaction is observed according to the time of flow.

Keywords: Supersonic flow, viscous flow, finite volume, cone body

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567
4717 A New Vector Quantization Front-End Process for Discrete HMM Speech Recognition System

Authors: M. Debyeche, J.P Haton, A. Houacine

Abstract:

The paper presents a complete discrete statistical framework, based on a novel vector quantization (VQ) front-end process. This new VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique that we named the distributed vector quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure, when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants. The first variant uses the K-means algorithm (K-means- DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of neural networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system.

Keywords: Hidden Markov Model, Vector Quantization, Neural Network, Speech Recognition, Arabic Language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2064
4716 Facility Location Problem in Emergency Logistic

Authors: Yousef Abu Nahleh, Arun Kumar, Fugen Daver

Abstract:

Facility location is one of the important problems affecting the relief operations. The location model in this paper is motivated by arranging the flow of relief materials from the main warehouse to continent warehouse and further to regional warehouse and from these to the disaster area. This flow makes the relief organization always ready to deal with the disaster situation during shortest possible time. The main purpose of this paper is merge the concept of just in time and the campaign system in emergency supply chain,so that when the disaster happens the affected country can request help from the nearest regional warehouse, which will supply the relief material and the required stuff to support and assist the victims in the disaster area. Furthermore, the regional warehouse places an order to the continent warehouse to replenish the material that is distributed to the disaster area. This way they will always be ready to respond to any type of disaster.

Keywords: Facility location, Center-of-Gravity Technique, Humanitarian relief, emergency supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3815
4715 Detection of Voltage Sag and Voltage Swell in Power Quality Using Wavelet Transforms

Authors: Nor Asrina Binti Ramlee

Abstract:

Voltage sag, voltage swell, high-frequency noise and voltage transients are kinds of disturbances in power quality. They are also known as power quality events. Equipment used in the industry nowadays has become more sensitive to these events with the increasing complexity of equipment. This leads to the importance of distributing clean power quality to the consumer. To provide better service, the best analysis on power quality is very vital. Thus, this paper presents the events detection focusing on voltage sag and swell. The method is developed by applying time domain signal analysis using wavelet transform approach in MATLAB. Four types of mother wavelet namely Haar, Dmey, Daubechies, and Symlet are used to detect the events. This project analyzed real interrupted signal obtained from 22 kV transmission line in Skudai, Johor Bahru, Malaysia. The signals will be decomposed through the wavelet mothers. The best mother is the one that is capable to detect the time location of the event accurately.

Keywords: Power quality, voltage sag, voltage swell, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
4714 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: Cognitive radio, energy detector, periodogram, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1045
4713 Comparative Study of Pasting Properties of High Fibre Plantain Based Flour Intended for Diabetic Food (Fufu)

Authors: C. C. Okafor, E. E. Ugwu

Abstract:

A comparative study on the feasibility of producing instant high fibre plantain flour for diabetic fufu by blending soy residence with different plantain (Musa spp) varieties (Horn, false Horn and French), all sieved at 60 mesh, mixed in ratio of 60:40 was analyzed for their passing properties using standard analytical method. Results show that VIIIS60 had the highest peak viscosity (303.75 RVU), Trough value (182.08 RVU), final viscosity (284.50 RVU), and lowest in breakdown viscosity (79.58 RVU), set back value (88.17 RVU), peak time (4.36min), pasting temperature (81.18°C) and differed significantly (p <0.05) from other samples. VIS60 had the lowest in peak viscosity (192.25 RVU), Trough value (112.67 RVU), final viscosity (211.92 RVU), but highest in breakdown viscosity (121.61 RVU), peak time (4.66min) pasting temperature (82.35°C), and differed significantly (p <0.05), from other samples. VIIS60 had the medium peak viscosity (236.67 RVU), Trough value (116.58 RVU), Break down viscosity (120:08 RVU), set back viscosity (167.92 RVU), peak time (4.39min), pasting temp (81.44°C) and differed significantly (p <0.05) from other samples. High final viscosity and low set back values of the French variety with soy residue blended at 60 mesh particle size recommends this french variety and fibre composition as optimum for production of instant plantain soy residue flour blend for production of diabetic fufu. 

Keywords: Plantain, soy residue pasting properties particle size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2384
4712 Recognition of Noisy Words Using the Time Delay Neural Networks Approach

Authors: Khenfer-Koummich Fatima, Mesbahi Larbi, Hendel Fatiha

Abstract:

This paper presents a recognition system for isolated words like robot commands. It’s carried out by Time Delay Neural Networks; TDNN. To teleoperate a robot for specific tasks as turn, close, etc… In industrial environment and taking into account the noise coming from the machine. The choice of TDNN is based on its generalization in terms of accuracy, in more it acts as a filter that allows the passage of certain desirable frequency characteristics of speech; the goal is to determine the parameters of this filter for making an adaptable system to the variability of speech signal and to noise especially, for this the back propagation technique was used in learning phase. The approach was applied on commands pronounced in two languages separately: The French and Arabic. The results for two test bases of 300 spoken words for each one are 87%, 97.6% in neutral environment and 77.67%, 92.67% when the white Gaussian noisy was added with a SNR of 35 dB.

Keywords: Neural networks, Noise, Speech Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
4711 Relationship between Level of Physical Activity and Exercise Imagery among Klang Valley Citizens

Authors: Kok, M.O., Omar-Fauzee, M.S., Rosli, M.H.

Abstract:

This study investigated the relationship between exercise imagery use and level of physical activity within a wide range of exercisers in Klang valley, Malaysia. One hundred and twenty four respondents (Mage = 28.92, SD = 9.34) completed two sets of questionnaires (Exercise Imagery Inventory and Leisure-Time Exercise Questionnaire) that measure the use of imagery and exercise frequency of participants. From the result obtained, exercise imagery is found to be significantly correlated to level of physical activity. Besides that, variables such as gender, age and ethnicity that may affect the use of imagery and exercise frequency were also being assessed in this study. Among all variables, only ethnicity showed significant difference in level of physical activity (p < 0.05). Findings in this study suggest that further investigation should be done on other variables such as socioeconomic, educational level, and selfefficacy that may affect the imagery use and frequency of physical activity among exercisers.

Keywords: Physical activity, exercise imagery, ExerciseImagery Inventory, Leisure-Time Exercise Questionnaire

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
4710 The Study of Cost Accounting in S Company Based On TDABC

Authors: Heng Ma

Abstract:

Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost.Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost.The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.

Keywords: Third-party logistics enterprises, TDABC, cost management, S company.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2446
4709 Analysis of the EEG Signal for a Practical Biometric System

Authors: Muhammad Kamil Abdullah, Khazaimatol S Subari, Justin Leo Cheang Loong, Nurul Nadia Ahmad

Abstract:

This paper discusses the effectiveness of the EEG signal for human identification using four or less of channels of two different types of EEG recordings. Studies have shown that the EEG signal has biometric potential because signal varies from person to person and impossible to replicate and steal. Data were collected from 10 male subjects while resting with eyes open and eyes closed in 5 separate sessions conducted over a course of two weeks. Features were extracted using the wavelet packet decomposition and analyzed to obtain the feature vectors. Subsequently, the neural networks algorithm was used to classify the feature vectors. Results show that, whether or not the subjects- eyes were open are insignificant for a 4– channel biometrics system with a classification rate of 81%. However, for a 2–channel system, the P4 channel should not be included if data is acquired with the subjects- eyes open. It was observed that for 2– channel system using only the C3 and C4 channels, a classification rate of 71% was achieved.

Keywords: Biometric, EEG, Wavelet Packet Decomposition, NeuralNetworks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3043
4708 An Efficient Cache Replacement Strategy for the Hybrid Cache Consistency Approach

Authors: Aline Zeitunlian, Ramzi A. Haraty

Abstract:

Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.

Keywords: Cache consistency, hybrid algorithm, and mobileenvironments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2213
4707 Transformation of Kosovo Education from Traditional into Modern 1999-2012

Authors: Bekim Avdiaj

Abstract:

Everyday life is and will be influenced depending on the developments that society undergoes throughout the history. Particularly, countries undergoing transition from one system to another sustain the greatest impact in trying to embrace the modern system. Kosovo society had the fortune to experience a change, which began in late 1999 to continue up to date. One of the 'developments' of the time with the evolution in Kosovo society was the transition from the traditional education system into the modern one. This transformation began immediately after the war, to continue even today. It was started by internationals, which governed and administered Kosovo society, including education. There was a great 'evolution', because almost the entire system was 'changed'. Among other things, for the first time it was enabled the opening of private schools from the lowest level up to the colleges and universities. This paper will address: how much was ready the society to embrace such a 'cultural' change in education, respectively, how much were prepared teachers for such changes; as it was actually thought to be a modern education system, how much was it according to international standards; what are the results and current situation in Kosovo education.

Keywords: Education, evolution, reform, transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
4706 Markov Chain Monte Carlo Model Composition Search Strategy for Quantitative Trait Loci in a Bayesian Hierarchical Model

Authors: Susan J. Simmons, Fang Fang, Qijun Fang, Karl Ricanek

Abstract:

Quantitative trait loci (QTL) experiments have yielded important biological and biochemical information necessary for understanding the relationship between genetic markers and quantitative traits. For many years, most QTL algorithms only allowed one observation per genotype. Recently, there has been an increasing demand for QTL algorithms that can accommodate more than one observation per genotypic distribution. The Bayesian hierarchical model is very flexible and can easily incorporate this information into the model. Herein a methodology is presented that uses a Bayesian hierarchical model to capture the complexity of the data. Furthermore, the Markov chain Monte Carlo model composition (MC3) algorithm is used to search and identify important markers. An extensive simulation study illustrates that the method captures the true QTL, even under nonnormal noise and up to 6 QTL.

Keywords: Bayesian hierarchical model, Markov chain MonteCarlo model composition, quantitative trait loci.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
4705 Experimental Study on Two-Step Pyrolysis of Automotive Shredder Residue

Authors: Letizia Marchetti, Federica Annunzi, Federico Fiorini, Cristiano Nicolella

Abstract:

Automotive shredder residue (ASR) is a mixture of waste that makes up 20-25% of end-of-life vehicles. For many years, ASR was commonly disposed of in landfills or incinerated, causing serious environmental problems. Nowadays, thermochemical treatments are a promising alternative, although the heterogeneity of ASR still poses some challenges. One of the emerging thermochemical treatments for ASR is pyrolysis, which promotes the decomposition of long polymeric chains by providing heat in the absence of an oxidizing agent. In this way, pyrolysis promotes the conversion of ASR into solid, liquid, and gaseous phases. This work aims to improve the performance of a two-step pyrolysis process. After the characterization of the analysed ASR, the focus is on determining the effects of residence time on product yields and gas composition. A batch experimental setup that reproduces the entire process was used. The setup consists of three sections: the pyrolysis section (made of two reactors), the separation section, and the analysis section. Two different residence times were investigated to find suitable conditions for the first sample of ASR. These first tests showed that the products obtained were more sensitive to residence time in the second reactor. Indeed, slightly increasing residence time in the second reactor managed to raise the yield of gas and carbon residue and decrease the yield of liquid fraction. Then, to test the versatility of the setup, the same conditions were applied to a different sample of ASR coming from a different chemical plant. The comparison between the two ASR samples shows that similar product yields and compositions are obtained using the same setup.

Keywords: Automotive shredder residue, experimental tests, heterogeneity, product yields, two-step pyrolysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 79
4704 Detecting and Measuring Fabric Pills Using Digital Image Analysis

Authors: Dariush Semnani, Hossein Ghayoor

Abstract:

In this paper a novel method was presented for evaluating the fabric pills using digital image processing techniques. This work provides a novel technique for detecting pills and also measuring their heights, surfaces and volumes. Surely, measuring the intensity of defects by human vision is an inaccurate method for quality control; as a result, this problem became a motivation for employing digital image processing techniques for detection of defects of fabric surface. In the former works, the systems were just limited to measuring of the surface of defects, but in the presented method the height and the volume of defects were also measured, which leads to a more accurate quality control. An algorithm was developed to first, find pills and then measure their average intensity by using three criteria of height, surface and volume. The results showed a meaningful relation between the number of rotations and the quality of pilled fabrics.

Keywords: 3D analysis, computer vision, fabric, pile, surface evaluation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2627
4703 Optimization of Loudspeaker Part Design Parameters by Air Viscosity Damping Effect

Authors: Yue Hu, Xilu Zhao, Takao Yamaguchi, Manabu Sasajima, Yoshio Koike, Akira Hara

Abstract:

This study optimized the design parameters of a cone loudspeaker as an example of high flexibility of the product design. We developed an acoustic analysis software program that considers the impact of damping caused by air viscosity. In sound reproduction, it is difficult to optimize each parameter of the loudspeaker design. To overcome the limitation of the design problem in practice, this study presents an acoustic analysis algorithm to optimize the design parameters of the loudspeaker. The material character of cone paper and the loudspeaker edge were the design parameters, and the vibration displacement of the cone paper was the objective function. The results of the analysis showed that the design had high accuracy as compared to the predicted value. These results suggested that although the parameter design is difficult, with experience and intuition, the design can be performed easily using the optimized design found with the acoustic analysis software.

Keywords: Air viscosity, design parameters, loudspeaker, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1201
4702 Dynamics of a Vapour Bubble inside a Vertical Rigid Cylinder with a Deposit Rib

Authors: S. Mehran, S. Rouhi, F.Rouzbahani, E. Haghgoo

Abstract:

In this paper dynamics of a vapour bubble generated due to a local energy input inside a vertical rigid cylinder and in the absence of buoyancy forces is investigated. Different ratios of the diameter of the rigid cylinder to the maximum radius of the bubble are considered. The Boundary Integral Equation Method is employed for numerical simulation of the problem. Results show that during the collapse phase of the bubble inside a vertical rigid cylinder, two liquid micro jets are developed on the top and bottom sides of the vapour bubble and are directed inward. Results also show that existence of a deposit rib inside the vertical rigid cylinder slightly increases the life time of the bubble. It is found that by increasing the ratio of the cylinder diameter to the maximum radius of the bubble, the rate of the growth and collapse phases of the bubble increases and the life time of the bubble decreases.

Keywords: Vapour bubble, Vertical rigid cylinder, Boundaryelement method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996
4701 Local Mesh Co-Occurrence Pattern for Content Based Image Retrieval

Authors: C. Yesubai Rubavathi, R. Ravi

Abstract:

This paper presents the local mesh co-occurrence patterns (LMCoP) using HSV color space for image retrieval system. HSV color space is used in this method to utilize color, intensity and brightness of images. Local mesh patterns are applied to define the local information of image and gray level co-occurrence is used to obtain the co-occurrence of LMeP pixels. Local mesh co-occurrence pattern extracts the local directional information from local mesh pattern and converts it into a well-mannered feature vector using gray level co-occurrence matrix. The proposed method is tested on three different databases called MIT VisTex, Corel, and STex. Also, this algorithm is compared with existing methods, and results in terms of precision and recall are shown in this paper.

Keywords: Content-based image retrieval system, HSV color space, gray level co-occurrence matrix, local mesh pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2227
4700 Fuzzy Based Particle Swarm Optimization Routing Technique for Load Balancing in Wireless Sensor Networks

Authors: S. Balaji, E. Golden Julie, M. Rajaram, Y. Harold Robinson

Abstract:

Network lifetime improvement and uncertainty in multiple systems are the issues of wireless sensor network routing. This paper presents fuzzy based particle swarm optimization routing technique to improve the network scalability. Significantly, in the cluster formation procedure, fuzzy based system is used to solve the uncertainty and network balancing. Cluster heads play an important role to reduce the energy consumption using particle swarm optimization algorithm, the cluster head sends its information along data packets to the heads with link. The simulation results show that the presented routing protocol can perform load balancing effectively and reduce the energy consumption of cluster heads.

Keywords: Wireless sensor networks, fuzzy logic, PSO, LEACH.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1289
4699 Construct Pairwise Test Suites Based on the Bak-Sneppen Model of Biological Evolution

Authors: Jianjun Yuan, Changjun Jiang

Abstract:

Pairwise testing, which requires that every combination of valid values of each pair of system factors be covered by at lease one test case, plays an important role in software testing since many faults are caused by unexpected 2-way interactions among system factors. Although meta-heuristic strategies like simulated annealing can generally discover smaller pairwise test suite, they may cost more time to perform search, compared with greedy algorithms. We propose a new method, improved Extremal Optimization (EO) based on the Bak-Sneppen (BS) model of biological evolution, for constructing pairwise test suites and define fitness function according to the requirement of improved EO. Experimental results show that improved EO gives similar size of resulting pairwise test suite and yields an 85% reduction in solution time over SA.

Keywords: Covering Arrays, Extremal Optimization, Simulated Annealing, Software Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
4698 Simulating Discrete Time Model Reference Adaptive Control System with Great Initial Error

Authors: Bubaker M. F. Bushofa, Abdel Hafez A. Azab

Abstract:

This article is based on the technique which is called Discrete Parameter Tracking (DPT). First introduced by A. A. Azab [8] which is applicable for less order reference model. The order of the reference model is (n-l) and n is the number of the adjustable parameters in the physical plant. The technique utilizes a modified gradient method [9] where the knowledge of the exact order of the nonadaptive system is not required, so, as to eliminate the identification problem. The applicability of the mentioned technique (DPT) was examined through the solution of several problems. This article introduces the solution of a third order system with three adjustable parameters, controlled according to second order reference model. The adjustable parameters have great initial error which represent condition. Computer simulations for the solution and analysis are provided to demonstrate the simplicity and feasibility of the technique.

Keywords: Adaptive Control System, Discrete Parameter Tracking, Discrete Time Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1071
4697 Data Gathering Protocols for Wireless Sensor Networks

Authors: Dhinu Johnson, Gurdip Singh

Abstract:

Sensor network applications are often data centric and involve collecting data from a set of sensor nodes to be delivered to various consumers. Typically, nodes in a sensor network are resource-constrained, and hence the algorithms operating in these networks must be efficient. There may be several algorithms available implementing the same service, and efficient considerations may require a sensor application to choose the best suited algorithm. In this paper, we present a systematic evaluation of a set of algorithms implementing the data gathering service. We propose a modular infrastructure for implementing such algorithms in TOSSIM with separate configurable modules for various tasks such as interest propagation, data propagation, aggregation, and path maintenance. By appropriately configuring these modules, we propose a number of data gathering algorithms, each of which incorporates a different set of heuristics for optimizing performance. We have performed comprehensive experiments to evaluate the effectiveness of these heuristics, and we present results from our experimentation efforts.

Keywords: Data Centric Protocols, Shortest Paths, Sensor networks, Message passing systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
4696 Microbial Production of Levan using Date Syrup and Investigation of Its Properties

Authors: Marzieh Moosavi-Nasab, Behnaz Layegh , Ladan Aminlari, Mohammad B. Hashemi

Abstract:

Levan, an exopolysaccharide, was produced by Microbacterium laevaniformans and its yield was characterized as a function of concentrations of date syrup, sucrose and the fermentation time. The optimum condition for levan production from sucrose was at concentration of 20% sucrose for 48 h and for date syrup was 25% for 48 h. The results show that an increase in fermentation time caused a decrease in the levan production at all concentrations of date syrup tested. Under these conditions after 48 h in sucrose medium, levan production reached 48.9 g/L and for date syrup reached 10.48 g/L . The effect of pH on the yield of the purified levan was examined and the optimum pH for levan production was determined to be 6.0. Levan was composed mainly of fructose residues when analyzed by TLC and FT-IR spectroscopy. Date syrup is a cheap substrate widely available in Iran and has potential for levan production. The thermal stability of levan was assessed by Thermo Gravimetric Analysis (TGA) that revealed the onset of decomposition near to 49°C for the levan produced from sucrose and 51°C for the levan from date syrup. DSC results showed a single Tg at 98°C for levan produced from sucrose and 206 °C for levan from date syrup.

Keywords: Date syrup, Fermentation, Levan, Microbacteriumlaevaniformans

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2744
4695 Application of Artificial Neural Network in Assessing Fill Slope Stability

Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung

Abstract:

This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.

Keywords: Landslide, limit analysis, ANN, soil properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1215
4694 Adaptive Bidirectional Flow for Image Interpolation and Enhancement

Authors: Shujun Fu, Qiuqi Ruan, Wenqia Wang

Abstract:

Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually the effects of blurred edges and jagged artifacts in the image to some extent. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to sharpen edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (“jaggies") along the tangent directions. In order to preserve image features such as edges, corners and textures, the nonlinear diffusion coefficients are locally adjusted according to the directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.

Keywords: anisotropic diffusion, bidirectional flow, directional derivatives, edge enhancement, image interpolation, inverse flow, shock filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
4693 A Study on Finding Similar Document with Multiple Categories

Authors: R. Saraçoğlu, N. Allahverdi

Abstract:

Searching similar documents and document management subjects have important place in text mining. One of the most important parts of similar document research studies is the process of classifying or clustering the documents. In this study, a similar document search approach that includes discussion of out the case of belonging to multiple categories (multiple categories problem) has been carried. The proposed method that based on Fuzzy Similarity Classification (FSC) has been compared with Rocchio algorithm and naive Bayes method which are widely used in text mining. Empirical results show that the proposed method is quite successful and can be applied effectively. For the second stage, multiple categories vector method based on information of categories regarding to frequency of being seen together has been used. Empirical results show that achievement is increased almost two times, when proposed method is compared with classical approach.

Keywords: Document similarity, Fuzzy classification, Multiple categories, Text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
4692 Decision Tree for Competing Risks Survival Probability in Breast Cancer Study

Authors: N. A. Ibrahim, A. Kudus, I. Daud, M. R. Abu Bakar

Abstract:

Competing risks survival data that comprises of more than one type of event has been used in many applications, and one of these is in clinical study (e.g. in breast cancer study). The decision tree method can be extended to competing risks survival data by modifying the split function so as to accommodate two or more risks which might be dependent on each other. Recently, researchers have constructed some decision trees for recurrent survival time data using frailty and marginal modelling. We further extended the method for the case of competing risks. In this paper, we developed the decision tree method for competing risks survival time data based on proportional hazards for subdistribution of competing risks. In particular, we grow a tree by using deviance statistic. The application of breast cancer data is presented. Finally, to investigate the performance of the proposed method, simulation studies on identification of true group of observations were executed.

Keywords: Competing risks, Decision tree, Simulation, Subdistribution Proportional Hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2381