Search results for: Search Algorithm
400 Dynamic Model Conception of Improving Services Quality in Railway Transport
Authors: Eva Nedeliakova, Jaroslav Masek, Juraj Camaj
Abstract:
This article describes the results of research focused on quality of railway freight transport services. Improvement of these services has a crucial importance in customer considering on the future use of railway transport. Processes filling the customer demands and output quality assessment were defined as a part of the research. In this contribution is introduced the map of quality planning and the algorithm of applied methodology. It characterizes a model which takes into account characters of transportation with linking a perception services quality in ordinary and extraordinary operation. Despite the fact that rail freight transport has its solid position in the transport market, lots of carriers worldwide have been experiencing a stagnation for a couple of years. Therefore, specific results of the research have a significant importance and belong to numerous initiatives aimed to develop and support railway transport not only by creating a single railway area or reducing noise but also by promoting railway services. This contribution is focused also on the application of dynamic quality models which represent an innovative method of evaluation quality services. Through this conception, time factor, expected, and perceived quality in each moment of the transportation process can be taken into account.Keywords: Quality, railway, transport, service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705399 A Comprehensive Review on Different Mixed Data Clustering Ensemble Methods
Authors: S. Sarumathi, N. Shanthi, S. Vidhya, M. Sharmila
Abstract:
An extensive amount of work has been done in data clustering research under the unsupervised learning technique in Data Mining during the past two decades. Moreover, several approaches and methods have been emerged focusing on clustering diverse data types, features of cluster models and similarity rates of clusters. However, none of the single clustering algorithm exemplifies its best nature in extracting efficient clusters. Consequently, in order to rectify this issue, a new challenging technique called Cluster Ensemble method was bloomed. This new approach tends to be the alternative method for the cluster analysis problem. The main objective of the Cluster Ensemble is to aggregate the diverse clustering solutions in such a way to attain accuracy and also to improve the eminence the individual clustering algorithms. Due to the massive and rapid development of new methods in the globe of data mining, it is highly mandatory to scrutinize a vital analysis of existing techniques and the future novelty. This paper shows the comparative analysis of different cluster ensemble methods along with their methodologies and salient features. Henceforth this unambiguous analysis will be very useful for the society of clustering experts and also helps in deciding the most appropriate one to resolve the problem in hand.
Keywords: Clustering, Cluster Ensemble Methods, Coassociation matrix, Consensus Function, Median Partition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107398 Optimal Placement of Piezoelectric Actuators on Plate Structures for Active Vibration Control Using Modified Control Matrix and Singular Value Decomposition Approach
Authors: Deepak Chhabra, Gian Bhushan, Pankaj Chandna
Abstract:
The present work deals with the optimal placement of piezoelectric actuators on a thin plate using Modified Control Matrix and Singular Value Decomposition (MCSVD) approach. The problem has been formulated using the finite element method using ten piezoelectric actuators on simply supported plate to suppress first six modes. The sizes of ten actuators are combined to outline one actuator by adding the ten columns of control matrix to form a column matrix. The singular value of column control matrix is considered as the fitness function and optimal positions of the actuators are obtained by maximizing it with GA. Vibration suppression has been studied for simply supported plate with piezoelectric patches in optimal positions using Linear Quadratic regulator) scheme. It is observed that MCSVD approach has given the position of patches adjacent to each-other, symmetric to the centre axis and given greater vibration suppression than other previously published results on SVD.
Keywords: Closed loop Average dB gain, Genetic Algorithm (GA), LQR Controller, MCSVD, Optimal positions, Singular Value Decomposition (SVD) Approaches.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3074397 Soft-Sensor for Estimation of Gasoline Octane Number in Platforming Processes with Adaptive Neuro-Fuzzy Inference Systems (ANFIS)
Authors: Hamed.Vezvaei, Sepideh.Ordibeheshti, Mehdi.Ardjmand
Abstract:
Gasoline Octane Number is the standard measure of the anti-knock properties of a motor in platforming processes, that is one of the important unit operations for oil refineries and can be determined with online measurement or use CFR (Cooperative Fuel Research) engines. Online measurements of the Octane number can be done using direct octane number analyzers, that it is too expensive, so we have to find feasible analyzer, like ANFIS estimators. ANFIS is the systems that neural network incorporated in fuzzy systems, using data automatically by learning algorithms of NNs. ANFIS constructs an input-output mapping based both on human knowledge and on generated input-output data pairs. In this research, 31 industrial data sets are used (21 data for training and the rest of the data used for generalization). Results show that, according to this simulation, hybrid method training algorithm in ANFIS has good agreements between industrial data and simulated results.Keywords: Adaptive Neuro-Fuzzy Inference Systems, GasolineOctane Number, Soft-sensor, Catalytic Naphtha Reforming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196396 Cyber Security Enhancement via Software-Defined Pseudo-Random Private IP Address Hopping
Authors: Andre Slonopas, Warren Thompson, Zona Kostic
Abstract:
Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicates via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.
Keywords: Moving Target Defense, cybersecurity, network security, hopping randomization, software defined network, network security theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 628395 Perceptual JPEG Compliant Coding by Using DCT-Based Visibility Thresholds of Color Images
Authors: Kuo-Cheng Liu
Abstract:
Effective estimation of just noticeable distortion (JND) for images is helpful to increase the efficiency of a compression algorithm in which both the statistical redundancy and the perceptual redundancy should be accurately removed. In this paper, we design a DCT-based model for estimating JND profiles of color images. Based on a mathematical model of measuring the base detection threshold for each DCT coefficient in the color component of color images, the luminance masking adjustment, the contrast masking adjustment, and the cross masking adjustment are utilized for luminance component, and the variance-based masking adjustment based on the coefficient variation in the block is proposed for chrominance components. In order to verify the proposed model, the JND estimator is incorporated into the conventional JPEG coder to improve the compression performance. A subjective and fair viewing test is designed to evaluate the visual quality of the coding image under the specified viewing condition. The simulation results show that the JPEG coder integrated with the proposed DCT-based JND model gives better coding bit rates at visually lossless quality for a variety of color images.
Keywords: Just-noticeable distortion (JND), discrete cosine transform (DCT), JPEG.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2581394 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion detection system (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw dataset for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle component analysis (PCA), Linear Discriminant Analysis (LDA) and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. This optimal feature subset is used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) are used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.
Keywords: Particle Swarm Optimization (PSO), Principle component analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2768393 Detecting HCC Tumor in Three Phasic CT Liver Images with Optimization of Neural Network
Authors: Mahdieh Khalilinezhad, Silvana Dellepiane, Gianni Vernazza
Abstract:
The aim of this work is to build a model based on tissue characterization that is able to discriminate pathological and non-pathological regions from three-phasic CT images. With our research and based on a feature selection in different phases, we are trying to design a neural network system with an optimal neuron number in a hidden layer. Our approach consists of three steps: feature selection, feature reduction, and classification. For each region of interest (ROI), 6 distinct sets of texture features are extracted such as: first order histogram parameters, absolute gradient, run-length matrix, co-occurrence matrix, autoregressive model, and wavelet, for a total of 270 texture features. When analyzing more phases, we show that the injection of liquid cause changes to the high relevant features in each region. Our results demonstrate that for detecting HCC tumor phase 3 is the best one in most of the features that we apply to the classification algorithm. The percentage of detection between pathology and healthy classes, according to our method, relates to first order histogram parameters with accuracy of 85% in phase 1, 95% in phase 2, and 95% in phase 3.
Keywords: Feature selection, Multi-phasic liver images, Neural network, Texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2536392 End-to-End Pyramid Based Method for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.
Keywords: Accelerate MRI scans, image reconstruction, pyramid network, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 339391 Bandwidth Efficient Diversity Scheme Using STTC Concatenated With STBC: MIMO Systems
Authors: Sameru Sharma, Sanjay Sharma, Derick Engles
Abstract:
Multiple-input multiple-output (MIMO) systems are widely in use to improve quality, reliability of wireless transmission and increase the spectral efficiency. However in MIMO systems, multiple copies of data are received after experiencing various channel effects. The limitations on account of complexity due to number of antennas in case of conventional decoding techniques have been looked into. Accordingly we propose a modified sphere decoder (MSD-1) algorithm with lower complexity and give rise to system with high spectral efficiency. With the aim to increase signal diversity we apply rotated quadrature amplitude modulation (QAM) constellation in multi dimensional space. Finally, we propose a new architecture involving space time trellis code (STTC) concatenated with space time block code (STBC) using MSD-1 at the receiver for improving system performance. The system gains have been verified with channel state information (CSI) errors.Keywords: Channel State Information , Diversity, Multi-Antenna, Rotated Constellation, Space Time Codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667390 Antimicrobial, Antioxidant and Cytotoxic Activities of Cleoma viscosa Linn. Crude Extracts
Authors: Suttijit Sriwatcharakul
Abstract:
The bioactivity studies from the weed ethanolic crude extracts from leaf, stem, pod and root of wild spider flower; Cleoma viscosa Linn. were analyzed for the growth inhibition of 6 bacterial species; Salmonella typhimurium TISTR 5562, Pseudomonas aeruginosa ATCC 27853, Staphylococcus aureus TISTR 1466, Streptococcus epidermidis ATCC 1228, Escherichia coli DMST 4212 and Bacillus subtilis ATCC 6633 with initial concentration crude extract of 50 mg/ml. The agar well diffusion results found that the extracts inhibit only gram positive bacteria species; S. aureus, S. epidermidis and B. subtilis. The minimum inhibition concentration study with gram positive strains revealed that leaf crude extract give the best result of the lowest concentration compared with other plant parts to inhibit the growth of S. aureus, S. epidermidis and B. subtilis at 0.78, 0.39 and lower than 0.39 mg/ml, respectively. The determination of total phenolic compounds in the crude extracts exhibited the highest phenolic content was 10.41 mg GAE/g dry weight in leaf crude extract. Analyzed the efficacy of free radical scavenging by using DPPH radical scavenging assay with all crude extracts showed value of IC50 of leaf, stem, pod and root crude extracts were 8.32, 12.26, 21.62 and 35.99 mg/ml, respectively. Studied cytotoxicity of crude extracts on human breast adenocarcinoma cell line by MTT assay found that pod extract had the most cytotoxicity CC50 value, 32.41 µg/ml. Antioxidant activity and cytotoxicity of crude extracts exhibited that the more increase of extract concentration, the more activities indicated. According to the bioactivities results, the leaf crude extract of Cleoma viscosa Linn. is the most interesting plant part for further work to search the beneficial of this weed.Keywords: Antimicrobial, antioxidant activity, Cleoma viscosa Linn., cytotoxicity test, total phenolic compound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771389 A Model of Market Segmentation for the Customers of Mellat Bank in Iran
Authors: Nader Gharibnavaz, Hossein Yazdi
Abstract:
If organizations like Mellat Bank want to identify its customer market completely to reach its specified goals, it can segment the market to offer the product package to the right segment. Our objective is to offer a segmentation model for Iran banking market in Mellat bank view. The methodology of this project is combined by “segmentation on the basis of four part-quality variables" and “segmentation on the basis of different in means". Required data are gathered from E-Systems and researcher personal observation. Finally, the research offers the organization that at first step form a four dimensional matrix with 756 segments using four variables named value-based, behavioral, activity style, and activity level, and at the second step calculate the means of profit for every cell of matrix in two distinguished work level (levels α1:normal condition and α2: high pressure condition) and compare the segments by checking two conditions that are 1- homogeneity every segment with its sub segment and 2- heterogeneity with other segments, and so it can do the necessary segmentation process. After all, the last offer (more explained by an operational example and feedback algorithm) is to test and update the model because of dynamic environment, technology, and banking system.Keywords: market segmentation model, banking system, Mellat bank
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3288388 Females’ Usage Patterns of Information and Communication Technologies (ICTs) in the Vhembe District, South Africa
Authors: F. O. Maphiri-Makananise
Abstract:
This paper explores and provides substantiated evidence on the usage patterns of Information and Communication Technologies (ICTs) by female users at Vhembe District in Limpopo- Province, South Africa. The study presents a comprehensive picture on the usage of ICTs from female users’ perspective. The significance of this study stems from the need to assess the role, relevance and usage patterns of ICTs such as smartphones, computers, laptops, and iPods, the internet and social networking sites among females following the developments of new media technologies in society. The objective of the study is to investigate the usability and accessibility of ICTs to empower female users in South Africa. The study used quantitative and qualitative research methods to determine the major ideas, perceptions and usage patterns of ICTs by users. Data collection involved the use of structured selfadministered questionnaire from two groups of respondents who participated in this study. Thus, (n=50) female students at the University of Venda provided their ideas and perceptions about the usefulness and usage patterns of ICTs such as smartphones, the Internet and computers at the university level, whereas, the second group were (n=50) learners from Makhado Comprehensive School who provided their perceptions and ideas about the use of ICTs at the high school level. The researcher also noted that the findings of the study were useful as a guideline and model for ICT intervention that could work as an empowerment to women in South Africa. It was observed that the central purpose of ICTs among female users was to search for information regarding assignment writing, conducting research, dating, exchanging ideas and networking with friends and relatives. This was demonstrated by a high number of females who used ICTs for e-learning (62%) and social purposes (85%). Therefore, the study revealed that most females used ICTs for social purposes and accessing the internet rather than for entertainment, a gesture that provides an opportune space to empower rural women in South Africa.Keywords: Female users, Information and Communication Technologies, Internet, Usage patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733387 Image Processing Approach for Detection of Three-Dimensional Tree-Rings from X-Ray Computed Tomography
Authors: Jorge Martinez-Garcia, Ingrid Stelzner, Joerg Stelzner, Damian Gwerder, Philipp Schuetz
Abstract:
Tree-ring analysis is an important part of the quality assessment and the dating of (archaeological) wood samples. It provides quantitative data about the whole anatomical ring structure, which can be used, for example, to measure the impact of the fluctuating environment on the tree growth, for the dendrochronological analysis of archaeological wooden artefacts and to estimate the wood mechanical properties. Despite advances in computer vision and edge recognition algorithms, detection and counting of annual rings are still limited to 2D datasets and performed in most cases manually, which is a time consuming, tedious task and depends strongly on the operator’s experience. This work presents an image processing approach to detect the whole 3D tree-ring structure directly from X-ray computed tomography imaging data. The approach relies on a modified Canny edge detection algorithm, which captures fully connected tree-ring edges throughout the measured image stack and is validated on X-ray computed tomography data taken from six wood species.
Keywords: Ring recognition, edge detection, X-ray computed tomography, dendrochronology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815386 Detection of Actuator Faults for an Attitude Control System using Neural Network
Authors: S. Montenegro, W. Hu
Abstract:
The objective of this paper is to develop a neural network-based residual generator to detect the fault in the actuators for a specific communication satellite in its attitude control system (ACS). First, a dynamic multilayer perceptron network with dynamic neurons is used, those neurons correspond a second order linear Infinite Impulse Response (IIR) filter and a nonlinear activation function with adjustable parameters. Second, the parameters from the network are adjusted to minimize a performance index specified by the output estimated error, with the given input-output data collected from the specific ACS. Then, the proposed dynamic neural network is trained and applied for detecting the faults injected to the wheel, which is the main actuator in the normal mode for the communication satellite. Then the performance and capabilities of the proposed network were tested and compared with a conventional model-based observer residual, showing the differences between these two methods, and indicating the benefit of the proposed algorithm to know the real status of the momentum wheel. Finally, the application of the methods in a satellite ground station is discussed.Keywords: Satellite, Attitude Control, Momentum Wheel, Neural Network, Fault Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992385 Improved Automated Classification of Alcoholics and Non-alcoholics
Authors: Ramaswamy Palaniappan
Abstract:
In this paper, several improvements are proposed to previous work of automated classification of alcoholics and nonalcoholics. In the previous paper, multiplayer-perceptron neural network classifying energy of gamma band Visual Evoked Potential (VEP) signals gave the best classification performance using 800 VEP signals from 10 alcoholics and 10 non-alcoholics. Here, the dataset is extended to include 3560 VEP signals from 102 subjects: 62 alcoholics and 40 non-alcoholics. Three modifications are introduced to improve the classification performance: i) increasing the gamma band spectral range by increasing the pass-band width of the used filter ii) the use of Multiple Signal Classification algorithm to obtain the power of the dominant frequency in gamma band VEP signals as features and iii) the use of the simple but effective knearest neighbour classifier. To validate that these two modifications do give improved performance, a 10-fold cross validation classification (CVC) scheme is used. Repeat experiments of the previously used methodology for the extended dataset are performed here and improvement from 94.49% to 98.71% in maximum averaged CVC accuracy is obtained using the modifications. This latest results show that VEP based classification of alcoholics is worth exploring further for system development.Keywords: Alcoholic, Multilayer-perceptron, Nearest neighbour, Gamma band, MUSIC, Visual evoked potential.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1379384 Efficient HAAR Wavelet Transform with Embedded Zerotrees of Wavelet Compression for Color Images
Authors: S. Piramu Kailasam
Abstract:
This study is expected to compress true color image with compression algorithms in color spaces to provide high compression rates. The need of high compression ratio is to improve storage space. Alternative aim is to rank compression algorithms in a suitable color space. The dataset is sequence of true color images with size 128 x 128. HAAR Wavelet is one of the famous wavelet transforms, has great potential and maintains image quality of color images. HAAR wavelet Transform using Set Partitioning in Hierarchical Trees (SPIHT) algorithm with different color spaces framework is applied to compress sequence of images with angles. Embedded Zerotrees of Wavelet (EZW) is a powerful standard method to sequence data. Hence the proposed compression frame work of HAAR wavelet, xyz color space, morphological gradient and applied image with EZW compression, obtained improvement to other methods, in terms of Compression Ratio, Mean Square Error, Peak Signal Noise Ratio and Bits Per Pixel quality measures.
Keywords: Color Spaces, HAAR Wavelet, Morphological Gradient, Embedded Zerotrees Wavelet Compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 521383 Data Envelopment Analysis with Partially Perfect Objects
Authors: Alexander Y. Vaninsky
Abstract:
This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Keywords: Data Envelopment Analysis, Perfect object, Partially perfect object, Partial efficiency, Explicit solution, Simplified algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697382 An Artificial Neural Network Based Model for Predicting H2 Production Rates in a Sucrose-Based Bioreactor System
Authors: Nikhil, Bestamin Özkaya, Ari Visa, Chiu-Yue Lin, Jaakko A. Puhakka, Olli Yli-Harja
Abstract:
The performance of a sucrose-based H2 production in a completely stirred tank reactor (CSTR) was modeled by neural network back-propagation (BP) algorithm. The H2 production was monitored over a period of 450 days at 35±1 ºC. The proposed model predicts H2 production rates based on hydraulic retention time (HRT), recycle ratio, sucrose concentration and degradation, biomass concentrations, pH, alkalinity, oxidation-reduction potential (ORP), acids and alcohols concentrations. Artificial neural networks (ANNs) have an ability to capture non-linear information very efficiently. In this study, a predictive controller was proposed for management and operation of large scale H2-fermenting systems. The relevant control strategies can be activated by this method. BP based ANNs modeling results was very successful and an excellent match was obtained between the measured and the predicted rates. The efficient H2 production and system control can be provided by predictive control method combined with the robust BP based ANN modeling tool.Keywords: Back-propagation, biohydrogen, bioprocessmodeling, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776381 Binarization of Text Region based on Fuzzy Clustering and Histogram Distribution in Signboards
Authors: Jonghyun Park, Toan Nguyen Dinh, Gueesang Lee
Abstract:
In this paper, we present a novel approach to accurately detect text regions including shop name in signboard images with complex background for mobile system applications. The proposed method is based on the combination of text detection using edge profile and region segmentation using fuzzy c-means method. In the first step, we perform an elaborate canny edge operator to extract all possible object edges. Then, edge profile analysis with vertical and horizontal direction is performed on these edge pixels to detect potential text region existing shop name in a signboard. The edge profile and geometrical characteristics of each object contour are carefully examined to construct candidate text regions and classify the main text region from background. Finally, the fuzzy c-means algorithm is performed to segment and detected binarize text region. Experimental results show that our proposed method is robust in text detection with respect to different character size and color and can provide reliable text binarization result.Keywords: Text detection, edge profile, signboard image, fuzzy clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2228380 A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation
Authors: Parvinder Singh Sandhu, Dalwinder Singh Salaria, Hardeep Singh
Abstract:
Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.Keywords: Software Reusability, Software Metrics, Neural Networks, Genetic Algorithm, Fuzzy Logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816379 A Particle Swarm Optimal Control Method for DC Motor by Considering Energy Consumption
Authors: Yingjie Zhang, Ming Li, Ying Zhang, Jing Zhang, Zuolei Hu
Abstract:
In the actual start-up process of DC motors, the DC drive system often faces a conflict between energy consumption and acceleration performance. To resolve the conflict, this paper proposes a comprehensive performance index that energy consumption index is added on the basis of classical control performance index in the DC motor starting process. Taking the comprehensive performance index as the cost function, particle swarm optimization algorithm is designed to optimize the comprehensive performance. Then it conducts simulations on the optimization of the comprehensive performance of the DC motor on condition that the weight coefficient of the energy consumption index should be properly designed. The simulation results show that as the weight of energy consumption increased, the energy efficiency was significantly improved at the expense of a slight sacrifice of fastness indicators with the comprehensive performance index method. The energy efficiency was increased from 63.18% to 68.48% and the response time reduced from 0.2875s to 0.1736s simultaneously compared with traditional proportion integrals differential controller in energy saving.
Keywords: Comprehensive performance index, energy consumption, acceleration performance, particle swarm optimal control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645378 New Approach for Minimizing Wavelength Fragmentation in Wavelength-Routed WDM Networks
Authors: Sami Baraketi, Jean-Marie Garcia, Olivier Brun
Abstract:
Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods
Keywords: WDM, lightpath, RWA, wavelength fragmentation, optimization, linear programming, heuristic
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871377 Analysis of Three-Dimensional Longitudinal Rolls Induced by Double Diffusive Poiseuille-Rayleigh-Benard Flows in Rectangular Channels
Authors: O. Rahli, N. Mimouni, R. Bennacer, K. Bouhadef
Abstract:
This numerical study investigates the travelling wave’s appearance and the behavior of Poiseuille-Rayleigh-Benard (PRB) flow induced in 3D thermosolutale mixed convection (TSMC) in horizontal rectangular channels. The governing equations are discretized by using a control volume method with third order Quick scheme in approximating the advection terms. Simpler algorithm is used to handle coupling between the momentum and continuity equations. To avoid the excessively high computer time, full approximation storage (FAS) with full multigrid (FMG) method is used to solve the problem. For a broad range of dimensionless controlling parameters, the contribution of this work is to analyzing the flow regimes of the steady longitudinal thermoconvective rolls (noted R//) for both thermal and mass transfer (TSMC). The transition from the opposed volume forces to cooperating ones, considerably affects the birth and the development of the longitudinal rolls. The heat and mass transfers distribution are also examined.Keywords: Heat and mass transfer, mixed convection, Poiseuille-Rayleigh-Benard flow, rectangular duct.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1087376 Error Rate Performance Comparisons of Precoding Schemes over Fading Channels for Multiuser MIMO
Authors: M. Arulvizhi
Abstract:
In Multiuser MIMO communication systems, interuser interference has a strong impact on the transmitted signals. Precoding technique schemes are employed for multiuser broadcast channels to suppress an interuser interference. Different Linear and nonlinear precoding schemes are there. For the massive system dimension, it is difficult to design an appropriate precoding algorithm with low computational complexity and good error rate performance at the same time over fading channels. This paper describes the error rate performance of precoding schemes over fading channels with the assumption of perfect channel state information at the transmitter. To estimate the bit error rate performance, different propagation environments namely, Rayleigh, Rician and Nakagami fading channels have been offered. This paper presents the error rate performance comparison of these fading channels based on precoding methods like Channel Inversion and Dirty paper coding for multiuser broadcasting system. MATLAB simulation has been used. It is observed that multiuser system achieves better error rate performance by Dirty paper coding over Rayleigh fading channel.
Keywords: Multiuser MIMO, channel inversion precoding, dirty paper coding, fading channels, BER.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 718375 Collaborative Stylistic Group Project: A Drama Practical Analysis Application
Authors: Omnia F. Elkommos
Abstract:
In the course of teaching stylistics to undergraduate students of the Department of English Language and Literature, Faculty of Arts and Humanities, the linguistic tool kit of theories comes in handy and useful for the better understanding of the different literary genres: Poetry, drama, and short stories. In the present paper, a model of teaching of stylistics is compiled and suggested. It is a collaborative group project technique for use in the undergraduate diverse specialisms (Literature, Linguistics and Translation tracks) class. Students initially are introduced to the different linguistic tools and theories suitable for each literary genre. The second step is to apply these linguistic tools to texts. Students are required to watch videos performing the poems or play, for example, and search the net for interpretations of the texts by other authorities. They should be using a template (prepared by the researcher) that has guided questions leading students along in their analysis. Finally, a practical analysis would be written up using the practical analysis essay template (also prepared by the researcher). As per collaborative learning, all the steps include activities that are student-centered addressing differentiation and considering their three different specialisms. In the process of selecting the proper tools, the actual application and analysis discussion, students are given tasks that request their collaboration. They also work in small groups and the groups collaborate in seminars and group discussions. At the end of the course/module, students present their work also collaboratively and reflect and comment on their learning experience. The module/course uses a drama play that lends itself to the task: ‘The Bond’ by Amy Lowell and Robert Frost. The project results in an interpretation of its theme, characterization and plot. The linguistic tools are drawn from pragmatics, and discourse analysis among others.
Keywords: Applied linguistic theories, collaborative learning, cooperative principle, discourse analysis, drama analysis, group project, online acting performance, pragmatics, speech act theory, stylistics, technology enhanced learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1081374 A Noble Flow Rate Control based on Leaky Bucket Method for Multi-Media OBS Networks
Authors: Kentaro Miyoko, Yoshihiko Mori, Yugo Ikeda, Yoshihiro Nishino, Yong-Bok Choi, Hiromi Okada
Abstract:
Optical burst switching (OBS) has been proposed to realize the next generation Internet based on the wavelength division multiplexing (WDM) network technologies. In the OBS, the burst contention is one of the major problems. The deflection routing has been designed for resolving the problem. However, the deflection routing becomes difficult to prevent from the burst contentions as the network load becomes high. In this paper, we introduce a flow rate control methods to reduce burst contentions. We propose new flow rate control methods based on the leaky bucket algorithm and deflection routing, i.e. separate leaky bucket deflection method, and dynamic leaky bucket deflection method. In proposed methods, edge nodes which generate data bursts carry out the flow rate control protocols. In order to verify the effectiveness of the flow rate control in OBS networks, we show that the proposed methods improve the network utilization and reduce the burst loss probability through computer simulations.Keywords: Optical burst switching, OBS, flow rate control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1707373 An Inverse Heat Transfer Algorithm for Predicting the Thermal Properties of Tumors during Cryosurgery
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.
Keywords: Cryosurgery, inverse heat transfer, Levenberg-Marquardt method, thermal properties, Pennes model, enthalpy method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500372 Optimized Energy Scheduling Algorithm for Energy Efficient Wireless Sensor Networks
Authors: S. Arun Rajan, S. Bhavani
Abstract:
Wireless sensor networks can be tiny, low cost, intelligent sensors connected with advanced communication systems. WSNs have pulled in significant consideration as a matter of fact that, industrial as well as medical solicitations employ these in monitoring targets, conservational observation, obstacle exposure, movement regulator etc. In these applications, sensor hubs are thickly sent in the unattended environment with little non-rechargeable batteries. This constraint requires energy-efficient systems to drag out the system lifetime. There are redundancies in data sent over the network. To overcome this, multiple virtual spine scheduling has been presented. Such networks problems are called Maximum Lifetime Backbone Scheduling (MLBS) problems. Though this sleep wake cycle reduces radio usage, improvement can be made in the path in which the group heads stay selected. Cluster head selection with emphasis on geometrical relation of the system will enhance the load sharing among the nodes. Also the data are analyzed to reduce redundant transmission. Multi-hop communication will facilitate lighter loads on the network.
Keywords: WSN, wireless sensor networks, MLBS, maximum lifetime backbone scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878371 Influence of Fiber Packing on Transverse Plastic Properties of Metal Matrix Composites
Authors: Mohammad Tahaye Abadi
Abstract:
The present paper concerns with the influence of fiber packing on the transverse plastic properties of metal matrix composites. A micromechanical modeling procedure is used to predict the effective mechanical properties of composite materials at large tensile and compressive deformations. Microstructure is represented by a repeating unit cell (RUC). Two fiber arrays are considered including ideal square fiber packing and random fiber packing defined by random sequential algorithm. The micromechanical modeling procedure is implemented for graphite/aluminum metal matrix composite in which the reinforcement behaves as elastic, isotropic solids and the matrix is modeled as an isotropic elastic-plastic solid following the von Mises criterion with isotropic hardening and the Ramberg-Osgood relationship between equivalent true stress and logarithmic strain. The deformation is increased to a considerable value to evaluate both elastic and plastic behaviors of metal matrix composites. The yields strength and true elastic-plastic stress are determined for graphite/aluminum composites.Keywords: Fiber packing, metal matrix composites, micromechanics, plastic deformation, random
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646