Search results for: Adaptive fuzzy clustering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1926

Search results for: Adaptive fuzzy clustering

246 Discovery of Human HMG-Coa Reductase Inhibitors Using Structure-Based Pharmacophore Modeling Combined with Molecular Dynamics Simulation Methodologies

Authors: Minky Son, Chanin Park, Ayoung Baek, Shalini John, Keun Woo Lee

Abstract:

3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGR) catalyzes the conversion of HMG-CoA to mevalonate using NADPH and the enzyme is involved in rate-controlling step of mevalonate. Inhibition of HMGR is considered as effective way to lower cholesterol levels so it is drug target to treat hypercholesterolemia, major risk factor of cardiovascular disease. To discover novel HMGR inhibitor, we performed structure-based pharmacophore modeling combined with molecular dynamics (MD) simulation. Four HMGR inhibitors were used for MD simulation and representative structure of each simulation were selected by clustering analysis. Four structure-based pharmacophore models were generated using the representative structure. The generated models were validated used in virtual screening to find novel scaffolds for inhibiting HMGR. The screened compounds were filtered by applying drug-like properties and used in molecular docking. Finally, four hit compounds were obtained and these complexes were refined using energy minimization. These compounds might be potential leads to design novel HMGR inhibitor.

Keywords: Anti-hypercholesterolemia drug, HMGR inhibitor, Molecular dynamics simulation, Structure-based pharmacophore modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
245 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

In the Solid-State-Drive (SSD) performance, whether the data has been well parallelized is an important factor. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: Dynamic allocation, NAND Flash based SSD, SSD parallelism, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
244 Recognition and Reconstruction of Partially Occluded Objects

Authors: Michela Lecca, Stefano Messelodi

Abstract:

A new automatic system for the recognition and re¬construction of resealed and/or rotated partially occluded objects is presented. The objects to be recognized are described by 2D views and each view is occluded by several half-planes. The whole object views and their visible parts (linear cuts) are then stored in a database. To establish if a region R of an input image represents an object possibly occluded, the system generates a set of linear cuts of R and compare them with the elements in the database. Each linear cut of R is associated to the most similar database linear cut. R is recognized as an instance of the object 0 if the majority of the linear cuts of R are associated to a linear cut of views of 0. In the case of recognition, the system reconstructs the occluded part of R and determines the scale factor and the orientation in the image plane of the recognized object view. The system has been tested on two different datasets of objects, showing good performance both in terms of recognition and reconstruction accuracy.

Keywords: Occluded Object Recognition, Shape Reconstruction, Automatic Self-Adaptive Systems, Linear Cut.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
243 Energy Recovery from Swell with a Height Inferior to 1.5 m

Authors: A. Errasti, F. Doffagne, O. Foucrier, S. Kao, A. Meigne, H. Pellae, T. Rouland

Abstract:

Renewable energy recovery is an important domain of research in past few years in view of protection of our ecosystem. Several industrial companies are setting up widespread recovery systems to exploit wave energy. Most of them have a large size, are implanted near the shores and exploit current flows. However, as oceans represent 70% of Earth surface, a huge space is still unexploited to produce energy. Present analysis focuses on surface small scale wave energy recovery. The principle is exactly the opposite of wheel damper for a car on a road. Instead of maintaining the car body as non-oscillatory as possible by adapted control, a system is designed so that its oscillation amplitude under wave action will be maximized with respect to a boat carrying it in view of differential potential energy recuperation. From parametric analysis of system equations, interesting domains have been selected and expected energy output has been evaluated.

Keywords: Small scale wave, potential energy, optimized energy recovery, auto-adaptive system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1194
242 Automatic Detection of Mass Type Breast Cancer using Texture Analysis in Korean Digital Mammography

Authors: E. B. Jo, J. H. Lee, J. Y. Park, S. M. Kim

Abstract:

In this study, we present an advanced detection technique for mass type breast cancer based on texture information of organs. The proposed method detects the cancer areas in three stages. In the first stage, the midpoints of mass area are determined based on AHE (Adaptive Histogram Equalization). In the second stage, we set the threshold coefficient of homogeneity by using MLE (Maximum Likelihood Estimation) to compute the uniformity of texture. Finally, mass type cancer tissues are extracted from the original image. As a result, it was observed that the proposed method shows an improved detection performance on dense breast tissues of Korean women compared with the existing methods. It is expected that the proposed method may provide additional diagnostic information for detection of mass-type breast cancer.

Keywords: Mass Type Breast Cancer, Mammography, Maximum Likelihood Estimation (MLE), Ranklets, SVM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989
241 Modeling and Simulation of Flow Shop Scheduling Problem through Petri Net Tools

Authors: Joselito Medina Marin, Norberto Hernández Romero, Juan Carlos Seck Tuoh Mora, Erick S. Martinez Gomez

Abstract:

The Flow Shop Scheduling Problem (FSSP) is a typical problem that is faced by production planning managers in Flexible Manufacturing Systems (FMS). This problem consists in finding the optimal scheduling to carry out a set of jobs, which are processed in a set of machines or shared resources. Moreover, all the jobs are processed in the same machine sequence. As in all the scheduling problems, the makespan can be obtained by drawing the Gantt chart according to the operations order, among other alternatives. On this way, an FMS presenting the FSSP can be modeled by Petri nets (PNs), which are a powerful tool that has been used to model and analyze discrete event systems. Then, the makespan can be obtained by simulating the PN through the token game animation and incidence matrix. In this work, we present an adaptive PN to obtain the makespan of FSSP by applying PN analytical tools.

Keywords: Flow-shop scheduling problem, makespan, Petri nets, state equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
240 Off-State Leakage Power Reduction by Automatic Monitoring and Control System

Authors: S. Abdollahi Pour, M. Saneei

Abstract:

This paper propose a new circuit design which monitor total leakage current during standby mode and generates the optimal reverse body bias voltage, by using the adaptive body bias (ABB) technique to compensate die-to-die parameter variations. Design details of power monitor are examined using simulation framework in 65nm and 32nm BTPM model CMOS process. Experimental results show the overhead of proposed circuit in terms of its power consumption is about 10 μW for 32nm technology and about 12 μW for 65nm technology at the same power supply voltage as the core power supply. Moreover the results show that our proposed circuit design is not far sensitive to the temperature variations and also process variations. Besides, uses the simple blocks which offer good sensitivity, high speed, the continuously feedback loop.

Keywords: leakage current, leakage power monitor, body biasing, low power

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
239 Shot Transition Detection with Minimal Decoding of MPEG Video Streams

Authors: Mona A. Fouad, Fatma M. Bayoumi, Hoda M. Onsi, Mohamed G. Darwish

Abstract:

Digital libraries become more and more necessary in order to support users with powerful and easy-to-use tools for searching, browsing and retrieving media information. The starting point for these tasks is the segmentation of video content into shots. To segment MPEG video streams into shots, a fully automatic procedure to detect both abrupt and gradual transitions (dissolve and fade-groups) with minimal decoding in real time is developed in this study. Each was explored through two phases: macro-block type's analysis in B-frames, and on-demand intensity information analysis. The experimental results show remarkable performance in detecting gradual transitions of some kinds of input data and comparable results of the rest of the examined video streams. Almost all abrupt transitions could be detected with very few false positive alarms.

Keywords: Adaptive threshold, abrupt transitions, gradual transitions, MPEG video streams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
238 A Novel Nucleus-Based Classifier for Discrimination of Osteoclasts and Mesenchymal Precursor Cells in Mouse Bone Marrow Cultures

Authors: Andreas Heindl, Alexander K. Seewald, Martin Schepelmann, Radu Rogojanu, Giovanna Bises, Theresia Thalhammer, Isabella Ellinger

Abstract:

Bone remodeling occurs by the balanced action of bone resorbing osteoclasts (OC) and bone-building osteoblasts. Increased bone resorption by excessive OC activity contributes to malignant and non-malignant diseases including osteoporosis. To study OC differentiation and function, OC formed in in vitro cultures are currently counted manually, a tedious procedure which is prone to inter-observer differences. Aiming for an automated OC-quantification system, classification of OC and precursor cells was done on fluorescence microscope images based on the distinct appearance of fluorescent nuclei. Following ellipse fitting to nuclei, a combination of eight features enabled clustering of OC and precursor cell nuclei. After evaluating different machine-learning techniques, LOGREG achieved 74% correctly classified OC and precursor cell nuclei, outperforming human experts (best expert: 55%). In combination with the automated detection of total cell areas, this system allows to measure various cell parameters and most importantly to quantify proteins involved in osteoclastogenesis.

Keywords: osteoclasts, machine learning, ellipse fitting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
237 Adaptive Anisotropic Diffusion for Ultrasonic Image Denoising and Edge Enhancement

Authors: Shujun Fu, Qiuqi Ruan, Wenqia Wang, Yu Li

Abstract:

Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.

Keywords: anisotropic diffusion, coordinate transformation, directional derivatives, edge enhancement, hyperbolic tangent function, image denoising.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1900
236 Survey on Strategic Games and Decision Making

Authors: S. Madhavi, K. Baala Srinivas, G. Bharath, R. K. Indhuja, M. Kowser Chandini

Abstract:

Game theory is the study of how people interact and make decisions to handle competitive situations. It has mainly been developed to study decision making in complex situations. Humans routinely alter their behaviour in response to changes in their social and physical environment. As a consequence, the outcomes of decisions that depend on the behaviour of multiple decision makers are difficult to predict and require highly adaptive decision-making strategies. In addition to the decision makers may have preferences regarding consequences to other individuals and choose their actions to improve or reduce the well-being of others. Nash equilibrium is a fundamental concept in the theory of games and the most widely used method of predicting the outcome of a strategic interaction in the social sciences. A Nash Equilibrium exists when there is no unilateral profitable deviation from any of the players involved. On the other hand, no player in the game would take a different action as long as every other player remains the same.

Keywords: Game Theory, Nash Equilibrium, Rules of Dominance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2372
235 Re-Design of Load Shedding Schemes of the Kosovo Power System

Authors: A.Gjukaj, G.Kabashi, G.Pula, N.Avdiu, B.Prebreza

Abstract:

This paper discusses aspects of re-design of loadshedding schemes with respect to actual developments in the Kosovo power system. Load-shedding is a type of emergency control that is designed to ensure system stability by reducing power system load to match the power generation supply. This paper presents a new adaptive load-shedding scheme that provides emergency protection against excess frequency decline, in cases when the Kosovo power system might be disconnected from the regional transmission network. The proposed load-shedding scheme uses the local frequency rate information to adapt the load-shedding pattern to suit the size and location of the occurring disturbance. The proposed scheme is tested in a software simulation on a large scale PSS/E model which represents nine power system areas of Southeast Europe including the Kosovo power system.

Keywords: About Load Shedding, Power System Transient, PSS/E Dynamic Simulation, Under-frequency Protection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765
234 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: Diabetic retinopathy, fundus images, STARE, Gabor filter, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
233 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics

Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris

Abstract:

The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.

Keywords: Cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
232 An Empirical Mode Decomposition Based Method for Action Potential Detection in Neural Raw Data

Authors: Sajjad Farashi, Mohammadjavad Abolhassani, Mostafa Taghavi Kani

Abstract:

Information in the nervous system is coded as firing patterns of electrical signals called action potential or spike so an essential step in analysis of neural mechanism is detection of action potentials embedded in the neural data. There are several methods proposed in the literature for such a purpose. In this paper a novel method based on empirical mode decomposition (EMD) has been developed. EMD is a decomposition method that extracts oscillations with different frequency range in a waveform. The method is adaptive and no a-priori knowledge about data or parameter adjusting is needed in it. The results for simulated data indicate that proposed method is comparable with wavelet based methods for spike detection. For neural signals with signal-to-noise ratio near 3 proposed methods is capable to detect more than 95% of action potentials accurately.

Keywords: EMD, neural data processing, spike detection, wavelet decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2374
231 A Novel Convergence Accelerator for the LMS Adaptive Algorithm

Authors: Jeng-Shin Sheu, Jenn-Kaie Lain, Tai-Kuo Woo, Jyh-Horng Wen

Abstract:

The least mean square (LMS) algorithmis one of the most well-known algorithms for mobile communication systems due to its implementation simplicity. However, the main limitation is its relatively slow convergence rate. In this paper, a booster using the concept of Markov chains is proposed to speed up the convergence rate of LMS algorithms. The nature of Markov chains makes it possible to exploit the past information in the updating process. Moreover, since the transition matrix has a smaller variance than that of the weight itself by the central limit theorem, the weight transition matrix converges faster than the weight itself. Accordingly, the proposed Markov-chain based booster thus has the ability to track variations in signal characteristics, and meanwhile, it can accelerate the rate of convergence for LMS algorithms. Simulation results show that the LMS algorithm can effectively increase the convergence rate and meantime further approach the Wiener solution, if the Markov-chain based booster is applied. The mean square error is also remarkably reduced, while the convergence rate is improved.

Keywords: LMS, Markov chain, convergence rate, accelerator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
230 Development of Orbital TIG Welding Robot System for the Pipe

Authors: Dongho Kim, Sung Choi, Kyowoong Pee, Youngsik Cho, Seungwoo Jeong, Soo-Ho Kim

Abstract:

This study is about the orbital TIG welding robot system which travels on the guide rail installed on the pipe, and welds and tracks the pipe seam using the LVS (Laser Vision Sensor) joint profile data. The orbital welding robot system consists of the robot, welder, controller, and LVS. Moreover we can define the relationship between welding travel speed and wire feed speed, and we can make the linear equation using the maximum and minimum amount of weld metal. Using the linear equation we can determine the welding travel speed and the wire feed speed accurately corresponding to the area of weld captured by LVS. We applied this orbital TIG welding robot system to the stainless steel or duplex pipe on DSME (Daewoo Shipbuilding and Marine Engineering Co. Ltd.,) shipyard and the result of radiographic test is almost perfect. (Defect rate: 0.033%).

Keywords: Adaptive welding, automatic welding, Pipe welding, Orbital welding, Laser vision sensor, LVS, welding D/B.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3868
229 Effect of Neighborhood Size on Negative Weights in Punctual Kriging Based Image Restoration

Authors: Asmatullah Chaudhry, Anwar M. Mirza

Abstract:

We present a general comparison of punctual kriging based image restoration for different neighbourhood sizes. The formulation of the technique under consideration is based on punctual kriging and fuzzy concepts for image restoration in spatial domain. Three different neighbourhood windows are considered to estimate the semivariance at different lags for studying its effect in reduction of negative weights resulted in punctual kriging, consequently restoration of degraded images. Our results show that effect of neighbourhood size higher than 5x5 on reduction in negative weights is insignificant. In addition, image quality measures, such as structure similarity indices, peak signal to noise ratios and the new variogram based quality measures; show that 3x3 window size gives better performance as compared with larger window sizes.

Keywords: Image restoration, punctual kriging, semi-variance, structure similarity index, negative weights in punctual kriging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2356
228 Agent Decision using Granular Computing in Traffic System

Authors: Yasser F. Hassan, Marwa Abdeen, Mustafa Fahmy

Abstract:

In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.

Keywords: Granular computing, rough sets, agents, traffic system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
227 Empirical Study from Final Exams of Computer Science Courses Demystifying the Notion of 'an Average Software Engineer'

Authors: Alex Elentukh

Abstract:

The paper is based on data collected from final exams administered during five years teaching the graduate course in software engineering. The visualization instrument with four distinct personas has been used to improve effectiveness of each class. The study offers a plethora of clues toward students' behavioral preferences. Diversity among students (professional background, physical proximity) is too significant to assume a single face of a learner. This is particularly true for a body of on-line graduate students in computer science. Conclusions of the study (each learner is unique and each class is unique) are extrapolated to demystify the notion of an 'average software engineer'. An immediate direction for an educator is to assure a course applies to a wide audience of very different individuals. On another hand, a student should be clear about his/her abilities and preferences - to follow the most effective learning path.

Keywords: K.3.2 computer & information science education, learner profiling, adaptive learning, software engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 651
226 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: Active Contour, Bayesian, Echocardiographic image, Feature vector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
225 The Spiral_OWL Model – Towards Spiral Knowledge Engineering

Authors: Hafizullah A. Hashim, Aniza. A

Abstract:

The Spiral development model has been used successfully in many commercial systems and in a good number of defense systems. This is due to the fact that cost-effective incremental commitment of funds, via an analogy of the spiral model to stud poker and also can be used to develop hardware or integrate software, hardware, and systems. To support adaptive, semantic collaboration between domain experts and knowledge engineers, a new knowledge engineering process, called Spiral_OWL is proposed. This model is based on the idea of iterative refinement, annotation and structuring of knowledge base. The Spiral_OWL model is generated base on spiral model and knowledge engineering methodology. A central paradigm for Spiral_OWL model is the concentration on risk-driven determination of knowledge engineering process. The collaboration aspect comes into play during knowledge acquisition and knowledge validation phase. Design rationales for the Spiral_OWL model are to be easy-to-implement, well-organized, and iterative development cycle as an expanding spiral.

Keywords: Domain Expert, Knowledge Base, Ontology, Software Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
224 Dominating Set Algorithm and Trust Evaluation Scheme for Secured Cluster Formation and Data Transferring

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

This paper describes the proficient way of choosing the cluster head based on dominating set algorithm in a wireless sensor network (WSN). The algorithm overcomes the energy deterioration problems by this selection process of cluster heads. Clustering algorithms such as LEACH, EEHC and HEED enhance scalability in WSNs. Dominating set algorithm keeps the first node alive longer than the other protocols previously used. As the dominating set of cluster heads are directly connected to each node, the energy of the network is saved by eliminating the intermediate nodes in WSN. Security and trust is pivotal in network messaging. Cluster head is secured with a unique key. The member can only connect with the cluster head if and only if they are secured too. The secured trust model provides security for data transmission in the dominated set network with the group key. The concept can be extended to add a mobile sink for each or for no of clusters to transmit data or messages between cluster heads and to base station. Data security id preferably high and data loss can be prevented. The simulation demonstrates the concept of choosing cluster heads by dominating set algorithm and trust evaluation using DSTE. The research done is rationalized.

Keywords: Wireless Sensor Networks, LEECH, EEHC, HEED, DSTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
223 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based On Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focusses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: Auto-ID, Construction Logistic, Fuzzy, Monitoring, RFID, Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
222 A Methodology for Data Migration between Different Database Management Systems

Authors: Bogdan Walek, Cyril Klimes

Abstract:

In present days the area of data migration is very topical. Current tools for data migration in the area of relational database have several disadvantages that are presented in this paper. We propose a methodology for data migration of the database tables and their data between various types of relational database systems (RDBMS). The proposed methodology contains an expert system. The expert system contains a knowledge base that is composed of IFTHEN rules and based on the input data suggests appropriate data types of columns of database tables. The proposed tool, which contains an expert system, also includes the possibility of optimizing the data types in the target RDBMS database tables based on processed data of the source RDBMS database tables. The proposed expert system is shown on data migration of selected database of the source RDBMS to the target RDBMS.

Keywords: Expert system, fuzzy, data migration, database, relational database, data type, relational database management system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3492
221 Signal Driven Sampling and Filtering a Promising Approach for Time Varying Signals Processing

Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin

Abstract:

The mobile systems are powered by batteries. Reducing the system power consumption is a key to increase its autonomy. It is known that mostly the systems are dealing with time varying signals. Thus, we aim to achieve power efficiency by smartly adapting the system processing activity in accordance with the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting signal driven sampling and processing. In this context, a signal driven filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by analysing the input signal local variations. Thus, it correlates the processing activity with the signal variations. It leads towards a drastic computational gain of the proposed technique compared to the classical one.

Keywords: Level Crossing Sampling, Activity Selection, Adaptive Rate Filtering, Computational Complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361
220 Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

Authors: Shuen-Tai Wang, Hsi-Ya Chang

Abstract:

Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interface following the cloud computing concept that seeks to enable collaboration and communication among users for high performance computing. Given the development of remote desktop virtualization, it allows to shift the user’s desktop from the traditional PC environment to the cloud platform, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.

Keywords: Virtualization, Remote Desktop, HTML5, Cloud Computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3253
219 An MADM Framework toward Hierarchical Production Planning in Hybrid MTS/MTO Environments

Authors: H. Rafiei, M. Rabbani

Abstract:

This paper proposes a new decision making structure to determine the appropriate product delivery strategy for different products in a manufacturing system among make-to-stock, make-toorder, and hybrid strategy. Given product delivery strategies for all products in the manufacturing system, the position of the Order Penetrating Point (OPP) can be located regarding the delivery strategies among which location of OPP in hybrid strategy is a cumbersome task. In this regard, we employ analytic network process, because there are varieties of interrelated driving factors involved in choosing the right location. Moreover, the proposed structure is augmented with fuzzy sets theory in order to cope with the uncertainty of judgments. Finally, applicability of the proposed structure is proven in practice through a real industrial case company. The numerical results demonstrate the efficiency of the proposed decision making structure in order partitioning and OPP location.

Keywords: Hybrid make-to-stock/make-to-order, Multi-attribute decision making, Order partitioning, Order penetration point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2223
218 Data Transmission Reliability in Short Message Integrated Distributed Monitoring Systems

Authors: Sui Xin, Li Chunsheng, Tian Di

Abstract:

Short message integrated distributed monitoring systems (SM-DMS) are growing rapidly in wireless communication applications in various areas, such as electromagnetic field (EMF) management, wastewater monitoring, and air pollution supervision, etc. However, delay in short messages often makes the data embedded in SM-DMS transmit unreliably. Moreover, there are few regulations dealing with this problem in SMS transmission protocols. In this study, based on the analysis of the command and data requirements in the SM-DMS, we developed a processing model for the control center to solve the delay problem in data transmission. Three components of the model: the data transmission protocol, the receiving buffer pool method, and the timer mechanism were described in detail. Discussions on adjusting the threshold parameter in the timer mechanism were presented for the adaptive performance during the runtime of the SM-DMS. This model optimized the data transmission reliability in SM-DMS, and provided a supplement to the data transmission reliability protocols at the application level.

Keywords: Delay, SMS, reliability, distributed monitoringsystem (DMS), wireless communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
217 New Efficient Method for Coding Color Images

Authors: Walaa M.Abd-Elhafiez, Wajeb Gharibi

Abstract:

In this paper a novel color image compression technique for efficient storage and delivery of data is proposed. The proposed compression technique started by RGB to YCbCr color transformation process. Secondly, the canny edge detection method is used to classify the blocks into the edge and non-edge blocks. Each color component Y, Cb, and Cr compressed by discrete cosine transform (DCT) process, quantizing and coding step by step using adaptive arithmetic coding. Our technique is concerned with the compression ratio, bits per pixel and peak signal to noise ratio, and produce better results than JPEG and more recent published schemes (like CBDCT-CABS and MHC). The provided experimental results illustrate the proposed technique that is efficient and feasible in terms of compression ratio, bits per pixel and peak signal to noise ratio.

Keywords: Image compression, color image, Q-coder, quantization, edge-detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671