Search results for: linked cluster algorithm
4811 An Approach to the Assembly Line Balancing Problem with Uncertain Operation Time
Authors: Zhongmin Wang, Lin Wei, Hengshan Zhang, Tianhua Chen, Yimin Zhou
Abstract:
The assembly line balancing problems are signficant in mass production systems. In order to deal with the uncertainties that practically exist but barely mentioned in the literature, this paper develops a mathematic model with an optimisation algorithm to solve the assembly line balancing problem with uncertainty operation time. The developed model is able to work with a variable number of workstations under the uncertain environment, aiming to obtain the minimal number of workstation and minimal idle time for each workstation. In particular, the proposed approach first introduces the concept of protection time that closely works with the uncertain operation time. Four dominance rules and the mechanism of determining up and low bounds are subsequently put forward, which serve as the basis for the proposed branch and bound algorithm. Experimental results show that the proposed work verified on a benchmark data set is able to solve the uncertainties efficiently.Keywords: assembly lines, SALBP-UOT, uncertain operation time, branch and bound algorithm.
Procedia PDF Downloads 1694810 Thin Films of Glassy Carbon Prepared by Cluster Deposition
Authors: Hatem Diaf, Patrice Melinon, Antonio Pereira, Bernard Moine, Nicholas Blanchard, Florent Bourquard, Florence Garrelie, Christophe Donnet
Abstract:
Glassy carbon exhibits excellent biological compatibility with live tissues meaning it has high potential for applications in life science. Moreover, glassy carbon has interesting properties including 'high temperature resistance', hardness, low density, low electrical resistance, low friction, and low thermal resistance. The structure of glassy carbon has long been a subject of debate. It is now admitted that glassy carbon is 100% sp2. This term is a little bit confusing as long sp2 hybridization defined from quantum chemistry is related to both properties: threefold configuration and pi bonding (parallel pz orbitals). Using plasma laser deposition of carbon clusters combined with pulsed nano/femto laser annealing, we are able to synthesize thin films of glassy carbon of good quality (probed by G band/ D disorder band ratio in Raman spectroscopy) without thermal post annealing. A careful inspecting of Raman signal, plasmon losses and structure performed by HRTEM (High Resolution Transmission Electron Microscopy) reveals that both properties (threefold and pi orbitals) cannot coexist together. The structure of the films is compared to models including schwarzites based from negatively curved surfaces at the opposite of onions or fullerene-like structures with positively curved surfaces. This study shows that a huge collection of porous carbon named vitreous carbon with different structures can coexist.Keywords: glassy carbon, cluster deposition, coating, electronic structure
Procedia PDF Downloads 3174809 An Algorithm for Preventing the Irregular Operation Modes of the Drive Synchronous Motor Providing the Ore Grinding
Authors: Baghdasaryan Marinka
Abstract:
The current scientific and engineering interest concerning the problems of preventing the emergency manifestations of drive synchronous motors, ensuring the ore grinding technological process has been justified. The analysis of the known works devoted to the abnormal operation modes of synchronous motors and possibilities of protection against them, has shown that their application is inexpedient for preventing the impermissible displays arising in the electrical drive synchronous motors ensuring the ore-grinding process. The main energy and technological factors affecting the technical condition of synchronous motors are evaluated. An algorithm for preventing the irregular operation modes of the electrical drive synchronous motor applied in the ore-grinding technological process has been developed and proposed for further application which gives an opportunity to provide smart solutions, ensuring the safe operation of the drive synchronous motor by a comprehensive consideration of the energy and technological factors.Keywords: synchronous motor, abnormal operating mode, electric drive, algorithm, energy factor, technological factor
Procedia PDF Downloads 1344808 Efficient Reconstruction of DNA Distance Matrices Using an Inverse Problem Approach
Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii
Abstract:
We continue to consider one of the cybernetic methods in computational biology related to the study of DNA chains. Namely, we are considering the problem of reconstructing the not fully filled distance matrix of DNA chains. When applied in a programming context, it is revealed that with a modern computer of average capabilities, creating even a small-sized distance matrix for mitochondrial DNA sequences is quite time-consuming with standard algorithms. As the size of the matrix grows larger, the computational effort required increases significantly, potentially spanning several weeks to months of non-stop computer processing. Hence, calculating the distance matrix on conventional computers is hardly feasible, and supercomputers are usually not available. Therefore, we started publishing our variants of the algorithms for calculating the distance between two DNA chains; then, we published algorithms for restoring partially filled matrices, i.e., the inverse problem of matrix processing. In this paper, we propose an algorithm for restoring the distance matrix for DNA chains, and the primary focus is on enhancing the algorithms that shape the greedy function within the branches and boundaries method framework.Keywords: DNA chains, distance matrix, optimization problem, restoring algorithm, greedy algorithm, heuristics
Procedia PDF Downloads 1164807 Spectral Clustering for Manufacturing Cell Formation
Authors: Yessica Nataliani, Miin-Shen Yang
Abstract:
Cell formation (CF) is an important step in group technology. It is used in designing cellular manufacturing systems using similarities between parts in relation to machines so that it can identify part families and machine groups. There are many CF methods in the literature, but there is less spectral clustering used in CF. In this paper, we propose a spectral clustering algorithm for machine-part CF. Some experimental examples are used to illustrate its efficiency. Overall, the spectral clustering algorithm can be used in CF with a wide variety of machine/part matrices.Keywords: group technology, cell formation, spectral clustering, grouping efficiency
Procedia PDF Downloads 4034806 Construction and Cross-Linking of Polyelectrolyte Multilayers Based on Polysaccharides as Antifouling Coatings
Authors: Wenfa Yu, Thuva Gnanasampanthan, John Finlay, Jessica Clarke, Charlotte Anderson, Tony Clare, Axel Rosenhahn
Abstract:
Marine biofouling is a worldwide problem at vast economic and ecological costs. Historically it was combated with toxic coatings such as tributyltin. As those coatings being banned nowadays, finding environmental friendly antifouling solution has become an urgent topic. In this study antifouling coatings consisted of natural occurring polysaccharides hyaluronic acid (HA), alginic acid (AA), chitosan (Ch) and polyelectrolyte polyethylenimine (PEI) are constructed into polyelectrolyte multilayers (PEMs) in a Layer-by-Layer (LbL) method. LbL PEM construction is a straightforward way to assemble biomacromolecular coatings on surfaces. Advantages about PEM include ease of handling, highly diverse PEM composition, precise control over the thickness and so on. PEMs have been widely employed in medical application and there are numerous studies regarding their protein adsorption, elasticity and cell adhesive properties. With the adjustment of coating composition, termination layer charge, coating morphology and cross-linking method, it is possible to prepare low marine biofouling coatings with PEMs. In this study, using spin coating technology, PEM construction was achieved at smooth multilayers with roughness as low as 2nm rms and highly reproducible thickness around 50nm. To obtain stability in sea water, the multilayers were covalently cross-linked either thermally or chemically. The cross-linking method affected surface energy, which was reflected in water contact angle, thermal cross-linking led to hydrophobic surfaces and chemical cross-linking generated hydrophilic surfaces. The coatings were then evaluated regarding its protein resistance and biological species resistance. While the hydrophobic thermally cross-linked PEM had low resistance towards proteins, the resistance of chemically cross-linked PEM strongly depended on the PEM termination layer and the charge of the protein, opposite charge caused high adsorption and same charge low adsorption, indicating electrostatic interaction plays a crucial role in the protein adsorption processes. Ulva linza was chosen as the biological species for antifouling performance evaluation. Despite of the poor resistance towards protein adsorption, thermally cross-linked PEM showed good resistance against Ulva spores settlement, the chemically cross-linked multilayers showed poor resistance regardless of the termination layer. Marine species adhesion is a complex process, although it involves proteins as bioadhesives, protein resistance its own is not a fully indicator for its antifouling performance. The species will pre select the surface, responding to cues like surface energy, chemistry, or charge and so on. Thus making it difficult for one single factors to determine its antifouling performance. Preparing PEM coating is a comprehensive work involving choosing polyelectrolyte combination, determining termination layer and the method for cross-linking. These decisions will affect PEM properties such as surface energy, charge, which is crucial, since biofouling is a process responding to surface properties in a highly sensitive and dynamic way.Keywords: hyaluronic acid, polyelectrolyte multilayers, protein resistance, Ulva linza zoospores
Procedia PDF Downloads 1634805 Control of a Quadcopter Using Genetic Algorithm Methods
Authors: Mostafa Mjahed
Abstract:
This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system
Procedia PDF Downloads 4314804 Optimal Portfolio Selection under Treynor Ratio Using Genetic Algorithms
Authors: Imad Zeyad Ramadan
Abstract:
In this paper a genetic algorithm was developed to construct the optimal portfolio based on the Treynor method. The GA maximizes the Treynor ratio under budget constraint to select the best allocation of the budget for the companies in the portfolio. The results show that the GA was able to construct a conservative portfolio which includes companies from the three sectors. This indicates that the GA reduced the risk on the investor as it choose some companies with positive risks (goes with the market) and some with negative risks (goes against the market).Keywords: oOptimization, genetic algorithm, portfolio selection, Treynor method
Procedia PDF Downloads 4454803 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink
Authors: Sanjay Rathee, Arti Kashyap
Abstract:
Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining
Procedia PDF Downloads 2944802 Modelling Fluoride Pollution of Groundwater Using Artificial Neural Network in the Western Parts of Jharkhand
Authors: Neeta Kumari, Gopal Pathak
Abstract:
Artificial neural network has been proved to be an efficient tool for non-parametric modeling of data in various applications where output is non-linearly associated with input. It is a preferred tool for many predictive data mining applications because of its power , flexibility, and ease of use. A standard feed forward networks (FFN) is used to predict the groundwater fluoride content. The ANN model is trained using back propagated algorithm, Tansig and Logsig activation function having varying number of neurons. The models are evaluated on the basis of statistical performance criteria like Root Mean Squarred Error (RMSE) and Regression coefficient (R2), bias (mean error), Coefficient of variation (CV), Nash-Sutcliffe efficiency (NSE), and the index of agreement (IOA). The results of the study indicate that Artificial neural network (ANN) can be used for groundwater fluoride prediction in the limited data situation in the hard rock region like western parts of Jharkhand with sufficiently good accuracy.Keywords: Artificial neural network (ANN), FFN (Feed-forward network), backpropagation algorithm, Levenberg-Marquardt algorithm, groundwater fluoride contamination
Procedia PDF Downloads 5484801 Probabilistic Gathering of Agents with Simple Sensors: Distributed Algorithm for Aggregation of Robots Equipped with Binary On-Board Detectors
Authors: Ariel Barel, Rotem Manor, Alfred M. Bruckstein
Abstract:
We present a probabilistic gathering algorithm for agents that can only detect the presence of other agents in front of or behind them. The agents act in the plane and are identical and indistinguishable, oblivious, and lack any means of direct communication. They do not have a common frame of reference in the plane and choose their orientation (direction of possible motion) at random. The analysis of the gathering process assumes that the agents act synchronously in selecting random orientations that remain fixed during each unit time-interval. Two algorithms are discussed. The first one assumes discrete jumps based on the sensing results given the randomly selected motion direction, and in this case, extensive experimental results exhibit probabilistic clustering into a circular region with radius equal to the step-size in time proportional to the number of agents. The second algorithm assumes agents with continuous sensing and motion, and in this case, we can prove gathering into a very small circular region in finite expected time.Keywords: control, decentralized, gathering, multi-agent, simple sensors
Procedia PDF Downloads 1624800 The Three-Zone Composite Productivity Model of Multi-Fractured Horizontal Wells under Different Diffusion Coefficients in a Shale Gas Reservoir
Authors: Weiyao Zhu, Qian Qi, Ming Yue, Dongxu Ma
Abstract:
Due to the nano-micro pore structures and the massive multi-stage multi-cluster hydraulic fracturing in shale gas reservoirs, the multi-scale seepage flows are much more complicated than in most other conventional reservoirs, and are crucial for the economic development of shale gas. In this study, a new multi-scale non-linear flow model was established and simplified, based on different diffusion and slip correction coefficients. Due to the fact that different flow laws existed between the fracture network and matrix zone, a three-zone composite model was proposed. Then, according to the conformal transformation combined with the law of equivalent percolation resistance, the productivity equation of a horizontal fractured well, with consideration given to diffusion, slip, desorption, and absorption, was built. Also, an analytic solution was derived, and the interference of the multi-cluster fractures was analyzed. The results indicated that the diffusion of the shale gas was mainly in the transition and Fick diffusion regions. The matrix permeability was found to be influenced by slippage and diffusion, which was determined by the pore pressure and diameter according to the Knudsen number. It was determined that, with the increased half-lengths of the fracture clusters, flow conductivity of the fractures, and permeability of the fracture network, the productivity of the fractured well also increased. Meanwhile, with the increased number of fractures, the distance between the fractures decreased, and the productivity slowly increased due to the mutual interference of the fractures. In regard to the fractured horizontal wells, the free gas was found to majorly contribute to the productivity, while the contribution of the desorption increased with the increased pressure differences.Keywords: multi-scale, fracture network, composite model, productivity
Procedia PDF Downloads 2664799 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 1684798 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining
Procedia PDF Downloads 4354797 Seamless Mobility in Heterogeneous Mobile Networks
Authors: Mohab Magdy Mostafa Mohamed
Abstract:
The objective of this paper is to introduce a vertical handover (VHO) algorithm between wireless LANs (WLANs) and LTE mobile networks. The proposed algorithm is based on the fuzzy control theory and takes into consideration power level, subscriber velocity, and target cell load instead of only power level in traditional algorithms. Simulation results show that network performance in terms of number of handovers and handover occurrence distance is improved.Keywords: vertical handover, fuzzy control theory, power level, speed, target cell load
Procedia PDF Downloads 3494796 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection
Procedia PDF Downloads 2544795 Evolved Bat Algorithm Based Adaptive Fuzzy Sliding Mode Control with LMI Criterion
Authors: P.-W. Tsai, C.-Y. Chen, C.-W. Chen
Abstract:
In this paper, the stability analysis of a GA-Based adaptive fuzzy sliding model controller for a nonlinear system is discussed. First, a nonlinear plant is well-approximated and described with a reference model and a fuzzy model, both involving FLC rules. Then, FLC rules and the consequent parameter are decided on via an Evolved Bat Algorithm (EBA). After this, we guarantee a new tracking performance inequality for the control system. The tracking problem is characterized to solve an eigenvalue problem (EVP). Next, an adaptive fuzzy sliding model controller (AFSMC) is proposed to stabilize the system so as to achieve good control performance. Lyapunov’s direct method can be used to ensure the stability of the nonlinear system. It is shown that the stability analysis can reduce nonlinear systems into a linear matrix inequality (LMI) problem. Finally, a numerical simulation is provided to demonstrate the control methodology.Keywords: adaptive fuzzy sliding mode control, Lyapunov direct method, swarm intelligence, evolved bat algorithm
Procedia PDF Downloads 4444794 Concept Mapping to Reach Consensus on an Antibiotic Smart Use Strategy Model to Promote and Support Appropriate Antibiotic Prescribing in a Hospital, Thailand
Authors: Phenphak Horadee, Rodchares Hanrinth, Saithip Suttiruksa
Abstract:
Inappropriate use of antibiotics has happened in several hospitals, Thailand. Drug use evaluation (DUE) is one strategy to overcome this difficulty. However, most community hospitals still encounter incomplete evaluation resulting overuse of antibiotics with high cost. Consequently, drug-resistant bacteria have been rising due to inappropriate antibiotic use. The aim of this study was to involve stakeholders in conceptualizing, developing, and prioritizing a feasible intervention strategy to promote and support appropriate antibiotic prescribing in a community hospital, Thailand. Study antibiotics included four antibiotics such as Meropenem, Piperacillin/tazobactam, Amoxicillin/clavulanic acid, and Vancomycin. The study was conducted for the 1-year period between March 1, 2018, and March 31, 2019, in a community hospital in the northeastern part of Thailand. Concept mapping was used in a purposive sample, including doctors (one was an administrator), pharmacists, and nurses who involving drug use evaluation of antibiotics. In-depth interviews for each participant and survey research were conducted to seek the problems for inappropriate use of antibiotics based on drug use evaluation system. Seventy-seven percent of DUE reported appropriate antibiotic prescribing, which still did not reach the goal of 80 percent appropriateness. Meropenem led other antibiotics for inappropriate prescribing. The causes of the unsuccessful DUE program were classified into three themes such as personnel, lack of public relation and communication, and unsupported policy and impractical regulations. During the first meeting, stakeholders (n = 21) expressed the generation of interventions. During the second meeting, participants who were almost the same group of people in the first meeting (n = 21) were requested to independently rate the feasibility and importance of each idea and to categorize them into relevant clusters to facilitate multidimensional scaling and hierarchical cluster analysis. The outputs of analysis included the idealist, cluster list, point map, point rating map, cluster map, and cluster rating map. All of these were distributed to participants (n = 21) during the third meeting to reach consensus on an intervention model. The final proposed intervention strategy included 29 feasible and crucial interventions in seven clusters: development of information technology system, establishing policy and taking it into the action plan, proactive public relations of the policy, action plan and workflow, in cooperation of multidisciplinary teams in drug use evaluation, work review and evaluation with performance reporting, promoting and developing professional and clinical skill for staff with training programs, and developing practical drug use evaluation guideline for antibiotics. These interventions are relevant and fit to several intervention strategies for antibiotic stewardship program in many international organizations such as participation of the multidisciplinary team, developing information technology to support antibiotic smart use, and communication. These interventions were prioritized for implementation over a 1-year period. Once the possibility of each activity or plan is set up, the proposed program could be applied and integrated into hospital policy after evaluating plans. Effectiveness of each intervention could be promoted to other community hospitals to promote and support antibiotic smart use.Keywords: antibiotic, concept mapping, drug use evaluation, multidisciplinary teams
Procedia PDF Downloads 1184793 Series-Parallel Systems Reliability Optimization Using Genetic Algorithm and Statistical Analysis
Authors: Essa Abrahim Abdulgader Saleem, Thien-My Dao
Abstract:
The main objective of this paper is to optimize series-parallel system reliability using Genetic Algorithm (GA) and statistical analysis; considering system reliability constraints which involve the redundant numbers of selected components, total cost, and total weight. To perform this work, firstly the mathematical model which maximizes system reliability subject to maximum system cost and maximum system weight constraints is presented; secondly, a statistical analysis is used to optimize GA parameters, and thirdly GA is used to optimize series-parallel systems reliability. The objective is to determine the strategy choosing the redundancy level for each subsystem to maximize the overall system reliability subject to total cost and total weight constraints. Finally, the series-parallel system case study reliability optimization results are showed, and comparisons with the other previous results are presented to demonstrate the performance of our GA.Keywords: reliability, optimization, meta-heuristic, genetic algorithm, redundancy
Procedia PDF Downloads 3344792 A Comparative Study of k-NN and MLP-NN Classifiers Using GA-kNN Based Feature Selection Method for Wood Recognition System
Authors: Uswah Khairuddin, Rubiyah Yusof, Nenny Ruthfalydia Rosli
Abstract:
This paper presents a comparative study between k-Nearest Neighbour (k-NN) and Multi-Layer Perceptron Neural Network (MLP-NN) classifier using Genetic Algorithm (GA) as feature selector for wood recognition system. The features have been extracted from the images using Grey Level Co-Occurrence Matrix (GLCM). The use of GA based feature selection is mainly to ensure that the database used for training the features for the wood species pattern classifier consists of only optimized features. The feature selection process is aimed at selecting only the most discriminating features of the wood species to reduce the confusion for the pattern classifier. This feature selection approach maintains the ‘good’ features that minimizes the inter-class distance and maximizes the intra-class distance. Wrapper GA is used with k-NN classifier as fitness evaluator (GA-kNN). The results shows that k-NN is the best choice of classifier because it uses a very simple distance calculation algorithm and classification tasks can be done in a short time with good classification accuracy.Keywords: feature selection, genetic algorithm, optimization, wood recognition system
Procedia PDF Downloads 5444791 Optimal Design of Friction Dampers for Seismic Retrofit of a Moment Frame
Authors: Hyungoo Kang, Jinkoo Kim
Abstract:
This study investigated the determination of the optimal location and friction force of friction dampers to effectively reduce the seismic response of a reinforced concrete structure designed without considering seismic load. To this end, the genetic algorithm process was applied and the results were compared with those obtained by simplified methods such as distribution of dampers based on the story shear or the inter-story drift ratio. The seismic performance of the model structure with optimally positioned friction dampers was evaluated by nonlinear static and dynamic analyses. The analysis results showed that compared with the system without friction dampers, the maximum roof displacement and the inter-story drift ratio were reduced by about 30% and 40%, respectively. After installation of the dampers about 70% of the earthquake input energy was dissipated by the dampers and the energy dissipated in the structural elements was reduced by about 50%. In comparison with the simplified methods of installation, the genetic algorithm provided more efficient solutions for seismic retrofit of the model structure.Keywords: friction dampers, genetic algorithm, optimal design, RC buildings
Procedia PDF Downloads 2434790 Preparation of Biomedical Hydrogels Using Phenolic Compounds and Electron Beam Irradiation
Authors: Farnaz Sadeghi, Moslem Tavakol
Abstract:
In this study, an attempt has been made to prepare a physically cross-linked gel by cooling of tannic acid (TA)-polyvinyl alcohol (PVA) solution that subsequently convert to antibacterial chemically cross-linked hydrogel by using electron beam irradiation. PVA is known for its biocompatibility and hydrophilicity, and TA is known for being a natural compound which can serve as a cross-linking agent and a therapeutic agent. Swelling behavior, gel content, pore size, and mechanical properties of hydrogels which prepared at 14, 28, and 56 (kGy) with different ratios of polymers were investigated. PVA-TA hydrogel showed sustained release of tannic acid as approximately 20% and 50% of loaded TA released from the hydrogel after 4 and 72 h release time. We found that gel content decreased and the moisture retention capability increased by an increase in TA composition. In addition, PVA-TA hydrogels showed a good antibacterial activity against S.aureus. MTT analysis indicated that close to 83% of fibroblast cells remained viable after 48 h exposure to hydrogel extract. Moreover, the cooling of 10% PVA solution containing 0.5 and 0.75% w/v tannic acid to room and refrigerator, respectively, led to formation of physical gel that did not present any flow index after inversion of hydrogel cast. According to the results, the hydrogel prepared by electron beam irradiation of blended PVA-TA solution could be further investigated as a promising candidate for wound healing.Keywords: poly vinyl alcohol, tannic acid, electron beam irradiation, hydrogel wound dressing
Procedia PDF Downloads 1534789 Development of a Few-View Computed Tomographic Reconstruction Algorithm Using Multi-Directional Total Variation
Authors: Chia Jui Hsieh, Jyh Cheng Chen, Chih Wei Kuo, Ruei Teng Wang, Woei Chyn Chu
Abstract:
Compressed sensing (CS) based computed tomographic (CT) reconstruction algorithm utilizes total variation (TV) to transform CT image into sparse domain and minimizes L1-norm of sparse image for reconstruction. Different from the traditional CS based reconstruction which only calculates x-coordinate and y-coordinate TV to transform CT images into sparse domain, we propose a multi-directional TV to transform tomographic image into sparse domain for low-dose reconstruction. Our method considers all possible directions of TV calculations around a pixel, so the sparse transform for CS based reconstruction is more accurate. In 2D CT reconstruction, we use eight-directional TV to transform CT image into sparse domain. Furthermore, we also use 26-directional TV for 3D reconstruction. This multi-directional sparse transform method makes CS based reconstruction algorithm more powerful to reduce noise and increase image quality. To validate and evaluate the performance of this multi-directional sparse transform method, we use both Shepp-Logan phantom and a head phantom as the targets for reconstruction with the corresponding simulated sparse projection data (angular sampling interval is 5 deg and 6 deg, respectively). From the results, the multi-directional TV method can reconstruct images with relatively less artifacts compared with traditional CS based reconstruction algorithm which only calculates x-coordinate and y-coordinate TV. We also choose RMSE, PSNR, UQI to be the parameters for quantitative analysis. From the results of quantitative analysis, no matter which parameter is calculated, the multi-directional TV method, which we proposed, is better.Keywords: compressed sensing (CS), low-dose CT reconstruction, total variation (TV), multi-directional gradient operator
Procedia PDF Downloads 2554788 Non-Population Search Algorithms for Capacitated Material Requirement Planning in Multi-Stage Assembly Flow Shop with Alternative Machines
Authors: Watcharapan Sukkerd, Teeradej Wuttipornpun
Abstract:
This paper aims to present non-population search algorithms called tabu search (TS), simulated annealing (SA) and variable neighborhood search (VNS) to minimize the total cost of capacitated MRP problem in multi-stage assembly flow shop with two alternative machines. There are three main steps for the algorithm. Firstly, an initial sequence of orders is constructed by a simple due date-based dispatching rule. Secondly, the sequence of orders is repeatedly improved to reduce the total cost by applying TS, SA and VNS separately. Finally, the total cost is further reduced by optimizing the start time of each operation using the linear programming (LP) model. Parameters of the algorithm are tuned by using real data from automotive companies. The result shows that VNS significantly outperforms TS, SA and the existing algorithm.Keywords: capacitated MRP, tabu search, simulated annealing, variable neighborhood search, linear programming, assembly flow shop, application in industry
Procedia PDF Downloads 2324787 Evaluation of Dual Polarization Rainfall Estimation Algorithm Applicability in Korea: A Case Study on Biseulsan Radar
Authors: Chulsang Yoo, Gildo Kim
Abstract:
Dual polarization radar provides comprehensive information about rainfall by measuring multiple parameters. In Korea, for the rainfall estimation, JPOLE and CSU-HIDRO algorithms are generally used. This study evaluated the local applicability of JPOLE and CSU-HIDRO algorithms in Korea by using the observed rainfall data collected on August, 2014 by the Biseulsan dual polarization radar data and KMA AWS. A total of 11,372 pairs of radar-ground rain rate data were classified according to thresholds of synthetic algorithms into suitable and unsuitable data. Then, evaluation criteria were derived by comparing radar rain rate and ground rain rate, respectively, for entire, suitable, unsuitable data. The results are as follows: (1) The radar rain rate equation including KDP, was found better in the rainfall estimation than the other equations for both JPOLE and CSU-HIDRO algorithms. The thresholds were found to be adequately applied for both algorithms including specific differential phase. (2) The radar rain rate equation including horizontal reflectivity and differential reflectivity were found poor compared to the others. The result was not improved even when only the suitable data were applied. Acknowledgments: This work was supported by the Basic Science Research Program through the National Research Foundation of Korea, funded by the Ministry of Education (NRF-2013R1A1A2011012).Keywords: CSU-HIDRO algorithm, dual polarization radar, JPOLE algorithm, radar rainfall estimation algorithm
Procedia PDF Downloads 2104786 Identification and Characterization of Polysaccharide Biosynthesis Protein (CAPD) of Enterococcus faecium
Authors: Liaqat Ali, Hubert E. Blum, Türkân Sakinc
Abstract:
Enterococcus faecium is an emerging multidrug-resistant nosocomial pathogen increased dramatically worldwide and causing bacteremia, endocarditis, urinary tract and surgical site infections in immunocomprised patients. The capsular polysaccharides that contribute to pathogenesis through evasion of the host innate immune system are also involved in hindering leukocyte killing of enterococci. The gene cluster (enterococcal polysaccharide antigen) of E. faecalis encoding homologues of many genes involved in polysaccharide biosynthesis. We identified two putative loci with 22 kb and 19 kb which contained 11 genes encoding for glycosyltransferases (GTFs); this was confirmed by using genome comparison of already sequenced strains that has no homology to known capsule genes and the epa-locus. The polysaccharide-conjugate vaccines have rapidly emerged as a suitable strategy to combat different pathogenic bacteria, therefore, we investigated a polysaccharide biosynthesis CapD protein in E. faecium contains 336 amino acids and had putative function for N-linked glycosylation. The deletion/knock-out capD mutant was constructed and complemented by homologues recombination method and confirmed by using PCR and sequencing. For further characterization and functional analysis, in-vitro cell culture and in-vivo a mouse infection models were used. Our ΔcapD mutant shows a strong hydrophobicity and all strains exhibited biofilm production. Subsequently, the opsonic activity was tested in an opsonophagocytic assay which shows increased in mutant compared complemented and wild type strains but more than two fold decreased in colonization and adherence was seen on surface of uroepithelial cells. However, a significant higher bacterial colonialization was observed in capD mutant during animal bacteremia infection. Unlike other polysaccharides biosynthesis proteins, CapD does not seems to be a major virulence factor in enterococci but further experiments and attention is needed to clarify its function, exact mechanism and involvement in pathogenesis of enteroccocal nosocomial infections eventually to develop a vaccine/ or targeted therapy.Keywords: E. faecium, pathogenesis, polysaccharides, biofilm formation
Procedia PDF Downloads 3314785 Riesz Mixture Model for Brain Tumor Detection
Authors: Mouna Zitouni, Mariem Tounsi
Abstract:
This research introduces an application of the Riesz mixture model for medical image segmentation for accurate diagnosis and treatment of brain tumors. We propose a pixel classification technique based on the Riesz distribution, derived from an extended Bartlett decomposition. To our knowledge, this is the first study addressing this approach. The Expectation-Maximization algorithm is implemented for parameter estimation. A comparative analysis, using both synthetic and real brain images, demonstrates the superiority of the Riesz model over a recent method based on the Wishart distribution.Keywords: EM algorithm, segmentation, Riesz probability distribution, Wishart probability distribution
Procedia PDF Downloads 164784 An Efficient Design of Static Synchronous Series Compensator Based Fractional Order PID Controller Using Invasive Weed Optimization Algorithm
Authors: Abdelghani Choucha, Lakhdar Chaib, Salem Arif
Abstract:
This paper treated the problem of power system stability with the aid of Static Synchronous Series Compensator (SSSC) installed in the transmission line of single machine infinite bus (SMIB) power system. A fractional order PID (FOPID) controller has been applied as a robust controller for optimal SSSC design to control the power system characteristics. Additionally, the SSSC based FOPID parameters are smoothly tuned using Invasive Weed Optimization algorithm (IWO). To verify the strength of the proposed controller, SSSC based FOPID controller is validated in a wide range of operating condition and compared with the conventional scheme SSSC-POD controller. The main purpose of the proposed process is greatly enhanced the dynamic states of the tested system. Simulation results clearly prove the superiority and performance of the proposed controller design.Keywords: SSSC-FOPID, SSSC-POD, SMIB power system, invasive weed optimization algorithm
Procedia PDF Downloads 1874783 Deterministic Random Number Generator Algorithm for Cryptosystem Keys
Authors: Adi A. Maaita, Hamza A. A. Al Sewadi
Abstract:
One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfills Shannon’s principle of “confusion and diffusion”. ASCII code characters wereutilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.Keywords: cryptosystems, information security agreement, key distribution, random numbers
Procedia PDF Downloads 2684782 Study of the Best Algorithm to Estimate Sunshine Duration from Global Radiation on Horizontal Surface for Tropical Region
Authors: Tovondahiniriko Fanjirindratovo, Olga Ramiarinjanahary, Paulisimone Rasoavonjy
Abstract:
The sunshine duration, which is the sum of all the moments when the solar beam radiation is up to a minimal value, is an important parameter for climatology, tourism, agriculture and solar energy. Its measure is usually given by a pyrheliometer installed on a two-axis solar tracker. Due to the high cost of this device and the availability of global radiation on a horizontal surface, on the other hand, several studies have been done to make a correlation between global radiation and sunshine duration. Most of these studies are fitted for the northern hemisphere using a pyrheliometric database. The aim of the present work is to list and assess all the existing methods and apply them to Reunion Island, a tropical region in the southern hemisphere. Using a database of ten years, global, diffuse and beam radiation for a horizontal surface are employed in order to evaluate the uncertainty of existing algorithms for a tropical region. The methodology is based on indirect comparison because the solar beam radiation is not measured but calculated by the beam radiation on a horizontal surface and the sun elevation angle.Keywords: Carpentras method, data fitting, global radiation, sunshine duration, Slob and Monna algorithm, step algorithm
Procedia PDF Downloads 122