Search results for: laundry algorithm
2569 Solving Directional Overcurrent Relay Coordination Problem Using Artificial Bees Colony
Authors: M. H. Hussain, I. Musirin, A. F. Abidin, S. R. A. Rahim
Abstract:
This paper presents the implementation of Artificial Bees Colony (ABC) algorithm in solving Directional OverCurrent Relays (DOCRs) coordination problem for near-end faults occurring in fixed network topology. The coordination optimization of DOCRs is formulated as linear programming (LP) problem. The objective function is introduced to minimize the operating time of the associated relay which depends on the time multiplier setting. The proposed technique is to taken as a technique for comparison purpose in order to highlight its superiority. The proposed algorithms have been tested successfully on 8 bus test system. The simulation results demonstrated that the ABC algorithm which has been proved to have good search ability is capable in dealing with constraint optimization problems.Keywords: artificial bees colony, directional overcurrent relay coordination problem, relay settings, time multiplier setting
Procedia PDF Downloads 3302568 Designing an Intelligent Voltage Instability System in Power Distribution Systems in the Philippines Using IEEE 14 Bus Test System
Authors: Pocholo Rodriguez, Anne Bernadine Ocampo, Ian Benedict Chan, Janric Micah Gray
Abstract:
The state of an electric power system may be classified as either stable or unstable. The borderline of stability is at any condition for which a slight change in an unfavourable direction of any pertinent quantity will cause instability. Voltage instability in power distribution systems could lead to voltage collapse and thus power blackouts. The researchers will present an intelligent system using back propagation algorithm that can detect voltage instability and output voltage of a power distribution and classify it as stable or unstable. The researchers’ work is the use of parameters involved in voltage instability as input parameters to the neural network for training and testing purposes that can provide faster detection and monitoring of the power distribution system.Keywords: back-propagation algorithm, load instability, neural network, power distribution system
Procedia PDF Downloads 4352567 Networked Implementation of Milling Stability Optimization with Bayesian Learning
Authors: Christoph Ramsauer, Jaydeep Karandikar, Tony Schmitz, Friedrich Bleicher
Abstract:
Machining stability is an important limitation to discrete part machining. In this work, a networked implementation of milling stability optimization with Bayesian learning is presented. The milling process was monitored with a wireless sensory tool holder instrumented with an accelerometer at the Vienna University of Technology, Vienna, Austria. The recorded data from a milling test cut is used to classify the cut as stable or unstable based on the frequency analysis. The test cut result is fed to a Bayesian stability learning algorithm at the University of Tennessee, Knoxville, Tennessee, USA. The algorithm calculates the probability of stability as a function of axial depth of cut and spindle speed and recommends the parameters for the next test cut. The iterative process between two transatlantic locations repeats until convergence to a stable optimal process parameter set is achieved.Keywords: machining stability, machine learning, sensor, optimization
Procedia PDF Downloads 2062566 Clinical Validation of an Automated Natural Language Processing Algorithm for Finding COVID-19 Symptoms and Complications in Patient Notes
Authors: Karolina Wieczorek, Sophie Wiliams
Abstract:
Introduction: Patient data is often collected in Electronic Health Record Systems (EHR) for purposes such as providing care as well as reporting data. This information can be re-used to validate data models in clinical trials or in epidemiological studies. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. Mentioning a disease in a discharge letter does not necessarily mean that a patient suffers from this disease. Many of them discuss a diagnostic process, different tests, or discuss whether a patient has a certain disease. The COVID-19 dataset in this study used natural language processing (NLP), an automated algorithm which extracts information related to COVID-19 symptoms, complications, and medications prescribed within the hospital. Free-text patient clinical patient notes are rich sources of information which contain patient data not captured in a structured form, hence the use of named entity recognition (NER) to capture additional information. Methods: Patient data (discharge summary letters) were exported and screened by an algorithm to pick up relevant terms related to COVID-19. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. A list of 124 Systematized Nomenclature of Medicine (SNOMED) Clinical Terms has been provided in Excel with corresponding IDs. Two independent medical student researchers were provided with a dictionary of SNOMED list of terms to refer to when screening the notes. They worked on two separate datasets called "A” and "B”, respectively. Notes were screened to check if the correct term had been picked-up by the algorithm to ensure that negated terms were not picked up. Results: Its implementation in the hospital began on March 31, 2020, and the first EHR-derived extract was generated for use in an audit study on June 04, 2020. The dataset has contributed to large, priority clinical trials (including International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) by bulk upload to REDcap research databases) and local research and audit studies. Successful sharing of EHR-extracted datasets requires communicating the provenance and quality, including completeness and accuracy of this data. The results of the validation of the algorithm were the following: precision (0.907), recall (0.416), and F-score test (0.570). Percentage enhancement with NLP extracted terms compared to regular data extraction alone was low (0.3%) for relatively well-documented data such as previous medical history but higher (16.6%, 29.53%, 30.3%, 45.1%) for complications, presenting illness, chronic procedures, acute procedures respectively. Conclusions: This automated NLP algorithm is shown to be useful in facilitating patient data analysis and has the potential to be used in more large-scale clinical trials to assess potential study exclusion criteria for participants in the development of vaccines.Keywords: automated, algorithm, NLP, COVID-19
Procedia PDF Downloads 1022565 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI
Authors: Hae-Yeoun Lee
Abstract:
Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring,which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.Keywords: cardiac MRI, graph searching, left ventricle segmentation, K-means clustering
Procedia PDF Downloads 3992564 The Organization of Multi-Field Hospital’s Work Environment in the Republic of Sakha, Yakutia
Authors: Inna Vinokurova, N. Savvina
Abstract:
The goal of research: to study the organization of multi-field hospital’s work environment in the Republic of Sakha (Yakutia), Autonomous public health care institution of Republic of Sakha (Yakutia) - Republican Hospital No. 1 - National Center of Medicine. Results: Autonomous public health care institution of Republic of Sakha (Yakutia) - Republican Hospital No. 1 - National Center of Medicine is a multidisciplinary, specialized hospital complex that provides specialized and high-tech medical care to children and adults in the Republic of Sakha (Yakutia) of the Russian Federation. There are 5 diagnostic and treatment centers (advisory and diagnostic, clinical, pediatric, perinatal, Republican cardiologic dispensary) with 45 clinical specialized departments with 727 cots, 5 resuscitation departments, 20 operating rooms and out-patient department with 905 visits in alternation in the National Center of Medicine. Annually more than 20,000 patients receive treatment in the hospital of the Republican Hospital of the Republic of Sakha (Yakutia), more than 70,000 patients visit out-patient sections, more than 2 million researches are done, more than 12,000 surgeries are performed, more than 2 thousand babies are delivered. National Center of Medicine has a great influence with such population’s health indicators as total mortality, birth rate, maternal, infant and perinatal mortality, circulatory system incidence. The work environment of the Republican Hospital of the Republic of Sakha (Yakutia) is represented by the following structural departments: pharmacy, blood transfusion department, sterilization department, laundry, dietetic department, infant-feeding centre, material and technical supply. More than 200 employees work in this service. The main function of these services is to provide on-time and fail-safe supply with all necessary: wear parts, medical supplies, donated blood and its components, foodstuffs, hospital linen , sterile instruments, etc. Thus, the activity of medical organization depends on the work environment, including quality health care, so it is a main part of multi-field hospital activity.Keywords: organization of multi-field hospital’s, work environment, quality health care, pharmacy, blood transfusion department, sterilization department
Procedia PDF Downloads 2422563 Lateral Control of Electric Vehicle Based on Fuzzy Logic Control
Authors: Hartani Kada, Merah Abdelkader
Abstract:
Aiming at the high nonlinearities and unmatched uncertainties of the intelligent electric vehicles’ dynamic system, this paper presents a lateral motion control algorithm for intelligent electric vehicles with four in-wheel motors. A fuzzy logic procedure is presented and formulated to realize lateral control in lane change. The vehicle dynamics model and a desired target tracking model were established in this paper. A fuzzy logic controller was designed for integrated active front steering (AFS) and direct yaw moment control (DYC) in order to improve vehicle handling performance and stability, and a fuzzy controller for the automatic steering problem. The simulation results demonstrate the strong robustness and excellent tracking performance of the control algorithm that is proposed.Keywords: fuzzy logic, lateral control, AFS, DYC, electric car technology, longitudinal control, lateral motion
Procedia PDF Downloads 6102562 Bi-Criteria Vehicle Routing Problem for Possibility Environment
Authors: Bezhan Ghvaberidze
Abstract:
A multiple criteria optimization approach for the solution of the Fuzzy Vehicle Routing Problem (FVRP) is proposed. For the possibility environment the levels of movements between customers are calculated by the constructed simulation interactive algorithm. The first criterion of the bi-criteria optimization problem - minimization of the expectation of total fuzzy travel time on closed routes is constructed for the FVRP. A new, second criterion – maximization of feasibility of movement on the closed routes is constructed by the Choquet finite averaging operator. The FVRP is reduced to the bi-criteria partitioning problem for the so called “promising” routes which were selected from the all admissible closed routes. The convenient selection of the “promising” routes allows us to solve the reduced problem in the real-time computing. For the numerical solution of the bi-criteria partitioning problem the -constraint approach is used. An exact algorithm is implemented based on D. Knuth’s Dancing Links technique and the algorithm DLX. The Main objective was to present the new approach for FVRP, when there are some difficulties while moving on the roads. This approach is called FVRP for extreme conditions (FVRP-EC) on the roads. Also, the aim of this paper was to construct the solving model of the constructed FVRP. Results are illustrated on the numerical example where all Pareto-optimal solutions are found. Also, an approach for more complex model FVRP with time windows was developed. A numerical example is presented in which optimal routes are constructed for extreme conditions on the roads.Keywords: combinatorial optimization, Fuzzy Vehicle routing problem, multiple objective programming, possibility theory
Procedia PDF Downloads 4852561 An Integration of Genetic Algorithm and Particle Swarm Optimization to Forecast Transport Energy Demand
Authors: N. R. Badurally Adam, S. R. Monebhurrun, M. Z. Dauhoo, A. Khoodaruth
Abstract:
Transport energy demand is vital for the economic growth of any country. Globalisation and better standard of living plays an important role in transport energy demand. Recently, transport energy demand in Mauritius has increased significantly, thus leading to an abuse of natural resources and thereby contributing to global warming. Forecasting the transport energy demand is therefore important for controlling and managing the demand. In this paper, we develop a model to predict the transport energy demand. The model developed is based on a system of five stochastic differential equations (SDEs) consisting of five endogenous variables: fuel price, population, gross domestic product (GDP), number of vehicles and transport energy demand and three exogenous parameters: crude birth rate, crude death rate and labour force. An interval of seven years is used to avoid any falsification of result since Mauritius is a developing country. Data available for Mauritius from year 2003 up to 2009 are used to obtain the values of design variables by applying genetic algorithm. The model is verified and validated for 2010 to 2012 by substituting the values of coefficients obtained by GA in the model and using particle swarm optimisation (PSO) to predict the values of the exogenous parameters. This model will help to control the transport energy demand in Mauritius which will in turn foster Mauritius towards a pollution-free country and decrease our dependence on fossil fuels.Keywords: genetic algorithm, modeling, particle swarm optimization, stochastic differential equations, transport energy demand
Procedia PDF Downloads 3692560 Event Extraction, Analysis, and Event Linking
Authors: Anam Alam, Rahim Jamaluddin Kanji
Abstract:
With the rapid growth of event in everywhere, event extraction has now become an important matter to retrieve the information from the unstructured data. One of the challenging problems is to extract the event from it. An event is an observable occurrence of interaction among entities. The paper investigates the effectiveness of event extraction capabilities of three software tools that are Wandora, Nitro and SPSS. We performed standard text mining techniques of these tools on the data sets of (i) Afghan War Diaries (AWD collection), (ii) MUC4 and (iii) WebKB. Information retrieval measures such as precision and recall which are computed under extensive set of experiments for Event Extraction. The experimental study analyzes the difference between events extracted by the software and human. This approach helps to construct an algorithm that will be applied for different machine learning methods.Keywords: event extraction, Wandora, nitro, SPSS, event analysis, extraction method, AFG, Afghan War Diaries, MUC4, 4 universities, dataset, algorithm, precision, recall, evaluation
Procedia PDF Downloads 5962559 Optimization of Electrocoagulation Process Using Duelist Algorithm
Authors: Totok R. Biyanto, Arif T. Mardianto, M. Farid R. R., Luthfi Machmudi, kandi mulakasti
Abstract:
The main objective of this research is optimizing the electrocoagulation process design as a post-treatment for biologically vinasse effluent process. The first principle model with three independent variables that affect the energy consumption of electrocoagulation process i.e. current density, electrode distance, and time of treatment process are chosen as optimized variables. The process condition parameters were determined with the value of pH, electrical conductivity, and temperature of vinasse about 6.5, 28.5 mS/cm, 52 oC, respectively. Aluminum was chosen as the electrode material of electrocoagulation process. Duelist algorithm was used as optimization technique due to its capability to reach a global optimum. The optimization results show that the optimal process can be reached in the conditions of current density of 2.9976 A/m2, electrode distance of 1.5 cm and electrolysis time of 119 min. The optimized energy consumption during process is 34.02 Wh.Keywords: optimization, vinasse effluent, electrocoagulation, energy consumption
Procedia PDF Downloads 4692558 Minimization of Propagation Delay in Multi Unmanned Aerial Vehicle Network
Authors: Purva Joshi, Rohit Thanki, Omar Hanif
Abstract:
Unmanned aerial vehicles (UAVs) are becoming increasingly important in various industrial applications and sectors. Nowadays, a multi UAV network is used for specific types of communication (e.g., military) and monitoring purposes. Therefore, it is critical to reducing propagation delay during communication between UAVs, which is essential in a multi UAV network. This paper presents how the propagation delay between the base station (BS) and the UAVs is reduced using a searching algorithm. Furthermore, the iterative-based K-nearest neighbor (k-NN) algorithm and Travelling Salesmen Problem (TSP) algorthm were utilized to optimize the distance between BS and individual UAV to overcome the problem of propagation delay in multi UAV networks. The simulation results show that this proposed method reduced complexity, improved reliability, and reduced propagation delay in multi UAV networks.Keywords: multi UAV network, optimal distance, propagation delay, K - nearest neighbor, traveling salesmen problem
Procedia PDF Downloads 2012557 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning
Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar
Abstract:
As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence
Procedia PDF Downloads 1112556 Integrated Model for Enhancing Data Security Performance in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 4772555 Saliency Detection Using a Background Probability Model
Authors: Junling Li, Fang Meng, Yichun Zhang
Abstract:
Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection.Keywords: visual saliency, background probability, boundary knowledge, background priors
Procedia PDF Downloads 4292554 Effectiveness of Earthing System in Vertical Configurations
Authors: S. Yunus, A. Suratman, N. Mohamad Nor, M. Othman
Abstract:
This paper presents the measurement and simulation results by Finite Element Method (FEM) for earth resistance (RDC) for interconnected vertical ground rod configurations. The soil resistivity was measured using the Wenner four-pin Method, and RDC was measured using the Fall of Potential (FOP) method, as outlined in the standard. Genetic Algorithm (GA) is employed to interpret the soil resistivity to that of a 2-layer soil model. The same soil resistivity data that were obtained by Wenner four-pin method were used in FEM for simulation. This paper compares the results of RDC obtained by FEM simulation with the real measurement at field site. A good agreement was seen for RDC obtained by measurements and FEM. This shows that FEM is a reliable software to be used for design of earthing systems. It is also found that the parallel rod system has a better performance compared to a similar setup using a grid layout.Keywords: earthing system, earth electrodes, finite element method, genetic algorithm, earth resistances
Procedia PDF Downloads 1102553 TMBCoI-SIOT: Trust Management System Based on the Community of Interest for the Social Internet of Things
Authors: Oumaima Ben Abderrahim, Mohamed Houcine Elhedhili, Leila Saidane
Abstract:
In this paper, we propose a trust management system based on clustering architecture for the social internet of things called TMBCO-SIOT. The proposed model integrates numerous factors such as direct and indirect trust; transaction factor; precaution factor; and social modeling of trust. The novelty of our approach can be summed up in two aspects. The first aspect concerns the architecture based on the community of interest (CoT) where each community is headed by an administrator (admin). However, the second aspect is the trust management system that tries to prevent On-Off attacks and mitigates dishonest recommendations using the k-means algorithm and guarantor things. The effectiveness of the proposed system is proved by simulation against malicious nodes.Keywords: IoT, trust management system, attacks, trust, dishonest recommendations, K-means algorithm
Procedia PDF Downloads 2122552 Mixtures of Length-Biased Weibull Distributions for Loss Severity Modelling
Authors: Taehan Bae
Abstract:
In this paper, a class of length-biased Weibull mixtures is presented to model loss severity data. The proposed model generalizes the Erlang mixtures with the common scale parameter, and it shares many important modelling features, such as flexibility to fit various data distribution shapes and weak-denseness in the class of positive continuous distributions, with the Erlang mixtures. We show that the asymptotic tail estimate of the length-biased Weibull mixture is Weibull-type, which makes the model effective to fit loss severity data with heavy-tailed observations. A method of statistical estimation is discussed with applications on real catastrophic loss data sets.Keywords: Erlang mixture, length-biased distribution, transformed gamma distribution, asymptotic tail estimate, EM algorithm, expectation-maximization algorithm
Procedia PDF Downloads 2242551 Robust Fault Diagnosis for Wind Turbine Systems Subjected to Multi-Faults
Authors: Sarah Odofin, Zhiwei Gao, Sun Kai
Abstract:
Operations, maintenance and reliability of wind turbines have received much attention over the years due to rapid expansion of wind farms. This paper explores early fault diagnosis scale technique based on a unique scheme of a 5MW wind turbine system that is optimized by genetic algorithm to be very sensitive to faults and resilient to disturbances. A quantitative model based analysis is pragmatic for primary fault diagnosis monitoring assessment to minimize downtime mostly caused by components breakdown and exploit productivity consistency. Simulation results are computed validating the wind turbine model which demonstrates system performance in a practical application of fault type examples. The results show the satisfactory effectiveness of the applied performance investigated in a Matlab/Simulink/Gatool environment.Keywords: disturbance robustness, fault monitoring and detection, genetic algorithm, observer technique
Procedia PDF Downloads 3792550 Reconfigurable Efficient IIR Filter Design Using MAC Algorithm
Authors: Rajesh Mehra
Abstract:
In this paper an IIR filter has been designed and simulated on an FPGA. The implementation is based on MAC algorithm which uses multiply-and-accumulate operations IIR filter design implementation. Parallel Pipelined structure is used to implement the proposed IIR Filter taking optimal advantage of the look up table of the FPGA device. The designed filter has been synthesized on DSP slice based FPGA to perform multiplier function of MAC unit. The DSP slices are useful to enhance the speed performance. The developed IIR filter is designed and simulated with MATLAB and synthesized with Xilinx Synthesis Tool (XST), and implemented on Virtex 5 and Spartan 3 ADSP FPGA devices. The IIR filter implemented on Virtex 5 FPGA can operate at an estimated frequency of 81.5 MHz as compared to 40.5 MHz in case of Spartan 3 ADSP FPGA. The Virtex 5 based implementation also consumes less slices and slice flip flops of target FPGA in comparison to Spartan 3 ADSP based implementation to provide cost effective solution for signal processing applications.Keywords: butterworth, DSP, IIR, MAC, FPGA
Procedia PDF Downloads 3572549 Use of Interpretable Evolved Search Query Classifiers for Sinhala Documents
Authors: Prasanna Haddela
Abstract:
Document analysis is a well matured yet still active research field, partly as a result of the intricate nature of building computational tools but also due to the inherent problems arising from the variety and complexity of human languages. Breaking down language barriers is vital in enabling access to a number of recent technologies. This paper investigates the application of document classification methods to new Sinhalese datasets. This language is geographically isolated and rich with many of its own unique features. We will examine the interpretability of the classification models with a particular focus on the use of evolved Lucene search queries generated using a Genetic Algorithm (GA) as a method of document classification. We will compare the accuracy and interpretability of these search queries with other popular classifiers. The results are promising and are roughly in line with previous work on English language datasets.Keywords: evolved search queries, Sinhala document classification, Lucene Sinhala analyzer, interpretable text classification, genetic algorithm
Procedia PDF Downloads 1142548 Application of Neural Networks to Predict Changing the Diameters of Bubbles in Pool Boiling Distilled Water
Authors: V. Nikkhah Rashidabad, M. Manteghian, M. Masoumi, S. Mousavian, D. Ashouri
Abstract:
In this research, the capability of neural networks in modeling and learning complicated and nonlinear relations has been used to develop a model for the prediction of changes in the diameter of bubbles in pool boiling distilled water. The input parameters used in the development of this network include element temperature, heat flux, and retention time of bubbles. The test data obtained from the experiment of the pool boiling of distilled water, and the measurement of the bubbles form on the cylindrical element. The model was developed based on training algorithm, which is typologically of back-propagation type. Considering the correlation coefficient obtained from this model is 0.9633. This shows that this model can be trusted for the simulation and modeling of the size of bubble and thermal transfer of boiling.Keywords: bubble diameter, heat flux, neural network, training algorithm
Procedia PDF Downloads 4432547 SAP-Reduce: Staleness-Aware P-Reduce with Weight Generator
Authors: Lizhi Ma, Chengcheng Hu, Fuxian Wong
Abstract:
Partial reduce (P-Reduce) has set a state-of-the-art performance on distributed machine learning in the heterogeneous environment over the All-Reduce architecture. The dynamic P-Reduce based on the exponential moving average (EMA) approach predicts all the intermediate model parameters, which raises unreliability. It is noticed that the approximation trick leads the wrong way to obtaining model parameters in all the nodes. In this paper, SAP-Reduce is proposed, which is a variant of the All-Reduce distributed training model with staleness-aware dynamic P-Reduce. SAP-Reduce directly utilizes the EMA-like algorithm to generate the normalized weights. To demonstrate the effectiveness of the algorithm, the experiments are set based on a number of deep learning models, comparing the single-step training acceleration ratio and convergence time. It is found that SAP-Reduce simplifying dynamic P-Reduce outperforms the intermediate approximation one. The empirical results show SAP-Reduce is 1.3× −2.1× faster than existing baselines.Keywords: collective communication, decentralized distributed training, machine learning, P-Reduce
Procedia PDF Downloads 322546 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms
Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan
Abstract:
Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means
Procedia PDF Downloads 2912545 Effect of Punch Diameter on Optimal Loading Profiles in Hydromechanical Deep Drawing Process
Authors: Mehmet Halkaci, Ekrem Öztürk, Mevlüt Türköz, H. Selçuk Halkacı
Abstract:
Hydromechanical deep drawing (HMD) process is an advanced manufacturing process used to form deep parts with only one forming step. In this process, sheet metal blank can be drawn deeper by means of fluid pressure acting on sheet surface in the opposite direction of punch movement. High limiting drawing ratio, good surface quality, less springback characteristic and high dimensional accuracy are some of the advantages of this process. The performance of the HMD process is affected by various process parameters such as fluid pressure, blank holder force, punch-die radius, pre-bulging pressure and height, punch diameter, friction between sheet-die and sheet-punch. The fluid pressure and bank older force are the main loading parameters and affect the formability of HMD process significantly. The punch diameter also influences the limiting drawing ratio (the ratio of initial sheet diameter to punch diameter) of the sheet metal blank. In this research, optimal loading (fluid pressure and blank holder force) profiles were determined for AA 5754-O sheet material through fuzzy control algorithm developed in previous study using LS-DYNA finite element analysis (FEA) software. In the preceding study, the fuzzy control algorithm was developed utilizing geometrical criteria such as thinning and wrinkling. In order to obtain the final desired part with the developed algorithm in terms of the punch diameter requested, the effect of punch diameter, which is the one of the process parameters, on loading profiles was investigated separately using blank thickness of 1 mm. Thus, the practicality of the previously developed fuzzy control algorithm with different punch diameters was clarified. Also, thickness distributions of the sheet metal blank along a curvilinear distance were compared for the FEA in which different punch diameters were used. Consequently, it was found that the use of different punch diameters did not affect the optimal loading profiles too much.Keywords: Finite Element Analysis (FEA), fuzzy control, hydromechanical deep drawing, optimal loading profiles, punch diameter
Procedia PDF Downloads 4312544 Smartphone Video Source Identification Based on Sensor Pattern Noise
Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.Keywords: digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification
Procedia PDF Downloads 4282543 An Approach to Noise Variance Estimation in Very Low Signal-to-Noise Ratio Stochastic Signals
Authors: Miljan B. Petrović, Dušan B. Petrović, Goran S. Nikolić
Abstract:
This paper describes a method for AWGN (Additive White Gaussian Noise) variance estimation in noisy stochastic signals, referred to as Multiplicative-Noising Variance Estimation (MNVE). The aim was to develop an estimation algorithm with minimal number of assumptions on the original signal structure. The provided MATLAB simulation and results analysis of the method applied on speech signals showed more accuracy than standardized AR (autoregressive) modeling noise estimation technique. In addition, great performance was observed on very low signal-to-noise ratios, which in general represents the worst case scenario for signal denoising methods. High execution time appears to be the only disadvantage of MNVE. After close examination of all the observed features of the proposed algorithm, it was concluded it is worth of exploring and that with some further adjustments and improvements can be enviably powerful.Keywords: noise, signal-to-noise ratio, stochastic signals, variance estimation
Procedia PDF Downloads 3862542 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem
Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee
Abstract:
Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research
Procedia PDF Downloads 3362541 Analysis of CO₂ Two-Phase Ejector with Taguchi and ANOVA Optimization and Refrigerant Selection with Enviro Economic Concerns by TOPSIS Analysis
Authors: Karima Megdouli, Bourhan tachtouch
Abstract:
Ejector refrigeration cycles offer an alternative to conventional systems for producing cold from low-temperature heat. In this article, a thermodynamic model is presented. This model has the advantage of simplifying the calculation algorithm and describes the complex double-throttling mechanism that occurs in the ejector. The model assumption and calculation algorithm are presented first. The impact of each efficiency is evaluated. Validation is performed on several data sets. The ejector model is then used to simulate a RES (refrigeration ejector system), to validate its robustness and suitability for use in predicting thermodynamic cycle performance. A Taguchi and ANOVA optimization is carried out on a RES. TOPSIS analysis was applied to decide the optimum refrigerants with cost, safety, environmental and enviro economic concerns along with thermophysical properties.Keywords: ejector, velocity distribution, shock circle, Taguchi and ANOVA optimization, TOPSIS analysis
Procedia PDF Downloads 892540 On the Other Side of Shining Mercury: In Silico Prediction of Cold Stabilizing Mutations in Serine Endopeptidase from Bacillus lentus
Authors: Debamitra Chakravorty, Pratap K. Parida
Abstract:
Cold-adapted proteases enhance wash performance in low-temperature laundry resulting in a reduction in energy consumption and wear of textiles and are also used in the dehairing process in leather industries. Unfortunately, the possible drawbacks of using cold-adapted proteases are their instability at higher temperatures. Therefore, proteases with broad temperature stability are required. Unfortunately, wild-type cold-adapted proteases exhibit instability at higher temperatures and thus have low shelf lives. Therefore, attempts to engineer cold-adapted proteases by protein engineering were made previously by directed evolution and random mutagenesis. The lacuna is the time, capital, and labour involved to obtain these variants are very demanding and challenging. Therefore, rational engineering for cold stability without compromising an enzyme's optimum pH and temperature for activity is the current requirement. In this work, mutations were rationally designed with the aid of high throughput computational methodology of network analysis, evolutionary conservation scores, and molecular dynamics simulations for Savinase from Bacillus lentus with the intention of rendering the mutants cold stable without affecting their temperature and pH optimum for activity. Further, an attempt was made to incorporate a mutation in the most stable mutant rationally obtained by this method to introduce oxidative stability in the mutant. Such enzymes are desired in detergents with bleaching agents. In silico analysis by performing 300 ns molecular dynamics simulations at 5 different temperatures revealed that these three mutants were found to be better in cold stability compared to the wild type Savinase from Bacillus lentus. Conclusively, this work shows that cold adaptation without losing optimum temperature and pH stability and additionally stability from oxidative damage can be rationally designed by in silico enzyme engineering. The key findings of this work were first, the in silico data of H5 (cold stable savinase) used as a control in this work, corroborated with its reported wet lab temperature stability data. Secondly, three cold stable mutants of Savinase from Bacillus lentus were rationally identified. Lastly, a mutation which will stabilize savinase against oxidative damage was additionally identified.Keywords: cold stability, molecular dynamics simulations, protein engineering, rational design
Procedia PDF Downloads 140