Search results for: recursive algorithm
2725 A Laundry Algorithm for Colored Textiles
Authors: H. E. Budak, B. Arslan-Ilkiz, N. Cakmakci, I. Gocek, U. K. Sahin, H. Acikgoz-Tufan, M. H. Arslan
Abstract:
The aim of this study is to design a novel laundry algorithm for colored textiles which have significant decoloring problem. During the experimental work, bleached knitted single jersey fabric made of 100% cotton and dyed with reactive dyestuff was utilized, since according to a conducted survey textiles made of cotton are the most demanded textile products in the textile market by the textile consumers and for coloration of textiles reactive dyestuffs are the ones that are the most commonly used in the textile industry for dyeing cotton-made products. Therefore, the fabric used in this study was selected and purchased in accordance with the survey results. The fabric samples cut out of this fabric were dyed with different dyeing parameters by using Remazol Brilliant Red 3BS dyestuff in Gyrowash machine at laboratory conditions. From the alternative reactive-dyed cotton fabric samples, the ones that have high tendency to color loss were determined and examined. Accordingly, the parameters of the dyeing process used for these fabric samples were evaluated and the dyeing process which was chosen to be used for causing high tendency to color loss for the cotton fabrics was determined in order to reveal the level of improvement in color loss during this study clearly. Afterwards, all of the untreated fabric samples cut out of the fabric purchased were dyed with the dyeing process selected. When dyeing process was completed, an experimental design was created for the laundering process by using Minitab® program considering temperature, time and mechanical action as parameters. All of the washing experiments were performed in domestic washing machine. 16 washing experiments were performed with 8 different experimental conditions and 2 repeats for each condition. After each of the washing experiments, water samples of the main wash of the laundering process were measured with UV spectrophotometer. The values obtained were compared with the calibration curve of the materials used for the dyeing process. The results of the washing experiments were statistically analyzed with Minitab® program. According to the results, the most suitable washing algorithm to be used in terms of the parameters temperature, time and mechanical action for domestic washing machines for minimizing fabric color loss was chosen. The laundry algorithm proposed in this study have the ability of minimalizing the problem of color loss of colored textiles in washing machines by eliminating the negative effects of the parameters of laundering process on color of textiles without compromising the fundamental effects of basic cleaning action being performed properly. Therefore, since fabric color loss is minimized with this washing algorithm, dyestuff residuals will definitely be lower in the grey water released from the laundering process. In addition to this, with this laundry algorithm it is possible to wash and clean other types of textile products with proper cleaning effect and minimized color loss.Keywords: color loss, laundry algorithm, textiles, domestic washing process
Procedia PDF Downloads 3572724 Monocular Visual Odometry for Three Different View Angles by Intel Realsense T265 with the Measurement of Remote
Authors: Heru Syah Putra, Aji Tri Pamungkas Nurcahyo, Chuang-Jan Chang
Abstract:
MOIL-SDK method refers to the spatial angle that forms a view with a different perspective from the Fisheye image. Visual Odometry forms a trusted application for extending projects by tracking using image sequences. A real-time, precise, and persistent approach that is able to contribute to the work when taking datasets and generate ground truth as a reference for the estimates of each image using the FAST Algorithm method in finding Keypoints that are evaluated during the tracking process with the 5-point Algorithm with RANSAC, as well as produce accurate estimates the camera trajectory for each rotational, translational movement on the X, Y, and Z axes.Keywords: MOIL-SDK, intel realsense T265, Fisheye image, monocular visual odometry
Procedia PDF Downloads 1332723 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings
Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey
Abstract:
Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing
Procedia PDF Downloads 1502722 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia PDF Downloads 1272721 Design and Implementation of DC-DC Converter with Inc-Cond Algorithm
Authors: Mustafa Engin Başoğlu, Bekir Çakır
Abstract:
The most important component affecting the efficiency of photovoltaic power systems are solar panels. Efficiency of these systems are significantly affected because of being low efficiency of solar panel. Therefore, solar panels should be operated under maximum power point conditions through a power converter. In this study, design boost converter with maximum power point tracking (MPPT) operation has been designed and performed with Incremental Conductance (Inc-Cond) algorithm by using direct duty control. Furthermore, it is shown that performance of boost converter with MPPT operation fails under low load resistance connection.Keywords: boost converter, incremental conductance (Inc-Cond), MPPT, solar panel
Procedia PDF Downloads 10442720 Stochastic Simulation of Random Numbers Using Linear Congruential Method
Authors: Melvin Ballera, Aldrich Olivar, Mary Soriano
Abstract:
Digital computers nowadays must be able to have a utility that is capable of generating random numbers. Usually, computer-generated random numbers are not random given predefined values such as starting point and end points, making the sequence almost predictable. There are many applications of random numbers such business simulation, manufacturing, services domain, entertainment sector and other equally areas making worthwhile to design a unique method and to allow unpredictable random numbers. Applying stochastic simulation using linear congruential algorithm, it shows that as it increases the numbers of the seed and range the number randomly produced or selected by the computer becomes unique. If this implemented in an environment where random numbers are very much needed, the reliability of the random number is guaranteed.Keywords: stochastic simulation, random numbers, linear congruential algorithm, pseudorandomness
Procedia PDF Downloads 3152719 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 2182718 Troubleshooting Petroleum Equipment Based on Wireless Sensors Based on Bayesian Algorithm
Authors: Vahid Bayrami Rad
Abstract:
In this research, common methods and techniques have been investigated with a focus on intelligent fault finding and monitoring systems in the oil industry. In fact, remote and intelligent control methods are considered a necessity for implementing various operations in the oil industry, but benefiting from the knowledge extracted from countless data generated with the help of data mining algorithms. It is a avoid way to speed up the operational process for monitoring and troubleshooting in today's big oil companies. Therefore, by comparing data mining algorithms and checking the efficiency and structure and how these algorithms respond in different conditions, The proposed (Bayesian) algorithm using data clustering and their analysis and data evaluation using a colored Petri net has provided an applicable and dynamic model from the point of view of reliability and response time. Therefore, by using this method, it is possible to achieve a dynamic and consistent model of the remote control system and prevent the occurrence of leakage in oil pipelines and refineries and reduce costs and human and financial errors. Statistical data The data obtained from the evaluation process shows an increase in reliability, availability and high speed compared to other previous methods in this proposed method.Keywords: wireless sensors, petroleum equipment troubleshooting, Bayesian algorithm, colored Petri net, rapid miner, data mining-reliability
Procedia PDF Downloads 642717 Breast Cancer Risk is Predicted Using Fuzzy Logic in MATLAB Environment
Authors: S. Valarmathi, P. B. Harathi, R. Sridhar, S. Balasubramanian
Abstract:
Machine learning tools in medical diagnosis is increasing due to the improved effectiveness of classification and recognition systems to help medical experts in diagnosing breast cancer. In this study, ID3 chooses the splitting attribute with the highest gain in information, where gain is defined as the difference between before the split versus after the split. It is applied for age, location, taluk, stage, year, period, martial status, treatment, heredity, sex, and habitat against Very Serious (VS), Very Serious Moderate (VSM), Serious (S) and Not Serious (NS) to calculate the gain of information. The ranked histogram gives the gain of each field for the breast cancer data. The doctors use TNM staging which will decide the risk level of the breast cancer and play an important decision making field in fuzzy logic for perception based measurement. Spatial risk area (taluk) of the breast cancer is calculated. Result clearly states that Coimbatore (North and South) was found to be risk region to the breast cancer than other areas at 20% criteria. Weighted value of taluk was compared with criterion value and integrated with Map Object to visualize the results. ID3 algorithm shows the high breast cancer risk regions in the study area. The study has outlined, discussed and resolved the algorithms, techniques / methods adopted through soft computing methodology like ID3 algorithm for prognostic decision making in the seriousness of the breast cancer.Keywords: ID3 algorithm, breast cancer, fuzzy logic, MATLAB
Procedia PDF Downloads 5162716 Particle Filter Supported with the Neural Network for Aircraft Tracking Based on Kernel and Active Contour
Authors: Mohammad Izadkhah, Mojtaba Hoseini, Alireza Khalili Tehrani
Abstract:
In this paper we presented a new method for tracking flying targets in color video sequences based on contour and kernel. The aim of this work is to overcome the problem of losing target in changing light, large displacement, changing speed, and occlusion. The proposed method is made in three steps, estimate the target location by particle filter, segmentation target region using neural network and find the exact contours by greedy snake algorithm. In the proposed method we have used both region and contour information to create target candidate model and this model is dynamically updated during tracking. To avoid the accumulation of errors when updating, target region given to a perceptron neural network to separate the target from background. Then its output used for exact calculation of size and center of the target. Also it is used as the initial contour for the greedy snake algorithm to find the exact target's edge. The proposed algorithm has been tested on a database which contains a lot of challenges such as high speed and agility of aircrafts, background clutter, occlusions, camera movement, and so on. The experimental results show that the use of neural network increases the accuracy of tracking and segmentation.Keywords: video tracking, particle filter, greedy snake, neural network
Procedia PDF Downloads 3402715 Neural Network Based Fluctuation Frequency Control in PV-Diesel Hybrid Power System
Authors: Heri Suryoatmojo, Adi Kurniawan, Feby A. Pamuji, Nursalim, Syaffaruddin, Herbert Innah
Abstract:
Photovoltaic (PV) system hybrid with diesel system is utilized widely for electrification in remote area. PV output power fluctuates due to uncertainty condition of temperature and sun irradiance. When the penetration of PV power is large, the reliability of the power utility will be disturbed and seriously impact the unstable frequency of system. Therefore, designing a robust frequency controller in PV-diesel hybrid power system is very important. This paper proposes new method of frequency control application in hybrid PV-diesel system based on artificial neural network (ANN). This method can minimize the frequency deviation without smoothing PV output power that controlled by maximum power point tracking (MPPT) method. The neural network algorithm controller considers average irradiance, change of irradiance and frequency deviation. In order the show the effectiveness of proposed algorithm, the addition of battery as energy storage system is also presented. To validate the proposed method, the results of proposed system are compared with the results of similar system using MPPT only. The simulation results show that the proposed method able to suppress frequency deviation smaller compared to the results of system using MPPT only.Keywords: energy storage system, frequency deviation, hybrid power generation, neural network algorithm
Procedia PDF Downloads 4992714 Optimal Design of Submersible Permanent Magnet Linear Synchronous Motor Based Design of Experiment and Genetic Algorithm
Authors: Xiao Zhang, Wensheng Xiao, Junguo Cui, Hongmin Wang
Abstract:
Submersible permanent magnet linear synchronous motors (SPMLSMs) are electromagnetic devices, which can directly drive plunger pump to obtain the crude oil. Those motors have been gradually applied in oil fields due to high thrust force density and high efficiency. Since the force performance closely depends on the concrete structural parameters, the seven different structural parameters are investigated in detail. This paper presents an optimum design of an SPMLSM to minimize the detent force and maximize the thrust by using design of experiment (DOE) and genetic algorithm (GA). The three significant structural parameters (air-gap length, slot width, pole-arc coefficient) are separately screened using 27 1/16 fractional factorial design (FFD) to investigate the significant effect of seven parameters used in this research on the force performance. Response surface methodology (RSM) is well adapted to make analytical model of thrust and detent force with constraints of corresponding significant parameters and enable objective function to be easily created, respectively. GA is performed as a searching tool to search for the Pareto-optimal solutions. By finite element analysis, the proposed PMLSM shows merits in improving thrust and reducing the detent force dramatically.Keywords: optimization, force performance, design of experiment (DOE), genetic algorithm (GA)
Procedia PDF Downloads 2892713 Wind Power Forecasting Using Echo State Networks Optimized by Big Bang-Big Crunch Algorithm
Authors: Amir Hossein Hejazi, Nima Amjady
Abstract:
In recent years, due to environmental issues traditional energy sources had been replaced by renewable ones. Wind energy as the fastest growing renewable energy shares a considerable percent of energy in power electricity markets. With this fast growth of wind energy worldwide, owners and operators of wind farms, transmission system operators, and energy traders need reliable and secure forecasts of wind energy production. In this paper, a new forecasting strategy is proposed for short-term wind power prediction based on Echo State Networks (ESN). The forecast engine utilizes state-of-the-art training process including dynamical reservoir with high capability to learn complex dynamics of wind power or wind vector signals. The study becomes more interesting by incorporating prediction of wind direction into forecast strategy. The Big Bang-Big Crunch (BB-BC) evolutionary optimization algorithm is adopted for adjusting free parameters of ESN-based forecaster. The proposed method is tested by real-world hourly data to show the efficiency of the forecasting engine for prediction of both wind vector and wind power output of aggregated wind power production.Keywords: wind power forecasting, echo state network, big bang-big crunch, evolutionary optimization algorithm
Procedia PDF Downloads 5712712 Filtering Intrusion Detection Alarms Using Ant Clustering Approach
Authors: Ghodhbani Salah, Jemili Farah
Abstract:
With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms
Procedia PDF Downloads 4022711 Design of an Augmented Automatic Choosing Control with Constrained Input by Lyapunov Functions Using Gradient Optimization Automatic Choosing Functions
Authors: Toshinori Nawata
Abstract:
In this paper a nonlinear feedback control called augmented automatic choosing control (AACC) for a class of nonlinear systems with constrained input is presented. When designing the control, a constant term which arises from linearization of a given nonlinear system is treated as a coefficient of a stable zero dynamics. Parameters of the control are suboptimally selected by maximizing the stable region in the sense of Lyapunov with the aid of a genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.Keywords: augmented automatic choosing control, nonlinear control, genetic algorithm, zero dynamics
Procedia PDF Downloads 4762710 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 562709 Task Scheduling on Parallel System Using Genetic Algorithm
Authors: Jasbir Singh Gill, Baljit Singh
Abstract:
Scheduling and mapping the application task graph on multiprocessor parallel systems is considered as the most crucial and critical NP-complete problem. Many genetic algorithms have been proposed to solve such problems. In this paper, two genetic approach based algorithms have been designed and developed with or without task duplication. The proposed algorithms work on two fitness functions. The first fitness i.e. task fitness is used to minimize the total finish time of the schedule (schedule length) while the second fitness function i.e. process fitness is concerned with allocating the tasks to the available highly efficient processor from the list of available processors (load balance). Proposed genetic-based algorithms have been experimentally implemented and evaluated with other state-of-art popular and widely used algorithms.Keywords: parallel computing, task scheduling, task duplication, genetic algorithm
Procedia PDF Downloads 3462708 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm
Authors: Hooman Torabifard
Abstract:
In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.Keywords: image summarization, particle swarm optimization, image threshold, image processing
Procedia PDF Downloads 1312707 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm
Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu
Abstract:
Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model
Procedia PDF Downloads 2492706 Wavelet Based Advanced Encryption Standard Algorithm for Image Encryption
Authors: Ajish Sreedharan
Abstract:
With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. As encryption process is applied to the whole image in AES ,it is difficult to improve the efficiency. In this paper, wavelet decomposition is used to concentrate the main information of image to the low frequency part. Then, AES encryption is applied to the low frequency part. The high frequency parts are XORed with the encrypted low frequency part and a wavelet reconstruction is applied. Theoretical analysis and experimental results show that the proposed algorithm has high efficiency, and satisfied security suits for image data transmission.Keywords: discrete wavelet transforms, AES, dynamic SBox
Procedia PDF Downloads 4302705 Implementation and Modeling of a Quadrotor
Authors: Ersan Aktas, Eren Turanoğuz
Abstract:
In this study, the quad-electrical rotor driven unmanned aerial vehicle system is designed and modeled using fundamental dynamic equations. After that, mechanical, electronical and control system of the air vehicle are designed and implemented. Brushless motor speeds are altered via electronic speed controllers in order to achieve desired controllability. The vehicle's fundamental Euler angles (i.e., roll angle, pitch angle, and yaw angle) are obtained via AHRS sensor. These angles are provided as an input to the control algorithm that run on soft the processor on the electronic card. The vehicle control algorithm is implemented in the electronic card. Controller is designed and improved for each Euler angles. Finally, flight tests have been performed to observe and improve the flight characteristics.Keywords: quadrotor, UAS applications, control architectures, PID
Procedia PDF Downloads 3622704 A Firefly Based Optimization Technique for Optimal Planning of Voltage Controlled Distributed Generators
Authors: M. M. Othman, Walid El-Khattam, Y. G. Hegazy, A. Y. Abdelaziz
Abstract:
This paper presents a method for finding the optimal location and capacity of dispatchable DGs connected to the distribution feeders for optimal planning for a specified power loss without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37-nodes feeder. The results that are validated by comparing it with results obtained from other competing methods show the effectiveness, accuracy and speed of the proposed method.Keywords: distributed generators, firefly technique, optimization, power loss
Procedia PDF Downloads 5312703 Constructing Optimized Criteria of Objective Assessment Indicators among Elderly Frailty
Authors: Shu-Ching Chiu, Shu-Fang Chang
Abstract:
The World Health Organization (WHO) has been actively developing intervention programs to deal with geriatric frailty. In its White Paper on Healthcare Policy 2020, the Department of Health, Bureau of Health Promotion proposed that active aging and the prevention of disability are essential for elderly people to maintain good health. The paper recommended five main policies relevant to this objective, one of which is the prevention of frailty and disability. Scholars have proposed a number of different criteria to diagnose and assess frailty; no consistent or normative standard of measurement is currently available. In addition, many methods of assessment are recursive, which can easily result in recall bias. Due to the relationship between frailty and physical fitness with regard to co-morbidity, it is important that academics optimize the criteria used to assess frailty by objectively evaluating the physical fitness of senior citizens. This study used a review of the literature to identify fitness indicators suitable for measuring frailty in the elderly. This study recommends that measurement criteria be integrated to produce an optimized predictive value for frailty score. Healthcare professionals could use this data to detect frailty at an early stage and provide appropriate care to prevent further debilitation and increase longevity.Keywords: frailty, aging, physical fitness, optimized criteria, healthcare
Procedia PDF Downloads 3542702 Text Mining Techniques for Prioritizing Pathogenic Mutations in Protein Families Known to Misfold or Aggregate
Authors: Khaleel Saleh Al-Rababah
Abstract:
Amyloid fibril forming regions, which are known as protein aggregates, in sequences of some protein families are associated with a number of diseases known as amyloidosis. Mutations play a role in forming fibrils by accelerating the fibril formation process. In this paper we want to extract diseases that caused by those mutations as a result of the impact of the mutations on structural and functional properties of the aggregated protein. We propose a text mining system, to automatically extract mutations, diseases and relations between mutations and diseases. We presented an algorithm based on finite state to cluster mutations found in the same sentence as a sentence could contain different mutation cause different diseases. Also, we presented a co reference algorithm that enables cross-link sentences.Keywords: amyloid, amyloidosis, co reference, protein, text mining
Procedia PDF Downloads 5212701 A New Approach to the Digital Implementation of Analog Controllers for a Power System Control
Authors: G. Shabib, Esam H. Abd-Elhameed, G. Magdy
Abstract:
In this paper, a comparison of discrete time PID, PSS controllers is presented through small signal stability of power system comprising of one machine connected to infinite bus system. This comparison achieved by using a new approach of discretization which converts the S-domain model of analog controllers to a Z-domain model to enhance the damping of a single machine power system. The new method utilizes the Plant Input Mapping (PIM) algorithm. The proposed algorithm is stable for any sampling rate, as well as it takes the closed loop characteristic into consideration. On the other hand, the traditional discretization methods such as Tustin’s method is produce satisfactory results only; when the sampling period is sufficiently low.Keywords: PSS, power system stabilizer PID, proportional-integral-derivative PIM, plant input mapping
Procedia PDF Downloads 5032700 Urban Rail Transit CBTC Computer Interlocking Subsystem Relying on Multi-Template Pen Point Tracking Algorithm
Authors: Xinli Chen, Xue Su
Abstract:
In the urban rail transit CBTC system, interlocking is considered one of the most basic sys-tems, which has the characteristics of logical complexity and high-security requirements. The development and verification of traditional interlocking subsystems are entirely manual pro-cesses and rely too much on the designer, which often hides many uncertain factors. In order to solve this problem, this article is based on the multi-template nib tracking algorithm for model construction and verification, achieving the main safety attributes and using SCADE for formal verification. Experimental results show that this method helps to improve the quality and efficiency of interlocking software.Keywords: computer interlocking subsystem, penpoint tracking, communication-based train control system, multi-template tip tracking
Procedia PDF Downloads 1592699 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms
Authors: Rikson Gultom
Abstract:
Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.Keywords: abusive language, hate speech, machine learning, optimization, social media
Procedia PDF Downloads 1262698 Single Machine Scheduling Problem to Minimize the Number of Tardy Jobs
Authors: Ali Allahverdi, Harun Aydilek, Asiye Aydilek
Abstract:
Minimizing the number of tardy jobs is an important factor to consider while making scheduling decisions. This is because on-time shipments are vital for lowering cost and increasing customers’ satisfaction. This paper addresses the single machine scheduling problem with the objective of minimizing the number of tardy jobs. The only known information is the lower and upper bounds for processing times, and deterministic job due dates. A dominance relation is established, and an algorithm is proposed. Several heuristics are generated from the proposed algorithm. Computational analysis indicates that the performance of one of the heuristics is very close to the optimal solution, i.e., on average, less than 1.5 % from the optimal solution.Keywords: single machine scheduling, number of tardy jobs, heuristi, lower and upper bounds
Procedia PDF Downloads 5542697 Adomian’s Decomposition Method to Functionally Graded Thermoelastic Materials with Power Law
Authors: Hamdy M. Youssef, Eman A. Al-Lehaibi
Abstract:
This paper presents an iteration method for the numerical solutions of a one-dimensional problem of generalized thermoelasticity with one relaxation time under given initial and boundary conditions. The thermoelastic material with variable properties as a power functional graded has been considered. Adomian’s decomposition techniques have been applied to the governing equations. The numerical results have been calculated by using the iterations method with a certain algorithm. The numerical results have been represented in figures, and the figures affirm that Adomian’s decomposition method is a successful method for modeling thermoelastic problems. Moreover, the empirical parameter of the functional graded, and the lattice design parameter have significant effects on the temperature increment, the strain, the stress, the displacement.Keywords: Adomian, decomposition method, generalized thermoelasticity, algorithm
Procedia PDF Downloads 1392696 Performance Evaluation of Karanja Oil Based Biodiesel Engine Using Modified Genetic Algorithm
Authors: G. Bhushan, S. Dhingra, K. K. Dubey
Abstract:
This paper presents the evaluation of performance (BSFC and BTE), combustion (Pmax) and emission (CO, NOx, HC and smoke opacity) parameters of karanja biodiesel in a single cylinder, four stroke, direct injection diesel engine by considering significant engine input parameters (blending ratio, compression ratio and load torque). Multi-objective optimization of performance, combustion and emission parameters is also carried out in a karanja biodiesel engine using hybrid RSM-NSGA-II technique. The pareto optimum solutions are predicted by running the hybrid RSM-NSGA-II technique. Each pareto optimal solution is having its own importance. Confirmation tests are also conducted at randomly selected few pareto solutions to check the authenticity of the results.Keywords: genetic algorithm, rsm, biodiesel, karanja
Procedia PDF Downloads 304