Search results for: modified backtracking search algorithm
6554 Research on Control Strategy of Differential Drive Assisted Steering of Distributed Drive Electric Vehicle
Authors: J. Liu, Z. P. Yu, L. Xiong, Y. Feng, J. He
Abstract:
According to the independence, accuracy and controllability of the driving/braking torque of the distributed drive electric vehicle, a control strategy of differential drive assisted steering was designed. Firstly, the assisted curve under different speed and steering wheel torque was developed and the differential torques were distributed to the right and left front wheels. Then the steering return ability assisted control algorithm was designed. At last, the joint simulation was conducted by CarSim/Simulink. The result indicated: the differential drive assisted steering algorithm could provide enough steering drive-assisted under low speed and improve the steering portability. Along with the increase of the speed, the provided steering drive-assisted decreased. With the control algorithm, the steering stiffness of the steering system increased along with the increase of the speed, which ensures the driver’s road feeling. The control algorithm of differential drive assisted steering could avoid the understeer under low speed effectively.Keywords: differential assisted steering, control strategy, distributed drive electric vehicle, driving/braking torque
Procedia PDF Downloads 4786553 Finding Optimal Operation Condition in a Biological Nutrient Removal Process with Balancing Effluent Quality, Economic Cost and GHG Emissions
Authors: Seungchul Lee, Minjeong Kim, Iman Janghorban Esfahani, Jeong Tai Kim, ChangKyoo Yoo
Abstract:
It is hard to maintain the effluent quality of the wastewater treatment plants (WWTPs) under with fixed types of operational control because of continuously changed influent flow rate and pollutant load. The aims of this study is development of multi-loop multi-objective control (ML-MOC) strategy in plant-wide scope targeting four objectives: 1) maximization of nutrient removal efficiency, 2) minimization of operational cost, 3) maximization of CH4 production in anaerobic digestion (AD) for CH4 reuse as a heat source and energy source, and 4) minimization of N2O gas emission to cope with global warming. First, benchmark simulation mode is modified to describe N2O dynamic in biological process, namely benchmark simulation model for greenhouse gases (BSM2G). Then, three types of single-loop proportional-integral (PI) controllers for DO controller, NO3 controller, and CH4 controller are implemented. Their optimal set-points of the controllers are found by using multi-objective genetic algorithm (MOGA). Finally, multi loop-MOC in BSM2G is implemented and evaluated in BSM2G. Compared with the reference case, the ML-MOC with the optimal set-points showed best control performances than references with improved performances of 34%, 5% and 79% of effluent quality, CH4 productivity, and N2O emission respectively, with the decrease of 65% in operational cost.Keywords: Benchmark simulation model for greenhouse gas, multi-loop multi-objective controller, multi-objective genetic algorithm, wastewater treatment plant
Procedia PDF Downloads 5036552 A Simple Recursive Framework to Generate Gray Codes for Weak Orders in Constant Amortized Time
Authors: Marsden Jacques, Dennis Wong
Abstract:
A weak order is a way to rank n objects where ties are allowed. In this talk, we present a recursive framework to generate Gray codes for weak orders. We then describe a simple algorithm based on the framework that generates 2-Gray codes for weak orders in constant amortized time per string. This framework can easily be modified to generate other Gray codes for weak orders. We provide an example on using the framework to generate the first Shift Gray code for weak orders, also in constant amortized time, where consecutive strings differ by a shift or a symbol change.Keywords: weak order, Cayley permutation, Gray code, shift Gray code
Procedia PDF Downloads 1796551 Design and Performance Analysis of Resource Management Algorithms in Response to Emergency and Disaster Situations
Authors: Volkan Uygun, H. Birkan Yilmaz, Tuna Tugcu
Abstract:
This study focuses on the development and use of algorithms that address the issue of resource management in response to emergency and disaster situations. The presented system, named Disaster Management Platform (DMP), takes the data from the data sources of service providers and distributes the incoming requests accordingly both to manage load balancing and minimize service time, which results in improved user satisfaction. Three different resource management algorithms, which give different levels of importance to load balancing and service time, are proposed for the study. The first one is the Minimum Distance algorithm, which assigns the request to the closest resource. The second one is the Minimum Load algorithm, which assigns the request to the resource with the minimum load. Finally, the last one is the Hybrid algorithm, which combines the previous two approaches. The performance of the proposed algorithms is evaluated with respect to waiting time, success ratio, and maximum load ratio. The metrics are monitored from simulations, to find the optimal scheme for different loads. Two different simulations are performed in the study, one is time-based and the other is lambda-based. The results indicate that, the Minimum Load algorithm is generally the best in all metrics whereas the Minimum Distance algorithm is the worst in all cases and in all metrics. The leading position in performance is switched between the Minimum Distance and the Hybrid algorithms, as lambda values change.Keywords: emergency and disaster response, resource management algorithm, disaster situations, disaster management platform
Procedia PDF Downloads 3386550 Performance Optimization of Low-Cost Solar Dryer Using Modified PI Controller
Authors: Rajesh Kondareddy, Prakash Kumar Nayak, Maunash Das, Vrinatri Velentina Boro
Abstract:
Today, there is a huge global concern for sustainable development which would include minimizing the consumption of non-renewable energies without affecting the basic global economy. Solar drying is one of the important processes used for extending the shelf life of agricultural products. The performance of a low cost automated solar dryer fitted with cascade control scheme and modified PI controller for drying chilli was investigated. The dryer was composed of designed solar collector (air heater) fitted with cylindrical pipes to improve the air velocity and a solar drying chamber containing rack of two cheese cloth (net) trays both being integrated together. The air allowed in through air inlet is heated up in the solar collector and channelled through the drying chamber where it is utilized in drying (removing the moisture content from the food substance or agricultural produce loaded). Here, to maintain the temperature in the heating chambers and to improve performance, a modified PI (Proportional–Integral) controller was used due its simplicity and robustness. Drying time for drying chilli from the initial moisture content of 88.5% (wb) to 7.3% (wb) was estimated to be 14 hours in solar dryer whereas 32 h was observed in the open sun drying.Keywords: cascade control, chilli, PI controller, solar dryer
Procedia PDF Downloads 2886549 Tribological Characterization of ZrN Coatings on Titanium Modified Austenitic Stainless Steel
Authors: Mohammad Farooq Wani
Abstract:
Tribological characterization of ZrN coatings deposited on titanium modified austenitic stainless steel (alloy D-9) substrates has been investigated. The coatings were deposited in the deposition temperature range 300–873 K, using the pulsed magnetron sputtering technique. Scratch adhesion tests were carried out using Rc indenter under various conditions of load. Detailed tribological studies were conducted to understand the friction and wear behaviour of these coatings. For all tribological studies steel and ceramic balls were used as counter face material. 3D-Surface profiles of all wear tracks was carried out using 3D universal profiler.Keywords: ZrN, Surafce coating, thin film, tribology, friction and wear
Procedia PDF Downloads 4306548 An Experimental Study on Some Conventional and Hybrid Models of Fuzzy Clustering
Authors: Jeugert Kujtila, Kristi Hoxhalli, Ramazan Dalipi, Erjon Cota, Ardit Murati, Erind Bedalli
Abstract:
Clustering is a versatile instrument in the analysis of collections of data providing insights of the underlying structures of the dataset and enhancing the modeling capabilities. The fuzzy approach to the clustering problem increases the flexibility involving the concept of partial memberships (some value in the continuous interval [0, 1]) of the instances in the clusters. Several fuzzy clustering algorithms have been devised like FCM, Gustafson-Kessel, Gath-Geva, kernel-based FCM, PCM etc. Each of these algorithms has its own advantages and drawbacks, so none of these algorithms would be able to perform superiorly in all datasets. In this paper we will experimentally compare FCM, GK, GG algorithm and a hybrid two-stage fuzzy clustering model combining the FCM and Gath-Geva algorithms. Firstly we will theoretically dis-cuss the advantages and drawbacks for each of these algorithms and we will describe the hybrid clustering model exploiting the advantages and diminishing the drawbacks of each algorithm. Secondly we will experimentally compare the accuracy of the hybrid model by applying it on several benchmark and synthetic datasets.Keywords: fuzzy clustering, fuzzy c-means algorithm (FCM), Gustafson-Kessel algorithm, hybrid clustering model
Procedia PDF Downloads 5146547 Exploring Valproic Acid (VPA) Analogues Interactions with HDAC8 Involved in VPA Mediated Teratogenicity: A Toxicoinformatics Analysis
Authors: Sakshi Piplani, Ajit Kumar
Abstract:
Valproic acid (VPA) is the first synthetic therapeutic agent used to treat epileptic disorders, which account for affecting nearly 1% world population. Teratogenicity caused by VPA has prompted the search for next generation drug with better efficacy and lower side effects. Recent studies have posed HDAC8 as direct target of VPA that causes the teratogenic effect in foetus. We have employed molecular dynamics (MD) and docking simulations to understand the binding mode of VPA and their analogues onto HDAC8. A total of twenty 3D-structures of human HDAC8 isoforms were selected using BLAST-P search against PDB. Multiple sequence alignment was carried out using ClustalW and PDB-3F07 having least missing and mutated regions was selected for study. The missing residues of loop region were constructed using MODELLER and energy was minimized. A set of 216 structural analogues (>90% identity) of VPA were obtained from Pubchem and ZINC database and their energy was optimized with Chemsketch software using 3-D CHARMM-type force field. Four major neurotransmitters (GABAt, SSADH, α-KGDH, GAD) involved in anticonvulsant activity were docked with VPA and its analogues. Out of 216 analogues, 75 were selected on the basis of lower binding energy and inhibition constant as compared to VPA, thus predicted to have anti-convulsant activity. Selected hHDAC8 structure was then subjected to MD Simulation using licenced version YASARA with AMBER99SB force field. The structure was solvated in rectangular box of TIP3P. The simulation was carried out with periodic boundary conditions and electrostatic interactions and treated with Particle mesh Ewald algorithm. pH of system was set to 7.4, temperature 323K and pressure 1atm respectively. Simulation snapshots were stored every 25ps. The MD simulation was carried out for 20ns and pdb file of HDAC8 structure was saved every 2ns. The structures were analysed using castP and UCSF Chimera and most stabilized structure (20ns) was used for docking study. Molecular docking of 75 selected VPA-analogues with PDB-3F07 was performed using AUTODOCK4.2.6. Lamarckian Genetic Algorithm was used to generate conformations of docked ligand and structure. The docking study revealed that VPA and its analogues have more affinity towards ‘hydrophobic active site channel’, due to its hydrophobic properties and allows VPA and their analogues to take part in van der Waal interactions with TYR24, HIS42, VAL41, TYR20, SER138, TRP137 while TRP137 and SER138 showed hydrogen bonding interaction with VPA-analogues. 14 analogues showed better binding affinity than VPA. ADMET SAR server was used to predict the ADMET properties of selected VPA analogues for predicting their druggability. On the basis of ADMET screening, 09 molecules were selected and are being used for in-vivo evaluation using Danio rerio model.Keywords: HDAC8, docking, molecular dynamics simulation, valproic acid
Procedia PDF Downloads 2526546 Pharmacy-Station Mobile Application
Authors: Taissir Fekih Romdhane
Abstract:
This paper proposes a mobile web application named Pharmacy-Station that sells medicines and permits user to search for medications based on their symptoms, making it is easy to locate a specific drug online without the need to visit a pharmacy where it may be out of stock. This application is developed using the jQuery Mobile framework, which uses many web technologies and languages such as HTML5, PHP, JavaScript and CSS3. To test the proposed application, we used data from popular pharmacies in Saudi Arabia that included important information such as location, contact, and medicines in stock, etc. This document describes the different steps followed to create the Pharmacy-Station application along with screenshots. Finally, based on the results, the paper concludes with recommendations and further works planned to improve the Pharmacy-Station mobile application.Keywords: pharmacy, mobile application, jquery mobile framework, search, medicine
Procedia PDF Downloads 1596545 Estimation of Optimum Parameters of Non-Linear Muskingum Model of Routing Using Imperialist Competition Algorithm (ICA)
Authors: Davood Rajabi, Mojgan Yazdani
Abstract:
Non-linear Muskingum model is an efficient method for flood routing, however, the efficiency of this method is influenced by three applied parameters. Therefore, efficiency assessment of Imperialist Competition Algorithm (ICA) to evaluate optimum parameters of non-linear Muskingum model was addressed through this study. In addition to ICA, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were also used aiming at an available criterion to verdict ICA. In this regard, ICA was applied for Wilson flood routing; then, routing of two flood events of DoAab Samsami River was investigated. In case of Wilson flood that the target function was considered as the sum of squared deviation (SSQ) of observed and calculated discharges. Routing two other floods, in addition to SSQ, another target function was also considered as the sum of absolute deviations of observed and calculated discharge. For the first floodwater based on SSQ, GA indicated the best performance, however, ICA was on first place, based on SAD. For the second floodwater, based on both target functions, ICA indicated a better operation. According to the obtained results, it can be said that ICA could be used as an appropriate method to evaluate the parameters of Muskingum non-linear model.Keywords: Doab Samsami river, genetic algorithm, imperialist competition algorithm, meta-exploratory algorithms, particle swarm optimization, Wilson flood
Procedia PDF Downloads 5056544 Influence of Initial Curing Time, Water Content and Apparent Water Content on Geopolymer Modified Sludge Generated in Landslide Area
Authors: Minh Chien Vu, Tomoaki Satomi, Hiroshi Takahashi
Abstract:
As being lack of sufficient strength to support the loading of construction as well as service life cause the clay content and clay mineralogy, soft and highly compressible soils (sludge) constitute a major problem in geotechnical engineering projects. Geopolymer, a kind of inorganic polymer, is a promising material with a wide range of applications and offers a lower level of CO₂ emissions than conventional Portland cement. However, the feasibility of geopolymer in term of modified the soft and highly compressible soil has not been received much attention due to the requirement of heat treatment for activating the fly ash component and the existence of high content of clay-size particles in the composition of sludge that affected on the efficiency of the reaction. On the other hand, the geopolymer modified sludge could be affected by other important factors such as initial curing time, initial water content and apparent water content. Therefore, this paper describes a different potential application of geopolymer: soil stabilization in landslide areas to adapt to the technical properties of sludge so that heavy machines can move on. Sludge condition process is utilized to demonstrate the possibility for stabilizing sludge using fly ash-based geopolymer at ambient curing condition ( ± 20 °C) in term of failure strength, strain and bulk density. Sludge conditioning is a process whereby sludge is treated with chemicals or various other means to improve the dewatering characteristics of sludge before applying in the construction area. The effect of initial curing time, water content and apparent water content on the modification of sludge are the main focus of this study. Test results indicate that the initial curing time has potential for improving failure strain and strength of modified sludge with the specific condition of soft soil. The result further shows that the initial water content over than 50% total mass of sludge could significantly lead to a decrease of strength performance of geopolymer-based modified sludge. The optimum apparent water content of geopolymer modified sludge is strongly influenced by the amount of geopolymer content and initial water content of sludge. The solution to minimize the effect of high initial water content will be considered deeper in the future.Keywords: landslide, sludge, fly ash, geopolymer, sludge conditioning
Procedia PDF Downloads 1166543 Three-Stage Multivariate Stratified Sample Surveys with Probabilistic Cost Constraint and Random Variance
Authors: Sanam Haseen, Abdul Bari
Abstract:
In this paper a three stage multivariate programming problem with random survey cost and variances as random variables has been formulated as a non-linear stochastic programming problem. The problem has been converted into an equivalent deterministic form using chance constraint programming and modified E-modeling. An empirical study of the problem has been done at the end of the paper using R-simulation.Keywords: chance constraint programming, modified E-model, stochastic programming, stratified sample surveys, three stage sample surveys
Procedia PDF Downloads 4586542 High Capacity Reversible Watermarking through Interpolated Error Shifting
Authors: Hae-Yeoun Lee
Abstract:
Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation
Procedia PDF Downloads 3236541 New Segmentation of Piecewise Moving-Average Model by Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
This paper addresses the problem of the signal segmentation within a Bayesian framework by using reversible jump MCMC algorithm. The signal is modelled by piecewise constant Moving-Average (MA) model where the numbers of segments, the position of change-point, the order and the coefficient of the MA model for each segment are unknown. The reversible jump MCMC algorithm is then used to generate samples distributed according to the joint posterior distribution of the unknown parameters. These samples allow calculating some interesting features of the posterior distribution. The performance of the methodology is illustrated via several simulation results.Keywords: piecewise, moving-average model, reversible jump MCMC, signal segmentation
Procedia PDF Downloads 2276540 Peer-Assisted Learning of Ebm in, a UK Medical School: Evaluation of the NICE Evidence Search Student Champion Scheme
Authors: Emily Jin, Harry Sharples, Anne Weist
Abstract:
Introduction: NICE Evidence Search Student Champion Scheme is a peer-assisted learning scheme that aims to improve the routine use of evidence-based information by future health and social care staff. The focus is on the NICE evidence search portal that provides selected information from more than 800 reliable health, social care, and medicines sources, including up-to-date guidelines and information for the public. This paper aims to evaluate the effectiveness of the scheme when implemented in Liverpool School of Medicine and to understand the experiences of those attending. Methods: Twelve student champions were recruited and trained in February 2020 as peer tutors during a workshop facilitated by NICE. Cascade sessions were then organised and delivered on an optional basis for students, in small groups of < 10 to approximately 70 attendees. Surveys were acquired immediately before and 8-12 weeks after cascade sessions (n=47 and 45 respectively). Data from these surveys facilitated the analysis of the scheme. Results: Surveys demonstrated 74% of all attendees frequently searched for health and social care information online as a part of their studies. However, only 15% of attendees reported having prior formal training on searching for health information, despite receiving such training earlier on in the curriculum. After attending cascade sessions, students reported a 58% increase in confidence when searching for information using evidence search, from a pre-session a baseline of 36%. Conclusion: NICE Evidence Search Student Champion Scheme provided clear benefits for attending students, increasing confidence in searching for peer-reviewed, mainly secondary sources of health information. The lack of reported training represents the unmet need that the champion scheme satisfies, and this likely benefits student champions as well as attendees. Increasing confidence in searching for healthcare information online may support future evidence-based decision-making.Keywords: evidence-based medicine, NICE, medical education, medical school, peer-assisted learning
Procedia PDF Downloads 1306539 Optimal Maintenance and Improvement Policies in Water Distribution System: Markov Decision Process Approach
Authors: Jong Woo Kim, Go Bong Choi, Sang Hwan Son, Dae Shik Kim, Jung Chul Suh, Jong Min Lee
Abstract:
The Markov Decision Process (MDP) based methodology is implemented in order to establish the optimal schedule which minimizes the cost. Formulation of MDP problem is presented using the information about the current state of pipe, improvement cost, failure cost and pipe deterioration model. The objective function and detailed algorithm of dynamic programming (DP) are modified due to the difficulty of implementing the conventional DP approaches. The optimal schedule derived from suggested model is compared to several policies via Monte Carlo simulation. Validity of the solution and improvement in computational time are proved.Keywords: Markov decision processes, dynamic programming, Monte Carlo simulation, periodic replacement, Weibull distribution
Procedia PDF Downloads 4246538 Determination of Nanomolar Mercury (II) by Using Multi-Walled Carbon Nanotubes Modified Carbon Zinc/Aluminum Layered Double Hydroxide – 3 (4-Methoxyphenyl) Propionate Nanocomposite Paste Electrode
Authors: Illyas Md Isa, Sharifah Norain Mohd Sharif, Norhayati Hashima
Abstract:
A mercury(II) sensor was developed by using multi-walled carbon nanotubes (MWCNTs) paste electrode modified with Zn/Al layered double hydroxide-3(4-methoxyphenyl)propionate nanocomposite (Zn/Al-HMPP). The optimum conditions by cyclic voltammetry were observed at electrode composition 2.5% (w/w) of Zn/Al-HMPP/MWCNTs, 0.4 M potassium chloride, pH 4.0, and scan rate of 100 mVs-1. The sensor exhibited wide linear range from 1x10-3 M to 1x10-7 M Hg2+ and 1x10-7 M to 1x10-9 M Hg2+, with a detection limit of 1x10-10 M Hg2+. The high sensitivity of the proposed electrode towards Hg(II) was confirmed by double potential-step chronocoulometry which indicated these values; diffusion coefficient 1.5445 x 10-9 cm2 s-1, surface charge 524.5 µC s-½ and surface coverage 4.41 x 10-2 mol cm-2. The presence of 25-fold concentration of most metal ions had no influence on the anodic peak current. With characteristics such as high sensitivity, selectivity and repeatability the electrode was then proposed as the appropriate alternative for the determination of mercury(II).Keywords: cyclic voltammetry, mercury(II), modified carbon paste electrode, nanocomposite
Procedia PDF Downloads 3096537 Algorithmic Approach to Management of Complications of Permanent Facial Filler: A Saudi Experience
Authors: Luay Alsalmi
Abstract:
Background: Facial filler is the most common type of cosmetic surgery next to botox. Permanent filler is preferred nowadays due to the low cost brought about by non-recurring injection appointments. However, such fillers pose a higher risk for complications, with even greater adverse effects when the procedure is done using unknown dermal filler injections. AIM: This study aimed to establish an algorithm to categorize and manage patients that receive permanent fillers. Materials and Methods: Twelve participants were presented to the service through emergency or as outpatient from November 2015 to May 2021. Demographics such as age, sex, date of injection, time of onset, and types of complications were collected. After examination, all cases were managed based on an algorithm established. FACE-Q was used to measure overall satisfaction and psychological well-being. Results: The algorithm to diagnose and manage these patients effectively with a high satisfaction rate was established in this study. All participants were non-smoker females with no known medical comorbidities. The algorithm presented determined the treatment plan when faced with complications. Results revealed high appearance-related psychosocial distress was observed prior to surgery, while it significantly dropped after surgery. FACE-Q was able to establish evidence of satisfactory ratings among patients prior to and after surgery. Conclusion: This treatment algorithm can guide the surgeon in formulating a suitable plan with fewer complications and a high satisfaction rate.Keywords: facial filler, FACE-Q, psycho-social stress, botox, treatment algorithm
Procedia PDF Downloads 846536 Commissioning of a Flattening Filter Free (FFF) using an Anisotropic Analytical Algorithm (AAA)
Authors: Safiqul Islam, Anamul Haque, Mohammad Amran Hossain
Abstract:
Aim: To compare the dosimetric parameters of the flattened and flattening filter free (FFF) beam and to validate the beam data using anisotropic analytical algorithm (AAA). Materials and Methods: All the dosimetric data’s (i.e. depth dose profiles, profile curves, output factors, penumbra etc.) required for the beam modeling of AAA were acquired using the Blue Phantom RFA for 6 MV, 6 FFF, 10MV & 10FFF. Progressive resolution Optimizer and Dose Volume Optimizer algorithm for VMAT and IMRT were are also configured in the beam model. Beam modeling of the AAA were compared with the measured data sets. Results: Due to the higher and lover energy component in 6FFF and 10 FFF the surface doses are 10 to 15% higher compared to flattened 6 MV and 10 MV beams. FFF beam has a lower mean energy compared to the flattened beam and the beam quality index were 6 MV 0.667, 6FFF 0.629, 10 MV 0.74 and 10 FFF 0.695 respectively. Gamma evaluation with 2% dose and 2 mm distance criteria for the Open Beam, IMRT and VMAT plans were also performed and found a good agreement between the modeled and measured data. Conclusion: We have successfully modeled the AAA algorithm for the flattened and FFF beams and achieved a good agreement with the calculated and measured value.Keywords: commissioning of a Flattening Filter Free (FFF) , using an Anisotropic Analytical Algorithm (AAA), flattened beam, parameters
Procedia PDF Downloads 3016535 Diesel Fault Prediction Based on Optimized Gray Neural Network
Authors: Han Bing, Yin Zhenjie
Abstract:
In order to analyze the status of a diesel engine, as well as conduct fault prediction, a new prediction model based on a gray system is proposed in this paper, which takes advantage of the neural network and the genetic algorithm. The proposed GBPGA prediction model builds on the GM (1.5) model and uses a neural network, which is optimized by a genetic algorithm to construct the error compensator. We verify our proposed model on the diesel faulty simulation data and the experimental results show that GBPGA has the potential to employ fault prediction on diesel.Keywords: fault prediction, neural network, GM(1, 5) genetic algorithm, GBPGA
Procedia PDF Downloads 3056534 Comparison between the Conventional Methods and PSO Based MPPT Algorithm for Photovoltaic Systems
Authors: Ramdan B. A. Koad, Ahmed F. Zobaa
Abstract:
Since the output characteristics of Photovoltaic (PV) system depends on the ambient temperature, solar radiation and load impedance, its maximum Power Point (MPP) is not constant. Under each condition PV module has a point at which it can produce its MPP. Therefore, a Maximum Power Point Tracking (MPPT) method is needed to uphold the PV panel operating at its MPP. This paper presents comparative study between the conventional MPPT methods used in (PV) system: Perturb and Observe (P&O), Incremental Conductance (IncCond), and Particle Swarm Optimization (PSO) algorithm for (MPPT) of (PV) system. To evaluate the study, the proposed PSO MPPT is implemented on a DC-DC converter and has been compared with P&O and INcond methods in terms of their tracking speed, accuracy and performance by using the Matlab tool Simulink. The simulation result shows that the proposed algorithm is simple, and is superior to the P&O and IncCond methods.Keywords: photovoltaic systems, maximum power point tracking, perturb and observe method, incremental conductance, methods and practical swarm optimization algorithm
Procedia PDF Downloads 3586533 Synthetic Method of Contextual Knowledge Extraction
Authors: Olga Kononova, Sergey Lyapin
Abstract:
Global information society requirements are transparency and reliability of data, as well as ability to manage information resources independently; particularly to search, to analyze, to evaluate information, thereby obtaining new expertise. Moreover, it is satisfying the society information needs that increases the efficiency of the enterprise management and public administration. The study of structurally organized thematic and semantic contexts of different types, automatically extracted from unstructured data, is one of the important tasks for the application of information technologies in education, science, culture, governance and business. The objectives of this study are the contextual knowledge typologization, selection or creation of effective tools for extracting and analyzing contextual knowledge. Explication of various kinds and forms of the contextual knowledge involves the development and use full-text search information systems. For the implementation purposes, the authors use an e-library 'Humanitariana' services such as the contextual search, different types of queries (paragraph-oriented query, frequency-ranked query), automatic extraction of knowledge from the scientific texts. The multifunctional e-library «Humanitariana» is realized in the Internet-architecture in WWS-configuration (Web-browser / Web-server / SQL-server). Advantage of use 'Humanitariana' is in the possibility of combining the resources of several organizations. Scholars and research groups may work in a local network mode and in distributed IT environments with ability to appeal to resources of any participating organizations servers. Paper discusses some specific cases of the contextual knowledge explication with the use of the e-library services and focuses on possibilities of new types of the contextual knowledge. Experimental research base are science texts about 'e-government' and 'computer games'. An analysis of the subject-themed texts trends allowed to propose the content analysis methodology, that combines a full-text search with automatic construction of 'terminogramma' and expert analysis of the selected contexts. 'Terminogramma' is made out as a table that contains a column with a frequency-ranked list of words (nouns), as well as columns with an indication of the absolute frequency (number) and the relative frequency of occurrence of the word (in %% ppm). The analysis of 'e-government' materials showed, that the state takes a dominant position in the processes of the electronic interaction between the authorities and society in modern Russia. The media credited the main role in these processes to the government, which provided public services through specialized portals. Factor analysis revealed two factors statistically describing the used terms: human interaction (the user) and the state (government, processes organizer); interaction management (public officer, processes performer) and technology (infrastructure). Isolation of these factors will lead to changes in the model of electronic interaction between government and society. In this study, the dominant social problems and the prevalence of different categories of subjects of computer gaming in science papers from 2005 to 2015 were identified. Therefore, there is an evident identification of several types of contextual knowledge: micro context; macro context; dynamic context; thematic collection of queries (interactive contextual knowledge expanding a composition of e-library information resources); multimodal context (functional integration of iconographic and full-text resources through hybrid quasi-semantic algorithm of search). Further studies can be pursued both in terms of expanding the resource base on which they are held, and in terms of the development of appropriate tools.Keywords: contextual knowledge, contextual search, e-library services, frequency-ranked query, paragraph-oriented query, technologies of the contextual knowledge extraction
Procedia PDF Downloads 3596532 Improving Gas Separation Performance of Poly(Vinylidene Fluoride) Based Membranes Containing Ionic Liquid
Authors: S. Al-Enezi, J. Samuel, A. Al-Banna
Abstract:
Polymer based membranes are one of the low-cost technologies available for the gas separation. Three major elements required for a commercial gas separating membrane are high permeability, high selectivity, and good mechanical strength. Poly(vinylidene fluoride) (PVDF) is a commercially available fluoropolymer and a widely used membrane material in gas separation devices since it possesses remarkable thermal, chemical stability, and excellent mechanical strength. The PVDF membrane was chemically modified by soaking in different ionic liquids and dried. The thermal behavior of modified membranes was investigated by differential scanning calorimetry (DSC), and thermogravimetry (TGA), and the results clearly show the best affinity between the ionic liquid and the polymer support. The porous structure of the PVDF membranes was clearly seen in the scanning electron microscopy (SEM) images. The CO₂ permeability of blended membranes was explored in comparison with the unmodified matrix. The ionic liquid immobilized in the hydrophobic PVDF support exhibited good performance for separations of CO₂/N₂. The improved permeability of modified membrane (PVDF-IL) is attributed to the high concentration of nitrogen rich imidazolium moieties.Keywords: PVDF, polymer membrane, gas permeability, CO₂ separation, nanotubes
Procedia PDF Downloads 2846531 Acceleration of DNA Hybridization Using Electroosmotic Flow
Authors: Yun-Hsiang Wang, Huai-Yi Chen, Kin Fong Lei
Abstract:
Deoxyribonucleic acid (DNA) hybridization is a common technique used in genetic assay widely. However, the hybridization ratio and rate are usually limited by the diffusion effect. Here, microfluidic electrode platform producing electroosmosis generated by alternating current signal has been proposed to enhance the hybridization ratio and rate. The electrode was made of aurum fabricated by microfabrication technique. Thiol-modified oligo probe was immobilized on the electrode for specific capture of target, which is modified by fluorescent tag. Alternative electroosmosis can induce local microfluidic vortexes to accelerate DNA hybridization. This study provides a strategy to enhance the rate of DNA hybridization in the genetic assay.Keywords: DNA hybridization, electroosmosis, electrical enhancement, hybridization ratio
Procedia PDF Downloads 3836530 An Experimental Investigation on Productivity and Performance of an Improved Design of Basin Type Solar Still
Authors: Mahmoud S. El-Sebaey, Asko Ellman, Ahmed Hegazy, Tarek Ghonim
Abstract:
Due to population growth, the need for drinkable healthy water is highly increased. Consequently, and since the conventional sources of water are limited, researchers devoted their efforts to oceans and seas for obtaining fresh drinkable water by thermal distillation. The current work is dedicated to the design and fabrication of modified solar still model, as well as conventional solar still for the sake of comparison. The modified still is single slope double basin solar still. The still consists of a lower basin with a dimension of 1000 mm x 1000 mm which contains the sea water, as well as the top basin that made with 4 mm acrylic, was temporarily kept on the supporting strips permanently fixed with the side walls. Equally ten spaced vertical glass strips of 50 mm height and 3 mm thickness were provided at the upper basin for the stagnancy of the water. Window glass of 3 mm was used as the transparent cover with 23° inclination at the top of the still. Furthermore, the performance evaluation and comparison of these two models in converting salty seawater into drinkable freshwater are introduced, analyzed and discussed. The experiments were performed during the period from June to July 2018 at seawater depths of 2, 3, 4 and 5 cm. Additionally, the solar still models were operated simultaneously in the same climatic conditions to analyze the influence of the modifications on the freshwater output. It can be concluded that the modified design of double basin single slope solar still shows the maximum freshwater output at all water depths tested. The results showed that the daily productivity for modified and conventional solar still was 2.9 and 1.8 dm³/m² day, indicating an increase of 60% in fresh water production.Keywords: freshwater output, solar still, solar energy, thermal desalination
Procedia PDF Downloads 1356529 Creative Accounting as a Financial Numbers Game
Authors: Feddaoui Amina
Abstract:
Through this study we will try to shed light on the theoretical framework proposed for understanding creative accounting as a financial numbers game and one of the most important techniques of accounts manipulation, its main actors and its practices. We will discover the role of the modified Jones model (1995) in detecting creative accounting practices using discretionary accruals. Finally we will try to confirm the importance and the need to address this type of practices using corporate governance as a main control system and an important defense line to reduce these dangerous accounts manipulation.Keywords: financial numbers game, creative accounting, modified Jones model, accounts manipulation
Procedia PDF Downloads 4776528 A Modified Shannon Entropy Measure for Improved Image Segmentation
Authors: Mohammad A. U. Khan, Omar A. Kittaneh, M. Akbar, Tariq M. Khan, Husam A. Bayoud
Abstract:
The Shannon Entropy measure has been widely used for measuring uncertainty. However, in partial settings, the histogram is used to estimate the underlying distribution. The histogram is dependent on the number of bins used. In this paper, a modification is proposed that makes the Shannon entropy based on histogram consistent. For providing the benefits, two application are picked in medical image processing applications. The simulations are carried out to show the superiority of this modified measure for image segmentation problem. The improvement may be contributed to robustness shown to uneven background in images.Keywords: Shannon entropy, medical image processing, image segmentation, modification
Procedia PDF Downloads 4976527 Design and Implementation of a Counting and Differentiation System for Vehicles through Video Processing
Authors: Derlis Gregor, Kevin Cikel, Mario Arzamendia, Raúl Gregor
Abstract:
This paper presents a self-sustaining mobile system for counting and classification of vehicles through processing video. It proposes a counting and classification algorithm divided in four steps that can be executed multiple times in parallel in a SBC (Single Board Computer), like the Raspberry Pi 2, in such a way that it can be implemented in real time. The first step of the proposed algorithm limits the zone of the image that it will be processed. The second step performs the detection of the mobile objects using a BGS (Background Subtraction) algorithm based on the GMM (Gaussian Mixture Model), as well as a shadow removal algorithm using physical-based features, followed by morphological operations. In the first step the vehicle detection will be performed by using edge detection algorithms and the vehicle following through Kalman filters. The last step of the proposed algorithm registers the vehicle passing and performs their classification according to their areas. An auto-sustainable system is proposed, powered by batteries and photovoltaic solar panels, and the data transmission is done through GPRS (General Packet Radio Service)eliminating the need of using external cable, which will facilitate it deployment and translation to any location where it could operate. The self-sustaining trailer will allow the counting and classification of vehicles in specific zones with difficult access.Keywords: intelligent transportation system, object detection, vehicle couting, vehicle classification, video processing
Procedia PDF Downloads 3236526 Density-based Denoising of Point Cloud
Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng
Abstract:
Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation
Procedia PDF Downloads 3456525 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems
Authors: Ali Hosseini
Abstract:
Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors
Procedia PDF Downloads 310