Search results for: time based DNA codes
14310 Evolving a Fuzzy Rule-Base for Image Segmentation
Abstract:
A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noiseKeywords: Comprehensive learning Particle Swarmoptimization, fuzzy classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195614309 The Role of Motivations for Eco-driving and Social Norms on Behavioural Intentions Regarding Speed Limits and Time Headway
Authors: M. Cristea, F. Paran, P. Delhomme
Abstract:
Eco-driving allows the driver to optimize his/her behaviour in order to achieve several types of benefits: reducing pollution emissions, increasing road safety, and fuel saving. One of the main rules for adopting eco-driving is to anticipate the traffic events by avoiding strong acceleration or braking and maintaining a steady speed when possible. Therefore, drivers have to comply with speed limits and time headway. The present study explored the role of three types of motivation and social norms in predicting French drivers- intentions to comply with speed limits and time headway as eco-driving practices as well as examine the variations according to gender and age. 1234 drivers with ages between 18 and 75 years old filled in a questionnaire which was presented as part of an online survey aiming to better understand the drivers- road habits. It included items assessing: a) behavioural intentions to comply with speed limits and time headway according to three types of motivation: reducing pollution emissions, increasing road safety, and fuel saving, b) subjective and descriptive social norms regarding the intention to comply with speed limits and time headway, and c) sociodemographical variables. Drivers expressed their intention to frequently comply with speed limits and time headway in the following 6 months; however, they showed more intention to comply with speed limits as compared to time headway regardless of the type of motivation. The subjective injunctive norms were significantly more important in predicting drivers- intentions to comply with speed limits and time headway as compared to the descriptive norms. In addition, the most frequently reported type of motivation for complying with speed limits and time headway was increasing road safety followed by fuel saving and reducing pollution emissions, hence underlining a low motivation to practice eco-driving. Practical implications of the results are discussed.
Keywords: Eco-driving, social norms, speed limits, time headway.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 159314308 Rural Connectivity Technologies Cost Analysis
Authors: F. Simba, L. Trojer, N.H. Mvungi, B.M. Mwinyiwiwa, E.M. Mjema
Abstract:
Rural areas of Tanzania are still disadvantaged in terms of diffusion of IP-based services; this is due to lack of Information and Communication Technology (ICT) infrastructures, especially lack of connectivity. One of the limitations for connectivity problems in rural areas of Tanzania is the high cost to establish infrastructures for IP-based services [1-2]. However the cost of connectivity varies from one technology to the other and at the same time, the cost is also different from one operator (service provider) to another within the country. This paper presents development of software system to calculate cost of connectivity to rural areas of Tanzania. The system is developed to make an easy access of connectivity cost from different technologies and different operators. The development of the calculator follows the V-model software development lifecycle. The calculator is used to evaluate the economic viability of different technologies considered as being potential candidates to provide rural connectivity. In this paper, the evaluation is based on the techno-economic analysis approach.
Keywords: rural, connectivity, cost, V-model, techno economic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189914307 Performance Evaluation of a Limited Round-Robin System
Authors: Yoshiaki Shikata
Abstract:
Performance of a limited Round-Robin (RR) rule is studied in order to clarify the characteristics of a realistic sharing model of a processor. Under the limited RR rule, the processor allocates to each request a fixed amount of time, called a quantum, in a fixed order. The sum of the requests being allocated these quanta is kept below a fixed value. Arriving requests that cannot be allocated quanta because of such a restriction are queued or rejected. Practical performance measures, such as the relationship between the mean sojourn time, the mean number of requests, or the loss probability and the quantum size are evaluated via simulation. In the evaluation, the requested service time of an arriving request is converted into a quantum number. One of these quanta is included in an RR cycle, which means a series of quanta allocated to each request in a fixed order. The service time of the arriving request can be evaluated using the number of RR cycles required to complete the service, the number of requests receiving service, and the quantum size. Then an increase or decrease in the number of quanta that are necessary before service is completed is reevaluated at the arrival or departure of other requests. Tracking these events and calculations enables us to analyze the performance of our limited RR rule. In particular, we obtain the most suitable quantum size, which minimizes the mean sojourn time, for the case in which the switching time for each quantum is considered.Keywords: Limited RR rule, quantum, processor sharing, sojourn time, performance measures, simulation, loss probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 124614306 IPSO Based UPFC Robust Output Feedback Controllers for Damping of Low Frequency Oscillations
Authors: A. Safari, H. Shayeghi, H. A. Shayanfar
Abstract:
On the basis of the linearized Phillips-Herffron model of a single-machine power system, a novel method for designing unified power flow controller (UPFC) based output feedback controller is presented. The design problem of output feedback controller for UPFC is formulated as an optimization problem according to with the time domain-based objective function which is solved by iteration particle swarm optimization (IPSO) that has a strong ability to find the most optimistic results. To ensure the robustness of the proposed damping controller, the design process takes into account a wide range of operating conditions and system configurations. The simulation results prove the effectiveness and robustness of the proposed method in terms of a high performance power system. The simulation study shows that the designed controller by Iteration PSO performs better than Classical PSO in finding the solution.
Keywords: UPFC, IPSO, output feedback Controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143414305 Decision Tree Modeling in Emergency Logistics Planning
Authors: Yousef Abu Nahleh, Arun Kumar, Fugen Daver, Reham Al-Hindawi
Abstract:
Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability of disaster for each country in the world by using decision tree modeling. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.
Keywords: Decision tree modeling, Forecasting, Humanitarian relief, emergency supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 330714304 Typical Day Prediction Model for Output Power and Energy Efficiency of a Grid-Connected Solar Photovoltaic System
Authors: Yan Su, L. C. Chan
Abstract:
A novel typical day prediction model have been built and validated by the measured data of a grid-connected solar photovoltaic (PV) system in Macau. Unlike conventional statistical method used by previous study on PV systems which get results by averaging nearby continuous points, the present typical day statistical method obtain the value at every minute in a typical day by averaging discontinuous points at the same minute in different days. This typical day statistical method based on discontinuous point averaging makes it possible for us to obtain the Gaussian shape dynamical distributions for solar irradiance and output power in a yearly or monthly typical day. Based on the yearly typical day statistical analysis results, the maximum possible accumulated output energy in a year with on site climate conditions and the corresponding optimal PV system running time are obtained. Periodic Gaussian shape prediction models for solar irradiance, output energy and system energy efficiency have been built and their coefficients have been determined based on the yearly, maximum and minimum monthly typical day Gaussian distribution parameters, which are obtained from iterations for minimum Root Mean Squared Deviation (RMSD). With the present model, the dynamical effects due to time difference in a day are kept and the day to day uncertainty due to weather changing are smoothed but still included. The periodic Gaussian shape correlations for solar irradiance, output power and system energy efficiency have been compared favorably with data of the PV system in Macau and proved to be an improvement than previous models.
Keywords: Grid Connected, RMSD, Solar PV System, Typical Day.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167914303 Optimization of Fuzzy Cluster Nodes in Cellular Multimedia Networks
Authors: J. D. Mallapur, Supriya H., Santosh B. K., Tej H.
Abstract:
The cellular network is one of the emerging areas of communication, in which the mobile nodes act as member for one base station. The cluster based communication is now an emerging area of wireless cellular multimedia networks. The cluster renders fast communication and also a convenient way to work with connectivity. In our scheme we have proposed an optimization technique for the fuzzy cluster nodes, by categorizing the group members into three categories like long refreshable member, medium refreshable member and short refreshable member. By considering long refreshable nodes as static nodes, we compute the new membership values for the other nodes in the cluster. We compare their previous and present membership value with the threshold value to categorize them into three different members. By which, we optimize the nodes in the fuzzy clusters. The simulation results show that there is reduction in the cluster computational time and iterational time after optimization.Keywords: Clusters, fuzzy and optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157014302 Analysis of One Dimensional Advection Diffusion Model Using Finite Difference Method
Authors: Vijay Kumar Kukreja, Ravneet Kaur
Abstract:
In this paper, one dimensional advection diffusion model is analyzed using finite difference method based on Crank-Nicolson scheme. A practical problem of filter cake washing of chemical engineering is analyzed. The model is converted into dimensionless form. For the grid Ω × ω = [0, 1] × [0, T], the Crank-Nicolson spatial derivative scheme is used in space domain and forward difference scheme is used in time domain. The scheme is found to be unconditionally convergent, stable, first order accurate in time and second order accurate in space domain. For a test problem, numerical results are compared with the analytical ones for different values of parameter.Keywords: Consistency, Crank-Nicolson scheme, Gerschgorin circle, Lax-Richtmyer theorem, Peclet number, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76114301 Experimental Analysis of Diesel Hydrotreating Reactor to Development a Simplified Tool for Process Real- time Optimization
Authors: S.Shokri, S.Zahedi, M.Ahmadi Marvast, B. Baloochi, H.Ganji
Abstract:
In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .
Keywords: Statistical model, Multiphase Reactors, Gas oil, Hydrodesulfurization, Optimization, Kinetics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 268614300 Improving the Performances of the nMPRA Architecture by Implementing Specific Functions in Hardware
Authors: Ionel Zagan, Vasile Gheorghita Gaitan
Abstract:
Minimizing the response time to asynchronous events in a real-time system is an important factor in increasing the speed of response and an interesting concept in designing equipment fast enough for the most demanding applications. The present article will present the results regarding the validation of the nMPRA (Multi Pipeline Register Architecture) architecture using the FPGA Virtex-7 circuit. The nMPRA concept is a hardware processor with the scheduler implemented at the processor level; this is done without affecting a possible bus communication, as is the case with the other CPU solutions. The implementation of static or dynamic scheduling operations in hardware and the improvement of handling interrupts and events by the real-time executive described in the present article represent a key solution for eliminating the overhead of the operating system functions. The nMPRA processor is capable of executing a preemptive scheduling, using various algorithms without a software scheduler. Therefore, we have also presented various scheduling methods and algorithms used in scheduling the real-time tasks.
Keywords: nMPRA architecture, pipeline processor, preemptive scheduling, real-time system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87914299 Conceptual Multidimensional Model
Authors: Manpreet Singh, Parvinder Singh, Suman
Abstract:
The data is available in abundance in any business organization. It includes the records for finance, maintenance, inventory, progress reports etc. As the time progresses, the data keep on accumulating and the challenge is to extract the information from this data bank. Knowledge discovery from these large and complex databases is the key problem of this era. Data mining and machine learning techniques are needed which can scale to the size of the problems and can be customized to the application of business. For the development of accurate and required information for particular problem, business analyst needs to develop multidimensional models which give the reliable information so that they can take right decision for particular problem. If the multidimensional model does not possess the advance features, the accuracy cannot be expected. The present work involves the development of a Multidimensional data model incorporating advance features. The criterion of computation is based on the data precision and to include slowly change time dimension. The final results are displayed in graphical form.Keywords: Multidimensional, data precision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145814298 Classifier Based Text Mining for Neural Network
Authors: M. Govindarajan, R. M. Chandrasekaran
Abstract:
Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.Keywords: Back propagation, classification accuracy, textmining, time complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 421814297 Evaluation of Risk Attributes Driven by Periodically Changing System Functionality
Authors: Dariusz Dymek, Leszek Kotulski
Abstract:
Modeling of the distributed systems allows us to represent the whole its functionality. The working system instance rarely fulfils the whole functionality represented by model; usually some parts of this functionality should be accessible periodically. The reporting system based on the Data Warehouse concept seams to be an intuitive example of the system that some of its functionality is required only from time to time. Analyzing an enterprise risk associated with the periodical change of the system functionality, we should consider not only the inaccessibility of the components (object) but also their functions (methods), and the impact of such a situation on the system functionality from the business point of view. In the paper we suggest that the risk attributes should be estimated from risk attributes specified at the requirements level (Use Case in the UML model) on the base of the information about the structure of the model (presented at other levels of the UML model). We argue that it is desirable to consider the influence of periodical changes in requirements on the enterprise risk estimation. Finally, the proposition of such a solution basing on the UML system model is presented.Keywords: Risk assessing, software maintenance, UML, graph grammars.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138514296 Curing Time Effect on Behavior of Cement Treated Marine Clay
Authors: H. W. Xiao, F. H. Lee
Abstract:
Cement stabilization has been widely used for improving the strength and stiffness of soft clayey soils. Cement treated soil specimens used to investigate the stress-strain behaviour in the laboratory study are usually cured for 7 days. This paper examines the effects of curing time on the strength and stress strain behaviour of cement treated marine clay under triaxial loading condition. Laboratory-prepared cement treated Singapore marine clay with different mix proportion S-C-W (soil solid-cement solid-water) and curing time (7 days to 180 days) was investigated through conducting unconfined compressive strength test and triaxial test. The results show that the curing time has a significant effect on the unconfined compressive strength u q , isotropic compression behaviour and stress strain behaviour. Although the primary yield loci of the cement treated soil specimens with the same mix proportion expand with curing time, they are very narrowly banded and have nearly the same shape after being normalized by isotropic compression primary stress ' py p . The isotropic compression primary yield stress ' py p was shown to be linearly related to unconfined compressive strength u q for specimens with different curing time and mix proportion. The effect of curing time on the hardening behaviour will diminish with consolidation stress higher than isotropic compression primary yield stress but its damping rate is dependent on the cement content.Keywords: Cement treated soil, curing time effect, hardening behaviour, isotropic compression primary yield stress, unconfined compressive strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 391014295 Uranium Adsorption Using a Composite Material Based on Platelet SBA-15 Supported Tin Salt Tungstomolybdophosphoric Acid
Authors: H. Aghayan, F. A. Hashemi, R. Yavari, S. Zolghadri
Abstract:
In this work, a new composite adsorbent based on a mesoporous silica SBA-15 with platelet morphology and tin salt of tungstomolybdophosphoric (TWMP) acid was synthesized and applied for uranium adsorption from aqueous solution. The sample was characterized by X-ray diffraction, Fourier transfer infra-red, and N2 adsorption-desorption analysis, and then, effect of various parameters such as concentration of metal ions and contact time on adsorption behavior was examined. The experimental result showed that the adsorption process was explained by the Langmuir isotherm model very well, and predominant reaction mechanism is physisorption. Kinetic data of adsorption suggest that the adsorption process can be described by the pseudo second-order reaction rate model.
Keywords: Platelet SBA-15, tungstomolybdophosphoric acid, adsorption, uranium ion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84114294 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media
Authors: Jinghui Peng, Shanyu Tang, Jia Li
Abstract:
Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.Keywords: Steganalysis, security, fast Fourier transform, streaming media.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78314293 Fault Detection and Identification of COSMED K4b2 Based On PCA and Neural Network
Authors: Jing Zhou, Steven Su, Aihuang Guo
Abstract:
COSMED K4b2 is a portable electrical device designed to test pulmonary functions. It is ideal for many applications that need the measurement of the cardio-respiratory response either in the field or in the lab is capable with the capability to delivery real time data to a sink node or a PC base station with storing data in the memory at the same time. But the actual sensor outputs and data received may contain some errors, such as impulsive noise which can be related to sensors, low batteries, environment or disturbance in data acquisition process. These abnormal outputs might cause misinterpretations of exercise or living activities to persons being monitored. In our paper we propose an effective and feasible method to detect and identify errors in applications by principal component analysis (PCA) and a back propagation (BP) neural network.
Keywords: BP Neural Network, Exercising Testing, Fault Detection and Identification, Principal Component Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307714292 The Evaluation of Production Line Performance by Using ARENA – A Case Study
Authors: Muhammad Marsudi, Hani Shafeek
Abstract:
The purpose of this paper is to simulate the production process of a metal stamping industry and to evaluate the utilization of the production line by using ARENA simulation software. The process time and the standard time for each process of the production line is obtained from data given by the company management. Other data are collected through direct observation of the line. There are three work stations performing ten different types of processes in order to produce a single product type. Arena simulation model is then developed based on the collected data. Verification and validation are done to the Arena model, and finally the result of Arena simulation can be analyzed. It is found that utilization at each workstation will increase if batch size is increased although throughput rate remains/is kept constant. This study is very useful for the company because the company needs to improve the efficiency and utilization of its production lines.
Keywords: Arena software, case study, production line, utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 537514291 Hybrid Fuzzy Selecting-Control-by- Range Controllers of a Servopneumatic Fatigue System
Authors: Marco Soares dos Santos, Jorge Augusto Ferreira, Camila Nicola Boeri, Fernando Neto da Silva
Abstract:
The present paper proposes high performance nonlinear force controllers for a servopneumatic real-time fatigue test machine. A CompactRIO® controller was used, being fully programmed using LabVIEW language. Fuzzy logic control algorithms were evaluated to tune the integral and derivative components in the development of hybrid controllers, namely a FLC P and a hybrid FLC PID real-time-based controllers. Their behaviours were described by using state diagrams. The main contribution is to ensure a smooth transition between control states, avoiding discrete transitions in controller outputs. Steady-state errors lower than 1.5 N were reached, without retuning the controllers. Good results were also obtained for sinusoidal tracking tasks from 1/¤Ç to 8/¤Ç Hz.Keywords: Hybrid Fuzzy Selecting, Control, Range Controllers, Servopneumatic Fatigue System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200114290 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements
Authors: Shagufta Tabassum
Abstract:
The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. Here we discuss the basic calibration and normalization procedure for TDR measurements. Our aim is to explain different types of error occur during TDR measurements and how to minimize it.
Keywords: time domain reflectometry measurement technique, cable and connector loss, oscilloscope loss, normalization technique
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50514289 Enhancement of Accountability within the South African Public Sector: Knowledge Gained from the Case of a National Commissioner of the South African Police Service
Authors: Yasmin Nanabhay
Abstract:
The paper scrutinizes the literature on accountability and non-accountability, and then presents an analysis of a South African case which demonstrated consequences of a lack of accountability. Ethical conduct displayed by members of the public sector is integral to creating a sustainable democratic government, which upholds the constitutional tenets of accountability, transparency and professional ethicality. Furthermore, a true constitutional democracy emphasises and advocates the notion of service leadership that nurtures public participation and engages with citizens in a positive manner. Ethical conduct and accountability in the public sector earns public trust; hence these are key principles in good governance. Yet, in the years since the advent of democracy in South Africa, the government has been plagued by rampant corruption and mal-administration by public officials and politicians in leadership positions. The control measures passed by government in an attempt to ensure ethicality and accountability within the public sector include codes of ethics, rules of conduct and the enactment of legislation. These are intended to shape the mindset of members of the public sector, with the ultimate aim of an efficient, effective, ethical, responsive and accountable public service. The purpose of the paper is to analyse control systems and accountability within the public sector and to present reasons for non-accountability by means of a selected case study. The selected case study is the corruption trial of Jackie Selebi, who served as National Commissioner of the South African Police Service but was dismissed from the post. The reasons for non-accountability in the public sector as well as recommendations based on the findings to enhance accountability will be undertaken. The case study demonstrates the experience and impact of corruption and/or mal-administration, as a result of a lack of accountability, which has contributed to the increasing loss of confidence in political leadership in the country as elsewhere in the world. The literature is applied to the erstwhile National Commissioner of the South African Police Service and President of Interpol, as a case study of non-accountability.
Keywords: Public sector, public accountability, internal control, oversight mechanisms, non-compliance, corruption, mal-administration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 82314288 Sampling Effects on Secondary Voltage Control of Microgrids Based on Network of Multiagent
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper studies a secondary voltage control framework of the microgrids based on the consensus for a communication network of multiagent. The proposed control is designed by the communication network with one-way links. The communication network is modeled by a directed graph. At this time, the concept of sampling is considered as the communication constraint among each distributed generator in the microgrids. To analyze the sampling effects on the secondary voltage control of the microgrids, by using Lyapunov theory and some mathematical techniques, the sufficient condition for such problem will be established regarding linear matrix inequality (LMI). Finally, some simulation results are given to illustrate the necessity of the consideration of the sampling effects on the secondary voltage control of the microgrids.Keywords: Microgrids, secondary control, multiagent, sampling, LMI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145014287 Parallelization of Ensemble Kalman Filter (EnKF) for Oil Reservoirs with Time-lapse Seismic Data
Authors: Md Khairullah, Hai-Xiang Lin, Remus G. Hanea, Arnold W. Heemink
Abstract:
In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.
Keywords: EnKF, Data assimilation, Parallel computing, Parallel efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 228114286 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus
Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo
Abstract:
The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.
Keywords: Anomaly detection, digital twin, Generalised Additive Model, Power Consumption Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50114285 Application of Model Free Adaptive Control in Main Steam Temperature System of Thermal Power Plant
Authors: Khaing Yadana Swe, Lillie Dewan
Abstract:
At present, the cascade PID control is widely used to control the superheating temperature (main steam temperature). As Main Steam Temperature has the characteristics of large inertia, large time-delay and time varying, etc., conventional PID control strategy cannot achieve good control performance. In order to overcome the bad performance and deficiencies of main steam temperature control system, Model Free Adaptive Control (MFAC) - P cascade control system is proposed in this paper. By substituting MFAC in PID of the main control loop of the main steam temperature control, it can overcome time delays, non-linearity, disturbance and time variation.
Keywords: Model free Adaptive Control, Cascade Control, Adaptive Control, PID.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 280114284 A Risk Assessment for the Small Hive Beetle Based on Meteorological Standard Measurements
Authors: J. Junk, M. Eickermann
Abstract:
The Small Hive Beetle, Aethina tumida (Coleoptera: Nitidulidae) is a parasite for honey bee colonies, Apis mellifera, and was recently introduced to the European continent, accidentally. Based on the literature, a model was developed by using regional meteorological variables (daily values of minimum, maximum and mean air temperature as well as mean soil temperature at 50 mm depth) to calculate the time-point of hive invasion by A. tumida in springtime, the development duration of pupae as well as the number of generations of A. tumida per year. Luxembourg was used as a test region for our model for 2005 to 2013. The model output indicates a successful surviving of the Small Hive Beetle in Luxembourg with two up to three generations per year. Additionally, based on our meteorological data sets a first migration of SHB to apiaries can be expected from mid of March up to April. Our approach can be transferred easily to other countries to estimate the risk potential for a successful introduction and spreading of A. tumida in Western Europe.
Keywords: Aethina tumida, air temperature, larval development, soil temperature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 75314283 Optimal Manufacturing Scheduling for Dependent Details Processing
Authors: Ivan C. Mustakerov, Daniela I. Borissova
Abstract:
The increasing competitiveness in manufacturing industry is forcing manufacturers to seek effective processing schedules. The paper presents an optimization manufacture scheduling approach for dependent details processing with given processing sequences and times on multiple machines. By defining decision variables as start and end moments of details processing it is possible to use straightforward variables restrictions to satisfy different technological requirements and to formulate easy to understand and solve optimization tasks for multiple numbers of details and machines. A case study example is solved for seven base moldings for CNC metalworking machines processed on five different machines with given processing order among details and machines and known processing time-s duration. As a result of linear optimization task solution the optimal manufacturing schedule minimizing the overall processing time is obtained. The manufacturing schedule defines the moments of moldings delivery thus minimizing storage costs and provides mounting due-time satisfaction. The proposed optimization approach is based on real manufacturing plant problem. Different processing schedules variants for different technological restrictions were defined and implemented in the practice of Bulgarian company RAIS Ltd. The proposed approach could be generalized for other job shop scheduling problems for different applications.Keywords: Optimal manufacturing scheduling, linear programming, metalworking machines production, dependant details processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148714282 An Optimized Multi-block Method for Turbulent Flows
Authors: M. Goodarzi, P. Lashgari
Abstract:
A major part of the flow field involves no complicated turbulent behavior in many turbulent flows. In this research work, in order to reduce required memory and CPU time, the flow field was decomposed into several blocks, each block including its special turbulence. A two dimensional backward facing step was considered here. Four combinations of the Prandtl mixing length and standard k- E models were implemented as well. Computer memory and CPU time consumption in addition to numerical convergence and accuracy of the obtained results were mainly investigated. Observations showed that, a suitable combination of turbulence models in different blocks led to the results with the same accuracy as the high order turbulence model for all of the blocks, in addition to the reductions in memory and CPU time consumption.Keywords: Computer memory, CPU time, Multi-block method, Turbulence modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156514281 Combined Effect of Heat Stimulation and Delayed Addition of Superplasticizer with Slag on Fresh and Hardened Property of Mortar
Authors: Faraidoon Rahmanzai, Mizuki Takigawa, Yu Bomura, Shigeyuki Date
Abstract:
To obtain the high quality and essential workability of mortar, different types of superplasticizers are used. The superplasticizers are the chemical admixture used in the mix to improve the fluidity of mortar. Many factors influenced the superplasticizer to disperse the cement particle in the mortar. Nature and amount of replaced cement by slag, mixing procedure, delayed addition time, and heat stimulation technique of superplasticizer cause the varied effect on the fluidity of the cementitious material. In this experiment, the superplasticizers were heated for 1 hour under 60 °C in a thermostatic chamber. Furthermore, the effect of delayed addition time of heat stimulated superplasticizers (SP) was also analyzed. This method was applied to two types of polycarboxylic acid based ether SP (precast type superplasticizer (SP2) and ready-mix type superplasticizer (SP1)) in combination with a partial replacement of normal Portland cement with blast furnace slag (BFS) with 30% w/c ratio. On the other hands, the fluidity, air content, fresh density, and compressive strength for 7 and 28 days were studied. The results indicate that the addition time and heat stimulation technique improved the flow and air content, decreased the density, and slightly decreased the compressive strength of mortar. Moreover, the slag improved the flow of mortar by increasing the amount of slag, and the effect of external temperature of SP on the flow of mortar was decreased. In comparison, the flow of mortar was improved on 5-minute delay for both kinds of SP, but SP1 has improved the flow in all conditions. Most importantly, the transition points in both types of SP appear to be the same, at about 5±1 min. In addition, the optimum addition time of SP to mortar should be in this period.
Keywords: Combined effect, delayed addition, heat stimulation, flow of mortar.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847