Search results for: moving average process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6888

Search results for: moving average process

6708 Evaluation of Fluoride Contents of Kirkuk City's Drinking Water and Its Source: Lesser Zab River and Its Effect on Human Health

Authors: Abbas R. Ali, Safa H. Abdulrahman

Abstract:

In this study, forty samples had been collected from water of Lesser Zab River and drinking water to determine fluoride concentration and show the impact of fluoride on general health of society of Kirkuk city. Estimation of fluoride concentration and determination of its proportion in water samples were performed attentively using a fluoride ion selective electrode. The fluoride concentrations in the Lesser Zab River samples were between 0.0265 ppm and 0.0863 ppm with an average of 0.0451 ppm, whereas the average fluoride concentration in drinking water samples was 0.102 ppm and ranged from 0.010 to 0.289 ppm. A comparison between results obtained with World Health Organization (WHO) show a low concentration of fluoride in the samples of the study. Thus, for health concerns we should increase the concentration of this ion in water of Kirkuk city at least to about (1.0 ppm) and this will take place after fluorination process.

Keywords: Fluoride concentration, Lesser Zab River, drinking water, health society, Kirkuk city.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381
6707 Synthetic Daily Flow Duration Curves for the Çoruh River Basin, Turkey

Authors: Fatih Tosunoğlu, İbrahim Can

Abstract:

The flow duration curve (FDC) is an informative method that represents the flow regime’s properties for a river basin. Therefore, the FDC is widely used for water resource projects such as hydropower, water supply, irrigation and water quality management. The primary purpose of this study is to obtain synthetic daily flow duration curves for Çoruh Basin, Turkey. For this aim, we firstly developed univariate auto-regressive moving average (ARMA) models for daily flows of 9 stations located in Çoruh basin and then these models were used to generate 100 synthetic flow series each having same size as historical series. Secondly, flow duration curves of each synthetic series were drawn and the flow values exceeded 10, 50 and 95% of the time and 95% confidence limit of these flows were calculated. As a result, flood, mean and low flows potential of Çoruh basin will comprehensively be represented.

Keywords: ARMA models, Çoruh basin, flow duration curve, Turkey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3748
6706 Creation and Annihilation of Spacetime Elements

Authors: Dnyanesh P. Mathur, Gregory L. Slater

Abstract:

Gravitation and the expansion of the universe at a large scale are generally regarded as two completely distinct phenomena. Yet, in General theory of Relativity (GR), they both manifest as 'curvature' of spacetime. We propose a hypothesis which treats these two 'curvature-producing' phenomena as aspects of an underlying process. This process treats spacetime itself as composed of discrete units (Plancktons) and is 'dynamic' in the sense that these elements of spacetime are continually being both created and annihilated. It is these two complementary processes of Planckton creation and Planckton annihilation which manifest themselves as - 'cosmic expansion' on the one hand and as 'gravitational attraction’ on the other. The Planckton hypothesis treats spacetime as a perfect fluid in the same manner as the co-moving frame of reference of Friedman equations and the Gullstrand-Painleve metric; i.e., Planckton hypothesis replaces 'curvature' of spacetime by the 'flow' of Plancktons (spacetime). Here we discuss how this perspective may allow a unified description of both cosmological and gravitational acceleration as well as providing a mechanism for inducing an irreducible action at every point associated with the creation and annihilation of Plancktons, which could be identified as the zero point energy.

Keywords: Discrete spacetime, spacetime flow, zero point energy, dark energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117
6705 A Video-based Algorithm for Moving Objects Detection at Signalized Intersection

Authors: Juan Li, Chunfu Shao, Chunjiao Dong, Dan Zhao, Yinhong Liu

Abstract:

Mixed-traffic (e.g., pedestrians, bicycles, and vehicles) data at an intersection is one of the essential factors for intersection design and traffic control. However, some data such as pedestrian volume cannot be directly collected by common detectors (e.g. inductive loop, sonar and microwave sensors). In this paper, a video based detection algorithm is proposed for mixed-traffic data collection at intersections using surveillance cameras. The algorithm is derived from Gaussian Mixture Model (GMM), and uses a mergence time adjustment scheme to improve the traditional algorithm. Real-world video data were selected to test the algorithm. The results show that the proposed algorithm has the faster processing speed and more accuracy than the traditional algorithm. This indicates that the improved algorithm can be applied to detect mixed-traffic at signalized intersection, even when conflicts occur.

Keywords: detection, intersection, mixed traffic, moving objects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
6704 A Study on Algorithm Fusion for Recognition and Tracking of Moving Robot

Authors: Jungho Choi, Youngwan Cho

Abstract:

This paper presents an algorithm for the recognition and tracking of moving objects, 1/10 scale model car is used to verify performance of the algorithm. Presented algorithm for the recognition and tracking of moving objects in the paper is as follows. SURF algorithm is merged with Lucas-Kanade algorithm. SURF algorithm has strong performance on contrast, size, rotation changes and it recognizes objects but it is slow due to many computational complexities. Processing speed of Lucas-Kanade algorithm is fast but the recognition of objects is impossible. Its optical flow compares the previous and current frames so that can track the movement of a pixel. The fusion algorithm is created in order to solve problems which occurred using the Kalman Filter to estimate the position and the accumulated error compensation algorithm was implemented. Kalman filter is used to create presented algorithm to complement problems that is occurred when fusion two algorithms. Kalman filter is used to estimate next location, compensate for the accumulated error. The resolution of the camera (Vision Sensor) is fixed to be 640x480. To verify the performance of the fusion algorithm, test is compared to SURF algorithm under three situations, driving straight, curve, and recognizing cars behind the obstacles. Situation similar to the actual is possible using a model vehicle. Proposed fusion algorithm showed superior performance and accuracy than the existing object recognition and tracking algorithms. We will improve the performance of the algorithm, so that you can experiment with the images of the actual road environment.

Keywords: SURF, Optical Flow Lucas-Kanade, Kalman Filter, object recognition, object tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2246
6703 Some Characteristics of Biodegradable Film Substituted by Yam (Dioscorea alata) Starch from Thailand

Authors: Orose Rugchati, Khumthong Mahawongwiriya, Kanita Thanacharoenchanaphas

Abstract:

Yam starch obtained from the water yam (munlued) by the wet milling process was studied for some physicochemical properties. Yam starch film was prepared by casting using glycerol as a plasticizer. The effect of different glycerol (1.30, 1.65 and 2.00g/100g of filmogenic solution) and starch concentrations (3.30, 3.65 and 4.00g /100g of filmogenic solution) were evaluated on some characteristics of the film. The temperature for obtaining the gelatinized starch solution was 70-80°C and then dried at 45°C for 4 hours. The resulting starch from munlued granular morphology was triangular and the average size of the granule was 26.68 μm. The amylose content by colorimetric method was 26 % and the gelatinize temperature was 70-80°C. The appearance of the film was smooth, transparent, and glossy with average moisture content of 25.96% and thickness of 0.01mm. Puncture deformation and flexibility increased with glycerol content. The starch and glycerol concentration were a significant factor of the yam starch film characteristics. Yam starch film can be described as a biofilm providing many applications and developments with the advantage of biodegradability.

Keywords: Characteristics of Biodegradable film, yam starch, Dioscoreaalata, substitute, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2895
6702 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing

Authors: Fengxia Zheng, Shouming Zhong

Abstract:

ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.

Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3620
6701 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms

Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan

Abstract:

Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving kmeans clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.

Keywords: Acute Leukaemia Images, Clustering Algorithms, Image Segmentation, Moving k-Means.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2732
6700 Requirement Engineering and Software Product Line Scoping Paradigm

Authors: Ahmed Mateen, Zhu Qingsheng, Faisal Shahzad

Abstract:

Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.

Keywords: Requirements engineering, software product lines, scoping, process structure, domain specific language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
6699 Developing a Research Framework for Investigating the Transparency of ePortfolios

Authors: Elizabeth Downs, Judith Repman, Kenneth Clark

Abstract:

This paper describes the evolution of strategies to evaluate ePortfolios in an online Master-s of Education (M.Ed.) degree in Instructional Technology. The ePortfolios are required as a culminating activity for students in the program. By using Web 2.0 tools to develop the ePortfolios, students are able to showcase their technical skills, integrate national standards, demonstrate their professional understandings, and reflect on their individual learning. Faculty have created assessment strategies to evaluate student achievement of these skills. To further develop ePortfolios as a tool promoting authentic learning, faculty are moving toward integrating transparency as part of the evaluation process.

Keywords: e-learning evaluation, ePortfolios, transparency, Web 2.0

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1516
6698 Intelligent Swarm-Finding in Formation Control of Multi-Robots to Track a Moving Target

Authors: Anh Duc Dang, Joachim Horn

Abstract:

This paper presents a new approach to control robots, which can quickly find their swarm while tracking a moving target through the obstacles of the environment. In this approach, an artificial potential field is generated between each free-robot and the virtual attractive point of the swarm. This artificial potential field will lead free-robots to their swarm. The swarm-finding of these free-robots dose not influence the general motion of their swarm and nor other robots. When one singular robot approaches the swarm then its swarm-search will finish, and it will further participate with its swarm to reach the position of the target. The connections between member-robots with their neighbors are controlled by the artificial attractive/repulsive force field between them to avoid collisions and keep the constant distances between them in ordered formation. The effectiveness of the proposed approach has been verified in simulations.

Keywords: Formation control, potential field method, obstacle avoidance, swarm intelligence, multi-agent systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091
6697 Analytical Modelling of Average Bond Stress within the Anchorage of Tensile Reinforcing Bars in Reinforced Concrete Members

Authors: Maruful H. Mazumder, Raymond I. Gilbert, Zhen- T. Chang

Abstract:

A reliable estimate of the average bond stress within the anchorage of steel reinforcing bars in tension is critically important for the design of reinforced concrete member. This paper describes part of a recently completed experimental research program in the Centre for Infrastructure Engineering and Safety (CIES) at the University of New South Wales, Sydney, Australia aimed at assessing the effects of different factors on the anchorage requirements of modern high strength steel reinforcing bars. The study found that an increase in the anchorage length and bar diameter generally leads to a reduction of the average ultimate bond stress. By the extension of a well established analytical model of bond and anchorage, it is shown here that the differences in the average ultimate bond stress for different anchorage lengths is associated with the variable degree of plastic deformation in the tensile zone of the concrete surrounding the bar.

Keywords: Anchorage, Bond stress, Development length, Reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3061
6696 MCDM Spectrum Handover Models for Cognitive Wireless Networks

Authors: Cesar Hernández, Diego Giral, Fernando Santa

Abstract:

Spectrum handover is a significant topic in the cognitive radio networks to assure an efficient data transmission in the cognitive radio user’s communications. This paper proposes a comparison between three spectrum handover models: VIKOR, SAW and MEW. Four evaluation metrics are used. These metrics are, accumulative average of failed handover, accumulative average of handover performed, accumulative average of transmission bandwidth and, accumulative average of the transmission delay. As a difference with related work, the performance of the three spectrum handover models was validated with captured data of spectrum occupancy in experiments performed at the GSM frequency band (824 MHz - 849 MHz). These data represent the actual behavior of the licensed users for this wireless frequency band. The results of the comparison show that VIKOR Algorithm provides a 15.8% performance improvement compared to SAW Algorithm and, it is 12.1% better than the MEW Algorithm.

Keywords: Cognitive radio, decision making, MEW, SAW, spectrum handover, VIKOR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099
6695 Motions of Multiple Objects Detection Based On Video Frames

Authors: Khin Thandar Lwin, Than Htike, Zaw Min Naing

Abstract:

This paper introduces an intelligent system, which can be applied in the monitoring of vehicle speed using a single camera. The ability of motion tracking is extremely useful in many automation problems and the solution to this problem will open up many future applications. One of the most common problems in our daily life is the speed detection of vehicles on a highway. In this paper, a novel technique is developed to track multiple moving objects with their speeds being estimated using a sequence of video frames. Field test has been conducted to capture real-life data and the processed results were presented. Multiple object problems and noisy in data are also considered. Implementing this system in real-time is straightforward. The proposal can accurately evaluate the position and the orientation of moving objects in real-time. The transformations and calibration between the 2D image and the actual road are also considered.

Keywords: Motion Estimation, Image Analyses, Speed Detection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
6694 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components

Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich

Abstract:

This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.

Keywords: Hard disk drive, line balancing, simulation, Arena program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1146
6693 Efficiency Improvements of GaAs-based Solar Cells by Hydrothermally-deposited ZnO Nanostructure Array

Authors: Chun-Yuan Huang, Chiao-Yang Cheng, Chun-Yem Huang, Yan-Kuin Su, James Chin-Lung Fang

Abstract:

ZnO nanostructures including nanowires, nanorods, and nanoneedles were successfully deposited on GaAs substrates, respectively, by simple two-step chemical method for the first time. A ZnO seed layer was firstly pre-coated on the O2-plasma treated substrate by sol-gel process, followed by the nucleation of ZnO nanostructures through hydrothermal synthesis. Nanostructures with different average diameter (15-250 nm), length (0.9-1.8 μm), density (0.9-16×109 cm-2) were obtained via adjusting the growth time and concentration of precursors. From the reflectivity spectra, we concluded ordered and taper nanostructures were preferential for photovoltaic applications. ZnO nanoneedles with an average diameter of 106 nm, a moderate length of 2.4 μm, and the density of 7.2×109 cm-2 could be synthesized in the concentration of 0.04 M for 18 h. Integrated with the nanoneedle array, the power conversion efficiency of single junction solar cell was increased from 7.3 to 12.2%, corresponding to a 67% improvement.

Keywords: Anti-reflection, Chemical synthesis, Solar cells, ZnO nanostructures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
6692 Rail Degradation Modelling Using ARMAX: A Case Study Applied to Melbourne Tram System

Authors: M. Karimpour, N. Elkhoury, L. Hitihamillage, S. Moridpour, R. Hesami

Abstract:

There is a necessity among rail transportation authorities for a superior understanding of the rail track degradation overtime and the factors influencing rail degradation. They need an accurate technique to identify the time when rail tracks fail or need maintenance. In turn, this will help to increase the level of safety and comfort of the passengers and the vehicles as well as improve the cost effectiveness of maintenance activities. An accurate model can play a key role in prediction of the long-term behaviour of railroad tracks. An accurate model can decrease the cost of maintenance. In this research, the rail track degradation is predicted using an autoregressive moving average with exogenous input (ARMAX). An ARMAX has been implemented on Melbourne tram data to estimate the values for the tram track degradation. Gauge values and rail usage in Million Gross Tone (MGT) are the main parameters used in the model. The developed model can accurately predict the future status of the tram tracks.

Keywords: ARMAX, Dynamic systems, MGT, Prediction, Rail degradation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1017
6691 Increasing The Speed of Convergence of an Artificial Neural Network based ARMA Coefficients Determination Technique

Authors: Abiodun M. Aibinu, Momoh J. E. Salami, Amir A. Shafie, Athaur Rahman Najeeb

Abstract:

In this paper, novel techniques in increasing the accuracy and speed of convergence of a Feed forward Back propagation Artificial Neural Network (FFBPNN) with polynomial activation function reported in literature is presented. These technique was subsequently used to determine the coefficients of Autoregressive Moving Average (ARMA) and Autoregressive (AR) system. The results obtained by introducing sequential and batch method of weight initialization, batch method of weight and coefficient update, adaptive momentum and learning rate technique gives more accurate result and significant reduction in convergence time when compared t the traditional method of back propagation algorithm, thereby making FFBPNN an appropriate technique for online ARMA coefficient determination.

Keywords: Adaptive Learning rate, Adaptive momentum, Autoregressive, Modeling, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1451
6690 RAPD Analysis of Genetic Diversity of Castor Bean

Authors: M. Vivodík, Ž. Balážová, Z. Gálová

Abstract:

The aim of this work was to detect genetic variability among the set of 40 castor genotypes using 8 RAPD markers. Amplification of genomic DNA of 40 genotypes, using RAPD analysis, yielded in 66 fragments, with an average of 8.25 polymorphic fragments per primer. Number of amplified fragments ranged from 3 to 13, with the size of amplicons ranging from 100 to 1200 bp. Values of the polymorphic information content (PIC) value ranged from 0.556 to 0.895 with an average of 0.784 and diversity index (DI) value ranged from 0.621 to 0.896 with an average of 0.798. The dendrogram based on hierarchical cluster analysis using UPGMA algorithm was prepared and analyzed genotypes were grouped into two main clusters and only two genotypes could not be distinguished. Knowledge on the genetic diversity of castor can be used for future breeding programs for increased oil production for industrial uses.

Keywords: Dendrogram, polymorphism, RAPD technique, Ricinus communis L.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2582
6689 Grocery Customer Behavior Analysis using RFID-based Shopping Paths Data

Authors: In-Chul Jung, Young S. Kwon

Abstract:

Knowing about the customer behavior in a grocery has been a long-standing issue in the retailing industry. The advent of RFID has made it easier to collect moving data for an individual shopper's behavior. Most of the previous studies used the traditional statistical clustering technique to find the major characteristics of customer behavior, especially shopping path. However, in using the clustering technique, due to various spatial constraints in the store, standard clustering methods are not feasible because moving data such as the shopping path should be adjusted in advance of the analysis, which is time-consuming and causes data distortion. To alleviate this problem, we propose a new approach to spatial pattern clustering based on the longest common subsequence. Experimental results using real data obtained from a grocery confirm the good performance of the proposed method in finding the hot spot, dead spot and major path patterns of customer movements.

Keywords: customer path, shopping behavior, exploratoryanalysis, LCS, RFID

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3092
6688 Energy Communities from Municipality Level to Province Level: A Comparison Using Autoregressive Integrated Moving Average Model

Authors: Amro Issam Hamed Attia Ramadan, Marco Zappatore, Pasquale Balena, Antonella Longo

Abstract:

Considering the energy crisis that is hitting Europe, it becomes increasingly necessary to change energy policies to depend less on fossil fuels and replace them with energy from renewable sources. This has triggered the urge to use clean energy, not only to satisfy energy needs and fulfill the required consumption, but also to decrease the danger of climatic changes due to harmful emissions. Many countries have already started creating energy communities based on renewable energy sources. The first step to understanding energy needs in any place is to perfectly know the consumption. In this work, we aim to estimate electricity consumption for a municipality that makes up part of a rural area located in southern Italy using forecast models that allow for the estimation of electricity consumption for the next 10 years, and we then apply the same model to the province where the municipality is located and estimate the future consumption for the same period to examine whether it is possible to start from the municipality level to reach the province level when creating energy communities.

Keywords: ARIMA, electricity consumption, forecasting models, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200
6687 A Hybrid Model of ARIMA and Multiple Polynomial Regression for Uncertainties Modeling of a Serial Production Line

Authors: Amir Azizi, Amir Yazid b. Ali, Loh Wei Ping, Mohsen Mohammadzadeh

Abstract:

Uncertainties of a serial production line affect on the production throughput. The uncertainties cannot be prevented in a real production line. However the uncertain conditions can be controlled by a robust prediction model. Thus, a hybrid model including autoregressive integrated moving average (ARIMA) and multiple polynomial regression, is proposed to model the nonlinear relationship of production uncertainties with throughput. The uncertainties under consideration of this study are demand, breaktime, scrap, and lead-time. The nonlinear relationship of production uncertainties with throughput are examined in the form of quadratic and cubic regression models, where the adjusted R-squared for quadratic and cubic regressions was 98.3% and 98.2%. We optimized the multiple quadratic regression (MQR) by considering the time series trend of the uncertainties using ARIMA model. Finally the hybrid model of ARIMA and MQR is formulated by better adjusted R-squared, which is 98.9%.

Keywords: ARIMA, multiple polynomial regression, production throughput, uncertainties

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2162
6686 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning

Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul

Abstract:

In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.

Keywords: Electrocardiogram, dictionary learning, sparse coding, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026
6685 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated. Two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: Business process repository, Petri nets, Simulation, Woflan, Yasper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2004
6684 A Goal-Oriented Social Business Process Management Framework

Authors: Mohammad Ehson Rangiha, Bill Karakostas

Abstract:

Social Business Process Management (SBPM) promises to overcome limitations of traditional BPM by allowing flexible process design and enactment through the involvement of users from a social community. This paper proposes a meta-model and architecture for socially driven business process management systems. It discusses the main facets of the architecture such as goalbased role assignment that combines social recommendations with user profile, and process recommendation, through a real example of a charity organization.

Keywords: Business Process Management, Goal-Based Modelling, Process Recommendation Social Collaboration, Social BPM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2521
6683 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
6682 The Investigation of the Role of Institutions in the Process of Growth and Development of Economy

Authors: Seyed Mohammad Reza Hosseini

Abstract:

The new institutional Economics helps generalization and expansion of new classic by adding the institution theories to Economic. It is clear that the appropriate institution is among the factors that lead to success in Economic programs. If the institutional are appropriate, the society will save the source and when we make use of time to apply the program, there will be welfare and average revenue product will also increase. In Economy, one should not expect the real manifestation of Economic programs only with a model for estimating and predicting rather institutions of the same purpose and along with production are needed to form the process of growth and development costs. In this research, the institution role in transaction costs, financial markets, distribution of revenue and capital and its influence on the process of growth and development are investigated so that handicaps and problems of Iran Economic Institutions can be recognized. In other words, incapability, non productivity and ambiguity of the institution in Iran Economic are some of the factors that handicap Economic growth and development. For example, Iran government as an important institution while having 20 ministries,83 organizations and 60 years of programming could not go along the growth and development but why?

Keywords: Institution, New institutional economics, Transaction costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
6681 Influence of Hydraulic Retention Time on Biogas Production from Frozen Seafood Wastewater using Decanter Cake as Anaerobic Co-digestion Material

Authors: Thaniya Kaosol, Narumol Sohgrathok

Abstract:

In this research, an anaerobic co-digestion using decanter cake from palm oil mill industry to improve the biogas production from frozen seafood wastewater is studied using Continuously Stirred Tank Reactor (CSTR) process. The experiments were conducted in laboratory-scale. The suitable Hydraulic Retention Time (HRT) was observed in CSTR experiments with 24 hours of mixing time using the mechanical mixer. The HRT of CSTR process impacts on the efficiency of biogas production. The best performance for biogas production using CSTR process was the anaerobic codigestion for 20 days of HRT with the maximum methane production rate of 1.86 l/d and the average maximum methane production of 64.6%. The result can be concluded that the decanter cake can improve biogas productivity of frozen seafood wastewater.

Keywords: anaerobic co-digestion, frozen seafood wastewater, decanter cake, biogas, hydraulic retention time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4619
6680 Investigation of the Effect of Grid Size on External Store Separation Trajectory Using CFD

Authors: Alaa A. Osman, Amgad M. Bayoumy, Ismail El baialy, Osama E. Abdellatif, Essam E. Khallil

Abstract:

In this paper, a numerical simulation of a finned store separating from a wing-pylon configuration has been studied and validated. A dynamic unstructured tetrahedral mesh approach is accomplished by using three grid sizes to numerically solving the discretized three dimensional, inviscid and compressible Euler equations. The method used for computations of separation of an external store assuming quasi-steady flow condition. Computations of quasi-steady flow have been directly coupled to a six degree-offreedom (6DOF) rigid-body motion code to generate store trajectories. The pressure coefficients at four different angular cuts and time histories of various trajectory parameters and wing pressure distribution during the store separation are compared for every grid size with published experimental data.

Keywords: CFD Modelling, Quasi-steady Flow, Moving-body Trajectories, Transonic Store Separation, Moving-body Trajectories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2936
6679 Process Capability Analysis by Using Statistical Process Control of Rice Polished Cylinder Turning Practice

Authors: S. Bangphan, P. Bangphan, T. Boonkang

Abstract:

Quality control helps industries in improvements of its product quality and productivity. Statistical Process Control (SPC) is one of the tools to control the quality of products that turning practice in bringing a department of industrial engineering process under control. In this research, the process control of a turning manufactured at workshops machines. The varying measurements have been recorded for a number of samples of a rice polished cylinder obtained from a number of trials with the turning practice. SPC technique has been adopted by the process is finally brought under control and process capability is improved.

Keywords: Rice polished cylinder, statistical process control, control charts, process capability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3666