Search results for: Data Clustering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7634

Search results for: Data Clustering

6134 A Novel Fuzzy Logic Based Controller to Adjust the Brightness of the Television Screen with Respect to Surrounding Light

Authors: A. V. Sai Balasubramanian, N. Ravi Shankar, S. Subbaraman, R. Rengaraj

Abstract:

One of the major cause of eye strain and other problems caused while watching television is the relative illumination between the screen and its surrounding. This can be overcome by adjusting the brightness of the screen with respect to the surrounding light. A controller based on fuzzy logic is proposed in this paper. The fuzzy controller takes in the intensity of light surrounding the screen and the present brightness of the screen as input. The output of the fuzzy controller is the grid voltage corresponding to the required brightness. This voltage is given to CRT and brightness is controller dynamically. For the given test system data, different de-fuzzifier methods have been implemented and the results are compared. In order to validate the effectiveness of the proposed approach, a fuzzy controller has been designed by obtaining a test data from a real time system. The simulations are performed in MATLAB and are verified with standard system data. The proposed approach can be implemented for real time applications.

Keywords: Fuzzy controller, Grid voltage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2786
6133 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O’Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: Empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4847
6132 Investigation of Scour Depth at Bridge Piers using Bri-Stars Model in Iran

Authors: Gh. Saeidifar, F. Raeiszadeh

Abstract:

BRI-STARS (BRIdge Stream Tube model for Alluvial River Simulation) program was used to investigate the scour depth around bridge piers in some of the major river systems in Iran. Model calibration was performed by collecting different field data. Field data are cataloged on three categories, first group of bridges that their rivers bed are formed by fine material, second group of bridges that their rivers bed are formed by sand material, and finally bridges that their rivers bed are formed by gravel or cobble materials. Verification was performed with some field data in Fars Province. Results show that for wide piers, computed scour depth is more than measured one. In gravel bed streams, computed scour depth is greater than measured scour depth, the reason is due to formation of armor layer on bed of channel. Once this layer is eroded, the computed scour depth is close to the measured one.

Keywords: BRI-STARS, local scour, bridge, computer modeling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
6131 Effects of Global Warming on Climate Change in Udon Thani Province in the Period in 60 Surrounding Years (A.D.1951-2010)

Authors: T. Santiboon

Abstract:

This research were investigated, determined, and analyzed of the climate characteristically change in the provincial Udon Thani in the period of 60 surrounding years from 1951 to 2010 A.D. that it-s transferred to effects of climatologically data for determining global warming. Statistically significant were not found for the 60 years- data (R2<0.81). Statistically significant were found after adapted data followed as the Sun Spot cycle in 11 year periods, at the level 0.001 (R2= 1.00). These results indicate the Udon Thani-s weather are affected change; temperatures and evaporation were increased, but rainfall and number days of rainfall, cyclone storm, wind speed, and humidity, forest assessment were decreased. The effects of thermal energy from the sun radiation energy and human activities that they-re followed as the sunspot cycle are able to be predicted from the last to the future of the uniformitarian-s the climate change and global warming effect of the world.

Keywords: Climate Change, Global Warming, Udon Thani Province Weather

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
6130 Unearthing Decisional Patterns of Air Traffic Control Officers from Simulator Data

Authors: Z. Zakaria, S. W. Lye, S. Endy

Abstract:

Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.

Keywords: Air traffic control strategies, conflict resolution, simulator data, strategy classification system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62
6129 Optimal Current Control of Externally Excited Synchronous Machines in Automotive Traction Drive Applications

Authors: Oliver Haala, Bernhard Wagner, Maximilian Hofmann, Martin Marz

Abstract:

The excellent suitability of the externally excited synchronous machine (EESM) in automotive traction drive applications is justified by its high efficiency over the whole operation range and the high availability of materials. Usually, maximum efficiency is obtained by modelling each single loss and minimizing the sum of all losses. As a result, the quality of the optimization highly depends on the precision of the model. Moreover, it requires accurate knowledge of the saturation dependent machine inductances. Therefore, the present contribution proposes a method to minimize the overall losses of a salient pole EESM and its inverter in steady state operation based on measurement data only. Since this method does not require any manufacturer data, it is well suited for an automated measurement data evaluation and inverter parametrization. The field oriented control (FOC) of an EESM provides three current components resp. three degrees of freedom (DOF). An analytic minimization of the copper losses in the stator and the rotor (assuming constant inductances) is performed and serves as a first approximation of how to choose the optimal current reference values. After a numeric offline minimization of the overall losses based on measurement data the results are compared to a control strategy that satisfies cos (ϕ) = 1.

Keywords: Current control, efficiency, externally excited synchronous machine, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4395
6128 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: Non-stationary, BINARMA(1, 1) model, Poisson Innovations, CML

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 588
6127 Development of an ArcGIS Toolbar for Trend Analysis of Climatic Data

Authors: Arnab Bandyopadhyay, Anubhab Pal, Subhajit Debnath

Abstract:

Climate change is a cumulative change in weather patterns over a period of time. Trend analysis using non-parametric Mann-Kendall test may help to determine the existence and magnitude of any statistically significant trend in the climatic data. Another index called Sen slope may be used to quantify the magnitude of such trends. A toolbar extension to ESRI ArcGIS named Arc Trends has been developed in this study for performing the above mentioned tasks. To study the temporal trend of meteorological parameters, 32 years (1971-2002) monthly meteorological data were collected for 133 selected stations over different agro-ecological regions of India. Both the maximum and minimum temperatures were found to be rising. A significant increasing trend in the relative humidity and a consistent significant decreasing trend in the wind speed all over the country were found. However, a general increase in rainfall was not found in recent years.

Keywords: Temporal trend, climate change, ArcGIS, Mann- Kendall test, Sen slope

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3086
6126 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: Thailand tourism, maximum entropy bootstrapping approach, macroeconomic model, asymmetric information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
6125 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm

Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad

Abstract:

This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.

Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
6124 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
6123 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
6122 A Rigid Point Set Registration of Remote Sensing Images Based on Genetic Algorithms and Hausdorff Distance

Authors: F. Meskine, N. Taleb, M. Chikr El-Mezouar, K. Kpalma, A. Almhdie

Abstract:

Image registration is the process of establishing point by point correspondence between images obtained from a same scene. This process is very useful in remote sensing, medicine, cartography, computer vision, etc. Then, the task of registration is to place the data into a common reference frame by estimating the transformations between the data sets. In this work, we develop a rigid point registration method based on the application of genetic algorithms and Hausdorff distance. First, we extract the feature points from both images based on the algorithm of global and local curvature corner. After refining the feature points, we use Hausdorff distance as similarity measure between the two data sets and for optimizing the search space we use genetic algorithms to achieve high computation speed for its inertial parallel. The results show the efficiency of this method for registration of satellite images.

Keywords: Feature extraction, Genetic algorithms, Hausdorff distance, Image registration, Point registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
6121 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition

Authors: Chuan Li, Ming Liang

Abstract:

Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.

Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
6120 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: Association rules, Rule-based classification, Classification quality, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
6119 Crack Opening Investigation in Fiberconcrete

Authors: Arturs Macanovskis, Vitalijs Lusis, Andrejs Krasnikovs

Abstract:

This work had three stages. In the first stage was examined pull-out process for steel fiber was embedded into a concrete by one end and was pulled out of concrete under the angle to pulling out force direction. Angle was varied. On the obtained forcedisplacement diagrams were observed jumps. For such mechanical behavior explanation, fiber channel in concrete surface microscopical experimental investigation, using microscope KEYENCE VHX2000, was performed. At the second stage were obtained diagrams for load- crack opening displacement for breaking homogeneously reinforced and layered fiberconcrete prisms (with dimensions 10x10x40cm) subjected to 4-point bending. After testing was analyzed main crack. At the third stage elaborated prediction model for the fiberconcrete beam, failure under bending, using the following data: a) diagrams for fibers pulling out at different angles; b) experimental data about steel-straight fibers locations in the main crack. Experimental and theoretical (modeling) data were compared.

Keywords: Fiberconcrete, pull-out, fiber channel, layered fiberconcrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856
6118 An Attribute-Centre Based Decision Tree Classification Algorithm

Authors: Gökhan Silahtaroğlu

Abstract:

Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.

Keywords: Classification, decision tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
6117 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences

Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam

Abstract:

The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.

Keywords: Learning experiences, innovation, traditional games, trainee teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2443
6116 Combining Bagging and Boosting

Authors: S. B. Kotsiantis, P. E. Pintelas

Abstract:

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.

Keywords: data mining, machine learning, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563
6115 Trend Analysis for Extreme Rainfall Events in New South Wales, Australia

Authors: Evan Hajani, Ataur Rahman, Khaled Haddad

Abstract:

Climate change will affect the hydrological cycle in many different ways such as increase in evaporation and rainfalls. There have been growing interests among researchers to identify the nature of trends in historical rainfall data in many different parts of the world. This paper examines the trends in annual maximum rainfall data from 30 stations in New South Wales, Australia by using two non-parametric tests, Mann-Kendall (MK) and Spearman’s Rho (SR). Rainfall data were analyzed for fifteen different durations ranging from 6 min to 3 days. It is found that the sub-hourly durations (6, 12, 18, 24, 30 and 48 minutes) show statistically significant positive (upward) trends whereas longer duration (subdaily and daily) events generally show a statistically significant negative (downward) trend. It is also found that the MK test and SR test provide notably different results for some rainfall event durations considered in this study. Since shorter duration sub-hourly rainfall events show positive trends at many stations, the design rainfall data based on stationary frequency analysis for these durations need to be adjusted to account for the impact of climate change. These shorter durations are more relevant to many urban development projects based on smaller catchments having a much shorter response time.

Keywords: Climate change, Mann-Kendall test, Spearman’s Rho test, trends, design rainfall.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2910
6114 Application of Artificial Neural Network to Forecast Actual Cost of a Project to Improve Earned Value Management System

Authors: Seyed Hossein Iranmanesh, Mansoureh Zarezadeh

Abstract:

This paper presents an application of Artificial Neural Network (ANN) to forecast actual cost of a project based on the earned value management system (EVMS). For this purpose, some projects randomly selected based on the standard data set , and it is produced necessary progress data such as actual cost ,actual percent complete , baseline cost and percent complete for five periods of project. Then an ANN with five inputs and five outputs and one hidden layer is trained to produce forecasted actual costs. The comparison between real and forecasted data show better performance based on the Mean Absolute Percentage Error (MAPE) criterion. This approach could be applicable to better forecasting the project cost and result in decreasing the risk of project cost overrun, and therefore it is beneficial for planning preventive actions.

Keywords: Earned Value Management System (EVMS), Artificial Neural Network (ANN), Estimate At Completion, Forecasting Methods, Project Performance Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2768
6113 Overview of Development of a Digital Platform for Building Critical Infrastructure Protection Systems in Smart Industries

Authors: Bruno Vilić Belina, Ivan Župan

Abstract:

Smart industry concepts and digital transformation are very popular in many industries. They develop their own digital platforms, which have an important role in innovations and transactions. The main idea of smart industry digital platforms is central data collection, industrial data integration and data usage for smart applications and services. This paper presents the development of a digital platform for building critical infrastructure protection systems in smart industries. Different service contraction modalities in Service Level Agreements (SLAs), Customer Relationship Management (CRM) relations, trends and changes in business architectures (especially process business architecture) for the purpose of developing infrastructural production and distribution networks, information infrastructure meta-models and generic processes by critical infrastructure owner demanded by critical infrastructure law, satisfying cybersecurity requirements and taking into account hybrid threats are researched.

Keywords: Cybersecurity, critical infrastructure, smart industries, digital platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 228
6112 Customer Need Type Classification Model using Data Mining Techniques for Recommender Systems

Authors: Kyoung-jae Kim

Abstract:

Recommender systems are usually regarded as an important marketing tool in the e-commerce. They use important information about users to facilitate accurate recommendation. The information includes user context such as location, time and interest for personalization of mobile users. We can easily collect information about location and time because mobile devices communicate with the base station of the service provider. However, information about user interest can-t be easily collected because user interest can not be captured automatically without user-s approval process. User interest usually represented as a need. In this study, we classify needs into two types according to prior research. This study investigates the usefulness of data mining techniques for classifying user need type for recommendation systems. We employ several data mining techniques including artificial neural networks, decision trees, case-based reasoning, and multivariate discriminant analysis. Experimental results show that CHAID algorithm outperforms other models for classifying user need type. This study performs McNemar test to examine the statistical significance of the differences of classification results. The results of McNemar test also show that CHAID performs better than the other models with statistical significance.

Keywords: Customer need type, Data mining techniques, Recommender system, Personalization, Mobile user.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
6111 Innovation Knowledge and Capability, Work Efficiency of Accountants and the Success of SME Registered in Nakorn Pathom Province

Authors: Autjira Songan, Supattra Kanchanopast

Abstract:

The objectives of this research were to compare the success of SME registered in Nakorn Pathom Province divided in personal data also to study the relations between the innovation knowledge and capability and the success of SME registered in Nakorn Pathom Province and to study the relations between the work efficiency and the success of SME registered in Nakorn Pathom Province. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. Data were analyzed by using Statistical Package for the Social Sciences.The findings revealed that the majority of respondents were male with the age between 25-34 years old, hold undergraduate degree, married and stay together. The average income of respondents was between 10,001-20,000 baht. It also found that in terms of innovation knowledge and capability, there were two variables had an influence on the amount of innovation knowledge and capability, innovation evaluation which were physical characteristic and innovation process.

Keywords: ccountants, Innovation, Knowledge, Work Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
6110 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity

Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle

Abstract:

The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.

Keywords: Complex-valued signal processing, synthetic aperture radar (SAR), 2-D radar imaging, compressive sensing, Sparse Bayesian learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
6109 Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model

Authors: Nurfirdawati Muhamad Hanafi, Azmanirah Ab Rahman, Marina Ibrahim Mukhtar, Jamil Ahmad, Sarebah Warman

Abstract:

This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), infit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained show that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.

Keywords: Competency Assessment, Reliability, Validity, Item Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2831
6108 A Delay-Tolerant Distributed Query Processing Architecture for Mobile Environment

Authors: T.P. Andamuthu, Dr. P. Balasubramanie

Abstract:

The intermittent connectivity modifies the “always on" network assumption made by all the distributed query processing systems. In modern- day systems, the absence of network connectivity is considered as a fault. Since the last upload, it might not be feasible to transmit all the data accumulated right away over the available connection. It is possible that vital information may be delayed excessively when the less important information takes place of the vital information. Owing to the restricted and uneven bandwidth, it is vital that the mobile nodes make the most advantageous use of the connectivity when it arrives. Hence, in order to select the data that needs to be transmitted first, some sort of data prioritization is essential. A continuous query processing system for intermittently connected mobile networks that comprises of a delaytolerant continuous query processor distributed across the mobile hosts has been proposed in this paper. In addition, a mechanism for prioritizing query results has been designed that guarantees enhanced accuracy and reduced delay. It is illustrated that our architecture reduces the client power consumption, increases query efficiency by the extensive simulation results.

Keywords: Broadcast, Location, Mobile host, Mobility, Query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
6107 Practical Guidelines and Examples for the Users of the TMS320C6713 DSK

Authors: Abdullah A Wardak

Abstract:

This paper describes how the correct endian mode of the TMS320C6713 DSK board can be identified. It also explains how the TMS320C6713 DSK board can be used in the little endian and in the big endian modes for assembly language programming in particular and for signal processing in general. Similarly, it discusses how crucially important it is for a user of the TMS320C6713 DSK board to identify the mode of operation and then use it correctly during the development stages of the assembly language programming; otherwise, it will cause unnecessary confusion and erroneous results as far as storing data into the memory and loading data from the memory is concerned. Furthermore, it highlights and strongly recommends to the users of the TMS320C6713 DSK board to be aware of the availability and importance of various display options in the Code Composer Studio (CCS) for correctly interpreting and displaying the desired data in the memory. The information presented in this paper will be of great importance and interest to those practitioners and developers who wants to use the TMS320C6713 DSK board for assembly language programming as well as input-output signal processing manipulations. Finally, examples that clearly illustrate the concept are presented.

Keywords: Assembly language programming, big endian mode, little endian mode, signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2787
6106 A Formulation of the Latent Class Vector Model for Pairwise Data

Authors: Tomoya Okubo, Kuninori Nakamura, Shin-ichi Mayekawa

Abstract:

In this research, a latent class vector model for pairwise data is formulated. As compared to the basic vector model, this model yields consistent estimates of the parameters since the number of parameters to be estimated does not increase with the number of subjects. The result of the analysis reveals that the model was stable and could classify each subject to the latent classes representing the typical scales used by these subjects.

Keywords: finite mixture models, latent class analysis, Thrustone's paired comparison method, vector model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1216
6105 Detecting Circles in Image Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: Image processing, median filter, projection, scalespace, segmentation, threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834