Search results for: statistical approach.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5960

Search results for: statistical approach.

5540 Machine Learning Methods for Flood Hazard Mapping

Authors: S. Zappacosta, C. Bove, M. Carmela Marinelli, P. di Lauro, K. Spasenovic, L. Ostano, G. Aiello, M. Pietrosanto

Abstract:

This paper proposes a neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The classification capability was compared with the flood hazard mapping River Basin Plans (Piani Assetto Idrogeologico, acronimed as PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale), encoding four different increasing flood hazard levels. The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.

Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 679
5539 Virtual Learning Process Environment: Cohort Analytics for Learning and Learning Processes

Authors: Ayodeji Adesina, Derek Molloy

Abstract:

Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.

Keywords: Business process management, cohort analytics, learning processes, virtual learning environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761
5538 Prediction of Research Topics Using Ensemble of Best Predictors from Similar Dataset

Authors: Indra Budi, Rizal Fathoni Aji, Agus Widodo

Abstract:

Prediction of future research topics by using time series analysis either statistical or machine learning has been conducted previously by several researchers. Several methods have been proposed to combine the forecasting results into single forecast. These methods use fixed combination of individual forecast to get the final forecast result. In this paper, quite different approach is employed to select the forecasting methods, in which every point to forecast is calculated by using the best methods used by similar validation dataset. The dataset used in the experiment is time series derived from research report in Garuda, which is an online sites belongs to the Ministry of Education in Indonesia, over the past 20 years. The experimental result demonstrates that the proposed method may perform better compared to the fix combination of predictors. In addition, based on the prediction result, we can forecast emerging research topics for the next few years.

Keywords: Combination, emerging topics, ensemble, forecasting, machine learning, prediction, research topics, similarity measure, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
5537 Non-equilibrium Statistical Mechanics of a Driven Lattice Gas Model: Probability Function, FDT-violation, and Monte Carlo Simulations

Authors: K. Sudprasert, M. Precharattana, N. Nuttavut, D. Triampo, B. Pattanasiri, Y. Lenbury, W. Triampo

Abstract:

The study of non-equilibrium systems has attracted increasing interest in recent years, mainly due to the lack of theoretical frameworks, unlike their equilibrium counterparts. Studying the steady state and/or simple systems is thus one of the main interests. Hence in this work we have focused our attention on the driven lattice gas model (DLG model) consisting of interacting particles subject to an external field E. The dynamics of the system are given by hopping of particles to nearby empty sites with rates biased for jumps in the direction of E. Having used small two dimensional systems of DLG model, the stochastic properties at nonequilibrium steady state were analytically studied. To understand the non-equilibrium phenomena, we have applied the analytic approach via master equation to calculate probability function and analyze violation of detailed balance in term of the fluctuation-dissipation theorem. Monte Carlo simulations have been performed to validate the analytic results.

Keywords: Non-equilibrium, lattice gas, stochastic process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
5536 Design a Low Voltage- Low Offset Class AB Op-Amp

Authors: B.Gholami, S.Gholami, A.Forouzantabar, Sh.Bazyari

Abstract:

A new design approach for three-stage operational amplifiers (op-amps) is proposed. It allows to actually implement a symmetrical push-pull class-AB amplifier output stage for wellestablished three-stage amplifiers using a feedforward transconductance stage. Compared with the conventional design practice, the proposed approach leads to a significant improvement of the symmetry between the positive and the negative op-amp step response, resulting in similar values of the positive/negative settling time. The new approach proves to be very useful in order to fully exploit the potentiality allowed by the op-amp in terms of speed performances. Design examples in a commercial 0.35-μm CMOS prove the effectiveness of theproposed strategy.

Keywords: Low-voltage op amp, design , optimum design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3544
5535 Electrocardiogram Signal Denoising Using a Hybrid Technique

Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim

Abstract:

This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.

Keywords: Hybrid technique, ADTF, DWT, tresholding, ECG signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1172
5534 Leakage Reduction ONOFIC Approach for Deep Submicron VLSI Circuits Design

Authors: Vijay Kumar Sharma, Manisha Pattanaik, Balwinder Raj

Abstract:

Minimizations of power dissipation, chip area with higher circuit performance are the necessary and key parameters in deep submicron regime. The leakage current increases sharply in deep submicron regime and directly affected the power dissipation of the logic circuits. In deep submicron region the power dissipation as well as high performance is the crucial concern since increasing importance of portable systems. Number of leakage reduction techniques employed to reduce the leakage current in deep submicron region but they have some trade-off to control the leakage current. ONOFIC approach gives an excellent agreement between power dissipation and propagation delay for designing the efficient CMOS logic circuits. In this article ONOFIC approach is compared with LECTOR technique and output results show that ONOFIC approach significantly reduces the power dissipation and enhance the speed of the logic circuits. The lower power delay product is the big outcome of this approach and makes it an influential leakage reduction technique.

Keywords: Deep submicron, Leakage Current, LECTOR, ONOFIC, Power Delay Product

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2477
5533 An AR/VR Based Approach Towards the Intuitive Control of Mobile Rescue Robots

Authors: Jürgen Roßmann, André Kupetz, Roland Wischnewski

Abstract:

An intuitive user interface for the teleoperation of mobile rescue robots is one key feature for a successful exploration of inaccessible and no-go areas. Therefore, we have developed a novel framework to embed a flexible and modular user interface into a complete 3-D virtual reality simulation system. Our approach is based on a client-server architecture to allow for a collaborative control of the rescue robot together with multiple clients on demand. Further, it is important that the user interface is not restricted to any specific type of mobile robot. Therefore, our flexible approach allows for the operation of different robot types with a consistent concept and user interface. In laboratory tests, we have evaluated the validity and effectiveness of our approach with the help of two different robot platforms and several input devices. As a result, an untrained person can intuitively teleoperate both robots without needing a familiarization time when changing the operating robot.

Keywords: Teleoperation of mobile robots, augmented reality, user interface, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
5532 Statistical Analysis of the Factors that Influence the Properties of Blueberries from Cultivar Bluecrop

Authors: Raquel P. F. Guiné, Susana R. Matos, Daniela V. T. A. Costa, Fernando J. Gonçalves

Abstract:

Because blueberries are worldwide recognized as a good source of beneficial components, their consumption has increased in the past decades, and so have the scientific works about their properties. Hence, this work was undertaken to evaluate the effect of some production and conservation factors on the properties of blueberries from cultivar Bluecrop. The physical and chemical analyses were done according to established methodologies and then all data was treated using software SPSS for assessment of the possible differences among the factors investigated and/or the correlations between the variables at study. The results showed that location of production influenced some of the berries properties (caliber, sugars, antioxidant activity, color and texture) and that the age of the bushes was correlated with moisture, sugars and acidity, as well as lightness. On the other hand, altitude of the farm only was correlated to sugar content. With regards to conservation, it influenced only anthocyanins content and DPPH antioxidant activity. Finally, the type of extract and the order of extraction had a pronounced influence on all the phenolic properties evaluated.

Keywords: Antioxidant activity, blueberry, conservation, geographical origin, phenolic compounds, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
5531 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients Cohorts: A Case Study in Scotland

Authors: Sotirios Raptis

Abstract:

Health and Social care (HSc) services planning and scheduling are facing unprecedented challenges, due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven approaches can help to improve policies, plan and design services provision schedules using algorithms that assist healthcare managers to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as Classification and Regression Trees (CART), Random Forests (RF), and Logistic Regression (LGR). The significance tests Chi-Squared and Student’s test are used on data over a 39 years span for which data exist for services delivered in Scotland. The demands are associated using probabilities and are parts of statistical hypotheses. These hypotheses, as their NULL part, assume that the target demand is statistically dependent on other services’ demands. This linking is checked using the data. In addition, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus, groups of services. Statistical tests confirmed ML coupling and made the prediction statistically meaningful and proved that a target service can be matched reliably to other services while ML showed that such marked relationships can also be linear ones. Zero padding was used for missing years records and illustrated better such relationships both for limited years and for the entire span offering long-term data visualizations while limited years periods explained how well patients numbers can be related in short periods of time or that they can change over time as opposed to behaviours across more years. The prediction performance of the associations were measured using metrics such as Receiver Operating Characteristic (ROC), Area Under Curve (AUC) and Accuracy (ACC) as well as the statistical tests Chi-Squared and Student. Co-plots and comparison tables for the RF, CART, and LGR methods as well as the p-value from tests and Information Exchange (IE/MIE) measures are provided showing the relative performance of ML methods and of the statistical tests as well as the behaviour using different learning ratios. The impact of k-neighbours classification (k-NN), Cross-Correlation (CC) and C-Means (CM) first groupings was also studied over limited years and for the entire span. It was found that CART was generally behind RF and LGR but in some interesting cases, LGR reached an AUC = 0 falling below CART, while the ACC was as high as 0.912 showing that ML methods can be confused by zero-padding or by data’s irregularities or by the outliers. On average, 3 linear predictors were sufficient, LGR was found competing well RF and CART followed with the same performance at higher learning ratios. Services were packed only when a significance level (p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, low birth weights, alcoholism, drug abuse, and emergency admissions. The work found  that different HSc services can be well packed as plans of limited duration, across various services sectors, learning configurations, as confirmed by using statistical hypotheses.

Keywords: Class, cohorts, data frames, grouping, prediction, probabilities, services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
5530 An Approach for Optimization of Functions and Reducing the Value of the Product by Using Virtual Models

Authors: A. Bocevska, G. Todorov, T. Neshkov

Abstract:

New developed approach for Functional Cost Analysis (FCA) based on virtual prototyping (VP) models in CAD/CAE environment, applicable and necessary in developing new products is presented. It is instrument for improving the value of the product while maintaining costs and/or reducing the costs of the product without reducing value. Five broad classes of VP methods are identified. Efficient use of prototypes in FCA is a vital activity that can make the difference between successful and unsuccessful entry of new products into the competitive word market. Successful realization of this approach is illustrated for a specific example using press joint power tool.

Keywords: CAD/CAE environment, Functional Cost Analysis (FCA), Virtual prototyping (VP) models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1312
5529 An Experimental Design Approach to Determine Effects of The Operating Parameters on The Rate of Ru promoted Ir Carbonylation of Methanol

Authors: Vahid Hosseinpour, Mohammad Kazemini, Alireza Mohammadrezaee

Abstract:

carbonylation of methanol in homogenous phase is one of the major routesfor production of acetic acid. Amongst group VIII metal catalysts used in this process iridium has displayed the best capabilities. To investigate effect of operating parameters like: temperature, pressure, methyl iodide, methyl acetate, iridium, ruthenium, and water concentrations on the reaction rate, experimental design for this system based upon central composite design (CCD) was utilized. Statistical rate equation developed by this method contained individual, interactions and curvature effects of parameters on the reaction rate. The model with p-value less than 0.0001 and R2 values greater than 0.9; confirmeda satisfactory fitness of the experimental and theoretical studies. In other words, the developed model and experimental data obtained passed all diagnostic tests establishing this model as a statistically significant.

Keywords: Acetic Acid, Carbonylation of Methanol, Central Composite Design, Experimental Design, Iridium/Ruthenium

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3627
5528 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis

Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai

Abstract:

The purpose of this study is to forecast the influences of information and communication technology (ICT) on the structural changes of Japanese economies. In this study, input-output (IO) and statistical approaches are used as analysis instruments. More specifically, this study employs Leontief IO coefficients and constrained multivariate regression (CMR) model in order to achieve the purpose. The periods of initial and forecast in this study are 2005 and 2015, respectively. In this study, ICT is represented by ICT capital stocks. This study conducts two levels of analysis, namely macro and micro. The results of macro level analysis show that the dynamics of Japanese economies on the forecast period, relative to the initial period, are not so high. We focus on (1) commerce, (2) business services and office supplies, and (3) personal services sectors when conducting the analysis of the micro level. Further, we analyze its specific IO coefficients when doing this analysis. The results of the analysis explain that ICT gives a strong influence on the changes of these coefficients from initial to forecast periods.

Keywords: Forecast, ICT, Structural changes, Japanese economies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
5527 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 466
5526 Probability Density Estimation Using Advanced Support Vector Machines and the Expectation Maximization Algorithm

Authors: Refaat M Mohamed, Ayman El-Baz, Aly A. Farag

Abstract:

This paper presents a new approach for the prob-ability density function estimation using the Support Vector Ma-chines (SVM) and the Expectation Maximization (EM) algorithms.In the proposed approach, an advanced algorithm for the SVM den-sity estimation which incorporates the Mean Field theory in the learning process is used. Instead of using ad-hoc values for the para-meters of the kernel function which is used by the SVM algorithm,the proposed approach uses the EM algorithm for an automatic optimization of the kernel. Experimental evaluation using simulated data set shows encouraging results.

Keywords: Density Estimation, SVM, Learning Algorithms, Parameters Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2483
5525 Signed Approach for Mining Web Content Outliers

Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi, G.V.Uma

Abstract:

The emergence of the Internet has brewed the revolution of information storage and retrieval. As most of the data in the web is unstructured, and contains a mix of text, video, audio etc, there is a need to mine information to cater to the specific needs of the users without loss of important hidden information. Thus developing user friendly and automated tools for providing relevant information quickly becomes a major challenge in web mining research. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent ones that are likely to contain outlying data such as noise, irrelevant and redundant data. This paper mainly focuses on Signed approach and full word matching on the organized domain dictionary for mining web content outliers. This Signed approach gives the relevant web documents as well as outlying web documents. As the dictionary is organized based on the number of characters in a word, searching and retrieval of documents takes less time and less space.

Keywords: Outliers, Relevant document, , Signed Approach, Web content mining, Web documents..

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2329
5524 Approach of Measuring System Analyses for Automotive Part Manufacturing

Authors: S. Homrossukon, S. Sansureerungsigun

Abstract:

This work aims to introduce an efficient and to standardize the measuring system analyses for automotive industrial. The study started by literature reviewing about the management and analyses measurement system. The approach of measuring system management, then, was constructed. Such approach was validated by collecting the current measuring system data using the equipments of interest including vernier caliper and micrometer. Their accuracy and precision of measurements were analyzed. Finally, the measuring system was improved and evaluated. The study showed that vernier did not meet its measuring characteristics based on the linearity whereas all equipments were lacking of the measuring precision characteristics. Consequently, the causes of measuring variation via the equipments of interest were declared. After the improvement, it was found that their measuring performance could be accepted as the standard required. Finally, the standardized approach for analyzing the measuring system of automotive was concluded.

Keywords: Automotive part manufacturing measurement, measuring accuracy, measuring precision, measurement system analyses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
5523 Analysis of Web User Identification Methods

Authors: Renáta Iváncsy, Sándor Juhász

Abstract:

Web usage mining has become a popular research area, as a huge amount of data is available online. These data can be used for several purposes, such as web personalization, web structure enhancement, web navigation prediction etc. However, the raw log files are not directly usable; they have to be preprocessed in order to transform them into a suitable format for different data mining tasks. One of the key issues in the preprocessing phase is to identify web users. Identifying users based on web log files is not a straightforward problem, thus various methods have been developed. There are several difficulties that have to be overcome, such as client side caching, changing and shared IP addresses and so on. This paper presents three different methods for identifying web users. Two of them are the most commonly used methods in web log mining systems, whereas the third on is our novel approach that uses a complex cookie-based method to identify web users. Furthermore we also take steps towards identifying the individuals behind the impersonal web users. To demonstrate the efficiency of the new method we developed an implementation called Web Activity Tracking (WAT) system that aims at a more precise distinction of web users based on log data. We present some statistical analysis created by the WAT on real data about the behavior of the Hungarian web users and a comprehensive analysis and comparison of the three methods

Keywords: Data preparation, Tracking individuals, Web useridentification, Web usage mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4370
5522 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: Band selection, fuzzy C-means, K-means, hyperspectral image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
5521 A Distributed Approach to Extract High Utility Itemsets from XML Data

Authors: S. Kannimuthu, K. Premalatha

Abstract:

This paper investigates a new data mining capability that entails mining of High Utility Itemsets (HUI) in a distributed environment. Existing research in data mining deals with only presence or absence of an items and do not consider the semantic measures like weight or cost of the items. Thus, HUI mining algorithm has evolved. HUI mining is the one kind of utility mining concept, aims to identify itemsets whose utility satisfies a given threshold. Although, the approach of mining HUIs in a distributed environment and mining of the same from XML data have not explored yet. In this work, a novel approach is proposed to mine HUIs from the XML based data in a distributed environment. This work utilizes Service Oriented Computing (SOC) paradigm which provides Knowledge as a Service (KaaS). The interesting patterns are provided via the web services with the help of knowledge server to answer the queries of the consumers. The performance of the approach is evaluated on various databases using execution time and memory consumption.

Keywords: Data mining, Knowledge as a Service, service oriented computing, utility mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2428
5520 ABURAS Index: A Statistically Developed Index for Dengue-Transmitting Vector Population Prediction

Authors: Hani M. Aburas

Abstract:

“Dengue" is an African word meaning “bone breaking" because it causes severe joint and muscle pain that feels like bones are breaking. It is an infectious disease mainly transmitted by female mosquito, Aedes aegypti, and causes four serotypes of dengue viruses. In recent years, a dramatic increase in the dengue fever confirmed cases around the equator-s belt has been reported. Several conventional indices have been designed so far to monitor the transmitting vector populations known as House Index (HI), Container Index (CI), Breteau Index (BI). However, none of them describes the adult mosquito population size which is important to direct and guide comprehensive control strategy operations since number of infected people has a direct relationship with the vector density. Therefore, it is crucial to know the population size of the transmitting vector in order to design a suitable and effective control program. In this context, a study is carried out to report a new statistical index, ABURAS Index, using Poisson distribution based on the collection of vector population in Jeddah Governorate, Saudi Arabia.

Keywords: Poisson distribution, statistical index, prediction, Aedes aegypti.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893
5519 Automata Theory Approach for Solving Frequent Pattern Discovery Problems

Authors: Renáta Iváncsy, István Vajk

Abstract:

The various types of frequent pattern discovery problem, namely, the frequent itemset, sequence and graph mining problems are solved in different ways which are, however, in certain aspects similar. The main approach of discovering such patterns can be classified into two main classes, namely, in the class of the levelwise methods and in that of the database projection-based methods. The level-wise algorithms use in general clever indexing structures for discovering the patterns. In this paper a new approach is proposed for discovering frequent sequences and tree-like patterns efficiently that is based on the level-wise issue. Because the level-wise algorithms spend a lot of time for the subpattern testing problem, the new approach introduces the idea of using automaton theory to solve this problem.

Keywords: Frequent pattern discovery, graph mining, pushdownautomaton, sequence mining, state machine, tree mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
5518 Connectionist Approach to Generic Text Summarization

Authors: Rajesh S.Prasad, U. V. Kulkarni, Jayashree.R.Prasad

Abstract:

As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance.

Keywords: Artificial Neural Networks (ANN); Computational Intelligence (CI); Connectionist Text Summarizer ECTS (ECTS); Evolving Connectionist systems; Evolving systems; Fuzzy systems (FS); Part of Speech (POS) disambiguation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
5517 Statistical Analysis of the Impact of Maritime Transport Gross Domestic Product on Nigeria’s Economy

Authors: K. P. Oyeduntan, K. Oshinubi

Abstract:

Nigeria is referred as the ‘Giant of Africa’ due to high population, land mass and large economy. However, it still trails far behind many smaller economies in the continent in terms of maritime operations. As we have seen that the maritime industry is the sparkplug for national growth, because it houses the most crucial infrastructure that generates wealth for a nation, it is worrisome that a nation with six seaports lag in maritime activities. In this research, we have studied how the Gross Domestic Product (GDP) of the maritime transport influences the Nigerian economy. To do this, we applied Simple Linear Regression (SLR), Support Vector Machine (SVM), Polynomial Regression Model (PRM), Generalized Additive Model (GAM) and Generalized Linear Mixed Model (GLMM) to model the relationship between the nation’s Total GDP (TGDP) and the Maritime Transport GDP (MGDP) using a time series data of 20 years. The result showed that the MGDP is statistically significant to the Nigerian economy. Amongst the statistical tool applied, the PRM of order 4 describes the relationship better when compared to other methods. The recommendations presented in this study will guide policy makers and help improve the economy of Nigeria.

Keywords: Economy, GDP, maritime transport, port, regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84
5516 A Sub-Pixel Image Registration Technique with Applications to Defect Detection

Authors: Zhen-Hui Hu, Jyh-Shong Ju, Ming-Hwei Perng

Abstract:

This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.

Keywords: Defect detection, Image registration, Straight line segment, Sub-pixel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
5515 Effects of Video Games and Online Chat on Mathematics Performance in High School: An Approach of Multivariate Data Analysis

Authors: Lina Wu, Wenyi Lu, Ye Li

Abstract:

Regarding heavy video game players for boys and super online chat lovers for girls as a symbolic phrase in the current adolescent culture, this project of data analysis verifies the displacement effect on deteriorating mathematics performance. To evaluate correlation or regression coefficients between a factor of playing video games or chatting online and mathematics performance compared with other factors, we use multivariate analysis technique and take gender difference into account. We find the most important reason for the negative sign of the displacement effect on mathematics performance due to students’ poor academic background. Statistical analysis methods in this project could be applied to study internet users’ academic performance from the high school education to the college education.

Keywords: Correlation coefficients, displacement effect, gender difference, multivariate analysis technique, regression coefficients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2148
5514 Probability Distribution of Rainfall Depth at Hourly Time-Scale

Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman

Abstract:

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2889
5513 Customer Relationship Management on Social Media Affecting Brand Loyalty of Siam Commercial Bank in Bangkok

Authors: Charawee Butbumrung

Abstract:

The purpose of this research was to study customer relationship management on social media affecting brand loyalty of Siam Commercial Bank in Bangkok. The statistics used in data analysis were frequency, mean, standard deviation, and Pearson’s correlation coefficient based on social science statistic program. The result of the study found that the majority of the respondents were female, 37–47 years old of age, bachelor degree of education and monthly income between 10,001 and 15,000 Baht. In addition, customer relationship management in the overall and by each aspect of formulating, maintaining, and extending the customer relationship had a high score. Furthermore, the result of hypothesis testing showed that the difference of the customer’s age, education, occupation, average monthly income had the difference in brand loyalty with the statistical significance level of 0.05 and customer relationship management had related with brand loyalty in the same direction with the low level of statistical significance 0.05.

Keywords: Brand loyalty, customer relationship, management, Siam Commercial Bank, social media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1111
5512 Processing Web-Cam Images by a Neuro-Fuzzy Approach for Vehicular Traffic Monitoring

Authors: A. Faro, D. Giordano, C. Spampinato

Abstract:

Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.

Keywords: Neuro-fuzzy networks, computer vision, Fuzzy systems, intelligent transportation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
5511 A Generic, Functionally Comprehensive Approach to Maintaining an Ontology as a Relational Database

Authors: Jennifer Leopold, Alton Coalter, Leong Lee

Abstract:

An ontology is a data model that represents a set of concepts in a given field and the relationships among those concepts. As the emphasis on achieving a semantic web continues to escalate, ontologies for all types of domains increasingly will be developed. These ontologies may become large and complex, and as their size and complexity grows, so will the need for multi-user interfaces for ontology curation. Herein a functionally comprehensive, generic approach to maintaining an ontology as a relational database is presented. Unlike many other ontology editors that utilize a database, this approach is entirely domain-generic and fully supports Webbased, collaborative editing including the designation of different levels of authorization for users.

Keywords: Ontology Editor, Relational Database, CollaborativeCuration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420