Search results for: Military data networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8657

Search results for: Military data networks

6527 The Use of Artificial Neural Network in Option Pricing: The Case of S and P 100 Index Options

Authors: Zeynep İltüzer Samur, Gül Tekin Temur

Abstract:

Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.

Keywords: Option Pricing, Neural Network, S&P 100 Index, American/European options

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3054
6526 Totally Integrated Smart Energy System through Data Acquisition via Remote Location

Authors: Muhammad Tahir Qadri, M. Irfan Anis, M. Nawaz Irshad Khan

Abstract:

This paper discusses the approach of real-time controlling of the energy management system using the data acquisition tool of LabVIEW. The main idea of this inspiration was to interface the Station (PC) with the system and publish the data on internet using LabVIEW. In this venture, controlling and switching of 3 phase AC loads are effectively and efficiently done. The phases are also sensed through devices. In case of any failure the attached generator starts functioning automatically. The computer sends command to the system and system respond to the request. The modern feature is to access and control the system world-wide using world wide web (internet). This controlling can be done at any time from anywhere to effectively use the energy especially in developing countries where energy management is a big problem. In this system totally integrated devices are used to operate via remote location.

Keywords: VI-server, Remote Access, Telemetry, Data Acquisition, web server.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
6525 Preservation of Molecular Ozone in a Clathrate Hydrate : Three-Phase (Gas + Liquid + Hydrate) Equilibrium Measurements for O3 + O2 + CO2 + H2O Systems

Authors: Kazutoshi Shishido, Sanehiro Muromachi, Ryo Ohmura

Abstract:

This paper reports the three-phase (gas + liquid + hydrate) equilibrium pressure versus temperature data for a (O3 + O2 + CO2 + H2O) system for developing the hydrate-based technology to preserve ozone, a chemically unstable substance, for various industrial, medical and consumer uses. These data cover the temperature range from 272 K to 277 K, corresponding to pressures from 1.6 MPa to 3.1 MPa, for each of the three different (O3 + O2)-to-CO2 or O2-to-CO2 molar ratios in the gas phase, which are approximately 4 : 6, 5 : 5, respectively. The mole fraction of ozone in the gas phase was ~0.03 , which are the densest ozone fraction to artificially form O3 containing hydrate ever reported in the literature. Based on these data, the formation of hydrate containing high-concentration ozone, as high as 1 mass %, will be expected.

Keywords: Clathrate hydrate, Ozone, Molecule storage, Sterilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
6524 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-TOPSIS, fuzzy set, FDM, flight safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 867
6523 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing

Authors: Fengxia Zheng, Shouming Zhong

Abstract:

ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.

Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3656
6522 Kinematic Analysis of 2-DOF Planer Robot Using Artificial Neural Network

Authors: Jolly Shah, S.S.Rattan, B.C.Nakra

Abstract:

Automatic control of the robotic manipulator involves study of kinematics and dynamics as a major issue. This paper involves the forward and inverse kinematics of 2-DOF robotic manipulator with revolute joints. In this study the Denavit- Hartenberg (D-H) model is used to model robot links and joints. Also forward and inverse kinematics solution has been achieved using Artificial Neural Networks for 2-DOF robotic manipulator. It shows that by using artificial neural network the solution we get is faster, acceptable and has zero error.

Keywords: Artificial Neural Network, Forward Kinematics, Inverse Kinematics, Robotic Manipulator

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4336
6521 Unsupervised Texture Classification and Segmentation

Authors: V.P.Subramanyam Rallabandi, S.K.Sett

Abstract:

An unsupervised classification algorithm is derived by modeling observed data as a mixture of several mutually exclusive classes that are each described by linear combinations of independent non-Gaussian densities. The algorithm estimates the data density in each class by using parametric nonlinear functions that fit to the non-Gaussian structure of the data. This improves classification accuracy compared with standard Gaussian mixture models. When applied to textures, the algorithm can learn basis functions for images that capture the statistically significant structure intrinsic in the images. We apply this technique to the problem of unsupervised texture classification and segmentation.

Keywords: Gaussian Mixture Model, Independent Component Analysis, Segmentation, Unsupervised Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
6520 Enhancement of Capacity in a MC-CDMA based Cognitive Radio Network Using Non-Cooperative Game Model

Authors: Kalyani J. Kulkarni, Bharat S. Chaudhari

Abstract:

This paper addresses the issue of resource allocation in the emerging cognitive technology. Focusing the Quality of Service (QoS) of Primary Users (PU), a novel method is proposed for the resource allocation of Secondary Users (SU). In this paper, we propose the unique Utility Function in the game theoretic model of Cognitive Radio which can be maximized to increase the capacity of the Cognitive Radio Network (CRN) and to minimize the interference scenario. Utility function is formulated to cater the need of PUs by observing Signal to Noise ratio. Existence of Nash Equilibrium for the postulated game is established.

Keywords: Cognitive Networks, Game Theory, Nash Equilibrium, Resource Allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
6519 Classifying Turbomachinery Blade Mode Shapes Using Artificial Neural Networks

Authors: Ismail Abubakar, Hamid Mehrabi, Reg Morton

Abstract:

Currently, extensive signal analysis is performed in order to evaluate structural health of turbomachinery blades. This approach is affected by constraints of time and the availability of qualified personnel. Thus, new approaches to blade dynamics identification that provide faster and more accurate results are sought after. Generally, modal analysis is employed in acquiring dynamic properties of a vibrating turbomachinery blade and is widely adopted in condition monitoring of blades. The analysis provides useful information on the different modes of vibration and natural frequencies by exploring different shapes that can be taken up during vibration since all mode shapes have their corresponding natural frequencies. Experimental modal testing and finite element analysis are the traditional methods used to evaluate mode shapes with limited application to real live scenario to facilitate a robust condition monitoring scheme. For a real time mode shape evaluation, rapid evaluation and low computational cost is required and traditional techniques are unsuitable. In this study, artificial neural network is developed to evaluate the mode shape of a lab scale rotating blade assembly by using result from finite element modal analysis as training data. The network performance evaluation shows that artificial neural network (ANN) is capable of mapping the correlation between natural frequencies and mode shapes. This is achieved without the need of extensive signal analysis. The approach offers advantage from the perspective that the network is able to classify mode shapes and can be employed in real time including simplicity in implementation and accuracy of the prediction. The work paves the way for further development of robust condition monitoring system that incorporates real time mode shape evaluation.

Keywords: Modal analysis, artificial neural network, mode shape, natural frequencies, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 883
6518 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning

Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan

Abstract:

We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.

Keywords: Daily activity recognition, healthcare, IoT sensors, transfer learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 856
6517 Visualization of Sediment Thickness Variation for Sea Bed Logging using Spline Interpolation

Authors: Hanita Daud, Noorhana Yahya, Vijanth Sagayan, Muizuddin Talib

Abstract:

This paper discusses on the use of Spline Interpolation and Mean Square Error (MSE) as tools to process data acquired from the developed simulator that shall replicate sea bed logging environment. Sea bed logging (SBL) is a new technique that uses marine controlled source electromagnetic (CSEM) sounding technique and is proven to be very successful in detecting and characterizing hydrocarbon reservoirs in deep water area by using resistivity contrasts. It uses very low frequency of 0.1Hz to 10 Hz to obtain greater wavelength. In this work the in house built simulator was used and was provided with predefined parameters and the transmitted frequency was varied for sediment thickness of 1000m to 4000m for environment with and without hydrocarbon. From series of simulations, synthetics data were generated. These data were interpolated using Spline interpolation technique (degree of three) and mean square error (MSE) were calculated between original data and interpolated data. Comparisons were made by studying the trends and relationship between frequency and sediment thickness based on the MSE calculated. It was found that the MSE was on increasing trends in the set up that has the presence of hydrocarbon in the setting than the one without. The MSE was also on decreasing trends as sediment thickness was increased and with higher transmitted frequency.

Keywords: Spline Interpolation, Mean Square Error, Sea Bed Logging, Controlled Source Electromagnetic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
6516 The Consumer Private Space: What is and How it can be Approached without Affecting the Consumer's Privacy

Authors: Calin Veghes

Abstract:

The concept of privacy, seen in connection to the consumer's private space and personalization, has recently gained a higher importance as a consequence of the increasing marketing efforts of the organizations based on the capturing, processing and usage of consumer-s personal data.Paper intends to provide a definition of the consumer-s private space based on the types of personal data the consumer is willing to disclose, to assess the attitude toward personalization and to identify the means preferred by consumers to control their personal data and defend their private space. Several implications generated through the definition of the consumer-s private space are identified and weighted from both the consumers- and organizations- perspectives.

Keywords: Consumer private space, personalization, privacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
6515 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand Side Management: A Systematic Mapping Review

Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring

Abstract:

Electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). To the authors' knowledge, there is no systematic mapping review focusing on the utilization of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorizing information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mix method is much lower than the other techniques, and the proportion of real-time data (RTD) to non-real-time data (NRTD) is about equal.

Keywords: Demand side management, direct load control, electric water heater, indirect load control, non-real-time data, real time data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 56
6514 VaR Forecasting in Times of Increased Volatility

Authors: Ivo Jánský, Milan Rippel

Abstract:

The paper evaluates several hundred one-day-ahead VaR forecasting models in the time period between the years 2004 and 2009 on data from six world stock indices - DJI, GSPC, IXIC, FTSE, GDAXI and N225. The models model mean using the ARMA processes with up to two lags and variance with one of GARCH, EGARCH or TARCH processes with up to two lags. The models are estimated on the data from the in-sample period and their forecasting accuracy is evaluated on the out-of-sample data, which are more volatile. The main aim of the paper is to test whether a model estimated on data with lower volatility can be used in periods with higher volatility. The evaluation is based on the conditional coverage test and is performed on each stock index separately. The primary result of the paper is that the volatility is best modelled using a GARCH process and that an ARMA process pattern cannot be found in analyzed time series.

Keywords: VaR, risk analysis, conditional volatility, garch, egarch, tarch, moving average process, autoregressive process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410
6513 A Weighted Sum Technique for the Joint Optimization of Performance and Power Consumption in Data Centers

Authors: Samee Ullah Khan, C.Ardil

Abstract:

With data centers, end-users can realize the pervasiveness of services that will be one day the cornerstone of our lives. However, data centers are often classified as computing systems that consume the most amounts of power. To circumvent such a problem, we propose a self-adaptive weighted sum methodology that jointly optimizes the performance and power consumption of any given data center. Compared to traditional methodologies for multi-objective optimization problems, the proposed self-adaptive weighted sum technique does not rely on a systematical change of weights during the optimization procedure. The proposed technique is compared with the greedy and LR heuristics for large-scale problems, and the optimal solution for small-scale problems implemented in LINDO. the experimental results revealed that the proposed selfadaptive weighted sum technique outperforms both of the heuristics and projects a competitive performance compared to the optimal solution.

Keywords: Meta-heuristics, distributed systems, adaptive methods, resource allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
6512 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Crops

Authors: M. M. Ali, Ahmed Al-Ani, Derek Eamus, Daniel K. Y. Tan

Abstract:

In this glasshouse study, we developed a new imagebased non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. The plants were grown on a nutrient solution containing different P concentrations, e.g. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P). After 7 weeks of treatment, the plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. These data were further used in linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using leaf image and morphological data. Our proposed nondestructive imaging method is precise in estimating P requirements of different crop species.

Keywords: Image-based techniques, leaf area, leaf P contents, linear discriminant analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
6511 An Improved K-Means Algorithm for Gene Expression Data Clustering

Authors: Billel Kenidra, Mohamed Benmohammed

Abstract:

Data mining technique used in the field of clustering is a subject of active research and assists in biological pattern recognition and extraction of new knowledge from raw data. Clustering means the act of partitioning an unlabeled dataset into groups of similar objects. Each group, called a cluster, consists of objects that are similar between themselves and dissimilar to objects of other groups. Several clustering methods are based on partitional clustering. This category attempts to directly decompose the dataset into a set of disjoint clusters leading to an integer number of clusters that optimizes a given criterion function. The criterion function may emphasize a local or a global structure of the data, and its optimization is an iterative relocation procedure. The K-Means algorithm is one of the most widely used partitional clustering techniques. Since K-Means is extremely sensitive to the initial choice of centers and a poor choice of centers may lead to a local optimum that is quite inferior to the global optimum, we propose a strategy to initiate K-Means centers. The improved K-Means algorithm is compared with the original K-Means, and the results prove how the efficiency has been significantly improved.

Keywords: Microarray data mining, biological pattern recognition, partitional clustering, k-means algorithm, centroid initialization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
6510 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

The problems arising from unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many researchers have found that the performance of existing classifiers tends to be biased towards the majority class. The k-nearest neighbors’ nonparametric discriminant analysis is a method that was proposed for classifying unbalanced classes with good performance. In this study, the methods of discriminant analysis are of interest in investigating misclassification error rates for classimbalanced data of three diabetes risk groups. The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification of class-imbalanced data of diabetes risk groups. Data from a project maintaining healthy conditions for 599 employees of a government hospital in Bangkok were obtained for the classification problem. The employees were divided into three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data including the variables of diabetes risk group, age, gender, blood glucose, and BMI were analyzed and bootstrapped for 50 and 100 samples, 599 observations per sample, for additional estimation of the misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples showed nonnormality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. Searching the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions of (0.90:0.05:0.05), (0.80: 0.10: 0.10) and (0.70, 0.15, 0.15). The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k=3 or k=4 and the defined prior probabilities of non-risk: risk: diabetic as 0.90: 0.05:0.05 or 0.80:0.10:0.10 gave the smallest error rate of misclassification. The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: Bootstrap, diabetes risk groups, error rate, k-nearest neighbors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
6509 Localizing and Experiencing Electronic Questionnaires in an Educational Web Site

Authors: Theodore H. Kaskalis

Abstract:

One of the main research methods in humanistic studies is the collection and process of data through questionnaires. This paper reports our experiences of localizing and adapting the phpESP package of electronic surveys, which led to a friendly on-line questionnaire environment offered through our department web site. After presenting the characteristics of this environment, we identify the expected benefits and present a questionnaire carried out through both the traditional and electronic way. We present the respondents' feedback and then we report the researchers' opinions.Finally, we propose ideas we intend to implement in order to further assist and enhance the research based on this web accessed,electronic questionnaire environment.

Keywords: Electronic questionnaires, Computer assisted webinterviewing, Survey data collection, Survey data visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
6508 Design and Implementation of Security Middleware for Data Warehouse Signature Framework

Authors: Mayada AlMeghari

Abstract:

Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature (DWS) Framework. The aim of using the middleware in the proposed DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.

Keywords: Middleware, parallel computing, data warehouse, security, group-key, high performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 299
6507 Re-Optimization MVPP Using Common Subexpression for Materialized View Selection

Authors: Boontita Suchyukorn, Raweewan Auepanwiriyakul

Abstract:

A Data Warehouses is a repository of information integrated from source data. Information stored in data warehouse is the form of materialized in order to provide the better performance for answering the queries. Deciding which appropriated views to be materialized is one of important problem. In order to achieve this requirement, the constructing search space close to optimal is a necessary task. It will provide effective result for selecting view to be materialized. In this paper we have proposed an approach to reoptimize Multiple View Processing Plan (MVPP) by using global common subexpressions. The merged queries which have query processing cost not close to optimal would be rewritten. The experiment shows that our approach can help to improve the total query processing cost of MVPP and sum of query processing cost and materialized view maintenance cost is reduced as well after views are selected to be materialized.

Keywords: Data Warehouse, materialized views, query rewriting, common subexpressions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
6506 A Fuzzy Approach for Delay Proportion Differentiated Service

Authors: Mehran Garmehi, Yasser Mansouri

Abstract:

There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.

Keywords: QoS, Differentiated Service (DiffServ), FuzzyController, Delay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
6505 Enhanced QoS Mechanisms for IEEE 802.11e Wireless Networks

Authors: Ho-Ting Wu, Min-Hua Yang, Kai-Wei Ke, Lei Yan

Abstract:

The quality-of-service (QoS) support for wireless LANs has been a hot research topic during the past few years. In this paper, two QoS provisioning mechanisms are proposed for the employment in 802.11e EDCA MAC scheme. First, the proposed call admission control mechanism can not only guarantee the QoS for the higher priority existing connections but also provide the minimum reserved bandwidth for traffic flows with lower priority. In addition, the adaptive contention window adjustment mechanism can adjust the maximum and minimum contention window size dynamically according to the existing connection number of each AC. The collision probability as well as the packet delay will thus be reduced effectively. Performance results via simulations have revealed the enhanced QoS property achieved by employing these two mechanisms.

Keywords: 802.11e, admission control, contention window, EDCA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
6504 Modeling Methodologies for Optimization and Decision Support on Coastal Transport Information System (Co.Tr.I.S.)

Authors: Vassilios Moussas, Dimos N. Pantazis, Panagiotis Stratakis

Abstract:

The aim of this paper is to present the optimization methodology developed in the frame of a Coastal Transport Information System. The system will be used for the effective design of coastal transportation lines and incorporates subsystems that implement models, tools and techniques that may support the design of improved networks. The role of the optimization and decision subsystem is to provide the user with better and optimal scenarios that will best fulfill any constrains, goals or requirements posed. The complexity of the problem and the large number of parameters and objectives involved led to the adoption of an evolutionary method (Genetic Algorithms). The problem model and the subsystem structure are presented in detail, and, its support for simulation is also discussed.

Keywords: Coastal transport, modeling, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
6503 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: Clustering, k-means, categorical datasets, pattern recognition, unsupervised learning, knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3514
6502 ECG Analysis using Nature Inspired Algorithm

Authors: A.Sankara Subramanian, G.Gurusamy, G.Selvakumar, P.Gnanasekar, A.Nagappan

Abstract:

This paper presents an algorithm based on the wavelet decomposition, for feature extraction from the ECG signal and recognition of three types of Ventricular Arrhythmias using neural networks. A set of Discrete Wavelet Transform (DWT) coefficients, which contain the maximum information about the arrhythmias, is selected from the wavelet decomposition. After that a novel clustering algorithm based on nature inspired algorithm (Ant Colony Optimization) is developed for classifying arrhythmia types. The algorithm is applied on the ECG registrations from the MIT-BIH arrhythmia and malignant ventricular arrhythmia databases. We applied Daubechies 4 wavelet in our algorithm. The wavelet decomposition enabled us to perform the task efficiently and produced reliable results.

Keywords: Daubechies 4 Wavelet, ECG, Nature inspired algorithm, Ventricular Arrhythmias, Wavelet Decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2287
6501 Integrating Agents and Computational Intelligence Techniques in E-learning Environments

Authors: Konstantinos C. Giotopoulos, Christos E. Alexakos, Grigorios N. Beligiannis, Spiridon D.Likothanassis

Abstract:

In this contribution a newly developed elearning environment is presented, which incorporates Intelligent Agents and Computational Intelligence Techniques. The new e-learning environment is constituted by three parts, the E-learning platform Front-End, the Student Questioner Reasoning and the Student Model Agent. These parts are distributed geographically in dispersed computer servers, with main focus on the design and development of these subsystems through the use of new and emerging technologies. These parts are interconnected in an interoperable way, using web services for the integration of the subsystems, in order to enhance the user modelling procedure and achieve the goals of the learning process.

Keywords: E-learning environments, intelligent agents, user modeling, Bayesian Networks, computational intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
6500 A New Direct Updating Method for Undamped Structural Systems

Authors: Yongxin Yuan, Jiashang Jiang

Abstract:

A new numerical method for simultaneously updating mass and stiffness matrices based on incomplete modal measured data is presented. By using the Kronecker product, all the variables that are to be modified can be found out and then can be updated directly. The optimal approximation mass matrix and stiffness matrix which satisfy the required eigenvalue equation and orthogonality condition are found under the Frobenius norm sense. The physical configuration of the analytical model is preserved and the updated model will exactly reproduce the modal measured data. The numerical example seems to indicate that the method is quite accurate and efficient.

Keywords: Finite element model, model updating, modal data, optimal approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
6499 Towards an Effective Reputation Assessment Process in Peer-to-Peer Systems

Authors: Farag Azzedin, Ahmad Ridha

Abstract:

The need for reputation assessment is particularly strong in peer-to-peer (P2P) systems because the peers' personal site autonomy is amplified by the inherent technological decentralization of the environment. However, the decentralization notion makes the problem of designing a peer-to-peer based reputation assessment substantially harder in P2P networks than in centralized settings.Existing reputation systems tackle the reputation assessment process in an ad-hoc manner. There is no systematic and coherent way to derive measures and analyze the current reputation systems. In this paper, we propose a reputation assessment process and use it to classify the existing reputation systems. Simulation experiments are conducted and focused on the different methods in selecting the recommendation sources and retrieving the recommendations. These two phases can contribute significantly to the overall performance due to communication cost and coverage.

Keywords: P2P Systems, Trust, Reputation, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
6498 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualized for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data were gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data were analyzed using Adele Clarke's situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the 'Southern Inclusive Education Framework for Guyana' and its support tool 'The Inclusive Checker created for Southern mainstream primary classrooms'.

Keywords: Social Model of Disability, Medical Model of Disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, Quasi-inclusion practices, Guyanese cultural challenges, mainstream primary schools, Loreman's Synthesis, Booths and Ainscow's Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 546