Search results for: Data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9246

Search results for: Data mining techniques

8166 Comparative Optical Analysis of Offset Reflector Antenna in GRASP

Authors: Ghulam Ahmad

Abstract:

In this paper comparison of Reflector Antenna analyzing techniques based on wave and ray nature of optics is presented for an offset reflector antenna using GRASP (General Reflector antenna Analysis Software Package) software. The results obtained using PO (Physical Optics), PTD (Physical theory of Diffraction), and GTD (Geometrical Theory of Diffraction) are compared. The validity of PO and GTD techniques in regions around the antenna, caustic behavior of GTD in main beam, and deviation of GTD in case of near-in sidelobes of radiation pattern are discussed. The comparison for far-out sidelobes predicted by PO, PO + PTD and GTD is described. The effect of Direct Radiations from feed which results in feed selection for the system is addressed.

Keywords: Geometrical optics & geometrical theory of diffraction, offset reflector antenna, physical optics & physical theory of diffraction, PO & GO comaprison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2118
8165 Comparing Autoregressive Moving Average (ARMA) Coefficients Determination using Artificial Neural Networks with Other Techniques

Authors: Abiodun M. Aibinu, Momoh J. E. Salami, Amir A. Shafie, Athaur Rahman Najeeb

Abstract:

Autoregressive Moving average (ARMA) is a parametric based method of signal representation. It is suitable for problems in which the signal can be modeled by explicit known source functions with a few adjustable parameters. Various methods have been suggested for the coefficients determination among which are Prony, Pade, Autocorrelation, Covariance and most recently, the use of Artificial Neural Network technique. In this paper, the method of using Artificial Neural network (ANN) technique is compared with some known and widely acceptable techniques. The comparisons is entirely based on the value of the coefficients obtained. Result obtained shows that the use of ANN also gives accurate in computing the coefficients of an ARMA system.

Keywords: Autoregressive moving average, coefficients, back propagation, model parameters, neural network, weight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
8164 The New Method of Concealed Data Aggregation in Wireless Sensor: A Case Study

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
8163 Peakwise Smoothing of Data Models using Wavelets

Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan

Abstract:

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
8162 A New Precautionary Method for Measurement and Improvement the Data Quality

Authors: Seyed Mohammad Hossein Moossavizadeh, Mehran Mohsenzadeh, Nasrin Arshadi

Abstract:

the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.

Keywords: Data quality, precaution, information system, measurement, improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
8161 Performance Enhancement of Cellular OFDM Based Wireless LANs by Exploiting Spatial Diversity Techniques

Authors: S. Ali. Tajer, Babak H. Khalaj

Abstract:

This paper represents an investigation on how exploiting multiple transmit antennas by OFDM based wireless LAN subscribers can mitigate physical layer error rate. Then by comparing the Wireless LANs that utilize spatial diversity techniques with the conventional ones it will reveal how PHY and TCP throughputs behaviors are ameliorated. In the next step it will assess the same issues based on a cellular context operation which is mainly introduced as an innovated solution that beside a multi cell operation scenario benefits spatio-temporal signaling schemes as well. Presented simulations will shed light on the improved performance of the wide range and high quality wireless LAN services provided by the proposed approach.

Keywords: Multiple Input Multiple Output (MIMO), Orthogonal Frequency Division Multiplexing (OFDM), and WirelessLocal Area Network (WLAN).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1396
8160 Application of a Systemic Soft Domain-Driven Design Framework

Authors: Mohammed Salahat, Steve Wade, Izhar Ul-Haq

Abstract:

This paper proposes a “soft systems" approach to domain-driven design of computer-based information systems. We propose a systemic framework combining techniques from Soft Systems Methodology (SSM), the Unified Modelling Language (UML), and an implementation pattern known as “Naked Objects". We have used this framework in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within the proposed framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to generate a ubiquitous language (soft language) which can be used as the basis for developing an object-oriented domain model. The domain model is further developed using techniques based on the UML and is implemented in software following the “Naked Objects" implementation pattern. We argue that there are advantages from combining and using techniques from different methodologies in this way. The proposed systemic framework is overviewed and justified as multimethodologyusing Mingers multimethodology ideas. This multimethodology approach is being evaluated through a series of action research projects based on real-world case studies. A Peer-Tutoring case study is presented here as a sample of the framework evaluation process

Keywords: SSM, UML, Domain-Driven Design, Soft Domain-Driven Design, Naked Objects, Soft Languag e.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
8159 Visualization and Indexing of Spectral Databases

Authors: Tibor Kulcsar, Gabor Sarossy, Gabor Bereznai, Robert Auer, Janos Abonyi

Abstract:

On-line (near infrared) spectroscopy is widely used to support the operation of complex process systems. Information extracted from spectral database can be used to estimate unmeasured product properties and monitor the operation of the process. These techniques are based on looking for similar spectra by nearest neighborhood algorithms and distance based searching methods. Search for nearest neighbors in the spectral space is an NP-hard problem, the computational complexity increases by the number of points in the discrete spectrum and the number of samples in the database. To reduce the calculation time some kind of indexing could be used. The main idea presented in this paper is to combine indexing and visualization techniques to reduce the computational requirement of estimation algorithms by providing a two dimensional indexing that can also be used to visualize the structure of the spectral database. This 2D visualization of spectral database does not only support application of distance and similarity based techniques but enables the utilization of advanced clustering and prediction algorithms based on the Delaunay tessellation of the mapped spectral space. This means the prediction has not to use the high dimension space but can be based on the mapped space too. The results illustrate that the proposed method is able to segment (cluster) spectral databases and detect outliers that are not suitable for instance based learning algorithms.

Keywords: indexing high dimensional databases, dimensional reduction, clustering, similarity, k-nn algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
8158 Holistic Simulation-Based Impact Analysis Framework for Sustainable Manufacturing

Authors: Mijoh A. Gbededo, Kapila Liyanage, Sabuj Mallik

Abstract:

The emerging approaches to sustainable manufacturing are considered to be solution-oriented with the aim of addressing the environmental, economic and social issues holistically. However, the analysis of the interdependencies amongst the three sustainability dimensions has not been fully captured in the literature. In a recent review of approaches to sustainable manufacturing, two categories of techniques are identified: 1) Sustainable Product Development (SPD), and 2) Sustainability Performance Assessment (SPA) techniques. The challenges of the approaches are not only related to the arguments and misconceptions of the relationships between the techniques and sustainable development but also to the inability to capture and integrate the three sustainability dimensions. This requires a clear definition of some of the approaches and a road-map to the development of a holistic approach that supports sustainability decision-making. In this context, eco-innovation, social impact assessment, and life cycle sustainability analysis play an important role. This paper deployed an integrative approach that enabled amalgamation of sustainable manufacturing approaches and the theories of reciprocity and motivation into a holistic simulation-based impact analysis framework. The findings in this research have the potential to guide sustainability analysts to capture the aspects of the three sustainability dimensions into an analytical model. Additionally, the research findings presented can aid the construction of a holistic simulation model of a sustainable manufacturing and support effective decision-making.

Keywords: Life cycle sustainability analysis, sustainable manufacturing, sustainability performance assessment, sustainable product development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
8157 Weigh-in-Motion Data Analysis Software for Developing Traffic Data for Mechanistic Empirical Pavement Design

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

Currently, there are few user friendly Weigh-in- Motion (WIM) data analysis softwares available which can produce traffic input data for the recently developed AASHTOWare pavement Mechanistic-Empirical (ME) design software. However, these softwares have only rudimentary Quality Control (QC) processes. Therefore, they cannot properly deal with erroneous WIM data. As the pavement performance is highly sensible to the quality of WIM data, it is highly recommended to use more refined QC process on raw WIM data to get a good result. This study develops a userfriendly software, which can produce traffic input for the ME design software. This software takes the raw data (Class and Weight data) collected from the WIM station and processes it with a sophisticated QC procedure. Traffic data such as traffic volume, traffic distribution, axle load spectra, etc. can be obtained from this software; which can directly be used in the ME design software.

Keywords: Weigh-in-motion, software, axle load spectra, traffic distribution, AASHTOWare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892
8156 Protein Secondary Structure Prediction Using Parallelized Rule Induction from Coverings

Authors: Leong Lee, Cyriac Kandoth, Jennifer L. Leopold, Ronald L. Frank

Abstract:

Protein 3D structure prediction has always been an important research area in bioinformatics. In particular, the prediction of secondary structure has been a well-studied research topic. Despite the recent breakthrough of combining multiple sequence alignment information and artificial intelligence algorithms to predict protein secondary structure, the Q3 accuracy of various computational prediction algorithms rarely has exceeded 75%. In a previous paper [1], this research team presented a rule-based method called RT-RICO (Relaxed Threshold Rule Induction from Coverings) to predict protein secondary structure. The average Q3 accuracy on the sample datasets using RT-RICO was 80.3%, an improvement over comparable computational methods. Although this demonstrated that RT-RICO might be a promising approach for predicting secondary structure, the algorithm-s computational complexity and program running time limited its use. Herein a parallelized implementation of a slightly modified RT-RICO approach is presented. This new version of the algorithm facilitated the testing of a much larger dataset of 396 protein domains [2]. Parallelized RTRICO achieved a Q3 score of 74.6%, which is higher than the consensus prediction accuracy of 72.9% that was achieved for the same test dataset by a combination of four secondary structure prediction methods [2].

Keywords: data mining, protein secondary structure prediction, parallelization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
8155 Review of Surface Electromyogram Signals: Its Analysis and Applications

Authors: Anjana Goen, D. C. Tiwari

Abstract:

Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.

Keywords: Evolvable hardware (EHW), Functional Electrical Simulation (FES), Hidden Markov Model (HMM), Hjorth Time Domain (HTD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3511
8154 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data

Authors: Sedigheh Mirzaei S., Debasis Sengupta

Abstract:

Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.

Keywords: Preece-Baines growth model, MCMC method, Mixed effect model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2132
8153 Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies

Authors: T. S. Myers, J. Trevathan

Abstract:

Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.

Keywords: Information architecture, Semantic technologies Sensor networks, Ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
8152 A Proposed Program for Postgraduates in Egypt to Acquire the Skills and Techniques for Producing Concept Cartoons for Kindergarten Children

Authors: Ahmed Amin Mousa, M. Abd El Salam

Abstract:

The current study presents a proposed program for acquisition the skills and techniques needed to produce concept cartoon. The proposed program has been prepared for non-specialist students who have never used neither graphics nor animating software. It was presented to postgraduates in Faculty of Education for Early Childhood, Cairo University, during the spring term of the 2014-2015 academic year. The program works in three different aspects: Drawing and images editing, sound manipulation, and creating animation. In addition, the researchers have prepared a questionnaire for measuring the quality of the concept cartoons produced by the students. The questionnaire was used as a pre-test and post-test, and at the end of the study, a significant difference was determined in favour of post-test results.

Keywords: Cartoon, concept cartoon, kindergarten, animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
8151 Data Migration between Document-Oriented and Relational Databases

Authors: Bogdan Walek, Cyril Klimes

Abstract:

Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.

Keywords: data migration, database, document-oriented database, XML, relational schema

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3509
8150 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: Biometrics, identity verification, genetic data, k-nearest neighbor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1110
8149 Flexible Arm Manipulator Control for Industrial Tasks

Authors: Mircea Ivanescu, Nirvana Popescu, Decebal Popescu, Dorin Popescu

Abstract:

This paper addresses the control problem of a class of hyper-redundant arms. In order to avoid discrepancy between the mathematical model and the actual dynamics, the dynamic model with uncertain parameters of this class of manipulators is inferred. A procedure to design a feedback controller which stabilizes the uncertain system has been proposed. A PD boundary control algorithm is used in order to control the desired position of the manipulator. This controller is easy to implement from the point of view of measuring techniques and actuation. Numerical simulations verify the effectiveness of the presented methods. In order to verify the suitability of the control algorithm, a platform with a 3D flexible manipulator has been employed for testing. Experimental tests on this platform illustrate the applications of the techniques developed in the paper.

Keywords: Distributed model, flexible manipulator, observer, robot control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
8148 Design and Control of PEM Fuel Cell Diffused Aeration System using Artificial Intelligence Techniques

Authors: Doaa M. Atia, Faten H. Fahmy, Ninet M. Ahmed, Hassen T. Dorrah

Abstract:

Fuel cells have become one of the major areas of research in the academia and the industry. The goal of most fish farmers is to maximize production and profits while holding labor and management efforts to the minimum. Risk of fish kills, disease outbreaks, poor water quality in most pond culture operations, aeration offers the most immediate and practical solution to water quality problems encountered at higher stocking and feeding rates. Many units of aeration system are electrical units so using a continuous, high reliability, affordable, and environmentally friendly power sources is necessary. Aeration of water by using PEM fuel cell power is not only a new application of the renewable energy, but also, it provides an affordable method to promote biodiversity in stagnant ponds and lakes. This paper presents a new design and control of PEM fuel cell powered a diffused air aeration system for a shrimp farm in Mersa Matruh in Egypt. Also Artificial intelligence (AI) techniques control is used to control the fuel cell output power by control input gases flow rate. Moreover the mathematical modeling and simulation of PEM fuel cell is introduced. A comparison study is applied between the performance of fuzzy logic control (FLC) and neural network control (NNC). The results show the effectiveness of NNC over FLC.

Keywords: PEM fuel cell, Diffused aeration system, Artificialintelligence (AI) techniques, neural network control, fuzzy logiccontrol

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209
8147 Seed-Based Region Growing (SBRG) vs Adaptive Network-Based Inference System (ANFIS) vs Fuzzyc-Means (FCM): Brain Abnormalities Segmentation

Authors: Shafaf Ibrahim, Noor Elaiza Abdul Khalid, Mazani Manaf

Abstract:

Segmentation of Magnetic Resonance Imaging (MRI) images is the most challenging problems in medical imaging. This paper compares the performances of Seed-Based Region Growing (SBRG), Adaptive Network-Based Fuzzy Inference System (ANFIS) and Fuzzy c-Means (FCM) in brain abnormalities segmentation. Controlled experimental data is used, which designed in such a way that prior knowledge of the size of the abnormalities are known. This is done by cutting various sizes of abnormalities and pasting it onto normal brain tissues. The normal tissues or the background are divided into three different categories. The segmentation is done with fifty seven data of each category. The knowledge of the size of the abnormalities by the number of pixels are then compared with segmentation results of three techniques proposed. It was proven that the ANFIS returns the best segmentation performances in light abnormalities, whereas the SBRG on the other hand performed well in dark abnormalities segmentation.

Keywords: Seed-Based Region Growing (SBRG), Adaptive Network-Based Fuzzy Inference System (ANFIS), Fuzzy c-Means (FCM), Brain segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2299
8146 Sensitivity Analysis for Direction of Arrival Estimation Using Capon and Music Algorithms in Mobile Radio Environment

Authors: Mustafa Abdalla, Khaled A. Madi, Rajab Farhat

Abstract:

An array antenna system with innovative signal processing can improve the resolution of a source direction of arrival (DoA) estimation. High resolution techniques take the advantage of array antenna structures to better process the incoming waves. They also have the capability to identify the direction of multiple targets. This paper investigates performance of the DOA estimation algorithm namely; Capon and MUSIC on the uniform linear array (ULA). The simulation results show that in Capon and MUSIC algorithm the resolution of the DOA techniques improves as number of snapshots, number of array elements, signal-to-noise ratio and separation angle between the two sources θ increases.

Keywords: Antenna array, Capon, MUSIC, Direction-of-arrival estimation, signal processing, uniform linear arrays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2722
8145 Solution of Two Dimensional Quasi-Harmonic Equations with CA Approach

Authors: F. Rezaie Moghaddam, J. Amani, T. Rezaie Moghaddam

Abstract:

Many computational techniques were applied to solution of heat conduction problem. Those techniques were the finite difference (FD), finite element (FE) and recently meshless methods. FE is commonly used in solution of equation of heat conduction problem based on the summation of stiffness matrix of elements and the solution of the final system of equations. Because of summation process of finite element, convergence rate was decreased. Hence in the present paper Cellular Automata (CA) approach is presented for the solution of heat conduction problem. Each cell considered as a fixed point in a regular grid lead to the solution of a system of equations is substituted by discrete systems of equations with small dimensions. Results show that CA can be used for solution of heat conduction problem.

Keywords: Heat conduction, Cellular automata, convergencerate, discrete system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
8144 Equalization Algorithms for MIMO System

Authors: Said Elkassimi, Said Safi, B. Manaut

Abstract:

In recent years, multi-antenna techniques are being considered as a potential solution to increase the flow of future wireless communication systems. The objective of this article is to study the emission and reception system MIMO (Multiple Input Multiple Output), and present the different reception decoding techniques. First we will present the least complex technical, linear receivers such as the zero forcing equalizer (ZF) and minimum mean squared error (MMSE). Then a nonlinear technique called ordered successive cancellation of interferences (OSIC) and the optimal detector based on the maximum likelihood criterion (ML), finally, we simulate the associated decoding algorithms for MIMO system such as ZF, MMSE, OSIC and ML, thus a comparison of performance of these algorithms in MIMO context.

Keywords: Multiple Input Multiple Outputs (MIMO), ZF, MMSE, Ordered Interference Successive Cancellation (OSIC), ML, Interference Successive Cancellation (SIC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2823
8143 Support Vector Machine based Intelligent Watermark Decoding for Anticipated Attack

Authors: Syed Fahad Tahir, Asifullah Khan, Abdul Majid, Anwar M. Mirza

Abstract:

In this paper, we present an innovative scheme of blindly extracting message bits from an image distorted by an attack. Support Vector Machine (SVM) is used to nonlinearly classify the bits of the embedded message. Traditionally, a hard decoder is used with the assumption that the underlying modeling of the Discrete Cosine Transform (DCT) coefficients does not appreciably change. In case of an attack, the distribution of the image coefficients is heavily altered. The distribution of the sufficient statistics at the receiving end corresponding to the antipodal signals overlap and a simple hard decoder fails to classify them properly. We are considering message retrieval of antipodal signal as a binary classification problem. Machine learning techniques like SVM is used to retrieve the message, when certain specific class of attacks is most probable. In order to validate SVM based decoding scheme, we have taken Gaussian noise as a test case. We generate a data set using 125 images and 25 different keys. Polynomial kernel of SVM has achieved 100 percent accuracy on test data.

Keywords: Bit Correct Ratio (BCR), Grid Search, Intelligent Decoding, Jackknife Technique, Support Vector Machine (SVM), Watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663
8142 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning

Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem

Abstract:

The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.

Keywords: Connectivism, data visualization, informal learning, learning analytics, semantic web, social web.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 802
8141 Mixtures of Monotone Networks for Prediction

Authors: Marina Velikova, Hennie Daniels, Ad Feelders

Abstract:

In many data mining applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. In this paper we consider partially monotone prediction problems, where the target variable depends monotonically on some of the input variables but not on all. We propose a novel method to construct prediction models, where monotone dependences with respect to some of the input variables are preserved by virtue of construction. Our method belongs to the class of mixture models. The basic idea is to convolute monotone neural networks with weight (kernel) functions to make predictions. By using simulation and real case studies, we demonstrate the application of our method. To obtain sound assessment for the performance of our approach, we use standard neural networks with weight decay and partially monotone linear models as benchmark methods for comparison. The results show that our approach outperforms partially monotone linear models in terms of accuracy. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.

Keywords: mixture models, monotone neural networks, partially monotone models, partially monotone problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1238
8140 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis

Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.

Keywords: Big data, social network analysis, text mining, topic modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
8139 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: Access controls, cloud computing, confidentiality, identity and access management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 716
8138 Fast Cosine Transform to Increase Speed-up and Efficiency of Karhunen-Loève Transform for Lossy Image Compression

Authors: Mario Mastriani, Juliana Gambini

Abstract:

In this work, we present a comparison between two techniques of image compression. In the first case, the image is divided in blocks which are collected according to zig-zag scan. In the second one, we apply the Fast Cosine Transform to the image, and then the transformed image is divided in blocks which are collected according to zig-zag scan too. Later, in both cases, the Karhunen-Loève transform is applied to mentioned blocks. On the other hand, we present three new metrics based on eigenvalues for a better comparative evaluation of the techniques. Simulations show that the combined version is the best, with minor Mean Absolute Error (MAE) and Mean Squared Error (MSE), higher Peak Signal to Noise Ratio (PSNR) and better image quality. Finally, new technique was far superior to JPEG and JPEG2000.

Keywords: Fast Cosine Transform, image compression, JPEG, JPEG2000, Karhunen-Loève Transform, zig-zag scan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4905
8137 Power Saving System in Green Data Center

Authors: Joon-young Jung, Dong-oh Kang, Chang-seok Bae

Abstract:

Power consumption is rapidly increased in data centers because the number of data center is increased and more the scale of data center become larger. Therefore, it is one of key research items to reduce power consumption in data center. The peak power of a typical server is around 250 watts. When a server is idle, it continues to use around 60% of the power consumed when in use, though vendors are putting effort into reducing this “idle" power load. Servers tend to work at only around a 5% to 20% utilization rate, partly because of response time concerns. An average of 10% of servers in their data centers was unused. In those reason, we propose dynamic power management system to reduce power consumption in green data center. Experiment result shows that about 55% power consumption is reduced at idle time.

Keywords: Data Center, Green IT, Management Server, Power Saving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617