Search results for: tree based model.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15719

Search results for: tree based model.

8909 Development of a Simulator for Explaining Organic Chemical Reactions Based on Qualitative Process Theory

Authors: Alicia Y. C. Tang, Rukaini Hj. Abdullah, Sharifuddin M. Zain

Abstract:

This paper discusses the development of a qualitative simulator (abbreviated QRiOM) for predicting the behaviour of organic chemical reactions. The simulation technique is based on the qualitative process theory (QPT) ontology. The modelling constructs of QPT embody notions of causality which can be used to explain the behaviour of a chemical system. The major theme of this work is that, in a qualitative simulation environment, students are able to articulate his/her knowledge through the inspection of explanations generated by software. The implementation languages are Java and Prolog. The software produces explanation in various forms that stresses on the causal theories in the chemical system which can be effectively used to support learning.

Keywords: Chemical reactions, explanation, qualitative processtheory, simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544
8908 Application of Adaptive Network-Based Fuzzy Inference System in Macroeconomic Variables Forecasting

Authors: Ε. Giovanis

Abstract:

In this paper we apply an Adaptive Network-Based Fuzzy Inference System (ANFIS) with one input, the dependent variable with one lag, for the forecasting of four macroeconomic variables of US economy, the Gross Domestic Product, the inflation rate, six monthly treasury bills interest rates and unemployment rate. We compare the forecasting performance of ANFIS with those of the widely used linear autoregressive and nonlinear smoothing transition autoregressive (STAR) models. The results are greatly in favour of ANFIS indicating that is an effective tool for macroeconomic forecasting used in academic research and in research and application by the governmental and other institutions

Keywords: Linear models, Macroeconomics, Neuro-Fuzzy, Non-Linear models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
8907 Hand Written Digit Recognition by Multiple Classifier Fusion based on Decision Templates Approach

Authors: Reza Ebrahimpour, Samaneh Hamedi

Abstract:

Classifier fusion may generate more accurate classification than each of the basic classifiers. Fusion is often based on fixed combination rules like the product, average etc. This paper presents decision templates as classifier fusion method for the recognition of the handwritten English and Farsi numerals (1-9). The process involves extracting a feature vector on well-known image databases. The extracted feature vector is fed to multiple classifier fusion. A set of experiments were conducted to compare decision templates (DTs) with some combination rules. Results from decision templates conclude 97.99% and 97.28% for Farsi and English handwritten digits.

Keywords: Decision templates, multi-layer perceptron, characteristics Loci, principle component analysis (PCA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1940
8906 Socio-Economic Insight of the Secondary Housing Market in Colombo Suburbs: Seller’s Point of Views

Authors: R. G. Ariyawansa, M. A. N. R. M. Perera

Abstract:

“House” is a powerful symbol of socio-economic background of individuals and families. In fact, housing provides all types of needs/wants from basic needs to self-actualization needs. This phenomenon can be realized only having analyzed hidden motives of buyers and sellers of the housing market. Hence, the aim of this study is to examine the socio-economic insight of the secondary housing market in Colombo suburbs. This broader aim was achieved via analyzing the general pattern of the secondary housing market, identifying socio-economic motives of sellers of the secondary housing market, and reviewing sellers’ experience of buyer behavior. A purposive sample of 50 sellers from popular residential areas in Colombo such as Maharagama, Kottawa, Piliyandala, Punnipitiya, and Nugegoda was used to collect primary data instead of relevant secondary data from published and unpublished reports. The sample was limited to selling price ranging from Rs15 million to Rs25 million, which apparently falls into middle and upper-middle income houses in the context. Participatory observation and semi-structured interviews were adopted as key data collection tools. Data were descriptively analyzed. This study found that the market is mainly handled by informal agents who are unqualified and unorganized. People such as taxi/tree-wheel drivers, boutique venders, security personals etc. are engaged in housing brokerage as a part time career. Few fulltime and formally organized agents were found but they were also not professionally qualified. As far as housing quality is concerned, it was observed that 90% of houses was poorly maintained and illegally modified. They are situated in poorly maintained neighborhoods as well. Among the observed houses, 2% was moderately maintained and 8% was well maintained and modified. Major socio-economic motives of sellers were “migrating foreign countries for education and employment” (80% and 10% respectively), “family problems” (4%), and “social status” (3%). Other motives were “health” and “environmental/neighborhood problems” (3%). This study further noted that the secondary middle income housing market in the area directly related with the migrants who motivated for education in foreign countries, mainly Australia, UK and USA. As per the literature, families motivated for education tend to migrate Colombo suburbs from remote areas of the country. They are seeking temporary accommodation in lower middle income housing. However, the secondary middle income housing market relates with the migration from Colombo to major global cities. Therefore, final transaction price of this market may depend on migration related dates such as university deadlines, visa and other agreements. Hence, it creates a buyers’ market lowering the selling price. Also it was revealed that the buyers tend to trust more on this market as far as the quality of construction of houses is concerned than brand new houses which are built for selling purpose.

Keywords: Informal housing market, hidden motives of buyers and sellers, secondary housing market, socio-economic insight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 673
8905 Searching k-Nearest Neighbors to be Appropriate under Gamming Environments

Authors: Jae Moon Lee

Abstract:

In general, algorithms to find continuous k-nearest neighbors have been researched on the location based services, monitoring periodically the moving objects such as vehicles and mobile phone. Those researches assume the environment that the number of query points is much less than that of moving objects and the query points are not moved but fixed. In gaming environments, this problem is when computing the next movement considering the neighbors such as flocking, crowd and robot simulations. In this case, every moving object becomes a query point so that the number of query point is same to that of moving objects and the query points are also moving. In this paper, we analyze the performance of the existing algorithms focused on location based services how they operate under gaming environments.

Keywords: Flocking behavior, heterogeneous agents, similarity, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
8904 An Improved Algorithm for Channel Estimations of OFDM System based Pilot Signal

Authors: Ahmed N. H. Alnuaimy, Mahamod Ismail, Mohd. A. M. Ali, Kasmiran Jumari, Ayman A. El-Saleh

Abstract:

This paper presents a new algorithm for the channel estimation of the OFDM system based on a pilot signal for the new generation of high data rate communication systems. In orthogonal frequency division multiplexing (OFDM) systems over fast-varying fading channels, channel estimation and tracking is generally carried out by transmitting known pilot symbols in given positions of the frequency-time grid. In this paper, we propose to derive an improved algorithm based on the calculation of the mean and the variance of the adjacent pilot signals for a specific distribution of the pilot signals in the OFDM frequency-time grid then calculating of the entire unknown channel coefficients from the equation of the mean and the variance. Simulation results shows that the performance of the OFDM system increase as the length of the channel increase where the accuracy of the estimated channel will be increased using this low complexity algorithm, also the number of the pilot signal needed to be inserted in the OFDM signal will be reduced which lead to increase in the throughput of the signal over the OFDM system in compared with other type of the distribution such as Comb type and Block type channel estimation.

Keywords: Channel estimation, orthogonal frequency divisionmultiplexing (OFDM), comb type channel estimation, block typechannel estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
8903 An Effective Hybrid Genetic Algorithm for Job Shop Scheduling Problem

Authors: Bin Cai, Shilong Wang, Haibo Hu

Abstract:

The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.

Keywords: Genetic algorithm, Job shop scheduling problem, Local search, Meta-heuristic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
8902 A Multi-Level WEB Based Parallel Processing System A Hierarchical Volunteer Computing Approach

Authors: Abdelrahman Ahmed Mohamed Osman

Abstract:

Over the past few years, a number of efforts have been exerted to build parallel processing systems that utilize the idle power of LAN-s and PC-s available in many homes and corporations. The main advantage of these approaches is that they provide cheap parallel processing environments for those who cannot afford the expenses of supercomputers and parallel processing hardware. However, most of the solutions provided are not very flexible in the use of available resources and very difficult to install and setup. In this paper, a multi-level web-based parallel processing system (MWPS) is designed (appendix). MWPS is based on the idea of volunteer computing, very flexible, easy to setup and easy to use. MWPS allows three types of subscribers: simple volunteers (single computers), super volunteers (full networks) and end users. All of these entities are coordinated transparently through a secure web site. Volunteer nodes provide the required processing power needed by the system end users. There is no limit on the number of volunteer nodes, and accordingly the system can grow indefinitely. Both volunteer and system users must register and subscribe. Once, they subscribe, each entity is provided with the appropriate MWPS components. These components are very easy to install. Super volunteer nodes are provided with special components that make it possible to delegate some of the load to their inner nodes. These inner nodes may also delegate some of the load to some other lower level inner nodes .... and so on. It is the responsibility of the parent super nodes to coordinate the delegation process and deliver the results back to the user. MWPS uses a simple behavior-based scheduler that takes into consideration the current load and previous behavior of processing nodes. Nodes that fulfill their contracts within the expected time get a high degree of trust. Nodes that fail to satisfy their contract get a lower degree of trust. MWPS is based on the .NET framework and provides the minimal level of security expected in distributed processing environments. Users and processing nodes are fully authenticated. Communications and messages between nodes are very secure. The system has been implemented using C#. MWPS may be used by any group of people or companies to establish a parallel processing or grid environment.

Keywords: Volunteer computing, Parallel Processing, XMLWebServices, .NET Remoting, Tuplespace.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
8901 Fuzzy Clustering of Categorical Attributes and its Use in Analyzing Cultural Data

Authors: George E. Tsekouras, Dimitris Papageorgiou, Sotiris Kotsiantis, Christos Kalloniatis, Panagiotis Pintelas

Abstract:

We develop a three-step fuzzy logic-based algorithm for clustering categorical attributes, and we apply it to analyze cultural data. In the first step the algorithm employs an entropy-based clustering scheme, which initializes the cluster centers. In the second step we apply the fuzzy c-modes algorithm to obtain a fuzzy partition of the data set, and the third step introduces a novel cluster validity index, which decides the final number of clusters.

Keywords: Categorical data, cultural data, fuzzy logic clustering, fuzzy c-modes, cluster validity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
8900 Highly Efficient White Light-emitting Diodes Based on Layered Quantum Dot-Phosphor Nanocomposites as Converting Materials

Authors: J. Y. Woo, J. Lee, N. Kim, C.-S. Han

Abstract:

This paper reports on the enhanced photoluminescence (PL) of nanocomposites through the layered structuring of phosphor and quantum dot (QD). Green phosphor of Sr2SiO4:Eu, red QDs of CdSe/CdS/CdZnS/ZnS core-multishell, and thermo-curable resin were used for this study. Two kinds of composite (layered and mixed) were prepared, and the schemes for optical energy transfer between QD and phosphor were suggested and investigated based on PL decay characteristics. It was found that the layered structure is more effective than the mixed one in the respects of PL intensity, PL decay and thermal loss. When this layered nanocomposite (QDs on phosphor) is used to make white light emitting diode (LED), the brightness is increased by 37 %, and the color rendering index (CRI) value is raised to 88.4 compared to the mixed case of 80.4.

Keywords: Quantum Dot, Nanocomposites, Photoluminescence, Light Emitting Diode

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3179
8899 Why do Clawback Provisions Affect Financial Reporting Quality? - An Analysis of Trigger Effects

Authors: Yu-Chun Lin

Abstract:

We identify clawback triggers from firms- proxy statements (Form DEF 14A) and use the likelihood of restatements to proxy for financial reporting quality. Based on a sample of 578 U.S. firms that voluntarily adopt clawback provisions during 2003-2009, when restatement-based triggers could be decomposed into two types: fraud and unintentional error, and we do observe the evidence that using fraud triggers is associated with high financial reporting quality. The findings support that fraud triggers can enhance deterrent effect of clawback provision by establishing a viable disincentive against fraud, misconduct, and otherwise harmful acts. These results are robust to controlling for the compensation components, to different sample specifications and to a number of sensitivity.

Keywords: Accruals quality, Clawback provisions, Compensation, Restatements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2566
8898 CScheme in Traditional Concurrency Problems

Authors: Nathar Shah, Visham Cheerkoot

Abstract:

CScheme, a concurrent programming paradigm based on scheme concept enables concurrency schemes to be constructed from smaller synchronization units through a GUI based composer and latter be reused on other concurrency problems of a similar nature. This paradigm is particularly important in the multi-core environment prevalent nowadays. In this paper, we demonstrate techniques to separate concurrency from functional code using the CScheme paradigm. Then we illustrate how the CScheme methodology can be used to solve some of the traditional concurrency problems – critical section problem, and readers-writers problem - using synchronization schemes such as Single Threaded Execution Scheme, and Readers Writers Scheme.

Keywords: Concurrent Programming, Object Oriented Programming, Environments for multiple-processor systems, Programming paradigms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
8897 Optimization Based Tuning of Autopilot Gains for a Fixed Wing UAV

Authors: Mansoor Ahsan, Khalid Rafique, Farrukh Mazhar

Abstract:

Unmanned Aerial Vehicles (UAVs) have gained tremendous importance, in both Military and Civil, during first decade of this century. In a UAV, onboard computer (autopilot) autonomously controls the flight and navigation of the aircraft. Based on the aircraft role and flight envelope, basic to complex and sophisticated controllers are used to stabilize the aircraft flight parameters. These controllers constitute the autopilot system for UAVs. The autopilot systems, most commonly, provide lateral and longitudinal control through Proportional-Integral-Derivative (PID) controllers or Phase-lead or Lag Compensators. Various techniques are commonly used to ‘tune’ gains of these controllers. Some techniques used are, in-flight step-by-step tuning, software-in-loop or hardware-in-loop tuning methods. Subsequently, numerous in-flight tests are required to actually ‘fine-tune’ these gains. However, an optimization-based tuning of these PID controllers or compensators, as presented in this paper, can greatly minimize the requirement of in-flight ‘tuning’ and substantially reduce the risks and cost involved in flight-testing.

Keywords: Unmanned aerial vehicle (UAV), autopilot, autonomous controls, PID controler gains tuning, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3631
8896 Microseismicity of the Tehran Region Based on Three Seismic Networks

Authors: Jamileh Vasheghani Farahani

Abstract:

The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).

Keywords: Iran, major faults, microseismicity, Tehran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
8895 The Vulnerability Analysis of Java Bytecode Based on Points-to Dataflow

Authors: Tang Hong, Zhang Lufeng, Chen Hua, Zhang Jianbo

Abstract:

Today many developers use the Java components collected from the Internet as external LIBs to design and develop their own software. However, some unknown security bugs may exist in these components, such as SQL injection bug may comes from the components which have no specific check for the input string by users. To check these bugs out is very difficult without source code. So a novel method to check the bugs in Java bytecode based on points-to dataflow analysis is in need, which is different to the common analysis techniques base on the vulnerability pattern check. It can be used as an assistant tool for security analysis of Java bytecode from unknown softwares which will be used as extern LIBs.

Keywords: Java bytecode, points-to dataflow, vulnerability analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
8894 Multiple Input Multiple Output Detection Using Roulette Wheel Based Ant Colony Optimization Technique

Authors: B. Rebekka, B. Malarkodi

Abstract:

This paper describes an approach to detect the transmitted signals for 2×2 Multiple Input Multiple Output (MIMO) setup using roulette wheel based ant colony optimization technique. The results obtained are compared with classical zero forcing and least mean square techniques. The detection rates achieved using this technique are consistently larger than the one achieved using classical methods for 50 number of attempts with two different antennas transmitting the input stream from a user. This paves the path to use alternative techniques to improve the throughput achieved in advanced networks like Long Term Evolution (LTE) networks.

Keywords: MIMO, ant colony optimization, roulette wheel, soft computing, LTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
8893 Performance Analysis of Traffic Classification with Machine Learning

Authors: Htay Htay Yi, Zin May Aye

Abstract:

Network security is role of the ICT environment because malicious users are continually growing that realm of education, business, and then related with ICT. The network security contravention is typically described and examined centrally based on a security event management system. The firewalls, Intrusion Detection System (IDS), and Intrusion Prevention System are becoming essential to monitor or prevent of potential violations, incidents attack, and imminent threats. In this system, the firewall rules are set only for where the system policies are needed. Dataset deployed in this system are derived from the testbed environment. The traffic as in DoS and PortScan traffics are applied in the testbed with firewall and IDS implementation. The network traffics are classified as normal or attacks in the existing testbed environment based on six machine learning classification methods applied in the system. It is required to be tested to get datasets and applied for DoS and PortScan. The dataset is based on CICIDS2017 and some features have been added. This system tested 26 features from the applied dataset. The system is to reduce false positive rates and to improve accuracy in the implemented testbed design. The system also proves good performance by selecting important features and comparing existing a dataset by machine learning classifiers.

Keywords: False negative rate, intrusion detection system, machine learning methods, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
8892 Transform-Domain Rate-Distortion Optimization Accelerator for H.264/AVC Video Encoding

Authors: Mohammed Golam Sarwer, Lai Man Po, Kai Guo, Q.M. Jonathan Wu

Abstract:

In H.264/AVC video encoding, rate-distortion optimization for mode selection plays a significant role to achieve outstanding performance in compression efficiency and video quality. However, this mode selection process also makes the encoding process extremely complex, especially in the computation of the ratedistortion cost function, which includes the computations of the sum of squared difference (SSD) between the original and reconstructed image blocks and context-based entropy coding of the block. In this paper, a transform-domain rate-distortion optimization accelerator based on fast SSD (FSSD) and VLC-based rate estimation algorithm is proposed. This algorithm could significantly simplify the hardware architecture for the rate-distortion cost computation with only ignorable performance degradation. An efficient hardware structure for implementing the proposed transform-domain rate-distortion optimization accelerator is also proposed. Simulation results demonstrated that the proposed algorithm reduces about 47% of total encoding time with negligible degradation of coding performance. The proposed method can be easily applied to many mobile video application areas such as a digital camera and a DMB (Digital Multimedia Broadcasting) phone.

Keywords: Context-adaptive variable length coding (CAVLC), H.264/AVC, rate-distortion optimization (RDO), sum of squareddifference (SSD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586
8891 Adaptive Impedance Control for Unknown Non-Flat Environment

Authors: Norsinnira Zainul Azlan, Hiroshi Yamaura

Abstract:

This paper presents a new adaptive impedance control strategy, based on Function Approximation Technique (FAT) to compensate for unknown non-flat environment shape or time-varying environment location. The target impedance in the force controllable direction is modified by incorporating adaptive compensators and the uncertainties are represented by FAT, allowing the update law to be derived easily. The force error feedback is utilized in the estimation and the accurate knowledge of the environment parameters are not required by the algorithm. It is shown mathematically that the stability of the controller is guaranteed based on Lyapunov theory. Simulation results presented to demonstrate the validity of the proposed controller.

Keywords: Adaptive impedance control, Function Approximation Technique (FAT), impedance control, unknown environment position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
8890 Security Weaknesses of Dynamic ID-based Remote User Authentication Protocol

Authors: Hyoungseob Lee, Donghyun Choi, Yunho Lee, Dongho Won, Seungjoo Kim

Abstract:

Recently, with the appearance of smart cards, many user authentication protocols using smart card have been proposed to mitigate the vulnerabilities in user authentication process. In 2004, Das et al. proposed a ID-based user authentication protocol that is secure against ID-theft and replay attack using smart card. In 2009, Wang et al. showed that Das et al.-s protocol is not secure to randomly chosen password attack and impersonation attack, and proposed an improved protocol. Their protocol provided mutual authentication and efficient password management. In this paper, we analyze the security weaknesses and point out the vulnerabilities of Wang et al.-s protocol.

Keywords: Message Alteration Attack, Impersonation Attack

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
8889 Controlling the Angle of Attack of an Aircraft Using Genetic Algorithm Based Flight Controller

Authors: S. Swain, P. S Khuntia

Abstract:

In this paper, the unstable angle of attack of a FOXTROT aircraft is controlled by using Genetic Algorithm based flight controller and the result is compared with the conventional techniques like Tyreus-Luyben (TL), Ziegler-Nichols (ZN) and Interpolation Rule (IR) for tuning the PID controller. In addition, the performance indices like Mean Square Error (MSE), Integral Square Error (ISE), and Integral Absolute Time Error (IATE) etc. are improved by using Genetic Algorithm. It was established that the error by using GA is very less as compared to the conventional techniques thereby improving the performance indices of the dynamic system.

Keywords: Angle of Attack, Genetic Algorithm, Performance Indices, PID Controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
8888 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks

Authors: Zeyad Abdelmageid, Xianbin Wang

Abstract:

Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterwards. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and at times better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.

Keywords: Channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 344
8887 A Case Study of Key-Dependent Permutations in Feistel Ciphers

Authors: Hani Almimi, Ola Osabi, Azman Samsudin

Abstract:

Many attempts have been made to strengthen Feistel based block ciphers. Among the successful proposals is the key- dependent S-box which was implemented in some of the high-profile ciphers. In this paper a key-dependent permutation box is proposed and implemented on DES as a case study. The new modified DES, MDES, was tested against Diehard Tests, avalanche test, and performance test. The results showed that in general MDES is more resistible to attacks than DES with negligible overhead. Therefore, it is believed that the proposed key-dependent permutation should be considered as a valuable primitive that can help strengthen the security of Substitution-Permutation Network which is a core design in many Feistel based block ciphers.

Keywords: Block Cipher, Feistel Structure, DES, Diehard Tests, Avalanche Effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1982
8886 A New Algorithm for Enhanced Robustness of Copyright Mark

Authors: Harsh Vikram Singh, S. P. Singh, Anand Mohan

Abstract:

This paper discusses a new heavy tailed distribution based data hiding into discrete cosine transform (DCT) coefficients of image, which provides statistical security as well as robustness against steganalysis attacks. Unlike other data hiding algorithms, the proposed technique does not introduce much effect in the stegoimage-s DCT coefficient probability plots, thus making the presence of hidden data statistically undetectable. In addition the proposed method does not compromise on hiding capacity. When compared to the generic block DCT based data-hiding scheme, our method found more robust against a variety of image manipulating attacks such as filtering, blurring, JPEG compression etc.

Keywords: Information Security, Robust Steganography, Steganalysis, Pareto Probability Distribution function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
8885 Near-Lossless Image Coding based on Orthogonal Polynomials

Authors: Krishnamoorthy R, Rajavijayalakshmi K, Punidha R

Abstract:

In this paper, a near lossless image coding scheme based on Orthogonal Polynomials Transform (OPT) has been presented. The polynomial operators and polynomials basis operators are obtained from set of orthogonal polynomials functions for the proposed transform coding. The image is partitioned into a number of distinct square blocks and the proposed transform coding is applied to each of these individually. After applying the proposed transform coding, the transformed coefficients are rearranged into a sub-band structure. The Embedded Zerotree (EZ) coding algorithm is then employed to quantize the coefficients. The proposed transform is implemented for various block sizes and the performance is compared with existing Discrete Cosine Transform (DCT) transform coding scheme.

Keywords: Near-lossless Coding, Orthogonal Polynomials Transform, Embedded Zerotree Coding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
8884 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification

Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian

Abstract:

Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.

Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 746
8883 Chilean Wines Classification based only on Aroma Information

Authors: Nicolás H. Beltrán, Manuel A. Duarte-Mermoud, Víctor A. Soto, Sebastián A. Salah, and Matías A. Bustos

Abstract:

Results of Chilean wine classification based on the information provided by an electronic nose are reported in this paper. The classification scheme consists of two parts; in the first stage, Principal Component Analysis is used as feature extraction method to reduce the dimensionality of the original information. Then, Radial Basis Functions Neural Networks is used as pattern recognition technique to perform the classification. The objective of this study is to classify different Cabernet Sauvignon, Merlot and Carménère wine samples from different years, valleys and vineyards of Chile.

Keywords: Feature extraction techniques, Pattern recognitiontechniques, Principal component analysis, Radial basis functionsneural networks, Wine classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
8882 Digital Narrative as a Change Agent to Teach Reading to Media-Centric Students

Authors: Robert F. Kenny

Abstract:

Because today-s media centric students have adopted digital as their native form of communication, teachers are having increasingly difficult time motivating reluctant readers to read and write. Our research has shown these text-averse individuals can learn to understand the importance of reading and writing if the instruction is based on digital narratives. While these students are naturally attracted to story, they are better at consuming them than creating them. Therefore, any intervention that utilizes story as its basis needs to include instruction on the elements of story making. This paper presents a series of digitally-based tools to identify potential weaknesses of visually impaired visual learners and to help motivate these and other media-centric students to select and complete books that are assigned to them

Keywords: Cognitive tempo, digital narratives, digital Booktalk

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
8881 Flood Modeling in Urban Area Using a Well-Balanced Discontinuous Galerkin Scheme on Unstructured Triangular Grids

Authors: Rabih Ghostine, Craig Kapfer, Viswanathan Kannan, Ibrahim Hoteit

Abstract:

Urban flooding resulting from a sudden release of water due to dam-break or excessive rainfall is a serious threatening environment hazard, which causes loss of human life and large economic losses. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision, and rescue plans. This work reports on the numerical modelling of flash flood propagation in urban areas after an excessive rainfall event or dam-break. A two-dimensional (2D) depth-averaged shallow water model is used with a refined unstructured grid of triangles for representing the urban area topography. The 2D shallow water equations are solved using a second-order well-balanced discontinuous Galerkin scheme. Theoretical test case and three flood events are described to demonstrate the potential benefits of the scheme: (i) wetting and drying in a parabolic basin (ii) flash flood over a physical model of the urbanized Toce River valley in Italy; (iii) wave propagation on the Reyran river valley in consequence of the Malpasset dam-break in 1959 (France); and (iv) dam-break flood in October 1982 at the town of Sumacarcel (Spain). The capability of the scheme is also verified against alternative models. Computational results compare well with recorded data and show that the scheme is at least as efficient as comparable second-order finite volume schemes, with notable efficiency speedup due to parallelization.

Keywords: Flood modeling, dam-break, shallow water equations, Discontinuous Galerkin scheme, MUSCL scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 923
8880 Unipolar Anamorphosis and its use in Accessibility Analyses

Authors: T. Hudecek, Z. Zakova

Abstract:

The paper deals with cartographic visualisation of results of transport accessibility monitoring with the use of a semiautomated method of unipolar anamorphosis, developed by the authors in the GIS environment. The method is based on transformation of distance in the map to values of a geographical phenomenon. In the case of time accessibility it is based on transformation of isochrones converted into the form of concentric circles, taking into account selected topographic and thematic elements in the map. The method is most suitable for analyses of accessibility to or from a centre and for modelling its long-term context. The paper provides a detailed analysis of the procedures and functionality of the method, discussing the issues of coordinates, transformation, scale and visualisation. It also offers a discussion of possible problems and inaccuracies. A practical application of the method is illustrated by previous research results by the authors in the filed of accessibility in Czechia.

Keywords: accessibility, GIS, transformation, unipolar anamorphosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398