Search results for: edge based enhancement
9246 Development of a Simulator for Explaining Organic Chemical Reactions Based on Qualitative Process Theory
Authors: Alicia Y. C. Tang, Rukaini Hj. Abdullah, Sharifuddin M. Zain
Abstract:
This paper discusses the development of a qualitative simulator (abbreviated QRiOM) for predicting the behaviour of organic chemical reactions. The simulation technique is based on the qualitative process theory (QPT) ontology. The modelling constructs of QPT embody notions of causality which can be used to explain the behaviour of a chemical system. The major theme of this work is that, in a qualitative simulation environment, students are able to articulate his/her knowledge through the inspection of explanations generated by software. The implementation languages are Java and Prolog. The software produces explanation in various forms that stresses on the causal theories in the chemical system which can be effectively used to support learning.Keywords: Chemical reactions, explanation, qualitative processtheory, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15669245 Application of Adaptive Network-Based Fuzzy Inference System in Macroeconomic Variables Forecasting
Authors: Ε. Giovanis
Abstract:
In this paper we apply an Adaptive Network-Based Fuzzy Inference System (ANFIS) with one input, the dependent variable with one lag, for the forecasting of four macroeconomic variables of US economy, the Gross Domestic Product, the inflation rate, six monthly treasury bills interest rates and unemployment rate. We compare the forecasting performance of ANFIS with those of the widely used linear autoregressive and nonlinear smoothing transition autoregressive (STAR) models. The results are greatly in favour of ANFIS indicating that is an effective tool for macroeconomic forecasting used in academic research and in research and application by the governmental and other institutionsKeywords: Linear models, Macroeconomics, Neuro-Fuzzy, Non-Linear models
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17939244 Hand Written Digit Recognition by Multiple Classifier Fusion based on Decision Templates Approach
Authors: Reza Ebrahimpour, Samaneh Hamedi
Abstract:
Classifier fusion may generate more accurate classification than each of the basic classifiers. Fusion is often based on fixed combination rules like the product, average etc. This paper presents decision templates as classifier fusion method for the recognition of the handwritten English and Farsi numerals (1-9). The process involves extracting a feature vector on well-known image databases. The extracted feature vector is fed to multiple classifier fusion. A set of experiments were conducted to compare decision templates (DTs) with some combination rules. Results from decision templates conclude 97.99% and 97.28% for Farsi and English handwritten digits.Keywords: Decision templates, multi-layer perceptron, characteristics Loci, principle component analysis (PCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19569243 Designing Ontology-Based Knowledge Integration for Preprocessing of Medical Data in Enhancing a Machine Learning System for Coding Assignment of a Multi-Label Medical Text
Authors: Phanu Waraporn
Abstract:
This paper discusses the designing of knowledge integration of clinical information extracted from distributed medical ontologies in order to ameliorate a machine learning-based multilabel coding assignment system. The proposed approach is implemented using a decision tree technique of the machine learning on the university hospital data for patients with Coronary Heart Disease (CHD). The preliminary results obtained show a satisfactory finding that the use of medical ontologies improves the overall system performance.
Keywords: Medical Ontology, Knowledge Integration, Machine Learning, Medical Coding, Text Assignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18509242 Ranking of Inventory Policies Using Distance Based Approach Method
Authors: Gupta Amit, Kumar Ramesh, Tewari P. C.
Abstract:
Globalization is putting enormous pressure on the business organizations specially manufacturing one to rethink the supply chain in innovative manners. Inventory consumes major portion of total sale revenue. Effective and efficient inventory management plays a vital role for the successful functioning of any organization. Selection of inventory policy is one of the important purchasing activities. This paper focuses on selection and ranking of alternative inventory policies. A deterministic quantitative model based on Distance Based Approach (DBA) method has been developed for evaluation and ranking of inventory policies. We have employed this concept first time for this type of the selection problem. Four inventory policies economic order quantity (EOQ), just in time (JIT), vendor managed inventory (VMI) and monthly policy are considered. Improper selection could affect a company’s competitiveness in terms of the productivity of its facilities and quality of its products. The ranking of inventory policies is a multi-criteria problem. There is a need to first identify the selection criteria and then processes the information with reference to relative importance of attributes for comparison. Criteria values for each inventory policy can be obtained either analytically or by using a simulation technique or they are linguistic subjective judgments defined by fuzzy sets, like, for example, the values of criteria. A methodology is developed and applied to rank the inventory policies.
Keywords: Inventory Policy, Ranking, DBA, Selection criteria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18279241 Searching k-Nearest Neighbors to be Appropriate under Gamming Environments
Authors: Jae Moon Lee
Abstract:
In general, algorithms to find continuous k-nearest neighbors have been researched on the location based services, monitoring periodically the moving objects such as vehicles and mobile phone. Those researches assume the environment that the number of query points is much less than that of moving objects and the query points are not moved but fixed. In gaming environments, this problem is when computing the next movement considering the neighbors such as flocking, crowd and robot simulations. In this case, every moving object becomes a query point so that the number of query point is same to that of moving objects and the query points are also moving. In this paper, we analyze the performance of the existing algorithms focused on location based services how they operate under gaming environments.
Keywords: Flocking behavior, heterogeneous agents, similarity, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15469240 An Improved Algorithm for Channel Estimations of OFDM System based Pilot Signal
Authors: Ahmed N. H. Alnuaimy, Mahamod Ismail, Mohd. A. M. Ali, Kasmiran Jumari, Ayman A. El-Saleh
Abstract:
This paper presents a new algorithm for the channel estimation of the OFDM system based on a pilot signal for the new generation of high data rate communication systems. In orthogonal frequency division multiplexing (OFDM) systems over fast-varying fading channels, channel estimation and tracking is generally carried out by transmitting known pilot symbols in given positions of the frequency-time grid. In this paper, we propose to derive an improved algorithm based on the calculation of the mean and the variance of the adjacent pilot signals for a specific distribution of the pilot signals in the OFDM frequency-time grid then calculating of the entire unknown channel coefficients from the equation of the mean and the variance. Simulation results shows that the performance of the OFDM system increase as the length of the channel increase where the accuracy of the estimated channel will be increased using this low complexity algorithm, also the number of the pilot signal needed to be inserted in the OFDM signal will be reduced which lead to increase in the throughput of the signal over the OFDM system in compared with other type of the distribution such as Comb type and Block type channel estimation.
Keywords: Channel estimation, orthogonal frequency divisionmultiplexing (OFDM), comb type channel estimation, block typechannel estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14169239 An Effective Hybrid Genetic Algorithm for Job Shop Scheduling Problem
Authors: Bin Cai, Shilong Wang, Haibo Hu
Abstract:
The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Keywords: Genetic algorithm, Job shop scheduling problem, Local search, Meta-heuristic algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16549238 A Multi-Level WEB Based Parallel Processing System A Hierarchical Volunteer Computing Approach
Authors: Abdelrahman Ahmed Mohamed Osman
Abstract:
Over the past few years, a number of efforts have been exerted to build parallel processing systems that utilize the idle power of LAN-s and PC-s available in many homes and corporations. The main advantage of these approaches is that they provide cheap parallel processing environments for those who cannot afford the expenses of supercomputers and parallel processing hardware. However, most of the solutions provided are not very flexible in the use of available resources and very difficult to install and setup. In this paper, a multi-level web-based parallel processing system (MWPS) is designed (appendix). MWPS is based on the idea of volunteer computing, very flexible, easy to setup and easy to use. MWPS allows three types of subscribers: simple volunteers (single computers), super volunteers (full networks) and end users. All of these entities are coordinated transparently through a secure web site. Volunteer nodes provide the required processing power needed by the system end users. There is no limit on the number of volunteer nodes, and accordingly the system can grow indefinitely. Both volunteer and system users must register and subscribe. Once, they subscribe, each entity is provided with the appropriate MWPS components. These components are very easy to install. Super volunteer nodes are provided with special components that make it possible to delegate some of the load to their inner nodes. These inner nodes may also delegate some of the load to some other lower level inner nodes .... and so on. It is the responsibility of the parent super nodes to coordinate the delegation process and deliver the results back to the user. MWPS uses a simple behavior-based scheduler that takes into consideration the current load and previous behavior of processing nodes. Nodes that fulfill their contracts within the expected time get a high degree of trust. Nodes that fail to satisfy their contract get a lower degree of trust. MWPS is based on the .NET framework and provides the minimal level of security expected in distributed processing environments. Users and processing nodes are fully authenticated. Communications and messages between nodes are very secure. The system has been implemented using C#. MWPS may be used by any group of people or companies to establish a parallel processing or grid environment.Keywords: Volunteer computing, Parallel Processing, XMLWebServices, .NET Remoting, Tuplespace.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14949237 Fuzzy Clustering of Categorical Attributes and its Use in Analyzing Cultural Data
Authors: George E. Tsekouras, Dimitris Papageorgiou, Sotiris Kotsiantis, Christos Kalloniatis, Panagiotis Pintelas
Abstract:
We develop a three-step fuzzy logic-based algorithm for clustering categorical attributes, and we apply it to analyze cultural data. In the first step the algorithm employs an entropy-based clustering scheme, which initializes the cluster centers. In the second step we apply the fuzzy c-modes algorithm to obtain a fuzzy partition of the data set, and the third step introduces a novel cluster validity index, which decides the final number of clusters.
Keywords: Categorical data, cultural data, fuzzy logic clustering, fuzzy c-modes, cluster validity index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17099236 Highly Efficient White Light-emitting Diodes Based on Layered Quantum Dot-Phosphor Nanocomposites as Converting Materials
Authors: J. Y. Woo, J. Lee, N. Kim, C.-S. Han
Abstract:
This paper reports on the enhanced photoluminescence (PL) of nanocomposites through the layered structuring of phosphor and quantum dot (QD). Green phosphor of Sr2SiO4:Eu, red QDs of CdSe/CdS/CdZnS/ZnS core-multishell, and thermo-curable resin were used for this study. Two kinds of composite (layered and mixed) were prepared, and the schemes for optical energy transfer between QD and phosphor were suggested and investigated based on PL decay characteristics. It was found that the layered structure is more effective than the mixed one in the respects of PL intensity, PL decay and thermal loss. When this layered nanocomposite (QDs on phosphor) is used to make white light emitting diode (LED), the brightness is increased by 37 %, and the color rendering index (CRI) value is raised to 88.4 compared to the mixed case of 80.4.Keywords: Quantum Dot, Nanocomposites, Photoluminescence, Light Emitting Diode
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31999235 Why do Clawback Provisions Affect Financial Reporting Quality? - An Analysis of Trigger Effects
Authors: Yu-Chun Lin
Abstract:
We identify clawback triggers from firms- proxy statements (Form DEF 14A) and use the likelihood of restatements to proxy for financial reporting quality. Based on a sample of 578 U.S. firms that voluntarily adopt clawback provisions during 2003-2009, when restatement-based triggers could be decomposed into two types: fraud and unintentional error, and we do observe the evidence that using fraud triggers is associated with high financial reporting quality. The findings support that fraud triggers can enhance deterrent effect of clawback provision by establishing a viable disincentive against fraud, misconduct, and otherwise harmful acts. These results are robust to controlling for the compensation components, to different sample specifications and to a number of sensitivity.Keywords: Accruals quality, Clawback provisions, Compensation, Restatements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25919234 CScheme in Traditional Concurrency Problems
Authors: Nathar Shah, Visham Cheerkoot
Abstract:
CScheme, a concurrent programming paradigm based on scheme concept enables concurrency schemes to be constructed from smaller synchronization units through a GUI based composer and latter be reused on other concurrency problems of a similar nature. This paradigm is particularly important in the multi-core environment prevalent nowadays. In this paper, we demonstrate techniques to separate concurrency from functional code using the CScheme paradigm. Then we illustrate how the CScheme methodology can be used to solve some of the traditional concurrency problems – critical section problem, and readers-writers problem - using synchronization schemes such as Single Threaded Execution Scheme, and Readers Writers Scheme.Keywords: Concurrent Programming, Object Oriented Programming, Environments for multiple-processor systems, Programming paradigms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16949233 Optimization Based Tuning of Autopilot Gains for a Fixed Wing UAV
Authors: Mansoor Ahsan, Khalid Rafique, Farrukh Mazhar
Abstract:
Unmanned Aerial Vehicles (UAVs) have gained tremendous importance, in both Military and Civil, during first decade of this century. In a UAV, onboard computer (autopilot) autonomously controls the flight and navigation of the aircraft. Based on the aircraft role and flight envelope, basic to complex and sophisticated controllers are used to stabilize the aircraft flight parameters. These controllers constitute the autopilot system for UAVs. The autopilot systems, most commonly, provide lateral and longitudinal control through Proportional-Integral-Derivative (PID) controllers or Phase-lead or Lag Compensators. Various techniques are commonly used to ‘tune’ gains of these controllers. Some techniques used are, in-flight step-by-step tuning, software-in-loop or hardware-in-loop tuning methods. Subsequently, numerous in-flight tests are required to actually ‘fine-tune’ these gains. However, an optimization-based tuning of these PID controllers or compensators, as presented in this paper, can greatly minimize the requirement of in-flight ‘tuning’ and substantially reduce the risks and cost involved in flight-testing.
Keywords: Unmanned aerial vehicle (UAV), autopilot, autonomous controls, PID controler gains tuning, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36579232 Microseismicity of the Tehran Region Based on Three Seismic Networks
Authors: Jamileh Vasheghani Farahani
Abstract:
The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).
Keywords: Iran, major faults, microseismicity, Tehran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15189231 The Vulnerability Analysis of Java Bytecode Based on Points-to Dataflow
Authors: Tang Hong, Zhang Lufeng, Chen Hua, Zhang Jianbo
Abstract:
Today many developers use the Java components collected from the Internet as external LIBs to design and develop their own software. However, some unknown security bugs may exist in these components, such as SQL injection bug may comes from the components which have no specific check for the input string by users. To check these bugs out is very difficult without source code. So a novel method to check the bugs in Java bytecode based on points-to dataflow analysis is in need, which is different to the common analysis techniques base on the vulnerability pattern check. It can be used as an assistant tool for security analysis of Java bytecode from unknown softwares which will be used as extern LIBs.Keywords: Java bytecode, points-to dataflow, vulnerability analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17639230 Multiple Input Multiple Output Detection Using Roulette Wheel Based Ant Colony Optimization Technique
Authors: B. Rebekka, B. Malarkodi
Abstract:
This paper describes an approach to detect the transmitted signals for 2×2 Multiple Input Multiple Output (MIMO) setup using roulette wheel based ant colony optimization technique. The results obtained are compared with classical zero forcing and least mean square techniques. The detection rates achieved using this technique are consistently larger than the one achieved using classical methods for 50 number of attempts with two different antennas transmitting the input stream from a user. This paves the path to use alternative techniques to improve the throughput achieved in advanced networks like Long Term Evolution (LTE) networks.Keywords: MIMO, ant colony optimization, roulette wheel, soft computing, LTE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10779229 Performance Analysis of Traffic Classification with Machine Learning
Authors: Htay Htay Yi, Zin May Aye
Abstract:
Network security is role of the ICT environment because malicious users are continually growing that realm of education, business, and then related with ICT. The network security contravention is typically described and examined centrally based on a security event management system. The firewalls, Intrusion Detection System (IDS), and Intrusion Prevention System are becoming essential to monitor or prevent of potential violations, incidents attack, and imminent threats. In this system, the firewall rules are set only for where the system policies are needed. Dataset deployed in this system are derived from the testbed environment. The traffic as in DoS and PortScan traffics are applied in the testbed with firewall and IDS implementation. The network traffics are classified as normal or attacks in the existing testbed environment based on six machine learning classification methods applied in the system. It is required to be tested to get datasets and applied for DoS and PortScan. The dataset is based on CICIDS2017 and some features have been added. This system tested 26 features from the applied dataset. The system is to reduce false positive rates and to improve accuracy in the implemented testbed design. The system also proves good performance by selecting important features and comparing existing a dataset by machine learning classifiers.Keywords: False negative rate, intrusion detection system, machine learning methods, performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10719228 Software Model for a Computer Based Training for an HVDC Control Desk Simulator
Authors: José R. G. Braga, Joice B. Mendes, Guilherme H. Caponetto, Alexandre C. B. Ramos
Abstract:
With major technological advances and to reduce the cost of training apprentices for real-time critical systems, it was necessary the development of Intelligent Tutoring Systems for training apprentices in these systems. These systems, in general, have interactive features so that the learning is actually more efficient, making the learner more familiar with the mechanism in question. In the home stage of learning, tests are performed to obtain the student's income, a measure on their use. The aim of this paper is to present a framework to model an Intelligent Tutoring Systems using the UML language. The various steps of the analysis are considered the diagrams required to build a general model, whose purpose is to present the different perspectives of its development.Keywords: Computer based training, Hypermedia, Software modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16079227 Transform-Domain Rate-Distortion Optimization Accelerator for H.264/AVC Video Encoding
Authors: Mohammed Golam Sarwer, Lai Man Po, Kai Guo, Q.M. Jonathan Wu
Abstract:
In H.264/AVC video encoding, rate-distortion optimization for mode selection plays a significant role to achieve outstanding performance in compression efficiency and video quality. However, this mode selection process also makes the encoding process extremely complex, especially in the computation of the ratedistortion cost function, which includes the computations of the sum of squared difference (SSD) between the original and reconstructed image blocks and context-based entropy coding of the block. In this paper, a transform-domain rate-distortion optimization accelerator based on fast SSD (FSSD) and VLC-based rate estimation algorithm is proposed. This algorithm could significantly simplify the hardware architecture for the rate-distortion cost computation with only ignorable performance degradation. An efficient hardware structure for implementing the proposed transform-domain rate-distortion optimization accelerator is also proposed. Simulation results demonstrated that the proposed algorithm reduces about 47% of total encoding time with negligible degradation of coding performance. The proposed method can be easily applied to many mobile video application areas such as a digital camera and a DMB (Digital Multimedia Broadcasting) phone.Keywords: Context-adaptive variable length coding (CAVLC), H.264/AVC, rate-distortion optimization (RDO), sum of squareddifference (SSD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16069226 Adaptive Impedance Control for Unknown Non-Flat Environment
Authors: Norsinnira Zainul Azlan, Hiroshi Yamaura
Abstract:
This paper presents a new adaptive impedance control strategy, based on Function Approximation Technique (FAT) to compensate for unknown non-flat environment shape or time-varying environment location. The target impedance in the force controllable direction is modified by incorporating adaptive compensators and the uncertainties are represented by FAT, allowing the update law to be derived easily. The force error feedback is utilized in the estimation and the accurate knowledge of the environment parameters are not required by the algorithm. It is shown mathematically that the stability of the controller is guaranteed based on Lyapunov theory. Simulation results presented to demonstrate the validity of the proposed controller.Keywords: Adaptive impedance control, Function Approximation Technique (FAT), impedance control, unknown environment position.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15829225 Security Weaknesses of Dynamic ID-based Remote User Authentication Protocol
Authors: Hyoungseob Lee, Donghyun Choi, Yunho Lee, Dongho Won, Seungjoo Kim
Abstract:
Recently, with the appearance of smart cards, many user authentication protocols using smart card have been proposed to mitigate the vulnerabilities in user authentication process. In 2004, Das et al. proposed a ID-based user authentication protocol that is secure against ID-theft and replay attack using smart card. In 2009, Wang et al. showed that Das et al.-s protocol is not secure to randomly chosen password attack and impersonation attack, and proposed an improved protocol. Their protocol provided mutual authentication and efficient password management. In this paper, we analyze the security weaknesses and point out the vulnerabilities of Wang et al.-s protocol.Keywords: Message Alteration Attack, Impersonation Attack
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17739224 Controlling the Angle of Attack of an Aircraft Using Genetic Algorithm Based Flight Controller
Authors: S. Swain, P. S Khuntia
Abstract:
In this paper, the unstable angle of attack of a FOXTROT aircraft is controlled by using Genetic Algorithm based flight controller and the result is compared with the conventional techniques like Tyreus-Luyben (TL), Ziegler-Nichols (ZN) and Interpolation Rule (IR) for tuning the PID controller. In addition, the performance indices like Mean Square Error (MSE), Integral Square Error (ISE), and Integral Absolute Time Error (IATE) etc. are improved by using Genetic Algorithm. It was established that the error by using GA is very less as compared to the conventional techniques thereby improving the performance indices of the dynamic system.Keywords: Angle of Attack, Genetic Algorithm, Performance Indices, PID Controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17739223 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterwards. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and at times better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.
Keywords: Channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3859222 A Case Study of Key-Dependent Permutations in Feistel Ciphers
Authors: Hani Almimi, Ola Osabi, Azman Samsudin
Abstract:
Many attempts have been made to strengthen Feistel based block ciphers. Among the successful proposals is the key- dependent S-box which was implemented in some of the high-profile ciphers. In this paper a key-dependent permutation box is proposed and implemented on DES as a case study. The new modified DES, MDES, was tested against Diehard Tests, avalanche test, and performance test. The results showed that in general MDES is more resistible to attacks than DES with negligible overhead. Therefore, it is believed that the proposed key-dependent permutation should be considered as a valuable primitive that can help strengthen the security of Substitution-Permutation Network which is a core design in many Feistel based block ciphers.
Keywords: Block Cipher, Feistel Structure, DES, Diehard Tests, Avalanche Effect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20109221 A New Algorithm for Enhanced Robustness of Copyright Mark
Authors: Harsh Vikram Singh, S. P. Singh, Anand Mohan
Abstract:
This paper discusses a new heavy tailed distribution based data hiding into discrete cosine transform (DCT) coefficients of image, which provides statistical security as well as robustness against steganalysis attacks. Unlike other data hiding algorithms, the proposed technique does not introduce much effect in the stegoimage-s DCT coefficient probability plots, thus making the presence of hidden data statistically undetectable. In addition the proposed method does not compromise on hiding capacity. When compared to the generic block DCT based data-hiding scheme, our method found more robust against a variety of image manipulating attacks such as filtering, blurring, JPEG compression etc.
Keywords: Information Security, Robust Steganography, Steganalysis, Pareto Probability Distribution function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17979220 Near-Lossless Image Coding based on Orthogonal Polynomials
Authors: Krishnamoorthy R, Rajavijayalakshmi K, Punidha R
Abstract:
In this paper, a near lossless image coding scheme based on Orthogonal Polynomials Transform (OPT) has been presented. The polynomial operators and polynomials basis operators are obtained from set of orthogonal polynomials functions for the proposed transform coding. The image is partitioned into a number of distinct square blocks and the proposed transform coding is applied to each of these individually. After applying the proposed transform coding, the transformed coefficients are rearranged into a sub-band structure. The Embedded Zerotree (EZ) coding algorithm is then employed to quantize the coefficients. The proposed transform is implemented for various block sizes and the performance is compared with existing Discrete Cosine Transform (DCT) transform coding scheme.Keywords: Near-lossless Coding, Orthogonal Polynomials Transform, Embedded Zerotree Coding
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19459219 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification
Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian
Abstract:
Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.
Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7759218 Chilean Wines Classification based only on Aroma Information
Authors: Nicolás H. Beltrán, Manuel A. Duarte-Mermoud, Víctor A. Soto, Sebastián A. Salah, and Matías A. Bustos
Abstract:
Results of Chilean wine classification based on the information provided by an electronic nose are reported in this paper. The classification scheme consists of two parts; in the first stage, Principal Component Analysis is used as feature extraction method to reduce the dimensionality of the original information. Then, Radial Basis Functions Neural Networks is used as pattern recognition technique to perform the classification. The objective of this study is to classify different Cabernet Sauvignon, Merlot and Carménère wine samples from different years, valleys and vineyards of Chile.Keywords: Feature extraction techniques, Pattern recognitiontechniques, Principal component analysis, Radial basis functionsneural networks, Wine classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15479217 Digital Narrative as a Change Agent to Teach Reading to Media-Centric Students
Authors: Robert F. Kenny
Abstract:
Because today-s media centric students have adopted digital as their native form of communication, teachers are having increasingly difficult time motivating reluctant readers to read and write. Our research has shown these text-averse individuals can learn to understand the importance of reading and writing if the instruction is based on digital narratives. While these students are naturally attracted to story, they are better at consuming them than creating them. Therefore, any intervention that utilizes story as its basis needs to include instruction on the elements of story making. This paper presents a series of digitally-based tools to identify potential weaknesses of visually impaired visual learners and to help motivate these and other media-centric students to select and complete books that are assigned to themKeywords: Cognitive tempo, digital narratives, digital Booktalk
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597