Search results for: Rule Based Modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12596

Search results for: Rule Based Modeling

8696 An Effective Hybrid Genetic Algorithm for Job Shop Scheduling Problem

Authors: Bin Cai, Shilong Wang, Haibo Hu

Abstract:

The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.

Keywords: Genetic algorithm, Job shop scheduling problem, Local search, Meta-heuristic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618
8695 Web Content Mining: A Solution to Consumer's Product Hunt

Authors: Syed Salman Ahmed, Zahid Halim, Rauf Baig, Shariq Bashir

Abstract:

With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.

Keywords: Data mining, web mining, search engines, knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2015
8694 A Multi-Level WEB Based Parallel Processing System A Hierarchical Volunteer Computing Approach

Authors: Abdelrahman Ahmed Mohamed Osman

Abstract:

Over the past few years, a number of efforts have been exerted to build parallel processing systems that utilize the idle power of LAN-s and PC-s available in many homes and corporations. The main advantage of these approaches is that they provide cheap parallel processing environments for those who cannot afford the expenses of supercomputers and parallel processing hardware. However, most of the solutions provided are not very flexible in the use of available resources and very difficult to install and setup. In this paper, a multi-level web-based parallel processing system (MWPS) is designed (appendix). MWPS is based on the idea of volunteer computing, very flexible, easy to setup and easy to use. MWPS allows three types of subscribers: simple volunteers (single computers), super volunteers (full networks) and end users. All of these entities are coordinated transparently through a secure web site. Volunteer nodes provide the required processing power needed by the system end users. There is no limit on the number of volunteer nodes, and accordingly the system can grow indefinitely. Both volunteer and system users must register and subscribe. Once, they subscribe, each entity is provided with the appropriate MWPS components. These components are very easy to install. Super volunteer nodes are provided with special components that make it possible to delegate some of the load to their inner nodes. These inner nodes may also delegate some of the load to some other lower level inner nodes .... and so on. It is the responsibility of the parent super nodes to coordinate the delegation process and deliver the results back to the user. MWPS uses a simple behavior-based scheduler that takes into consideration the current load and previous behavior of processing nodes. Nodes that fulfill their contracts within the expected time get a high degree of trust. Nodes that fail to satisfy their contract get a lower degree of trust. MWPS is based on the .NET framework and provides the minimal level of security expected in distributed processing environments. Users and processing nodes are fully authenticated. Communications and messages between nodes are very secure. The system has been implemented using C#. MWPS may be used by any group of people or companies to establish a parallel processing or grid environment.

Keywords: Volunteer computing, Parallel Processing, XMLWebServices, .NET Remoting, Tuplespace.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
8693 Fuzzy Clustering of Categorical Attributes and its Use in Analyzing Cultural Data

Authors: George E. Tsekouras, Dimitris Papageorgiou, Sotiris Kotsiantis, Christos Kalloniatis, Panagiotis Pintelas

Abstract:

We develop a three-step fuzzy logic-based algorithm for clustering categorical attributes, and we apply it to analyze cultural data. In the first step the algorithm employs an entropy-based clustering scheme, which initializes the cluster centers. In the second step we apply the fuzzy c-modes algorithm to obtain a fuzzy partition of the data set, and the third step introduces a novel cluster validity index, which decides the final number of clusters.

Keywords: Categorical data, cultural data, fuzzy logic clustering, fuzzy c-modes, cluster validity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
8692 Highly Efficient White Light-emitting Diodes Based on Layered Quantum Dot-Phosphor Nanocomposites as Converting Materials

Authors: J. Y. Woo, J. Lee, N. Kim, C.-S. Han

Abstract:

This paper reports on the enhanced photoluminescence (PL) of nanocomposites through the layered structuring of phosphor and quantum dot (QD). Green phosphor of Sr2SiO4:Eu, red QDs of CdSe/CdS/CdZnS/ZnS core-multishell, and thermo-curable resin were used for this study. Two kinds of composite (layered and mixed) were prepared, and the schemes for optical energy transfer between QD and phosphor were suggested and investigated based on PL decay characteristics. It was found that the layered structure is more effective than the mixed one in the respects of PL intensity, PL decay and thermal loss. When this layered nanocomposite (QDs on phosphor) is used to make white light emitting diode (LED), the brightness is increased by 37 %, and the color rendering index (CRI) value is raised to 88.4 compared to the mixed case of 80.4.

Keywords: Quantum Dot, Nanocomposites, Photoluminescence, Light Emitting Diode

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3164
8691 Why do Clawback Provisions Affect Financial Reporting Quality? - An Analysis of Trigger Effects

Authors: Yu-Chun Lin

Abstract:

We identify clawback triggers from firms- proxy statements (Form DEF 14A) and use the likelihood of restatements to proxy for financial reporting quality. Based on a sample of 578 U.S. firms that voluntarily adopt clawback provisions during 2003-2009, when restatement-based triggers could be decomposed into two types: fraud and unintentional error, and we do observe the evidence that using fraud triggers is associated with high financial reporting quality. The findings support that fraud triggers can enhance deterrent effect of clawback provision by establishing a viable disincentive against fraud, misconduct, and otherwise harmful acts. These results are robust to controlling for the compensation components, to different sample specifications and to a number of sensitivity.

Keywords: Accruals quality, Clawback provisions, Compensation, Restatements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2544
8690 CScheme in Traditional Concurrency Problems

Authors: Nathar Shah, Visham Cheerkoot

Abstract:

CScheme, a concurrent programming paradigm based on scheme concept enables concurrency schemes to be constructed from smaller synchronization units through a GUI based composer and latter be reused on other concurrency problems of a similar nature. This paradigm is particularly important in the multi-core environment prevalent nowadays. In this paper, we demonstrate techniques to separate concurrency from functional code using the CScheme paradigm. Then we illustrate how the CScheme methodology can be used to solve some of the traditional concurrency problems – critical section problem, and readers-writers problem - using synchronization schemes such as Single Threaded Execution Scheme, and Readers Writers Scheme.

Keywords: Concurrent Programming, Object Oriented Programming, Environments for multiple-processor systems, Programming paradigms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
8689 Elegant: An Intuitive Software Tool for Interactive Learning of Power System Analysis

Authors: Eduardo N. Velloso, Fernando M. N. Dantas, Luciano S. Barros

Abstract:

A common complaint from power system analysis students lies in the overly complex tools they need to learn and use just to simulate very basic systems or just to check the answers to power system calculations. The most basic power system studies are power-flow solutions and short-circuit calculations. This paper presents a simple tool with an intuitive interface to perform both these studies and assess its performance in comparison with existent commercial solutions. With this in mind, Elegant is a pure Python software tool for learning power system analysis developed for undergraduate and graduate students. It solves the power-flow problem by iterative numerical methods and calculates bolted short-circuit fault currents by modeling the network in the domain of symmetrical components. Elegant can be used with a user-friendly Graphical User Interface (GUI) and automatically generates human-readable reports of the simulation results. The tool is exemplified using a typical Brazilian regional system with 18 buses. This study performs a comparative experiment with 1 undergraduate and 4 graduate students who attempted the same problem using both Elegant and a commercial tool. It was found that Elegant significantly reduces the time and labor involved in basic power system simulations while still providing some insights into real power system designs.

Keywords: Free- and open-source software, power-flow, power system analysis, Python, short-circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 400
8688 Optimization Based Tuning of Autopilot Gains for a Fixed Wing UAV

Authors: Mansoor Ahsan, Khalid Rafique, Farrukh Mazhar

Abstract:

Unmanned Aerial Vehicles (UAVs) have gained tremendous importance, in both Military and Civil, during first decade of this century. In a UAV, onboard computer (autopilot) autonomously controls the flight and navigation of the aircraft. Based on the aircraft role and flight envelope, basic to complex and sophisticated controllers are used to stabilize the aircraft flight parameters. These controllers constitute the autopilot system for UAVs. The autopilot systems, most commonly, provide lateral and longitudinal control through Proportional-Integral-Derivative (PID) controllers or Phase-lead or Lag Compensators. Various techniques are commonly used to ‘tune’ gains of these controllers. Some techniques used are, in-flight step-by-step tuning, software-in-loop or hardware-in-loop tuning methods. Subsequently, numerous in-flight tests are required to actually ‘fine-tune’ these gains. However, an optimization-based tuning of these PID controllers or compensators, as presented in this paper, can greatly minimize the requirement of in-flight ‘tuning’ and substantially reduce the risks and cost involved in flight-testing.

Keywords: Unmanned aerial vehicle (UAV), autopilot, autonomous controls, PID controler gains tuning, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3614
8687 Microseismicity of the Tehran Region Based on Three Seismic Networks

Authors: Jamileh Vasheghani Farahani

Abstract:

The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).

Keywords: Iran, major faults, microseismicity, Tehran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1490
8686 Utilizing Ontologies Using Ontology Editor for Creating Initial Unified Modeling Language (UML)Object Model

Authors: Waralak Vongdoiwang Siricharoen

Abstract:

One of object oriented software developing problem is the difficulty of searching the appropriate and suitable objects for starting the system. In this work, ontologies appear in the part of supporting the object discovering in the initial of object oriented software developing. There are many researches try to demonstrate that there is a great potential between object model and ontologies. Constructing ontology from object model is called ontology engineering can be done; On the other hand, this research is aiming to support the idea of building object model from ontology is also promising and practical. Ontology classes are available online in any specific areas, which can be searched by semantic search engine. There are also many helping tools to do so; one of them which are used in this research is Protégé ontology editor and Visual Paradigm. To put them together give a great outcome. This research will be shown how it works efficiently with the real case study by using ontology classes in travel/tourism domain area. It needs to combine classes, properties, and relationships from more than two ontologies in order to generate the object model. In this paper presents a simple methodology framework which explains the process of discovering objects. The results show that this framework has great value while there is possible for expansion. Reusing of existing ontologies offers a much cheaper alternative than building new ones from scratch. More ontologies are becoming available on the web, and online ontologies libraries for storing and indexing ontologies are increasing in number and demand. Semantic and Ontologies search engines have also started to appear, to facilitate search and retrieval of online ontologies.

Keywords: Software Developing, Ontology, Ontology Library, Artificial Intelligent, Protégé, Object Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
8685 The Vulnerability Analysis of Java Bytecode Based on Points-to Dataflow

Authors: Tang Hong, Zhang Lufeng, Chen Hua, Zhang Jianbo

Abstract:

Today many developers use the Java components collected from the Internet as external LIBs to design and develop their own software. However, some unknown security bugs may exist in these components, such as SQL injection bug may comes from the components which have no specific check for the input string by users. To check these bugs out is very difficult without source code. So a novel method to check the bugs in Java bytecode based on points-to dataflow analysis is in need, which is different to the common analysis techniques base on the vulnerability pattern check. It can be used as an assistant tool for security analysis of Java bytecode from unknown softwares which will be used as extern LIBs.

Keywords: Java bytecode, points-to dataflow, vulnerability analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
8684 Modeling of Material Removal on Machining of Ti-6Al-4V through EDM using Copper Tungsten Electrode and Positive Polarity

Authors: M. M. Rahman, Md. Ashikur Rahman Khan, K. Kadirgama M. M. Noor, Rosli A. Bakar

Abstract:

This paper deals optimized model to investigate the effects of peak current, pulse on time and pulse off time in EDM performance on material removal rate of titanium alloy utilizing copper tungsten as electrode and positive polarity of the electrode. The experiments are carried out on Ti6Al4V. Experiments were conducted by varying the peak current, pulse on time and pulse off time. A mathematical model is developed to correlate the influences of these variables and material removal rate of workpiece. Design of experiments (DOE) method and response surface methodology (RSM) techniques are implemented. The validity test of the fit and adequacy of the proposed models has been carried out through analysis of variance (ANOVA). The obtained results evidence that as the material removal rate increases as peak current and pulse on time increases. The effect of pulse off time on MRR changes with peak ampere. The optimum machining conditions in favor of material removal rate are verified and compared. The optimum machining conditions in favor of material removal rate are estimated and verified with proposed optimized results. It is observed that the developed model is within the limits of the agreeable error (about 4%) when compared to experimental results. This result leads to desirable material removal rate and economical industrial machining to optimize the input parameters.

Keywords: Ti-6Al-4V, material removal rate, copper tungsten, positive polarity, RSM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2506
8683 Multiple Input Multiple Output Detection Using Roulette Wheel Based Ant Colony Optimization Technique

Authors: B. Rebekka, B. Malarkodi

Abstract:

This paper describes an approach to detect the transmitted signals for 2×2 Multiple Input Multiple Output (MIMO) setup using roulette wheel based ant colony optimization technique. The results obtained are compared with classical zero forcing and least mean square techniques. The detection rates achieved using this technique are consistently larger than the one achieved using classical methods for 50 number of attempts with two different antennas transmitting the input stream from a user. This paves the path to use alternative techniques to improve the throughput achieved in advanced networks like Long Term Evolution (LTE) networks.

Keywords: MIMO, ant colony optimization, roulette wheel, soft computing, LTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1046
8682 Micromechanics Modeling of 3D Network Smart Orthotropic Structures

Authors: E. M. Hassan, A. L. Kalamkarov

Abstract:

Two micromechanical models for 3D smart composite with embedded periodic or nearly periodic network of generally orthotropic reinforcements and actuators are developed and applied to cubic structures with unidirectional orientation of constituents. Analytical formulas for the effective piezothermoelastic coefficients are derived using the Asymptotic Homogenization Method (AHM). Finite Element Analysis (FEA) is subsequently developed and used to examine the aforementioned periodic 3D network reinforced smart structures. The deformation responses from the FE simulations are used to extract effective coefficients. The results from both techniques are compared. This work considers piezoelectric materials that respond linearly to changes in electric field, electric displacement, mechanical stress and strain and thermal effects. This combination of electric fields and thermo-mechanical response in smart composite structures is characterized by piezoelectric and thermal expansion coefficients. The problem is represented by unitcell and the models are developed using the AHM and the FEA to determine the effective piezoelectric and thermal expansion coefficients. Each unit cell contains a number of orthotropic inclusions in the form of structural reinforcements and actuators. Using matrix representation of the coupled response of the unit cell, the effective piezoelectric and thermal expansion coefficients are calculated and compared with results of the asymptotic homogenization method. A very good agreement is shown between these two approaches.

Keywords: Asymptotic Homogenization Method, Effective Piezothermoelastic Coefficients, Finite Element Analysis, 3D Smart Network Composite Structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
8681 Performance Analysis of Traffic Classification with Machine Learning

Authors: Htay Htay Yi, Zin May Aye

Abstract:

Network security is role of the ICT environment because malicious users are continually growing that realm of education, business, and then related with ICT. The network security contravention is typically described and examined centrally based on a security event management system. The firewalls, Intrusion Detection System (IDS), and Intrusion Prevention System are becoming essential to monitor or prevent of potential violations, incidents attack, and imminent threats. In this system, the firewall rules are set only for where the system policies are needed. Dataset deployed in this system are derived from the testbed environment. The traffic as in DoS and PortScan traffics are applied in the testbed with firewall and IDS implementation. The network traffics are classified as normal or attacks in the existing testbed environment based on six machine learning classification methods applied in the system. It is required to be tested to get datasets and applied for DoS and PortScan. The dataset is based on CICIDS2017 and some features have been added. This system tested 26 features from the applied dataset. The system is to reduce false positive rates and to improve accuracy in the implemented testbed design. The system also proves good performance by selecting important features and comparing existing a dataset by machine learning classifiers.

Keywords: False negative rate, intrusion detection system, machine learning methods, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026
8680 Transform-Domain Rate-Distortion Optimization Accelerator for H.264/AVC Video Encoding

Authors: Mohammed Golam Sarwer, Lai Man Po, Kai Guo, Q.M. Jonathan Wu

Abstract:

In H.264/AVC video encoding, rate-distortion optimization for mode selection plays a significant role to achieve outstanding performance in compression efficiency and video quality. However, this mode selection process also makes the encoding process extremely complex, especially in the computation of the ratedistortion cost function, which includes the computations of the sum of squared difference (SSD) between the original and reconstructed image blocks and context-based entropy coding of the block. In this paper, a transform-domain rate-distortion optimization accelerator based on fast SSD (FSSD) and VLC-based rate estimation algorithm is proposed. This algorithm could significantly simplify the hardware architecture for the rate-distortion cost computation with only ignorable performance degradation. An efficient hardware structure for implementing the proposed transform-domain rate-distortion optimization accelerator is also proposed. Simulation results demonstrated that the proposed algorithm reduces about 47% of total encoding time with negligible degradation of coding performance. The proposed method can be easily applied to many mobile video application areas such as a digital camera and a DMB (Digital Multimedia Broadcasting) phone.

Keywords: Context-adaptive variable length coding (CAVLC), H.264/AVC, rate-distortion optimization (RDO), sum of squareddifference (SSD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
8679 Adaptive Impedance Control for Unknown Non-Flat Environment

Authors: Norsinnira Zainul Azlan, Hiroshi Yamaura

Abstract:

This paper presents a new adaptive impedance control strategy, based on Function Approximation Technique (FAT) to compensate for unknown non-flat environment shape or time-varying environment location. The target impedance in the force controllable direction is modified by incorporating adaptive compensators and the uncertainties are represented by FAT, allowing the update law to be derived easily. The force error feedback is utilized in the estimation and the accurate knowledge of the environment parameters are not required by the algorithm. It is shown mathematically that the stability of the controller is guaranteed based on Lyapunov theory. Simulation results presented to demonstrate the validity of the proposed controller.

Keywords: Adaptive impedance control, Function Approximation Technique (FAT), impedance control, unknown environment position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547
8678 Security Weaknesses of Dynamic ID-based Remote User Authentication Protocol

Authors: Hyoungseob Lee, Donghyun Choi, Yunho Lee, Dongho Won, Seungjoo Kim

Abstract:

Recently, with the appearance of smart cards, many user authentication protocols using smart card have been proposed to mitigate the vulnerabilities in user authentication process. In 2004, Das et al. proposed a ID-based user authentication protocol that is secure against ID-theft and replay attack using smart card. In 2009, Wang et al. showed that Das et al.-s protocol is not secure to randomly chosen password attack and impersonation attack, and proposed an improved protocol. Their protocol provided mutual authentication and efficient password management. In this paper, we analyze the security weaknesses and point out the vulnerabilities of Wang et al.-s protocol.

Keywords: Message Alteration Attack, Impersonation Attack

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
8677 Feasibility Study of Friction Stir Welding Application for Kevlar Material

Authors: Ahmet Taşan, Süha Tirkeş, Yavuz Öztürk, Zafer Bingül

Abstract:

Friction stir welding (FSW) is a joining process in the solid state, which eliminates problems associated with the material melting and solidification, such as cracks, residual stresses and distortions generated during conventional welding. Among the most important advantages of FSW are; easy automation, less distortion, lower residual stress and good mechanical properties in the joining region. FSW is a recent approach to metal joining and although originally intended for aluminum alloys, it is investigated in a variety of metallic materials. The basic concept of FSW is a rotating tool, made of non-consumable material, specially designed with a geometry consisting of a pin and a recess (shoulder). This tool is inserted as spinning on its axis at the adjoining edges of two sheets or plates to be joined and then it travels along the joining path line. The tool rotation axis defines an angle of inclination with which the components to be welded. This angle is used for receiving the material to be processed at the tool base and to promote the gradual forge effect imposed by the shoulder during the passage of the tool. This prevents the material plastic flow at the tool lateral, ensuring weld closure on the back of the pin. In this study, two 4 mm Kevlar® plates which were produced with the Kevlar® fabrics, are analyzed with COMSOL Multiphysics in order to investigate the weldability via FSW. Thereafter, some experimental investigation is done with an appropriate workbench in order to compare them with the analysis results.

Keywords: Analytical modeling, composite materials welding, friction stir welding, heat generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1070
8676 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks

Authors: Zeyad Abdelmageid, Xianbin Wang

Abstract:

Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterwards. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and at times better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.

Keywords: Channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 320
8675 A Case Study of Key-Dependent Permutations in Feistel Ciphers

Authors: Hani Almimi, Ola Osabi, Azman Samsudin

Abstract:

Many attempts have been made to strengthen Feistel based block ciphers. Among the successful proposals is the key- dependent S-box which was implemented in some of the high-profile ciphers. In this paper a key-dependent permutation box is proposed and implemented on DES as a case study. The new modified DES, MDES, was tested against Diehard Tests, avalanche test, and performance test. The results showed that in general MDES is more resistible to attacks than DES with negligible overhead. Therefore, it is believed that the proposed key-dependent permutation should be considered as a valuable primitive that can help strengthen the security of Substitution-Permutation Network which is a core design in many Feistel based block ciphers.

Keywords: Block Cipher, Feistel Structure, DES, Diehard Tests, Avalanche Effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
8674 A New Algorithm for Enhanced Robustness of Copyright Mark

Authors: Harsh Vikram Singh, S. P. Singh, Anand Mohan

Abstract:

This paper discusses a new heavy tailed distribution based data hiding into discrete cosine transform (DCT) coefficients of image, which provides statistical security as well as robustness against steganalysis attacks. Unlike other data hiding algorithms, the proposed technique does not introduce much effect in the stegoimage-s DCT coefficient probability plots, thus making the presence of hidden data statistically undetectable. In addition the proposed method does not compromise on hiding capacity. When compared to the generic block DCT based data-hiding scheme, our method found more robust against a variety of image manipulating attacks such as filtering, blurring, JPEG compression etc.

Keywords: Information Security, Robust Steganography, Steganalysis, Pareto Probability Distribution function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766
8673 Near-Lossless Image Coding based on Orthogonal Polynomials

Authors: Krishnamoorthy R, Rajavijayalakshmi K, Punidha R

Abstract:

In this paper, a near lossless image coding scheme based on Orthogonal Polynomials Transform (OPT) has been presented. The polynomial operators and polynomials basis operators are obtained from set of orthogonal polynomials functions for the proposed transform coding. The image is partitioned into a number of distinct square blocks and the proposed transform coding is applied to each of these individually. After applying the proposed transform coding, the transformed coefficients are rearranged into a sub-band structure. The Embedded Zerotree (EZ) coding algorithm is then employed to quantize the coefficients. The proposed transform is implemented for various block sizes and the performance is compared with existing Discrete Cosine Transform (DCT) transform coding scheme.

Keywords: Near-lossless Coding, Orthogonal Polynomials Transform, Embedded Zerotree Coding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1924
8672 River Analysis System Model for Proposed Weirs at Downstream of Large Dam, Thailand

Authors: S. Chuenchooklin

Abstract:

This research was conducted in the Lower Ping River Basin downstream of the Bhumibol Dam and the Lower Wang River Basin in Tak Province, Thailand. Most of the tributary streams of the Ping can be considered as ungauged catchments. There are 10- pumping station installation at both river banks of the Ping in Tak Province. Recently, most of them could not fully operate due to the water amount in the river below the level that would be pumping, even though included water from the natural river and released flow from the Bhumibol Dam. The aim of this research was to increase the performance of those pumping stations using weir projects in the Ping. Therefore, the river analysis system model (HEC-RAS) was applied to study the hydraulic behavior of water surface profiles in the Ping River with both cases of existing conditions and proposed weirs during the violent flood in 2011 and severe drought in 2013. Moreover, the hydrologic modeling system (HMS) was applied to simulate lateral streamflow hydrograph from ungauged catchments of the Ping. The results of HEC-RAS model calibration with existing conditions in 2011 showed best trial roughness coefficient for the main channel of 0.026. The simulated water surface levels fitted to observation data with R2 of 0.8175. The model was applied to 3 proposed cascade weirs with 2.35 m in height and found surcharge water level only 0.27 m higher than the existing condition in 2011. Moreover, those weirs could maintain river water levels and increase of those pumping performances during less river flow in 2013.

Keywords: HEC-RAS, HMS, pumping stations, cascade weirs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250
8671 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification

Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian

Abstract:

Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.

Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 725
8670 Chilean Wines Classification based only on Aroma Information

Authors: Nicolás H. Beltrán, Manuel A. Duarte-Mermoud, Víctor A. Soto, Sebastián A. Salah, and Matías A. Bustos

Abstract:

Results of Chilean wine classification based on the information provided by an electronic nose are reported in this paper. The classification scheme consists of two parts; in the first stage, Principal Component Analysis is used as feature extraction method to reduce the dimensionality of the original information. Then, Radial Basis Functions Neural Networks is used as pattern recognition technique to perform the classification. The objective of this study is to classify different Cabernet Sauvignon, Merlot and Carménère wine samples from different years, valleys and vineyards of Chile.

Keywords: Feature extraction techniques, Pattern recognitiontechniques, Principal component analysis, Radial basis functionsneural networks, Wine classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
8669 Digital Narrative as a Change Agent to Teach Reading to Media-Centric Students

Authors: Robert F. Kenny

Abstract:

Because today-s media centric students have adopted digital as their native form of communication, teachers are having increasingly difficult time motivating reluctant readers to read and write. Our research has shown these text-averse individuals can learn to understand the importance of reading and writing if the instruction is based on digital narratives. While these students are naturally attracted to story, they are better at consuming them than creating them. Therefore, any intervention that utilizes story as its basis needs to include instruction on the elements of story making. This paper presents a series of digitally-based tools to identify potential weaknesses of visually impaired visual learners and to help motivate these and other media-centric students to select and complete books that are assigned to them

Keywords: Cognitive tempo, digital narratives, digital Booktalk

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
8668 Toward a Risk Assessment Model Based On Multi-Agent System for Cloud Consumer

Authors: Saadia Drissi, Siham Benhadou, Hicham Medromi

Abstract:

The cloud computing is an innovative paradigm that introduces several changes in technology that have resulted a new ways for cloud providers to deliver their services to cloud consumers mainly in term of security risk assessment, thus, adapting a current risk assessment tools to cloud computing is a very difficult task due to its several characteristics that challenge the effectiveness of risk assessment approaches. As consequence, there is a need of risk assessment model adapted to cloud computing. This paper requires a new risk assessment model based on multi-agent system and AHP model as fundamental steps towards the development of flexible risk assessment approach regarding cloud consumers.

Keywords: Cloud computing, risk assessment model, multi-agent system, AHP model, cloud consumer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214
8667 Unipolar Anamorphosis and its use in Accessibility Analyses

Authors: T. Hudecek, Z. Zakova

Abstract:

The paper deals with cartographic visualisation of results of transport accessibility monitoring with the use of a semiautomated method of unipolar anamorphosis, developed by the authors in the GIS environment. The method is based on transformation of distance in the map to values of a geographical phenomenon. In the case of time accessibility it is based on transformation of isochrones converted into the form of concentric circles, taking into account selected topographic and thematic elements in the map. The method is most suitable for analyses of accessibility to or from a centre and for modelling its long-term context. The paper provides a detailed analysis of the procedures and functionality of the method, discussing the issues of coordinates, transformation, scale and visualisation. It also offers a discussion of possible problems and inaccuracies. A practical application of the method is illustrated by previous research results by the authors in the filed of accessibility in Czechia.

Keywords: accessibility, GIS, transformation, unipolar anamorphosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376