Search results for: Parallel Techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2976

Search results for: Parallel Techniques

2166 Performance Improvement of MAC Protocols for Broadband Power-Line Access Networks of Developing Countries: A Case of Tanzania

Authors: Abdi T. Abdalla, Justinian Anatory

Abstract:

This paper investigates the possibility of improving throughputs of some Media Access Controls protocols such as ALOHA, slotted ALOHA and Carrier Sense Multiple Access with Collision Avoidance with the aim of increasing the performance of Powerline access networks. In this investigation, the real Powerline network topology in Tanzania located in Dar es Salaam City, Kariakoo area was used as a case study. During this investigation, Wireshark Network Protocol Analyzer was used to analyze data traffic of similar existing network for projection purpose and then the data were simulated using MATLAB. This paper proposed and analyzed three improvement techniques based on collision domain, packet length and combination of the two. From the results, it was found that the throughput of Carrier Sense Multiple Access with Collision Avoidance protocol improved noticeably while ALOHA and slotted ALOHA showed insignificant changes especially when the hybrid techniques were employed.

Keywords: Access Network, ALOHA, Broadband Powerline Communication, Slotted ALOHA, CSMA/CA and MAC Protocols.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2003
2165 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
2164 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
2163 Existence of Nano-Organic Carbon Particles below the Size Range of 10 nm in the Indoor Air Environment

Authors: Bireswar Paul, Amitava Datta

Abstract:

Indoor air environment is a big concern in the last few decades in the developing countries, with increased focus on monitoring the air quality. In this work, an experimental study has been conducted to establish the existence of carbon nanoparticles below the size range of 10 nm in the non-sooting zone of a LPG/air partially premixed flame. Mainly, four optical techniques, UV absorption spectroscopy, fluorescence spectroscopy, dynamic light scattering and TEM have been used to characterize and measure the size of carbon nanoparticles in the sampled materials collected from the inner surface of the flame front. The existence of the carbon nanoparticles in the sampled material has been confirmed with the typical nature of the absorption and fluorescence spectra already reported in the literature. The band gap energy shows that the particles are made up of three to six aromatic rings. The size measurement by DLS technique also shows that the particles below the size range of 10 nm. The results of DLS are also corroborated by the TEM image of the same material. 

Keywords: Indoor air, carbon nanoparticles, LPG, partially premixed flame, optical techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
2162 Centralized Peak Consumption Smoothing Revisited for Habitat Energy Scheduling

Authors: M. Benbouzid, Q. Bresson, A. Duclos, K. Longo, Q. Morel

Abstract:

Currently, electricity suppliers must predict the consumption of their customers in order to deduce the power they need to produce. It is then important in a first step to optimize household consumptions to obtain more constant curves by limiting peaks in energy consumption. Here centralized real time scheduling is proposed to manage the equipments starting in parallel. The aim is not to exceed a certain limit while optimizing the power consumption across a habitat. The Raspberry Pi is used as a box; this scheduler interacts with the various sensors in 6LoWPAN. At the scale of a single dwelling, household consumption decreases, particularly at times corresponding to the peaks. However, it would be wiser to consider the use of a residential complex so that the result would be more significant. So the ceiling would no longer be fixed. The scheduling would be done on two scales, on the one hand per dwelling, and secondly, at the level of a residential complex.

Keywords: Smart grid, Energy box, Scheduling, Gang Model, Energy consumption, Energy management system, and Wireless Sensor Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
2161 Reliability Evaluation using Triangular Intuitionistic Fuzzy Numbers Arithmetic Operations

Authors: G. S. Mahapatra, T. K. Roy

Abstract:

In general fuzzy sets are used to analyze the fuzzy system reliability. Here intuitionistic fuzzy set theory for analyzing the fuzzy system reliability has been used. To analyze the fuzzy system reliability, the reliability of each component of the system as a triangular intuitionistic fuzzy number is considered. Triangular intuitionistic fuzzy number and their arithmetic operations are introduced. Expressions for computing the fuzzy reliability of a series system and a parallel system following triangular intuitionistic fuzzy numbers have been described. Here an imprecise reliability model of an electric network model of dark room is taken. To compute the imprecise reliability of the above said system, reliability of each component of the systems is represented by triangular intuitionistic fuzzy numbers. Respective numerical example is presented.

Keywords: Fuzzy set, Intuitionistic fuzzy number, Systemreliability, Triangular intuitionistic fuzzy number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3142
2160 Investigation of a Hybrid Process: Multipoint Incremental Forming

Authors: Safa Boudhaouia, Mohamed Amen Gahbiche, Eliane Giraud, Wacef Ben Salem, Philippe Dal Santo

Abstract:

Multi-point forming (MPF) and asymmetric incremental forming (ISF) are two flexible processes for sheet metal manufacturing. To take advantages of these two techniques, a hybrid process has been developed: The Multipoint Incremental Forming (MPIF). This process accumulates at once the advantages of each of these last mentioned forming techniques, which makes it a very interesting and particularly an efficient process for single, small, and medium series production. In this paper, an experimental and a numerical investigation of this technique are presented. To highlight the flexibility of this process and its capacity to manufacture standard and complex shapes, several pieces were produced by using MPIF. The forming experiments are performed on a 3-axis CNC machine. Moreover, a numerical model of the MPIF process has been implemented in ABAQUS and the analysis showed a good agreement with experimental results in terms of deformed shape. Furthermore, the use of an elastomeric interpolator allows avoiding classical local defaults like dimples, which are generally caused by the asymmetric contact and also improves the distribution of residual strain. Future works will apply this approach to other alloys used in aeronautic or automotive applications.

Keywords: Incremental forming, numerical simulation, MPIF, multipoint forming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
2159 Object-Centric Process Mining Using Process Cubes

Authors: Anahita Farhang Ghahfarokhi, Alessandro Berti, Wil M.P. van der Aalst

Abstract:

Process mining provides ways to analyze business processes. Common process mining techniques consider the process as a whole. However, in real-life business processes, different behaviors exist that make the overall process too complex to interpret. Process comparison is a branch of process mining that isolates different behaviors of the process from each other by using process cubes. Process cubes organize event data using different dimensions. Each cell contains a set of events that can be used as an input to apply process mining techniques. Existing work on process cubes assume single case notions. However, in real processes, several case notions (e.g., order, item, package, etc.) are intertwined. Object-centric process mining is a new branch of process mining addressing multiple case notions in a process. To make a bridge between object-centric process mining and process comparison, we propose a process cube framework, which supports process cube operations such as slice and dice on object-centric event logs. To facilitate the comparison, the framework is integrated with several object-centric process discovery approaches.

Keywords: Process mining, multidimensional process mining, multi-perspective business processes, OLAP, process cubes, process discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
2158 Using Different Aspects of the Signings for Appearance-based Sign Language Recognition

Authors: Morteza Zahedi, Philippe Dreuw, Thomas Deselaers, Hermann Ney

Abstract:

Sign language is used by the deaf and hard of hearing people for communication. Automatic sign language recognition is a challenging research area since sign language often is the only way of communication for the deaf people. Sign language includes different components of visual actions made by the signer using the hands, the face, and the torso, to convey his/her meaning. To use different aspects of signs, we combine the different groups of features which have been extracted from the image frames recorded directly by a stationary camera. We combine the features in two levels by employing three techniques. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, or by concatenating feature groups over time and using LDA to choose the most discriminant elements. At the model level, a late fusion of differently trained models can be carried out by a log-linear model combination. In this paper, we investigate these three combination techniques in an automatic sign language recognition system and show that the recognition rate can be significantly improved.

Keywords: American sign language, appearance-based features, Feature combination, Sign language recognition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1375
2157 Improved Network Construction Methods Based on Virtual Rails for Mobile Sensor Network

Authors: Noritaka Shigei, Kazuto Matsumoto, Yoshiki Nakashima, Hiromi Miyajima

Abstract:

Although Mobile Wireless Sensor Networks (MWSNs), which consist of mobile sensor nodes (MSNs), can cover a wide range of observation region by using a small number of sensor nodes, they need to construct a network to collect the sensing data on the base station by moving the MSNs. As an effective method, the network construction method based on Virtual Rails (VRs), which is referred to as VR method, has been proposed. In this paper, we propose two types of effective techniques for the VR method. They can prolong the operation time of the network, which is limited by the battery capabilities of MSNs and the energy consumption of MSNs. The first technique, an effective arrangement of VRs, almost equalizes the number of MSNs belonging to each VR. The second technique, an adaptive movement method of MSNs, takes into account the residual energy of battery. In the simulation, we demonstrate that each technique can improve the network lifetime and the combination of both techniques is the most effective.

Keywords: Wireless sensor network, mobile sensor node, relay of sensing data, virtual rail, residual energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
2156 Optimization of Acid Treatments by Assessing Diversion Strategies in Carbonate and Sandstone Formations

Authors: Ragi Poyyara, Vijaya Patnana, Mohammed Alam

Abstract:

When acid is pumped into damaged reservoirs for damage removal/stimulation, distorted inflow of acid into the formation occurs caused by acid preferentially traveling into highly permeable regions over low permeable regions, or (in general) into the path of least resistance. This can lead to poor zonal coverage and hence warrants diversion to carry out an effective placement of acid. Diversion is desirably a reversible technique of temporarily reducing the permeability of high perm zones, thereby forcing the acid into lower perm zones. The uniqueness of each reservoir can pose several challenges to engineers attempting to devise optimum and effective diversion strategies. Diversion techniques include mechanical placement and/or chemical diversion of treatment fluids, further sub-classified into ball sealers, bridge plugs, packers, particulate diverters, viscous gels, crosslinked gels, relative permeability modifiers (RPMs), foams, and/or the use of placement techniques, such as coiled tubing (CT) and the maximum pressure difference and injection rate (MAPDIR) methodology. It is not always realized that the effectiveness of diverters greatly depends on reservoir properties, such as formation type, temperature, reservoir permeability, heterogeneity, and physical well characteristics (e.g., completion type, well deviation, length of treatment interval, multiple intervals, etc.). This paper reviews the mechanisms by which each variety of diverter functions and discusses the effect of various reservoir properties on the efficiency of diversion techniques. Guidelines are recommended to help enhance productivity from zones of interest by choosing the best methods of diversion while pumping an optimized amount of treatment fluid. The success of an overall acid treatment often depends on the effectiveness of the diverting agents.

Keywords: Acid treatment, carbonate, diversion, sandstone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4026
2155 Pollutants Removal from Synthetic Wastewater by the Combined Electrochemical Sequencing Batch Reactor

Authors: Amin Mojiri, Akiyoshi Ohashi, Tomonori Kindaichi

Abstract:

Synthetic domestic wastewater was treated via combining treatment methods, including electrochemical oxidation, adsorption, and sequencing batch reactor (SBR). In the upper part of the reactor, an anode and a cathode (Ti/RuO2-IrO2) were organized in parallel for the electrochemical oxidation procedure. Sodium sulfate (Na2SO4) with a concentration of 2.5 g/L was applied as the electrolyte. The voltage and current were fixed on 7.50 V and 0.40 A, respectively. Then, 15% working value of the reactor was filled by activated sludge, and 85% working value of the reactor was added with synthetic wastewater. Powdered cockleshell, 1.5 g/L, was added in the reactor to do ion-exchange. Response surface methodology was employed for statistical analysis. Reaction time (h) and pH were considered as independent factors. A total of 97.0% biochemical oxygen demand, 99.9% phosphorous and 88.6% cadmium were eliminated at the optimum reaction time (80.0 min) and pH (6.4).

Keywords: Adsorption, electrochemical oxidation, metals, sequencing batch reactor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 767
2154 An Off-the-Shelf Scheme for Dependable Grid Systems Using Virtualization

Authors: Toshinori Takabatake

Abstract:

Recently, grid computing has been widely focused on the science, industry, and business fields, which are required a vast amount of computing. Grid computing is to provide the environment that many nodes (i.e., many computers) are connected with each other through a local/global network and it is available for many users. In the environment, to achieve data processing among nodes for any applications, each node executes mutual authentication by using certificates which published from the Certificate Authority (for short, CA). However, if a failure or fault has occurred in the CA, any new certificates cannot be published from the CA. As a result, a new node cannot participate in the gird environment. In this paper, an off-the-shelf scheme for dependable grid systems using virtualization techniques is proposed and its implementation is verified. The proposed approach using the virtualization techniques is to restart an application, e.g., the CA, if it has failed. The system can tolerate a failure or fault if it has occurred in the CA. Since the proposed scheme is implemented at the application level easily, the cost of its implementation by the system builder hardly takes compared it with other methods. Simulation results show that the CA in the system can recover from its failure or fault.

Keywords: grid computing, restarting application, certificate authority, virtualization, dependability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1354
2153 A Consistency Protocol Multi-Layer for Replicas Management in Large Scale Systems

Authors: Ghalem Belalem, Yahya Slimani

Abstract:

Large scale systems such as computational Grid is a distributed computing infrastructure that can provide globally available network resources. The evolution of information processing systems in Data Grid is characterized by a strong decentralization of data in several fields whose objective is to ensure the availability and the reliability of the data in the reason to provide a fault tolerance and scalability, which cannot be possible only with the use of the techniques of replication. Unfortunately the use of these techniques has a height cost, because it is necessary to maintain consistency between the distributed data. Nevertheless, to agree to live with certain imperfections can improve the performance of the system by improving competition. In this paper, we propose a multi-layer protocol combining the pessimistic and optimistic approaches conceived for the data consistency maintenance in large scale systems. Our approach is based on a hierarchical representation model with tree layers, whose objective is with double vocation, because it initially makes it possible to reduce response times compared to completely pessimistic approach and it the second time to improve the quality of service compared to an optimistic approach.

Keywords: Data Grid, replication, consistency, optimistic approach, pessimistic approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
2152 A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Singularvalue decomposition, Discrete cosine transform, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2070
2151 A Frugal Bidding Procedure for Replicating WWW Content

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Internet, data content replication, static allocation, mechanism design, equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
2150 TS Fuzzy Controller to Stochastic Systems

Authors: Joabe Silva, Ginalber Serra

Abstract:

This paper proposes the analysis and design of robust fuzzy control to Stochastic Parametrics Uncertaint Linear systems. This system type to be controlled is partitioned into several linear sub-models, in terms of transfer function, forming a convex polytope, similar to LPV (Linear Parameters Varying) system. Once defined the linear sub-models of the plant, these are organized into fuzzy Takagi- Sugeno (TS) structure. From the Parallel Distributed Compensation (PDC) strategy, a mathematical formulation is defined in the frequency domain, based on the gain and phase margins specifications, to obtain robust PI sub-controllers in accordance to the Takagi- Sugeno fuzzy model of the plant. The main results of the paper are based on the robust stability conditions with the proposal of one Axiom and two Theorems.

Keywords: Fuzzy Systems; Robust Stability, Stochastic Control, Stochastic Process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
2149 Genetic Programming: Principles, Applications and Opportunities for Hydrological Modelling

Authors: Oluwaseun K. Oyebode, Josiah A. Adeyemo

Abstract:

Hydrological modelling plays a crucial role in the planning and management of water resources, most especially in water stressed regions where the need to effectively manage the available water resources is of critical importance. However, due to the complex, nonlinear and dynamic behaviour of hydro-climatic interactions, achieving reliable modelling of water resource systems and accurate projection of hydrological parameters are extremely challenging. Although a significant number of modelling techniques (process-based and data-driven) have been developed and adopted in that regard, the field of hydrological modelling is still considered as one that has sluggishly progressed over the past decades. This is majorly as a result of the identification of some degree of uncertainty in the methodologies and results of techniques adopted. In recent times, evolutionary computation (EC) techniques have been developed and introduced in response to the search for efficient and reliable means of providing accurate solutions to hydrological related problems. This paper presents a comprehensive review of the underlying principles, methodological needs and applications of a promising evolutionary computation modelling technique – genetic programming (GP). It examines the specific characteristics of the technique which makes it suitable to solving hydrological modelling problems. It discusses the opportunities inherent in the application of GP in water related-studies such as rainfall estimation, rainfall-runoff modelling, streamflow forecasting, sediment transport modelling, water quality modelling and groundwater modelling among others. Furthermore, the means by which such opportunities could be harnessed in the near future are discussed. In all, a case for total embracement of GP and its variants in hydrological modelling studies is made so as to put in place strategies that would translate into achieving meaningful progress as it relates to modelling of water resource systems, and also positively influence decision-making by relevant stakeholders.

Keywords: Computational modelling, evolutionary algorithms, genetic programming, hydrological modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3307
2148 A Materialized View Approach to Support Aggregation Operations over Long Periods in Sensor Networks

Authors: Minsoo Lee, Julee Choi, Sookyung Song

Abstract:

The increasing interest on processing data created by sensor networks has evolved into approaches to implement sensor networks as databases. The aggregation operator, which calculates a value from a large group of data such as computing averages or sums, etc. is an essential function that needs to be provided when implementing such sensor network databases. This work proposes to add the DURING clause into TinySQL to calculate values during a specific long period and suggests a way to implement the aggregation service in sensor networks by applying materialized view and incremental view maintenance techniques that is used in data warehouses. In sensor networks, data values are passed from child nodes to parent nodes and an aggregation value is computed at the root node. As such root nodes need to be memory efficient and low powered, it becomes a problem to recompute aggregate values from all past and current data. Therefore, applying incremental view maintenance techniques can reduce the memory consumption and support fast computation of aggregate values.

Keywords: Aggregation, Incremental View Maintenance, Materialized view, Sensor Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
2147 Contrast Enhancement of Color Images with Color Morphing Approach

Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi

Abstract:

Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.

Keywords: Contrast enhancement, normalized RGB, adaptive histogram equalization, cumulative variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1082
2146 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing

Authors: Jaimin Patel

Abstract:

Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.

Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man-in-the-middle attack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
2145 HaskellFL: A Tool for Detecting Logical Errors in Haskell

Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha

Abstract:

Understanding and using the functional paradigm is a challenge for many programmers. Looking for logical errors in code may take a lot of a developer’s time when a program grows in size. In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying Functional Programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against Functional Programming assignments submitted by students enrolled at the Functional Programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available in GitHub. This work also evaluated the effectiveness of two fault localization techniques, Tarantula and Ochiai, in the Haskell context. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. The results also showed that the Ochiai method was more effective than Tarantula.

Keywords: Debug, fault localization, functional programming, Haskell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 693
2144 Advanced Geolocation of IP Addresses

Authors: Robert Koch, Mario Golling, Gabi Dreo Rodosek

Abstract:

Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.

Keywords: IP geolocation, prosecution of computer fraud, attack attribution, target-analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4698
2143 An Integrated Framework for the Realtime Investigation of State Space Exploration

Authors: Jörg Lassig, Stefanie Thiem

Abstract:

The objective of this paper is the introduction to a unified optimization framework for research and education. The OPTILIB framework implements different general purpose algorithms for combinatorial optimization and minimum search on standard continuous test functions. The preferences of this library are the straightforward integration of new optimization algorithms and problems as well as the visualization of the optimization process of different methods exploring the search space exclusively or for the real time visualization of different methods in parallel. Further the usage of several implemented methods is presented on the basis of two use cases, where the focus is especially on the algorithm visualization. First it is demonstrated how different methods can be compared conveniently using OPTILIB on the example of different iterative improvement schemes for the TRAVELING SALESMAN PROBLEM. A second study emphasizes how the framework can be used to find global minima in the continuous domain.

Keywords: Global Optimization Heuristics, Particle Swarm Optimization, Ensemble Based Threshold Accepting, Ruin and Recreate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
2142 Importance of Public Communication Campaigns and Art Activities in Social Education

Authors: Bilgehan Gültekin, Tuba Gültekin

Abstract:

Universities have an important role in social education in many aspects. In terms of creating awareness and convincing public about social issues, universities take a leading position for public. The best way to provide public support for social education is to develop public communication campaigns. The aim of this study is to present a public communication model which will be guided in social education practices. The study titled “Importance of public communication campaigns and art activities in Social Education “is based on the following topics: Effects of public communication campaigns on social education, Public relations techniques for education, communication strategies, Steps of public relations campaigns in social education, making persuasive messages for public communication campaigns, developing artistic messages and organizing art activities in social education. In addition to these topics, media planning for social education, forming a team as campaign managers, dialogues with opinion leaders in education and preparing creative communication models for social education will be taken into consideration. This study also aims to criticize social education Case studies in Turkey. At the same time, some communicative methods and principles will be given in the light of communication campaigns within the context of this notice.

Keywords: Art activities in social education, Persuasive communication, Public communication campaigns, Public relations techniques for education

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
2141 Carbon-Based Electrochemical Detection of Pharmaceuticals from Water

Authors: M. Ardelean, F. Manea, A. Pop, J. Schoonman

Abstract:

The presence of pharmaceuticals in the environment and especially in water has gained increasing attention. They are included in emerging class of pollutants, and for most of them, legal limits have not been set-up due to their impact on human health and ecosystem was not determined and/or there is not the advanced analytical method for their quantification. In this context, the development of various advanced analytical methods for the quantification of pharmaceuticals in water is required. The electrochemical methods are known to exhibit the great potential for high-performance analytical methods but their performance is in direct relation to the electrode material and the operating techniques. In this study, two types of carbon-based electrodes materials, i.e., boron-doped diamond (BDD) and carbon nanofiber (CNF)-epoxy composite electrodes have been investigated through voltammetric techniques for the detection of naproxen in water. The comparative electrochemical behavior of naproxen (NPX) on both BDD and CNF electrodes was studied by cyclic voltammetry, and the well-defined peak corresponding to NPX oxidation was found for each electrode. NPX oxidation occurred on BDD electrode at the potential value of about +1.4 V/SCE (saturated calomel electrode) and at about +1.2 V/SCE for CNF electrode. The sensitivities for NPX detection were similar for both carbon-based electrode and thus, CNF electrode exhibited superiority in relation to the detection potential. Differential-pulsed voltammetry (DPV) and square-wave voltammetry (SWV) techniques were exploited to improve the electroanalytical performance for the NPX detection, and the best results related to the sensitivity of 9.959 µA·µM-1 were achieved using DPV. In addition, the simultaneous detection of NPX and fluoxetine -a very common antidepressive drug, also present in water, was studied using CNF electrode and very good results were obtained. The detection potential values that allowed a good separation of the detection signals together with the good sensitivities were appropriate for the simultaneous detection of both tested pharmaceuticals. These results reclaim CNF electrode as a valuable tool for the individual/simultaneous detection of pharmaceuticals in water.

Keywords: Boron-doped diamond electrode, carbon nanofiber-epoxy composite electrode, emerging pollutants, pharmaceuticals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1241
2140 Treatment of Petroleum Refinery Wastewater by using UASB Reactors

Authors: H.A. Gasim, S.R.M. Kutty, M.H. Isa, M.P.M. Isa

Abstract:

Petroleum refineries discharged large amount of wastewater -during the refining process- that contains hazardous constituents that is hard to degrade. Anaerobic treatment process is well known as an efficient method to degrade high strength wastewaters. Up-flow Anaerobic Sludge Blanker (UASB) is a common process used for various wastewater treatments. Two UASB reactors were set up and operated in parallel to evaluate the treatment efficiency of petroleum refinery wastewater. In this study four organic volumetric loading rates were applied (i.e. 0.58, 0.89, 1.21 and 2.34 kg/m3·d), two loads to each reactor. Each load was applied for a period of 60 days for the reactor to acclimatize and reach steady state, and then the second load applied. The chemical oxygen demand (COD) removals were satisfactory with the removal efficiencies at the loadings applied were 78, 82, 83 and 81 % respectively.

Keywords: Petroleum refinery wastewater, anaerobic treatment, UASB, organic volumetric loading rate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2476
2139 Sensitivity of Input Blocking Capacitor on Output Voltage and Current of a PV Inverter Employing IGBTs

Authors: Z.A. Jaffery, Vinay Kumar Chandna, Sunil Kumar Chaudhary

Abstract:

This paper present a MATLAB-SIMULINK model of a single phase 2.5 KVA, 240V RMS controlled PV VSI (Photovoltaic Voltage Source Inverter) inverter using IGBTs (Insulated Gate Bipolar Transistor). The behavior of output voltage, output current, and the total harmonic distortion (THD), with the variation in input dc blocking capacitor (Cdc), for linear and non-linear load has been analyzed. The values of Cdc as suggested by the other authors in their papers are not clearly defined and it poses difficulty in selecting the proper value. As the dc power stored in Cdc, (generally placed parallel with battery) is used as input to the VSI inverter. The simulation results shows the variation in the output voltage and current with different values of Cdc for linear and non-linear load connected at the output side of PV VSI inverter and suggest the selection of suitable value of Cdc.

Keywords: DC Blocking capacitor, IGBTs, PV VSI, THD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
2138 A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Data replication, auctions, static allocation, pricing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
2137 Questions Categorization in E-Learning Environment Using Data Mining Technique

Authors: Vilas P. Mahatme, K. K. Bhoyar

Abstract:

Nowadays, education cannot be imagined without digital technologies. It broadens the horizons of teaching learning processes. Several universities are offering online courses. For evaluation purpose, e-examination systems are being widely adopted in academic environments. Multiple-choice tests are extremely popular. Moving away from traditional examinations to e-examination, Moodle as Learning Management Systems (LMS) is being used. Moodle logs every click that students make for attempting and navigational purposes in e-examination. Data mining has been applied in various domains including retail sales, bioinformatics. In recent years, there has been increasing interest in the use of data mining in e-learning environment. It has been applied to discover, extract, and evaluate parameters related to student’s learning performance. The combination of data mining and e-learning is still in its babyhood. Log data generated by the students during online examination can be used to discover knowledge with the help of data mining techniques. In web based applications, number of right and wrong answers of the test result is not sufficient to assess and evaluate the student’s performance. So, assessment techniques must be intelligent enough. If student cannot answer the question asked by the instructor then some easier question can be asked. Otherwise, more difficult question can be post on similar topic. To do so, it is necessary to identify difficulty level of the questions. Proposed work concentrate on the same issue. Data mining techniques in specific clustering is used in this work. This method decide difficulty levels of the question and categories them as tough, easy or moderate and later this will be served to the desire students based on their performance. Proposed experiment categories the question set and also group the students based on their performance in examination. This will help the instructor to guide the students more specifically. In short mined knowledge helps to support, guide, facilitate and enhance learning as a whole.

Keywords: Data mining, e-examination, e-learning, moodle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047