Search results for: Mamdani Rule Based- FCMs(MBFCMs)
9218 Detection, Tracking and Classification of Vehicles and Aircraft based on Magnetic Sensing Technology
Authors: K. Dimitropoulos, N. Grammalidis, I. Gragopoulos, H. Gao, Th. Heuer, M. Weinmann, S. Voit, C. Stockhammer, U. Hartmann, N. Pavlidou
Abstract:
Existing ground movement surveillance technologies at airports are subjected to limitations due to shadowing effects or multiple reflections. Therefore, there is a strong demand for a new sensing technology, which will be cost effective and will provide detection of non-cooperative targets under any weather conditions. This paper aims to present a new intelligent system, developed within the framework of the EC-funded ISMAEL project, which is based on a new magnetic sensing technology and provides detection, tracking and automatic classification of targets moving on the airport surface. The system is currently being installed at two European airports. Initial experimental results under real airport traffic demonstrate the great potential of the proposed system.Keywords: Air traffic management, magnetic sensors, multitracking, A-SMGCS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19339217 Advertisement Effectiveness for Print Media: A Conceptual Model
Authors: Prateek Maheshwari, Nitin Seth, Anoop Kumar Gupta
Abstract:
The objective of present research paper is to highlight the importance of measuring advertisement effectiveness in print media and to develop a conceptual model for advertisement effectiveness. The developed model is based on dimensions on which advertisement effectiveness depends and on the dimensions which are used to measure the effectiveness. An in-depth and extensive literature review is carried out to understand the concept of advertisement effectiveness and its various determinants in context of print media. Based on the insights gained, a conceptual framework for advertisement effectiveness is presented. The model is an attempt to uncover the relatively less explored area of advertisement effectiveness in Indian advertising scenario. It is believed that present work will encourage scholars and academicians to further explore the area and will offer conceptual assistance and a fresh direction in the domain of advertisement effectiveness.Keywords: Advertisement Effectiveness, Conceptual Model, Effectiveness Dimensions, Print Media.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 55819216 A Content Vector Model for Text Classification
Authors: Eric Jiang
Abstract:
As a popular rank-reduced vector space approach, Latent Semantic Indexing (LSI) has been used in information retrieval and other applications. In this paper, an LSI-based content vector model for text classification is presented, which constructs multiple augmented category LSI spaces and classifies text by their content. The model integrates the class discriminative information from the training data and is equipped with several pertinent feature selection and text classification algorithms. The proposed classifier has been applied to email classification and its experiments on a benchmark spam testing corpus (PU1) have shown that the approach represents a competitive alternative to other email classifiers based on the well-known SVM and naïve Bayes algorithms.Keywords: Feature Selection, Latent Semantic Indexing, Text Classification, Vector Space Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18859215 Analysis of Textual Data Based On Multiple 2-Class Classification Models
Authors: Shigeaki Sakurai, Ryohei Orihara
Abstract:
This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Keywords: Text mining, Multiple viewpoints, Differential analysis, Questionnaire data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12909214 Effect on Bandwidth of Using Double Substrates Based Metamaterial Planar Antenna
Authors: Smrity Dwivedi
Abstract:
The present paper has revealed the effect of double substrates over a bandwidth performance for planar antennas. The used material has its own importance to get minimum return loss and improved directivity. The author has taken double substrates to enhance the efficiency in terms of gain of antenna. Metamaterial based antenna has its own specific structure which increased the performance of antenna. Improved return loss is -20 dB, and the voltage standing wave ratio (VSWR) is 1.2, which is better than single substrate having return loss of -15 dB and VSWR of 1.4. Complete results are obtained using commercial software CST microwave studio.
Keywords: Metamaterials, return loss, standing wave ratio, directivity, CST microwave studio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10599213 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation Based Approach
Authors: Sujoy Das, M. M. Ghosh
Abstract:
The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solidsolid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulselike pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.
Keywords: Brownian dynamics, Molecular dynamics, Nanofluid, Thermal conductivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22649212 Imposter Detection Based on Location in Vehicular Ad-Hoc Network
Authors: Sanjoy Das, Akash Arya, Rishi Pal Singh
Abstract:
Vehicular Ad hoc Network is basically the solution of several problems associated while vehicles are plying on the road. In this paper, we have focused on the detection of imposter node while it has stolen the ID's of the authenticated vehicle in the network. The purpose is to harm the network through imposter messages. Here, we have proposed a protocol namely Imposter Detection based on Location (IDBL), which will store the location coordinate of the each vehicle as the key of the authenticity of the message so that imposter node can be detected. The imposter nodes send messages from a stolen ID and show that it is from an authentic node ID. So, to detect this anomaly, the first location is checked and observed different from original vehicle location. This node is known as imposter node. We have implemented the algorithm through JAVA and tested various types of node distribution and observed the detection probability of imposter node.
Keywords: Authentication, detection, IDBL protocol, imposter node, node detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8009211 A Proposed Approach for Emotion Lexicon Enrichment
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Document Analysis is an important research field that aims to gather the information by analyzing the data in documents. As one of the important targets for many fields is to understand what people actually want, sentimental analysis field has been one of the vital fields that are tightly related to the document analysis. This research focuses on analyzing text documents to classify each document according to its opinion. The aim of this research is to detect the emotions from text documents based on enriching the lexicon with adapting their content based on semantic patterns extraction. The proposed approach has been presented, and different experiments are applied by different perspectives to reveal the positive impact of the proposed approach on the classification results.Keywords: Document analysis, sentimental analysis, emotion detection, WEKA tool, NRC Lexicon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14569210 Autonomous Control of a Mobile Manipulator
Authors: Shonal Singh, Bibhya Sharma, Jito Vanualailai
Abstract:
This paper considers the design of a motion planner that will simultaneously accomplish control and motion planning of a n-link nonholonomic mobile manipulator, wherein, a n-link holonomic manipulator is coupled with a nonholonomic mobile platform, within an obstacle-ridden environment. This planner, derived from the Lyapunov-based control scheme, generates collision-free trajectories from an initial configuration to a final configuration in a constrained environment cluttered with stationary solid objects of different shapes and sizes. We demonstrate the efficiency of the control scheme and the resulting acceleration controllers of the mobile manipulator with results through computer simulations of an interesting scenario.Keywords: Artificial potential fields, Lyapunov-based control scheme, Lyapunov stability, nonholonomic manipulator, minimum distance technique, kinodynamic constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14159209 A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm
Authors: Ali Nourollah, Mohsen Movahedinejad
Abstract:
In this paper a new algorithm to generate random simple polygons from a given set of points in a two dimensional plane is designed. The proposed algorithm uses a genetic algorithm to generate polygons with few vertices. A new merge algorithm is presented which converts any two polygons into a simple polygon. This algorithm at first changes two polygons into a polygonal chain and then the polygonal chain is converted into a simple polygon. The process of converting a polygonal chain into a simple polygon is based on the removal of intersecting edges. The experiments results show that the proposed algorithm has the ability to generate a great number of different simple polygons and has better performance in comparison to celebrated algorithms such as space partitioning and steady growth.
Keywords: Divide and conquer, genetic algorithm, merge polygons, Random simple polygon generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32909208 Bank Loans and the Business Cycle: The Case of the Czech Republic
Authors: Libena Cernohorska, Jan Cernohorsky
Abstract:
This article aims to evaluate the impact of loans provided within the Czech banking sector on the growth of the Czech economy. The article is based on research of current scientific findings in respect to bank loans and economic development. The paper is based on data taken from the Czech Statistical Office on the development of the gross domestic product and data from the Czech National Bank on the development of loans from the period 2004-2015. Links between selected variables are tested using Granger causality tests. The results calculated confirm the hypothesis of the impact of the loans on economic growth, with a six-month delay. The results thus correspond to the standard economic findings and results of most previous studies.
Keywords: Bank, business cycle, economic growth, loans.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6689207 A Robust Data Hiding Technique based on LSB Matching
Authors: Emad T. Khalaf, Norrozila Sulaiman
Abstract:
Many researchers are working on information hiding techniques using different ideas and areas to hide their secrete data. This paper introduces a robust technique of hiding secret data in image based on LSB insertion and RSA encryption technique. The key of the proposed technique is to encrypt the secret data. Then the encrypted data will be converted into a bit stream and divided it into number of segments. However, the cover image will also be divided into the same number of segments. Each segment of data will be compared with each segment of image to find the best match segment, in order to create a new random sequence of segments to be inserted then in a cover image. Experimental results show that the proposed technique has a high security level and produced better stego-image quality.Keywords: steganography; LSB Matching; RSA Encryption; data segments
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22209206 Evolutionary Algorithm Based Centralized Congestion Management for Multilateral Transactions
Authors: T. Mathumathi, S. Ganesh, R. Gunabalan
Abstract:
This work presents an approach for AC load flow based centralized model for congestion management in the forward markets. In this model, transaction maximizes its profit under the limits of transmission line capacities allocated by Independent System Operator (ISO). The voltage and reactive power impact of the system are also incorporated in this model. Genetic algorithm is used to solve centralized congestion management problem for multilateral transactions. Results obtained for centralized model using genetic algorithm is compared with Sequential Quadratic Programming (SQP) technique. The statistical performances of various algorithms such as best, worst, mean and standard deviations of social welfare are given. Simulation results clearly demonstrate the better performance of genetic algorithm over SQP.
Keywords: Congestion management, Genetic algorithm, Sequential quadratic programming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17619205 Taguchi-Based Optimization of Surface Roughness and Dimensional Accuracy in Wire EDM Process with S7 Heat Treated Steel
Authors: Joseph C. Chen, Joshua Cox
Abstract:
This research focuses on the use of the Taguchi method to reduce the surface roughness and improve dimensional accuracy of parts machined by Wire Electrical Discharge Machining (EDM) with S7 heat treated steel material. Due to its high impact toughness, the material is a candidate for a wide variety of tooling applications which require high precision in dimension and desired surface roughness. This paper demonstrates that Taguchi Parameter Design methodology is able to optimize both dimensioning and surface roughness successfully by investigating seven wire-EDM controllable parameters: pulse on time (ON), pulse off time (OFF), servo voltage (SV), voltage (V), servo feed (SF), wire tension (WT), and wire speed (WS). The temperature of the water in the Wire EDM process is investigated as the noise factor in this research. Experimental design and analysis based on L18 Taguchi orthogonal arrays are conducted. This paper demonstrates that the Taguchi-based system enables the wire EDM process to produce (1) high precision parts with an average of 0.6601 inches dimension, while the desired dimension is 0.6600 inches; and (2) surface roughness of 1.7322 microns which is significantly improved from 2.8160 microns.
Keywords: Taguchi parameter design, surface roughness, dimensional accuracy, Wire EDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10889204 Study of Fly Ash Geopolymer Based Composites with Polyester Waste Addition
Authors: Konstantinos Sotiriadis, Olesia Mikhailova
Abstract:
In the present work, fly ash geopolymer based composites including polyester (PES) waste were studied. Specimens of three compositions were prepared: (a) fly ash geopolymer with 5% PES waste; (b) fly ash geopolymer mortar with 5% PES waste; (c) fly ash geopolymer mortar with 6.25% PES waste. Compressive and bending strength measurements, water absorption test and determination of thermal conductivity coefficient were performed. The results showed that the addition of sand in a mixture of geopolymer with 5% PES content led to higher compressive strength, while it increased water absorption and reduced thermal conductivity coefficient. The increase of PES addition in geopolymer mortars resulted in a more dense structure, indicated by the increase of strength and thermal conductivity and the decrease of water absorption.
Keywords: Fly ash, geopolymers, polyester waste, composites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24979203 Genetic Algorithm Based Deep Learning Parameters Tuning for Robot Object Recognition and Grasping
Authors: Delowar Hossain, Genci Capi
Abstract:
This paper concerns with the problem of deep learning parameters tuning using a genetic algorithm (GA) in order to improve the performance of deep learning (DL) method. We present a GA based DL method for robot object recognition and grasping. GA is used to optimize the DL parameters in learning procedure in term of the fitness function that is good enough. After finishing the evolution process, we receive the optimal number of DL parameters. To evaluate the performance of our method, we consider the object recognition and robot grasping tasks. Experimental results show that our method is efficient for robot object recognition and grasping.
Keywords: Deep learning, genetic algorithm, object recognition, robot grasping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21349202 Traffic Forecasting for Open Radio Access Networks Virtualized Network Functions in 5G Networks
Authors: Khalid Ali, Manar Jammal
Abstract:
In order to meet the stringent latency and reliability requirements of the upcoming 5G networks, Open Radio Access Networks (O-RAN) have been proposed. The virtualization of O-RAN has allowed it to be treated as a Network Function Virtualization (NFV) architecture, while its components are considered Virtualized Network Functions (VNFs). Hence, intelligent Machine Learning (ML) based solutions can be utilized to apply different resource management and allocation techniques on O-RAN. However, intelligently allocating resources for O-RAN VNFs can prove challenging due to the dynamicity of traffic in mobile networks. Network providers need to dynamically scale the allocated resources in response to the incoming traffic. Elastically allocating resources can provide a higher level of flexibility in the network in addition to reducing the OPerational EXpenditure (OPEX) and increasing the resources utilization. Most of the existing elastic solutions are reactive in nature, despite the fact that proactive approaches are more agile since they scale instances ahead of time by predicting the incoming traffic. In this work, we propose and evaluate traffic forecasting models based on the ML algorithm. The algorithms aim at predicting future O-RAN traffic by using previous traffic data. Detailed analysis of the traffic data was carried out to validate the quality and applicability of the traffic dataset. Hence, two ML models were proposed and evaluated based on their prediction capabilities.
Keywords: O-RAN, traffic forecasting, NFV, ARIMA, LSTM, elasticity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5419201 Evaluation of Clustering Based on Preprocessing in Gene Expression Data
Authors: Seo Young Kim, Toshimitsu Hamasaki
Abstract:
Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.
Keywords: Gene expression, clustering, data preprocessing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17409200 New Wavelet-Based Superresolution Algorithm for Speckle Reduction in SAR Images
Authors: Mario Mastriani
Abstract:
This paper describes a novel projection algorithm, the Projection Onto Span Algorithm (POSA) for wavelet-based superresolution and removing speckle (in wavelet domain) of unknown variance from Synthetic Aperture Radar (SAR) images. Although the POSA is good as a new superresolution algorithm for image enhancement, image metrology and biometric identification, here one will use it like a tool of despeckling, being the first time that an algorithm of super-resolution is used for despeckling of SAR images. Specifically, the speckled SAR image is decomposed into wavelet subbands; POSA is applied to the high subbands, and reconstruct a SAR image from the modified detail coefficients. Experimental results demonstrate that the new method compares favorably to several other despeckling methods on test SAR images.
Keywords: Projection, speckle, superresolution, synthetic aperture radar, thresholding, wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16179199 Study of the Effect of Inclusion of TiO2 in Active Flux on Submerged Arc Welding of Low Carbon Mild Steel Plate and Parametric Optimization of the Process by Using DEA Based Bat Algorithm
Authors: Sheetal Kumar Parwar, J. Deb Barma, A. Majumder
Abstract:
Submerged arc welding is a very complex process. It is a very efficient and high performance welding process. In this present study an attempt have been done to reduce the welding distortion by increased amount of oxide flux through TiO2 in submerged arc welding process. Care has been taken to avoid the excessiveness of the adding agent for attainment of significant results. Data Envelopment Analysis (DEA) based BAT algorithm is used for the parametric optimization purpose in which DEA is used to convert multi response parameters into a single response parameter. The present study also helps to know the effectiveness of the addition of TiO2 in active flux during submerged arc welding process.Keywords: BAT algorithm, design of experiment, optimization, submerged arc welding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20079198 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research
Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez
Abstract:
Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.
Keywords: Action research, information security, information technology, methodological design, process virtualization, risk management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9599197 Ground Response Analyses in Budapest Based on Site Investigations and Laboratory Measurements
Authors: Zsolt Szilvágyi, Jakub Panuska, Orsolya Kegyes-Brassai, Ákos Wolf, Péter Tildy, Richard P. Ray
Abstract:
Near-surface loose sediments and local ground conditions in general have a major influence on seismic response of structures. It is a difficult task to model ground behavior in seismic soil-structure-foundation interaction problems, fully account for them in seismic design of structures, or even properly consider them in seismic hazard assessment. In this study, we focused on applying seismic soil investigation methods, used for determining soil stiffness and damping properties, to response analysis used in seismic design. A site in Budapest, Hungary was investigated using Multichannel Analysis of Surface Waves, Seismic Cone Penetration Tests, Bender Elements, Resonant Column and Torsional Shear tests. Our aim was to compare the results of the different test methods and use the resulting soil properties for 1D ground response analysis. Often in practice, there are little-to no data available on dynamic soil properties and estimated parameters are used for design. Therefore, a comparison is made between results based on estimated parameters and those based on detailed investigations. Ground response results are also compared to Eurocode 8 design spectra.
Keywords: Bender element, ground response analysis, MASW, resonant column test, SCPT, torsional shear test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11249196 Feature Subset Selection approach based on Maximizing Margin of Support Vector Classifier
Authors: Khin May Win, Nan Sai Moon Kham
Abstract:
Identification of cancer genes that might anticipate the clinical behaviors from different types of cancer disease is challenging due to the huge number of genes and small number of patients samples. The new method is being proposed based on supervised learning of classification like support vector machines (SVMs).A new solution is described by the introduction of the Maximized Margin (MM) in the subset criterion, which permits to get near the least generalization error rate. In class prediction problem, gene selection is essential to improve the accuracy and to identify genes for cancer disease. The performance of the new method was evaluated with real-world data experiment. It can give the better accuracy for classification.Keywords: Microarray data, feature selection, recursive featureelimination, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15419195 Simulation of Low Cycle Fatigue Behaviour of Nickel-Based Alloy at Elevated Temperatures
Authors: Harish Ramesh Babu, Marco Böcker, Mario Raddatz, Sebastian Henkel, Horst Biermann, Uwe Gampe
Abstract:
Thermal power machines are subjected to cyclic loading conditions under elevated temperatures. At these extreme conditions, the durability of the components has a significant influence. The material mechanical behaviour has to be known in detail for a failsafe construction. For this study a nickel-based alloy is considered, the deformation and fatigue behaviour of the material is analysed under cyclic loading. A viscoplastic model is used for calculating the deformation behaviour as well as to simulate the rate-dependent and cyclic plasticity effects. Finally, the cyclic deformation results of the finite element simulations are compared with low cycle fatigue (LCF) experiments.Keywords: Complex low cycle fatigue, elevated temperatures, IN718, viscoplastic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7089194 Shear Behaviour of RC Deep Beams with Openings Strengthened with Carbon Fiber Reinforced Polymer
Authors: Mannal Tariq
Abstract:
Construction industry is making progress at a high pace. The trend of the world is getting more biased towards the high rise buildings. Deep beams are one of the most common elements in modern construction having small span to depth ratio. Deep beams are mostly used as transfer girders. This experimental study consists of 16 reinforced concrete (RC) deep beams. These beams were divided into two groups; A and B. Groups A and B consist of eight beams each, having 381 mm (15 in) and 457 mm (18 in) depth respectively. Each group was further subdivided into four sub groups each consisting of two identical beams. Each subgroup was comprised of solid/control beam (without opening), opening above neutral axis (NA), at NA and below NA. Except for control beams, all beams with openings were strengthened with carbon fibre reinforced polymer (CFRP) vertical strips. These eight groups differ from each other based on depth and location of openings. For testing sake, all beams have been loaded with two symmetrical point loads. All beams have been designed based on strut and tie model concept. The outcome of experimental investigation elaborates the difference in the shear behaviour of deep beams based on depth and location of circular openings variation. 457 mm (18 in) deep beam with openings above NA show the highest strength and 381 mm (15 in) deep beam with openings below NA show the least strength. CFRP sheets played a vital role in increasing the shear capacity of beams.
Keywords: CFRP, deep beams, openings in deep beams, strut and tie model, shear behaviour.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13599193 A Graph-Based Approach for Placement of No-Replicated Databases in Grid
Authors: Cherif Haddad, Faouzi Ben Charrada
Abstract:
On a such wide-area environment as a Grid, data placement is an important aspect of distributed database systems. In this paper, we address the problem of initial placement of database no-replicated fragments in Grid architecture. We propose a graph based approach that considers resource restrictions. The goal is to optimize the use of computing, storage and communication resources. The proposed approach is developed in two phases: in the first phase, we perform fragment grouping using knowledge about fragments dependency and, in the second phase, we determine an efficient placement of the fragment groups on the Grid. We also show, via experimental analysis that our approach gives solutions that are close to being optimal for different databases and Grid configurations.Keywords: Grid computing, Distributed systems, Data resourcesmanagement, Database systems, Database placement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16419192 Locating Critical Failure Surface in Rock Slope Stability with Hybrid Model Based on Artificial Immune System and Cellular Learning Automata (CLA-AIS)
Authors: Ramin Javadzadeh, Emad Javadzadeh
Abstract:
Locating the critical slip surface with the minimum factor of safety for a rock slope is a difficult problem. In recent years, some modern global optimization methods have been developed with success in treating various types of problems, but very few of such methods have been applied to rock mechanical problems. In this paper, use of hybrid model based on artificial immune system and cellular learning automata is proposed. The results show that the algorithm is an effective and efficient optimization method with a high level of confidence rate.
Keywords: CLA-AIS, failure surface, optimization methods, rock slope.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20119191 Diagnosis of Inter Turn Fault in the Stator of Synchronous Generator Using Wavelet Based ANFIS
Authors: R. Rajeswari, N. Kamaraj
Abstract:
In this paper, Wavelet based ANFIS for finding inter turn fault of generator is proposed. The detector uniquely responds to the winding inter turn fault with remarkably high sensitivity. Discrimination of different percentage of winding affected by inter turn fault is provided via ANFIS having an Eight dimensional input vector. This input vector is obtained from features extracted from DWT of inter turn faulty current leaving the generator phase winding. Training data for ANFIS are generated via a simulation of generator with inter turn fault using MATLAB. The proposed algorithm using ANFIS is giving satisfied performance than ANN with selected statistical data of decomposed levels of faulty current.Keywords: Winding InterTurn fault, ANN, ANFIS, and DWT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29459190 Radar Task Schedulers based on Multiple Queue
Authors: María I. Jiménez, Alberto Izquierdo, Juan J. Villacorta, Lara del Val, Mariano Raboso
Abstract:
There are very complex communication systems, as the multifunction radar, MFAR (Multi-Function Array Radar), where functions are integrated all together, and simultaneously are performed the classic functions of tracking and surveillance, as all the functions related to the communication, countermeasures, and calibration. All these functions are divided into the tasks to execute. The task scheduler is a key element of the radar, since it does the planning and distribution of energy and time resources to be shared and used by all tasks. This paper presents schedulers based on the use of multiple queue. Several schedulers have been designed and studied, and it has been made a comparative analysis of different performed schedulers. The tests and experiments have been done by means of system software simulation. Finally a suitable set of radar characteristics has been selected to evaluate the behavior of the task scheduler working.Keywords: Queue Theory, Radar, Scheduler, Task.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21999189 A Life Cycle Assessment (LCA) of Aluminum Production Process
Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour
Abstract:
The production of aluminum alloys and ingots – starting from the processing of alumina to aluminum, and the final cast product – was studied using a Life Cycle Assessment (LCA) approach. The studied aluminum supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminum metal were investigated. The impact of the aluminum production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it come to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.
Keywords: Life cycle assessment, aluminum production, Supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4647