Search results for: Data Structures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8500

Search results for: Data Structures

6970 What Managers Think of Informal Networks and Knowledge Sharing by Means of Personal Networking?

Authors: Mahmood Q.K. Ghaznavi, Martin Perry, Paul Toulson, Keri Logan

Abstract:

The importance of nurturing, accumulating, and efficiently deploying knowledge resources through formal structures and organisational mechanisms is well understood. Recent trends in knowledge management (KM) highlight that the effective creation and transfer of knowledge can also rely upon extra-organisational channels, such as, informal networks. The perception exists that the role of informal networks in knowledge creation and performance has been underestimated in the organisational context. Literature indicates that many managers fail to comprehend and successfully exploit the potential role of informal networks to create value for their organisations. This paper investigates: 1) whether managers share work-specific knowledge with informal contacts within and outside organisational boundaries; and 2) what do they think is the importance of this knowledge collaboration in their learning and work outcomes.

Keywords: Informal network, knowledge management, knowledge sharing, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
6969 Tree Based Data Aggregation to Resolve Funneling Effect in Wireless Sensor Network

Authors: G. Rajesh, B. Vinayaga Sundaram, C. Aarthi

Abstract:

In wireless sensor network, sensor node transmits the sensed data to the sink node in multi-hop communication periodically. This high traffic induces congestion at the node which is present one-hop distance to the sink node. The packet transmission and reception rate of these nodes should be very high, when compared to other sensor nodes in the network. Therefore, the energy consumption of that node is very high and this effect is known as the “funneling effect”. The tree based-data aggregation technique (TBDA) is used to reduce the energy consumption of the node. The throughput of the overall performance shows a considerable decrease in the number of packet transmissions to the sink node. The proposed scheme, TBDA, avoids the funneling effect and extends the lifetime of the wireless sensor network. The average case time complexity for inserting the node in the tree is O(n log n) and for the worst case time complexity is O(n2).

Keywords: Data Aggregation, Funneling Effect, Traffic Congestion, Wireless Sensor Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1312
6968 Sleep Scheduling Schemes Based on Location of Mobile User in Sensor-Cloud

Authors: N. Mahendran, R. Priya

Abstract:

The mobile cloud computing (MCC) with wireless sensor networks (WSNs) technology gets more attraction by research scholars because its combines the sensors data gathering ability with the cloud data processing capacity. This approach overcomes the limitation of data storage capacity and computational ability of sensor nodes. Finally, the stored data are sent to the mobile users when the user sends the request. The most of the integrated sensor-cloud schemes fail to observe the following criteria: 1) The mobile users request the specific data to the cloud based on their present location. 2) Power consumption since most of them are equipped with non-rechargeable batteries. Mostly, the sensors are deployed in hazardous and remote areas. This paper focuses on above observations and introduces an approach known as collaborative location-based sleep scheduling (CLSS) scheme. Both awake and asleep status of each sensor node is dynamically devised by schedulers and the scheduling is done purely based on the of mobile users’ current location; in this manner, large amount of energy consumption is minimized at WSN. CLSS work depends on two different methods; CLSS1 scheme provides lower energy consumption and CLSS2 provides the scalability and robustness of the integrated WSN.

Keywords: Sleep scheduling, mobile cloud computing, wireless sensor network, integration, location, network lifetime.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 972
6967 Particle Swarm Optimization Approach on Flexible Structure at Wiper Blade System

Authors: A. Zolfagharian, M.Z. Md. Zain, A. R. AbuBakar, M. Hussein

Abstract:

Application of flexible structures has been significantly, increased in industry and aerospace missions due to their contributions and unique advantages over the rigid counterparts. In this paper, vibration analysis of a flexible structure i.e., automobile wiper blade is investigated and controlled. The wiper generates unwanted noise and vibration during the wiping the rain and other particles on windshield which may cause annoying noise in different ranges of frequency. A two dimensional analytical modeled wiper blade whose model accuracy is verified by numerical studies in literature is considered in this study. Particle swarm optimization (PSO) is employed in alliance with input shaping (IS) technique in order to control or to attenuate the amplitude level of unwanted noise/vibration of the wiper blade.

Keywords: Input shaping, noise reduction, particle swarmoptimization, wiper blade

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
6966 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector

Authors: Saif Ul Haq

Abstract:

The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.

Keywords: Construction industry, quality considerations, quality function deployment, safety considerations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 890
6965 Tourism Satellite Account: Approach and Information System Development

Authors: Pappas Theodoros, Michael Diakomichalis

Abstract:

Measuring the economic impact of tourism in a benchmark economy is a global concern, with previous measurements being partial and not fully integrated. Tourism is a phenomenon that requires individual consumption of visitors, and which should be observed and measured to reveal the overall contribution of tourism to an economy. The Tourism Satellite Account (TSA) is a critical tool for assessing the annual growth of tourism, providing reliable measurements. This article presents a system of TSA information that encompasses all functions TSA functions, including input, storage, management, and analysis of data, as well as additional future functions and enhances the efficiency of tourism data management and TSA collection utility. The methodology and results presented offer new insights for the development and implementation of TSA.

Keywords: Tourism Satellite Account, information system, data-based tourist account.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36
6964 Input Data Balancing in a Neural Network PM-10 Forecasting System

Authors: Suk-Hyun Yu, Heeyong Kwon

Abstract:

Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.

Keywords: AI, air quality prediction, neural networks, pattern recognition, PM-10.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 823
6963 Molecular Electronic Devices based on Carotenoid Derivatives

Authors: Vicente F. P. Aleixo, Augusto C. F. Saraiva, Jordan Del Nero

Abstract:

The production of devices in nanoscale with specific molecular rectifying function is one of the most significant goals in state-of-art technology. In this work we show by ab initio quantum mechanics calculations coupled with non-equilibrium Green function, the design of an organic two-terminal device. These molecular structures have molecular source and drain with several bridge length (from five up to 11 double bonds). Our results are consistent with significant features as a molecular rectifier and can be raised up as: (a) it can be used as bi-directional symmetrical rectifier; (b) two devices integrated in one (FET with one operational region, and Thyristor thiristor); (c) Inherent stability due small intrinsic capacitance under forward/reverse bias. We utilize a scheme for the transport mechanism based on previous properties of ¤Ç bonds type that can be successfully utilized to construct organic nanodevices.

Keywords: ab initio, Carotenoid, Charge Transfer, Nanodevice

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
6962 Tree Based Data Fusion Clustering Routing Algorithm for Illimitable Network Administration in Wireless Sensor Network

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

In wireless sensor networks, locality and positioning information can be captured using Global Positioning System (GPS). This message can be congregated initially from spot to identify the system. Users can retrieve information of interest from a wireless sensor network (WSN) by injecting queries and gathering results from the mobile sink nodes. Routing is the progression of choosing optimal path in a mobile network. Intermediate node employs permutation of device nodes into teams and generating cluster heads that gather the data from entity cluster’s node and encourage the collective data to base station. WSNs are widely used for gathering data. Since sensors are power-constrained devices, it is quite vital for them to reduce the power utilization. A tree-based data fusion clustering routing algorithm (TBDFC) is used to reduce energy consumption in wireless device networks. Here, the nodes in a tree use the cluster formation, whereas the elevation of the tree is decided based on the distance of the member nodes to the cluster-head. Network simulation shows that this scheme improves the power utilization by the nodes, and thus considerably improves the lifetime.

Keywords: WSN, TBDFC, LEACH, PEGASIS, TREEPSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1108
6961 Transmission Mains Earthing Design and Concrete Pole Deployments

Authors: M. Nassereddine, J. Rizk, A. Hellany, M. Nagrial

Abstract:

The High Voltage (HV) transmission mains into the community necessitate earthing design to ensure safety compliance of the system. Concrete poles are widely used within HV transmission mains; which could have an impact on the earth grid impedance and input impedance of the system from the fault point of view. This paper provides information on concrete pole earthing to enhance the split factor of the system; further, it discusses the deployment of concrete structures in high soil resistivity area to reduce the earth grid system of the plant. This paper introduces the cut off soil resistivity SC ρ when replacing timber poles with concrete ones.

Keywords: Concrete Poles, Earth Grid, EPR, High Voltage, Soil Resistivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3598
6960 Repair of Concrete Structures with SCC

Authors: F. Kharchi, M. Benhadji, O. Bouksani

Abstract:

The objective of this work is to study the influence of the properties of the substrate on the retrofit (thin repair) of damaged concrete elements, with the SCC. Fluidity, principal characteristic of the SCC, would enable it to cover and adhere to the concrete to be repaired. Two aspects of repair are considered, the bond (Adhesion) and the tensile strength and the cracking. The investigation is experimental; It was conducted over test specimens made up of ordinary concrete prepared and hardened in advance (the material to be repaired) over which a self compacting concrete layer is cast. Three alternatives of SC concrete and one ordinary concrete (comparison) were tested. It appears that the self-compacting concrete constitutes a good material for repairing. It follows perfectly the surfaces- forms to be repaired and allows a perfect bond. Fracture tests made on specimens of self-compacting concrete show a brittle behaviour. However when a small percentage of fibres is added, the resistance to cracking is very much improve.

Keywords: Adhesion, concrete, experimental, repair, self-compacting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
6959 Holistic Face Recognition using Multivariate Approximation, Genetic Algorithms and AdaBoost Classifier: Preliminary Results

Authors: C. Villegas-Quezada, J. Climent

Abstract:

Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.

Keywords: AdaBoost Classifier, Holistic Face Recognition, Minimax Multivariate Approximation, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
6958 Intrusion Detection based on Distance Combination

Authors: Joffroy Beauquier, Yongjie Hu

Abstract:

The intrusion detection problem has been frequently studied, but intrusion detection methods are often based on a single point of view, which always limits the results. In this paper, we introduce a new intrusion detection model based on the combination of different current methods. First we use a notion of distance to unify the different methods. Second we combine these methods using the Pearson correlation coefficients, which measure the relationship between two methods, and we obtain a combined distance. If the combined distance is greater than a predetermined threshold, an intrusion is detected. We have implemented and tested the combination model with two different public data sets: the data set of masquerade detection collected by Schonlau & al., and the data set of program behaviors from the University of New Mexico. The results of the experiments prove that the combination model has better performances.

Keywords: Intrusion detection, combination, distance, Pearson correlation coefficients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1837
6957 Fault Tolerance in Distributed Database Systems

Authors: M. A. Adeboyejo, O. O. Adeosun

Abstract:

Pioneer networked systems assume that connections are reliable, and a faulty operation will be considered in case of losing a connection. Transient connections are typical of mobile devices. Areas of application of data sharing system such as these, lead to the conclusion that network connections may not always be reliable, and that the conventional approaches can be improved. Nigerian commercial banking industry is a critical system whose operation is increasingly becoming dependent on information technology (IT) driven information system. The proposed solution to this problem makes use of a hierarchically clustered network structure which we selected to reflect (as much as possible) the typical organizational structure of the Nigerian commercial banks. Representative transactions such as data updates and replication of the results of such updates were used to simulate the proposed model to show its applicability.

Keywords: Dependability, reliability, data redundancy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3351
6956 Normalization Discriminant Independent Component Analysis

Authors: Liew Yee Ping, Pang Ying Han, Lau Siong Hoe, Ooi Shih Yin, Housam Khalifa Bashier Babiker

Abstract:

In face recognition, feature extraction techniques attempts to search for appropriate representation of the data. However, when the feature dimension is larger than the samples size, it brings performance degradation. Hence, we propose a method called Normalization Discriminant Independent Component Analysis (NDICA). The input data will be regularized to obtain the most reliable features from the data and processed using Independent Component Analysis (ICA). The proposed method is evaluated on three face databases, Olivetti Research Ltd (ORL), Face Recognition Technology (FERET) and Face Recognition Grand Challenge (FRGC). NDICA showed it effectiveness compared with other unsupervised and supervised techniques.

Keywords: Face recognition, small sample size, regularization, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
6955 Coding Considerations for Standalone Molecular Dynamics Simulations of Atomistic Structures

Authors: R. O. Ocaya, J. J. Terblans

Abstract:

The laws of Newtonian mechanics allow ab-initio molecular dynamics to model and simulate particle trajectories in material science by defining a differentiable potential function. This paper discusses some considerations for the coding of ab-initio programs for simulation on a standalone computer and illustrates the approach by C language codes in the context of embedded metallic atoms in the face-centred cubic structure. The algorithms use velocity-time integration to determine particle parameter evolution for up to several thousands of particles in a thermodynamical ensemble. Such functions are reusable and can be placed in a redistributable header library file. While there are both commercial and free packages available, their heuristic nature prevents dissection. In addition, developing own codes has the obvious advantage of teaching techniques applicable to new problems.

Keywords: C-language, molecular dynamics, simulation, embedded atom method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1402
6954 Daily Global Solar Radiation Modeling Using Multi-Layer Perceptron (MLP) Neural Networks

Authors: Seyed Fazel Ziaei Asl, Ali Karami, Gholamreza Ashari, Azam Behrang, Arezoo Assareh, N.Hedayat

Abstract:

Predict daily global solar radiation (GSR) based on meteorological variables, using Multi-layer perceptron (MLP) neural networks is the main objective of this study. Daily mean air temperature, relative humidity, sunshine hours, evaporation, wind speed, and soil temperature values between 2002 and 2006 for Dezful city in Iran (32° 16' N, 48° 25' E), are used in this study. The measured data between 2002 and 2005 are used to train the neural networks while the data for 214 days from 2006 are used as testing data.

Keywords: Multi-layer Perceptron (MLP) Neural Networks;Global Solar Radiation (GSR), Meteorological Parameters, Prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2974
6953 The Effect of CPU Location in Total Immersion of Microelectronics

Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson

Abstract:

Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.

Keywords: CPU location, data centre cooling, heat sink in enclosures, Immersed microelectronics, turbulent natural convection in enclosures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2168
6952 Influence of Milled Waste Glass to Clay Ceramic Foam Properties Made by Direct Foaming Route

Authors: A. Shishkin, V. Mironovs, D. Goljandin, A. Korjakins

Abstract:

The goal of this work is to develop sustainable and durable ceramic cellular structures using widely available natural resources- clay and milled waste glass. Present paper describes method of obtaining clay ceramic foam (CCF) with addition of milled waste glass in 5, 7 and 10 wt% by direct foaming with high speed mixer-disperser (HSMD). For more efficient clay and waste glass milling and mixing, the high velocity disintegrator was used. The CCF with 5, 7, and 10 wt% were obtained at 900, 950, 1000 and 1050 °C firing temperature and they have demonstrated mechanical compressive strength for all 12 samples ranging from 3.8 to 14.3 MPa and porosity 76-65%. Obtained CCF has compressive strength 14.3 MPa and porosity 65.3%.

Keywords: Ceramic foam, waste glass, clay foam, glass foam, open cell, direct foaming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
6951 Low-Temperature Luminescence Spectroscopy of Violet Sr-Al-O:Eu2+ Phosphor Particles

Authors: Keiji Komatsu, Hayato Maruyama, Ariyuki Kato, Atsushi Nakamura, Shigeo Ohshio, Hiroki Akasaka, Hidetoshi Saitoh

Abstract:

Violet Sr–Al–O:Eu2+ phosphor particles were synthesized from a metal–ethylenediaminetetraacetic acid (EDTA) solution of Sr, Al, Eu, and particulate alumina via spray drying and sintering in a reducing atmosphere. The crystal structures and emission properties at 85–300 K were investigated. The composition of the violet Sr–Al–O:Eu2+ phosphor particles was determined from various Sr–Al–O:Eu2+ phosphors by their emission properties’ dependence on temperature. The highly crystalline SrAl12O19:Eu2+ emission phases were confirmed by their crystallite sizes and the activation energies for the 4f5d–8S7/2 transition of the Eu2+ ion. These results showed that the material identification for the violet Sr–Al–O:Eu2+ phosphor was accomplished by the low-temperature luminescence measurements.

Keywords: Low temperature luminescence spectroscopy, Material Identification, Strontium aluminates phosphor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2347
6950 A Study on the Cloud Simulation with a Network Topology Generator

Authors: Jun-Kwon Jung, Sung-Min Jung, Tae-Kyung Kim, Tai-Myoung Chung

Abstract:

CloudSim is a useful tool to simulate the cloud environment. It shows the service availability, the power consumption, and the network traffic of services on the cloud environment. Moreover, it supports to calculate a network communication delay through a network topology data easily. CloudSim allows inputting a file of topology data, but it does not provide any generating process. Thus, it needs the file of topology data generated from some other tools. The BRITE is typical network topology generator. Also, it supports various type of topology generating algorithms. If CloudSim can include the BRITE, network simulation for clouds is easier than existing version. This paper shows the potential of connection between BRITE and CloudSim. Also, it proposes the direction to link between them.

Keywords: Cloud, simulation, topology, BRITE, network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3768
6949 Structural Behavior of Incomplete Box Girder Bridges Subjected to Unpredicted Loads

Authors: E. H. N. Gashti, J. Razzaghi, K. Kujala

Abstract:

In general, codes and regulations consider seismic loads only for completed structures of the bridges while, evaluation of incomplete structure of bridges, especially those constructed by free cantilever method, under these loads is also of great importance. Hence, this research tried to study the behavior of incomplete structure of common bridge type (box girder bridge), in construction phase under vertical seismic loads. Subsequently, the paper provided suitable guidelines and solutions to resist this destructive phenomenon. Research results proved that use of preventive methods can significantly reduce the stresses resulted from vertical seismic loads in box cross sections to an acceptable range recommended by design codes.

Keywords: Box girder bridges, Prestress loads, Free cantilever method, Seismic loads, Construction phase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
6948 Low Power Circuit Architecture of AES Crypto Module for Wireless Sensor Network

Authors: MooSeop Kim, Juhan Kim, Yongje Choi

Abstract:

Recently, much research has been conducted for security for wireless sensor networks and ubiquitous computing. Security issues such as authentication and data integrity are major requirements to construct sensor network systems. Advanced Encryption Standard (AES) is considered as one of candidate algorithms for data encryption in wireless sensor networks. In this paper, we will present the hardware architecture to implement low power AES crypto module. Our low power AES crypto module has optimized architecture of data encryption unit and key schedule unit which could be applicable to wireless sensor networks. We also details low power design methods used to design our low power AES crypto module.

Keywords: Algorithm, Low Power Crypto Circuit, AES, Security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2505
6947 Role of Credit on Production Efficiency of Farming Sector in Pakistan(A Data Envelopment Analysis)

Authors: Saima Ayaz, Zakir Hussain, Maqbool Hussain Sial

Abstract:

The study identified the sources of production inefficiency of the farming sector in district Faisalabad in the Punjab province of Pakistan. Data Envelopment Analysis (DEA) technique was utilized at farm level survey data of 300 farmers for the year 2009. The overall mean efficiency score was 0.78 indicating 22 percent inefficiency of the sample farmers. Computed efficiency scores were then regressed on farm specific variables using Tobit regression analysis. Farming experience, education, access to farming credit, herd size and number of cultivation practices showed constructive and significant effect on the farmer-s technical efficiency.

Keywords: Agricultural credit, DEA, Technical efficiency, Tobit analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2342
6946 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 900
6945 On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel

Authors: Tatjana Eitrich, Bruno Lang

Abstract:

This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.

Keywords: Support Vector Machine Training, Multi-ParameterKernels, Shared Memory Parallel Computing, Large Data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
6944 Structural Monitoring and Control During Support System Replacement of a Historical Gate

Authors: Ahmet Turer

Abstract:

Middle-gate is located in Hasankeyf, Batman dating back to 1800 BC and is one of the important historical structures in Turkey. The ancient structure has suffered major structural cracks due to aging as well as lateral pressure of a cracked rock which is predicted to be about 100 tons. The existing support system was found to be inadequate to support the load especially after a recent rock fall in the close vicinity. Concerns were increased since the existing support system that is integral with a damaged and cracked gate wall needed to be replaced by a new support system. The replacement process must be carefully monitored by crackmeters and control mechanisms should be integrated to prevent cracks to expand while the same crack width needs to be maintained after the operation. The control system and actions taken during the intervention are explained in this paper.

Keywords: structural control, crack width, replacement, support

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1266
6943 Dimensional Variations of Cement Matrices in the Presence of Metal Fibers

Authors: Fatima Setti, Ezziane Karim, Setti Bakhti, Negadi Kheira

Abstract:

The objective of this study is to present and to analyze the feasibility of using steel fibers as reinforcement in the cementations matrix to minimize the effect of free shrinkage which is a major cause of cracks that have can observe on concrete structures, also to improve the mechanical resistances of this concrete reinforced. The experimental study was performed on specimens with geometric characteristics adapted to the testing. The tests of shrinkage apply on prismatic specimens, equipped with rods fixed to the ends with different dosages of fibers, it should be noted that the fibers used are hooked end of 50mm length and 67 slenderness. The results show that the compressive strength and flexural strength increases as the degree of incorporation of fibbers increases. And the shrinkage deformations are generally less important for fibers-reinforced concrete to those appearing in the concrete without fibers.

Keywords: Concrete, Steel fibers, Compression, Flexural, Deformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1683
6942 Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction

Authors: Mussa I. Mgwatu, Reuben R. M. Kainkwa

Abstract:

Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.

Keywords: Probabilistic models, wind speed, wind energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340
6941 Demographic Factors Influencing Employees’ Salary Expectations and Labor Turnover

Authors: M. Osipova

Abstract:

Thanks to informational technologies development every sphere of economics is becoming more and more datacentralized as people are generating huge datasets containing information on any aspect of their life. Applying research of such data to human resources management allows getting scarce statistics on labor market state including salary expectations and potential employees’ typical career behavior, and this information can become a reliable basis for management decisions. The following article presents results of career behavior research based on freely accessible resume data. Information used for study is much wider than one usually uses in human resources surveys. That is why there is enough data for statistically significant results even for subgroups analysis.

Keywords: Human resources management, labor market, salary expectations, statistics, turnover.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842