Search results for: optical techniques.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3035

Search results for: optical techniques.

695 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: Connected components, Embrace threads, Local weighted kernel, Structuring element.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1143
694 Zero Carbon & Low Energy Housing; Comparative Analysis of Two Persian Vernacular Architectural Solutions to Increase Energy Efficiency

Authors: N. Poorang

Abstract:

In order to respond the human needs, all regional, social, and economical factors are available to gain residents’ comfort and ideal architecture. There is no doubt the thermal comfort has to satisfy people not only for daily and physical activities but also creating pleasant area for mental activities and relaxing. It costs energy and increases greenhouse gas emissions.

Reducing energy use in buildings is a critical component of meeting carbon reduction commitments. Hence housing design represents a major opportunity to cut energy use and CO2 emissions.

In terms of energy efficiency, it is vital to propose and research modern design methods for buildings however vernacular architecture techniques are proven empirical existing practices which have to be considered. This research tries to compare two architectural solution were proposed by Persian vernacular architecture, to achieve energy efficiency in hot areas.

The aim of this research is to analyze two forms of traditional Persian architecture in different locations in order to develop a systematic research and sustainable technologies on adaptation to contemporary living standards.

Keywords: Comparative Analysis, Persian Vernacular Architecture, Sustainable architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2267
693 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years

Authors: Yu. I. Pigolkin, M. A. Garcia Corro

Abstract:

Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.

Keywords: Age–related changes in bone structures, forensic personal identification, Sella turcica morphometry, body identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348
692 Rapid Frequency Response Measurement of Power Conversion Products with Coherence-Based Confidence Analysis

Authors: Tomi Roinila, Aki Taskinen, Matti Vilkko

Abstract:

Switched-mode converters play now a significant role in modern society. Their operation are often crucial in various electrical applications affecting the every day life. Therefore, the quality of the converters needs to be reliably verified. Recent studies have shown that the converters can be fully characterized by a set of frequency responses which can be efficiently used to validate the proper operation of the converters. Consequently, several methods have been proposed to measure the frequency responses fast and accurately. Most often correlation-based techniques have been applied. The presented measurement methods are highly sensitive to external errors and system nonlinearities. This fact has been often forgotten and the necessary uncertainty analysis of the measured responses has been neglected. This paper presents a simple approach to analyze the noise and nonlinearities in the frequency-response measurements of switched-mode converters. Coherence analysis is applied to form a confidence interval characterizing the noise and nonlinearities involved in the measurements. The presented method is verified by practical measurements from a high-frequency switchedmode converter.

Keywords: Switched-mode converters, Frequency analysis, CoherenceAnalysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
691 Effect of Tube Materials and Special Coating on Coke Deposition in the Steam Cracking of Hydrocarbons

Authors: A. Niaei, D. Salari , N. Daneshvar, A. Chamandeh, R. Nabavi

Abstract:

The steam cracking reactions are always accompanied with the formation of coke which deposits on the walls of the tubular reactors. The investigation has attempted to control catalytic coking by the applying aluminum, zinc and ceramic coating like aluminum-magnesium by thermal spray and pack cementation method. Rate of coke formation during steam cracking of naphtha has been investigated both for uncoated stainless steel (with different alloys) and metal coating constructed with thermal Spray and pack cementation method with metal powders of Aluminum, Aluminum-Magnesium, zinc, silicon, nickel and chromium. The results of the study show that passivating the surface of SS321 with a coating of Aluminum and Aluminum-Magnesium can significantly reduce the rate of coke deposition during naphtha pyrolysis. SEM and EDAX techniques (Philips XL Series) were used to examine the coke deposits formed by the metal-hydrocarbon reactions. Our objective was to separate the different stages by identifying the characteristic morphologies.

Keywords: Steam Cracking, Pyrolysis, Coke deposition, thermalspray, Pack Cementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2985
690 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976
689 An Effective Method of Head Lamp and Tail Lamp Recognition for Night Time Vehicle Detection

Authors: Hyun-Koo Kim, Sagong Kuk, MinKwan Kim, Ho-Youl Jung

Abstract:

This paper presents an effective method for detecting vehicles in front of the camera-assisted car during nighttime driving. The proposed method detects vehicles based on detecting vehicle headlights and taillights using techniques of image segmentation and clustering. First, to effectively extract spotlight of interest, a segmentation process based on automatic multi-level threshold method is applied on the road-scene images. Second, to spatial clustering vehicle of detecting lamps, a grouping process based on light tracking and locating vehicle lighting patterns. For simulation, we are implemented through Da-vinci 7437 DSP board with near infrared mono-camera and tested it in the urban and rural roads. Through the test, classification performances are above 97% of true positive rate evaluated on real-time environment. Our method also has good performance in the case of clear, fog and rain weather.

Keywords: Assistance Driving System, Multi-level Threshold Method, Near Infrared Mono Camera, Nighttime Vehicle Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2913
688 Thermochemical Conversion: Jatropha curcus in Fixed Bed Reactor Using Slow Pyrolysis

Authors: Vipan Kumar Sohpal, Rajesh Kumar Sharma

Abstract:

Thermochemical conversion of non-edible biomass offers an efficient and economically process to provide valuable fuels and prepare chemicals derived from biomass in the context of developing countries. Pyrolysis has advantages over other thermochemical conversion techniques because it can convert biomass directly into solid, liquid and gaseous products by thermal decomposition of biomass in the absence of oxygen. The present paper aims to focus on the slow thermochemical conversion processes for non-edible Jatropha curcus seed cake. The present discussion focuses on the effect of nitrogen gas flow rate on products composition (wt %). In addition, comparative analysis has been performed for different mesh size for product composition. Result shows that, slow pyrolysis experiments of Jatropha curcus seed cake in fixed bed reactor yield the bio-oil 18.42 wt % at a pyrolysis temperature of 500°C, particle size of -6+8 mesh number and nitrogen gas flow rate of 150 ml/min.

Keywords: Jatropha curcus, Thermo-chemical, Pyrolysis, Product composition, Yield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2834
687 Spatial Pattern and GIS-Based Model for Risk Assessment – A Case Study of Dusit District, Bangkok

Authors: Morakot Worachairungreung

Abstract:

The objectives of the research are to study patterns of fire location distribution and develop techniques of Geographic Information System application in fire risk assessment for fire planning and management. Fire risk assessment was based on two factors: the vulnerability factor such as building material types, building height, building density and capacity for mitigation factor such as accessibility by road, distance to fire station, distance to hydrants and it was obtained from four groups of stakeholders including firemen, city planners, local government officers and local residents. Factors obtained from all stakeholders were converted into Raster data of GIS and then were superimposed on the data in order to prepare fire risk map of the area showing level of fire risk ranging from high to low. The level of fire risk was obtained from weighted mean of each factor based on the stakeholders. Weighted mean for each factor was obtained by Analytical Hierarchy Analysis.

Keywords: Fire Risk Assessment, Geographic Information System: GIS, Raster Analysis and Analytical Hierarchy Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2178
686 Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL

Authors: S. H. Kazmi, T. Ahmed, K. Javed, A. Ghani

Abstract:

In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.

Keywords: Islanding, under-frequency load shedding, frequency rate of change, static UFLS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2194
685 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy

Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie

Abstract:

In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.

Keywords: Data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2524
684 Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors

Authors: Mohammad Amin Safi, Mahmud Ashrafizaadeh, Amir Ali Ashrafizaadeh

Abstract:

A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.

Keywords: Lattice Boltzmann model, Graphical processing unit, Binary mixture diffusion, 2D flow simulations, Optimized algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
683 Performances Comparison of Neural Architectures for On-Line Speed Estimation in Sensorless IM Drives

Authors: K.Sedhuraman, S.Himavathi, A.Muthuramalingam

Abstract:

The performance of sensor-less controlled induction motor drive depends on the accuracy of the estimated speed. Conventional estimation techniques being mathematically complex require more execution time resulting in poor dynamic response. The nonlinear mapping capability and powerful learning algorithms of neural network provides a promising alternative for on-line speed estimation. The on-line speed estimator requires the NN model to be accurate, simpler in design, structurally compact and computationally less complex to ensure faster execution and effective control in real time implementation. This in turn to a large extent depends on the type of Neural Architecture. This paper investigates three types of neural architectures for on-line speed estimation and their performance is compared in terms of accuracy, structural compactness, computational complexity and execution time. The suitable neural architecture for on-line speed estimation is identified and the promising results obtained are presented.

Keywords: Sensorless IM drives, rotor speed estimators, artificial neural network, feed- forward architecture, single neuron cascaded architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432
682 Bioleaching of Metals Contained in Spent Catalysts by Acidithiobacillus thiooxidans DSM 26636

Authors: Andrea M. Rivas-Castillo, Marlenne Gómez-Ramirez, Isela Rodríguez-Pozos, Norma G. Rojas-Avelizapa

Abstract:

Spent catalysts are considered as hazardous residues of major concern, mainly due to the simultaneous presence of several metals in elevated concentrations. Although hydrometallurgical, pyrometallurgical and chelating agent methods are available to remove and recover some metals contained in spent catalysts; these procedures generate potentially hazardous wastes and the emission of harmful gases. Thus, biotechnological treatments are currently gaining importance to avoid the negative impacts of chemical technologies. To this end, diverse microorganisms have been used to assess the removal of metals from spent catalysts, comprising bacteria, archaea and fungi, whose resistance and metal uptake capabilities differ depending on the microorganism tested. Acidophilic sulfur oxidizing bacteria have been used to investigate the biotreatment and extraction of valuable metals from spent catalysts, namely Acidithiobacillus thiooxidans and Acidithiobacillus ferroxidans, as they present the ability to produce leaching agents such as sulfuric acid and sulfur oxidation intermediates. In the present work, the ability of A. thiooxidans DSM 26636 for the bioleaching of metals contained in five different spent catalysts was assessed by growing the culture in modified Starkey mineral medium (with elemental sulfur at 1%, w/v), and 1% (w/v) pulp density of each residue for up to 21 days at 30 °C and 150 rpm. Sulfur-oxidizing activity was periodically evaluated by determining sulfate concentration in the supernatants according to the NMX-k-436-1977 method. The production of sulfuric acid was assessed in the supernatants as well, by a titration procedure using NaOH 0.5 M with bromothymol blue as acid-base indicator, and by measuring pH using a digital potentiometer. On the other hand, Inductively Coupled Plasma - Optical Emission Spectrometry was used to analyze metal removal from the five different spent catalysts by A. thiooxidans DSM 26636. Results obtained show that, as could be expected, sulfuric acid production is directly related to the diminish of pH, and also to highest metal removal efficiencies. It was observed that Al and Fe are recurrently removed from refinery spent catalysts regardless of their origin and previous usage, although these removals may vary from 9.5 ± 2.2 to 439 ± 3.9 mg/kg for Al, and from 7.13 ± 0.31 to 368.4 ± 47.8 mg/kg for Fe, depending on the spent catalyst proven. Besides, bioleaching of metals like Mg, Ni, and Si was also obtained from automotive spent catalysts, which removals were of up to 66 ± 2.2, 6.2±0.07, and 100±2.4, respectively. Hence, the data presented here exhibit the potential of A. thiooxidans DSM 26636 for the simultaneous bioleaching of metals contained in spent catalysts from diverse provenance.

Keywords: Acidithiobacillus thiooxidans, spent catalysts, bioleaching, metals, sulfuric acid, sulfur-oxidizing activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 996
681 An Overview on Aluminum Matrix Composites: Liquid State Processing

Authors: S. P. Jordan, G. Christian, S. P. Jeffs

Abstract:

Modern composite materials are increasingly being chosen in replacement of heavier metallic material systems within many engineering fields including aerospace and automotive industries. The increasing push towards satisfying environmental targets are fuelling new material technologies and manufacturing processes. This paper will introduce materials and manufacturing processes using metal matrix composites along with manufacturing processes optimized at Alvant Ltd., based in Basingstoke in the UK which offers modern, cost effective, selectively reinforced composites for light-weighting applications within engineering. An overview and introduction into modern optimized manufacturing methods capable of producing viable replacements for heavier metallic and lower temperature capable polymer composites are offered. A review of the capabilities and future applications of this viable material is discussed to highlight the potential involved in further optimization of old manufacturing techniques, to fully realize the potential to lightweight material using cost-effective methods.

Keywords: Aluminum matrix composites, light-weighting, hybrid squeeze casting, strategically placed reinforcements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 685
680 Harris Extraction and SIFT Matching for Correlation of Two Tablets

Authors: Ali Alzaabi, Georges Alquié, Hussain Tassadaq, Ali Seba

Abstract:

This article presents the developments of efficient algorithms for tablet copies comparison. Image recognition has specialized use in digital systems such as medical imaging, computer vision, defense, communication etc. Comparison between two images that look indistinguishable is a formidable task. Two images taken from different sources might look identical but due to different digitizing properties they are not. Whereas small variation in image information such as cropping, rotation, and slight photometric alteration are unsuitable for based matching techniques. In this paper we introduce different matching algorithms designed to facilitate, for art centers, identifying real painting images from fake ones. Different vision algorithms for local image features are implemented using MATLAB. In this framework a Table Comparison Computer Tool “TCCT" is designed to facilitate our research. The TCCT is a Graphical Unit Interface (GUI) tool used to identify images by its shapes and objects. Parameter of vision system is fully accessible to user through this graphical unit interface. And then for matching, it applies different description technique that can identify exact figures of objects.

Keywords: Harris Extraction and SIFT Matching

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
679 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Ángel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
678 Optimization of Lean Methodologies in the Textile Industry Using Design of Experiments

Authors: Ahmad Yame, Ahad Ali, Badih Jawad, Daw Al-Werfalli Mohamed Nasser, Sabah Abro

Abstract:

Industries in general have a lot of waste. Wool textile company, Baniwalid, Libya has many complex problems that led to enormous waste generated due to the lack of lean strategies, expertise, technical support and commitment. To successfully address waste at wool textile company, this study will attempt to develop a methodical approach that integrates lean manufacturing tools to optimize performance characteristics such as lead time and delivery. This methodology will utilize Value Stream Mapping (VSM) techniques to identify the process variables that affect production. Once these variables are identified, Design of Experiments (DOE) Methodology will be used to determine the significantly influential process variables, these variables are then controlled and set at their optimal to achieve optimal levels of productivity, quality, agility, efficiency and delivery to analyze the outputs of the simulation model for different lean configurations. The goal of this research is to investigate how the tools of lean manufacturing can be adapted from the discrete to the continuous manufacturing environment and to evaluate their benefits at a specific industrial.

Keywords: Lean manufacturing, DOE, value stream mapping, textiles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
677 A Two-Stage Multi-Agent System to Predict the Unsmoothed Monthly Sunspot Numbers

Authors: Mak Kaboudan

Abstract:

A multi-agent system is developed here to predict monthly details of the upcoming peak of the 24th solar magnetic cycle. While studies typically predict the timing and magnitude of cycle peaks using annual data, this one utilizes the unsmoothed monthly sunspot number instead. Monthly numbers display more pronounced fluctuations during periods of strong solar magnetic activity than the annual sunspot numbers. Because strong magnetic activities may cause significant economic damages, predicting monthly variations should provide different and perhaps helpful information for decision-making purposes. The multi-agent system developed here operates in two stages. In the first, it produces twelve predictions of the monthly numbers. In the second, it uses those predictions to deliver a final forecast. Acting as expert agents, genetic programming and neural networks produce the twelve fits and forecasts as well as the final forecast. According to the results obtained, the next peak is predicted to be 156 and is expected to occur in October 2011- with an average of 136 for that year.

Keywords: Computational techniques, discrete wavelet transformations, solar cycle prediction, sunspot numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1311
676 Study of Two Writing Schemes for a Magnetic Tunnel Junction Based On Spin Orbit Torque

Authors: K. Jabeur, L. D. Buda-Prejbeanu, G. Prenat, G. Di Pendina

Abstract:

MRAM technology provides a combination of fast access time, non-volatility, data retention and endurance. While a growing interest is given to two-terminal Magnetic Tunnel Junctions (MTJ) based on Spin-Transfer Torque (STT) switching as the potential candidate for a universal memory, its reliability is dramatically decreased because of the common writing/reading path. Three-terminal MTJ based on Spin-Orbit Torque (SOT) approach revitalizes the hope of an ideal MRAM. It can overcome the reliability barrier encountered in current two-terminal MTJs by separating the reading and the writing path. In this paper, we study two possible writing schemes for the SOT-MTJ device based on recently fabricated samples. While the first is based on precessional switching, the second requires the presence of permanent magnetic field. Based on an accurate Verilog-A model, we simulate the two writing techniques and we highlight advantages and drawbacks of each one. Using the second technique, pioneering logic circuits based on the three-terminal architecture of the SOT-MTJ described in this work are under development with preliminary attractive results.

Keywords: Spin orbit Torque, Magnetic Tunnel Junction, MRAM, Spintronic, Circuit simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3531
675 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: Fake news detection, types of fake news, machine learning, natural language processing, classification techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
674 Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm

Authors: Jehad A. H. Hammad, Nur'Aini binti Abdul Rashid

Abstract:

With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.

Keywords: Biological sequence, Database index, N-gram indexing, Parallel computing, Sequence retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2103
673 Real Time Data Communication with FlightGear Using Simulink over a UDP Protocol

Authors: Adil Loya, Ali Haider, Arslan A. Ghaffor, Abubaker Siddique

Abstract:

Simulation and modelling of Unmanned Aerial Vehicle (UAV) has gained wide popularity in front of aerospace community. The demand of designing and modelling optimized control system for UAV has increased ten folds since last decade, as next generation warfare is dependent on unmanned technologies. Therefore, this research focuses on the simulation of nonlinear UAV dynamics on Simulink and its integration with Flightgear. There has been lots of research on implementation of optimizing control using Simulink, however, there are fewer known techniques to simulate these dynamics over Flightgear and a tedious technique of acquiring data has been tackled in this research horizon. Sending data to Flightgear is easy but receiving it from Simulink is not that straight forward, i.e. we can only receive control data on the output. However, in this research we have managed to get the data out from the Flightgear by implementation of level 2 s-function block within Simulink. Moreover, the results captured from Flightgear over a Universal Datagram Protocol (UDP) communication are then compared with the attitude signal that were sent previously. This provide useful information regarding the difference in outputs attained from Simulink to Flightgear. It was found that values received on Simulink were in high agreement with that of the Flightgear output. And complete study has been conducted in a discrete way.

Keywords: aerospace, flight control, FlightGear, communication, Simulink

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
672 Predicting Protein-Protein Interactions from Protein Sequences Using Phylogenetic Profiles

Authors: Omer Nebil Yaveroglu, Tolga Can

Abstract:

In this study, a high accuracy protein-protein interaction prediction method is developed. The importance of the proposed method is that it only uses sequence information of proteins while predicting interaction. The method extracts phylogenetic profiles of proteins by using their sequence information. Combining the phylogenetic profiles of two proteins by checking existence of homologs in different species and fitting this combined profile into a statistical model, it is possible to make predictions about the interaction status of two proteins. For this purpose, we apply a collection of pattern recognition techniques on the dataset of combined phylogenetic profiles of protein pairs. Support Vector Machines, Feature Extraction using ReliefF, Naive Bayes Classification, K-Nearest Neighborhood Classification, Decision Trees, and Random Forest Classification are the methods we applied for finding the classification method that best predicts the interaction status of protein pairs. Random Forest Classification outperformed all other methods with a prediction accuracy of 76.93%

Keywords: Protein Interaction Prediction, Phylogenetic Profile, SVM , ReliefF, Decision Trees, Random Forest Classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
671 Identification of Industrial Health Using ANN

Authors: Deepak Goswami, Padma Lochan Hazarika, Kandarpa Kumar Sarma

Abstract:

The customary practice of identifying industrial sickness is a set traditional techniques which rely upon a range of manual monitoring and compilation of financial records. It makes the process tedious, time consuming and often are susceptible to manipulation. Therefore, certain readily available tools are required which can deal with such uncertain situations arising out of industrial sickness. It is more significant for a country like India where the fruits of development are rarely equally distributed. In this paper, we propose an approach based on Artificial Neural Network (ANN) to deal with industrial sickness with specific focus on a few such units taken from a less developed north-east (NE) Indian state like Assam. The proposed system provides decision regarding industrial sickness using eight different parameters which are directly related to the stages of sickness of such units. The mechanism primarily uses certain signals and symptoms of industrial health to decide upon the state of a unit. Specifically, we formulate an ANN based block with data obtained from a few selected units of Assam so that required decisions related to industrial health could be taken. The system thus formulated could become an important part of planning and development. It can also contribute towards computerization of decision support systems related to industrial health and help in better management.

Keywords: Industrial, Health, Classification, ANN, MLP, MSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
670 Analysing the Elementary Science and Technology Coursebook and Student Workbook in Terms of Constructivism

Authors: Nil Duban

Abstract:

The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.

Keywords: Constructivism, coursebooks, science and technology education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
669 Collaborative Design System based on Object- Oriented Modeling of Supply Chain Simulation: A Case Study of Thai Jewelry Industry

Authors: Somlak Wannarumon, Apichai Ritvirool, Thana Boonrit

Abstract:

The paper proposes a new concept in developing collaborative design system. The concept framework involves applying simulation of supply chain management to collaborative design called – 'SCM–Based Design Tool'. The system is developed particularly to support design activities and to integrate all facilities together. The system is aimed to increase design productivity and creativity. Therefore, designers and customers can collaborate by the system since conceptual design. JAG: Jewelry Art Generator based on artificial intelligence techniques is integrated into the system. Moreover, the proposed system can support users as decision tool and data propagation. The system covers since raw material supply until product delivery. Data management and sharing information are visually supported to designers and customers via user interface. The system is developed on Web–assisted product development environment. The prototype system is presented for Thai jewelry industry as a system prototype demonstration, but applicable for other industry.

Keywords: Collaborative design, evolutionary art, jewelry design, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
668 Collaborative Design System based on Object-Oriented Modeling of Supply Chain Simulation: A Case Study of Thai Jewelry Industry

Authors: Somlak Wannarumon, Apichai Ritvirool, Thana Boonrit

Abstract:

The paper proposes a new concept in developing collaborative design system. The concept framework involves applying simulation of supply chain management to collaborative design called – 'SCM–Based Design Tool'. The system is developed particularly to support design activities and to integrate all facilities together. The system is aimed to increase design productivity and creativity. Therefore, designers and customers can collaborate by the system since conceptual design. JAG: Jewelry Art Generator based on artificial intelligence techniques is integrated into the system. Moreover, the proposed system can support users as decision tool and data propagation. The system covers since raw material supply until product delivery. Data management and sharing information are visually supported to designers and customers via user interface. The system is developed on Web–assisted product development environment. The prototype system is presented for Thai jewelry industry as a system prototype demonstration, but applicable for other industry.

Keywords: Collaborative design, evolutionary art, jewelry design, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
667 A Real-Time Rendering based on Efficient Updating of Static Objects Buffer

Authors: Youngjae Chun, Kyoungsu Oh

Abstract:

Real-time 3D applications have to guarantee interactive rendering speed. There is a restriction for the number of polygons which is rendered due to performance of a graphics hardware or graphics algorithms. Generally, the rendering performance will be drastically increased when handling only the dynamic 3d models, which is much fewer than the static ones. Since shapes and colors of the static objects don-t change when the viewing direction is fixed, the information can be reused. We render huge amounts of polygon those cannot handled by conventional rendering techniques in real-time by using a static object image and merging it with rendering result of the dynamic objects. The performance must be decreased as a consequence of updating the static object image including removing an static object that starts to move, re-rending the other static objects being overlapped by the moving ones. Based on visibility of the object beginning to move, we can skip the updating process. As a result, we enhance rendering performance and reduce differences of rendering speed between each frame. Proposed method renders total 200,000,000 polygons that consist of 500,000 dynamic polygons and the rest are static polygons in about 100 frames per second.

Keywords: Occlusion query, Real-time rendering, Temporal coherence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
666 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4409