Search results for: Dynamic Clusters algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5240

Search results for: Dynamic Clusters algorithm

590 Improved Modulo 2n +1 Adder Design

Authors: Somayeh Timarchi, Keivan Navi

Abstract:

Efficient modulo 2n+1 adders are important for several applications including residue number system, digital signal processors and cryptography algorithms. In this paper we present a novel modulo 2n+1 addition algorithm for a recently represented number system. The proposed approach is introduced for the reduction of the power dissipated. In a conventional modulo 2n+1 adder, all operands have (n+1)-bit length. To avoid using (n+1)-bit circuits, the diminished-1 and carry save diminished-1 number systems can be effectively used in applications. In the paper, we also derive two new architectures for designing modulo 2n+1 adder, based on n-bit ripple-carry adder. The first architecture is a faster design whereas the second one uses less hardware. In the proposed method, the special treatment required for zero operands in Diminished-1 number system is removed. In the fastest modulo 2n+1 adders in normal binary system, there are 3-operand adders. This problem is also resolved in this paper. The proposed architectures are compared with some efficient adders based on ripple-carry adder and highspeed adder. It is shown that the hardware overhead and power consumption will be reduced. As well as power reduction, in some cases, power-delay product will be also reduced.

Keywords: Modulo 2n+1 arithmetic, residue number system, low power, ripple-carry adders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2898
589 Simulation Study on the Indoor Thermal Comfort with Insulation on Interior Structural Components of Super High-Rise Residences

Authors: Y. Wang, H. Fukuda, A. Ozaki, H. Sato

Abstract:

In this study, we discussed the effects on the thermal comfort of super high-rise residences that how effected by the high thermal capacity structural components. We considered different building orientations, structures, and insulation methods. We used the dynamic simulation software THERB (simulation of the thermal environment of residential buildings). It can estimate the temperature, humidity, sensible temperature, and heating/cooling load for multiple buildings. In the past studies, we examined the impact of air-conditioning loads (hereinafter referred to as AC loads) on the interior structural parts and the AC-usage patterns of super-high-rise residences. Super-high-rise residences have more structural components such as pillars and beams than do ordinary apartment buildings. The skeleton is generally made of concrete and steel, which have high thermal-storage capacities. The thermal-storage capacity of super-high-rise residences is considered to have a larger impact on the AC load and thermal comfort than that of ordinary residences. We show that the AC load of super-high-rise units would be reduced by installing insulation on the surfaces of interior walls that are not usually insulated in Japan.

Keywords: High-rise Residences, AC Load, Thermal Comfort, Thermal Storage, Insulation Patterns

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
588 Fingerprint Compression Using Contourlet Transform and Multistage Vector Quantization

Authors: S. Esakkirajan, T. Veerakumar, V. Senthil Murugan, R. Sudhakar

Abstract:

This paper presents a new fingerprint coding technique based on contourlet transform and multistage vector quantization. Wavelets have shown their ability in representing natural images that contain smooth areas separated with edges. However, wavelets cannot efficiently take advantage of the fact that the edges usually found in fingerprints are smooth curves. This issue is addressed by directional transforms, known as contourlets, which have the property of preserving edges. The contourlet transform is a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks. The computation and storage requirements are the major difficulty in implementing a vector quantizer. In the full-search algorithm, the computation and storage complexity is an exponential function of the number of bits used in quantizing each frame of spectral information. The storage requirement in multistage vector quantization is less when compared to full search vector quantization. The coefficients of contourlet transform are quantized by multistage vector quantization. The quantized coefficients are encoded by Huffman coding. The results obtained are tabulated and compared with the existing wavelet based ones.

Keywords: Contourlet Transform, Directional Filter bank, Laplacian Pyramid, Multistage Vector Quantization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
587 RTCoord: A Methodology to Design WSAN Applications

Authors: J. Barbarán, M. Díaz, I. Esteve, D. Garrido, L. Llopis, B. Rubio

Abstract:

Wireless Sensor and Actor Networks (WSANs) constitute an emerging and pervasive technology that is attracting increasing interest in the research community for a wide range of applications. WSANs have two important requirements: coordination interactions and real-time communication to perform correct and timely actions. This paper introduces a methodology to facilitate the task of the application programmer focusing on the coordination and real-time requirements of WSANs. The methodology proposed in this model uses a real-time component model, UM-RTCOM, which will help us to achieve the design and implementation of applications in WSAN by using the component oriented paradigm. This will help us to develop software components which offer some very interesting features, such as reusability and adaptability which are very suitable for WSANs as they are very dynamic environments with rapidly changing conditions. In addition, a high-level coordination model based on tuple channels (TC-WSAN) is integrated into the methodology by providing a component-based specification of this model in UM-RTCOM; this will allow us to satisfy both sensor-actor and actor-actor coordination requirements in WSANs. Finally, we present in this paper the design and implementation of an application which will help us to show how the methodology can be easily used in order to achieve the development of WSANs applications.

Keywords: Sensor networks, real time and embedded systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294
586 A Review and Comparative Analysis on Cluster Ensemble Methods

Authors: S. Sarumathi, P. Ranjetha, C. Saraswathy, M. Vaishnavi, S. Geetha

Abstract:

Clustering is an unsupervised learning technique for aggregating data objects into meaningful classes so that intra cluster similarity is maximized and inter cluster similarity is minimized in data mining. However, no single clustering algorithm proves to be the most effective in producing the best result. As a result, a new challenging technique known as the cluster ensemble approach has blossomed in order to determine the solution to this problem. For the cluster analysis issue, this new technique is a successful approach. The cluster ensemble's main goal is to combine similar clustering solutions in a way that achieves the precision while also improving the quality of individual data clustering. Because of the massive and rapid creation of new approaches in the field of data mining, the ongoing interest in inventing novel algorithms necessitates a thorough examination of current techniques and future innovation. This paper presents a comparative analysis of various cluster ensemble approaches, including their methodologies, formal working process, and standard accuracy and error rates. As a result, the society of clustering practitioners will benefit from this exploratory and clear research, which will aid in determining the most appropriate solution to the problem at hand.

Keywords: Clustering, cluster ensemble methods, consensus function, data mining, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
585 Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Authors: Seema Biday, Udhav Bhosle

Abstract:

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Keywords: Correlation, Frequency domain, Multitemporal, Relative Radiometric Correction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976
584 Comparison of Deep Convolutional Neural Networks Models for Plant Disease Identification

Authors: Megha Gupta, Nupur Prakash

Abstract:

Identification of plant diseases has been performed using machine learning and deep learning models on the datasets containing images of healthy and diseased plant leaves. The current study carries out an evaluation of some of the deep learning models based on convolutional neural network architectures for identification of plant diseases. For this purpose, the publicly available New Plant Diseases Dataset, an augmented version of PlantVillage dataset, available on Kaggle platform, containing 87,900 images has been used. The dataset contained images of 26 diseases of 14 different plants and images of 12 healthy plants. The CNN models selected for the study presented in this paper are AlexNet, ZFNet, VGGNet (four models), GoogLeNet, and ResNet (three models). The selected models are trained using PyTorch, an open-source machine learning library, on Google Colaboratory. A comparative study has been carried out to analyze the high degree of accuracy achieved using these models. The highest test accuracy and F1-score of 99.59% and 0.996, respectively, were achieved by using GoogLeNet with Mini-batch momentum based gradient descent learning algorithm.

Keywords: comparative analysis, convolutional neural networks, deep learning, plant disease identification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 631
583 The Mechanical Response of a Composite Propellant under Harsh Conditions

Authors: Xin Tong, Jin-sheng Xu, Xiong Chen, Ya Zheng

Abstract:

The aim of this paper is to study the mechanical properties of HTPB (Hydroxyl-terminated polybutadiene) composite propellant under harsh conditions. It describes two tests involving uniaxial tensile tests of various strain rates (ranging from 0.0005 s-1 to 1.5 s-1), temperatures (ranging from 223 K to 343 K) and high-cycle fatigue tests under low-temperature (223 K, frequencies were set at 50, 100, 150 Hz) using DMA (Dynamic Mechanical Analyzer). To highlight the effect of small pre-strain on fatigue properties of HTPB propellant, quasi-static stretching was carried out before fatigue loading, and uniaxial tensile tests at constant strain rates were successively applied. The results reveal that flow stress of propellant increases with reduction in temperature and rise in strain rate, and the strain rate-temperature equivalence relationship could be described by TTSP (time-temperature superposition principle) incorporating a modified WLF equation. Moreover, the rate of performance degradations and damage accumulation of propellant during fatigue tests increased with increasing strain amplitude and loading frequencies, while initial quasi-static loading has a negative effect on fatigue properties by comparing stress-strain relations after fatigue tests.

Keywords: Fatigue, HTPB propellant, tensile properties, time-temperature superposition principle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061
582 Thermodynamic Modeling of the High Temperature Shift Converter Reactor Using Minimization of Gibbs Free Energy

Authors: H. Zare Aliabadi

Abstract:

The equilibrium chemical reactions taken place in a converter reactor of the Khorasan Petrochemical Ammonia plant was studied using the minimization of Gibbs free energy method. In the minimization of the Gibbs free energy function the Davidon– Fletcher–Powell (DFP) optimization procedure using the penalty terms in the well-defined objective function was used. It should be noted that in the DFP procedure along with the corresponding penalty terms the Hessian matrices for the composition of constituents in the Converter reactor can be excluded. This, in fact, can be considered as the main advantage of the DFP optimization procedure. Also the effect of temperature and pressure on the equilibrium composition of the constituents was investigated. The results obtained in this work were compared with the data collected from the converter reactor of the Khorasan Petrochemical Ammonia plant. It was concluded that the results obtained from the method used in this work are in good agreement with the industrial data. Notably, the algorithm developed in this work, in spite of its simplicity, takes the advantage of short computation and convergence time.

Keywords: Gibbs free energy, converter reactors, Chemical equilibrium

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556
581 Fuzzy C-Means Clustering for Biomedical Documents Using Ontology Based Indexing and Semantic Annotation

Authors: S. Logeswari, K. Premalatha

Abstract:

Search is the most obvious application of information retrieval. The variety of widely obtainable biomedical data is enormous and is expanding fast. This expansion makes the existing techniques are not enough to extract the most interesting patterns from the collection as per the user requirement. Recent researches are concentrating more on semantic based searching than the traditional term based searches. Algorithms for semantic searches are implemented based on the relations exist between the words of the documents. Ontologies are used as domain knowledge for identifying the semantic relations as well as to structure the data for effective information retrieval. Annotation of data with concepts of ontology is one of the wide-ranging practices for clustering the documents. In this paper, indexing based on concept and annotation are proposed for clustering the biomedical documents. Fuzzy c-means (FCM) clustering algorithm is used to cluster the documents. The performances of the proposed methods are analyzed with traditional term based clustering for PubMed articles in five different diseases communities. The experimental results show that the proposed methods outperform the term based fuzzy clustering.

Keywords: MeSH Ontology, Concept Indexing, Annotation, semantic relations, Fuzzy c-means.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2297
580 A New Approach to Face Recognition Using Dual Dimension Reduction

Authors: M. Almas Anjum, M. Younus Javed, A. Basit

Abstract:

In this paper a new approach to face recognition is presented that achieves double dimension reduction, making the system computationally efficient with better recognition results and out perform common DCT technique of face recognition. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results change with change in face image resolution and provide optimal results when arriving at a certain resolution level. In the proposed model of face recognition, initially image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to increased computational speed and feature extraction potential of Discrete Cosine Transform (DCT), it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A tradeoff between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL , Yale and EME color database.

Keywords: Biometrics, DCT, Face Recognition, Illumination, Computation, Feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
579 Evaluation of the Rheological Properties of Bituminous Binders Modified with Biochars Obtained from Various Biomasses by Pyrolysis Method

Authors: Muhammed Ertuğrul Çeloğlu, Mehmet Yılmaz

Abstract:

In this study, apricot seed shell, walnut shell, and sawdust were chosen as biomass sources. The materials were sorted by using a sieve No. 50 and the sieved materials were subjected to pyrolysis process at 400 °C, resulting in three different biochar products. The resulting biochar products were added to the bitumen at three different rates (5%, 10% and 15%), producing modified bitumen. Penetration, softening point, rotation viscometer and dynamic shear rheometer (DSR) tests were conducted on modified binders. Thus the modified bitumen, which was obtained by using additives at 3 different rates obtained from biochar produced at 400 °C temperatures of 3 different biomass sources were compared and the effects of pyrolysis temperature and additive rates were evaluated. As a result of the conducted tests, it was determined that the rheology of the pure bitumen improved significantly as a result of the modification of the bitumen with the biochar. Additionally, with biochar additive, it was determined that the rutting parameter values obtained from softening point, viscometer and DSR tests were increased while the values in terms of penetration and phase angle decreased. It was also observed that the most effective biomass is sawdust while the least effective was ground apricot seed shell.

Keywords: Rheology, biomass, pyrolysis, biochar.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 835
578 Foundation Retrofitting of Storage Tank under Seismic Load

Authors: Seyed Abolhasan Naeini, Mohammad Hossein Zade, E. Izadi, M. Hossein Zade

Abstract:

The different seismic behavior of liquid storage tanks rather than conventional structures makes their responses more complicated. Uplifting and excessive settlement due to liquid sloshing are the most frequent damages in cylindrical liquid tanks after shell bucking failure modes. As a matter of fact, uses of liquid storage tanks because of the simple construction on compact layer of soil as a foundation are very conventional, but in some cases need to retrofit are essential. The tank seismic behavior can be improved by modifying dynamic characteristic of tank with verifying seismic loads as well as retrofitting and improving base ground. This paper focuses on a typical steel tank on loose, medium and stiff sandy soil and describes an evaluation of displacement of the tank before and after retrofitting. The Abaqus program was selected for its ability to include shell and structural steel elements, soil-structure interaction, and geometrical nonlinearities and contact type elements. The result shows considerable decreasing in settlement and uplifting in the case of retrofitted tank. Also, by increasing shear strength parameter of soil, the performance of the liquid storage tank under the case of seismic load increased.

Keywords: Steel tank, soil-structure, sandy soil, seismic load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
577 A New Approach to Design an Efficient CIC Decimator Using Signed Digit Arithmetic

Authors: Vishal Awasthi, Krishna Raj

Abstract:

Any digital processing performed on a signal with larger nyquist interval requires more computation than signal processing performed on smaller nyquist interval. The sampling rate alteration generates the unwanted effects in the system such as spectral aliasing and spectral imaging during signal processing. Multirate-multistage implementation of digital filter can result a significant computational saving than single rate filter designed for sample rate conversion. In this paper, we presented an efficient cascaded integrator comb (CIC) decimation filter that perform fast down sampling using signed digit adder algorithm with compensated frequency droop that arises due to aliasing effect during the decimation process. This proposed compensated CIC decimation filter structure with a hybrid signed digit (HSD) fast adder provide an improved performance in terms of down sampling speed by 65.15% than ripple carry adder (RCA) and reduced area and power by 57.5% and 0.01 % than signed digit (SD) adder algorithms respectively.

Keywords: Sampling rate conversion, Multirate Filtering, Compensation Theory, Decimation filter, CIC filter, Redundant signed digit arithmetic, Fast adders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4888
576 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek

Abstract:

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
575 An Intelligent Nondestructive Testing System of Ultrasonic Infrared Thermal Imaging Based on Embedded Linux

Authors: Hao Mi, Ming Yang, Tian-yue Yang

Abstract:

Ultrasonic infrared nondestructive testing is a kind of testing method with high speed, accuracy and localization. However, there are still some problems, such as the detection requires manual real-time field judgment, the methods of result storage and viewing are still primitive. An intelligent non-destructive detection system based on embedded linux is put forward in this paper. The hardware part of the detection system is based on the ARM (Advanced Reduced Instruction Set Computer Machine) core and an embedded linux system is built to realize image processing and defect detection of thermal images. The CLAHE algorithm and the Butterworth filter are used to process the thermal image, and then the boa server and CGI (Common Gateway Interface) technology are used to transmit the test results to the display terminal through the network for real-time monitoring and remote monitoring. The system also liberates labor and eliminates the obstacle of manual judgment. According to the experiment result, the system provides a convenient and quick solution for industrial non-destructive testing.

Keywords: Remote monitoring, non-destructive testing, embedded linux system, image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960
574 Finite Element Analysis of Sheet Metal Airbending Using Hyperform LS-DYNA

Authors: Himanshu V. Gajjar, Anish H. Gandhi, Harit K. Raval

Abstract:

Air bending is one of the important metal forming processes, because of its simplicity and large field application. Accuracy of analytical and empirical models reported for the analysis of bending processes is governed by simplifying assumption and do not consider the effect of dynamic parameters. Number of researches is reported on the finite element analysis (FEA) of V-bending, Ubending, and air V-bending processes. FEA of bending is found to be very sensitive to many physical and numerical parameters. FE models must be computationally efficient for practical use. Reported work shows the 3D FEA of air bending process using Hyperform LSDYNA and its comparison with, published 3D FEA results of air bending in Ansys LS-DYNA and experimental results. Observing the planer symmetry and based on the assumption of plane strain condition, air bending problem was modeled in 2D with symmetric boundary condition in width. Stress-strain results of 2D FEA were compared with 3D FEA results and experiments. Simplification of air bending problem from 3D to 2D resulted into tremendous reduction in the solution time with only marginal effect on stressstrain results. FE model simplification by studying the problem symmetry is more efficient and practical approach for solution of more complex large dimensions slow forming processes.

Keywords: Air V-bending, Finite element analysis, HyperformLS-DYNA, Planner symmetry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3207
573 Performance of Steel Frame with a Viscoelastic Damper Device under Earthquake Excitation

Authors: M. H. Mehrabi, S. S. Ghodsi, Zainah Ibrahim, Meldi Suhatril

Abstract:

Standard routes for upgrading existing buildings to improve their seismic response can be expensive in terms of both time and cost due to the modifications required to the foundations. As a result, interest has grown in the installation of viscoelastic dampers (VEDs) in mid and high-rise buildings. Details of a low-cost viscoelastic passive control device, the rotary rubber braced damper (RRBD), are presented in this paper. This design has the added benefits of being lightweight and simple to install. Experimental methods and finite element modeling were used to assess the performance of the proposed VED design and its effect on building response during earthquakes. The analyses took into account the behaviors of non-linear materials and large deformations. The results indicate that the proposed RRBD provides high levels of energy absorption, ensuring the stable cyclical response of buildings in all scenarios considered. In addition, time history analysis was employed in this study to evaluate the RRBD’s ability to control the displacements and accelerations experienced by steel frame structures. It was demonstrated that the device responds well even at low displacements, highlighting its suitability for use in seismic events of varying severity.

Keywords: Dynamic response, passive control, performance test, seismic protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940
572 Investigating the Vehicle-Bicyclists Conflicts Using LIDAR Sensor Technology at Signalized Intersections

Authors: Alireza Ansariyar, Mansoureh Jeihani

Abstract:

Light Detection and Ranging (LiDAR) sensors is capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore city. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By employing an image-processing algorithm, a safety Measure of Effectiveness (MOE) aims to identify critical zones for bicyclists upon entering each respective zone at the signalized intersection. Considering the trajectory of conflicts, the results of analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.

Keywords: LiDAR sensor, Post Encroachment Time threshold, vehicle-bike conflicts, measure of effectiveness, weather condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 127
571 A Multi-Feature Deep Learning Algorithm for Urban Traffic Classification with Limited Labeled Data

Authors: Rohan Putatunda, Aryya Gangopadhyay

Abstract:

Acoustic sensors, if embedded in smart street lights, can help in capturing the activities (car honking, sirens, events, traffic, etc.) in cities. Needless to say, the acoustic data from such scenarios are complex due to multiple audio streams originating from different events, and when decomposed to independent signals, the amount of retrieved data volume is small in quantity which is inadequate to train deep neural networks. So, in this paper, we address the two challenges: a) separating the mixed signals, and b) developing an efficient acoustic classifier under data paucity. So, to address these challenges, we propose an architecture with supervised deep learning, where the initial captured mixed acoustics data are analyzed with Fast Fourier Transformation (FFT), followed by filtering the noise from the signal, and then decomposed to independent signals by fast independent component analysis (Fast ICA). To address the challenge of data paucity, we propose a multi feature-based deep neural network with high performance that is reflected in our experiments when compared to the conventional convolutional neural network (CNN) and multi-layer perceptron (MLP).

Keywords: FFT, ICA, vehicle classification, multi-feature DNN, CNN, MLP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 421
570 Pattern Matching Based on Regular Tree Grammars

Authors: Riad S. Jabri

Abstract:

Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.

Keywords: Bottom-up automata, Code selection, Pattern matching, Regular tree grammars, Match trees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
569 A Reliable Secure Multicast Key Distribution Scheme for Mobile Adhoc Networks

Authors: D. SuganyaDevi, G. Padmavathi

Abstract:

Reliable secure multicast communication in mobile adhoc networks is challenging due to its inherent characteristics of infrastructure-less architecture with lack of central authority, high packet loss rates and limited resources such as bandwidth, time and power. Many emerging commercial and military applications require secure multicast communication in adhoc environments. Hence key management is the fundamental challenge in achieving reliable secure communication using multicast key distribution for mobile adhoc networks. Thus in designing a reliable multicast key distribution scheme, reliability and congestion control over throughput are essential components. This paper proposes and evaluates the performance of an enhanced optimized multicast cluster tree algorithm with destination sequenced distance vector routing protocol to provide reliable multicast key distribution. Simulation results in NS2 accurately predict the performance of proposed scheme in terms of key delivery ratio and packet loss rate under varying network conditions. This proposed scheme achieves reliability, while exhibiting low packet loss rate with high key delivery ratio compared with the existing scheme.

Keywords: Key Distribution, Mobile Adhoc Network, Multicast and Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
568 Mathematical Approach towards Fault Detection and Isolation of Linear Dynamical Systems

Authors: V.Manikandan, N.Devarajan

Abstract:

The main objective of this work is to provide a fault detection and isolation based on Markov parameters for residual generation and a neural network for fault classification. The diagnostic approach is accomplished in two steps: In step 1, the system is identified using a series of input / output variables through an identification algorithm. In step 2, the fault is diagnosed comparing the Markov parameters of faulty and non faulty systems. The Artificial Neural Network is trained using predetermined faulty conditions serves to classify the unknown fault. In step 1, the identification is done by first formulating a Hankel matrix out of Input/ output variables and then decomposing the matrix via singular value decomposition technique. For identifying the system online sliding window approach is adopted wherein an open slit slides over a subset of 'n' input/output variables. The faults are introduced at arbitrary instances and the identification is carried out in online. Fault residues are extracted making a comparison of the first five Markov parameters of faulty and non faulty systems. The proposed diagnostic approach is illustrated on benchmark problems with encouraging results.

Keywords: Artificial neural network, Fault Diagnosis, Identification, Markov parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
567 Dynamic Clustering Estimation of Tool Flank Wear in Turning Process using SVD Models of the Emitted Sound Signals

Authors: A. Samraj, S. Sayeed, J. E. Raja., J. Hossen, A. Rahman

Abstract:

Monitoring the tool flank wear without affecting the throughput is considered as the prudent method in production technology. The examination has to be done without affecting the machining process. In this paper we proposed a novel work that is used to determine tool flank wear by observing the sound signals emitted during the turning process. The work-piece material we used here is steel and aluminum and the cutting insert was carbide material. Two different cutting speeds were used in this work. The feed rate and the cutting depth were constant whereas the flank wear was a variable. The emitted sound signal of a fresh tool (0 mm flank wear) a slightly worn tool (0.2 -0.25 mm flank wear) and a severely worn tool (0.4mm and above flank wear) during turning process were recorded separately using a high sensitive microphone. Analysis using Singular Value Decomposition was done on these sound signals to extract the feature sound components. Observation of the results showed that an increase in tool flank wear correlates with an increase in the values of SVD features produced out of the sound signals for both the materials. Hence it can be concluded that wear monitoring of tool flank during turning process using SVD features with the Fuzzy C means classification on the emitted sound signal is a potential and relatively simple method.

Keywords: Fuzzy c means, Microphone, Singular ValueDecomposition, Tool Flank Wear.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892
566 In Vitro Study of Coded Transmission in Synthetic Aperture Ultrasound Imaging Systems

Authors: Ihor Trots, Yuriy Tasinkevych, Andrzej Nowicki, Marcin Lewandowski

Abstract:

In the paper the study of synthetic transmit aperture method applying the Golay coded transmission for medical ultrasound imaging is presented. Longer coded excitation allows to increase the total energy of the transmitted signal without increasing the peak pressure. Moreover signal-to-noise ratio and penetration depth are improved while maintaining high ultrasound image resolution. In the work the 128-element linear transducer array with 0.3 mm inter-element spacing excited by one cycle and the 8 and 16- bit Golay coded sequences at nominal frequency 4 MHz was used. To generate a spherical wave covering the full image region a single element transmission aperture was used and all the elements received the echo signals. The comparison of 2D ultrasound images of the tissue mimicking phantom and in vitro measurements of the beef liver is presented to illustrate the benefits of the coded transmission. The results were obtained using the synthetic aperture algorithm with transmit and receive signals correction based on a single element directivity function.

Keywords: Golay coded sequences, radiation pattern, signal processing, synthetic aperture, ultrasound imaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1670
565 Analytical Studies on Volume Determination of Leg Ulcer using Structured Light and Laser Triangulation Data Acquisition Techniques

Authors: M. Abdul-Rani, K. K. Chong, A. F. M. Hani, Y. B. Yap, A. Jamil

Abstract:

Imaging is defined as the process of obtaining geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin surface in medical application. This research focuses on analyzing and determining volume of leg ulcers using imaging devices. Volume determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the indication on responding to treatment whether healing or worsening. Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the treatment efficacy. The objectives of this paper is to compare the accuracy between two 3D data acquisition method, which is laser triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique produces better accuracy compared with laser triangulation data acquisition method for leg ulcer volume determination.

Keywords: Imaging, Laser Triangulation, Structured Light, Volume Determination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
564 Analysis of Impact Load Induced by Ultrasonic Cavitation Bubble Collapse Using Thin Film Pressure Sensors

Authors: Moiz S. Vohra, Nagalingam Arun Prasanth, Wei L. Tan, S. H. Yeo

Abstract:

The understanding of generation and collapse of acoustic cavitation bubbles are prerequisites for application of cavitation erosion. Microbubbles generated due to rapid fluctuation of pressure induced by propagation of ultrasonic wave lead to formation of high velocity microjets and or shock waves upon collapse. Due to vast application of ultrasonic, it is important to characterize and understand cavitation collapse pressure under the radiating surface at different conditions. A comparative investigation is carried out to determine impact load and dynamic pressure distribution exerted upon bubble collapse using thin film pressure sensors. Measurements were recorded at different input conditions such as amplitude, stand-off distance, insertion depth of the horn inside the liquid and pulse on-off time of acoustic vibrations. Impact force of 2.97 N is recorded at amplitude of 108 μm and stand-off distance of 1 mm from the sensor film, whereas impulsive force as low as 0.4 N is recorded at amplitude of 12 μm and stand-off distance of 5 mm from the sensor film. The results drawn from the investigation indicated that variety of impact loads can be achieved by controlling generation and collapse of bubbles, making it suitable to use for numerous application.

Keywords: Ultrasonic cavitation, bubble collapse, pressure mapping sensor, impact load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1133
563 Effects of Blast Load on Historic Stone Masonry Buildings in Canada: A Review and Analytical Study

Authors: Abass Braimah, Maha Hussein Abdallah

Abstract:

The global ascendancy of terrorist attacks on building infrastructure with economic and heritage significance has increased awareness of the possibility of terrorism in Canada. Many structures in Canada that are at risk of terrorist attacks include government buildings, built many years ago of historic stone masonry construction. Although many researchers are investigating ways to retrofit masonry stone buildings to mitigate the effect of blast loadings, lack of knowledge on the dynamic behavior of historic stone masonry structures under blast loads makes it difficult to ascertain the effectiveness of the retrofitting techniques. This paper presents a review of open-source literature for the experimental and numerical stone masonry structures under blast loads. This review yielded very little information of the response of the historic stone masonry structures under blast loads. Thus, a comprehensive study is needed to understand the blast load effects on historic stone masonry buildings. The out-of-plane response of historic masonry structures to blast loads is investigated by using single-degree-of-freedom analysis. This approach presents equations that can be used effectively in the analysis of historic masonry walls to out-of-plane blast loading.

Keywords: Blast loads, historical buildings, masonry structure, single-degree-of-freedom analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 442
562 Designing Information Systems in Education as Prerequisite for Successful Management Results

Authors: Vladimir Simovic, Matija Varga, Tonco Marusic

Abstract:

This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.

Keywords: Designing, education management, information systems, matrix technology, process affinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1091
561 Utilizing Biological Models to Determine the Recruitment of the Irish Republican Army

Authors: Erika Ann Schaub, Christian J Darken

Abstract:

Sociological models (e.g., social network analysis, small-group dynamic and gang models) have historically been used to predict the behavior of terrorist groups. However, they may not be the most appropriate method for understanding the behavior of terrorist organizations because the models were not initially intended to incorporate violent behavior of its subjects. Rather, models that incorporate life and death competition between subjects, i.e., models utilized by scientists to examine the behavior of wildlife populations, may provide a more accurate analysis. This paper suggests the use of biological models to attain a more robust method for understanding the behavior of terrorist organizations as compared to traditional methods. This study also describes how a biological population model incorporating predator-prey behavior factors can predict terrorist organizational recruitment behavior for the purpose of understanding the factors that govern the growth and decline of terrorist organizations. The Lotka-Volterra, a biological model that is based on a predator-prey relationship, is applied to a highly suggestive case study, that of the Irish Republican Army. This case study illuminates how a biological model can be utilized to understand the actions of a terrorist organization.

Keywords: Biological Models, Lotka-Volterra Predator-Prey Model, Terrorist Organizational Behavior, Terrorist Recruitment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520