Search results for: natural Language Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3670

Search results for: natural Language Processing

670 Challenges in Adopting 3R Concept in the Heritage Building Restoration

Authors: H. H. Goh, K. C. Goh, T. W. Seow, N. S. Said, S. E. P. Ang

Abstract:

Malaysia is rich with historic buildings, particularly in Penang and Malacca states. Restoration activities are increasingly important as these states are recognized under UNESCO World Heritage Sites. Restoration activities help to maintain the uniqueness and value of a heritage building. However, increasing in restoration activities has resulted in large quantities of waste. To cope with this problem, the 3R concept (reduce, reuse and recycle) is introduced. The 3R concept is one of the waste management hierarchies. This concept is still yet to apply in the building restoration industry compared to the construction industry. Therefore, this study aims to promote the 3R concept in the heritage building restoration industry. This study aims to examine the importance of 3R concept and to identify challenges in applying the 3R concept in the heritage building restoration industry. This study focused on contractors and consultants who are involved in heritage restoration projects in Penang. Literature review and interviews helps to reach the research objective. Data that obtained is analyzed by using content analysis. For the research, application of 3R concept is important to conserve natural resources and reduce pollution problems. However, limited space to organise waste is the obstruction during the implementation of this concept. In conclusion, the 3R concept plays an important role in promoting environmental conservation and helping in reducing the construction waste.

Keywords: 3R Concept, Heritage building, Restoration activities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3163
669 Design and Fabrication of a Programmable Stiffness-Sensitive Gripper for Object Handling

Authors: Mehdi Modabberifar, Sanaz Jabary, Mojtaba Ghodsi

Abstract:

Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling.

Keywords: Gripper, haptic, stiffness, robotic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1153
668 A User Friendly Tool for Performance Evaluation of Different Reference Evapotranspiration Methods

Authors: Vijay Shankar

Abstract:

Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.

Keywords: Crop coefficient, Crop evapotranspiration, Field moisture, Irrigation Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
667 Inferring Hierarchical Pronunciation Rules from a Phonetic Dictionary

Authors: Erika Pigliapoco, Valerio Freschi, Alessandro Bogliolo

Abstract:

This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.

Keywords: Automatic phonetic transcription, pronunciation rules, hierarchical tree inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1924
666 Recycling Construction Waste Materials to Reduce the Environmental Pollutants

Authors: Mehrdad Abkenari, Alireza Rezaei, Naghmeh Pournayeb

Abstract:

There have recently been many studies and investments in developed and developing countries regarding the possibility of recycling construction waste, which are still ongoing. Since the term 'construction waste' covers a vast spectrum of materials in constructing buildings, roads and etc., many investigations are required to measure their technical performance in use as well as their time and place of use. Concrete is among the major and fundamental materials used in current construction industry. Along with the rise of population in developing countries, it is desperately required to meet the people's primary need in construction industry and on the other hand, dispose existing wastes for reducing the amount of environmental pollutants. Restrictions of natural resources and environmental pollution are the most important problems encountered by civil engineers. Reusing construction waste is an important and economic approach that not only assists the preservation of environment but also, provides us with primary raw materials. In line with consistent municipal development in disposal and reuse of construction waste, several approaches including, management of construction waste and materials, materials recycling and innovation and new inventions in materials have been predicted. This article has accordingly attempted to study the activities related to recycling of construction wastes and then, stated the economic, quantitative, qualitative and environmental results obtained.

Keywords: Civil engineering, environment, recycling, construction waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2927
665 Semantic Modeling of Management Information: Enabling Automatic Reasoning on DMTF-CIM

Authors: Fernando Alonso, Rafael Fernandez, Sonia Frutos, Javier Soriano

Abstract:

CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping can be used for automatic reasoning about the management information models, as a design aid, by means of new-generation CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.

Keywords: CIM, Knowledge-based Information Models, Ontology Languages, OWL, Description Logics, Integrated Network Management, Intelligent Agents, Automatic Reasoning Techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
664 Pectoral Muscles Suppression in Digital Mammograms Using Hybridization of Soft Computing Methods

Authors: I. Laurence Aroquiaraj, K. Thangavel

Abstract:

Breast region segmentation is an essential prerequisite in computerized analysis of mammograms. It aims at separating the breast tissue from the background of the mammogram and it includes two independent segmentations. The first segments the background region which usually contains annotations, labels and frames from the whole breast region, while the second removes the pectoral muscle portion (present in Medio Lateral Oblique (MLO) views) from the rest of the breast tissue. In this paper we propose hybridization of Connected Component Labeling (CCL), Fuzzy, and Straight line methods. Our proposed methods worked good for separating pectoral region. After removal pectoral muscle from the mammogram, further processing is confined to the breast region alone. To demonstrate the validity of our segmentation algorithm, it is extensively tested using over 322 mammographic images from the Mammographic Image Analysis Society (MIAS) database. The segmentation results were evaluated using a Mean Absolute Error (MAE), Hausdroff Distance (HD), Probabilistic Rand Index (PRI), Local Consistency Error (LCE) and Tanimoto Coefficient (TC). The hybridization of fuzzy with straight line method is given more than 96% of the curve segmentations to be adequate or better. In addition a comparison with similar approaches from the state of the art has been given, obtaining slightly improved results. Experimental results demonstrate the effectiveness of the proposed approach.

Keywords: X-ray Mammography, CCL, Fuzzy, Straight line.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1754
663 FEM Simulation of HE Blast-Fragmentation Warhead and the Calculation of Lethal Range

Authors: G. Tanapornraweekit, W. Kulsirikasem

Abstract:

This paper presents the simulation of fragmentation warhead using a hydrocode, Autodyn. The goal of this research is to determine the lethal range of such a warhead. This study investigates the lethal range of warheads with and without steel balls as preformed fragments. The results from the FE simulation, i.e. initial velocities and ejected spray angles of fragments, are further processed using an analytical approach so as to determine a fragment hit density and probability of kill of a modelled warhead. In order to simulate a plenty of preformed fragments inside a warhead, the model requires expensive computation resources. Therefore, this study attempts to model the problem in an alternative approach by considering an equivalent mass of preformed fragments to the mass of warhead casing. This approach yields approximately 7% and 20% difference of fragment velocities from the analytical results for one and two layers of preformed fragments, respectively. The lethal ranges of the simulated warheads are 42.6 m and 56.5 m for warheads with one and two layers of preformed fragments, respectively, compared to 13.85 m for a warhead without preformed fragment. These lethal ranges are based on the requirement of fragment hit density. The lethal ranges which are based on the probability of kill are 27.5 m, 61 m and 70 m for warheads with no preformed fragment, one and two layers of preformed fragments, respectively.

Keywords: Lethal Range, Natural Fragment, Preformed Fragment, Warhead.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4308
662 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
661 Preliminary Study of the Phonological Development in Three- and Four-Year-Old Bulgarian Children

Authors: Tsvetomira Braynova, Miglena Simonska

Abstract:

The article presents the results of a research of phonological processes in three- and four-year-old children. A test, created for the purpose of the study, was developed and conducted among 120 children. The study included three areas of research - at the level of words (96 words), at the level of sentence repetition (10 sentences) and at the level of generating own speech from a picture (15 pictures). The test also gives us additional information about the articulation errors of the assessed children. The main purpose of the research is to analyze all phonological processes that occur at this age in Bulgarian children and to identify which are typical and atypical for this age. The results show that the most common phonology errors that children make are: sound substitution, elision of sound, metathesis of sound, elision of syllable, elision of consonants clustered in a syllable. Measuring the correlation between average length of repeated speech and average length of generated speech, the analysis does not prove that the more words a child can repeat in part “repeated speech”, the more words they can be expected to generate in part “generating sentence”. The results of this study show that the task of naming a word provides sufficient and representative information to assess the child's phonology.

Keywords: Articulation, phonology, speech, language development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 383
660 A Trainable Neural Network Ensemble for ECG Beat Classification

Authors: Atena Sajedin, Shokoufeh Zakernejad, Soheil Faridi, Mehrdad Javadi, Reza Ebrahimpour

Abstract:

This paper illustrates the use of a combined neural network model for classification of electrocardiogram (ECG) beats. We present a trainable neural network ensemble approach to develop customized electrocardiogram beat classifier in an effort to further improve the performance of ECG processing and to offer individualized health care. We process a three stage technique for detection of premature ventricular contraction (PVC) from normal beats and other heart diseases. This method includes a denoising, a feature extraction and a classification. At first we investigate the application of stationary wavelet transform (SWT) for noise reduction of the electrocardiogram (ECG) signals. Then feature extraction module extracts 10 ECG morphological features and one timing interval feature. Then a number of multilayer perceptrons (MLPs) neural networks with different topologies are designed. The performance of the different combination methods as well as the efficiency of the whole system is presented. Among them, Stacked Generalization as a proposed trainable combined neural network model possesses the highest recognition rate of around 95%. Therefore, this network proves to be a suitable candidate in ECG signal diagnosis systems. ECG samples attributing to the different ECG beat types were extracted from the MIT-BIH arrhythmia database for the study.

Keywords: ECG beat Classification; Combining Classifiers;Premature Ventricular Contraction (PVC); Multi Layer Perceptrons;Wavelet Transform

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214
659 Laminar Free Convection of Nanofluid Flow in Horizontal Porous Annulus

Authors: Manal H. Saleh

Abstract:

A numerical study has been carried out to investigate the heat transfer by natural convection of nanofluid taking Cu as nanoparticles and the water as based fluid in a three dimensional annulus enclosure filled with porous media (silica sand) between two horizontal concentric cylinders with 12 annular fins of 2.4mm thickness attached to the inner cylinder under steady state conditions. The governing equations which used are continuity, momentum and energy equations under an assumptions used Darcy law and Boussinesq-s approximation which are transformed to dimensionless equations. The finite difference approach is used to obtain all the computational results using the MATLAB-7. The parameters affected on the system are modified Rayleigh number (10 ≤Ra*≤ 1000), fin length Hf (3, 7 and 11mm), radius ratio Rr (0.293, 0.365 and 0.435) and the volume fraction(0 ≤ ¤ò ≤ 0 .35). It was found that the average Nusselt number depends on (Ra*, Hf, Rr and φ). The results show that, increasing of fin length decreases the heat transfer rate and for low values of Ra*, decreasing Rr cause to decrease Nu while for Ra* greater than 100, decreasing Rr cause to increase Nu and adding Cu nanoparticles with 0.35 volume fraction cause 27.9% enhancement in heat transfer. A correlation for Nu in terms of Ra*, Hf and φ, has been developed for inner hot cylinder.

Keywords: Annular fins, laminar free convection, nanofluid, porous media, three dimensions horizontal annulus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2488
658 Mathematical Model of Depletion of Forestry Resource: Effect of Synthetic Based Industries

Authors: Manisha Chaudhary, Joydip Dhar, Govind Prasad Sahu

Abstract:

A mathematical model is proposed considering the forest biomass density B(t), density of wood based industries W(t) and density of synthetic industries S(t). It is assumed that the forest biomass grows logistically in the absence of wood based industries, but depletion of forestry biomass is due to presence of wood based industries. The growth of wood based industries depends on B(t), while S(t) grows at a constant rate, independent of B(t). Further there is a competition between W(t) and S(t) according to market demand. The proposed model has four ecologically feasible steady states, namely, E1: forest biomass free and wood industries free equilibrium; E2: wood industries free equilibrium and two coexisting equilibria E∗1 , E∗2 . Behavior of the system near all feasible equilibria is analyzed using the stability theory of differential equations. In the proposed model, the natural depletion rate h1 is a crucial parameter and system exhibits Hopf-bifurcation about the non-trivial equilibrium with respect to h1. The analytical results are verified using numerical simulation.

Keywords: A mathematical model, Competition between wood based and synthetic industries, Hopf-bifurcation, Stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3496
657 Optimization of Strategies and Models Review for Optimal Technologies - Based On Fuzzy Schemes for Green Architecture

Authors: Ghada Elshafei, Abdelazim Negm

Abstract:

Recently, the green architecture becomes a significant way to a sustainable future. Green building designs involve finding the balance between comfortable homebuilding and sustainable environment. Moreover, the utilization of the new technologies such as artificial intelligence techniques are used to complement current practices in creating greener structures to keep the built environment more sustainable. The most common objectives in green buildings should be designed to minimize the overall impact of the built environment that effect on ecosystems in general and in particularly human health and natural environment. This will lead to protecting occupant health, improving employee productivity, reducing pollution and sustaining the environmental. In green building design, multiple parameters which may be interrelated, contradicting, vague and of qualitative/quantitative nature are broaden to use. This paper presents a comprehensive critical state- ofart- review of current practices based on fuzzy and its combination techniques. Also, presented how green architecture/building can be improved using the technologies that been used for analysis to seek optimal green solutions strategies and models to assist in making the best possible decision out of different alternatives.

Keywords: Green architecture/building, technologies, optimization, strategies, fuzzy techniques and models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2521
656 A Study on Fantasy Images Represented on the Films: Focused on Mise-en-Scène Element

Authors: Somi Nah

Abstract:

The genre of fantasy depicts a world of imagine that triggers popular interest from a created view of world, and a fantasy is defined as a story that illustrates a world of imagine where scientific or horror elements are stand in its center. This study is not focused on the narrative of the fantasy, i.e. not on the adventurous story, but is concentrated on the image of the fantasy to work on its relationship with intended themes and differences among cultures due to meanings of materials. As for films, we have selected some films in the 2000's that are internationally recognized as expressing unique images of fantasy containing the theme of love in them. The selected films are 5 pieces including two European films, Amelie from Montmartre (2001) and The Science of Sleep (2005) and three Asian films, Citizen Dog from Thailand (2004), Memories of Matsuko from Japan (2006), and I'm a Cyborg, but That's OK from Korea (2006). These films share some common characteristics to the effect that they give tiny lessons and feelings for life with expressions of fantasy images as if they were fairy tales for adults and that they lead the audience to reflect on their days and revive forgotten dreams of childhood. We analyze the images of fantasy in each of the films on the basis of the elements of Mise-en-Scène (setting and props, costume, hair and make-up, facial expressions and body language, lighting and color, positioning of characters, and objects within a frame).

Keywords: Mise-en-scène, fantasy images, films, visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4951
655 Double Reduction of Ada-ECATNet Representation using Rewriting Logic

Authors: Noura Boudiaf, Allaoua Chaoui

Abstract:

One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets [2] are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic [12] and its programming language Maude [13]. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems. We proposed in [4] a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet). In this paper, we show that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets (CPNs). Such translation doesn-t reduce only the size of program, but reduces also the number of program states. We show also, how this compact Ada-ECATNet may be reduced again by applying reduction rules on it. This double reduction of Ada-ECATNet permits a considerable minimization of the memory space and run time of corresponding Maude program.

Keywords: Ada tasking, ECATNets, Algebraic Petri Nets, Compact Representation, Analysis, Rewriting Logic, Maude.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
654 Production of Spherical Cementite within Bainitic Matrix Microstructures in High Carbon Powder Metallurgy Steels

Authors: O. Altuntaş, A. Güral

Abstract:

The hardness-microstructure relationships of spherical cementite in bainitic matrix obtained by a different heat treatment cycles carried out to high carbon powder metallurgy (P/M) steel were investigated. For this purpose, 1.5 wt.% natural graphite powder admixed in atomized iron powders and the mixed powders were compacted under 700 MPa at room temperature and then sintered at 1150 °C under a protective argon gas atmosphere. The densities of the green and sintered samples were measured via the Archimedes method. A density of 7.4 g/cm3 was obtained after sintering and a density of 94% was achieved. The sintered specimens having primary cementite plus lamellar pearlitic structures were fully quenched from 950 °C temperature and then over-tempered at 705 °C temperature for 60 minutes to produce spherical-fine cementite particles in the ferritic matrix. After by this treatment, these samples annealed at 735 °C temperature for 3 minutes were austempered at 300 °C salt bath for a period of 1 to 5 hours. As a result of this process, it could be able to produced spherical cementite particle in the bainitic matrix. This microstructure was designed to improve wear and toughness of P/M steels. The microstructures were characterized and analyzed by SEM and micro and macro hardness.

Keywords: Powder metallurgy steel, heat treatment, bainite, spherical cementite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994
653 Influence of Pile Radius on Inertial Response of Pile Group in Fundamental Frequency of Homogeneous Soil Medium

Authors: Faghihnia Torshizi Mostafa, Saitoh Masato

Abstract:

An efficient method is developed for the response of a group of vertical, cylindrical fixed-head, finite length piles embedded in a homogeneous elastic stratum, subjected to harmonic force atop the pile group cap. Pile to pile interaction is represented through simplified beam-on-dynamic-Winkler-foundation (BDWF) with realistic frequency-dependent springs and dashpots. Pile group effect is considered through interaction factors. New closed-form expressions for interaction factors and curvature ratios atop the pile are extended by considering different boundary conditions at the tip of the piles (fixed, hinged). In order to investigate the fundamental characteristics of inertial bending strains in pile groups, inertial bending strains at the head of each pile are expressed in terms of slenderness ratio. The results of parametric study give valuable insight in understanding the behavior of fixed head pile groups in fundamental natural frequency of soil stratum.

Keywords: Winkler-foundation, fundamental frequency of soil stratum, normalized inertial bending strain, harmonic excitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077
652 Health Risk Assessment for Sewer Workers using Bayesian Belief Networks

Authors: Kevin Fong-Rey Liu, Ken Yeh, Cheng-Wu Chen, Han-Hsi Liang

Abstract:

The sanitary sewerage connection rate becomes an important indicator of advanced cities. Following the construction of sanitary sewerages, the maintenance and management systems are required for keeping pipelines and facilities functioning well. These maintenance tasks often require sewer workers to enter the manholes and the pipelines, which are confined spaces short of natural ventilation and full of hazardous substances. Working in sewers could be easily exposed to a risk of adverse health effects. This paper proposes the use of Bayesian belief networks (BBN) as a higher level of noncarcinogenic health risk assessment of sewer workers. On the basis of the epidemiological studies, the actual hospital attendance records and expert experiences, the BBN is capable of capturing the probabilistic relationships between the hazardous substances in sewers and their adverse health effects, and accordingly inferring the morbidity and mortality of the adverse health effects. The provision of the morbidity and mortality rates of the related diseases is more informative and can alleviate the drawbacks of conventional methods.

Keywords: Bayesian belief networks, sanitary sewerage, healthrisk assessment, hazard quotient, target organ-specific hazard index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705
651 The Study of the Intelligent Fuzzy Weighted Input Estimation Method Combined with the Experiment Verification for the Multilayer Materials

Authors: Ming-Hui Lee, Tsung-Chien Chen, Tsu-Ping Yu, Horng-Yuan Jang

Abstract:

The innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux of the multilayer materials as presented in this paper. The feasibility of this method can be verified by adopting the temperature measurement experiment. The experiment modular may be designed by using the copper sample which is stacked up 4 aluminum samples with different thicknesses. Furthermore, the bottoms of copper samples are heated by applying the standard heat source, and the temperatures on the tops of aluminum are measured by using the thermocouples. The temperature measurements are then regarded as the inputs into the presented method to estimate the heat flux in the bottoms of copper samples. The influence on the estimation caused by the temperature measurement of the sample with different thickness, the processing noise covariance Q, the weighting factor γ , the sampling time interval Δt , and the space discrete interval Δx , will be investigated by utilizing the experiment verification. The results show that this method is efficient and robust to estimate the unknown time-varying heat input of the multilayer materials.

Keywords: Multilayer Materials, Input Estimation Method, IHCP, Heat Flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
650 Comparison of Compression Ability Using DCT and Fractal Technique on Different Imaging Modalities

Authors: Sumathi Poobal, G. Ravindran

Abstract:

Image compression is one of the most important applications Digital Image Processing. Advanced medical imaging requires storage of large quantities of digitized clinical data. Due to the constrained bandwidth and storage capacity, however, a medical image must be compressed before transmission and storage. There are two types of compression methods, lossless and lossy. In Lossless compression method the original image is retrieved without any distortion. In lossy compression method, the reconstructed images contain some distortion. Direct Cosine Transform (DCT) and Fractal Image Compression (FIC) are types of lossy compression methods. This work shows that lossy compression methods can be chosen for medical image compression without significant degradation of the image quality. In this work DCT and Fractal Compression using Partitioned Iterated Function Systems (PIFS) are applied on different modalities of images like CT Scan, Ultrasound, Angiogram, X-ray and mammogram. Approximately 20 images are considered in each modality and the average values of compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the reconstructed image is arrived by the PSNR values. Based on the results it can be concluded that the DCT has higher PSNR values and FIC has higher compression ratio. Hence in medical image compression, DCT can be used wherever picture quality is preferred and FIC is used wherever compression of images for storage and transmission is the priority, without loosing picture quality diagnostically.

Keywords: DCT, FIC, PIFS, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
649 Fuzzy Wavelet Packet based Feature Extraction Method for Multifunction Myoelectric Control

Authors: Rami N. Khushaba, Adel Al-Jumaily

Abstract:

The myoelectric signal (MES) is one of the Biosignals utilized in helping humans to control equipments. Recent approaches in MES classification to control prosthetic devices employing pattern recognition techniques revealed two problems, first, the classification performance of the system starts degrading when the number of motion classes to be classified increases, second, in order to solve the first problem, additional complicated methods were utilized which increase the computational cost of a multifunction myoelectric control system. In an effort to solve these problems and to achieve a feasible design for real time implementation with high overall accuracy, this paper presents a new method for feature extraction in MES recognition systems. The method works by extracting features using Wavelet Packet Transform (WPT) applied on the MES from multiple channels, and then employs Fuzzy c-means (FCM) algorithm to generate a measure that judges on features suitability for classification. Finally, Principle Component Analysis (PCA) is utilized to reduce the size of the data before computing the classification accuracy with a multilayer perceptron neural network. The proposed system produces powerful classification results (99% accuracy) by using only a small portion of the original feature set.

Keywords: Biomedical Signal Processing, Data mining andInformation Extraction, Machine Learning, Rehabilitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
648 Influence of Non-Structural Elements on Dynamic Response of Multi-Storey Rc Building to Mining Shock

Authors: Joanna M. Dulińska, Maria Fabijańska

Abstract:

In the paper the results of calculations of the dynamic response of a multi-storey reinforced concrete building to a strong mining shock originated from the main region of mining activity in Poland (i.e. the Legnica-Glogow Copper District) are presented. The representative time histories of accelerations registered in three directions were used as ground motion data in calculations of the dynamic response of the structure. Two variants of a numerical model were applied: the model including only structural elements of the building and the model including both structural and non-structural elements (i.e. partition walls and ventilation ducts made of brick). It turned out that non-structural elements of multi-storey RC buildings have a small impact of about 10 % on natural frequencies of these structures. It was also proved that the dynamic response of building to mining shock obtained in case of inclusion of all non-structural elements in the numerical model is about 20 % smaller than in case of consideration of structural elements only. The principal stresses obtained in calculations of dynamic response of multi-storey building to strong mining shock are situated on the level of about 30% of values obtained from static analysis (dead load).

Keywords: Dynamic characteristics of buildings, mining shocks, dynamic response of buildings, non-structural elements

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
647 Restrictedly-Regular Map Representation of n-Dimensional Abstract Polytopes

Authors: Antonio Breda d’Azevedo

Abstract:

Regularity has often been present in the form of regular polyhedra or tessellations; classical examples are the nine regular polyhedra consisting of the five Platonic solids (regular convex polyhedra) and the four Kleper-Poinsot polyhedra. These polytopes can be seen as regular maps. Maps are cellular embeddings of graphs (with possibly multiple edges, loops or dangling edges) on compact connected (closed) surfaces with or without boundary. The n-dimensional abstract polytopes, particularly the regular ones, have gained popularity over recent years. The main focus of research has been their symmetries and regularity. Planification of polyhedra helps its spatial construction, yet it destroys its symmetries. To our knowledge there is no “planification” for n-dimensional polytopes. However we show that it is possible to make a “surfacification” of the n-dimensional polytope, that is, it is possible to construct a restrictedly-marked map representation of the abstract polytope on some surface that describes its combinatorial structures as well as all of its symmetries. We also show that there are infinitely many ways to do this; yet there is one that is more natural that describes reflections on the sides ((n−1)-faces) of n-simplices with reflections on the sides of n-polygons. We illustrate this construction with the 4-tetrahedron (a regular 4-polytope with automorphism group of size 120) and the 4-cube (a regular 4-polytope with automorphism group of size 384).

Keywords: Maps, representation, polytopes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665
646 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm

Authors: Xiang Jianhong, Wang Cong, Wang Linyu

Abstract:

With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.

Keywords: telemedicine, fetal electrocardiogram, compressed sensing, joint sparse reconstruction, block sparse signal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 509
645 Thermodynamic Analysis of Ventilated Façades under Operating Conditions in Southern Spain

Authors: Carlos A. D. Torres, Antonio D. Delgado

Abstract:

In this work we study the thermodynamic behavior of some ventilated facades under summer operating conditions in Southern Spain. Under these climatic conditions, indoor comfort implies a high energetic demand due to high temperatures that usually are reached in this season in the considered geographical area.

The aim of this work is to determine if during summer operating conditions in Southern Spain, ventilated façades provide some energy saving compared to the non-ventilated façades and to deduce their behavior patterns in terms of energy efficiency.

The modelization of the air flow in the channel has been performed by using Navier-Stokes equations for thermodynamic flows. Numerical simulations have been carried out with a 2D Finite Element approach.

This way, we analyze the behavior of ventilated façades under different weather conditions as variable wind, variable temperature and different levels of solar irradiation.

CFD computations show the combined effect of the shading of the external wall and the ventilation by the natural convection into the air gap achieve a reduction of the heat load during the summer period. This reduction has been evaluated by comparing the thermodynamic performances of two ventilated and two unventilated façades with the same geometry and thermophysical characteristics.

Keywords: Passive cooling, ventilated façades, energy-efficient building, CFD, FEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4948
644 Low Resolution Single Neural Network Based Face Recognition

Authors: Jahan Zeb, Muhammad Younus Javed, Usman Qayyum

Abstract:

This research paper deals with the implementation of face recognition using neural network (recognition classifier) on low-resolution images. The proposed system contains two parts, preprocessing and face classification. The preprocessing part converts original images into blurry image using average filter and equalizes the histogram of those image (lighting normalization). The bi-cubic interpolation function is applied onto equalized image to get resized image. The resized image is actually low-resolution image providing faster processing for training and testing. The preprocessed image becomes the input to neural network classifier, which uses back-propagation algorithm to recognize the familiar faces. The crux of proposed algorithm is its beauty to use single neural network as classifier, which produces straightforward approach towards face recognition. The single neural network consists of three layers with Log sigmoid, Hyperbolic tangent sigmoid and Linear transfer function respectively. The training function, which is incorporated in our work, is Gradient descent with momentum (adaptive learning rate) back propagation. The proposed algorithm was trained on ORL (Olivetti Research Laboratory) database with 5 training images. The empirical results provide the accuracy of 94.50%, 93.00% and 90.25% for 20, 30 and 40 subjects respectively, with time delay of 0.0934 sec per image.

Keywords: Average filtering, Bicubic Interpolation, Neurons, vectorization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
643 Family-size Biogas Plant Using Manure and Urine Mixture at Ambient Temperature in Semi-arid Regions of Northwestern China

Authors: Wenguang Ding, Yang Wu, Xia Wang, Yayu Gao

Abstract:

Biogas, a clean renewable energy, is attracting a growing concern of researchers and professionals in many fields. Based on the natural and climatic conditions in semi-arid regions of northwestern China, the present study introduces a specifically-designed family-size biogas plant (with a digester of 10m3) with manure and urine of animals and humanity as raw materials. The biogas plant is applicable to areas with altitudes of more than 2000 meters in northwestern China. In addition to the installation cost, a little operational expenditure, structure, characteristics, benefits of this small-scale biogas plant, this article introduces a wide range of specific popularization methods such as training, financial support, guided tour to the biogas plant, community-based group study and delivery of operational manuals. The feasibility of the biogas plant is explored on the basis of the availability of the raw materials. Simple operations contained in the current work increase the possibility of the wide use of this small-scale biogas plant in similar regions of the world.

Keywords: biogas, family-size biogas plant, northwestern China, popularization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2772
642 Sleep Scheduling Schemes Based on Location of Mobile User in Sensor-Cloud

Authors: N. Mahendran, R. Priya

Abstract:

The mobile cloud computing (MCC) with wireless sensor networks (WSNs) technology gets more attraction by research scholars because its combines the sensors data gathering ability with the cloud data processing capacity. This approach overcomes the limitation of data storage capacity and computational ability of sensor nodes. Finally, the stored data are sent to the mobile users when the user sends the request. The most of the integrated sensor-cloud schemes fail to observe the following criteria: 1) The mobile users request the specific data to the cloud based on their present location. 2) Power consumption since most of them are equipped with non-rechargeable batteries. Mostly, the sensors are deployed in hazardous and remote areas. This paper focuses on above observations and introduces an approach known as collaborative location-based sleep scheduling (CLSS) scheme. Both awake and asleep status of each sensor node is dynamically devised by schedulers and the scheduling is done purely based on the of mobile users’ current location; in this manner, large amount of energy consumption is minimized at WSN. CLSS work depends on two different methods; CLSS1 scheme provides lower energy consumption and CLSS2 provides the scalability and robustness of the integrated WSN.

Keywords: Sleep scheduling, mobile cloud computing, wireless sensor network, integration, location, network lifetime.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 975
641 Enhancing Cache Performance Based on Improved Average Access Time

Authors: Jasim. A. Ghaeb

Abstract:

A high performance computer includes a fast processor and millions bytes of memory. During the data processing, huge amount of information are shuffled between the memory and processor. Because of its small size and its effectiveness speed, cache has become a common feature of high performance computers. Enhancing cache performance proved to be essential in the speed up of cache-based computers. Most enhancement approaches can be classified as either software based or hardware controlled. The performance of the cache is quantified in terms of hit ratio or miss ratio. In this paper, we are optimizing the cache performance based on enhancing the cache hit ratio. The optimum cache performance is obtained by focusing on the cache hardware modification in the way to make a quick rejection to the missed line's tags from the hit-or miss comparison stage, and thus a low hit time for the wanted line in the cache is achieved. In the proposed technique which we called Even- Odd Tabulation (EOT), the cache lines come from the main memory into cache are classified in two types; even line's tags and odd line's tags depending on their Least Significant Bit (LSB). This division is exploited by EOT technique to reject the miss match line's tags in very low time compared to the time spent by the main comparator in the cache, giving an optimum hitting time for the wanted cache line. The high performance of EOT technique against the familiar mapping technique FAM is shown in the simulated results.

Keywords: Caches, Cache performance, Hit time, Cache hit ratio, Cache mapping, Cache memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677