Search results for: fault detection and classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5671

Search results for: fault detection and classification

2011 Comparison of Rumen Microbial Analysis Pipelines Based on 16s rRNA Gene Sequencing

Authors: Xiaoxing Ye

Abstract:

To investigate complex rumen microbial communities, 16S ribosomal RNA (rRNA) sequencing is widely used. Here, we evaluated the impact of bioinformatics pipelines on the observation of OTUs and taxonomic classification of 750 cattle rumen microbial samples by comparing three commonly used pipelines (LotuS, UPARSE, and QIIME) with Usearch. In LotuS-based analyses, 189 archaeal and 3894 bacterial OTUs were observed. The observed OTUs for the Usearch analysis were significantly larger than the LotuS results. We discovered 1495 OTUs for archaea and 92665 OTUs for bacteria using Usearch analysis. In addition, taxonomic assignments were made for the rumen microbial samples. All pipelines had consistent taxonomic annotations from the phylum to the genus level. A difference in relative abundance was calculated for all microbial levels, including Bacteroidetes (QIIME: 72.2%, Usearch: 74.09%), Firmicutes (QIIME: 18.3%, Usearch: 20.20%) for the bacterial phylum, Methanobacteriales (QIIME: 64.2%, Usearch: 45.7%) for the archaeal class, Methanobacteriaceae (QIIME: 35%, Usearch: 45.7%) and Methanomassiliicoccaceae (QIIME: 35%, Usearch: 31.13%) for archaeal family. However, the most prevalent archaeal class varied between these two annotation pipelines. The Thermoplasmata was the top class according to the QIIME annotation, whereas Methanobacteria was the top class according to Usearch.

Keywords: cattle rumen, rumen microbial, 16S rRNA gene sequencing, bioinformatics pipeline

Procedia PDF Downloads 88
2010 Coal Mining Safety Monitoring Using Wsn

Authors: Somdatta Saha

Abstract:

The main purpose was to provide an implementable design scenario for underground coal mines using wireless sensor networks (WSNs). The main reason being that given the intricacies in the physical structure of a coal mine, only low power WSN nodes can produce accurate surveillance and accident detection data. The work mainly concentrated on designing and simulating various alternate scenarios for a typical mine and comparing them based on the obtained results to arrive at a final design. In the Era of embedded technology, the Zigbee protocols are used in more and more applications. Because of the rapid development of sensors, microcontrollers, and network technology, a reliable technological condition has been provided for our automatic real-time monitoring of coal mine. The underground system collects temperature, humidity and methane values of coal mine through sensor nodes in the mine; it also collects the number of personnel inside the mine with the help of an IR sensor, and then transmits the data to information processing terminal based on ARM.

Keywords: ARM, embedded board, wireless sensor network (Zigbee)

Procedia PDF Downloads 340
2009 Causes of Variation Orders in the Egyptian Construction Industry: Time and Cost Impacts

Authors: A. Samer Ezeldin, Jwanda M. El Sarag

Abstract:

Variation orders are of great importance in any construction project. Variation orders are defined as any change in the scope of works of a project that can be an addition omission, or even modification. This paper investigates the variation orders that occur during construction projects in Egypt. The literature review represents a comparison of causes of variation orders among Egypt, Tanzania, Nigeria, Malaysia and the United Kingdom. A classification of occurrence of variation orders due to owner related factors, consultant related factors and other factors are signified in the literature review. These classified events that lead to variation orders were introduced in a survey with 19 events to observe their frequency of occurrence, and their time and cost impacts. The survey data was obtained from 87 participants that included clients, consultants, and contractors and a database of 42 scenarios was created. A model is then developed to help assist project managers in predicting the frequency of variations and account for a budget for any additional costs and minimize any delays that can take place. Two experts with more than 25 years of experience were given the model to verify that the model was working effectively. The model was then validated on a residential compound that was completed in July 2016 to prove that the model actually produces acceptable results.

Keywords: construction, cost impact, Egypt, time impact, variation orders

Procedia PDF Downloads 183
2008 3D-Mesh Robust Watermarking Technique for Ownership Protection and Authentication

Authors: Farhan A. Alenizi

Abstract:

Digital watermarking has evolved in the past years as an important means for data authentication and ownership protection. The images and video watermarking was well known in the field of multimedia processing; however, 3D objects' watermarking techniques have emerged as an important means for the same purposes, as 3D mesh models are in increasing use in different areas of scientific, industrial, and medical applications. Like the image watermarking techniques, 3D watermarking can take place in either space or transform domains. Unlike images and video watermarking, where the frames have regular structures in both space and temporal domains, 3D objects are represented in different ways as meshes that are basically irregular samplings of surfaces; moreover, meshes can undergo a large variety of alterations which may be hard to tackle. This makes the watermarking process more challenging. While the transform domain watermarking is preferable in images and videos, they are still difficult to implement in 3d meshes due to the huge number of vertices involved and the complicated topology and geometry, and hence the difficulty to perform the spectral decomposition, even though significant work was done in the field. Spatial domain watermarking has attracted significant attention in the past years; they can either act on the topology or on the geometry of the model. Exploiting the statistical characteristics in the 3D mesh models from both geometrical and topological aspects was useful in hiding data. However, doing that with minimal surface distortions to the mesh attracted significant research in the field. A 3D mesh blind watermarking technique is proposed in this research. The watermarking method depends on modifying the vertices' positions with respect to the center of the object. An optimal method will be developed to reduce the errors, minimizing the distortions that the 3d object may experience due to the watermarking process, and reducing the computational complexity due to the iterations and other factors. The technique relies on the displacement process of the vertices' locations depending on the modification of the variances of the vertices’ norms. Statistical analyses were performed to establish the proper distributions that best fit each mesh, and hence establishing the bins sizes. Several optimizing approaches were introduced in the realms of mesh local roughness, the statistical distributions of the norms, and the displacements in the mesh centers. To evaluate the algorithm's robustness against other common geometry and connectivity attacks, the watermarked objects were subjected to uniform noise, Laplacian smoothing, vertices quantization, simplification, and cropping. Experimental results showed that the approach is robust in terms of both perceptual and quantitative qualities. It was also robust against both geometry and connectivity attacks. Moreover, the probability of true positive detection versus the probability of false-positive detection was evaluated. To validate the accuracy of the test cases, the receiver operating characteristics (ROC) curves were drawn, and they’ve shown robustness from this aspect. 3D watermarking is still a new field but still a promising one.

Keywords: watermarking, mesh objects, local roughness, Laplacian Smoothing

Procedia PDF Downloads 160
2007 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: dropwise condensation, textured surface, image processing, watershed

Procedia PDF Downloads 223
2006 Genetic Variation among the Wild and Hatchery Raised Populations of Labeo rohita Revealed by RAPD Markers

Authors: Fayyaz Rasool, Shakeela Parveen

Abstract:

The studies on genetic diversity of Labeo rohita by using molecular markers were carried out to investigate the genetic structure by RAPAD marker and the levels of polymorphism and similarity amongst the different groups of five populations of wild and farmed types. The samples were collected from different five locations as representatives of wild and hatchery raised populations. RAPAD data for Jaccard’s coefficient by following the un-weighted Pair Group Method with Arithmetic Mean (UPGMA) for Hierarchical Clustering of the similar groups on the basis of similarity amongst the genotypes and the dendrogram generated divided the randomly selected individuals of the five populations into three classes/clusters. The variance decomposition for the optimal classification values remained as 52.11% for within class variation, while 47.89% for the between class differences. The Principal Component Analysis (PCA) for grouping of the different genotypes from the different environmental conditions was done by Spearman Varimax rotation method for bi-plot generation of the co-occurrence of the same genotypes with similar genetic properties and specificity of different primers indicated clearly that the increase in the number of factors or components was correlated with the decrease in eigenvalues. The Kaiser Criterion based upon the eigenvalues greater than one, first two main factors accounted for 58.177% of cumulative variability.

Keywords: variation, clustering, PCA, wild, hatchery, RAPAD, Labeo rohita

Procedia PDF Downloads 449
2005 Seismic Hazard Prediction Using Seismic Bumps: Artificial Neural Network Technique

Authors: Belkacem Selma, Boumediene Selma, Tourkia Guerzou, Abbes Labdelli

Abstract:

Natural disasters have occurred and will continue to cause human and material damage. Therefore, the idea of "preventing" natural disasters will never be possible. However, their prediction is possible with the advancement of technology. Even if natural disasters are effectively inevitable, their consequences may be partly controlled. The rapid growth and progress of artificial intelligence (AI) had a major impact on the prediction of natural disasters and risk assessment which are necessary for effective disaster reduction. The Earthquakes prediction to prevent the loss of human lives and even property damage is an important factor; that is why it is crucial to develop techniques for predicting this natural disaster. This present study aims to analyze the ability of artificial neural networks (ANNs) to predict earthquakes that occur in a given area. The used data describe the problem of high energy (higher than 10^4J) seismic bumps forecasting in a coal mine using two long walls as an example. For this purpose, seismic bumps data obtained from mines has been analyzed. The results obtained show that the ANN with high accuracy was able to predict earthquake parameters; the classification accuracy through neural networks is more than 94%, and that the models developed are efficient and robust and depend only weakly on the initial database.

Keywords: earthquake prediction, ANN, seismic bumps

Procedia PDF Downloads 127
2004 Study on Optimization Design of Pressure Hull for Underwater Vehicle

Authors: Qasim Idrees, Gao Liangtian, Liu Bo, Miao Yiran

Abstract:

In order to improve the efficiency and accuracy of the pressure hull structure, optimization of underwater vehicle based on response surface methodology, a method for optimizing the design of pressure hull structure was studied. To determine the pressure shell of five dimensions as a design variable, the application of thin shell theory and the Chinese Classification Society (CCS) specification was carried on the preliminary design. In order to optimize variables of the feasible region, different methods were studied and implemented such as Opt LHD method (to determine the design test sample points in the feasible domain space), parametric ABAQUS solution for each sample point response, and the two-order polynomial response for the surface model of the limit load of structures. Based on the ultimate load of the structure and the quality of the shell, the two-generation genetic algorithm was used to solve the response surface, and the Pareto optimal solution set was obtained. The final optimization result was 41.68% higher than that of the initial design, and the shell quality was reduced by about 27.26%. The parametric method can ensure the accuracy of the test and improve the efficiency of optimization.

Keywords: parameterization, response surface, structure optimization, pressure hull

Procedia PDF Downloads 233
2003 Switching to the Latin Alphabet in Kazakhstan: A Brief Overview of Character Recognition Methods

Authors: Ainagul Yermekova, Liudmila Goncharenko, Ali Baghirzade, Sergey Sybachin

Abstract:

In this article, we address the problem of Kazakhstan's transition to the Latin alphabet. The transition process started in 2017 and is scheduled to be completed in 2025. In connection with these events, the problem of recognizing the characters of the new alphabet is raised. Well-known character recognition programs such as ABBYY FineReader, FormReader, MyScript Stylus did not recognize specific Kazakh letters that were used in Cyrillic. The author tries to give an assessment of the well-known method of character recognition that could be in demand as part of the country's transition to the Latin alphabet. Three methods of character recognition: template, structured, and feature-based, are considered through the algorithms of operation. At the end of the article, a general conclusion is made about the possibility of applying a certain method to a particular recognition process: for example, in the process of population census, recognition of typographic text in Latin, or recognition of photos of car numbers, store signs, etc.

Keywords: text detection, template method, recognition algorithm, structured method, feature method

Procedia PDF Downloads 187
2002 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm

Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi

Abstract:

The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.

Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt

Procedia PDF Downloads 234
2001 Child Sexual Abuse Prevention: Evaluation of the Program “Sharing Mouth to Mouth: My Body, Nobody Can Touch It”

Authors: Faride Peña, Teresita Castillo, Concepción Campo

Abstract:

Sexual violence, and particularly child sexual abuse, is a serious problem all over the world, México included. Given its importance, there are several preventive and care programs done by the government and the civil society all over the country but most of them are developed in urban areas even though these problems are especially serious in rural areas. Yucatán, a state in southern México, occupies one of the first places in child sexual abuse. Considering the above, the University Unit of Clinical Research and Victimological Attention (UNIVICT) of the Autonomous University of Yucatan, designed, implemented and is currently evaluating the program named “Sharing Mouth to Mouth: My Body, Nobody Can Touch It”, a program to prevent child sexual abuse in rural communities of Yucatán, México. Its aim was to develop skills for the detection of risk situations, providing protection strategies and mechanisms for prevention through culturally relevant psycho-educative strategies to increase personal resources in children, in collaboration with parents, teachers, police and municipal authorities. The diagnosis identified that a particularly vulnerable population were children between 4 and 10 years. The program run during 2015 in primary schools in the municipality whose inhabitants are mostly Mayan. The aim of this paper is to present its evaluation in terms of its effectiveness and efficiency. This evaluation included documental analysis of the work done in the field, psycho-educational and recreational activities with children, evaluation of knowledge by participating children and interviews with parents and teachers. The results show high efficiency in fulfilling the tasks and achieving primary objectives. The efficiency shows satisfactory results but also opportunity areas that can be resolved with minor adjustments to the program. The results also show the importance of including culturally relevant strategies and activities otherwise it minimizes possible achievements. Another highlight is the importance of participatory action research in preventive approaches to child sexual abuse since by becoming aware of the importance of the subject people participate more actively; in addition to design culturally appropriate strategies and measures so that the proposal may not be distant to the people. Discussion emphasizes the methodological implications of prevention programs (convenience of using participatory action research (PAR), importance of monitoring and mediation during implementation, developing detection skills tools in creative ways using psycho-educational interactive techniques and working assessment issued by the participants themselves). As well, it is important to consider the holistic character this type of program should have, in terms of incorporating social and culturally relevant characteristics, according to the community individuality and uniqueness, consider type of communication to be used and children’ language skills considering that there should be variations strongly linked to a specific cultural context.

Keywords: child sexual abuse, evaluation, PAR, prevention

Procedia PDF Downloads 295
2000 Right Atrial Tissue Morphology in Acquired Heart Diseases

Authors: Edite Kulmane, Mara Pilmane, Romans Lacis

Abstract:

Introduction: Acquired heart diseases remain one of the leading health care problems in the world. Changes in myocardium of the diseased hearts are complex and pathogenesis is still not fully clear. The aim of this study was to identify appearance and distribution of apoptosis, homeostasis regulating factors, and innervation and ischemia markers in right atrial tissue in different acquired heart diseases. Methods: During elective open heart surgery were taken right atrial tissue fragments from 12 patients. All patients were operated because of acquired heart diseases- aortic valve stenosis (5 patients), coronary heart disease (5 patients), coronary heart disease and secondary mitral insufficiency (1 patient) and mitral disease (1 patient). The mean age was (mean±SD) 70,2±7,0 years (range 58-83 years). The tissues were stained with haematoxylin and eosin methods for routine light-microscopical examination and for immunohistochemical detection of protein gene peptide 9.5 (PGP 9.5), human atrial natriuretic peptide (hANUP), vascular endothelial growth factor (VEGF), chromogranin A and endothelin. Apoptosis was detected by TUNEL method. Results: All specimens showed degeneration of cardiomyocytes with lysis of myofibrils, diffuse vacuolization especially in perinuclear region, different size of cells and their nuclei. The severe invasion of connective tissue was observed in main part of all fragments. The apoptotic index ranged from 24 to 91%. One specimen showed region of newly performed microvessels with cube shaped endotheliocytes that were positive for PGP 9.5, endothelin, chromogranin A and VEGF. From all fragments, taken from patients with coronary heart disease, there were observed numerous PGP 9.5-containing nerve fibres, except in patient with secondary mitral insufficiency, who showed just few PGP 9.5 positive nerves. In majority of specimens there were regions observed with cube shaped mixed -VEGF immunoreactive endocardial and epicardial cells. Only VEGF positive endothelial cells were observed just in few specimens. There was no significant difference of hANUP secreting cells among all specimens. All patients operated due to the coronary heart disease moderate to numerous number of chromogranin A positive cells were seen while in patients with aortic valve stenosis tissue demonstrated just few factor positive cells. Conclusions: Complex detection of different factors may indicate selectively disordered morphopathogenetical event of heart disease: decrease of PGP 9.5 nerves suggests the decreased innervation of organ; increased apoptosis indicates the cell death without ingrowth of connective tissue; persistent presence of hANUP proves the unchanged homeostasis of cardiomyocytes probably supported by expression of chromogranins. Finally, decrease of VEGF detects the regions of affected blood vessels in heart affected by acquired heart disease.

Keywords: heart, apoptosis, protein-gene peptide 9.5, atrial natriuretic peptide, vascular endothelial growth factor, chromogranin A, endothelin

Procedia PDF Downloads 295
1999 Application of the MOOD Technique to the Steady-State Euler Equations

Authors: Gaspar J. Machado, Stéphane Clain, Raphael Loubère

Abstract:

The goal of the present work is to numerically study steady-state nonlinear hyperbolic equations in the context of the finite volume framework. We will consider the unidimensional Burgers' equation as the reference case for the scalar situation and the unidimensional Euler equations for the vectorial situation. We consider two approaches to solve the nonlinear equations: a time marching algorithm and a direct steady-state approach. We first develop the necessary and sufficient conditions to obtain the existence and unicity of the solution. We treat regular examples and solutions with a steady shock and to provide very-high-order finite volume approximations we implement a method based on the MOOD technology (Multi-dimensional Optimal Order Detection). The main ingredient consists in using an 'a posteriori' limiting strategy to eliminate non physical oscillations deriving from the Gibbs phenomenon while keeping a high accuracy for the smooth part.

Keywords: Euler equations, finite volume, MOOD, steady-state

Procedia PDF Downloads 277
1998 miR-200c as a Biomarker for 5-FU Chemosensitivity in Colorectal Cancer

Authors: Rezvan Najafi, Korosh Heydari, Massoud Saidijam

Abstract:

5-FU is a chemotherapeutic agent that has been used in colorectal cancer (CRC) treatment. However, it is usually associated with the acquired resistance, which decreases the therapeutic effects of 5-FU. miR-200c is involved in chemotherapeutic drug resistance, but its mechanism is not fully understood. In this study, the effect of inhibition of miR-200c in sensitivity of HCT-116 CRC cells to 5-FU was evaluated. HCT-116 cells were transfected with LNA-anti- miR-200c for 48 h. mRNA expression of miR-200c was evaluated using quantitative real- time PCR. The protein expression of phosphatase and tensin homolog (PTEN) and E-cadherin were analyzed by western blotting. Annexin V and propidium iodide staining assay were applied for apoptosis detection. The caspase-3 activation was evaluated by an enzymatic assay. The results showed LNA-anti-miR-200c inhibited the expression of PTEN and E-cadherin protein, apoptosis and activation of caspase 3 compared with control cells. In conclusion, these results suggest that miR-200c as a prognostic marker can overcome to 5-FU chemoresistance in CRC.

Keywords: colorectal cancer, miR-200c, 5-FU resistance, E-cadherin, PTEN

Procedia PDF Downloads 166
1997 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 112
1996 Impact of Instrument Transformer Secondary Connections on Performance of Protection System: Experiences from Indian POWERGRID

Authors: Pankaj Kumar Jha, Mahendra Singh Hada, Brijendra Singh, Sandeep Yadav

Abstract:

Protective relays are commonly connected to the secondary windings of instrument transformers, i.e., current transformers (CTs) and/or capacitive voltage transformers (CVTs). The purpose of CT and CVT is to provide galvanic isolation from high voltages and reduce primary currents and voltages to a nominal quantity recognized by the protective relays. Selecting the correct instrument transformers for an application is imperative: failing to do so may compromise the relay’s performance, as the output of the instrument transformer may no longer be an accurately scaled representation of the primary quantity. Having an accurately rated instrument transformer is of no use if these devices are not properly connected. The performance of the protective relay is reliant on its programmed settings and on the current and voltage inputs from the instrument transformers secondary. This paper will help in understanding the fundamental concepts of the connections of Instrument Transformers to the protection relays and the effect of incorrect connection on the performance of protective relays. Multiple case studies of protection system mal-operations due to incorrect connections of instrument transformers will be discussed in detail in this paper. Apart from the connection issue of instrument transformers to protective relays, this paper will also discuss the effect of multiple earthing of CTs and CVTs secondary on the performance of the protection system. Case studies presented in this paper will help the readers to analyse the problem through real-world challenges in complex power system networks. This paper will also help the protection engineer in better analysis of disturbance records. CT and CVT connection errors can lead to undesired operations of protection systems. However, many of these operations can be avoided by adhering to industry standards and implementing tried-and-true field testing and commissioning practices. Understanding the effect of missing neutral of CVT, multiple earthing of CVT secondary, and multiple grounding of CT star points on the performance of the protection system through real-world case studies will help the protection engineer in better commissioning the protection system and maintenance of the protection system.

Keywords: bus reactor, current transformer, capacitive voltage transformer, distance protection, differential protection, directional earth fault, disturbance report, instrument transformer, ICT, REF protection, shunt reactor, voltage selection relay, VT fuse failure

Procedia PDF Downloads 81
1995 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 146
1994 Impact of Flavor on Food Product Quality, A Case Study of Vanillin Stability during Biscuit Preparation

Authors: N. Yang, R. Linforth, I. Fisk

Abstract:

The influence of food processing and choice of flavour solvent was investigated using biscuits prepared with vanillin flavour as an example. Powder vanillin either was added directly into the dough or dissolved into flavour solvent then mixed into the dough. The impact of two commonly used flavour solvents on food quality was compared: propylene glycol (PG) or triacetin (TA). The analytical approach for vanillin detection was developed by chromatography (HPLC-PDA), and the standard extraction method for vanillin was also established. The results indicated the impact of solvent choice on vanillin level during biscuit preparation. After baking, TA as a more heat resistant solvent retained more vanillin than PG, so TA is a better solvent for products that undergo a heating process. The results also illustrated the impact of mixing and baking on vanillin stability in the matrices. The average loss of vanillin was 33% during mixing and 13% during baking, which indicated that the binding of vanillin to fat or flour before baking might cause larger loss than evaporation loss during baking.

Keywords: biscuit, flavour stability, food quality, vanillin

Procedia PDF Downloads 508
1993 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph

Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.

Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction

Procedia PDF Downloads 425
1992 Microbiological Analysis, Cytotoxic and Genotoxic Effects from Material Captured in PM2.5 and PM10 Filters Used in the Aburrá Valley Air Quality Monitoring Network (Colombia)

Authors: Carmen E. Zapata, Juan Bautista, Olga Montoya, Claudia Moreno, Marisol Suarez, Alejandra Betancur, Duvan Nanclares, Natalia A. Cano

Abstract:

This study aims to evaluate the diversity of microorganisms in filters PM2.5 and PM10; and determine the genotoxic and cytotoxic activity of the complex mixture present in PM2.5 filters used in the Aburrá Valley Air Quality Monitoring Network (Colombia). The research results indicate that particulate matter PM2.5 of different monitoring stations are bacteria; however, this study of detection of bacteria and their phylogenetic relationship is not complete evidence to connect the microorganisms with pathogenic or degrading activities of compounds present in the air. Additionally, it was demonstrated the damage induced by the particulate material in the cell membrane, lysosomal and endosomal membrane and in the mitochondrial metabolism; this damage was independent of the PM2.5 concentrations in almost all the cases.

Keywords: cytotoxic, genotoxic, microbiological analysis, PM10, PM2.5

Procedia PDF Downloads 348
1991 Detection of Respiratory Syncytial Virus (hRSV) by PCR Technique in Lower Respiratory Tract Infection (LRTI) in Babylon City

Authors: Amal Raqib Shameran, Ghanim Aboud Al-Mola

Abstract:

Respiratory syncytial virus (hRSV) is the major pathogens of respiratory tract infections (RTI) among infants and children in the world. They are classified in family Paramyxoviridae and sub-family Pneumovirinae. The current work aimed to detect the role of RSV in the lower respiratory tract infection (LRTI) in Hilla, Iraq. The samples were collected from 50 children who were admitted to hospital suffering from lower respiratory tract infections (LRTI). 50 nasal and pharyngeal swabs were taken from patients at the period from January 2010 till April 2011, hospitalized in Hilla Maternity and Children Hospital. The results showed that the proportion of children infected with hRSV accounted for 24% 12/50 with lower respiratory tract infections (LRTI) when they tested by polymerase chain reaction (RT-PCR).

Keywords: respiratory syncytial virus, respiratory tract infections, infants, polymerase chain reaction (PCR)

Procedia PDF Downloads 356
1990 Using Closed Frequent Itemsets for Hierarchical Document Clustering

Authors: Cheng-Jhe Lee, Chiun-Chieh Hsu

Abstract:

Due to the rapid development of the Internet and the increased availability of digital documents, the excessive information on the Internet has led to information overflow problem. In order to solve these problems for effective information retrieval, document clustering in text mining becomes a popular research topic. Clustering is the unsupervised classification of data items into groups without the need of training data. Many conventional document clustering methods perform inefficiently for large document collections because they were originally designed for relational database. Therefore they are impractical in real-world document clustering and require special handling for high dimensionality and high volume. We propose the FIHC (Frequent Itemset-based Hierarchical Clustering) method, which is a hierarchical clustering method developed for document clustering, where the intuition of FIHC is that there exist some common words for each cluster. FIHC uses such words to cluster documents and builds hierarchical topic tree. In this paper, we combine FIHC algorithm with ontology to solve the semantic problem and mine the meaning behind the words in documents. Furthermore, we use the closed frequent itemsets instead of only use frequent itemsets, which increases efficiency and scalability. The experimental results show that our method is more accurate than those of well-known document clustering algorithms.

Keywords: FIHC, documents clustering, ontology, closed frequent itemset

Procedia PDF Downloads 399
1989 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius

Authors: Mina Adel Shokry Fahim, Jūratė Sužiedelytė Visockienė

Abstract:

With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realisation often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.

Keywords: air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter

Procedia PDF Downloads 53
1988 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 448
1987 Energy Management System and Interactive Functions of Smart Plug for Smart Home

Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya

Abstract:

Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.

Keywords: energy management, load profile, smart plug, wireless sensor network

Procedia PDF Downloads 273
1986 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders

Authors: Hao-Hsi Tseng, Hsin-Yun Lee

Abstract:

The Most Advantageous Tender (MAT) has been criticized for its susceptibility to dictatorial situations and for its processing of same score, same rank issues. This study applies the four criteria from Arrow's Impossibility Theorem to construct a mechanism for revealing illegitimate scores in scoring methods. While commonly be used to improve on problems resulting from extreme scores, ranking methods hide significant defects, adversely affecting selection fairness. To address these shortcomings, this study relies mainly on the overall evaluated score method, using standardized scores plus normal cumulative distribution function conversion to calculate the evaluation of vender preference. This allows for free score evaluations, which reduces the influence of dictatorial behavior and avoiding same score, same rank issues. Large-scale simulations confirm that this method outperforms currently used methods using the Impossibility Theorem.

Keywords: Arrow’s impossibility theorem, cumulative normal distribution function, most advantageous tender, scoring method

Procedia PDF Downloads 464
1985 Minimizing Unscheduled Maintenance from an Aircraft and Rolling Stock Maintenance Perspective: Preventive Maintenance Model

Authors: Adel A. Ghobbar, Varun Raman

Abstract:

The Corrective maintenance of components and systems is a problem plaguing almost every industry in the world today. Train operators’ and the maintenance repair and overhaul subsidiary of the Dutch railway company is also facing this problem. A considerable portion of the maintenance activities carried out by the company are unscheduled. This, in turn, severely stresses and stretches the workforce and resources available. One possible solution is to have a robust preventive maintenance plan. The other possible solution is to plan maintenance based on real-time data obtained from sensor-based ‘Health and Usage Monitoring Systems.’ The former has been investigated in this paper. The preventive maintenance model developed for train operator will subsequently be extended, to tackle the unscheduled maintenance problem also affecting the aerospace industry. The extension of the model to the aerospace sector will be dealt with in the second part of the research, and it would, in turn, validate the soundness of the model developed. Thus, there are distinct areas that will be addressed in this paper, including the mathematical modelling of preventive maintenance and optimization based on cost and system availability. The results of this research will help an organization to choose the right maintenance strategy, allowing it to save considerable sums of money as opposed to overspending under the guise of maintaining high asset availability. The concept of delay time modelling was used to address the practical problem of unscheduled maintenance in this paper. The delay time modelling can be used to help with support planning for a given asset. The model was run using MATLAB, and the results are shown that the ideal inspection intervals computed using the extended from a minimal cost perspective were 29 days, and from a minimum downtime, perspective was 14 days. Risk matrix integration was constructed to represent the risk in terms of the probability of a fault leading to breakdown maintenance and its consequences in terms of maintenance cost. Thus, the choice of an optimal inspection interval of 29 days, resulted in a cost of approximately 50 Euros and the corresponding value of b(T) was 0.011. These values ensure that the risk associated with component X being maintained at an inspection interval of 29 days is more than acceptable. Thus, a switch in maintenance frequency from 90 days to 29 days would be optimal from the point of view of cost, downtime and risk.

Keywords: delay time modelling, unscheduled maintenance, reliability, maintainability, availability

Procedia PDF Downloads 132
1984 LWD Acquisition of Caliper and Drilling Mechanics in a Geothermal Well, A Case Study in Sorik Marapi Field – Indonesia

Authors: Vinda B. Manurung, Laila Warkhaida, David Hutabarat, Sentanu Wisnuwardhana, Christovik Simatupang, Dhani Sanjaya, Ashadi, Redha B. Putra, Kiki Yustendi

Abstract:

The geothermal drilling environment presents many obstacles that have limited the use of directional drilling and logging-while-drilling (LWD) technologies, such as borehole washout, mud losses, severe vibration, and high temperature. The case study presented in this paper demonstrates a practice to enhance data logging in geothermal drilling by deploying advanced telemetry and LWD technologies. This operation is aiming continuous improvement in geothermal drilling operations. The case study covers a 12.25-in. hole section of well XX-05 in Pad XX of the Sorik Marapi Geothermal Field. LWD string consists of electromagnetic (EM) telemetry, pressure while drilling (PWD), vibration (DDSr), and acoustic calliper (ACAL). Through this tool configuration, the operator acquired drilling mechanics and caliper logs in real-time and recorded mode, enabling effective monitoring of wellbore stability. Throughout the real-time acquisition, EM-PPM telemetry had provided a three times faster data rate to the surface unit. With the integration of Caliper data and Drilling mechanics data (vibration and ECD -equivalent circulating density), the borehole conditions were more visible to the directional driller, allowing for better control of drilling parameters to minimize vibration and achieve optimum hole cleaning in washed-out or tight formation sequences. After reaching well TD, the recorded data from the caliper sensor indicated an average of 8.6% washout for the entire 12.25-in. interval. Washout intervals were compared with loss occurrence, showing potential for the caliper to be used as an indirect indicator of fractured intervals and validating fault trend prognosis. This LWD case study has given added value in geothermal borehole characterization for both drilling operation and subsurface. Identified challenges while running LWD in this geothermal environment need to be addressed for future improvements, such as the effect of tool eccentricity and the impact of vibration. A perusal of both real-time and recorded drilling mechanics and caliper data has opened various possibilities for maximizing sensor usage in future wells.

Keywords: geothermal drilling, geothermal formation, geothermal technologies, logging-while-drilling, vibration, caliper, case study

Procedia PDF Downloads 131
1983 Assessing Musculoskeletal Disorder Prevalence and Heat-Related Symptoms: A Cross-sectional Comparison in Indian Farmers

Authors: Makkhan Lal Meena, R. C. Bairwa, G. S. Dangayach, Rahul Jain

Abstract:

The current study looked at the frequency of chronic illness conditions, accidents, health complaints, and ergonomic issues among 100 conventional and 100 greenhouse farmers. Data related to the health symptoms and ergonomic problems were collected through questionnaires by conducting direct interviews of farmers. According to the findings, symptoms of heat exposure (skin rashes, headache, dizziness, and lack of appetite) were substantially higher among conventional farmers than greenhouse farmers. The greenhouse farmers reported much more pain, numbness, or weakness in wrists/hands, fingers, upper back, hips, and ankles/feet than conventional farmers. The findings of the study suggest that suitable ergonomic knowledge and awareness campaign programs concentrating on safety at work, particularly low back pain, should be implemented in workplaces to allow for earlier detection of symptoms among the greenhouse farmers.

Keywords: accident, conventional farmer, ergonomics, health symptoms, greenhouse farmers, pesticide

Procedia PDF Downloads 271
1982 Crystallography Trials of Escherichia coli Nitrate Transporter, NarU

Authors: Naureen Akhtar

Abstract:

The stability of the protein in detergent-containing solution is the key for its successful crystallisation. Fluorescence-detection size-exclusion chromatography (FSEC) is a potential approach for screening monodispersity as well as the stability of protein in a detergent-containing-solution. In this present study, covalently linked Green Fluorescent Protein (GFP) to bacterial nitrate transporter, NarU from Escherichia coli was studied for pre-crystallisation trials by FSEC. Immobilised metal ion affinity chromatography (IMAC) and gel filtration were employed for their purification. The main objectives of this study were over-expression, detergent screening and crystallisation of nitrate transporter proteins. This study could not produce enough proteins that could realistically be taken forward to achieve the objectives set for this particular research. In future work, different combinations of variables like vectors, tags, creation of mutant proteins, host cells, position of GFP (N- or C-terminal) and/or membrane proteins would be tried to determine the best combination as the principle of technique is still promising.

Keywords: transporters, detergents, over-expression, crystallography

Procedia PDF Downloads 477