Search results for: array signal processing
2608 Effect of Hot Equal Channel Angular Pressing Process on Mechanical Properties of Commercial Pure Titanium
Authors: Seyed Ata Khalkhkali Sharifi, Gholamhossein Majzoubi, Farhad Abroush
Abstract:
Developing mechanical properties of pure titanium has been reviewed in this paper by using ECAP process. At the first step of this article, the experimental samples were prepared as mentioned in the standards. Then pure grade 2 Ti was processed via equal-channel angular pressing (ECAp) for 2 passes following route-A at 400°C. After processing, the microstructural evolution, tensile, fatigue, hardness properties and wear behavior were investigated. Finally, the effect of ECAP process on these samples was analyzed. The results showed improvement in strength values with a slight decrease in ductility. The analysis on 30 points within the sample showed hardness increase in each pass. Also, it was concluded that fatigue properties were increased too.Keywords: equal-channel angular pressing, titanium, mechanical behavior, engineering materials and applications
Procedia PDF Downloads 2582607 CookIT: A Web Portal for the Preservation and Dissemination of Traditional Italian Recipes
Authors: M. T. Artese, G. Ciocca, I. Gagliardi
Abstract:
Food is a social and cultural aspect of every individual. Food products, processing, and traditions have been identified as cultural objects carrying history and identity of social groups. Traditional recipes are passed down from one generation to the other, often to strengthen the link with the territory. The paper presents CookIT, a web portal developed to collect Italian traditional recipes related to regional cuisine, with the purpose to disseminate the knowledge of typical Italian recipes and the Mediterranean diet which is a significant part of Italian cuisine. The system designed is completed with multimodal means of browsing and data retrieval. Stored recipes can be retrieved integrating and combining a number of different methods and keys, while the results are displayed using classical styles, such as list and mosaic, and also using maps and graphs, with which users can play using available keys for interaction.Keywords: collaborative portal, Italian cuisine, intangible cultural heritage, traditional recipes, searching and browsing
Procedia PDF Downloads 1492606 The Effect of Precipitation on Weed Infestation of Spring Barley under Different Tillage Conditions
Authors: J. Winkler, S. Chovancová
Abstract:
The article deals with the relation between rainfall in selected months and subsequent weed infestation of spring barley. The field experiment was performed at Mendel University agricultural enterprise in Žabčice, Czech Republic. Weed infestation was measured in spring barley vegetation in years 2004 to 2012. Barley was grown in three tillage variants: conventional tillage technology (CT), minimization tillage technology (MT), and no tillage (NT). Precipitation was recorded in one-day intervals. Monthly precipitation was calculated from the measured values in the months of October through to April. The technique of canonical correspondence analysis was applied for further statistical processing. 41 different species of weeds were found in the course of the 9-year monitoring period. The results clearly show that precipitation affects the incidence of most weed species in the selected months, but acts differently in the monitored variants of tillage technologies.Keywords: weeds, precipitation, tillage, weed infestation forecast
Procedia PDF Downloads 4982605 Rule-Based Expert System for Headache Diagnosis and Medication Recommendation
Authors: Noura Al-Ajmi, Mohammed A. Almulla
Abstract:
With the increased utilization of technology devices around the world, healthcare and medical diagnosis are critical issues that people worry about these days. Doctors are doing their best to avoid any medical errors while diagnosing diseases and prescribing the wrong medication. Subsequently, artificial intelligence applications that can be installed on mobile devices such as rule-based expert systems facilitate the task of assisting doctors in several ways. Due to their many advantages, the usage of expert systems has increased recently in health sciences. This work presents a backward rule-based expert system that can be used for a headache diagnosis and medication recommendation system. The structure of the system consists of three main modules, namely the input unit, the processing unit, and the output unit.Keywords: headache diagnosis system, prescription recommender system, expert system, backward rule-based system
Procedia PDF Downloads 2152604 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 1752603 Offset Dependent Uniform Delay Mathematical Optimization Model for Signalized Traffic Network Using Differential Evolution Algorithm
Authors: Tahseen Saad, Halim Ceylan, Jonathan Weaver, Osman Nuri Çelik, Onur Gungor Sahin
Abstract:
A new concept of uniform delay offset dependent mathematical optimization problem is derived as the main objective for this study using a differential evolution algorithm. To control the coordination problem, which depends on offset selection and to estimate uniform delay based on the offset choice in a traffic signal network. The assumption is the periodic sinusoidal function for arrival and departure patterns. The cycle time is optimized at the entry links and the optimized value is used in the non-entry links as a common cycle time. The offset optimization algorithm is used to calculate the uniform delay at each link. The results are illustrated by using a case study and are compared with the canonical uniform delay model derived by Webster and the highway capacity manual’s model. The findings show new model minimizes the total uniform delay to almost half compared to conventional models. The mathematical objective function is robust. The algorithm convergence time is fast.Keywords: area traffic control, traffic flow, differential evolution, sinusoidal periodic function, uniform delay, offset variable
Procedia PDF Downloads 2752602 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 2142601 Processing of Flexible Dielectric Nanocomposites Using Nanocellulose and Recycled Alum Sludge for Wearable Technology Applications
Authors: D. Sun, L. Saw, A. Onyianta, D. O’Rourke, Z. Lu, C. See, C. Wilson, C. Popescu, M. Dorris
Abstract:
With the rapid development of wearable technology (e.g., smartwatch, activity trackers and health monitor devices), flexible dielectric materials with environmental-friendly, low-cost and high-energy efficiency characteristics are in increasing demand. In this work, a flexible dielectric nanocomposite was processed by incorporating two components: cellulose nanofibrils and alum sludge in a polymer matrix. The two components were used in the reinforcement phase as well as for enhancing the dielectric properties; they were processed using waste materials that would otherwise be disposed to landfills. Alum sludge is a by-product of the water treatment process in which aluminum sulfate is prevalently used as the primary coagulant. According to the data from a project partner-Scottish Water: there are approximately 10k tons of alum sludge generated as a waste from the water treatment work to be landfilled every year in Scotland. The industry has been facing escalating financial and environmental pressure to develop more sustainable strategies to deal with alum sludge wastes. In the available literature, some work on reusing alum sludge has been reported (e.g., aluminum recovery or agriculture and land reclamation). However, little work can be found in applying it to processing energy materials (e.g., dielectrics) for enhanced energy density and efficiency. The alum sludge was collected directly from a water treatment plant of Scottish Water and heat-treated and refined before being used in preparing composites. Cellulose nanofibrils were derived from water hyacinth, an invasive aquatic weed that causes significant ecological issues in tropical regions. The harvested water hyacinth was dried and processed using a cost-effective method, including a chemical extraction followed by a homogenization process in order to extract cellulose nanofibrils. Biodegradable elastomer polydimethylsiloxane (PDMS) was used as the polymer matrix and the nanocomposites were processed by casting raw materials in Petri dishes. The processed composites were characterized using various methods, including scanning electron microscopy (SEM), rheological analysis, thermogravimetric and X-ray diffraction analysis. The SEM result showed that cellulose nanofibrils of approximately 20nm in diameter and 100nm in length were obtained and the alum sludge particles were of approximately 200um in diameters. The TGA/DSC analysis result showed that a weight loss of up to 48% can be seen in the raw material of alum sludge and its crystallization process has been started at approximately 800°C. This observation coincides with the XRD result. Other experiments also showed that the composites exhibit comprehensive mechanical and dielectric performances. This work depicts that it is a sustainable practice of reusing such waste materials in preparing flexible, lightweight and miniature dielectric materials for wearable technology applications.Keywords: cellulose, biodegradable, sustainable, alum sludge, nanocomposite, wearable technology, dielectric
Procedia PDF Downloads 852600 Comparison of Different Extraction Methods for the Determination of Polyphenols
Authors: Senem Suna
Abstract:
Extraction of bioactive compounds from several food/food products comes as an important topic and new trend related with health promoting effects. As a result of the increasing interest in natural foods, different methods are used for the acquisition of these components especially polyphenols. However, special attention has to be paid to the selection of proper techniques or several processing technologies (supercritical fluid extraction, microwave-assisted extraction, ultrasound-assisted extraction, powdered extracts production) for each kind of food to get maximum benefit as well as the obtainment of phenolic compounds. In order to meet consumer’s demand for healthy food and the management of quality and safety requirements, advanced research and development are needed. In this review, advantages, and disadvantages of different extraction methods, their opportunities to be used in food industry and the effects of polyphenols are mentioned in details. Consequently, with the evaluation of the results of several studies, the selection of the most suitable food specific method was aimed.Keywords: bioactives, extraction, powdered extracts, supercritical fluid extraction
Procedia PDF Downloads 2392599 Influence of High-Resolution Satellites Attitude Parameters on Image Quality
Authors: Walid Wahballah, Taher Bazan, Fawzy Eltohamy
Abstract:
One of the important functions of the satellite attitude control system is to provide the required pointing accuracy and attitude stability for optical remote sensing satellites to achieve good image quality. Although offering noise reduction and increased sensitivity, time delay and integration (TDI) charge coupled devices (CCDs) utilized in high-resolution satellites (HRS) are prone to introduce large amounts of pixel smear due to the instability of the line of sight. During on-orbit imaging, as a result of the Earth’s rotation and the satellite platform instability, the moving direction of the TDI-CCD linear array and the imaging direction of the camera become different. The speed of the image moving on the image plane (focal plane) represents the image motion velocity whereas the angle between the two directions is known as the drift angle (β). The drift angle occurs due to the rotation of the earth around its axis during satellite imaging; affecting the geometric accuracy and, consequently, causing image quality degradation. Therefore, the image motion velocity vector and the drift angle are two important factors used in the assessment of the image quality of TDI-CCD based optical remote sensing satellites. A model for estimating the image motion velocity and the drift angle in HRS is derived. The six satellite attitude control parameters represented in the derived model are the (roll angle φ, pitch angle θ, yaw angle ψ, roll angular velocity φ֗, pitch angular velocity θ֗ and yaw angular velocity ψ֗ ). The influence of these attitude parameters on the image quality is analyzed by establishing a relationship between the image motion velocity vector, drift angle and the six satellite attitude parameters. The influence of the satellite attitude parameters on the image quality is assessed by the presented model in terms of modulation transfer function (MTF) in both cross- and along-track directions. Three different cases representing the effect of pointing accuracy (φ, θ, ψ) bias are considered using four different sets of pointing accuracy typical values, while the satellite attitude stability parameters are ideal. In the same manner, the influence of satellite attitude stability (φ֗, θ֗, ψ֗) on image quality is also analysed for ideal pointing accuracy parameters. The results reveal that cross-track image quality is influenced seriously by the yaw angle bias and the roll angular velocity bias, while along-track image quality is influenced only by the pitch angular velocity bias.Keywords: high-resolution satellites, pointing accuracy, attitude stability, TDI-CCD, smear, MTF
Procedia PDF Downloads 4022598 Fluorescence in situ Hybridization (FISH) Detection of Bacteria and Archaea in Fecal Samples
Authors: Maria Nejjari, Michel Cloutier, Guylaine Talbot, Martin Lanthier
Abstract:
The fluorescence in situ hybridization (FISH) is a staining technique that allows the identification, detection and quantification of microorganisms without prior cultivation by means of epifluorescence and confocal laser scanning microscopy (CLSM). Oligonucleotide probes have been used to detect bacteria and archaea that colonize the cattle and swine digestive systems. These bacterial strains have been obtained from fecal samples issued from cattle manure and swine slurry. The collection of these samples has been done at 3 different pit’s levels A, B and C with same height. Two collection depth levels have been taken in consideration, one collection level just under the pit’s surface and the second one at the bottom of the pit. Cells were fixed and FISH was performed using oligonucleotides of 15 to 25 nucleotides of length associated with a fluorescent molecule Cy3 or Cy5. The double hybridization using Cy3 probe targeting bacteria (Cy3-EUB338-I) along with a Cy5 probe targeting Archaea (Gy5-ARCH915) gave a better signal. The CLSM images show that there are more bacteria than archaea in swine slurry. However, the choice of fluorescent probes is critical for getting the double hybridization and a unique signature for each microorganism. FISH technique is an easy way to detect pathogens like E. coli O157, Listeria, Salmonella that easily contaminate water streams, agricultural soils and, consequently, food products and endanger human health.Keywords: archaea, bacteria, detection, FISH, fluorescence
Procedia PDF Downloads 3872597 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 1352596 Purification and Pre-Crystallization of Recombinant PhoR Cytoplasmic Domain Protein from Mycobacterium Tuberculosis H37Rv
Authors: Oktira Roka Aji, Maelita R. Moeis, Ihsanawati, Ernawati A. Giri-Rachman
Abstract:
Globally, tuberculosis (TB) remains a leading cause of death. The emergence of multidrug-resistant strains and extensively drug-resistant strains have become a major public concern. One of the potential candidates for drug target is the cytoplasmic domain of PhoR Histidine Kinase, a part of the Two Component System (TCS) PhoR-PhoP in Mycobacterium tuberculosis (Mtb). TCS PhoR-PhoP relay extracellular signal to control the expression of 114 virulent associated genes in Mtb. The 3D structure of PhoR cytoplasmic domain is needed to screen novel drugs using structure based drug discovery. The PhoR cytoplasmic domain from Mtb H37Rv was overexpressed in E. coli BL21(DE3), then purified using IMAC Ni-NTA Agarose his-tag affinity column and DEAE-ion exchange column chromatography. The molecular weight of the purified protein was estimated to be 37 kDa after SDS-PAGE analysis. This sample was used for pre-crystallization screening by applying sitting drop vapor diffusion method using Natrix (HR2-116) 48 solutions crystal screen kit at 25ºC. Needle-like crystals were observed after the seventh day of incubation in test solution No.47 (0.1 M KCl, 0.01 M MgCl2.6H2O, 0.05 M Tris-Cl pH 8.5, 30% v/v PEG 4000). Further testing is required for confirming the crystal.Keywords: tuberculosis, two component system, histidine kinase, needle-like crystals
Procedia PDF Downloads 4322595 An Automatic Speech Recognition of Conversational Telephone Speech in Malay Language
Authors: M. Draman, S. Z. Muhamad Yassin, M. S. Alias, Z. Lambak, M. I. Zulkifli, S. N. Padhi, K. N. Baharim, F. Maskuriy, A. I. A. Rahim
Abstract:
The performance of Malay automatic speech recognition (ASR) system for the call centre environment is presented. The system utilizes Kaldi toolkit as the platform to the entire library and algorithm used in performing the ASR task. The acoustic model implemented in this system uses a deep neural network (DNN) method to model the acoustic signal and the standard (n-gram) model for language modelling. With 80 hours of training data from the call centre recordings, the ASR system can achieve 72% of accuracy that corresponds to 28% of word error rate (WER). The testing was done using 20 hours of audio data. Despite the implementation of DNN, the system shows a low accuracy owing to the varieties of noises, accent and dialect that typically occurs in Malaysian call centre environment. This significant variation of speakers is reflected by the large standard deviation of the average word error rate (WERav) (i.e., ~ 10%). It is observed that the lowest WER (13.8%) was obtained from recording sample with a standard Malay dialect (central Malaysia) of native speaker as compared to 49% of the sample with the highest WER that contains conversation of the speaker that uses non-standard Malay dialect.Keywords: conversational speech recognition, deep neural network, Malay language, speech recognition
Procedia PDF Downloads 3222594 Auditory Function in MP3 Users and Association with Hidden Hearing Loss
Authors: Nana Saralidze, Nino Sharashenidze, Zurab Kevanishvili
Abstract:
Hidden hearing loss may occur in humans exposed to prolonged high-level sound. It is the loss of ability to hear high-level background noise while having normal hearing in quiet. We compared the hearing of people who regularly listen 3 hours and more to personal music players and those who do not. Forty participants aged 18-30 years were divided into two groups: regular users of music players and people who had never used them. And the third group – elders aged 50-55 years, had 15 participants. Pure-tone audiometry (125-16000 Hz), auditory brainstem response (ABR) (70dB SPL), and ability to identify speech in noise (4-talker babble with a 65-dB signal-to-noise ratio at 80 dB) were measured in all participants. All participants had normal pure-tone audiometry (all thresholds < 25 dB HL). A significant difference between groups was observed in that regular users of personal audio systems correctly identified 53% of words, whereas the non-users identified 74% and the elder group – 63%. This contributes evidence supporting the presence of a hidden hearing loss in humans and demonstrates that speech-in-noise audiometry is an effective method and can be considered as the GOLD standard for detecting hidden hearing loss.Keywords: mp3 player, hidden hearing loss, speech audiometry, pure tone audiometry
Procedia PDF Downloads 742593 Contrast Enhancement of Masses in Mammograms Using Multiscale Morphology
Authors: Amit Kamra, V. K. Jain, Pragya
Abstract:
Mammography is widely used technique for breast cancer screening. There are various other techniques for breast cancer screening but mammography is the most reliable and effective technique. The images obtained through mammography are of low contrast which causes problem for the radiologists to interpret. Hence, a high quality image is mandatory for the processing of the image for extracting any kind of information from it. Many contrast enhancement algorithms have been developed over the years. In the present work, an efficient morphology based technique is proposed for contrast enhancement of masses in mammographic images. The proposed method is based on Multiscale Morphology and it takes into consideration the scale of the structuring element. The proposed method is compared with other state-of-the-art techniques. The experimental results show that the proposed method is better both qualitatively and quantitatively than the other standard contrast enhancement techniques.Keywords: enhancement, mammography, multi-scale, mathematical morphology
Procedia PDF Downloads 4262592 Searching Linguistic Synonyms through Parts of Speech Tagging
Authors: Faiza Hussain, Usman Qamar
Abstract:
Synonym-based searching is recognized to be a complicated problem as text mining from unstructured data of web is challenging. Finding useful information which matches user need from bulk of web pages is a cumbersome task. In this paper, a novel and practical synonym retrieval technique is proposed for addressing this problem. For replacement of semantics, user intent is taken into consideration to realize the technique. Parts-of-Speech tagging is applied for pattern generation of the query and a thesaurus for this experiment was formed and used. Comparison with Non-Context Based Searching, Context Based searching proved to be a more efficient approach while dealing with linguistic semantics. This approach is very beneficial in doing intent based searching. Finally, results and future dimensions are presented.Keywords: natural language processing, text mining, information retrieval, parts-of-speech tagging, grammar, semantics
Procedia PDF Downloads 3072591 Mathematical Modeling of Carotenoids and Polyphenols Content of Faba Beans (Vicia faba L.) during Microwave Treatments
Authors: Ridha Fethi Mechlouch, Ahlem Ayadi, Ammar Ben Brahim
Abstract:
Given the importance of the preservation of polyphenols and carotenoids during thermal processing, we attempted in this study to investigate the variation of these two parameters in faba beans during microwave treatment using different power densities (1; 2; and 3W/g), then to perform a mathematical modeling by using non-linear regression analysis to evaluate the models constants. The variation of the carotenoids and polyphenols ratio of faba beans and the models are tested to validate the experimental results. Exponential models were found to be suitable to describe the variation of caratenoid ratio (R²= 0.945, 0.927 and 0.946) for power densities (1; 2; and 3W/g) respectively, and polyphenol ratio (R²= 0.931, 0.989 and 0.982) for power densities (1; 2; and 3W/g) respectively. The effect of microwave power density Pd(W/g) on the coefficient k of models were also investigated. The coefficient is highly correlated (R² = 1) and can be expressed as a polynomial function.Keywords: microwave treatment, power density, carotenoid, polyphenol, modeling
Procedia PDF Downloads 2592590 Development of Ferrous-Aluminum Alloys from Recyclable Material by High Energy Milling
Authors: Arnold S. Freitas Neto, Rodrigo E. Coelho, Erick S. Mendonça
Abstract:
This study aimed to obtain an alloy of Iron and Aluminum in the proportion of 50% of atomicity for each constituent. Alloys were obtained by processing recycled aluminum and chips of 1200 series carbon steel in a high-energy mill. For the experiment, raw materials were processed thorough high energy milling before mixing the substances. Subsequently, the mixture of 1200 series carbon steel and Aluminum powder was carried out a milling process. Thereafter, hot compression was performed in a closed die in order to obtain the samples. The pieces underwent heat treatments, sintering and aging. Lastly, the composition and the mechanical properties of their hardness were analyzed. In this paper, results are compared with previous studies, which used iron powder of high purity instead of Carbon steel in the composition.Keywords: Fe-Al alloys, high energy milling, metallography characterization, powder metallurgy
Procedia PDF Downloads 3092589 Hit-Or-Miss Transform as a Tool for Similar Shape Detection
Authors: Osama Mohamed Elrajubi, Idris El-Feghi, Mohamed Abu Baker Saghayer
Abstract:
This paper describes an identification of specific shapes within binary images using the morphological Hit-or-Miss Transform (HMT). Hit-or-Miss transform is a general binary morphological operation that can be used in searching of particular patterns of foreground and background pixels in an image. It is actually a basic operation of binary morphology since almost all other binary morphological operators are derived from it. The input of this method is a binary image and a structuring element (a template which will be searched in a binary image) while the output is another binary image. In this paper a modification of Hit-or-Miss transform has been proposed. The accuracy of algorithm is adjusted according to the similarity of the template and the sought template. The implementation of this method has been done by C language. The algorithm has been tested on several images and the results have shown that this new method can be used for similar shape detection.Keywords: hit-or-miss operator transform, HMT, binary morphological operation, shape detection, binary images processing
Procedia PDF Downloads 3322588 A Custom Convolutional Neural Network with Hue, Saturation, Value Color for Malaria Classification
Authors: Ghazala Hcini, Imen Jdey, Hela Ltifi
Abstract:
Malaria disease should be considered and handled as a potential restorative catastrophe. One of the most challenging tasks in the field of microscopy image processing is due to differences in test design and vulnerability of cell classifications. In this article, we focused on applying deep learning to classify patients by identifying images of infected and uninfected cells. We performed multiple forms, counting a classification approach using the Hue, Saturation, Value (HSV) color space. HSV is used since of its superior ability to speak to image brightness; at long last, for classification, a convolutional neural network (CNN) architecture is created. Clusters of focus were used to deliver the classification. The highlights got to be forbidden, and a few more clamor sorts are included in the information. The suggested method has a precision of 99.79%, a recall value of 99.55%, and provides 99.96% accuracy.Keywords: deep learning, convolutional neural network, image classification, color transformation, HSV color, malaria diagnosis, malaria cells images
Procedia PDF Downloads 882587 Human Posture Estimation Based on Multiple Viewpoints
Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo
Abstract:
This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.Keywords: multi-view, pose estimation, ST-GCN, joint fusion
Procedia PDF Downloads 702586 Optimization of Cutting Parameters on Delamination Using Taguchi Method during Drilling of GFRP Composites
Authors: Vimanyu Chadha, Ranganath M. Singari
Abstract:
Drilling composite materials is a frequently practiced machining process during assembling in various industries such as automotive and aerospace. However, drilling of glass fiber reinforced plastic (GFRP) composites is significantly affected by damage tendency of these materials under cutting forces such as thrust force and torque. The aim of this paper is to investigate the influence of the various cutting parameters such as cutting speed and feed rate; subsequently also to study the influence of number of layers on delamination produced while drilling a GFRP composite. A plan of experiments, based on Taguchi techniques, was instituted considering drilling with prefixed cutting parameters in a hand lay-up GFRP material. The damage induced associated with drilling GFRP composites were measured. Moreover, Analysis of Variance (ANOVA) was performed to obtain minimization of delamination influenced by drilling parameters and number layers. The optimum drilling factor combination was obtained by using the analysis of signal-to-noise ratio. The conclusion revealed that feed rate was the most influential factor on the delamination. The best results of the delamination were obtained with composites with a greater number of layers at lower cutting speeds and feed rates.Keywords: analysis of variance, delamination, design optimization, drilling, glass fiber reinforced plastic composites, Taguchi method
Procedia PDF Downloads 2582585 Post Growth Annealing Effect on Deep Level Emission and Raman Spectra of Hydrothermally Grown ZnO Nanorods Assisted by KMnO4
Authors: Ashish Kumar, Tejendra Dixit, I. A. Palani, Vipul Singh
Abstract:
Zinc oxide, with its interesting properties such as large band gap (3.37eV), high exciton binding energy (60 meV) and intense UV absorption has been studied in literature for various applications viz. optoelectronics, biosensors, UV-photodetectors etc. The performance of ZnO devices is highly influenced by morphologies, size, crystallinity of the ZnO active layer and processing conditions. Recently, our group has shown the influence of the in situ addition of KMnO4 in the precursor solution during the hydrothermal growth of ZnO nanorods (NRs) on their near band edge (NBE) emission. In this paper, we have investigated the effect of post-growth annealing on the variations in NBE and deep level (DL) emissions of as grown ZnO nanorods. These observed results have been explained on the basis of X-ray Diffraction (XRD) and Raman spectroscopic analysis, which clearly show that improved crystalinity and quantum confinement in ZnO nanorods.Keywords: ZnO, nanorods, hydrothermal, KMnO4
Procedia PDF Downloads 4002584 Precise Determination of the Residual Stress Gradient in Composite Laminates Using a Configurable Numerical-Experimental Coupling Based on the Incremental Hole Drilling Method
Authors: A. S. Ibrahim Mamane, S. Giljean, M.-J. Pac, G. L’Hostis
Abstract:
Fiber reinforced composite laminates are particularly subject to residual stresses due to their heterogeneity and the complex chemical, mechanical and thermal mechanisms that occur during their processing. Residual stresses are now well known to cause damage accumulation, shape instability, and behavior disturbance in composite parts. Many works exist in the literature on techniques for minimizing residual stresses in thermosetting and thermoplastic composites mainly. To study in-depth the influence of processing mechanisms on the formation of residual stresses and to minimize them by establishing a reliable correlation, it is essential to be able to measure very precisely the profile of residual stresses in the composite. Residual stresses are important data to consider when sizing composite parts and predicting their behavior. The incremental hole drilling is very effective in measuring the gradient of residual stresses in composite laminates. This method is semi-destructive and consists of drilling incrementally a hole through the thickness of the material and measuring relaxation strains around the hole for each increment using three strain gauges. These strains are then converted into residual stresses using a matrix of coefficients. These coefficients, called calibration coefficients, depending on the diameter of the hole and the dimensions of the gauges used. The reliability of the incremental hole drilling depends on the accuracy with which the calibration coefficients are determined. These coefficients are calculated using a finite element model. The samples’ features and the experimental conditions must be considered in the simulation. Any mismatch can lead to inadequate calibration coefficients, thus introducing errors on residual stresses. Several calibration coefficient correction methods exist for isotropic material, but there is a lack of information on this subject concerning composite laminates. In this work, a Python program was developed to automatically generate the adequate finite element model. This model allowed us to perform a parametric study to assess the influence of experimental errors on the calibration coefficients. The results highlighted the sensitivity of the calibration coefficients to the considered errors and gave an order of magnitude of the precisions required on the experimental device to have reliable measurements. On the basis of these results, improvements were proposed on the experimental device. Furthermore, a numerical method was proposed to correct the calibration coefficients for different types of materials, including thick composite parts for which the analytical approach is too complex. This method consists of taking into account the experimental errors in the simulation. Accurate measurement of the experimental errors (such as eccentricity of the hole, angular deviation of the gauges from their theoretical position, or errors on increment depth) is therefore necessary. The aim is to determine more precisely the residual stresses and to expand the validity domain of the incremental hole drilling technique.Keywords: fiber reinforced composites, finite element simulation, incremental hole drilling method, numerical correction of the calibration coefficients, residual stresses
Procedia PDF Downloads 1322583 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 1942582 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation
Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi
Abstract:
Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.Keywords: integral production, level set method, morphological operation, segmentation
Procedia PDF Downloads 3172581 Long Memory and ARFIMA Modelling: The Case of CPI Inflation for Ghana and South Africa
Authors: A. Boateng, La Gil-Alana, M. Lesaoana; Hj. Siweya, A. Belete
Abstract:
This study examines long memory or long-range dependence in the CPI inflation rates of Ghana and South Africa using Whittle methods and autoregressive fractionally integrated moving average (ARFIMA) models. Standard I(0)/I(1) methods such as Augmented Dickey-Fuller (ADF), Philips-Perron (PP) and Kwiatkowski–Phillips–Schmidt–Shin (KPSS) tests were also employed. Our findings indicate that long memory exists in the CPI inflation rates of both countries. After processing fractional differencing and determining the short memory components, the models were specified as ARFIMA (4,0.35,2) and ARFIMA (3,0.49,3) respectively for Ghana and South Africa. Consequently, the CPI inflation rates of both countries are fractionally integrated and mean reverting. The implication of this result will assist in policy formulation and identification of inflationary pressures in an economy.Keywords: Consumer Price Index (CPI) inflation rates, Whittle method, long memory, ARFIMA model
Procedia PDF Downloads 3692580 Tamper Resistance Evaluation Tests with Noise Resources
Authors: Masaya Yoshikawa, Toshiya Asai, Ryoma Matsuhisa, Yusuke Nozaki, Kensaku Asahi
Abstract:
Recently, side-channel attacks, which estimate secret keys using side-channel information such as power consumption and compromising emanations of cryptography circuits embedded in hardware, have become a serious problem. In particular, electromagnetic analysis attacks against cryptographic circuits between information processing and electromagnetic fields, which are related to secret keys in cryptography circuits, are the most threatening side-channel attacks. Therefore, it is important to evaluate tamper resistance against electromagnetic analysis attacks for cryptography circuits. The present study performs basic examination of the tamper resistance of cryptography circuits using electromagnetic analysis attacks with noise resources.Keywords: tamper resistance, cryptographic circuit, hardware security evaluation, noise resources
Procedia PDF Downloads 5042579 Contextual Toxicity Detection with Data Augmentation
Authors: Julia Ive, Lucia Specia
Abstract:
Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing
Procedia PDF Downloads 170