Search results for: JM-MB-TBD filter
294 A New 3D Shape Descriptor Based on Multi-Resolution and Multi-Block CS-LBP
Authors: Nihad Karim Chowdhury, Mohammad Sanaullah Chowdhury, Muhammed Jamshed Alam Patwary, Rubel Biswas
Abstract:
In content-based 3D shape retrieval system, achieving high search performance has become an important research problem. A challenging aspect of this problem is to find an effective shape descriptor which can discriminate similar shapes adequately. To address this problem, we propose a new shape descriptor for 3D shape models by combining multi-resolution with multi-block center-symmetric local binary pattern operator. Given an arbitrary 3D shape, we first apply pose normalization, and generate a set of multi-viewed 2D rendered images. Second, we apply Gaussian multi-resolution filter to generate several levels of images from each of 2D rendered image. Then, overlapped sub-images are computed for each image level of a multi-resolution image. Our unique multi-block CS-LBP comes next. It allows the center to be composed of m-by-n rectangular pixels, instead of a single pixel. This process is repeated for all the 2D rendered images, derived from both ‘depth-buffer’ and ‘silhouette’ rendering. Finally, we concatenate all the features vectors into one dimensional histogram as our proposed 3D shape descriptor. Through several experiments, we demonstrate that our proposed 3D shape descriptor outperform the previous methods by using a benchmark dataset.Keywords: 3D shape retrieval, 3D shape descriptor, CS-LBP, overlapped sub-images
Procedia PDF Downloads 445293 Estimating X-Ray Spectra for Digital Mammography by Using the Expectation Maximization Algorithm: A Monte Carlo Simulation Study
Authors: Chieh-Chun Chang, Cheng-Ting Shih, Yan-Lin Liu, Shu-Jun Chang, Jay Wu
Abstract:
With the widespread use of digital mammography (DM), radiation dose evaluation of breasts has become important. X-ray spectra are one of the key factors that influence the absorbed dose of glandular tissue. In this study, we estimated the X-ray spectrum of DM using the expectation maximization (EM) algorithm with the transmission measurement data. The interpolating polynomial model proposed by Boone was applied to generate the initial guess of the DM spectrum with the target/filter combination of Mo/Mo and the tube voltage of 26 kVp. The Monte Carlo N-particle code (MCNP5) was used to tally the transmission data through aluminum sheets of 0.2 to 3 mm. The X-ray spectrum was reconstructed by using the EM algorithm iteratively. The influence of the initial guess for EM reconstruction was evaluated. The percentage error of the average energy between the reference spectrum inputted for Monte Carlo simulation and the spectrum estimated by the EM algorithm was -0.14%. The normalized root mean square error (NRMSE) and the normalized root max square error (NRMaSE) between both spectra were 0.6% and 2.3%, respectively. We conclude that the EM algorithm with transmission measurement data is a convenient and useful tool for estimating x-ray spectra for DM in clinical practice.Keywords: digital mammography, expectation maximization algorithm, X-Ray spectrum, X-Ray
Procedia PDF Downloads 730292 Building an Ontology for Researchers: An Application of Topic Maps and Social Information
Authors: Yu Hung Chiang, Hei Chia Wang
Abstract:
In the academic area, it is important for research to find proper research domain. Many researchers may refer to conference issues to find their interesting or new topics. Furthermore, conferences issues can help researchers realize current research trends in their field and learn about cutting-edge developments in their specialty. However, online published conference information may widely be distributed; it is not easy to be concluded. Many researchers use search engine of journals or conference issues to filter information in order to get what they want. However, this search engine has its limitation. There will still be some issues should be considered; i.e. researchers cannot find the associated topics which may be useful information for them. Hence, use Knowledge Management (KM) could be a way to resolve these issues. In KM, ontology is widely adopted; but most existed ontology construction methods do not consider social information between target users. To effective in academic KM, this study proposes a method of constructing research Topic Maps using Open Directory Project (ODP) and Social Information Processing (SIP). Through catching of social information in conference website: i.e. the information of co-authorship or collaborator, research topics can be associated among related researchers. Finally, the experiments show Topic Maps successfully help researchers to find the information they need more easily and quickly as well as construct associations between research topics.Keywords: knowledge management, topic map, social information processing, ontology extraction
Procedia PDF Downloads 293291 Integrated Wastewater Reuse Project of the Faculty of Sciences AinChock, Morocco
Authors: Nihad Chakri, Btissam El Amrani, Faouzi Berrada, Fouad Amraoui
Abstract:
In Morocco, water scarcity requires the exploitation of non-conventional resources. Rural areas are under-equipped with sanitation infrastructure, unlike urban areas. Decentralized and low-cost solutions could improve the quality of life of the population and the environment. In this context, the Faculty of Sciences Ain Chock "FSAC" has undertaken an integrated project to treat part of its wastewater using a decentralized compact system. The project will propose alternative solutions that are inexpensive and adapted to the context of peri-urban and rural areas in order to treat the wastewater generated and use it for irrigation, watering, and cleaning. For this purpose, several tests were carried out in the laboratory in order to develop a liquid waste treatment system optimized for local conditions. Based on the results obtained at the laboratory scale of the different proposed scenarios, we designed and implemented a prototype of a mini wastewater treatment plant for the Faculty. In this article, we will outline the steps of dimensioning, construction, and monitoring of the mini-station in our Faculty.Keywords: wastewater, purification, optimization, vertical filter, MBBR process, sizing, decentralized pilot, reuse, irrigation, sustainable development
Procedia PDF Downloads 114290 Biodiversity Indices for Macrobenthic Community structures of Mangrove Forests, Khamir Port, Iran
Authors: Mousa Keshavarz, Abdul-Reza Dabbagh, Maryam Soyuf Jahromi
Abstract:
The diversity of mangrove macrobenthos assemblages at mudflat and mangrove ecosystems of Port Khamir, Iran were investigated for one year. During this period, we measured physicochemical properties of water temperature, salinity, pH, DO and the density and distribution of the macrobenthos. We sampled a total of 9 transects, at three different topographic levels along the intertidal zone at three stations. Assemblages at class level were compared. The five most diverse and abundant classes were Foraminifers (54%), Gastropods (23%), Polychaetes (10%), Bivalves (8%) & Crustaceans (5%), respectively. Overall densities were 1869 ± 424 ind/m2 (26%) in spring, 2544 ± 383 ind/m2(36%) in summer, 1482 ± 323 ind/m2 (21%) in autumn and 1207 ± 80 ind/m2 (17%) in winter. Along the intertidal zone, the overall relative density of individuals at high, intermediate, and low topographic levels was 40, 30, and 30% respectively. Biodiversity indices were used to compare different classes: Gastropoda (Shannon index: 0.33) and Foraminifera (Simpson index: 0.28) calculated the highest scores. It was also calculated other bio-indices. With the exception of bivalves, filter feeders were associated with coarser sediments at higher intertidal levels, while deposit feeders were associated with finer sediments at lower levels. Salinity was the most important factor acting on community structure, while DO and pH had little influence.Keywords: macrobenthos, biodiversity, mangrove forest, Khamir Port
Procedia PDF Downloads 376289 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels
Authors: Mohammad Obeidat, Ayman Mansour
Abstract:
In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.Keywords: atrial fibrillation, communication channels, closed loop, estimation
Procedia PDF Downloads 378288 Design of a Real Time Heart Sounds Recognition System
Authors: Omer Abdalla Ishag, Magdi Baker Amien
Abstract:
Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform
Procedia PDF Downloads 446287 Continuous Blood Pressure Measurement from Pulse Transit Time Techniques
Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen
Abstract:
Pulse Blood pressure (BP) is one of the vital signs, and is an index that helps determining the stability of life. In this respect, some spinal cord injury patients need to take the tilt table test. While doing the test, the posture changes abruptly, and may cause a patient’s BP to change abnormally. This may cause patients to feel discomfort, and even feel as though their life is threatened. Therefore, if a continuous non-invasive BP assessment system were built, it could help to alert health care professionals in the process of rehabilitation when the BP value is out of range. In our research, BP assessed by the pulse transit time technique was developed. In the system, we use a self-made photoplethysmograph (PPG) sensor and filter circuit to detect two PPG signals and to calculate the time difference. The BP can immediately be assessed by the trend line. According to the results of this study, the relationship between the systolic BP and PTT has a highly negative linear correlation (R2=0.8). Further, we used the trend line to assess the value of the BP and compared it to a commercial sphygmomanometer (Omron MX3); the error rate of the system was found to be in the range of ±10%, which is within the permissible error range of a commercial sphygmomanometer. The continue blood pressure measurement from pulse transit time technique may have potential to become a convenience method for clinical rehabilitation.Keywords: continous blood pressure measurement, PPG, time transit time, transit velocity
Procedia PDF Downloads 353286 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis
Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.Keywords: big data, social network analysis, text mining, topic modeling
Procedia PDF Downloads 294285 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine
Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji
Abstract:
The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis
Procedia PDF Downloads 319284 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency
Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu
Abstract:
In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal
Procedia PDF Downloads 151283 The Trigger-DAQ System in the Mu2e Experiment
Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella
Abstract:
The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).Keywords: trigger, daq, mu2e, Fermilab
Procedia PDF Downloads 155282 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials
Authors: Han Longxiang
Abstract:
In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.Keywords: emulsion, filtration, graphene, one-step
Procedia PDF Downloads 80281 Design of Raw Water Reservoir on Sandy Soil
Authors: Venkata Ramana Pamu
Abstract:
This paper is a case study of a 5310 ML capacity Raw Water Reservoir (RWR), situated in Indian state Rajasthan, which is a part of Rajasthan Rural Water Supply & Fluorosis Mitigation Project. This RWR embankment was constructed by locally available material on natural ground profile. Height of the embankment was varying from 2m to 10m.This is due to existing ground level was varying. Reservoir depth 9m including 1.5m free board and 1V:3H slopes were provided both upstream and downstream side. Proper soil investigation, tests were done and it was confirmed that the existing soil is sandy silt. The existing excavated earth was used as filling material for embankment construction, due to this controlling seepage from upstream to downstream be a challenging task. Slope stability and Seismic analysis of the embankment done by Conventional method for both full reservoir condition and rapid drawdown. Horizontal filter at toe level was provided along with upstream side PCC (Plain Cement Concrete) block and HDPE (High Density poly ethylene) lining as a remedy to control seepage. HDPE lining was also provided at storage area of the reservoir bed level. Mulching was done for downstream side slope protection.Keywords: raw water reservoir, seepage, seismic analysis, slope stability
Procedia PDF Downloads 497280 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer
Authors: Nabil Saad, David Morgan, Manish Gupta
Abstract:
Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.Keywords: aerosols, extinction, visibility, albedo
Procedia PDF Downloads 90279 Developing Structured Sizing Systems for Manufacturing Ready-Made Garments of Indian Females Using Decision Tree-Based Data Mining
Authors: Hina Kausher, Sangita Srivastava
Abstract:
In India, there is a lack of standard, systematic sizing approach for producing readymade garments. Garments manufacturing companies use their own created size tables by modifying international sizing charts of ready-made garments. The purpose of this study is to tabulate the anthropometric data which covers the variety of figure proportions in both height and girth. 3,000 data has been collected by an anthropometric survey undertaken over females between the ages of 16 to 80 years from some states of India to produce the sizing system suitable for clothing manufacture and retailing. This data is used for the statistical analysis of body measurements, the formulation of sizing systems and body measurements tables. Factor analysis technique is used to filter the control body dimensions from a large number of variables. Decision tree-based data mining is used to cluster the data. The standard and structured sizing system can facilitate pattern grading and garment production. Moreover, it can exceed buying ratios and upgrade size allocations to retail segments.Keywords: anthropometric data, data mining, decision tree, garments manufacturing, sizing systems, ready-made garments
Procedia PDF Downloads 133278 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials
Authors: Han Longxiang
Abstract:
In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.Keywords: emulsion, filtration, graphene, one-step
Procedia PDF Downloads 90277 Energy Analysis of Sugarcane Production: A Case Study in Metehara Sugar Factory in Ethiopia
Authors: Wasihun Girma Hailemariam
Abstract:
Energy is one of the key elements required for every agricultural activity, especially for large scale agricultural production such as sugarcane cultivation which mostly is used to produce sugar and bioethanol from sugarcane. In such kinds of resource (energy) intensive activities, energy analysis of the production system and looking for other alternatives which can reduce energy inputs of the sugarcane production process are steps forward for resource management. The purpose of this study was to determine input energy (direct and indirect) per hectare of sugarcane production sector of Metehara sugar factory in Ethiopia. Total energy consumption of the production system was 61,642 MJ/ha-yr. This total input energy is a cumulative value of different inputs (direct and indirect inputs) in the production system. The contribution of these different inputs is discussed and a scenario of substituting the most influential input by other alternative input which can replace the original input in its nutrient content was discussed. In this study the most influential input for increased energy consumption was application of organic fertilizer which accounted for 50 % of the total energy consumption. Filter cake which is a residue from the sugar production in the factory was used to substitute the organic fertilizer and the reduction in the energy consumption of the sugarcane production was discussedKeywords: energy analysis, organic fertilizer, resource management, sugarcane
Procedia PDF Downloads 158276 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)
Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves
Abstract:
The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.Keywords: 3D models, environment, matching, pleiades
Procedia PDF Downloads 330275 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods
Authors: Bandar Alahmadi, Lethia Jackson
Abstract:
Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 339274 Inerting and Upcycling of Foundry Fines
Authors: Chahinez Aissaoui, Cecile Diliberto, Jean-Michel Mechling
Abstract:
The manufacture of metal foundry products requires the use of sand moulds, which are destroyed, and new ones made each time metal is poured. However, recycled sand requires a regeneration process that produces a polluted fine mineral phase. Particularly rich in heavy metals and organic residues, this foundry co-product is disposed of in hazardous waste landfills and requires an expensive stabilisation process. This paper presents the results of research that valorises this fine fraction of foundry sand by inerting it in a cement phase. The fines are taken from the bag filter suction systems of a foundry. The sample is in the form of filler, with a fraction of less than 140µm, the D50 is 43µm. The Blaine fineness is 3120 cm²/g, and the fines are composed mainly of SiO₂, Al₂O₃ and Fe₂O₃. The loss on ignition at 1000°C of this material is 20%. The chosen inerting technique is to manufacture cement pastes which, once hardened, will be crushed for use as artificial aggregates in new concrete formulations. Different percentages of volume substitutions of Portland cement were tested: 30, 50 and 65%. The substitution rates were chosen to obtain the highest possible recycling rate while satisfying the European discharge limits (these values are assessed by leaching). They were also optimised by adding water-reducing admixtures to increase the compressive strengths of the mixes.Keywords: leaching, upcycling, waste, residuals
Procedia PDF Downloads 68273 Sampling and Chemical Characterization of Particulate Matter in a Platinum Mine
Authors: Juergen Orasche, Vesta Kohlmeier, George C. Dragan, Gert Jakobi, Patricia Forbes, Ralf Zimmermann
Abstract:
Underground mining poses a difficult environment for both man and machines. At more than 1000 meters underneath the surface of the earth, ores and other mineral resources are still gained by conventional and motorised mining. Adding to the hazards caused by blasting and stone-chipping, the working conditions are best described by the high temperatures of 35-40°C and high humidity, at low air exchange rates. Separate ventilation shafts lead fresh air into a mine and others lead expended air back to the surface. This is essential for humans and machines working deep underground. Nevertheless, mines are widely ramified. Thus the air flow rate at the far end of a tunnel is sensed to be close to zero. In recent years, conventional mining was supplemented by mining with heavy diesel machines. These very flat machines called Load Haul Dump (LHD) vehicles accelerate and ease work in areas favourable for heavy machines. On the other hand, they emit non-filtered diesel exhaust, which constitutes an occupational hazard for the miners. Combined with a low air exchange, high humidity and inorganic dust from the mining it leads to 'black smog' underneath the earth. This work focuses on the air quality in mines employing LHDs. Therefore we performed personal sampling (samplers worn by miners during their work), stationary sampling and aethalometer (Microaeth MA200, Aethlabs) measurements in a platinum mine in around 1000 meters under the earth’s surface. We compared areas of high diesel exhaust emission with areas of conventional mining where no diesel machines were operated. For a better assessment of health risks caused by air pollution we applied a separated gas-/particle-sampling tool (or system), with first denuder section collecting intermediate VOCs. These multi-channel silicone rubber denuders are able to trap IVOCs while allowing particles ranged from 10 nm to 1 µm in diameter to be transmitted with an efficiency of nearly 100%. The second section is represented by a quartz fibre filter collecting particles and adsorbed semi-volatile organic compounds (SVOC). The third part is a graphitized carbon black adsorber – collecting the SVOCs that evaporate from the filter. The compounds collected on these three sections were analyzed in our labs with different thermal desorption techniques coupled with gas chromatography and mass spectrometry (GC-MS). VOCs and IVOCs were measured with a Shimadzu Thermal Desorption Unit (TD20, Shimadzu, Japan) coupled to a GCMS-System QP 2010 Ultra with a quadrupole mass spectrometer (Shimadzu). The GC was equipped with a 30m, BP-20 wax column (0.25mm ID, 0.25µm film) from SGE (Australia). Filters were analyzed with In-situ derivatization thermal desorption gas chromatography time-of-flight-mass spectrometry (IDTD-GC-TOF-MS). The IDTD unit is a modified GL sciences Optic 3 system (GL Sciences, Netherlands). The results showed black carbon concentrations measured with the portable aethalometers up to several mg per m³. The organic chemistry was dominated by very high concentrations of alkanes. Typical diesel engine exhaust markers like alkylated polycyclic aromatic hydrocarbons were detected as well as typical lubrication oil markers like hopanes.Keywords: diesel emission, personal sampling, aethalometer, mining
Procedia PDF Downloads 157272 Understanding the Experience of the Visually Impaired towards a Multi-Sensorial Architectural Design
Authors: Sarah M. Oteifa, Lobna A. Sherif, Yasser M. Mostafa
Abstract:
Visually impaired people, in their daily lives, face struggles and spatial barriers because the built environment is often designed with an extreme focus on the visual element, causing what is called architectural visual bias or ocularcentrism. The aim of the study is to holistically understand the world of the visually impaired as an attempt to extract the qualities of space that accommodate their needs, and to show the importance of multi-sensory, holistic designs for the blind. Within the framework of existential phenomenology, common themes are reached through "intersubjectivity": experience descriptions by blind people and blind architects, observation of how blind children learn to perceive their surrounding environment, and a personal lived blind-folded experience are analyzed. The extracted themes show how visually impaired people filter out and prioritize tactile (active, passive and dynamic touch), acoustic and olfactory spatial qualities respectively, and how this happened during the personal lived blind folded experience. The themes clarify that haptic and aural inclusive designs are essential to create environments suitable for the visually impaired to empower them towards an independent, safe and efficient life.Keywords: architecture, architectural ocularcentrism, multi-sensory design, visually impaired
Procedia PDF Downloads 202271 Development of Filling Material in 3D Printer with the Aid of Computer Software for Supported with Natural Zeolite for the Removal of Nitrogen and Phosphorus
Authors: Luís Fernando Cusioli, Leticia Nishi, Lucas Bairros, Gabriel Xavier Jorge, Sandro Rogério Lautenschalager, Celso Varutu Nakamura, Rosângela Bergamasco
Abstract:
Focusing on the elimination of nitrogen and phosphorus from sewage, the study proposes to face the challenges of eutrophication and to optimize the effectiveness of sewage treatment through biofilms and filling produced by a 3D printer, seeking to identify the most effective Polylactic Acid (PLA), Acrylonitrile Butadiene Styrene (ABS). The study also proposes to evaluate the nitrification process in a Submerged Aerated Biological Filter (FBAS) on a pilot plant scale, quantifying the removal of nitrogen and phosphorus. The experiment will consist of two distinct phases, namely, a bench stage and the implementation of a pilot plant. During the bench stage, samples will be collected at five points to characterize the microbiota. Samples will be collected, and the microbiota will be investigated using Fluorescence In Situ Hybridization (FISH), deepening the understanding of the performance of biofilms in the face of multiple variables. In this context, the study contributes to the search for effective solutions to mitigate eutrophication and, thus, strengthen initiatives to improve effluent treatment.Keywords: eutrophication, sewage treatment, biofilms, nitrogen and phosphorus removal, 3d printer, environmental efficiency
Procedia PDF Downloads 88270 Metagenomics Composition During and After Wet Deposition and the Presence of Airborne Microplastics
Authors: Yee Hui Lim, Elena Gusareva, Irvan Luhung, Yulia Frank, Stephan Christoph Schuster
Abstract:
Environmental pollution from microplastics (MPs) is an emerging concern worldwide. While the presence of microplastics has been well established in the marine and terrestrial environments, the prevalence of microplastics in the atmosphere is still poorly understood. Wet depositions such as rain or snow scavenge impurities from the atmosphere as it falls to the ground. These wet depositions serve as a useful tool in the removal of airborne particles that are suspended in the air. Therefore, the aim of this study is to investigate the presence of atmospheric microplastics and fibres through the analysis of air, rainwater and snow samples. Air samples were collected with filter-based air samplers from outdoor locations in Singapore. The sampling campaigns were conducted during and after each rain event. Rainwater samples from Singapore and Siberia were collected as well. Snow samples were also collected from Siberia as part of the ongoing study. Genomic DNA was then extracted from the samples and sequenced with shotgun metagenomics approach. qPCR analysis was conducted to quantify the total bacteria and fungi in the air, rainwater and snow samples. The results compared the bioaerosol profiles of all the samples. To observe the presence of microplastics, scanning electron microscope (SEM) was used. From the preliminary results, microplastics were detected. It can be concluded that there is a significant amount of atmospheric microplastics present, and its occurrence should be investigated in greater detail.Keywords: atmospheric microplastics, metagenomics, scanning electron microscope, wet deposition
Procedia PDF Downloads 86269 Same-Day Detection Method of Salmonella Spp., Shigella Spp. and Listeria Monocytogenes with Fluorescence-Based Triplex Real-Time PCR
Authors: Ergun Sakalar, Kubra Bilgic
Abstract:
Faster detection and characterization of pathogens are the basis of the evoid from foodborne pathogens. Salmonella spp., Shigella spp. and Listeria monocytogenes are common foodborne bacteria that are among the most life-threatining. It is important to rapid and accurate detection of these pathogens to prevent food poisoning and outbreaks or to manage food chains. The present work promise to develop a sensitive, species specific and reliable PCR based detection system for simultaneous detection of Salmonella spp., Shigella spp. and Listeria monocytogenes. For this purpose, three genes were picked out, ompC for Salmonella spp., ipaH for Shigella spp. and hlyA for L. monocytogenes. After short pre-enrichment of milk was passed through a vacuum filter and bacterial DNA was exracted using commercially available kit GIDAGEN®(Turkey, İstanbul). Detection of amplicons was verified by examination of the melting temperature (Tm) that are 72° C, 78° C, 82° C for Salmonella spp., Shigella spp. and L. monocytogenes, respectively. The method specificity was checked against a group of bacteria strains, and also carried out sensitivity test resulting in under 10² CFU mL⁻¹ of milk for each bacteria strain. Our results show that the flourescence based triplex qPCR method can be used routinely to detect Salmonella spp., Shigella spp. and L. monocytogenes during the milk processing procedures in order to reduce cost, time of analysis and the risk of foodborne disease outbreaks.Keywords: evagreen, food-born bacteria, pathogen detection, real-time pcr
Procedia PDF Downloads 244268 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)
Procedia PDF Downloads 421267 Biochar Assisted Municipal Wastewater Treatment and Nutrient Recycling
Authors: A. Pokharel, A. Farooque, B. Acharya
Abstract:
Pyrolysis can be used for energy production from waste biomass of agriculture and forestry. Biochar is the solid byproduct of pyrolysis and its cascading use can offset the cost of the process. A wide variety of research on biochar has highlighted its ability to absorb nutrients, metal and complex compounds; filter suspended solids; enhance microorganisms’ growth; retain water and nutrients as well as to increase carbon content of soil. In addition, sustainable biochar systems are an attractive approach for carbon sequestration and total waste management cycle. Commercially available biochar from Sigma Aldrich was studied for adsorption of nitrogen from effluent of municipal wastewater treatment plant. Adsorption isotherm and breakthrough curve were determined for the biochar. Similarly, biochar’s effects in aerobic as well as anaerobic bioreactors were also studied. In both cases, the biomass was increased in presence of biochar. The amount of gas produced for anaerobic digestion of fruit mix (apple and banana) was similar but the rate of production was significantly faster in biochar fed reactors. The cumulative goal of the study is to use biochar in various wastewater treatment units like aeration tank, secondary clarifier and tertiary nutrient recovery system as well as in anaerobic digestion of the sludge to optimize utilization and add value before being used as a soil amendment.Keywords: biochar, nutrient recyling, wastewater treatment, soil amendment
Procedia PDF Downloads 148266 Artificial Intelligence-Based Chest X-Ray Test of COVID-19 Patients
Authors: Dhurgham Al-Karawi, Nisreen Polus, Shakir Al-Zaidi, Sabah Jassim
Abstract:
The management of COVID-19 patients based on chest imaging is emerging as an essential tool for evaluating the spread of the pandemic which has gripped the global community. It has already been used to monitor the situation of COVID-19 patients who have issues in respiratory status. There has been increase to use chest imaging for medical triage of patients who are showing moderate-severe clinical COVID-19 features, this is due to the fast dispersal of the pandemic to all continents and communities. This article demonstrates the development of machine learning techniques for the test of COVID-19 patients using Chest X-Ray (CXR) images in nearly real-time, to distinguish the COVID-19 infection with a significantly high level of accuracy. The testing performance has covered a combination of different datasets of CXR images of positive COVID-19 patients, patients with viral and bacterial infections, also, people with a clear chest. The proposed AI scheme successfully distinguishes CXR scans of COVID-19 infected patients from CXR scans of viral and bacterial based pneumonia as well as normal cases with an average accuracy of 94.43%, sensitivity 95%, and specificity 93.86%. Predicted decisions would be supported by visual evidence to help clinicians speed up the initial assessment process of new suspected cases, especially in a resource-constrained environment.Keywords: COVID-19, chest x-ray scan, artificial intelligence, texture analysis, local binary pattern transform, Gabor filter
Procedia PDF Downloads 145265 Kinetics of Hydrogen Sulfide Removal from Biogas Using Biofilm on Packed Bed of Salak Fruit Seeds
Authors: Retno A. S. Lestari, Wahyudi B. Sediawan, Siti Syamsiah, Sarto
Abstract:
Sulfur-oxidizing bacteria were isolated and then grown on salak fruit seeds forming a biofilm on the surface. Their performances in sulfide removal were experimentally observed. In doing so, the salak fruit seeds containing biofilm were then used as packing material in a cylinder. Biogas obtained from biological treatment, which contains 27.95 ppm of hydrogen sulfide was flown through the packed bed. The hydrogen sulfide from the biogas was absorbed in the biofilm and then degraded by the microbes in the biofilm. The hydrogen sulfide concentrations at a various axial position and various times were analyzed. A set of simple kinetics model for the rate of the sulfide removal and the bacterial growth was proposed. Since the biofilm is very thin, the sulfide concentration in the Biofilm at a certain axial position is assumed to be uniform. The simultaneous ordinary differential equations obtained were then solved numerically using Runge-Kutta method. The values of the parameters were also obtained by curve-fitting. The accuracy of the model proposed was tested by comparing the calculation results using the model with the experimental data obtained. It turned out that the model proposed can describe the removal of sulfide liquid using bio-filter in the packed bed. The biofilter could remove 89,83 % of the hydrogen sulfide in the feed at 2.5 hr of operation and biogas flow rate of 30 L/hr.Keywords: sulfur-oxidizing bacteria, salak fruit seeds, biofilm, packing material, biogas
Procedia PDF Downloads 222