Search results for: discrete filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1470

Search results for: discrete filter

480 Optrix: Energy Aware Cross Layer Routing Using Convex Optimization in Wireless Sensor Networks

Authors: Ali Shareef, Aliha Shareef, Yifeng Zhu

Abstract:

Energy minimization is of great importance in wireless sensor networks in extending the battery lifetime. One of the key activities of nodes in a WSN is communication and the routing of their data to a centralized base-station or sink. Routing using the shortest path to the sink is not the best solution since it will cause nodes along this path to fail prematurely. We propose a cross-layer energy efficient routing protocol Optrix that utilizes a convex formulation to maximize the lifetime of the network as a whole. We further propose, Optrix-BW, a novel convex formulation with bandwidth constraint that allows the channel conditions to be accounted for in routing. By considering this key channel parameter we demonstrate that Optrix-BW is capable of congestion control. Optrix is implemented in TinyOS, and we demonstrate that a relatively large topology of 40 nodes can converge to within 91% of the optimal routing solution. We describe the pitfalls and issues related with utilizing a continuous form technique such as convex optimization with discrete packet based communication systems as found in WSNs. We propose a routing controller mechanism that allows for this transformation. We compare Optrix against the Collection Tree Protocol (CTP) and we found that Optrix performs better in terms of convergence to an optimal routing solution, for load balancing and network lifetime maximization than CTP.

Keywords: wireless sensor network, Energy Efficient Routing

Procedia PDF Downloads 372
479 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime

Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.

Keywords: data fusion, round types speed hump, speed hump detection, surface filter

Procedia PDF Downloads 496
478 Comparing Two Interventions for Teaching Math to Pre-School Students with Autism

Authors: Hui Fang Huang Su, Jia Borror

Abstract:

This study compared two interventions for teaching math to preschool-aged students with autism spectrum disorder (ASD). The first is considered the business as usual (BAU) intervention, which uses the Strategies for Teaching Based on Autism Research (STAR) curriculum and discrete trial teaching as the instructional methodology. The second is the Math is Not Difficult (Project MIND) activity-embedded, naturalistic intervention. These interventions were randomly assigned to four preschool students with ASD classrooms and implemented over three months for Project Mind. We used measurement gained during the same three months for the STAR intervention. In addition, we used A quasi-experimental, pre-test/post-test design to compare the effectiveness of these two interventions in building mathematical knowledge and skills. The pre-post measures include three standardized instruments: the Test of Early Math Ability-3, the Problem Solving and Calculation subtests of the Woodcock-Johnson Test of Achievement IV, and the Bracken Test of Basic Concepts-3 Receptive. The STAR curriculum-based assessment is administered to all Baudhuin students three times per year, and we used the results in this study. We anticipated that implementing these two approaches would improve the mathematical knowledge and skills of children with ASD. Still, it is crucial to see whether a behavioral or naturalistic teaching approach leads to more significant results.

Keywords: early learning, autism, math for pre-schoolers, special education, teaching strategies

Procedia PDF Downloads 144
477 Numerical Simulation of Fracturing Behaviour of Pre-Cracked Crystalline Rock Using a Cohesive Grain-Based Distinct Element Model

Authors: Mahdi Saadat, Abbas Taheri

Abstract:

Understanding the cracking response of crystalline rocks at mineralogical scale is of great importance during the design procedure of mining structures. A grain-based distinct element model (GBM) is employed to numerically study the cracking response of Barre granite at micro- and macro-scales. The GBM framework is augmented with a proposed distinct element-based cohesive model to reproduce the micro-cracking response of the inter- and intra-grain contacts. The cohesive GBM framework is implemented in PFC2D distinct element codes. The microstructural properties of Barre granite are imported in PFC2D to generate synthetic specimens. The microproperties of the model is calibrated against the laboratory uniaxial compressive and Brazilian split tensile tests. The calibrated model is then used to simulate the fracturing behaviour of pre-cracked Barre granite with different flaw configurations. The numerical results of the proposed model demonstrate a good agreement with the experimental counterparts. The GBM framework proposed thus appears promising for further investigation of the influence of grain microstructure and mineralogical properties on the cracking behaviour of crystalline rocks.

Keywords: discrete element modelling, cohesive grain-based model, crystalline rock, fracturing behavior

Procedia PDF Downloads 111
476 Heuristic Classification of Hydrophone Recordings

Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas

Abstract:

An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.

Keywords: anthrophony, hydrophone, k-means, machine learning

Procedia PDF Downloads 150
475 A New 3D Shape Descriptor Based on Multi-Resolution and Multi-Block CS-LBP

Authors: Nihad Karim Chowdhury, Mohammad Sanaullah Chowdhury, Muhammed Jamshed Alam Patwary, Rubel Biswas

Abstract:

In content-based 3D shape retrieval system, achieving high search performance has become an important research problem. A challenging aspect of this problem is to find an effective shape descriptor which can discriminate similar shapes adequately. To address this problem, we propose a new shape descriptor for 3D shape models by combining multi-resolution with multi-block center-symmetric local binary pattern operator. Given an arbitrary 3D shape, we first apply pose normalization, and generate a set of multi-viewed 2D rendered images. Second, we apply Gaussian multi-resolution filter to generate several levels of images from each of 2D rendered image. Then, overlapped sub-images are computed for each image level of a multi-resolution image. Our unique multi-block CS-LBP comes next. It allows the center to be composed of m-by-n rectangular pixels, instead of a single pixel. This process is repeated for all the 2D rendered images, derived from both ‘depth-buffer’ and ‘silhouette’ rendering. Finally, we concatenate all the features vectors into one dimensional histogram as our proposed 3D shape descriptor. Through several experiments, we demonstrate that our proposed 3D shape descriptor outperform the previous methods by using a benchmark dataset.

Keywords: 3D shape retrieval, 3D shape descriptor, CS-LBP, overlapped sub-images

Procedia PDF Downloads 429
474 Iterative Dynamic Programming for 4D Flight Trajectory Optimization

Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho

Abstract:

4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.

Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization

Procedia PDF Downloads 142
473 Estimating X-Ray Spectra for Digital Mammography by Using the Expectation Maximization Algorithm: A Monte Carlo Simulation Study

Authors: Chieh-Chun Chang, Cheng-Ting Shih, Yan-Lin Liu, Shu-Jun Chang, Jay Wu

Abstract:

With the widespread use of digital mammography (DM), radiation dose evaluation of breasts has become important. X-ray spectra are one of the key factors that influence the absorbed dose of glandular tissue. In this study, we estimated the X-ray spectrum of DM using the expectation maximization (EM) algorithm with the transmission measurement data. The interpolating polynomial model proposed by Boone was applied to generate the initial guess of the DM spectrum with the target/filter combination of Mo/Mo and the tube voltage of 26 kVp. The Monte Carlo N-particle code (MCNP5) was used to tally the transmission data through aluminum sheets of 0.2 to 3 mm. The X-ray spectrum was reconstructed by using the EM algorithm iteratively. The influence of the initial guess for EM reconstruction was evaluated. The percentage error of the average energy between the reference spectrum inputted for Monte Carlo simulation and the spectrum estimated by the EM algorithm was -0.14%. The normalized root mean square error (NRMSE) and the normalized root max square error (NRMaSE) between both spectra were 0.6% and 2.3%, respectively. We conclude that the EM algorithm with transmission measurement data is a convenient and useful tool for estimating x-ray spectra for DM in clinical practice.

Keywords: digital mammography, expectation maximization algorithm, X-Ray spectrum, X-Ray

Procedia PDF Downloads 707
472 Building an Ontology for Researchers: An Application of Topic Maps and Social Information

Authors: Yu Hung Chiang, Hei Chia Wang

Abstract:

In the academic area, it is important for research to find proper research domain. Many researchers may refer to conference issues to find their interesting or new topics. Furthermore, conferences issues can help researchers realize current research trends in their field and learn about cutting-edge developments in their specialty. However, online published conference information may widely be distributed; it is not easy to be concluded. Many researchers use search engine of journals or conference issues to filter information in order to get what they want. However, this search engine has its limitation. There will still be some issues should be considered; i.e. researchers cannot find the associated topics which may be useful information for them. Hence, use Knowledge Management (KM) could be a way to resolve these issues. In KM, ontology is widely adopted; but most existed ontology construction methods do not consider social information between target users. To effective in academic KM, this study proposes a method of constructing research Topic Maps using Open Directory Project (ODP) and Social Information Processing (SIP). Through catching of social information in conference website: i.e. the information of co-authorship or collaborator, research topics can be associated among related researchers. Finally, the experiments show Topic Maps successfully help researchers to find the information they need more easily and quickly as well as construct associations between research topics.

Keywords: knowledge management, topic map, social information processing, ontology extraction

Procedia PDF Downloads 275
471 Integrated Wastewater Reuse Project of the Faculty of Sciences AinChock, Morocco

Authors: Nihad Chakri, Btissam El Amrani, Faouzi Berrada, Fouad Amraoui

Abstract:

In Morocco, water scarcity requires the exploitation of non-conventional resources. Rural areas are under-equipped with sanitation infrastructure, unlike urban areas. Decentralized and low-cost solutions could improve the quality of life of the population and the environment. In this context, the Faculty of Sciences Ain Chock "FSAC" has undertaken an integrated project to treat part of its wastewater using a decentralized compact system. The project will propose alternative solutions that are inexpensive and adapted to the context of peri-urban and rural areas in order to treat the wastewater generated and use it for irrigation, watering, and cleaning. For this purpose, several tests were carried out in the laboratory in order to develop a liquid waste treatment system optimized for local conditions. Based on the results obtained at the laboratory scale of the different proposed scenarios, we designed and implemented a prototype of a mini wastewater treatment plant for the Faculty. In this article, we will outline the steps of dimensioning, construction, and monitoring of the mini-station in our Faculty.

Keywords: wastewater, purification, optimization, vertical filter, MBBR process, sizing, decentralized pilot, reuse, irrigation, sustainable development

Procedia PDF Downloads 98
470 Biodiversity Indices for Macrobenthic Community structures of Mangrove Forests, Khamir Port, Iran

Authors: Mousa Keshavarz, Abdul-Reza Dabbagh, Maryam Soyuf Jahromi

Abstract:

The diversity of mangrove macrobenthos assemblages at mudflat and mangrove ecosystems of Port Khamir, Iran were investigated for one year. During this period, we measured physicochemical properties of water temperature, salinity, pH, DO and the density and distribution of the macrobenthos. We sampled a total of 9 transects, at three different topographic levels along the intertidal zone at three stations. Assemblages at class level were compared. The five most diverse and abundant classes were Foraminifers (54%), Gastropods (23%), Polychaetes (10%), Bivalves (8%) & Crustaceans (5%), respectively. Overall densities were 1869 ± 424 ind/m2 (26%) in spring, 2544 ± 383 ind/m2(36%) in summer, 1482 ± 323 ind/m2 (21%) in autumn and 1207 ± 80 ind/m2 (17%) in winter. Along the intertidal zone, the overall relative density of individuals at high, intermediate, and low topographic levels was 40, 30, and 30% respectively. Biodiversity indices were used to compare different classes: Gastropoda (Shannon index: 0.33) and Foraminifera (Simpson index: 0.28) calculated the highest scores. It was also calculated other bio-indices. With the exception of bivalves, filter feeders were associated with coarser sediments at higher intertidal levels, while deposit feeders were associated with finer sediments at lower levels. Salinity was the most important factor acting on community structure, while DO and pH had little influence.

Keywords: macrobenthos, biodiversity, mangrove forest, Khamir Port

Procedia PDF Downloads 363
469 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels

Authors: Mohammad Obeidat, Ayman Mansour

Abstract:

In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.

Keywords: atrial fibrillation, communication channels, closed loop, estimation

Procedia PDF Downloads 365
468 Design of a Real Time Heart Sounds Recognition System

Authors: Omer Abdalla Ishag, Magdi Baker Amien

Abstract:

Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.

Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform

Procedia PDF Downloads 424
467 Numerical Analysis of a Pilot Solar Chimney Power Plant

Authors: Ehsan Gholamalizadeh, Jae Dong Chung

Abstract:

Solar chimney power plant is a feasible solar thermal system which produces electricity from the Sun. The objective of this study is to investigate buoyancy-driven flow and heat transfer through a built pilot solar chimney system called 'Kerman Project'. The system has a chimney with the height and diameter of 60 m and 3 m, respectively, and the average radius of its solar collector is about 20 m, and also its average collector height is about 2 m. A three-dimensional simulation was conducted to analyze the system, using computational fluid dynamics (CFD). In this model, radiative transfer equation was solved using the discrete ordinates (DO) radiation model taking into account a non-gray radiation behavior. In order to modelling solar irradiation from the sun’s rays, the solar ray tracing algorithm was coupled to the computation via a source term in the energy equation. The model was validated with comparing to the experimental data of the Manzanares prototype and also the performance of the built pilot system. Then, based on the numerical simulations, velocity and temperature distributions through the system, the temperature profile of the ground surface and the system performance were presented. The analysis accurately shows the flow and heat transfer characteristics through the pilot system and predicts its performance.

Keywords: buoyancy-driven flow, computational fluid dynamics, heat transfer, renewable energy, solar chimney power plant

Procedia PDF Downloads 241
466 Continuous Blood Pressure Measurement from Pulse Transit Time Techniques

Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen

Abstract:

Pulse Blood pressure (BP) is one of the vital signs, and is an index that helps determining the stability of life. In this respect, some spinal cord injury patients need to take the tilt table test. While doing the test, the posture changes abruptly, and may cause a patient’s BP to change abnormally. This may cause patients to feel discomfort, and even feel as though their life is threatened. Therefore, if a continuous non-invasive BP assessment system were built, it could help to alert health care professionals in the process of rehabilitation when the BP value is out of range. In our research, BP assessed by the pulse transit time technique was developed. In the system, we use a self-made photoplethysmograph (PPG) sensor and filter circuit to detect two PPG signals and to calculate the time difference. The BP can immediately be assessed by the trend line. According to the results of this study, the relationship between the systolic BP and PTT has a highly negative linear correlation (R2=0.8). Further, we used the trend line to assess the value of the BP and compared it to a commercial sphygmomanometer (Omron MX3); the error rate of the system was found to be in the range of ±10%, which is within the permissible error range of a commercial sphygmomanometer. The continue blood pressure measurement from pulse transit time technique may have potential to become a convenience method for clinical rehabilitation.

Keywords: continous blood pressure measurement, PPG, time transit time, transit velocity

Procedia PDF Downloads 333
465 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis

Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.

Keywords: big data, social network analysis, text mining, topic modeling

Procedia PDF Downloads 274
464 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine

Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji

Abstract:

The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.

Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis

Procedia PDF Downloads 298
463 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 131
462 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 141
461 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 67
460 Design of Raw Water Reservoir on Sandy Soil

Authors: Venkata Ramana Pamu

Abstract:

This paper is a case study of a 5310 ML capacity Raw Water Reservoir (RWR), situated in Indian state Rajasthan, which is a part of Rajasthan Rural Water Supply & Fluorosis Mitigation Project. This RWR embankment was constructed by locally available material on natural ground profile. Height of the embankment was varying from 2m to 10m.This is due to existing ground level was varying. Reservoir depth 9m including 1.5m free board and 1V:3H slopes were provided both upstream and downstream side. Proper soil investigation, tests were done and it was confirmed that the existing soil is sandy silt. The existing excavated earth was used as filling material for embankment construction, due to this controlling seepage from upstream to downstream be a challenging task. Slope stability and Seismic analysis of the embankment done by Conventional method for both full reservoir condition and rapid drawdown. Horizontal filter at toe level was provided along with upstream side PCC (Plain Cement Concrete) block and HDPE (High Density poly ethylene) lining as a remedy to control seepage. HDPE lining was also provided at storage area of the reservoir bed level. Mulching was done for downstream side slope protection.

Keywords: raw water reservoir, seepage, seismic analysis, slope stability

Procedia PDF Downloads 483
459 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 78
458 Investigation of Fire Damaged Concrete Using Nonlinear Resonance Vibration Method

Authors: Kang-Gyu Park, Sun-Jong Park, Hong Jae Yim, Hyo-Gyung Kwak

Abstract:

This paper attempts to evaluate the effect of fire damage on concrete by using nonlinear resonance vibration method, one of the nonlinear nondestructive method. Concrete exhibits not only nonlinear stress-strain relation but also hysteresis and discrete memory effect which are contained in consolidated materials. Hysteretic materials typically show the linear resonance frequency shift. Also, the shift of resonance frequency is changed according to the degree of micro damage. The degree of the shift can be obtained through nonlinear resonance vibration method. Five exposure scenarios were considered in order to make different internal micro damage. Also, the effect of post-fire-curing on fire-damaged concrete was taken into account to conform the change in internal damage. Hysteretic non linearity parameter was obtained by amplitude-dependent resonance frequency shift after specific curing periods. In addition, splitting tensile strength was measured on each sample to characterize the variation of residual strength. Then, a correlation between the hysteretic non linearity parameter and residual strength was proposed from each test result.

Keywords: nonlinear resonance vibration method, non linearity parameter, splitting tensile strength, micro damage, post-fire-curing, fire damaged concrete

Procedia PDF Downloads 251
457 Double Beta Decay Experiments in Novi Sad

Authors: Nataša Todorović, Jovana Nikolov

Abstract:

Despite the great interest in β⁻β⁻ decay, β⁺β⁺ decays are rarely investigated due to the low probability of detecting these processes with available low-level equipment. If β⁺β⁺, β⁺EC, or ECEC decay occurs in a thin sample of a material, the positrons will be stopped and annihilated inside the material, leading to the emission of two or four coincidence gamma photons energy of 511 keV. The paper presents the results of measurements of double beta decay of ⁶⁴Zn, ⁵⁰Cr, and ⁵⁴Fe isotopes. In the first experiment, 511-keV gamma rays originating from the annihilation of positrons in natural zinc were measured by a coincidence technique to obtain a non-zero value for the (0ν+2ν) half-life. In the second experiment, the result of measuring double beta decay of ⁵⁰Cr is presented, which suggests a result other than zero at 95% CL and gives the lowest limit for the half-life of this process. In the third experiment, neutrino-less ECEC decay of ⁵⁴Fe was examined. Under the decay theory, gamma rays are emitted whose energy does not coincide with the energies of gamma rays emitted by nuclei from known discrete excited states. Iron shield of an internal volume of 1 m³ and thickness of 25 cm served as a source for measuring the (0ν+2ν) process in ⁵⁴Fe, whose yield in natural iron is 5.4%. We obtain the lower limit for the half-life for ⁵⁴Fe: T(0ν, K, K)>4.4x10²⁰ yr, T(0ν, K, L)>4.1x10²⁰ yr, and T(0ν, L, L)>5.0x10²⁰ yr. For ⁵⁰Cr limit for the half-life is T(0ν+2ν)>1.3(6)x10¹⁸ yr, and for ⁶⁴Zn T(0ν+2ν, ECβ+)=1.1(0.9)x10⁹ years.

Keywords: neutrinoless double beta decay, half-life, ⁶⁴Zn, ⁵⁰Cr, and, ⁵⁴Fe

Procedia PDF Downloads 93
456 Developing Structured Sizing Systems for Manufacturing Ready-Made Garments of Indian Females Using Decision Tree-Based Data Mining

Authors: Hina Kausher, Sangita Srivastava

Abstract:

In India, there is a lack of standard, systematic sizing approach for producing readymade garments. Garments manufacturing companies use their own created size tables by modifying international sizing charts of ready-made garments. The purpose of this study is to tabulate the anthropometric data which covers the variety of figure proportions in both height and girth. 3,000 data has been collected by an anthropometric survey undertaken over females between the ages of 16 to 80 years from some states of India to produce the sizing system suitable for clothing manufacture and retailing. This data is used for the statistical analysis of body measurements, the formulation of sizing systems and body measurements tables. Factor analysis technique is used to filter the control body dimensions from a large number of variables. Decision tree-based data mining is used to cluster the data. The standard and structured sizing system can facilitate pattern grading and garment production. Moreover, it can exceed buying ratios and upgrade size allocations to retail segments.

Keywords: anthropometric data, data mining, decision tree, garments manufacturing, sizing systems, ready-made garments

Procedia PDF Downloads 119
455 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 74
454 Analysis of Spectral Radiative Entropy Generation in a Non-Gray Participating Medium with Heat Source (Furnaces)

Authors: Asadollah Bahrami

Abstract:

In the present study, spectral radiative entropy generation is analyzed in a furnace filled with a mixture of H₂O, CO₂ and soot at radiative equilibrium. For the angular and spatial discretization of the radiative transfer equation and radiative entropy generation equations, the discrete ordinates method and the finite volume method are used, respectively. Spectral radiative properties are obtained using the correlated-k (CK) non-gray model with updated parameters based on the HITEMP2010 high-resolution database. In order to evaluate the effects of the location of the heat source, boundary condition and wall emissivity on radiative entropy generation, five cases are considered with different conditions. The spectral and total radiative entropy generation in the system are calculated for all cases and the effects of mentioned parameters on radiative entropy generation are attentively analyzed and finally, the optimum condition is especially presented. The most important results can be stated as follows: Results demonstrate that the wall emissivity has a considerable effect on the radiative entropy generation. Also, irreversible radiative transfer at the wall with lower temperatures is the main source of radiative entropy generation in the furnaces. In addition, the effect of the location of the heat source on total radiative entropy generation is less than other factors. Eventually, it can be said that characterizing the effective parameters of radiative entropy generation provides an approach to minimizing the radiative entropy generation and enhancing the furnace's performance practicality.

Keywords: spectral radiative entropy generation, non-gray medium, correlated k(CK) model, heat source

Procedia PDF Downloads 78
453 Energy Analysis of Sugarcane Production: A Case Study in Metehara Sugar Factory in Ethiopia

Authors: Wasihun Girma Hailemariam

Abstract:

Energy is one of the key elements required for every agricultural activity, especially for large scale agricultural production such as sugarcane cultivation which mostly is used to produce sugar and bioethanol from sugarcane. In such kinds of resource (energy) intensive activities, energy analysis of the production system and looking for other alternatives which can reduce energy inputs of the sugarcane production process are steps forward for resource management. The purpose of this study was to determine input energy (direct and indirect) per hectare of sugarcane production sector of Metehara sugar factory in Ethiopia. Total energy consumption of the production system was 61,642 MJ/ha-yr. This total input energy is a cumulative value of different inputs (direct and indirect inputs) in the production system. The contribution of these different inputs is discussed and a scenario of substituting the most influential input by other alternative input which can replace the original input in its nutrient content was discussed. In this study the most influential input for increased energy consumption was application of organic fertilizer which accounted for 50 % of the total energy consumption. Filter cake which is a residue from the sugar production in the factory was used to substitute the organic fertilizer and the reduction in the energy consumption of the sugarcane production was discussed

Keywords: energy analysis, organic fertilizer, resource management, sugarcane

Procedia PDF Downloads 136
452 Stability Characteristics of Angle Ply Bi-Stable Laminates by Considering the Effect of Resin Layers

Authors: Masih Moore, Saeed Ziaei-Rad

Abstract:

In this study, the stability characteristics of a bi-stable composite plate with different asymmetric composition are considered. The interest in bi-stable structures comes from their ability that these structures can have two different stable equilibrium configurations to define a discrete set of stable shapes. The structures can easily change the first stable shape to the second one by a simple snap action. The main purpose of the current research is to consider the effect of including resin layers on the stability characteristics of bi-stable laminates. To this end and In order to determine the magnitude of the loads that are responsible for snap through and snap back phenomena between two stable shapes of the laminate, a non-linear finite element method (FEM) is utilized. An experimental investigation was also carried out to study the critical loads that caused snapping between two different stable shapes. Several specimens were manufactured from T300/5208 graphite-epoxy with [0/90]T, [-30/60]T, [-20/70]T asymmetric stacking sequence. In order to create an accurate finite element model, different thickness of resin layers created during the manufacturing process of the laminate was measured and taken into account. The geometry of each lamina and the resin layers was characterized by optical microscopy from different locations of the laminates thickness. The exact thickness of each lamina and the resin layer in all specimens with [0/90]T,[-30/60]T, [-20/70]T stacking sequence were determined by using image processing technique.

Keywords: bi-stable laminates, finite element method, graphite-epoxy plate, snap behavior

Procedia PDF Downloads 227
451 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D models, environment, matching, pleiades

Procedia PDF Downloads 313