Search results for: adaptive filter and average filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6277

Search results for: adaptive filter and average filter

5497 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D models, environment, matching, pleiades

Procedia PDF Downloads 312
5496 Applying of an Adaptive Neuro-Fuzzy Inference System (ANFIS) for Estimation of Flood Hydrographs

Authors: Amir Ahmad Dehghani, Morteza Nabizadeh

Abstract:

This paper presents the application of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to flood hydrograph modeling of Shahid Rajaee reservoir dam located in Iran. This was carried out using 11 flood hydrographs recorded in Tajan river gauging station. From this dataset, 9 flood hydrographs were chosen to train the model and 2 flood hydrographs to test the model. The different architectures of neuro-fuzzy model according to the membership function and learning algorithm were designed and trained with different epochs. The results were evaluated in comparison with the observed hydrographs and the best structure of model was chosen according the least RMSE in each performance. To evaluate the efficiency of neuro-fuzzy model, various statistical indices such as Nash-Sutcliff and flood peak discharge error criteria were calculated. In this simulation, the coordinates of a flood hydrograph including peak discharge were estimated using the discharge values occurred in the earlier time steps as input values to the neuro-fuzzy model. These results indicate the satisfactory efficiency of neuro-fuzzy model for flood simulating. This performance of the model demonstrates the suitability of the implemented approach to flood management projects.

Keywords: adaptive neuro-fuzzy inference system, flood hydrograph, hybrid learning algorithm, Shahid Rajaee reservoir dam

Procedia PDF Downloads 458
5495 Nuclear Power Plant Radioactive Effluent Discharge Management in China

Authors: Jie Yang, Qifu Cheng, Yafang Liu, Zhijie Gu

Abstract:

Controlled emissions of effluent from nuclear power plants are an important means of ensuring environmental safety. In order to fully grasp the actual discharge level of nuclear power plant in China's nuclear power plant in the pressurized water reactor and heavy water reactor, it will use the global average nuclear power plant effluent discharge as a reference to the standard analysis of China's nuclear power plant environmental discharge status. The results show that the average normalized emission of liquid tritium in PWR nuclear power plants in China is slightly higher than the global average value, and the other nuclides emissions are lower than the global average values.

Keywords: radioactive effluent, HWR, PWR, nuclear power plant

Procedia PDF Downloads 225
5494 Adaptive Design of Large Prefabricated Concrete Panels Collective Housing

Authors: Daniel M. Muntean, Viorel Ungureanu

Abstract:

More than half of the urban population in Romania lives today in residential buildings made out of large prefabricated reinforced concrete panels. Since their initial design was made in the 1960’s, these housing units are now being technically and morally outdated, consuming large amounts of energy for heating, cooling, ventilation and lighting, while failing to meet the needs of the contemporary life-style. Due to their widespread use, the design of a system that improves their energy efficiency would have a real impact, not only on the energy consumption of the residential sector, but also on the quality of life that it offers. Furthermore, with the transition of today’s existing power grid to a “smart grid”, buildings could become an active element for future electricity networks by contributing in micro-generation and energy storage. One of the most addressed issues today is to find locally adapted strategies that can be applied considering the 20-20-20 EU policy criteria and to offer sustainable and innovative solutions for the cost-optimal energy performance of buildings adapted on the existing local market. This paper presents a possible adaptive design scenario towards sustainable retrofitting of these housing units. The apartments are transformed in order to meet the current living requirements and additional extensions are placed on top of the building, replacing the unused roof space, acting not only as housing units, but as active solar energy collection systems. An adaptive building envelope is ensured in order to achieve overall air-tightness and an elevator system is introduced to facilitate access to the upper levels.

Keywords: adaptive building, energy efficiency, retrofitting, residential buildings, smart grid

Procedia PDF Downloads 282
5493 Computational Fluid Dynamics Simulation Study of Flow near Moving Wall of Various Surface Types Using Moving Mesh Method

Authors: Khizir Mohd Ismail, Yu Jun Lim, Tshun Howe Yong

Abstract:

The study of flow behavior in an enclosed volume using Computational Fluid Dynamics (CFD) has been around for decades. However, due to the knowledge limitation of adaptive grid methods, the flow in an enclosed volume near the moving wall using CFD is less explored. A CFD simulation of flow in an enclosed volume near a moving wall was demonstrated and studied by introducing a moving mesh method and was modeled with Unsteady Reynolds-Averaged Navier-Stokes (URANS) approach. A static enclosed volume with controlled opening size in the bottom was positioned against a moving, translational wall with sliding mesh features. Controlled variables such as smoothed, crevices and corrugated wall characteristics, the distance between the enclosed volume to the wall and the moving wall speed against the enclosed chamber were varied to understand how the flow behaves and reacts in between these two geometries. These model simulations were validated against experimental results and provided result confidence when the simulation had shown good agreement with the experimental data. This study had provided better insight into the flow behaving in an enclosed volume when various wall types in motion were introduced within the various distance between each other and create a potential opportunity of application which involves adaptive grid methods in CFD.

Keywords: moving wall, adaptive grid methods, CFD, moving mesh method

Procedia PDF Downloads 128
5492 QCARNet: Networks for Quality-Adaptive Compression Artifact

Authors: Seung Ho Park, Young Su Moon, Nam Ik Cho

Abstract:

We propose a convolution neural network (CNN) for quality adaptive compression artifact reduction named QCARNet. The proposed method is different from the existing discriminative models that learn a specific model at a certain quality level. The method is composed of a quality estimation CNN (QECNN) and a compression artifact reduction CNN (CARCNN), which are two functionally separate CNNs. By connecting the QECNN and CARCNN, each CARCNN layer is able to adaptively reduce compression artifacts and preserve details depending on the estimated quality level map generated by the QECNN. We experimentally demonstrate that the proposed method achieves better performance compared to other state-of-the-art blind compression artifact reduction methods.

Keywords: compression artifact reduction, deblocking, image denoising, image restoration

Procedia PDF Downloads 115
5491 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods

Authors: Bandar Alahmadi, Lethia Jackson

Abstract:

Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.

Keywords: adversarial examples, attack, computer vision, image processing

Procedia PDF Downloads 320
5490 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 240
5489 Effects of Moringa oleifera Leaf Powder on the Feed Intake and Average Weight of Pullets

Authors: Cajethan U. Ugwuoke, Hyginus O. Omeje, Emmanuel C. Osinem

Abstract:

The study was carried out to determine the effects of Moringa oleifera leaf powder additive on the feed intake and average weight of pullets. A completely Randomized Design (CRD) was adopted for the study. On the procedure of the experiment, 240 chicks were randomly selected from 252 Isa Brown day-old chicks. The chicks were equally randomly allotted to 12 pens with 20 chicks each. The pens were randomly assigned to four different treatment groups with three replicates each. T1 was fed with control feed while T2, T3, and T4 were fed with 2.5%, 5% and 7.5% Moringa oleifera leaf powder fortified feed respectively. The chicks were fed with uniform feed up to week four. From week five, experimental feeds were given to the pullet up to 20 weeks of age. The birds were placed on the same treatment conditions except different experimental feeds given to different groups. Data on the feed intake were collected daily while the average weight of the pullets was collected weekly using weighing scale. Data collected were analyzed using mean, bar charts and Analysis of Variance. The layers fed with control feed consumed the highest amount of feed in most of the weeks under study. The average weights of all the treatment groups were equal from week 1 to week 4. Little variation in average weight started in week 5 with T2 topping the groups. However, there was no statistically significant difference (p>0.05) in the feed intake and average weight of layers fed with different inclusion rates of Moringa oleifera leaf powder in feeds.

Keywords: average weight, feed intake, Moringa oleifera, pullets

Procedia PDF Downloads 175
5488 Dynamic Analysis and Clutch Adaptive Prefill in Dual Clutch Transmission

Authors: Bin Zhou, Tongli Lu, Jianwu Zhang, Hongtao Hao

Abstract:

Dual clutch transmissions (DCT) offer a high comfort performance in terms of the gearshift. Hydraulic multi-disk clutches are the key components of DCT, its engagement determines the shifting comfort. The prefill of the clutches requests an initial engagement which the clutches just contact against each other but not transmit substantial torque from the engine, this initial clutch engagement point is called the touch point. Open-loop control is typically implemented for the clutch prefill, a lot of uncertainties, such as oil temperature and clutch wear, significantly affects the prefill, probably resulting in an inappropriate touch point. Underfill causes the engine flaring in gearshift while overfill arises clutch tying up, both deteriorating the shifting comfort of DCT. Therefore, it is important to enable an adaptive capacity for the clutch prefills regarding the uncertainties. In this paper, a dynamic model of the hydraulic actuator system is presented, including the variable force solenoid and clutch piston, and validated by a test. Subsequently, the open-loop clutch prefill is simulated based on the proposed model. Two control parameters of the prefill, fast fill time and stable fill pressure is analyzed with regard to the impact on the prefill. The former has great effects on the pressure transients, the latter directly influences the touch point. Finally, an adaptive method is proposed for the clutch prefill during gear shifting, in which clutch fill control parameters are adjusted adaptively and continually. The adaptive strategy is changing the stable fill pressure according to the current clutch slip during a gearshift, improving the next prefill process. The stable fill pressure is increased by means of the clutch slip while underfill and decreased with a constant value for overfill. The entire strategy is designed in the Simulink/Stateflow, and implemented in the transmission control unit with optimization. Road vehicle test results have shown the strategy realized its adaptive capability and proven it improves the shifting comfort.

Keywords: clutch prefill, clutch slip, dual clutch transmission, touch point, variable force solenoid

Procedia PDF Downloads 296
5487 Neuroevolution Based on Adaptive Ensembles of Biologically Inspired Optimization Algorithms Applied for Modeling a Chemical Engineering Process

Authors: Sabina-Adriana Floria, Marius Gavrilescu, Florin Leon, Silvia Curteanu, Costel Anton

Abstract:

Neuroevolution is a subfield of artificial intelligence used to solve various problems in different application areas. Specifically, neuroevolution is a technique that applies biologically inspired methods to generate neural network architectures and optimize their parameters automatically. In this paper, we use different biologically inspired optimization algorithms in an ensemble strategy with the aim of training multilayer perceptron neural networks, resulting in regression models used to simulate the industrial chemical process of obtaining bricks from silicone-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. In addition, the initial conditions that were taken into account during the design and commissioning of the installation can change over time, which leads to the need to add new mixes to adjust the operating conditions for the desired purpose, e.g., material properties and energy saving. The present approach follows the study by simulation of a process of obtaining bricks from silicone-based materials, i.e., the modeling and optimization of the process. Optimization aims to determine the working conditions that minimize the emissions represented by nitrogen monoxide. We first use a search procedure to find the best values for the parameters of various biologically inspired optimization algorithms. Then, we propose an adaptive ensemble strategy that uses only a subset of the best algorithms identified in the search stage. The adaptive ensemble strategy combines the results of selected algorithms and automatically assigns more processing capacity to the more efficient algorithms. Their efficiency may also vary at different stages of the optimization process. In a given ensemble iteration, the most efficient algorithms aim to maintain good convergence, while the less efficient algorithms can improve population diversity. The proposed adaptive ensemble strategy outperforms the individual optimizers and the non-adaptive ensemble strategy in convergence speed, and the obtained results provide lower error values.

Keywords: optimization, biologically inspired algorithm, neuroevolution, ensembles, bricks, emission minimization

Procedia PDF Downloads 88
5486 Genomics of Aquatic Adaptation

Authors: Agostinho Antunes

Abstract:

The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.

Keywords: comparative genomics, adaptive evolution, bioinformatics, phylogenetics, genome mining

Procedia PDF Downloads 512
5485 Inerting and Upcycling of Foundry Fines

Authors: Chahinez Aissaoui, Cecile Diliberto, Jean-Michel Mechling

Abstract:

The manufacture of metal foundry products requires the use of sand moulds, which are destroyed, and new ones made each time metal is poured. However, recycled sand requires a regeneration process that produces a polluted fine mineral phase. Particularly rich in heavy metals and organic residues, this foundry co-product is disposed of in hazardous waste landfills and requires an expensive stabilisation process. This paper presents the results of research that valorises this fine fraction of foundry sand by inerting it in a cement phase. The fines are taken from the bag filter suction systems of a foundry. The sample is in the form of filler, with a fraction of less than 140µm, the D50 is 43µm. The Blaine fineness is 3120 cm²/g, and the fines are composed mainly of SiO₂, Al₂O₃ and Fe₂O₃. The loss on ignition at 1000°C of this material is 20%. The chosen inerting technique is to manufacture cement pastes which, once hardened, will be crushed for use as artificial aggregates in new concrete formulations. Different percentages of volume substitutions of Portland cement were tested: 30, 50 and 65%. The substitution rates were chosen to obtain the highest possible recycling rate while satisfying the European discharge limits (these values are assessed by leaching). They were also optimised by adding water-reducing admixtures to increase the compressive strengths of the mixes.

Keywords: leaching, upcycling, waste, residuals

Procedia PDF Downloads 55
5484 Group Sequential Covariate-Adjusted Response Adaptive Designs for Survival Outcomes

Authors: Yaxian Chen, Yeonhee Park

Abstract:

Driven by evolving FDA recommendations, modern clinical trials demand innovative designs that strike a balance between statistical rigor and ethical considerations. Covariate-adjusted response-adaptive (CARA) designs bridge this gap by utilizing patient attributes and responses to skew treatment allocation in favor of the treatment that is best for an individual patient’s profile. However, existing CARA designs for survival outcomes often hinge on specific parametric models, constraining their applicability in clinical practice. In this article, we address this limitation by introducing a CARA design for survival outcomes (CARAS) based on the Cox model and a variance estimator. This method addresses issues of model misspecification and enhances the flexibility of the design. We also propose a group sequential overlapweighted log-rank test to preserve type I error rate in the context of group sequential trials using extensive simulation studies to demonstrate the clinical benefit, statistical efficiency, and robustness to model misspecification of the proposed method compared to traditional randomized controlled trial designs and response-adaptive randomization designs.

Keywords: cox model, log-rank test, optimal allocation ratio, overlap weight, survival outcome

Procedia PDF Downloads 37
5483 Security Over OFDM Fading Channels with Friendly Jammer

Authors: Munnujahan Ara

Abstract:

In this paper, we investigate the effect of friendly jamming power allocation strategies on the achievable average secrecy rate over a bank of parallel fading wiretap channels. We investigate the achievable average secrecy rate in parallel fading wiretap channels subject to Rayleigh and Rician fading. The achievable average secrecy rate, due to the presence of a line-of-sight component in the jammer channel is also evaluated. Moreover, we study the detrimental effect of correlation across the parallel sub-channels, and evaluate the corresponding decrease in the achievable average secrecy rate for the various fading configurations. We also investigate the tradeoff between the transmission power and the jamming power for a fixed total power budget. Our results, which are applicable to current orthogonal frequency division multiplexing (OFDM) communications systems, shed further light on the achievable average secrecy rates over a bank of parallel fading channels in the presence of friendly jammers.

Keywords: fading parallel channels, wire-tap channel, OFDM, secrecy capacity, power allocation

Procedia PDF Downloads 487
5482 Sampling and Chemical Characterization of Particulate Matter in a Platinum Mine

Authors: Juergen Orasche, Vesta Kohlmeier, George C. Dragan, Gert Jakobi, Patricia Forbes, Ralf Zimmermann

Abstract:

Underground mining poses a difficult environment for both man and machines. At more than 1000 meters underneath the surface of the earth, ores and other mineral resources are still gained by conventional and motorised mining. Adding to the hazards caused by blasting and stone-chipping, the working conditions are best described by the high temperatures of 35-40°C and high humidity, at low air exchange rates. Separate ventilation shafts lead fresh air into a mine and others lead expended air back to the surface. This is essential for humans and machines working deep underground. Nevertheless, mines are widely ramified. Thus the air flow rate at the far end of a tunnel is sensed to be close to zero. In recent years, conventional mining was supplemented by mining with heavy diesel machines. These very flat machines called Load Haul Dump (LHD) vehicles accelerate and ease work in areas favourable for heavy machines. On the other hand, they emit non-filtered diesel exhaust, which constitutes an occupational hazard for the miners. Combined with a low air exchange, high humidity and inorganic dust from the mining it leads to 'black smog' underneath the earth. This work focuses on the air quality in mines employing LHDs. Therefore we performed personal sampling (samplers worn by miners during their work), stationary sampling and aethalometer (Microaeth MA200, Aethlabs) measurements in a platinum mine in around 1000 meters under the earth’s surface. We compared areas of high diesel exhaust emission with areas of conventional mining where no diesel machines were operated. For a better assessment of health risks caused by air pollution we applied a separated gas-/particle-sampling tool (or system), with first denuder section collecting intermediate VOCs. These multi-channel silicone rubber denuders are able to trap IVOCs while allowing particles ranged from 10 nm to 1 µm in diameter to be transmitted with an efficiency of nearly 100%. The second section is represented by a quartz fibre filter collecting particles and adsorbed semi-volatile organic compounds (SVOC). The third part is a graphitized carbon black adsorber – collecting the SVOCs that evaporate from the filter. The compounds collected on these three sections were analyzed in our labs with different thermal desorption techniques coupled with gas chromatography and mass spectrometry (GC-MS). VOCs and IVOCs were measured with a Shimadzu Thermal Desorption Unit (TD20, Shimadzu, Japan) coupled to a GCMS-System QP 2010 Ultra with a quadrupole mass spectrometer (Shimadzu). The GC was equipped with a 30m, BP-20 wax column (0.25mm ID, 0.25µm film) from SGE (Australia). Filters were analyzed with In-situ derivatization thermal desorption gas chromatography time-of-flight-mass spectrometry (IDTD-GC-TOF-MS). The IDTD unit is a modified GL sciences Optic 3 system (GL Sciences, Netherlands). The results showed black carbon concentrations measured with the portable aethalometers up to several mg per m³. The organic chemistry was dominated by very high concentrations of alkanes. Typical diesel engine exhaust markers like alkylated polycyclic aromatic hydrocarbons were detected as well as typical lubrication oil markers like hopanes.

Keywords: diesel emission, personal sampling, aethalometer, mining

Procedia PDF Downloads 138
5481 Application of Neuro-Fuzzy Technique for Optimizing the PVC Membrane Sensor

Authors: Majid Rezayi, Sh. Shahaboddin, HNM E. Mahmud, A. Yadollah, A. Saeid, A. Yatimah

Abstract:

In this study, the adaptive neuro-fuzzy inference system (ANFIS) was applied to obtain the membrane composition model affecting the potential response of our reported polymeric PVC sensor for determining the titanium (III) ions. The performance statistics of the artificial neural network (ANN) and linear regression models for potential slope prediction of membrane composition of titanium (III) ion selective electrode were compared with ANFIS technique. The results show that the ANFIS model can be used as a practical tool for obtaining the Nerntian slope of the proposed sensor in this study.

Keywords: adaptive neuro fuzzy inference, PVC sensor, titanium (III) ions, Nerntian slope

Procedia PDF Downloads 259
5480 Understanding the Experience of the Visually Impaired towards a Multi-Sensorial Architectural Design

Authors: Sarah M. Oteifa, Lobna A. Sherif, Yasser M. Mostafa

Abstract:

Visually impaired people, in their daily lives, face struggles and spatial barriers because the built environment is often designed with an extreme focus on the visual element, causing what is called architectural visual bias or ocularcentrism. The aim of the study is to holistically understand the world of the visually impaired as an attempt to extract the qualities of space that accommodate their needs, and to show the importance of multi-sensory, holistic designs for the blind. Within the framework of existential phenomenology, common themes are reached through "intersubjectivity": experience descriptions by blind people and blind architects, observation of how blind children learn to perceive their surrounding environment, and a personal lived blind-folded experience are analyzed. The extracted themes show how visually impaired people filter out and prioritize tactile (active, passive and dynamic touch), acoustic and olfactory spatial qualities respectively, and how this happened during the personal lived blind folded experience. The themes clarify that haptic and aural inclusive designs are essential to create environments suitable for the visually impaired to empower them towards an independent, safe and efficient life.

Keywords: architecture, architectural ocularcentrism, multi-sensory design, visually impaired

Procedia PDF Downloads 187
5479 Development of Filling Material in 3D Printer with the Aid of Computer Software for Supported with Natural Zeolite for the Removal of Nitrogen and Phosphorus

Authors: Luís Fernando Cusioli, Leticia Nishi, Lucas Bairros, Gabriel Xavier Jorge, Sandro Rogério Lautenschalager, Celso Varutu Nakamura, Rosângela Bergamasco

Abstract:

Focusing on the elimination of nitrogen and phosphorus from sewage, the study proposes to face the challenges of eutrophication and to optimize the effectiveness of sewage treatment through biofilms and filling produced by a 3D printer, seeking to identify the most effective Polylactic Acid (PLA), Acrylonitrile Butadiene Styrene (ABS). The study also proposes to evaluate the nitrification process in a Submerged Aerated Biological Filter (FBAS) on a pilot plant scale, quantifying the removal of nitrogen and phosphorus. The experiment will consist of two distinct phases, namely, a bench stage and the implementation of a pilot plant. During the bench stage, samples will be collected at five points to characterize the microbiota. Samples will be collected, and the microbiota will be investigated using Fluorescence In Situ Hybridization (FISH), deepening the understanding of the performance of biofilms in the face of multiple variables. In this context, the study contributes to the search for effective solutions to mitigate eutrophication and, thus, strengthen initiatives to improve effluent treatment.

Keywords: eutrophication, sewage treatment, biofilms, nitrogen and phosphorus removal, 3d printer, environmental efficiency

Procedia PDF Downloads 61
5478 Metagenomics Composition During and After Wet Deposition and the Presence of Airborne Microplastics

Authors: Yee Hui Lim, Elena Gusareva, Irvan Luhung, Yulia Frank, Stephan Christoph Schuster

Abstract:

Environmental pollution from microplastics (MPs) is an emerging concern worldwide. While the presence of microplastics has been well established in the marine and terrestrial environments, the prevalence of microplastics in the atmosphere is still poorly understood. Wet depositions such as rain or snow scavenge impurities from the atmosphere as it falls to the ground. These wet depositions serve as a useful tool in the removal of airborne particles that are suspended in the air. Therefore, the aim of this study is to investigate the presence of atmospheric microplastics and fibres through the analysis of air, rainwater and snow samples. Air samples were collected with filter-based air samplers from outdoor locations in Singapore. The sampling campaigns were conducted during and after each rain event. Rainwater samples from Singapore and Siberia were collected as well. Snow samples were also collected from Siberia as part of the ongoing study. Genomic DNA was then extracted from the samples and sequenced with shotgun metagenomics approach. qPCR analysis was conducted to quantify the total bacteria and fungi in the air, rainwater and snow samples. The results compared the bioaerosol profiles of all the samples. To observe the presence of microplastics, scanning electron microscope (SEM) was used. From the preliminary results, microplastics were detected. It can be concluded that there is a significant amount of atmospheric microplastics present, and its occurrence should be investigated in greater detail.

Keywords: atmospheric microplastics, metagenomics, scanning electron microscope, wet deposition

Procedia PDF Downloads 71
5477 Molecular Detection of E. coli in Treated Wastewater and Well Water Samples Collected from Al Riyadh Governorate, Saudi Arabia

Authors: Hanouf A. S. Al Nuwaysir, Nadine Moubayed, Abir Ben Bacha, Islem Abid

Abstract:

Consumption of waste water continues to cause significant problems for human health in both developed and developing countries. Many regulations have been implied by different world authorities controlling water quality for the presence of coliforms used as standard indicators of water quality deterioration and historically leading health protection concept. In this study, the European directive for the detection of Escherichia coli, ISO 9308-1, was applied to examine and monitor coliforms in water samples collected from Wadi Hanifa and neighboring wells, Riyadh governorate, kingdom of Saudi Arabia, which is used for irrigation and industrial purposes. Samples were taken from different locations for 8 months consecutively, chlorine concentration ranging from 0.1- 0.4 mg/l, was determined using the DPD FREE CHLORINE HACH kit. Water samples were then analyzed following the ISO protocol which relies on the membrane filtration technique (0.45µm, pore size membrane filter) and a chromogenic medium TTC, a lactose based medium used for the detection and enumeration of total coliforms and E.coli. Data showed that the number of bacterial isolates ranged from 60 to 300 colonies/100ml for well and surface water samples respectively; where higher numbers were attributed to the surface samples. Organisms which apparently ferment lactose on TTC agar plates, appearing as orange colonies, were selected and additionally cultured on EMB and MacConkey agar for a further differentiation among E.coli and coliform bacteria. Two additional biochemical tests (Cytochrome oxidase and indole from tryptophan) were also investigated to detect and differentiate the presence of E.coli from other coliforms, E. coli was identified in an average of 5 to 7colonies among 25 selected colonies.On the other hand, a more rapid, specific and sensitive analytical molecular detection namely single colony PCR was also performed targeting hha gene to sensitively detect E.coli, giving more accurate and time consuming identification of colonies considered presumptively as E.coli. Comparative methodologies, such as ultrafiltration and direct DNA extraction from membrane filters (MoBio, Grermany) were also applied; however, results were not as accurate as the membrane filtration, making it a technique of choice for the detection and enumeration of water coliforms, followed by sufficiently specific enzymatic confirmatory stage.

Keywords: coliform, cytochrome oxidase, hha primer, membrane filtration, single colony PCR

Procedia PDF Downloads 302
5476 Same-Day Detection Method of Salmonella Spp., Shigella Spp. and Listeria Monocytogenes with Fluorescence-Based Triplex Real-Time PCR

Authors: Ergun Sakalar, Kubra Bilgic

Abstract:

Faster detection and characterization of pathogens are the basis of the evoid from foodborne pathogens. Salmonella spp., Shigella spp. and Listeria monocytogenes are common foodborne bacteria that are among the most life-threatining. It is important to rapid and accurate detection of these pathogens to prevent food poisoning and outbreaks or to manage food chains. The present work promise to develop a sensitive, species specific and reliable PCR based detection system for simultaneous detection of Salmonella spp., Shigella spp. and Listeria monocytogenes. For this purpose, three genes were picked out, ompC for Salmonella spp., ipaH for Shigella spp. and hlyA for L. monocytogenes. After short pre-enrichment of milk was passed through a vacuum filter and bacterial DNA was exracted using commercially available kit GIDAGEN®(Turkey, İstanbul). Detection of amplicons was verified by examination of the melting temperature (Tm) that are 72° C, 78° C, 82° C for Salmonella spp., Shigella spp. and L. monocytogenes, respectively. The method specificity was checked against a group of bacteria strains, and also carried out sensitivity test resulting in under 10² CFU mL⁻¹ of milk for each bacteria strain. Our results show that the flourescence based triplex qPCR method can be used routinely to detect Salmonella spp., Shigella spp. and L. monocytogenes during the milk processing procedures in order to reduce cost, time of analysis and the risk of foodborne disease outbreaks.

Keywords: evagreen, food-born bacteria, pathogen detection, real-time pcr

Procedia PDF Downloads 227
5475 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance

Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane

Abstract:

Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.

Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)

Procedia PDF Downloads 395
5474 Biochar Assisted Municipal Wastewater Treatment and Nutrient Recycling

Authors: A. Pokharel, A. Farooque, B. Acharya

Abstract:

Pyrolysis can be used for energy production from waste biomass of agriculture and forestry. Biochar is the solid byproduct of pyrolysis and its cascading use can offset the cost of the process. A wide variety of research on biochar has highlighted its ability to absorb nutrients, metal and complex compounds; filter suspended solids; enhance microorganisms’ growth; retain water and nutrients as well as to increase carbon content of soil. In addition, sustainable biochar systems are an attractive approach for carbon sequestration and total waste management cycle. Commercially available biochar from Sigma Aldrich was studied for adsorption of nitrogen from effluent of municipal wastewater treatment plant. Adsorption isotherm and breakthrough curve were determined for the biochar. Similarly, biochar’s effects in aerobic as well as anaerobic bioreactors were also studied. In both cases, the biomass was increased in presence of biochar. The amount of gas produced for anaerobic digestion of fruit mix (apple and banana) was similar but the rate of production was significantly faster in biochar fed reactors. The cumulative goal of the study is to use biochar in various wastewater treatment units like aeration tank, secondary clarifier and tertiary nutrient recovery system as well as in anaerobic digestion of the sludge to optimize utilization and add value before being used as a soil amendment.

Keywords: biochar, nutrient recyling, wastewater treatment, soil amendment

Procedia PDF Downloads 122
5473 Kinetics of Hydrogen Sulfide Removal from Biogas Using Biofilm on Packed Bed of Salak Fruit Seeds

Authors: Retno A. S. Lestari, Wahyudi B. Sediawan, Siti Syamsiah, Sarto

Abstract:

Sulfur-oxidizing bacteria were isolated and then grown on salak fruit seeds forming a biofilm on the surface. Their performances in sulfide removal were experimentally observed. In doing so, the salak fruit seeds containing biofilm were then used as packing material in a cylinder. Biogas obtained from biological treatment, which contains 27.95 ppm of hydrogen sulfide was flown through the packed bed. The hydrogen sulfide from the biogas was absorbed in the biofilm and then degraded by the microbes in the biofilm. The hydrogen sulfide concentrations at a various axial position and various times were analyzed. A set of simple kinetics model for the rate of the sulfide removal and the bacterial growth was proposed. Since the biofilm is very thin, the sulfide concentration in the Biofilm at a certain axial position is assumed to be uniform. The simultaneous ordinary differential equations obtained were then solved numerically using Runge-Kutta method. The values of the parameters were also obtained by curve-fitting. The accuracy of the model proposed was tested by comparing the calculation results using the model with the experimental data obtained. It turned out that the model proposed can describe the removal of sulfide liquid using bio-filter in the packed bed. The biofilter could remove 89,83 % of the hydrogen sulfide in the feed at 2.5 hr of operation and biogas flow rate of 30 L/hr.

Keywords: sulfur-oxidizing bacteria, salak fruit seeds, biofilm, packing material, biogas

Procedia PDF Downloads 198
5472 Reliability-Based Life-Cycle Cost Model for Engineering Systems

Authors: Reza Lotfalian, Sudarshan Martins, Peter Radziszewski

Abstract:

The effect of reliability on life-cycle cost, including initial and maintenance cost of a system is studied. The failure probability of a component is used to calculate the average maintenance cost during the operation cycle of the component. The standard deviation of the life-cycle cost is also calculated as an error measure for the average life-cycle cost. As a numerical example, the model is used to study the average life cycle cost of an electric motor.

Keywords: initial cost, life-cycle cost, maintenance cost, reliability

Procedia PDF Downloads 576
5471 Enhancing Signal Reception in a Mobile Radio Network Using Adaptive Beamforming Antenna Arrays Technology

Authors: Ugwu O. C., Mamah R. O., Awudu W. S.

Abstract:

This work is aimed at enhancing signal reception on a mobile radio network and minimizing outage probability in a mobile radio network using adaptive beamforming antenna arrays. In this research work, an empirical real-time drive measurement was done in a cellular network of Globalcom Nigeria Limited located at Ikeja, the headquarters of Lagos State, Nigeria, with reference base station number KJA 004. The empirical measurement includes Received Signal Strength and Bit Error Rate which were recorded for exact prediction of the signal strength of the network as at the time of carrying out this research work. The Received Signal Strength and Bit Error Rate were measured with a spectrum monitoring Van with the help of a Ray Tracer at an interval of 100 meters up to 700 meters from the transmitting base station. The distance and angular location measurements from the reference network were done with the help Global Positioning System (GPS). The other equipment used were transmitting equipment measurements software (Temsoftware), Laptops and log files, which showed received signal strength with distance from the base station. Results obtained were about 11% from the real-time experiment, which showed that mobile radio networks are prone to signal failure and can be minimized using an Adaptive Beamforming Antenna Array in terms of a significant reduction in Bit Error Rate, which implies improved performance of the mobile radio network. In addition, this work did not only include experiments done through empirical measurement but also enhanced mathematical models that were developed and implemented as a reference model for accurate prediction. The proposed signal models were based on the analysis of continuous time and discrete space, and some other assumptions. These developed (proposed) enhanced models were validated using MATLAB (version 7.6.3.35) program and compared with the conventional antenna for accuracy. These outage models were used to manage the blocked call experience in the mobile radio network. 20% improvement was obtained when the adaptive beamforming antenna arrays were implemented on the wireless mobile radio network.

Keywords: beamforming algorithm, adaptive beamforming, simulink, reception

Procedia PDF Downloads 14
5470 Reducing Environmental Impact of Olive Oil Production in Sakaka City Using Combined Chemical, Physical, and Biological Treatment

Authors: Abdullah Alhajoj, Bassam Alowaiesh

Abstract:

This work aims to reduce the risks of discharging olive mill waste directly to the environment without treatment in Sakaka City, KSA. The organic loads expressed by chemical oxygen demand (COD) and biological oxygen demand (BOD) of the produced wastewater (OMWW) as well as the solid waste (OMW) were evaluated. The wastes emitted from the three-phase centrifuge decanters was found to be higher than that emitted from the two-phase centrifuge decanters. The olive mill wastewater (OMWW) was treated using advanced oxidation combined with filtration treatment. The results indicated that the concentration of COD, BOD, TSS, oil and grease and phenol was reduced by using complex sand filtration from 72150, 21660 10256, 36430, and 1470 mg/l to 980, 421, 58, 68, and 0.35 mg/l for three-phase OMWW and from 150562, 17955, 15325, 19658 and 2153 mg/l to 1050, 501, 29, 0.75, and 0.29 mg/l, respectively. While, by using modified trickling filter (packed with the neck of waste plastic bottles the concentration of the previously mentioned parameters was reduced to 1190, 570, 55, 0.85, and 0.3 mg/l, respectively. This work supports the application of such treatment technique for reducing the environmental threats of olive mill waste effluents in Saudi Arabia.

Keywords: two-phase, three-phase, olive mill, olive oil, waste treatment, filtration, advanced oxidation, waste plastic bottles

Procedia PDF Downloads 138
5469 Treatment of Grey Water from Different Restaurants in FUTA Using Fungi

Authors: F. A. Ogundolie, F. Okogue, D. V. Adegunloye

Abstract:

Greywater samples were obtained from three restaurants in the Federal University of Technology; Akure coded SSR, MGR and GGR. Fungi isolates obtained include Rhizopus stolonifer, Aspergillus niger, Mucor mucedo, Aspergillus flavus, Saccharomyces cerevisiae. Of these fungi isolates obtained, R. stolonifer, A. niger and A. flavus showed significant degradation ability on grey water and was used for this research. A simple bioreactor was constructed using biodegradation process in purification of waste water samples. Waste water undergoes primary treatment; secondary treatment involves the introduction of the isolated organisms into the waste water sample and the tertiary treatment which involved the use of filter candle and the sand bed filtration process to achieve the end product without the use of chemicals. A. niger brought about significant reduction in both the bacterial load and the fungi load of the greywater samples of the three respective restaurants with a reduction of (1.29 × 108 to 1.57 × 102 cfu/ml; 1.04 × 108 to 1.12 × 102 cfu/ml and 1.72 × 108 to 1.60 × 102 cfu/ml) for bacterial load in SSR, MGR and GGR respectively. Reduction of 2.01 × 104 to 1.2 × 101; 1.72 × 104 to 1.1 × 101, and 2.50 × 104 to 1.5 × 101 in fungi load from SSR, MGR and GGR respectively. Result of degradation of these selected waste water by the fungi showed that A. niger was probably more potent in the degradation of organic matter and hence, A. niger could be used in the treatment of wastewater.

Keywords: Aspergillus niger, greywater, bacterial, fungi, microbial load, bioreactor, biodegradation, purification, organic matter and filtration

Procedia PDF Downloads 293
5468 Application of the Global Optimization Techniques to the Optical Thin Film Design

Authors: D. Li

Abstract:

Optical thin films are used in a wide variety of optical components and there are many software tools programmed for advancing multilayer thin film design. The available software packages for designing the thin film structure may not provide optimum designs. Normally, almost all current software programs obtain their final designs either from optimizing a starting guess or by technique, which may or may not involve a pseudorandom process, that give different answers every time, depending upon the initial conditions. With the increasing power of personal computers, functional methods in optimization and synthesis of optical multilayer systems have been developed such as DGL Optimization, Simulated Annealing, Genetic Algorithms, Needle Optimization, Inductive Optimization and Flip-Flop Optimization. Among these, DGL Optimization has proved its efficiency in optical thin film designs. The application of the DGL optimization technique to the design of optical coating is presented. A DGL optimization technique is provided, and its main features are discussed. Guidelines on the application of the DGL optimization technique to various types of design problems are given. The innovative global optimization strategies used in a software tool, OnlyFilm, to optimize multilayer thin film designs through different filter designs are outlined. OnlyFilm is a powerful, versatile, and user-friendly thin film software on the market, which combines optimization and synthesis design capabilities with powerful analytical tools for optical thin film designers. It is also the only thin film design software that offers a true global optimization function.

Keywords: optical coatings, optimization, design software, thin film design

Procedia PDF Downloads 294