Search results for: Dust removing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 242

Search results for: Dust removing

92 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data

Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang

Abstract:

De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.

Keywords: Contig graph, NGS, de novo assembly, scaffold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
91 The Sequestration of Heavy Metals Contaminating the Wonderfonteinspruit Catchment Area using Natural Zeolite

Authors: P.P. Diale, S.S.L. Mkhize, E. Muzenda, J. Zimba

Abstract:

For more than 120 years, gold mining formed the backbone the South Africa-s economy. The consequence of mine closure was observed in large-scale land degradation and widespread pollution of surface water and groundwater. This paper investigates the feasibility of using natural zeolite in removing heavy metals contaminating the Wonderfonteinspruit Catchment Area (WCA), a water stream with high levels of heavy metals and radionuclide pollution. Batch experiments were conducted to study the adsorption behavior of natural zeolite with respect to Fe2+, Mn2+, Ni2+, and Zn2+. The data was analysed using the Langmuir and Freudlich isotherms. Langmuir was found to correlate the adsorption of Fe2+, Mn2+, Ni2+, and Zn2+ better, with the adsorption capacity of 11.9 mg/g, 1.2 mg/g, 1.3 mg/g, and 14.7 mg/g, respectively. Two kinetic models namely, pseudo-first order and pseudo second order were also tested to fit the data. Pseudo-second order equation was found to be the best fit for the adsorption of heavy metals by natural zeolite. Zeolite functionalization with humic acid increased its uptake ability.

Keywords: gold-mining, natural zeolites, water pollution, WestRand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2480
90 Surface Roughness and MRR Effect on Manual Plasma Arc Cutting Machining

Authors: R. Bhuvenesh, M.H. Norizaman, M.S. Abdul Manan

Abstract:

Industrial surveys shows that manufacturing companies define the qualities of thermal removing process based on the dimension and physical appearance of the cutting material surface. Therefore, the roughness of the surface area of the material cut by the plasma arc cutting process and the rate of the removed material by the manual plasma arc cutting machine was importantly considered. Plasma arc cutter Selco Genesis 90 was used to cut Standard AISI 1017 Steel of 200 mm x100 mm x 6 mm manually based on the selected parameters setting. The material removal rate (MRR) was measured by determining the weight of the specimens before and after the cutting process. The surface roughness (SR) analysis was conducted using Mitutoyo CS-3100 to determine the average roughness value (Ra). Taguchi method was utilized to achieve optimum condition for both outputs studied. The microstructure analysis in the region of the cutting surface is performed using SEM. The results reveal that the SR values are inversely proportional to the MRR values. The quality of the surface roughness depends on the dross peak that occurred after the cutting process.

Keywords: Material removal rate, plasma arc cutting, surface roughness, Taguchi method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5631
89 Construction of cDNALibrary and EST Analysis of Tenebriomolitorlarvae

Authors: JiEun Jeong, Se-Won Kang, Hee-Ju Hwang, Sung-Hwa Chae, Sang-Haeng Choi, Hong-SeogPark, YeonSoo Han, Bok-Reul Lee, Dae-Hyun Seog, Yong Seok Lee

Abstract:

Tofurther advance research on immune-related genes from T. molitor, we constructed acDNA library and analyzed expressed sequence taq (EST) sequences from 1,056 clones. After removing vector sequence and quality checkingthrough thePhred program (trim_alt 0.05 (P-score>20), 1039 sequences were generated. The average length of insert was 792 bp. In addition, we identified 162 clusters, 167 contigs and 391 contigs after clustering and assembling process using a TGICL package. EST sequences were searchedagainst NCBI nr database by local BLAST (blastx, EKeywords: EST, Innate immunity, Tenebriomolitor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
88 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques

Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas

Abstract:

The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.

Keywords: Artificial neural network, competitive dynamics, logistic regression, text classification, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 476
87 Stability and Kinetic Analysis during Vermicomposting of Sewage Sludge

Authors: Ashish Kumar Nayak, Dhamodharan K., Ajay S. Kalamdhad

Abstract:

The present study is aimed at alteration of sewage sludge into stable compost product using vermicomposting of sewage sludge mixed with cattle manure and saw dust in five different proportions based on C/N ratios (C/N 15 (R1), 20 (R2), 25 (R3) and 30 (R4); and control (R5)) by employing an epigeic earthworm Eisenia fetida. Higher reductions in C/N ratio, CO2 evolution and OUR were observed in R4 demonstrated the compost stability. In addition, R4 proved to be best combination for the growth of the earthworms. In order to observe the optimal degradation, kinetics for degradation of organic matter in vermicomposting were quantitatively evaluated. An approach model was developed by assuming that composting process is carried out in a homogeneous way and the kinetics for decomposition reaction is represented by a Monod-type equation. The results exhibit comparable variations in the kinetic constants Km and K3 under varying parameters during vermicomposting process. Results suggested that higher R2 value in R4, enhanced suitability towards Lineweaver-Burke plot. R4 yields higher degradability coefficient (K) reveals that the occurrence of optimal nutrient balance, which not only enhanced the affinity of enzymes towards substrate but also improved its degradation process. Therefore, it can be proved that R4 provided to be the best feed combination for vermicomposting process as compared to other reactors.

Keywords: Vermicomposting, Eisenia fetida, Sewage sludge, C/N ratio, Stability, Enzyme kinetics concept.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307
86 NiO-CeO2 Nano-Catalyst for the Removal of Priority Organic Pollutants from Wastewater through Catalytic Wet Air Oxidation at Mild Conditions

Authors: Anushree, Chhaya Sharma, Satish Kumar

Abstract:

Catalytic wet air oxidation (CWAO) is normally carried out at elevated temperature and pressure. This work investigates the potential of NiO-CeO2 nano-catalyst in CWAO of paper industry wastewater under milder operating conditions of 90 °C and 1 atm. The NiO-CeO2 nano-catalysts were synthesized by a simple co-precipitation method and characterized by X-ray diffraction (XRD), before and after use, in order to study any crystallographic change during experiment. The extent of metal-leaching from the catalyst was determined using the inductively coupled plasma optical emission spectrometry (ICP-OES). The catalytic activity of nano-catalysts was studied in terms of total organic carbon (TOC), adsorbable organic halides (AOX) and chlorophenolics (CHPs) removal. Interestingly, mixed oxide catalysts exhibited higher activity than the corresponding single-metal oxides. The maximum removal efficiency was achieved with Ce40Ni60 catalyst. The results indicate that the CWAO process is efficient in removing the priority organic pollutants from wastewater, as it exhibited up to 59% TOC, 55% AOX, and 54 % CHPs removal.

Keywords: Nano-materials, NiO-CeO2, wastewater, wet air oxidation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
85 Influence of Heterogeneous Traffic on the Roadside Fine (PM2.5 and PM1) and Coarse(PM10) Particulate Matter Concentrations in Chennai City, India

Authors: Srimuruganandam. B, S.M. Shiva Nagendra

Abstract:

In this paper the influence of heterogeneous traffic on the temporal variation of ambient PM10, PM2.5 and PM1 concentrations at a busy arterial route (Sardar Patel Road) in the Chennai city has been analyzed. The hourly PM concentration, traffic counts and average speed of the vehicles have been monitored at the study site for one week (19th-25th January 2009). Results indicated that the concentrations of coarse (PM10) and fine PM (PM2.5 and PM1) concentrations at SP road are having similar trend during peak and non-peak hours, irrespective of the days. The PM concentrations showed daily two peaks corresponding to morning (8 to 10 am) and evening (7 to 9 pm) peak hour traffic flow. The PM10 concentration is dominated by fine particles (53% of PM2.5 and 45% of PM1). The high PM2.5/PM10 ratio indicates that the majority of PM10 particles originate from re-suspension of road dust. The analysis of traffic flow at the study site showed that 2W, 3W and 4W are having similar diurnal trend as PM concentrations. This confirms that the 2W, 3W and 4W are the main emission source contributing to ambient PM concentration at SP road. The speed measurement at SP road showed that the average speed of 2W, 3W, 4W, LCV and HCV are 38, 40, 38, 40 and 38 km/hr and 43, 41, 42, 40 and 41 km/hr respectively for the weekdays and weekdays.

Keywords: particulate matter, heterogeneous traffic, fineparticles, coarse particles, vehicle speed, weekend and weekday.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
84 Effects of Increased Green Surface on a Densely Built Urban Fabric: The Case of Budapest

Authors: Viktória Sugár, Orsolya Frick, Gabriella Horváth, A. Bendegúz Vöröss, Péter Leczovics, Géza Baráth

Abstract:

Urban greenery has multiple positive effects both on the city and its residents. Apart from the visual advantages, it changes the micro-climate by cooling and shading, also increasing vapor and oxygen, reducing dust and carbon-dioxide content at the same time. The above are all critical factors of livability of an urban fabric. Unfortunately, in a dense, historical district there are restricted possibilities to build green surfaces. The present study collects and systemizes the applicable green solutions in the case of a historical downtown district of Budapest. The study contains a GIS-based measurement of the eligible surfaces for greenery, and also calculates the potential of oxygen production, carbon-dioxide reduction and cooling effect of an increased green surface.  It can be concluded that increasing the green surface has measurable effects on a densely built urban fabric, including air quality, micro-climate and other environmental factors.

Keywords: Urban greenery, green roof, green wall, green surface potential, sustainable city, oxygen production, carbon-dioxide reduction, geographical information system, GIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 872
83 Successful Management of a Boy with Mild Persistent Asthma (A Longitudinal Case)

Authors: Lubis A., Setiawati L., Setyoningrum A. R., Suryawan A., Irwanto

Abstract:

Asthma is a condition that causing chronic health problems in children. In addition to basic therapy against disease, we must try to reduce the impact of chronic health problems and also optimize their medical aspect of growth and development. A boy with mild asthma attack frequent episode did not showed any improvement with medical treatment and his asthma control test was 11. From radiologic examination he got hyperaerated lung and billateral sinusitis maxillaris; skin test results were house dust, food and pet allergy; an overweight body; bad school grades; psychological and environmental problem. We followed and evaluated this boy in 6 months, treated holistically. Even we could not do much on environmental but no more psychological and school problems, his on a good bodyweight and his asthma control test was 22. A case of a child with mild asthma attack frequent episode was reported. Asthma clinical course show no significant improvement when other predisposing factor is not well-controlled and a child’s growth and development may be affected. Improving condition of the patient can be created with the help of loving and caring way of nurturing from the parents and supportive peer group. Therefore, continuous and consistent monitoring is required because prognosis of asthma is generally good when regularly and properly controlled.

Keywords: Asthma, chronic health problems, growth and development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
82 Pilot-scale Study of Horizontal Anaerobic Digester for Biogas Production using Food Waste

Authors: Yongsei Lee, Hyunsu Park, Youngseob Yu, Heechan Yoo, Sungin Yoo

Abstract:

A horizontal anaerobic digester was developed and tested in pilot scale for Korean food waste with high water contents (>80%). The hydrogen sulfide in the biogas was removed by a biological desulfurization equipment integrated in the horizontal digester. A mixer of the horizontal digester was designed to easily remove the sediment in the bottom and scum layers on surface in the digester. Experimental result for 120 days of operation of the pilot plant showed a high removal efficiency of 81.2% for organic substance and high stability during the whole operation period were acquired. Also food waste was treated at high organic loading rates over 4 kg•VS/m3∙day and a methane gas production rate of 0.62 m3/kg•VSremoved was accomplished. The biological desulfurization equipment inside the horizontal digester was proven to be an economic and effective method to reduce the biogas desulfurization cost by removing hydrogen sulfide more than 90% without external desulfurization equipments.

Keywords: Biogas, Biological desulfurization, Horizontal anaerobic digester, Korean food waste

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3027
81 A Forward Automatic Censored Cell-Averaging Detector for Multiple Target Situations in Log-Normal Clutter

Authors: Musa'ed N. Almarshad, Saleh A. Alshebeili, Mourad Barkat

Abstract:

A challenging problem in radar signal processing is to achieve reliable target detection in the presence of interferences. In this paper, we propose a novel algorithm for automatic censoring of radar interfering targets in log-normal clutter. The proposed algorithm, termed the forward automatic censored cell averaging detector (F-ACCAD), consists of two steps: removing the corrupted reference cells (censoring) and the actual detection. Both steps are performed dynamically by using a suitable set of ranked cells to estimate the unknown background level and set the adaptive thresholds accordingly. The F-ACCAD algorithm does not require any prior information about the clutter parameters nor does it require the number of interfering targets. The effectiveness of the F-ACCAD algorithm is assessed by computing, using Monte Carlo simulations, the probability of censoring and the probability of detection in different background environments.

Keywords: CFAR, Log-normal clutter, Censoring, Probabilityof detection, Probability of false alarm, Probability of falsecensoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
80 Neural Network Based Determination of Splice Junctions by ROC Analysis

Authors: S. Makal, L. Ozyilmaz, S. Palavaroglu

Abstract:

Gene, principal unit of inheritance, is an ordered sequence of nucleotides. The genes of eukaryotic organisms include alternating segments of exons and introns. The region of Deoxyribonucleic acid (DNA) within a gene containing instructions for coding a protein is called exon. On the other hand, non-coding regions called introns are another part of DNA that regulates gene expression by removing from the messenger Ribonucleic acid (RNA) in a splicing process. This paper proposes to determine splice junctions that are exon-intron boundaries by analyzing DNA sequences. A splice junction can be either exon-intron (EI) or intron exon (IE). Because of the popularity and compatibility of the artificial neural network (ANN) in genetic fields; various ANN models are applied in this research. Multi-layer Perceptron (MLP), Radial Basis Function (RBF) and Generalized Regression Neural Networks (GRNN) are used to analyze and detect the splice junctions of gene sequences. 10-fold cross validation is used to demonstrate the accuracy of networks. The real performances of these networks are found by applying Receiver Operating Characteristic (ROC) analysis.

Keywords: Gene, neural networks, ROC analysis, splice junctions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
79 On-line Handwritten Character Recognition: An Implementation of Counterpropagation Neural Net

Authors: Muhammad Faisal Zafar, Dzulkifli Mohamad, Razib M. Othman

Abstract:

On-line handwritten scripts are usually dealt with pen tip traces from pen-down to pen-up positions. Time evaluation of the pen coordinates is also considered along with trajectory information. However, the data obtained needs a lot of preprocessing including filtering, smoothing, slant removing and size normalization before recognition process. Instead of doing such lengthy preprocessing, this paper presents a simple approach to extract the useful character information. This work evaluates the use of the counter- propagation neural network (CPN) and presents feature extraction mechanism in full detail to work with on-line handwriting recognition. The obtained recognition rates were 60% to 94% using the CPN for different sets of character samples. This paper also describes a performance study in which a recognition mechanism with multiple thresholds is evaluated for counter-propagation architecture. The results indicate that the application of multiple thresholds has significant effect on recognition mechanism. The method is applicable for off-line character recognition as well. The technique is tested for upper-case English alphabets for a number of different styles from different peoples.

Keywords: On-line character recognition, character digitization, counter-propagation neural networks, extreme coordinates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2393
78 Removal of Heavy Metals from Water in the Presence of Organic Wastes: Fruit Peels

Authors: Berk Kılıç, Derin Dalgıç, Ela Mia Sevilla Levi, Ömer Aydın

Abstract:

In this experiment our goal was to remove heavy metals from water. Generally, removing toxic heavy elements: Cu+2, Cr+6 and Fe+3, ions from their aqueous solutions has been determined with different kinds of plants’ peels. However, this study focuses on banana, peach, orange, and potato peels. The first step of the experiment was to wash the peels with distilled water and then dry the peels in an oven for 80 h at 80 °C. The peels were washed with NaOH and dried again at 80 °C for 2 days. Once the peels were washed and dried, 0.4 grams were weighed and added to a 200 mL sample of 0.1% heavy metal solution by mass. The mixing process was done via a magnetic stirrer. A sample of each was taken at 15-minute intervals and the level of absorbance change of the solutions was detected using a UV-Vis Spectrophotometer. Among the used waste products, orange showed the best results, followed by banana peel as the most efficient for our purposes. Moreover, the amount of fruit peel, pH values of the initial heavy metal solution, and initial concentration of heavy metal solutions were investigated to determine the effectiveness of fruit peels for absorbency.

Keywords: Absorbance, heavy metal, removal of heavy metals, fruit peels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 100
77 Conservation and Repair Works for Traditional Timber Mosque in Malaysia: A Review on Techniques

Authors: N.K.F. Mustafa, S. Johar, A.G. Ahmad, S.H. Zulkarnain, M.Y. A. Rahman, A.I. Che Ani

Abstract:

Building life cycle will never be excused from the existence of defects and deterioration. They are common problems in building, existed in newly build or in aged building. Buildings constructed from wood are indeed affected by its agent and serious defects and damages can reduce values to a building. In repair works, it is important to identify the causes and repair techniques that best suites with the condition. This paper reviews the conservation of traditional timber mosque in Malaysia comprises the concept, principles and approaches of mosque conservation in general. As in conservation practice, wood in historic building can be conserved by using various restoration and conservation techniques which this can be grouped as Fully and Partial Replacement, Mechanical Reinforcement, Consolidation by Impregnation and Reinforcement, Removing Paint and also Preservation of Wood and Control Insect Invasion, as to prolong and extended the function of a timber in a building. It resulted that the common techniques adopted in timber mosque conservation are from the conventional ways and the understanding of the repair technique requires the use of only preserve wood to prevent the future immature defects.

Keywords: Building conservation, conservation principles, repair works, traditional timber mosque.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3645
76 DCBOR: A Density Clustering Based on Outlier Removal

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.

Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902
75 Dynamics of Mini Hydraulic Backhoe Excavator: A Lagrange-Euler (L-E) Approach

Authors: Bhaveshkumar P. Patel, J. M. Prajapati

Abstract:

Excavators are high power machines used in the mining, agricultural and construction industry whose principal functions are digging (material removing), ground leveling and material transport operations. During the digging task there are certain unknown forces exerted by the bucket on the soil and the digging operation is repetitive in nature. Automation of the digging task can be performed by an automatically controlled excavator system, which is not only control the forces but also follow the planned digging trajectories. To develop such a controller for automated excavation, it is required to develop a dynamic model to describe the behavior of the control system during digging operation and motion of excavator with time. The presented work described a dynamic model needed for controller design and which is derived by applying Lagrange-Euler approach. The developed dynamic model is intended for further development of an automated excavation control system for light duty construction work and can be applied for heavy duty or all types of backhoe excavators.

Keywords: Backhoe excavator, controller, digging, excavation, trajectory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4415
74 Robust Statistics Based Algorithm to Remove Salt and Pepper Noise in Images

Authors: V.R.Vijaykumar, P.T.Vanathi, P.Kanagasabapathy, D.Ebenezer

Abstract:

In this paper, a robust statistics based filter to remove salt and pepper noise in digital images is presented. The function of the algorithm is to detect the corrupted pixels first since the impulse noise only affect certain pixels in the image and the remaining pixels are uncorrupted. The corrupted pixels are replaced by an estimated value using the proposed robust statistics based filter. The proposed method perform well in removing low to medium density impulse noise with detail preservation upto a noise density of 70% compared to standard median filter, weighted median filter, recursive weighted median filter, progressive switching median filter, signal dependent rank ordered mean filter, adaptive median filter and recently proposed decision based algorithm. The visual and quantitative results show the proposed algorithm outperforms in restoring the original image with superior preservation of edges and better suppression of impulse noise

Keywords: Image denoising, Nonlinear filter, Robust Statistics, and Salt and Pepper Noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2161
73 Experimental Study of the Metal Foam Flow Conditioner for Orifice Plate Flowmeters

Authors: B. Manshoor, N. Ihsak, Amir Khalid

Abstract:

The sensitivity of orifice plate metering to disturbed flow (either asymmetric or swirling) is a subject of great concern to flow meter users and manufacturers. The distortions caused by pipe fittings and pipe installations upstream of the orifice plate are major sources of this type of non-standard flows. These distortions can alter the accuracy of metering to an unacceptable degree. In this work, a multi-scale object known as metal foam has been used to generate a predetermined turbulent flow upstream of the orifice plate. The experimental results showed that the combination of an orifice plate and metal foam flow conditioner is broadly insensitive to upstream disturbances. This metal foam demonstrated a good performance in terms of removing swirl and producing a repeatable flow profile within a short distance downstream of the device. The results of using a combination of a metal foam flow conditioner and orifice plate for non-standard flow conditions including swirling flow and asymmetric flow show this package can preserve the accuracy of metering up to the level required in the standards.

Keywords: Metal foam flow conditioner, flow measurement, orifice plate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2020
72 A Real-Time Rendering based on Efficient Updating of Static Objects Buffer

Authors: Youngjae Chun, Kyoungsu Oh

Abstract:

Real-time 3D applications have to guarantee interactive rendering speed. There is a restriction for the number of polygons which is rendered due to performance of a graphics hardware or graphics algorithms. Generally, the rendering performance will be drastically increased when handling only the dynamic 3d models, which is much fewer than the static ones. Since shapes and colors of the static objects don-t change when the viewing direction is fixed, the information can be reused. We render huge amounts of polygon those cannot handled by conventional rendering techniques in real-time by using a static object image and merging it with rendering result of the dynamic objects. The performance must be decreased as a consequence of updating the static object image including removing an static object that starts to move, re-rending the other static objects being overlapped by the moving ones. Based on visibility of the object beginning to move, we can skip the updating process. As a result, we enhance rendering performance and reduce differences of rendering speed between each frame. Proposed method renders total 200,000,000 polygons that consist of 500,000 dynamic polygons and the rest are static polygons in about 100 frames per second.

Keywords: Occlusion query, Real-time rendering, Temporal coherence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
71 The Use of Fractional Brownian Motion in the Generation of Bed Topography for Bodies of Water Coupled with the Lattice Boltzmann Method

Authors: Elysia Barker, Jian Guo Zhou, Ling Qian, Steve Decent

Abstract:

A method of modelling topography used in the simulation of riverbeds is proposed in this paper which removes the need for datapoints and measurements of a physical terrain. While complex scans of the contours of a surface can be achieved with other methods, this requires specialised tools which the proposed method overcomes by using fractional Brownian motion (FBM) as a basis to estimate the real surface within a 15% margin of error while attempting to optimise algorithmic efficiency. This removes the need for complex, expensive equipment and reduces resources spent modelling bed topography. This method also accounts for the change in topography over time due to erosion, sediment transport, and other external factors which could affect the topography of the ground by updating its parameters and generating a new bed. The lattice Boltzmann method (LBM) is used to simulate both stationary and steady flow cases in a side-by-side comparison over the generated bed topography using the proposed method, and a test case taken from an external source. The method, if successful, will be incorporated into the current LBM program used in the testing phase, which will allow an automatic generation of topography for the given situation in future research, removing the need for bed data to be specified.

Keywords: Bed topography, FBM, LBM, shallow water, simulations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 248
70 Green Lean TQM Practices in Malaysian Automotive Companies

Authors: Noor Azlina Mohd Salleh, Salmiah Kasolang, Ahmed Jaffar

Abstract:

Green Lean Total Quality Management (TQM) System is a system comprises of Environmental Management System (EMS) practices which is integrated to TQM with Lean Manufacturing (LM) principles. The ultimate goal of this system is to focus on achieving total customer satisfaction and environmental care by removing eight wastes available in any process in an organization. A survey questionnaire was developed and distributed to 30 highly active automotive vendors in Malaysia and analyzed by SPSS v.17. It was found out that some vendors have been practicing TQM and LM while some have started to implement EMS. This study is only focusing on highly active companies that have been involved in MAJAICO Program and Proton Vendor Development Program. This is the first study conducted to know the current status of TQM, LM and EMS practices in highly active automotive companies in Malaysia. It was found out that EMS has been practiced by 16 companies out of 30. Within these 16 companies the approach is more holistic and green. This is a preliminary study that combined 4 awards practices, ISO/TS16949, Toyota Production System SAEJ4000, MAJAICO Lean Production System and EMS.

Keywords: Automotive Industry, Lean Manufacturing, Operational Engineering Management, Total Quality Management. Environmental Management System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3476
69 A Novel Method to Manufacture Superhydrophobic and Insulating Polyester Nanofibers via a Meso-Porous Aerogel Powder

Authors: Z. Mazrouei-Sebdani, A. Khoddami, H. Hadadzadeh, M. Zarrebini

Abstract:

In this research, waterglass based aerogel powder was prepared by sol–gel process and ambient pressure drying. Inspired by limited dust releasing, aerogel powder was introduced to the PET electrospinning solution in an attempt to create required bulk and surface structure for the nanofibers to improve their hydrophobic and insulation properties. The samples evaluation was carried out by measuring density, porosity, contact angle, heat transfer, FTIR, BET, and SEM. According to the results, porous silica aerogel powder was fabricated with mean pore diameter of 24 nm and contact angle of 145.9º. The results indicated the usefulness of the aerogel powder confined into nanofibers to control surface roughness for manipulating superhydrophobic nanowebs with water contact angle of 147º. It can be due to a multi-scale surface roughness which was created by nanowebs structure itself and nanofibers surface irregularity in presence of the aerogels while a layer of fluorocarbon created low surface energy. The wettability of a solid substrate is an important property that is controlled by both the chemical composition and geometry of the surface. Also, a decreasing trend in the heat transfer was observed from 22% for the nanofibers without any aerogel powder to 8% for the nanofibers with 4% aerogel powder. The development of thermal insulating materials has become increasingly more important than ever in view of the fossil energy depletion and global warming that call for more demanding energysaving practices.

Keywords: Superhydrophobicity, Insulation, Sol-gel, Surface energy, Roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2927
68 Indoor and Outdoor Concentration of Particulate Matter at Domestic Homes

Authors: B. Karakas, S. Lakestani, C. Guler, B. Guciz Dogan, S. Acar Vaizoglu, A. Taner, B. Sekerel, R. Tıpırdamaz, G. Gullu

Abstract:

Particulate matter (PM) in ambient air is responsible for adverse health effects in adults and children. Relatively little is known about the concentrations, sources and health effects of PM in indoor air. A monitoring study was conducted in Ankara by three campaigns in order to measure PM levels in indoor and outdoor environments to identify and quantify associations between sources and concentrations. Approximately 82 homes (1st campaign for 42, 2nd campaign for 12, and 3rd campaign for 28), three rooms (living room, baby-s room and living room used as a baby-s room) and outdoor ambient at each home were sampled with Grimm Environmental Dust Monitoring (EDM) 107, during different seasonal periods of 2011 and 2012. In this study, the relationship between indoor and outdoor PM levels for particulate matter less than 10 micrometer (.m) (PM10), particulate matter less than 2.5.m (PM2.5) and particulate matter less than 1.0.m (PM1) were investigated. The mean concentration of PM10, PM2.5, and PM1.0 at living room used as baby-s room is higher than living and baby-s room (or bedroom) for three sampling campaigns. It is concluded that the household activities and environmental conditions are very important for PM concentrations in the indoor environments during the sampling periods. The amount of smokers, being near a main street and/or construction activities increased the PM concentration. This study is based on the assessment the relationship between indoor and outdoor PM levels and the household activities and environmental conditions

Keywords: Indoor air quality, particulate matter (PM), PM10, PM2.5, PM1.0.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3192
67 Adaptive Non-linear Filtering Technique for Image Restoration

Authors: S. K. Satpathy, S. Panda, K. K. Nagwanshi, S. K. Nayak, C. Ardil

Abstract:

Removing noise from the any processed images is very important. Noise should be removed in such a way that important information of image should be preserved. A decisionbased nonlinear algorithm for elimination of band lines, drop lines, mark, band lost and impulses in images is presented in this paper. The algorithm performs two simultaneous operations, namely, detection of corrupted pixels and evaluation of new pixels for replacing the corrupted pixels. Removal of these artifacts is achieved without damaging edges and details. However, the restricted window size renders median operation less effective whenever noise is excessive in that case the proposed algorithm automatically switches to mean filtering. The performance of the algorithm is analyzed in terms of Mean Square Error [MSE], Peak-Signal-to-Noise Ratio [PSNR], Signal-to-Noise Ratio Improved [SNRI], Percentage Of Noise Attenuated [PONA], and Percentage Of Spoiled Pixels [POSP]. This is compared with standard algorithms already in use and improved performance of the proposed algorithm is presented. The advantage of the proposed algorithm is that a single algorithm can replace several independent algorithms which are required for removal of different artifacts.

Keywords: Filtering, Decision Based Algorithm, noise, imagerestoration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122
66 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm, to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: Rough Set Theory, Attribute Reduction, Fuzzy Logic, Memetic Algorithms, Record to Record Algorithm, Great Deluge Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
65 Application of Mutual Information based Least dependent Component Analysis (MILCA) for Removal of Ocular Artifacts from Electroencephalogram

Authors: V Krishnaveni, S Jayaraman, K Ramadoss

Abstract:

The electrical potentials generated during eye movements and blinks are one of the main sources of artifacts in Electroencephalogram (EEG) recording and can propagate much across the scalp, masking and distorting brain signals. In recent times, signal separation algorithms are used widely for removing artifacts from the observed EEG data. In this paper, a recently introduced signal separation algorithm Mutual Information based Least dependent Component Analysis (MILCA) is employed to separate ocular artifacts from EEG. The aim of MILCA is to minimize the Mutual Information (MI) between the independent components (estimated sources) under a pure rotation. Performance of this algorithm is compared with eleven popular algorithms (Infomax, Extended Infomax, Fast ICA, SOBI, TDSEP, JADE, OGWE, MS-ICA, SHIBBS, Kernel-ICA, and RADICAL) for the actual independence and uniqueness of the estimated source components obtained for different sets of EEG data with ocular artifacts by using a reliable MI Estimator. Results show that MILCA is best in separating the ocular artifacts and EEG and is recommended for further analysis.

Keywords: Electroencephalogram, Ocular Artifacts (OA), Independent Component Analysis (ICA), Mutual Information (MI), Mutual Information based Least dependent Component Analysis(MILCA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2162
64 Investigation of the Effect of Impulse Voltage to Flashover by Using Water Jet

Authors: Harun Gülan, Muhsin Tunay Gencoglu, Mehmet Cebeci

Abstract:

The main function of the insulators used in high voltage (HV) transmission lines is to insulate the energized conductor from the pole and hence from the ground. However, when the insulators fail to perform this insulation function due to various effects, failures occur. The deterioration of the insulation results either from breakdown or surface flashover. The surface flashover is caused by the layer of pollution that forms conductivity on the surface of the insulator, such as salt, carbonaceous compounds, rain, moisture, fog, dew, industrial pollution and desert dust. The source of the majority of failures and interruptions in HV lines is surface flashover. This threatens the continuity of supply and causes significant economic losses. Pollution flashover in HV insulators is still a serious problem that has not been fully resolved. In this study, a water jet test system has been established in order to investigate the behavior of insulators under dirty conditions and to determine their flashover performance. Flashover behavior of the insulators is examined by applying impulse voltages in the test system. This study aims to investigate the insulator behaviour under high impulse voltages. For this purpose, a water jet test system was installed and experimental results were obtained over a real system and analyzed. By using the water jet test system instead of the actual insulator, the damage to the insulator as a result of the flashover that would occur under impulse voltage was prevented. The results of the test system performed an important role in determining the insulator behavior and provided predictability.

Keywords: Insulator, pollution flashover, high impulse voltage, water jet model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1197
63 Hydrogen Sulphide Removal Using a Novel Biofilter Media

Authors: Z. M. Shareefdeen, A. Aidan, W.Ahmed, M. B. Khatri, M. Islam, R. Lecheheb, F. Shams

Abstract:

Air emissions from waste treatment plants often consist of a combination of Volatile Organic Compounds (VOCs) and odors. Hydrogen sulfide is one of the major odorous gases present in the waste emissions coming from municipal wastewater treatment facilities. Hydrogen sulfide (H2S) is odorous, highly toxic and flammable. Exposure to lower concentrations can result in eye irritation, a sore throat and cough, shortness of breath, and fluid in the lungs. Biofiltration has become a widely accepted technology for treating air streams containing H2S. When compared with other nonbiological technologies, biofilter is more cost-effective for treating large volumes of air containing low concentrations of biodegradable compounds. Optimization of biofilter media is essential for many reasons such as: providing a higher surface area for biofilm growth, low pressure drop, physical stability, and good moisture retention. In this work, a novel biofilter media is developed and tested at a pumping station of a municipality located in the United Arab Emirates (UAE). The media is found to be very effective (>99%) in removing H2S concentrations that are expected in pumping stations under steady state and shock loading conditions.

Keywords: biofilter media, hydrogen sulphide, pumping station, biofiltration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908