Search results for: minimum root mean square (RMS) error matching algorithm
6063 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2656062 Application of Optical Method for Calcul of Deformed Object Samples
Authors: R. Daira
Abstract:
The electronic speckle interferometry technique used to measure the deformations of scatterers process is based on the subtraction of interference patterns. A speckle image is first recorded before deformation of the object in the RAM of a computer, after a second deflection. The square of the difference between two images showing correlation fringes observable in real time directly on monitor. The interpretation these fringes to determine the deformation. In this paper, we present experimental results of deformation out of the plane of two samples in aluminum, electronic boards and stainless steel.Keywords: optical method, holography, interferometry, deformation
Procedia PDF Downloads 4046061 A Post-Occupancy Evaluation of the Impact of Indoor Environmental Quality on Health and Well-Being in Office Buildings
Authors: Suyeon Bae, Abimbola Asojo, Denise Guerin, Caren Martin
Abstract:
Post-occupancy evaluations (POEs) have been recognized for documenting occupant well-being and responses to indoor environmental quality (IEQ) factors such as thermal, lighting, and acoustic conditions. Sustainable Post-Occupancy evaluation survey (SPOES) developed by an interdisciplinary team at a Midwest University provides an evidence-based quantitative analysis of occupants’ satisfaction in office, classroom, and residential spaces to help direct attention to successful areas and areas that need improvement in buildings. SPOES is a self-administered and Internet-based questionnaire completed by building occupants. In this study, employees in three different office buildings rated their satisfaction on a Likert-type scale about 12 IEQ criteria including thermal condition, indoor air quality, acoustic quality, daylighting, electric lighting, privacy, view conditions, furnishings, appearance, cleaning and maintenance, vibration and movement, and technology. Employees rated their level of satisfaction on a Likert-type scale from 1 (very dissatisfied) to 7 (very satisfied). They also rate the influence of their physical environment on their perception of their work performance and the impact of their primary workspaces on their health on a scale from 1 (hinders) to 7 (enhances). Building A is a three-story building that includes private and group offices, classrooms, and conference rooms and amounted to 55,000 square-feet for primary workplace (N=75). Building B, a six-story building, consisted of private offices, shared enclosed office, workstations, and open desk areas for employees and amounted to 14,193 square-feet (N=75). Building C is a three-story 56,000 square-feet building that included classrooms, therapy rooms, an outdoor playground, gym, restrooms, and training rooms for clinicians (N=76). The results indicated that 10 IEQs for Building A except acoustic quality and privacy showed statistically significant correlations on the impact of the primary workspace on health. In Building B, 11 IEQs except technology showed statistically significant correlations on the impact of the primary workspace on health. Building C had statistically significant correlations between all 12 IEQ and the employees’ perception of the impact of their primary workspace on their health in two-tailed correlations (P ≤ 0.05). Out of 33 statistically significant correlations, 25 correlations (76%) showed at least moderate relationship (r ≥ 0.35). For the three buildings, daylighting, furnishings, and indoor air quality IEQs ranked highest on the impact on health. IEQs about vibration and movement, view condition, and electric lighting ranked second, followed by IEQs about cleaning and maintenance and appearance. These results imply that 12 IEQs developed in SPOES are highly related to employees’ perception of how their primary workplaces impact their health. The IEQs in this study offer an opportunity for improving occupants’ well-being and the built environment.Keywords: post-occupancy evaluation, built environment, sustainability, well-being, indoor air quality
Procedia PDF Downloads 2896060 Spatio-Temporal Variability and Trends in Frost-Free Season Parameters in Finland: Influence of Climate Teleconnections
Authors: Masoud Irannezhad, Sirpa Rasmus, Saghar Ahmadian, Deliang Chen, Bjorn Klove
Abstract:
Variability and changes in thermal conditions play a crucial role in functioning of human society, particularly over cold climate regions like Finland. Accordingly, the frost-free season (FFS) parameters in terms of start (FFSS), end (FFSE) and length (FFSL) have substantial effects not only on natural environment (e.g. flora and fauna), but also on human requirements (e.g. agriculture, forestry and energy generation). Applying the 0°C threshold of minimum temperature (Tmin), the FFS was defined as the period between the last spring frost as FFSS and the first fall frost as FFSE. For this study, gridded (10 x 10 km2) daily minimum temperature datasets throughout Finland during 1961-2011 was used to investigate recent spatio-temporal variations and trends in frost-free season (FFS) parameters and their relationships with the well-known large-scale climate teleconnections (CTs). The FFS in Finland naturally increases from north (~60 days) to south (~190 days), in association with earlier FFSS (~24 April) and later FFSE (~30 October). Statistically significant (p<0.05) trends in FFSL were all positive (increasing) ranged between 0 and 13.5 (days/decade) and mainly observed in the east, upper west, centre and upper north of Finland. Such lengthening trends in FFS were attributable to both earlier FFSS and later FFSE mostly over central and upper northern Finland, while only to later FFSE in eastern and upper western parts. Variations in both FFSL and FFSS were significantly associated with the Polar (POL) pattern over northern Finland, while with the East Atlantic (EA) pattern over eastern and upper western areas. However, the POL and Scandinavia (SCA) patterns were most influential CTs for FFSE variability over northern Finland.Keywords: climate teleconnections, Finland, frost-free season, trend analysis
Procedia PDF Downloads 2036059 Maximization of Lifetime for Wireless Sensor Networks Based on Energy Efficient Clustering Algorithm
Authors: Frodouard Minani
Abstract:
Since last decade, wireless sensor networks (WSNs) have been used in many areas like health care, agriculture, defense, military, disaster hit areas and so on. Wireless Sensor Networks consist of a Base Station (BS) and more number of wireless sensors in order to monitor temperature, pressure, motion in different environment conditions. The key parameter that plays a major role in designing a protocol for Wireless Sensor Networks is energy efficiency which is a scarcest resource of sensor nodes and it determines the lifetime of sensor nodes. Maximizing sensor node’s lifetime is an important issue in the design of applications and protocols for Wireless Sensor Networks. Clustering sensor nodes mechanism is an effective topology control approach for helping to achieve the goal of this research. In this paper, the researcher presents an energy efficiency protocol to prolong the network lifetime based on Energy efficient clustering algorithm. The Low Energy Adaptive Clustering Hierarchy (LEACH) is a routing protocol for clusters which is used to lower the energy consumption and also to improve the lifetime of the Wireless Sensor Networks. Maximizing energy dissipation and network lifetime are important matters in the design of applications and protocols for wireless sensor networks. Proposed system is to maximize the lifetime of the Wireless Sensor Networks by choosing the farthest cluster head (CH) instead of the closest CH and forming the cluster by considering the following parameter metrics such as Node’s density, residual-energy and distance between clusters (inter-cluster distance). In this paper, comparisons between the proposed protocol and comparative protocols in different scenarios have been done and the simulation results showed that the proposed protocol performs well over other comparative protocols in various scenarios.Keywords: base station, clustering algorithm, energy efficient, sensors, wireless sensor networks
Procedia PDF Downloads 1446058 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features
Authors: Bushra Zafar, Usman Qamar
Abstract:
Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection
Procedia PDF Downloads 3166057 The Comparison of Dismount Skill between National and International Men’s Artistic Gymnastics in Parallel Bars Apparatus
Authors: Chen ChihYu, Tang Wen Tzu, Chen Kuang Hui
Abstract:
Aim —To compare the dismount skill between Taiwanese and elite international gymnastics in parallel bars following the 2017-2020 code of points. Methods—The gymnasts who advanced to the parallel bars event finals of these four competitions including World Championships, Universiade, the National Games of Taiwan, and the National Intercollegiate Athletic Games of Taiwan both 2017 and 2019 were selected in this study. The dismount skill of parallel bars was analyzed, and the average difficulty score was compared by one-way ANOVA. Descriptive statistics were applied to present the type of dismount skill and the difficulty of each gymnast in these four competitions. The data from World Championships and Universiade were combined as the international group (INT), and data of Taiwanese National Games and National Intercollegiate Athletic Games were also combined as the national group (NAT). The differences between INT and NAT were analyzed by the Chi-square test. The statistical significance of this study was set at α= 0.05. Results— i) There was a significant difference in the mean parallel bars dismount skill in these four competitions analyzed by one-way ANOVA. Both dismount scores of World Championships and Universiade were significantly higher than in Taiwanese National Games and National Intercollegiate Athletic Games (0.58±0.08 & 0.56±0.08 > 0.42±0.06 & 40±0.06, p < 0.05). ii) Most of the gymnasts in World Championships and Universiade selected the 0.6-point skill as the parallel bars dismount element, and for the Taiwanese National Games and the National Intercollegiate Athletic Games, most of the gymnasts performed the 0.4-point dismount skill. iii) The result of the Chi-square test has shown that there was a significant difference in the selection of parallel bars dismount skill. The INT group used the E or E+ difficulty element as the dismount skill, and the NAT group selected the D or D- difficulty element. Conclusion— The level of parallel bars dismount in Taiwanese gymnastics is inferior to elite international gymnastics. It is suggested that Taiwanese gymnastics must try to practice the F difficulty dismount (double salto forward tucked with half twist) in the future.Keywords: Artistic Gymnastics World Championships, dismount, difficulty score, element
Procedia PDF Downloads 1426056 Enhancing Precision in Abdominal External Beam Radiation Therapy: Exhale Breath Hold Technique for Respiratory Motion Management
Authors: Stephanie P. Nigro
Abstract:
The Exhale Breath Hold (EBH) technique presents a promising approach to enhance the precision and efficacy of External Beam Radiation Therapy (EBRT) for abdominal tumours, which include liver, pancreas, kidney, and adrenal glands. These tumours are challenging to treat due to their proximity to organs at risk (OARs) and the significant motion induced by respiration and physiological variations, such as stomach filling. Respiratory motion can cause up to 40mm of displacement in abdominal organs, complicating accurate targeting. While current practices like limiting fasting help reduce motion related to digestive processes, they do not address respiratory motion. 4DCT scans are used to assess this motion, but they require extensive workflow time and expose patients to higher doses of radiation. The EBH technique, which involves holding the breath in an exhale with no air in the lungs, stabilizes internal organ motion, thereby reducing respiratory-induced motion. The primary benefit of EBH is the reduction in treatment volume sizes, specifically the Internal Target Volume (ITV) and Planning Target Volume (PTV), as demonstrated by smaller ITVs when gated in EBH. This reduction also improves the quality of 3D Cone Beam CT (CBCT) images by minimizing respiratory artifacts, facilitating soft tissue matching akin to stereotactic treatments. Patients suitable for EBH must meet criteria including the ability to hold their breath for at least 15 seconds and maintain a consistent breathing pattern. For those who do not qualify, the traditional 4DCT protocol will be used. The implementation involves an EBH planning scan and additional short EBH scans to ensure reproducibility and assist in contouring and volume expansions, with a Free Breathing (FB) scan used for setup purposes. Treatment planning on EBH scans leads to smaller PTVs, though intrafractional and interfractional breath hold variations must be accounted for in margins. The treatment decision process includes performing CBCT in EBH intervals, with careful matching and adjustment based on soft tissue and fiducial markers. Initial studies at two sites will evaluate the necessity of multiple CBCTs, assessing shifts and the benefits of initial versus mid-treatment CBCT. Considerations for successful implementation include thorough patient coaching, staff training, and verification of breath holds, despite potential disadvantages such as longer treatment times and patient exhaustion. Overall, the EBH technique offers significant improvements in the accuracy and quality of abdominal EBRT, paving the way for more effective and safer treatments for patients.Keywords: abdominal cancers, exhale breath hold, radiation therapy, respiratory motion
Procedia PDF Downloads 276055 External Validation of Established Pre-Operative Scoring Systems in Predicting Response to Microvascular Decompression for Trigeminal Neuralgia
Authors: Kantha Siddhanth Gujjari, Shaani Singhal, Robert Andrew Danks, Adrian Praeger
Abstract:
Background: Trigeminal neuralgia (TN) is a heterogenous pain syndrome characterised by short paroxysms of lancinating facial pain in the distribution of the trigeminal nerve, often triggered by usually innocuous stimuli. TN has a low prevalence of less than 0.1%, of which 80% to 90% is caused by compression of the trigeminal nerve from an adjacent artery or vein. The root entry zone of the trigeminal nerve is most sensitive to neurovascular conflict (NVC), causing dysmyelination. Whilst microvascular decompression (MVD) is an effective treatment for TN with NVC, all patients do not achieve long-term pain relief. Pre-operative scoring systems by Panczykowski and Hardaway have been proposed but have not been externally validated. These pre-operative scoring systems are composite scores calculated according to a subtype of TN, presence and degree of neurovascular conflict, and response to medical treatments. There is discordance in the assessment of NVC identified on pre-operative magnetic resonance imaging (MRI) between neurosurgeons and radiologists. To our best knowledge, the prognostic impact for MVD of this difference of interpretation has not previously been investigated in the form of a composite scoring system such as those suggested by Panczykowski and Hardaway. Aims: This study aims to identify prognostic factors and externally validate the proposed scoring systems by Panczykowski and Hardaway for TN. A secondary aim is to investigate the prognostic difference between a neurosurgeon's interpretation of NVC on MRI compared with a radiologist’s. Methods: This retrospective cohort study included 95 patients who underwent de novo MVD in a single neurosurgical unit in Melbourne. Data was recorded from patients’ hospital records and neurosurgeon’s correspondence from perioperative clinic reviews. Patient demographics, type of TN, distribution of TN, response to carbamazepine, neurosurgeon, and radiologist interpretation of NVC on MRI, were clearly described prospectively and preoperatively in the correspondence. Scoring systems published by Panczykowski et al. and Hardaway et al. were used to determine composite scores, which were compared with the recurrence of TN recorded during follow-up over 1-year. Categorical data analysed using Pearson chi-square testing. Independent numerical and nominal data analysed with logistical regression. Results: Logistical regression showed that a Panczykowski composite score of greater than 3 points was associated with a higher likelihood of pain-free outcome 1-year post-MVD with an OR 1.81 (95%CI 1.41-2.61, p=0.032). The composite score using neurosurgeon’s impression of NVC had an OR 2.96 (95%CI 2.28-3.31, p=0.048). A Hardaway composite score of greater than 2 points was associated with a higher likelihood of pain-free outcome 1 year post-MVD with an OR 3.41 (95%CI 2.58-4.37, p=0.028). The composite score using neurosurgeon’s impression of NVC had an OR 3.96 (95%CI 3.01-4.65, p=0.042). Conclusion: Composite scores developed by Panczykowski and Hardaway were validated for the prediction of response to MVD in TN. A composite score based on the neurosurgeon’s interpretation of NVC on MRI, when compared with the radiologist’s had a greater correlation with pain-free outcomes 1 year post-MVD.Keywords: de novo microvascular decompression, neurovascular conflict, prognosis, trigeminal neuralgia
Procedia PDF Downloads 746054 Oxidative Stability of Methyl and Ethyl Microalgae Biodiesel with Synthetic Antioxidants
Authors: Willian L. G. Silva, Fabio R. M. Batista, Matthieu Tubino
Abstract:
Microalgae can be considered a potential source of oil for biodiesel synthesis since this microorganism can grow rapidly in either fresh or salty water, not competing with food production. There are several favorable conditions in Brazil for this type of culture due to the country’s great amount of water. Another very positive aspect of this type of culture is its ability to fix atmospheric CO2, contributing to the reduction of greenhouse gases and their effects on global warming. Despite this biodiesel environmental advantages it degrades resulting in changes in its physical and chemical properties. In this work, the methyl and ethyl microalgae biodiesel oxidative stability was studied in the absence and presence of a synthetic antioxidant. The synthetic antioxidants used were propyl gallate (PG) and tert-butylhydroquinone (TBHQ), at a 0,12% (w/w) concentration. The biodiesel mixture was kept in a sealed glass flask, sheltered from light, and at room temperature (about 25 ºC) for 180 days. During this period, aliquots from this biodiesel were subjected to induced degradation by the Rancimat method, which determines an important quality parameter, provided in the current methods, and is used to monitor the degradation processes that occur in the biodiesel over time. The induction period (IP) expresses the biodiesel oxidative stability. It was stablished that the minimum accepted IP value for biodiesel is 8 hours. The results show that ethylic biodiesel increased its IP value from 7,6 hours to 31 hours when using PG, and to 67 hours when using TBHQ, exceeding the minimum accepted IP value. When the antioxidants were added to the methylic biodiesel samples, the IP was raised to 28 hours when using PG, and to 62 hours when using TBHQ. These values were maintained throughout the entire period of study (180 days). On the other hand, the biodiesel samples without additives maintained an IP above the allowed value for only 30 days. Therefore, in order to preserve microalgae biodiesel for longer periods of time, it is necessary to add antioxidants to both derivatives, i.e., the ethylic and methylic.Keywords: biodiesel, microalgae, oxidative stability, storage, synthetic antioxidants
Procedia PDF Downloads 4626053 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform
Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba
Abstract:
Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision
Procedia PDF Downloads 4786052 The Effect of Extracts of 12 Local Medicinal Plants Against Uropathogenic Escherichia Coli
Authors: Hafida Merzouk
Abstract:
Urinary tract infections are among the most serious public health issues in all age groups. Thus, the empirical therapy should based on local levels of resistance, as indicated in several studies from different countries, to effectively avoid the emergence of multidrug-resistant bacterial strains and recurrent infections. Numerous effective antibiotic treatments are available, but wouldbe ineffective for treating recurrent cystitis caused by a urinary tract infection, as well as the emergence of drug resistance. That iswhy the aim of this study was to highlight the antibacterial and the antioxidant activity of 11 medicinal plants used traditionally in Algeria against E. coli, the most responsible urinary tract infections. First, the extraction of total polyphenols with aqueous acetone showed variable yields. The highest yield was obtained by Asplenium trichomanes with 27%, followed by Petroselinum crispum and Ciannamomum cassia with an equal yield of 21%. Artemisia herba-alba gave the lowest yield (9%). The extracts of different plants showed variable contents of phenolic compounds. Reducing power and DPPH (2,2-diphenyl-1-picrylhydrazyl) scavenging activity revealed that most of the extracts studied had significant activity. The anti-free radical activity was very high in the extract of A splenium adiantum-nigrum compared with the other extracts studied, but Petroselinum crispum and Parietaria officinalis had the lowest reducing activity; Antibacterial activity was determined on E. coli strainsusing the diffusion, MICs (Minimum Inhibitory Concentrations) and MBCs (Minimum Bactericidal concentrations) methods. The strains tested were sensitive to most extracts studied, except Asplenium adiantum-nigrum extract, for which both strains showed resistance.Keywords: E. coli, medicinal plants, phenolic compounds, urinary infections
Procedia PDF Downloads 646051 Analysing the Mesoscale Variations of 7Be and 210Pb Concentrations in a Complex Orography, Guadalquivir Valley, Southern Spain
Authors: M. A. Hernández-Ceballos, E. G. San Miguel, C. Galán, J. P. Bolívar
Abstract:
The evolution of 7Be and 210Pb activity concentrations in surface air along the Guadalquivir valley (southern Iberian Peninsula) is presented in this study. Samples collected for 48 h, every fifteen days, from September 2012 to November 2013 at two sampling sites (Huelva city in the mouth and Cordoba city in the middle (located 250 km far away)), are used to 1) analysing the spatial variability and 2) understanding the influence of wind conditions on 7Be and 210Pb. Similar average concentrations were registered along the valley. The mean 7Be activity concentration was 4.46 ± 0.21 mBq/m3 at Huelva and 4.33 ± 0.20 mBq/m3 at Cordoba, although registering higher maximum and minimum values at Cordoba (9.44 mBq/m3 and 1.80 mBq/m3) than at Huelva (7.95 mBq/m3 and 1.04 mBq/m3). No significant differences were observed in the 210Pb mean activity concentrations between Cordoba (0.40 ± 0.04 mBq/m3) and Huelva (0.35 ± 0.04 mBq/m3), although the maximum (1.10 mBq/m3 and 0.87 mBq/m3) and minimum (0.02 mBq/m3 and 0.04 mBq/m3) values were recorded in Cordoba. Although similar average concentrations were obtained in both sites, the temporal evolution of both natural radionuclides presents differences between them. The meteorological analysis of two sampling periods, in which large differences on 7Be and 210Pb concentrations are observed, indicates the different impact of surface and upper wind dynamics. The analysis reveals the different impact of the two sea-land breeze patterns usually observed along the valley (pure and non-pure) and the corresponding air masses at higher layers associated with each one. The pure, with short development (around 30 km inland) and increasing accumulation process, favours high concentrations of both radionuclides in Huelva (coastal site), while the non-pure, with winds sweeping the valley until arrive to Cordoba (250 km far away), causes high activity values at this site. These results reveal the impact of mesoscale conditions on these two natural radionuclides, and the importance of these circulations on its spatial and temporal variability.Keywords: 7Be, 210Pb, air masses, mesoscale process
Procedia PDF Downloads 4096050 Resistivity Tomography Optimization Based on Parallel Electrode Linear Back Projection Algorithm
Authors: Yiwei Huang, Chunyu Zhao, Jingjing Ding
Abstract:
Electrical Resistivity Tomography has been widely used in the medicine and the geology, such as the imaging of the lung impedance and the analysis of the soil impedance, etc. Linear Back Projection is the core algorithm of Electrical Resistivity Tomography, but the traditional Linear Back Projection can not make full use of the information of the electric field. In this paper, an imaging method of Parallel Electrode Linear Back Projection for Electrical Resistivity Tomography is proposed, which generates the electric field distribution that is not linearly related to the traditional Linear Back Projection, captures the new information and improves the imaging accuracy without increasing the number of electrodes by changing the connection mode of the electrodes. The simulation results show that the accuracy of the image obtained by the inverse operation obtained by the Parallel Electrode Linear Back Projection can be improved by about 20%.Keywords: electrical resistivity tomography, finite element simulation, image optimization, parallel electrode linear back projection
Procedia PDF Downloads 1536049 Risk Assessment of Natural Gas Pipelines in Coal Mined Gobs Based on Bow-Tie Model and Cloud Inference
Authors: Xiaobin Liang, Wei Liang, Laibin Zhang, Xiaoyan Guo
Abstract:
Pipelines pass through coal mined gobs inevitably in the mining area, the stability of which has great influence on the safety of pipelines. After extensive literature study and field research, it was found that there are a few risk assessment methods for coal mined gob pipelines, and there is a lack of data on the gob sites. Therefore, the fuzzy comprehensive evaluation method is widely used based on expert opinions. However, the subjective opinions or lack of experience of individual experts may lead to inaccurate evaluation results. Hence the accuracy of the results needs to be further improved. This paper presents a comprehensive approach to achieve this purpose by combining bow-tie model and cloud inference. The specific evaluation process is as follows: First, a bow-tie model composed of a fault tree and an event tree is established to graphically illustrate the probability and consequence indicators of pipeline failure. Second, the interval estimation method can be scored in the form of intervals to improve the accuracy of the results, and the censored mean algorithm is used to remove the maximum and minimum values of the score to improve the stability of the results. The golden section method is used to determine the weight of the indicators and reduce the subjectivity of index weights. Third, the failure probability and failure consequence scores of the pipeline are converted into three numerical features by using cloud inference. The cloud inference can better describe the ambiguity and volatility of the results which can better describe the volatility of the risk level. Finally, the cloud drop graphs of failure probability and failure consequences can be expressed, which intuitively and accurately illustrate the ambiguity and randomness of the results. A case study of a coal mine gob pipeline carrying natural gas has been investigated to validate the utility of the proposed method. The evaluation results of this case show that the probability of failure of the pipeline is very low, the consequences of failure are more serious, which is consistent with the reality.Keywords: bow-tie model, natural gas pipeline, coal mine gob, cloud inference
Procedia PDF Downloads 2506048 Antibacterial Activity of Copper Nanoparticles on Vancomycin Resistant Staphylococcus Aureus in Vitro and Animal Models
Authors: Sina Gharevali
Abstract:
Staphylococcus aureus is one of the most important factors for nosocomial infections and infections acquired in a hospital setting role as is. Drug-resistant bacteria methicillin, which in 1961 was reported in many parts of the world, Made the role as the last drug, vancomycin, in the treatment of infections caused by the Staphylococcus aureus chain be taken into consideration. The aim of this study was to evaluate the antimicrobial effects of copper nanoparticles and compared it with antibiotics on Staphylococcus aureus resistant to vancomycin in vitro and animal model. In this study, this test was performed, and the most effective antibiotic for vancomycin-resistant Staphylococcus aureus was determined by disk diffusion method. After various concentrations of copper nanoparticles and antibiotics were prepared and vancomycin resistant Staphylococcus aureus bacteria with serial dilution method for determining antibiotic ciprofloxacin. Minimum Inhibitory Concentration and Minimum Bactericidal Concentrationcopper nanoparticles was performed. The agar dilution method for bacterial growth in different concentrations of copper nanoparticles and antibiotics ciprofloxacin was performed. The agar dilution method for bacterial growth in different concentrations of copper nanoparticles and antibiotics ciprofloxacin was performed. Then the broth dilution method for the antibiotic ciprofloxacin, nano-particles, and nano-particles of copper and copper-established antibiotic synergy MIC and MBC were obtained. MBC was obtained from the experimental animal model test method, and the results were compared. The results showed that copper nanoparticles compared with the antibiotic ciprofloxacin in vitro and animal model more effective in inhibiting the growth of Staphylococcus aureus resistant to vancomycin and ciprofloxacin and extent of the impact of the Synthetic effect of lower copper nanoparticles. Which can then be used to treat clinical research as a candidate.Keywords: nanoparticles, copper, staphylococcus, aureus
Procedia PDF Downloads 966047 25 (OH)D3 Level and Obesity Type, and Its Effect on Renal Excretory Function in Patients with a Functioning Transplant
Authors: Magdalena Barbara Kaziuk, Waldemar Kosiba, Marek Jan Kuzniewski
Abstract:
Introduction: Vitamin D3 has a proven pleiotropic effect, not only responsible for calcium and phosphate management, but also influencing normal functioning of the whole body. Aim: Evaluation of vitamin D3 resources and its effect on a nutritional status, obesity type and glomerular filtration in kidney transplant recipients. Methods: Group of 152 (81 women and 71 men, average age 47.8 ± 11.6 years) patients with a functioning renal transplant their body composition was assessed using the bioimpendance method (BIA) and anthropometric measurements more than 3 months after the transplant. The nutritional status and the obesity type were determined with the Waist to Height Ratio (WHtR) and the Waist to Hip Ratio (WHR). 25- Hydroxyvitamin D3 (25 (OH)D3) was determined, together with its correlation with the obesity type and the glomerular filtration rate (eGFR) calculated with the MDRD formula. Results: The mean 25 (OH)D3 level was 20.4 ng/ml. 30ng/ml was considered as a minimum correct level 22,7% of patients from the study group were classified to be a correct body weight, 56,7% of participants had an android type and 20,6% had a gynoid type. Significant correlation was observed between 25 (OH)D3 deficiency and abdominal obesity (p < 0.005) in patients. Furthermore, a statistically significant relationship was demonstrated between the 25 (OH)D3 levels and eGFR in patients after a kidney transplant. Patients with an android body type had lower eGFR versus those with the gynoid body type (p=0.004). Conclusions: Correct diet in patients after a kidney transplant determines minimum recommended serum levels of vitamin D3. Excessive fatty tissue, low levels of 25 (OH)D3), may be a predictor for android obesity and renal injury; therefore, correct diet and pharmacological management together with physical activities adapted to the physical fitness level of a patient are necessary.Keywords: kidney transplantation, glomerular filtration rate, obesity, vitamin D3
Procedia PDF Downloads 2786046 Facility Anomaly Detection with Gaussian Mixture Model
Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho
Abstract:
Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm
Procedia PDF Downloads 2726045 Modified Lot Quality Assurance Sampling (LQAS) Model for Quality Assessment of Malaria Parasite Microscopy and Rapid Diagnostic Tests in Kano, Nigeria
Authors: F. Sarkinfada, Dabo N. Tukur, Abbas A. Muaz, Adamu A. Yahuza
Abstract:
Appropriate Quality Assurance (QA) of parasite-based diagnosis of malaria to justify Artemisinin-based Combination Therapy (ACT) is essential for Malaria Programmes. In Low and Middle Income Countries (LMIC), resource constrain appears to be a major challenge in implementing the conventional QA system. We designed and implemented a modified LQAS model for QA of malaria parasite (MP) microscopy and RDT in a State Specialist Hospital (SSH) and a University Health Clinic (UHC) in Kano, Nigeria. The capacities of both facilities for MP microscopy and RDT were assessed before implementing a modified LQAS over a period of 3 months. Quality indicators comprising the qualities of blood film and staining, MP positivity rates, concordance rates, error rates (in terms of false positives and false negatives), sensitivity and specificity were monitored and evaluated. Seventy one percent (71%) of the basic requirements for malaria microscopy was available in both facilities, with the absence of certifies microscopists, SOPs and Quality Assurance mechanisms. A daily average of 16 to 32 blood samples were tested with a blood film staining quality of >70% recorded in both facilities. Using microscopy, the MP positivity rates were 50.46% and 19.44% in SSH and UHS respectively, while the MP positivity rates were 45.83% and 22.78% in SSH and UHS when RDT was used. Higher concordance rates of 88.90% and 93.98% were recorded in SSH and UHC respectively using microscopy, while lower rates of 74.07% and 80.58% in SSH and UHC were recorded when RDT was used. In both facilities, error rates were higher when RDT was used than with microscopy. Sensitivity and specificity were higher when microscopy was used (95% and 84% in SSH; 94% in UHC) than when RDT was used (72% and 76% in SSH; 78% and 81% in UHC). It could be feasible to implement an integrated QA model for MP microscopy and RDT using modified LQAS in Malaria Control Programmes in Low and Middle Income Countries that might have resource constrain for parasite-base diagnosis of malaria to justify ACT treatment.Keywords: malaria, microscopy, quality assurance, RDT
Procedia PDF Downloads 2226044 Evaluation of Complications Observed in Porcelain Fused to Metal Crowns Placed at a Teaching Institution
Authors: Shizrah Jamal, Robia Ghafoor, Farhan Raza
Abstract:
Porcelain fused to metal crown is the most versatile variety of crown that is commonly placed worldwide. Various complications have been reported in the PFM crowns with use over the period of time. These include chipping of the porcelain, recurrent caries, loss of retention, open contacts, and tooth fracture. The objective of the present study was to determine the frequency of these complications in crowns cemented over a period of five years in a tertiary care hospital and also to report the survival of these crowns. A retrospective study was conducted in Dental clinics, Aga Khan University Hospital in which 150 PFM crowns cemented over a period of five years were evaluated. Patient demographics, oral hygiene habits, para-functional habits, crown insertion and follow-up dates were recorded in a specially designed proforma. All PFM crowns fulfilling the inclusion criteria were assessed both clinically and radiographically for the presence of any complication. SPSS version 22.0 was used for statistical analysis. Frequency distribution and proportion of complications were determined. Chi-square test was used to determine the association of complications of PFM crowns with multiple variables including tooth wear, opposing dentition and betel nut chewing. Kaplan- meier survival analysis was used to determine the survival of PFM crowns over the period of five years. Level of significance was kept at 0.05. A total of 107 patients, with a mean age of 43.51 + 12.4 years, having 150 PFM crowns were evaluated. The most common complication observed was open proximal contacts (8.7%) followed by porcelain chipping (6%), decementation (5.3%), and abutment fracture (1.3%). Chi square test showed that there was no statistically significant association of PFM crown complication with tooth wear, betel nut and opposing dentition (p-value <0.05). The overall success and survival rates of PFM crowns turned out to be 78.7 and 84.7% respectively. Within the limitations of the study, it can be concluded that PFM crowns are an effective treatment modality with high success and survival rates. Since it was a single centered study; the results should be generalized with caution.Keywords: chipping, complication, crown, survival rate
Procedia PDF Downloads 2086043 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 3876042 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School
Authors: Shofiayuningtyas Luftiani
Abstract:
Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis
Procedia PDF Downloads 1826041 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 756040 Design and Simulation of an Inter-Satellite Optical Wireless Communication System Using Diversity Techniques
Authors: Sridhar Rapuru, D. Mallikarjunreddy, Rajanarendra Sai
Abstract:
In this reign of the internet, the access of any multimedia file to the users at any time with a superior quality is needed. To achieve this goal, it is very important to have a good network without any interruptions between the satellites along with various earth stations. For that purpose, a high speed inter-satellite optical wireless communication system (IsOWC) is designed with space and polarization diversity techniques. IsOWC offers a high bandwidth, small size, less power requirement and affordable when compared with the present microwave satellite systems. To improve the efficiency and to reduce the propagation delay, inter-satellite link is established between the satellites. High accurate tracking systems are required to establish the reliable connection between the satellites as they have their own orbits. The only disadvantage of this IsOWC system is laser beam width is narrower than the RF because of this highly accurate tracking system to meet this requirement. The satellite uses the 'ephemerides data' for rough pointing and tracking system for fine pointing to the other satellite. In this proposed IsOWC system, laser light is used as a wireless connectedness between the source and destination and free space acts as the channel to carry the message. The proposed system will be designed, simulated and analyzed for 6000km with an improvement of data rate over previously existing systems. The performance parameters of the system are Q-factor, eye opening, bit error rate, etc., The proposed system for Inter-satellite Optical Wireless Communication System Design Using Diversity Techniques finds huge scope of applications in future generation communication purposes.Keywords: inter-satellite optical wireless system, space and polarization diversity techniques, line of sight, bit error rate, Q-factor
Procedia PDF Downloads 2696039 Improving the Accuracy of Stress Intensity Factors Obtained by Scaled Boundary Finite Element Method on Hybrid Quadtree Meshes
Authors: Adrian W. Egger, Savvas P. Triantafyllou, Eleni N. Chatzi
Abstract:
The scaled boundary finite element method (SBFEM) is a semi-analytical numerical method, which introduces a scaling center in each element’s domain, thus transitioning from a Cartesian reference frame to one resembling polar coordinates. Consequently, an analytical solution is achieved in radial direction, implying that only the boundary need be discretized. The only limitation imposed on the resulting polygonal elements is that they remain star-convex. Further arbitrary p- or h-refinement may be applied locally in a mesh. The polygonal nature of SBFEM elements has been exploited in quadtree meshes to alleviate all issues conventionally associated with hanging nodes. Furthermore, since in 2D this results in only 16 possible cell configurations, these are precomputed in order to accelerate the forward analysis significantly. Any cells, which are clipped to accommodate the domain geometry, must be computed conventionally. However, since SBFEM permits polygonal elements, significantly coarser meshes at comparable accuracy levels are obtained when compared with conventional quadtree analysis, further increasing the computational efficiency of this scheme. The generalized stress intensity factors (gSIFs) are computed by exploiting the semi-analytical solution in radial direction. This is initiated by placing the scaling center of the element containing the crack at the crack tip. Taking an analytical limit of this element’s stress field as it approaches the crack tip, delivers an expression for the singular stress field. By applying the problem specific boundary conditions, the geometry correction factor is obtained, and the gSIFs are then evaluated based on their formal definition. Since the SBFEM solution is constructed as a power series, not unlike mode superposition in FEM, the two modes contributing to the singular response of the element can be easily identified in post-processing. Compared to the extended finite element method (XFEM) this approach is highly convenient, since neither enrichment terms nor a priori knowledge of the singularity is required. Computation of the gSIFs by SBFEM permits exceptional accuracy, however, when combined with hybrid quadtrees employing linear elements, this does not always hold. Nevertheless, it has been shown that crack propagation schemes are highly effective even given very coarse discretization since they only rely on the ratio of mode one to mode two gSIFs. The absolute values of the gSIFs may still be subject to large errors. Hence, we propose a post-processing scheme, which minimizes the error resulting from the approximation space of the cracked element, thus limiting the error in the gSIFs to the discretization error of the quadtree mesh. This is achieved by h- and/or p-refinement of the cracked element, which elevates the amount of modes present in the solution. The resulting numerical description of the element is highly accurate, with the main error source now stemming from its boundary displacement solution. Numerical examples show that this post-processing procedure can significantly improve the accuracy of the computed gSIFs with negligible computational cost even on coarse meshes resulting from hybrid quadtrees.Keywords: linear elastic fracture mechanics, generalized stress intensity factors, scaled finite element method, hybrid quadtrees
Procedia PDF Downloads 1466038 Optimization Techniques for Microwave Structures
Authors: Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The discontinuities is characterized using a hybrid spectral/numerical technique. This structure presents an arbitrary number of ports, each one with different orientation and dimensions. This article presents a hybrid method based on multimode contour integral and mode matching techniques. The process is based on segmentation and dividing the structure into key building blocks. We use the multimode contour integral method to analyze the blocks including irregular shape discontinuities. Finally, the multimode scattering matrix of the whole structure can be found by cascading the blocks. Therefore, the new method is suitable for analysis of a wide range of waveguide problems. Therefore, the present approach can be applied easily to the analysis of any multiport junctions and cascade blocks. The accuracy of the method is validated comparing with results for several complex problems found in the literature. CPU times are also included to show the efficiency of the new method proposed.Keywords: segmentation, s parameters, simulation, optimization
Procedia PDF Downloads 5286037 Performences of Type-2 Fuzzy Logic Control and Neuro-Fuzzy Control Based on DPC for Grid Connected DFIG with Fixed Switching Frequency
Authors: Fayssal Amrane, Azeddine Chaiba
Abstract:
In this paper, type-2 fuzzy logic control (T2FLC) and neuro-fuzzy control (NFC) for a doubly fed induction generator (DFIG) based on direct power control (DPC) with a fixed switching frequency is proposed for wind generation application. First, a mathematical model of the doubly-fed induction generator implemented in d-q reference frame is achieved. Then, a DPC algorithm approach for controlling active and reactive power of DFIG via fixed switching frequency is incorporated using PID. The performance of T2FLC and NFC, which is based on the DPC algorithm, are investigated and compared to those obtained from the PID controller. Finally, simulation results demonstrate that the NFC is more robust, superior dynamic performance for wind power generation system applications.Keywords: doubly fed induction generator (DFIG), direct power control (DPC), neuro-fuzzy control (NFC), maximum power point tracking (MPPT), space vector modulation (SVM), type 2 fuzzy logic control (T2FLC)
Procedia PDF Downloads 4206036 An Evolutionary Multi-Objective Optimization for Airport Gate Assignment Problem
Authors: Seyedmirsajad Mokhtarimousavi, Danial Talebi, Hamidreza Asgari
Abstract:
Gate Assignment Problem (GAP) is one of the most substantial issues in airport operation. In principle, GAP intends to maintain the maximum capacity of the airport through the best possible allocation of the resources (gates) in order to reach the optimum outcome. The problem involves a wide range of dependent and independent resources and their limitations, which add to the complexity of GAP from both theoretical and practical perspective. In this study, GAP was mathematically formulated as a three-objective problem. The preliminary goal of multi-objective formulation was to address a higher number of objectives that can be simultaneously optimized and therefore increase the practical efficiency of the final solution. The problem is solved by applying the second version of Non-dominated Sorting Genetic Algorithm (NSGA-II). Results showed that the proposed mathematical model could address most of major criteria in the decision-making process in airport management in terms of minimizing both airport/airline cost and passenger walking distance time. Moreover, the proposed approach could properly find acceptable possible answers.Keywords: airport management, gate assignment problem, mathematical modeling, genetic algorithm, NSGA-II
Procedia PDF Downloads 2996035 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data
Authors: Muthukumarasamy Govindarajan
Abstract:
Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine
Procedia PDF Downloads 1426034 Biomass Carbon Credit Estimation for Sustainable Urban Planning and Micro-climate Assessment
Authors: R. Niranchana, K. Meena Alias Jeyanthi
Abstract:
As a result of the present climate change dilemma, the energy balancing strategy is to construct a sustainable environment has become a top concern for researchers worldwide. The environment itself has always been a solution from the earliest days of human evolution. Carbon capture begins with its accurate estimation and monitoring credit inventories, and its efficient use. Sustainable urban planning with deliverables of re-use energy models might benefit from assessment methods like biomass carbon credit ranking. The term "biomass energy" refers to the various ways in which living organisms can potentially be converted into a source of energy. The approaches that can be applied to biomass and an algorithm for evaluating carbon credits are presented in this paper. The micro-climate evaluation using Computational Fluid dynamics was carried out across the location (1 km x1 km) at Dindigul, India (10°24'58.68" North, 77°54.1.80 East). Sustainable Urban design must be carried out considering environmental and physiological convection, conduction, radiation and evaporative heat exchange due to proceeding solar access and wind intensities.Keywords: biomass, climate assessment, urban planning, multi-regression, carbon estimation algorithm
Procedia PDF Downloads 95