Search results for: type-2 fuzzy sets
460 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector
Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts
Abstract:
At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion
Procedia PDF Downloads 187459 Poor Proficiency of English Language among Tertiary Level Students in Bangladesh and Its Effect on Employability: An Investigation to Find Facts and Solutions
Authors: Tanvir Ahmed, Nahian Fyrose Fahim, Subrata Majumder, Sarker Kibria
Abstract:
English is unanimously recognized as the standard second language in the world, and no one can deny this fact. Many people believe that possessing English proficiency skills is the key to communicating effectively globally, especially for developing countries, which can bring further success to itself on many fronts, as well as to other countries, by ensuring its people worldwide access to education, business, and technology. Bangladesh is a developing country of about 160 million people. A notable number of students in Bangladesh are currently pursuing higher education, especially at the tertiary or collegiate level, in more than 150 public and private universities. English is the dominant linguistic medium through which college instruction and lectures are given to students in Bangladesh. However, many of our students who have only completed their primary and secondary levels of education in the Bangla medium or language are generally in an awkward position to suddenly take and complete many unfamiliar requirements by the time they enter the university as freshmen. As students, they struggle to complete at least 18 courses to acquire proficiency in English. After obtaining a tertiary education certificate, the students could then have the opportunity to acquire a sustainable position in the job market industry; however, many of them do fail, unfortunately, because of poor English proficiency skills. Our study focuses on students in both public and private universities (N=150) as well as education experts (N=30) in Bangladesh. We had prepared two sets of questionnaires that were based upon a literature review on this subject, as we had also collected data and identified the reasons, and arrived at probable solutions to overcoming these problems. After statistical analysis, the study suggested certain remedial measures that could be taken in order to increase student's proficiency in English as well as to ensure their employability potential.Keywords: tertiary education, English language proficiency, employability, unemployment problems
Procedia PDF Downloads 106458 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring
Authors: Daniel Fundi Murithi
Abstract:
Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring
Procedia PDF Downloads 163457 High-Risk Gene Variant Profiling Models Ethnic Disparities in Diabetes Vulnerability
Authors: Jianhua Zhang, Weiping Chen, Guanjie Chen, Jason Flannick, Emma Fikse, Glenda Smerin, Yanqin Yang, Yulong Li, John A. Hanover, William F. Simonds
Abstract:
Ethnic disparities in many diseases are well recognized and reflect the consequences of genetic, behavior, and environmental factors. However, direct scientific evidence connecting the ethnic genetic variations and the disease disparities has been elusive, which may have led to the ethnic inequalities in large scale genetic studies. Through the genome-wide analysis of data representing 185,934 subjects, including 14,955 from our own studies of the African America Diabetes Mellitus, we discovered sets of genetic variants either unique to or conserved in all ethnicities. We further developed a quantitative gene function-based high-risk variant index (hrVI) of 20,428 genes to establish profiles that strongly correlate with the subjects' self-identified ethnicities. With respect to the ability to detect human essential and pathogenic genes, the hrVI analysis method is both comparable with and complementary to the well-known genetic analysis methods, pLI and VIRlof. Application of the ethnicity-specific hrVI analysis to the type 2 diabetes mellitus (T2DM) national repository, containing 20,791 cases and 24,440 controls, identified 114 candidate T2DM-associated genes, 8.8-fold greater than that of ethnicity-blind analysis. All the genes identified are defined as either pathogenic or likely-pathogenic in ClinVar database, with 33.3% diabetes-associated and 54.4% obesity-associated genes. These results demonstrate the utility of hrVI analysis and provide the first genetic evidence by clustering patterns of how genetic variations among ethnicities may impede the discovery of diabetes and foreseeably other disease-associated genes.Keywords: diabetes-associated genes, ethnic health disparities, high-risk variant index, hrVI, T2DM
Procedia PDF Downloads 137456 Calcitonin gene-related peptide Receptor Antagonists for Chronic Migraine – Real World Outcomes
Authors: B. J. Mahen, N. E. Lloyd-Gale, S. Johnson, W. P. Rakowicz, M. J. Harris, A. D. Miller
Abstract:
Background: Migraine is a leading cause of disability in the world. Calcitonin gene-related peptide (CGRP) receptor antagonists offer an approach to migraine prophylaxis by inhibiting the inflammatory and vasodilatory effects of CGRP. In recent years, NICE licensed the use of three CGRP-receptor antagonists: Fremanezumab, Galcanezumab, and Erenumab. Here, we present the outcomes of CGRP-antagonist treatment in a cohort of patients who suffer from episodic or chronic migraine and have failed at least three oral prophylactic therapies. Methods: We offered CGRP antagonists to 86 patients who met the NICE criteria to start therapy. We recorded the number of headache days per month (HDPM) at 0 weeks, 3 months, and 12 months. Of those, 26 patients were switched to an alternative treatment due to poor response or side effects. Of the 112 total cases, 9 cases did not sufficiently maintain their headache diary, and 5 cases were not followed up at 3 months. We have therefore included 98 sets of data in our analysis. Results: Fremanezumab achieved a reduction in HDPM by 51.7% at 3 months (p<0.0001), with 63.7% of patients meeting NICE criteria to continue therapy. Patients trialed on Galcanezumab attained a reduction in HDPM by 47.0% (p=0.0019), with 51.6% of patients meeting NICE criteria to continue therapy. Erenumab, however, only achieved a reduction in HDPM by 17.0% (p=0.29), and this was not statistically significant. Furthermore, 34.4%, 9.7%, and 4.9% of patients taking Fremanezumab, Galcanezumab, and Erenumab, respectively, continued therapy beyond 12 months. Of those who attempted drug holidays following 12 months of treatment, migraine symptoms relapsed in 100% of cases. Conclusion: We observed a significant improvement in HDPM amongst episodic and chronic migraine patients following treatment with Fremanezumab or Galcanezumab.Keywords: migraine, CGRP, fremanezumab, galcanezumab, erenumab
Procedia PDF Downloads 95455 Experimental Study on Strength Development of Low Cement Concrete Using Mix Design for Both Binary and Ternary Mixes
Authors: Mulubrhan Berihu, Supratic Gupta, Zena Gebriel
Abstract:
Due to the design versatility, availability, and cost efficiency, concrete is continuing to be the most used construction material on earth. However, the production of Portland cement, the primary component of concrete mix is causing to have a serious effect on environmental and economic impacts. This shows there is a need to study using of supplementary cementitious materials (SCMs). The most commonly used supplementary cementitious materials are wastes and the use of these industrial waste products has technical, economical and environmental benefits besides the reduction of CO2 emission from cement production. The study aims to document the effect on strength property of concrete due to use of low cement by maximizing supplementary cementitious materials like fly ash or marble powder. Based on the different mix proportion of pozzolana and marble powder a range of mix design was formulated. The first part of the project is to study the strength of low cement concrete using fly ash replacement experimentally. The test results showed that using up to 85 kg/m3 of cement is possible for plain concrete works like hollow block concrete to achieve 9.8 Mpa and the experimental results indicates that strength is a function of w/b. In the second part a new set of mix design has been carried out with fly ash and marble powder to study the strength of both binary and ternary mixes. In this experimental study, three groups of mix design (c+FA, c+FA+m and c+m), four sets of mixes for each group were taken up. Experimental results show that c+FA has maintained the best strength and impermeability whereas c+m obtained less compressive strength, poorer permeability and split tensile strength. c+FA shows a big difference in gaining of compressive strength from 7 days to 28 days compression strength compared to others and this obviously shows the slow rate of hydration of fly ash concrete. As the w/b ratio increases the strength decreases significantly. At the same time higher permeability has been seen in the specimens which were tested for three hours than one hour.Keywords: efficiency factor, cement content, compressive strength, mix proportion, w/c ratio, water permeability, SCMs
Procedia PDF Downloads 211454 Design of Low-Emission Catalytically Stabilized Combustion Chamber Concept
Authors: Annapurna Basavaraju, Andreas Marn, Franz Heitmeir
Abstract:
The Advisory Council for Aeronautics Research in Europe (ACARE) is cognizant for the overall reduction of NOx emissions by 80% in its vision 2020. Moreover small turbo engines have higher fuel specific emissions compared to large engines due to their limited combustion chamber size. In order to fulfill these requirements, novel combustion concepts are essential. This motivates to carry out the research on the current state of art, catalytic stabilized combustion chamber using hydrogen in small jet engines which are designed and investigated both numerically and experimentally during this project. Catalytic combustion concepts can also be adopted for low caloric fuels and are therefore not constrained to only hydrogen. However, hydrogen has high heating value and has the major advantage of producing only the nitrogen oxides as pollutants during the combustion, thus eliminating the interest on other emissions such as Carbon monoxides etc. In the present work, the combustion chamber is designed based on the ‘Rich catalytic Lean burn’ concept. The experiments are conducted for the characteristic operating range of an existing engine. This engine has been tested successfully at Institute of Thermal Turbomachinery and Machine Dynamics (ITTM), Technical University Graz. One of the facts that the efficient combustion is a result of proper mixing of fuel-air mixture, considerable significance is given to the selection of appropriate mixer. This led to the design of three diverse configurations of mixers and is investigated experimentally and numerically. Subsequently the best mixer would be equipped in the main combustion chamber and used throughout the experimentation. Furthermore, temperatures and pressures would be recorded at various locations inside the combustion chamber and the exhaust emissions will also be analyzed. The instrumented combustion chamber would be inspected at the engine relevant inlet conditions for nine different sets of catalysts at the Hot Flow Test Facility (HFTF) of the institute.Keywords: catalytic combustion, gas turbine, hydrogen, mixer, NOx emissions
Procedia PDF Downloads 305453 The Psychology of Cross-Cultural Communication: A Socio-Linguistics Perspective
Authors: Tangyie Evani, Edmond Biloa, Emmanuel Nforbi, Lem Lilian Atanga, Kom Beatrice
Abstract:
The dynamics of languages in contact necessitates a close study of how its users negotiate meanings from shared values in the process of cross-cultural communication. A transverse analysis of the situation demonstrates the existence of complex efforts on connecting cultural knowledge to cross-linguistic competencies within a widening range of communicative exchanges. This paper sets to examine the psychology of cross-cultural communication in a multi-linguistic setting like Cameroon where many local and international languages are in close contact. The paper equally analyses the pertinence of existing macro sociological concepts as fundamental knowledge traits in literal and idiomatic cross semantic mapping. From this point, the article presents a path model of connecting sociolinguistics to the increasing adoption of a widening range of communicative genre piloted by the on-going globalisation trends with its high-speed information technology machinery. By applying a cross cultural analysis frame, the paper will be contributing to a better understanding of the fundamental changes in the nature and goals of cross-cultural knowledge in pragmatics of communication and cultural acceptability’s. It emphasises on the point that, in an era of increasing global interchange, a comprehensive inclusive global culture through bridging gaps in cross-cultural communication would have significant potentials to contribute to achieving global social development goals, if inadequacies in language constructs are adjusted to create avenues that intertwine with sociocultural beliefs, ensuring that meaningful and context bound sociolinguistic values are observed within the global arena of communication.Keywords: cross-cultural communication, customary language, literalisms, primary meaning, subclasses, transubstantiation
Procedia PDF Downloads 285452 A Prediction of Cutting Forces Using Extended Kienzle Force Model Incorporating Tool Flank Wear Progression
Authors: Wu Peng, Anders Liljerehn, Martin Magnevall
Abstract:
In metal cutting, tool wear gradually changes the micro geometry of the cutting edge. Today there is a significant gap in understanding the impact these geometrical changes have on the cutting forces which governs tool deflection and heat generation in the cutting zone. Accurate models and understanding of the interaction between the work piece and cutting tool leads to improved accuracy in simulation of the cutting process. These simulations are useful in several application areas, e.g., optimization of insert geometry and machine tool monitoring. This study aims to develop an extended Kienzle force model to account for the effect of rake angle variations and tool flank wear have on the cutting forces. In this paper, the starting point sets from cutting force measurements using orthogonal turning tests of pre-machined flanches with well-defined width, using triangular coated inserts to assure orthogonal condition. The cutting forces have been measured by dynamometer with a set of three different rake angles, and wear progression have been monitored during machining by an optical measuring collaborative robot. The method utilizes the measured cutting forces with the inserts flank wear progression to extend the mechanistic cutting forces model with flank wear as an input parameter. The adapted cutting forces model is validated in a turning process with commercial cutting tools. This adapted cutting forces model shows the significant capability of prediction of cutting forces accounting for tools flank wear and different-rake-angle cutting tool inserts. The result of this study suggests that the nonlinear effect of tools flank wear and interaction between the work piece and the cutting tool can be considered by the developed cutting forces model.Keywords: cutting force, kienzle model, predictive model, tool flank wear
Procedia PDF Downloads 109451 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process
Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma
Abstract:
As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis
Procedia PDF Downloads 100450 Intercultural Education and Changing Paradigms of Education: A Research Survey
Authors: Shalini Misra
Abstract:
The means and methods of education have been changing fast since the invention of internet. Both, ancient and modern education emphasized on the holistic development of students. But, a significant change has been observed in the 21st century learners. Online classes, intercultural and interdisciplinary education which were exceptions in the past, are setting new trends in the field of education. In the modern era, intercultural and interpersonal skills are of immense importance, not only for students but for everyone. It sets a platform for better understanding and deeper learning by ensuring the active participation and involvement of students belonging to different social and cultural backgrounds in various academic and non-academic pursuits. On October 31, 2015, on the occasion of 140th birth anniversary of Sardar Vallabhbhai Patel, Hon’ble Prime Minister of India, Narendra Modi announced a wonderful initiative, ‘Ek Bharat Shreshtha Bharat’ i.e. ‘One India Best India’ commonly known as ‘EBSB’. The program highlighted India’s rich culture and traditions. The objective of the program was to foster a better understanding and healthy relationship among Indian States. Under this program, a variety of subjects were covered like ‘Arts, Culture and Language’ .It was claimed to be a successful cultural exchange where students from diverse communities shared their thoughts and experiences with one another. Under this online cultural exchange program, the state of Uttarakhand was paired with the state of Karnataka in the year 2022. The present paper proposes to undertake a survey of a total of thirty secondary level students of Uttarakhand and the partner state Karnataka, who participated in this program with a purpose of learning and embracing new ideas and culture thus promoting intercultural education. It aims to study and examine the role of intercultural education in shifting and establishing new paradigms of education.Keywords: education, intercultural, interpersonal, traditions, understanding
Procedia PDF Downloads 82449 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel
Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani
Abstract:
Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry
Procedia PDF Downloads 273448 Socio-Cultural Representations through Lived Religions in Dalrymple’s Nine Lives
Authors: Suman
Abstract:
In the continuous interaction between the past and the present that historiography is, each time when history gets re/written, a new representation emerges. This new representation is a reflection of the earlier archives and their interpretations, fragmented remembrances of the past, as well as the reactions to the present. Memory, or lack thereof, and stereotyping generally play a major role in this representation. William Dalrymple’s Nine Lives: In Search of the Sacred in Modern India (2009) is one such written account that sets out to narrate the representations of religion and culture of India and contemporary reactions to it. Dalrymple’s nine saints belong to different castes, sects, religions, and regions. By dealing with their religions and expressions of those religions, and through the lived mysticism of these nine individuals, the book engages with some important issues like class, caste and gender in the contexts provided by historical as well as present India. The paper studies the development of religion and accompanied feeling of religiosity in modern as well as historical contexts through a study of these elements in the book. Since, the language used in creation of texts and the literary texts thus produced create a new reality that questions the stereotypes of the past, and in turn often end up creating new stereotypes or stereotypical representations at times, the paper seeks to actively engage with the text in order to identify and study such stereotypes, along with their changing representations. Through a detailed examination of the book, the paper seeks to unravel whether some socio-cultural stereotypes existed earlier, and whether there is development of new stereotypes from Dalrymple’s point of view as an outsider writing on issues that are deeply rooted in the cultural milieu of the country. For this analysis, the paper takes help from the psycho-literary theories of stereotyping and representation.Keywords: stereotyping, representation, William Dalrymple, religion
Procedia PDF Downloads 311447 Maintaining Experimental Consistency in Geomechanical Studies of Methane Hydrate Bearing Soils
Authors: Lior Rake, Shmulik Pinkert
Abstract:
Methane hydrate has been found in significant quantities in soils offshore within continental margins and in permafrost within arctic regions where low temperature and high pressure are present. The mechanical parameters for geotechnical engineering are commonly evaluated in geomechanical laboratories adapted to simulate the environmental conditions of methane hydrate-bearing sediments (MHBS). Due to the complexity and high cost of natural MHBS sampling, most laboratory investigations are conducted on artificially formed samples. MHBS artificial samples can be formed using different hydrate formation methods in the laboratory, where methane gas and water are supplied into the soil pore space under the methane hydrate phase conditions. The most commonly used formation method is the excess gas method which is considered a relatively simple, time-saving, and repeatable testing method. However, there are several differences in the procedures and techniques used to produce the hydrate using the excess gas method. As a result of the difference between the test facilities and the experimental approaches that were carried out in previous studies, different measurement criteria and analyses were proposed for MHBS geomechanics. The lack of uniformity among the various experimental investigations may adversely impact the reliability of integrating different data sets for unified mechanical model development. In this work, we address some fundamental aspects relevant to reliable MHBS geomechanical investigations, such as hydrate homogeneity in the sample, the hydrate formation duration criterion, the hydrate-saturation evaluation method, and the effect of temperature measurement accuracy. Finally, a set of recommendations for repeatable and reliable MHBS formation will be suggested for future standardization of MHBS geomechanical investigation.Keywords: experimental study, laboratory investigation, excess gas, hydrate formation, standardization, methane hydrate-bearing sediment
Procedia PDF Downloads 59446 A Multivariate Statistical Approach for Water Quality Assessment of River Hindon, India
Authors: Nida Rizvi, Deeksha Katyal, Varun Joshi
Abstract:
River Hindon is an important river catering the demand of highly populated rural and industrial cluster of western Uttar Pradesh, India. Water quality of river Hindon is deteriorating at an alarming rate due to various industrial, municipal and agricultural activities. The present study aimed at identifying the pollution sources and quantifying the degree to which these sources are responsible for the deteriorating water quality of the river. Various water quality parameters, like pH, temperature, electrical conductivity, total dissolved solids, total hardness, calcium, chloride, nitrate, sulphate, biological oxygen demand, chemical oxygen demand and total alkalinity were assessed. Water quality data obtained from eight study sites for one year has been subjected to the two multivariate techniques, namely, principal component analysis and cluster analysis. Principal component analysis was applied with the aim to find out spatial variability and to identify the sources responsible for the water quality of the river. Three Varifactors were obtained after varimax rotation of initial principal components using principal component analysis. Cluster analysis was carried out to classify sampling stations of certain similarity, which grouped eight different sites into two clusters. The study reveals that the anthropogenic influence (municipal, industrial, waste water and agricultural runoff) was the major source of river water pollution. Thus, this study illustrates the utility of multivariate statistical techniques for analysis and elucidation of multifaceted data sets, recognition of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.Keywords: cluster analysis, multivariate statistical techniques, river Hindon, water quality
Procedia PDF Downloads 467445 The Influence of Celebrity Endorsement on Consumers’ Attitude and Purchas Intention Towards Skincare Products in Malaysia
Authors: Tew Leh Ghee
Abstract:
The study's goal is to determine how celebrity endorsement affects Malaysian consumers' attitudes and intentions to buy skincare products. Since customers now largely rely on celebrity endorsement to influence purchasing decisions in almost every business, celebrity endorsement is not, in reality, a new phenomenon. Even though the market for skincare products has a vast potential to be exploited, corporations have yet to seize this niche via celebrity endorsement. Basically, there hasn't been much study done to recognize the significance of celebrity endorsement in this industry. This research combined descriptive and quantitative methods with a self-administered survey as the primary data-gathering tool. All of the characteristics under study were measured using a 5-point Likert scale, and the questionnaire was written in English. A convenience sample method was used to choose respondents, and 360 sets of valid questionnaires were gathered for the study's statistical analysis. Preliminary statistical analyses were analyzed using SPSS version 20.0 (Statistical Package for the Social Sciences). The backdrop of the respondents' demographics was examined using descriptive analysis. All concept assessments' validity and reliability were examined using exploratory factor analysis, item-total statistics, and reliability statistics. Pearson correlation and regression analysis were used, respectively, to assess relationships and impacts between the variables under study. The research showed that, apart from competence, celebrity endorsements of skincare products in Malaysia had a favorable impact on attitudes and purchase intentions as evaluated by attractiveness and dependability. The research indicated that the most significant element influencing attitude and buy intention was the credibility of a celebrity endorsement. The study offered implications in order to provide potential improvements of celebrity endorsement in skincare goods in Malaysia. The study's last portion includes its limits and ideas for the future.Keywords: trustworthiness, influential, phenomenon, celebrity emdorsement
Procedia PDF Downloads 81444 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 73443 Misconception on Multilingualism in Glorious Quran
Authors: Muhammed Unais
Abstract:
The holy Quran is a pure Arabic book completely ensured the absence of non Arabic term. If it was revealed in a multilingual way including various foreign languages besides the Arabic, it can be easily misunderstood that the Arabs became helpless to compile such a work positively responding to the challenge of Allah due to their lack of knowledge in other languages in which the Quran is compiled. As based on the presence of some non Arabic terms in Quran like Istabrq, Saradiq, Rabbaniyyoon, etc. some oriental scholars argued that the holy Quran is not a book revealed in Arabic. We can see some Muslim scholars who either support or deny the presence of foreign terms in Quran but all of them agree that the roots of these words suspected as non Arabic are from foreign languages and are assimilated to the Arabic and using as same in that foreign language. After this linguistic assimilation was occurred and the assimilated non Arabic words became familiar among the Arabs, the Quran revealed as using these words in such a way stating that all words it contains are Arabic either pure or assimilated. Hence the two of opinions around the authenticity and reliability of etymology of these words are right. Those who argue the presence of foreign words he is right by the way of the roots of that words are from foreign and those who argue its absence he is right for that are assimilated and changed as the pure Arabic. The possibility of multilingualism in a monolingual book is logically negative but its significance is being changed according to time and place. The problem of multilingualism in Quran is the misconception raised by some oriental scholars that the Arabs became helpless to compile a book equal to Quran not because of their weakness in Arabic but because the Quran is revealed in languages they are ignorant on them. Really, the Quran was revealed in pure Arabic, the most literate language of the Arabs, and the whole words and its meaning were familiar among them. If one become positively aware of the linguistic and cultural assimilation ever found in whole civilizations and cultural sets he will have not any question in this respect. In this paper the researcher intends to shed light on the possibility of multilingualism in a monolingual book and debates among scholars in this issue, foreign terms in Quran and the logical justifications along with the exclusive features of Quran.Keywords: Quran, foreign Terms, multilingualism, language
Procedia PDF Downloads 398442 Discussion as a Means to Improve Peer Assessment Accuracy
Authors: Jung Ae Park, Jooyong Park
Abstract:
Writing is an important learning activity that cultivates higher level thinking. Effective and immediate feedback is necessary to help improve students' writing skills. Peer assessment can be an effective method in writing tasks because it makes it possible for students not only to receive quick feedback on their writing but also to get a chance to examine different perspectives on the same topic. Peer assessment can be practiced frequently and has the advantage of immediate feedback. However, there is controversy about the accuracy of peer assessment. In this study, we tried to demonstrate experimentally how the accuracy of peer assessment could be improved. Participants (n=76) were randomly assigned to groups of 4 members. All the participant graded two sets of 4 essays on the same topic. They graded the first set twice, and the second set or the posttest once. After the first grading of the first set, each group in the experimental condition 1 (discussion group), were asked to discuss the results of the peer assessment and then to grade the essays again. Each group in the experimental condition 2 (reading group), were asked to read the assessment on each essay by an expert and then to grade the essays again. In the control group, the participants were asked to grade the 4 essays twice in different orders. Afterwards, all the participants graded the second set of 4 essays. The mean score from 4 participants was calculated for each essay. The accuracy of the peer assessment was measured by Pearson correlation with the scores of the expert. The results were analyzed by two-way repeated measure ANOVA. The main effect of grading was observed: Grading accuracy got better as the number of grading experience increased. Analysis of posttest accuracy revealed that the score variations within a group of 4 participants decreased in both discussion and reading conditions but not in the control condition. These results suggest that having students discuss their grading together can be an efficient means to improve peer assessment accuracy. By discussing, students can learn from others about what to consider in grading and whether their grading is too strict or lenient. Further research is needed to examine the exact cause of the grading accuracy.Keywords: peer assessment, evaluation accuracy, discussion, score variations
Procedia PDF Downloads 267441 Biopolitical Border Imagery during the European Migrant Crisis: A Comparative Discourse Analysis between Mediterranean Europe and the Balkans
Authors: Mira Kaneva
Abstract:
The ongoing migration crisis polemic opens up the debate to the ambivalent essence of borders due to both the legality and legitimacy of the displacement of vast masses of people across the European continent. In neoliberal terms, migration is seen as an economic opportunity, or, on the opposite, as a social disparity; in realist terms, it is regarded as a security threat that calls for mobilization; from a critical standpoint, it is a matter of discourse on democratic governance. This paper sets the objective of analyzing borders through the Foucauldian prism of biopolitics. It aims at defining the specifics of the management of the human body by producing both the irregular migrant as a subject (but prevalently as an object in the discourse) and the political subjectivity by exercising state power in repressive practices, including hate speech. The study relies on the conceptual framework of Bigo, Agamben, Huysmans, among others, and applies the methodology of qualitative comparative analysis between the cases of borders (fences, enclaves, camps and other forms of abnormal spatiality) in Italy, Spain, Greece, the Republic of Macedonia, Serbia and Bulgaria. The paper thus tries to throw light on these cross- and intra-regional contexts that share certain similarities and differences. It tries to argue that the governmentality of the masses of refugees and economic immigrants through the speech acts of their exclusion leads to a temporary populist backlash; a tentative finding is that the status-quo in terms of social and economic measures remains relatively balanced, whereas, values such as freedom, openness, and tolerance are consecutively marginalized.Keywords: Balkans, biopolitical borders, cross- and intra-regional discourse analysis, irregular migration, Mediterranean Europe, securitization vs. humanitarianism
Procedia PDF Downloads 214440 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 33439 Development of Muay Thai Competition Management for Promoting Sport Tourism in the next Decade (2015-2024)
Authors: Supasak Ngaoprasertwong
Abstract:
The purpose of this research was to develop a model for Muay Thai competition management for promoting sport tourism in the next decade. Moreover, the model was appropriately initiated for practical use. This study also combined several methodologies, both quantitative research and qualitative research, to entirely cover all aspects of data, especially the tourists’ satisfaction toward Muay Thai competition. The data were collected from 400 tourists watching Muay Thai competition in 4 stadiums to create the model for Muay Thai competition to support the sport tourism in the next decade. Besides, Ethnographic Delphi Futures Research (EDFR) was applied to gather the data from certain experts in boxing industry or having significant role in Muay Thai competition in both public sector and private sector. The first step of data collection was an in-depth interview with 27 experts associated with Muay Thai competition, Muay Thai management, and tourism. The second step and the third step of data collection were conducted to confirm the experts’ opinions toward various elements. When the 3 steps of data collection were completely accomplished, all data were assembled to draft the model. Then the model was proposed to 8 experts to conduct a brainstorming to affirm it. According to the results of quantitative research, it found that the tourists were satisfied with personnel of competition at high level (x=3.87), followed by facilities, services, and safe high level (x=3.67). Furthermore, they were satisfied with operation in competition field at high level (x=3.62).Regarding the qualitative methodology including literature review, theories, concepts and analysis of qualitative research development of the model for Muay Thai competition to promote the sport tourism in the next decade, the findings indicated that there were 2 data sets as follows: The first one was related to Muay Thai competition to encourage the sport tourism and the second one was associated with Muay Thai stadium management to support the sport tourism. After the brain storming, “EE Muay Thai Model” was finally developed for promoting the sport tourism in the next decade (2015-2024).Keywords: Muay Thai competition management, Muay Thai sport tourism, Muay Thai, Muay Thai for sport tourism management
Procedia PDF Downloads 319438 Probability Sampling in Matched Case-Control Study in Drug Abuse
Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell
Abstract:
Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling
Procedia PDF Downloads 493437 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods
Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo
Abstract:
The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines
Procedia PDF Downloads 622436 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers
Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider
Abstract:
At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web
Procedia PDF Downloads 136435 Artificial Neural Network Based Approach in Prediction of Potential Water Pollution Across Different Land-Use Patterns
Authors: M.Rüştü Karaman, İsmail İşeri, Kadir Saltalı, A.Reşit Brohi, Ayhan Horuz, Mümin Dizman
Abstract:
Considerable relations has recently been given to the environmental hazardous caused by agricultural chemicals such as excess fertilizers. In this study, a neural network approach was investigated in the prediction of potential nitrate pollution across different land-use patterns by using a feedforward multilayered computer model of artificial neural network (ANN) with proper training. Periodical concentrations of some anions, especially nitrate (NO3-), and cations were also detected in drainage waters collected from the drain pipes placed in irrigated tomato field, unirrigated wheat field, fallow and pasture lands. The soil samples were collected from the irrigated tomato field and unirrigated wheat field on a grid system with 20 m x 20 m intervals. Site specific nitrate concentrations in the soil samples were measured for ANN based simulation of nitrate leaching potential from the land profiles. In the application of ANN model, a multi layered feedforward was evaluated, and data sets regarding with training, validation and testing containing the measured soil nitrate values were estimated based on spatial variability. As a result of the testing values, while the optimal structures of 2-15-1 was obtained (R2= 0.96, P < 0.01) for unirrigated field, the optimal structures of 2-10-1 was obtained (R2= 0.96, P < 0.01) for irrigated field. The results showed that the ANN model could be successfully used in prediction of the potential leaching levels of nitrate, based on different land use patterns. However, for the most suitable results, the model should be calibrated by training according to different NN structures depending on site specific soil parameters and varied agricultural managements.Keywords: artificial intelligence, ANN, drainage water, nitrate pollution
Procedia PDF Downloads 311434 Instructional Leadership, Information and Communications Technology Competencies and Performance of Basic Education Teachers
Authors: Jay Martin L. Dionaldo
Abstract:
This study aimed to develop a causal model on the performance of the basic education teachers in the Division of Malaybalay City for the school year 2018-2019. This study used the responses of 300 randomly selected basic education teachers of Malaybalay City, Bukidnon. They responded to the three sets of questionnaires patterned from the National Education Association (2018) on instructional leadership of teachers, the questionnaire of Caluza et al., (2017) for information and communications technology competencies and the questionnaire on the teachers’ performance using the Individual Performance Commitment and Review Form (IPCRF) adopted by the Department of Education (DepEd). Descriptive statistics such as mean for the description, correlation for a relationship, regression for the extent influence, and path analysis for the model that best fits teachers’ performance were used. Result showed that basic education teachers have a very satisfactory level of performance. Also, the teachers highly practice instructional leadership practices in terms of coaching and mentoring, facilitating collaborative relationships, and community awareness and engagement. On the other hand, they are proficient users of ICT in terms of technology operations and concepts and basic users in terms of their pedagogical indicators. Furthermore, instructional leadership, coaching and mentoring, facilitating collaborative relationships and community awareness and engagement and information and communications technology competencies; technology operations and concept and pedagogy were significantly correlated toward teachers’ performance. Coaching and mentoring, community awareness and engagement, and technology operations and concept were the best predictors of teachers’ performance. The model that best fit teachers’ performance is anchored on coaching and mentoring of the teachers, embedded with facilitating collaborative relationships, community awareness, and engagement, technology operations, and concepts, and pedagogy.Keywords: information and communications technology, instructional leadership, coaching and mentoring, collaborative relationship
Procedia PDF Downloads 116433 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification
Authors: Meimei Shi
Abstract:
Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus
Procedia PDF Downloads 140432 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 66431 Identification of the Microalgae Species in a Wild Mix Culture Acclimated to Landfill Leachate and Ammonia Removal Performances in a Microbubble Assisted Photobioreactor
Authors: Neslihan Ozman Say, Jim Gilmour, Pratik Desai, William Zimmerman
Abstract:
Landfill leachate treatment has been attracting researchers recently for various environmental and economical reasons. Leachate discharge to receiving waterbodies without treatment causes serious detrimental effects including partial oxygen depletion due to high biological oxygen demand (BOD) and chemical oxygen demand (COD) concentrations besides toxicity of heavy metals it contains and high ammonia concentrations. In this study, it is aimed to show microalgal ammonia removal performances of a wild microalgae consortia as an alternative treatment method and determine the dominant leachate tolerant species for this consortia. For the microalgae species identification experiments a microalgal consortium which has been isolated from a local pond in Sheffield inoculated in %5 diluted raw landfill leachate and acclimated to the leachate by batch feeding for a month. In order to determine the most tolerant microalgal consortium, four different untreated landfill leachate samples have been used as diluted in four different ratios as 5%, 10%, 20%, and 40%. Microalgae cell samples have been collected from all experiment sets and have been examined by using 18S rDNA sequencing and specialised gel electrophoresis which are adapted molecular biodiversity methods. The best leachate tolerant algal consortium is being used in order to determine ammonia removal performances of the culture in a microbubble assisted photobioreactor (PBR). A porous microbubble diffuser which is supported by a fluidic oscillator is being used for dosing CO₂ and air mixture in the PBR. It is known that high mass transfer performance of microbubble technology provides a better removal efficiency and a better mixing in the photobioreactor. Ammonia concentrations and microalgal growth are being monitored for PBR currently. It is aimed to present all the results of the study in final paper submission.Keywords: ammonia removal from leachate, landfill leachate treatment, microalgae species identification, microbubble assisted photobioreactors
Procedia PDF Downloads 161