Search results for: mutant sets
453 Quest for Literary Past: A Study of Byatt’s Possession
Authors: Chen Jun
Abstract:
Antonia Susan Byatt’s Possession: A Romance has been misread as a postmodern pastiche novel since its publication because there are epics, epigraphs, lyrics, fairy tales, epistles, and even critical articles swollen in this work. The word ‘pastiche’ suggests messy, disorganized, and chaotic, which buries its artistic excellence while overlooking its subtitle, A Romance. The center of romance is the quest that the hero sets forth to conquer the adversity, hardship, and danger to accomplish a task to prove his identity or social worth. This paper argues that Byatt’s Possession is not a postmodern pastiche novel but rather a postmodern romance in which the characters in the academic world set forth their quest into the Victorian literary past that is nostalgically identified by Byatt as the Golden Age of English literature. In doing so, these five following issues are addressed: first, the origin of the protagonist Roland, and consequently, the nature of his quest; second, the central image of the dragon created by the fictional Victorian poet Henry Ash; third, Melusine as an image of female serpent created by the fictional Victorian poet Christabel LaMotte; fourth, the images of the two ladies; last, the image of water that links the dragon and the serpent. In Possession, the past is reinvented not as an unfortunate fall but as a Golden Age presented in the imaginative academic adventure. The dragon, a stereotypical symbol of evil, becomes the symbol of life in Byatt’s work, which parallels with the image of the mythical phoenix that can resurrect from its own ash. At the same time, the phoenix symbolizes Byatt’s efforts to revive the Victorian poetic art that is supposed to be dead in the post-capitalism society when the novel is the dominating literary genre and poetry becomes the minority. The fictional Victorian poet Ash is in fact Byatt’s own poetic mask through which she breathes life into the lost poetic artistry in the postmodern era.Keywords: Byatt, possession, postmodern romance, literary past
Procedia PDF Downloads 414452 A Preliminary Investigation on Factors that Influence Road Users Speeding Behaviors in Selected Roads of Peninsular Malaysia
Authors: Farah Fazlinda Binti Mohamad, Siti Hikmah Binti Musthar, Ahmad Saifizul Bin Abdullah, Jamilah Mohamad, Mohamed Rehan Karim
Abstract:
Road safety is intolerable issue. It affects and impinges on everyone's life as the roads shared by everyone. The most vulnerable victims were the road users who cater the roads every day. It is an appalling when World Health Organization reported that Malaysian road users were ranked worst in Asian countries with 23 deaths for every 100,000 of population over the span of 12 years (World Health Organization, 2009). From this report, it is found that speeding has contributed to 60% of all accidents in the country. Therefore, this study aims to elucidate on speeding matter that occur among road users in selected roads of Peninsular Malaysia. This study on the other hand, provides an insight understanding on the factors affecting behaviour of road users to speeding in selected roads of Peninsular Malaysia. To answer the study aims, 500 sets of questionnaires were distributed among 500 respondents in selected roads of Peninsular Malaysia to obtain their opinions on the matter. The respondents were from different demographics backgrounds to have fair explanation on the issue. The answers have been analysed using descriptive analysis. The results indicated psychological factors of road users appeared to be prominent in explaining road users’ behaviour to speeding. Male road users were also found dominant in speeding compared to female. Thus, this has increased their vulnerability to road injuries and deaths. These findings are very useful in order for us to improve our driving behaviour. Relevant authorities should also revise the existing countermeasures as well as designing the new countermeasures for the road users. It is nevertheless important to comprehend this speeding issue and factors associating it. This matter should be taken seriously and responsibly by each road users as road safety is a responsible of all.Keywords: road safety, speeding, countermeasures, accidents
Procedia PDF Downloads 487451 Classification on Statistical Distributions of a Complex N-Body System
Authors: David C. Ni
Abstract:
Contemporary models for N-body systems are based on temporal, two-body, and mass point representation of Newtonian mechanics. Other mainstream models include 2D and 3D Ising models based on local neighborhood the lattice structures. In Quantum mechanics, the theories of collective modes are for superconductivity and for the long-range quantum entanglement. However, these models are still mainly for the specific phenomena with a set of designated parameters. We are therefore motivated to develop a new construction directly from the complex-variable N-body systems based on the extended Blaschke functions (EBF), which represent a non-temporal and nonlinear extension of Lorentz transformation on the complex plane – the normalized momentum spaces. A point on the complex plane represents a normalized state of particle momentums observed from a reference frame in the theory of special relativity. There are only two key parameters, normalized momentum and nonlinearity for modelling. An algorithm similar to Jenkins-Traub method is adopted for solving EBF iteratively. Through iteration, the solution sets show a form of σ + i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various distributions, such as 1-peak, 2-peak, and 3-peak etc. distributions and some of them are analog to the canonical distributions. The results of the numerical analysis demonstrate continuum-to-discreteness transitions, evolutional invariance of distributions, phase transitions with conjugate symmetry, etc., which manifest the construction as a potential candidate for the unification of statistics. We hereby classify the observed distributions on the finite convergent domains. Continuous and discrete distributions both exist and are predictable for given partitions in different regions of parameter-pair. We further compare these distributions with canonical distributions and address the impacts on the existing applications.Keywords: blaschke, lorentz transformation, complex variables, continuous, discrete, canonical, classification
Procedia PDF Downloads 309450 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 268449 Evaluation of Environmental and Social Management System of Green Climate Fund's Accredited Entities: A Qualitative Approach Applied to Environmental and Social System
Authors: Sima Majnooni
Abstract:
This paper discusses the Green Climate Fund's environmental and social management framework (GCF). The environmental and social management framework ensures the accredited entity considers the GCF's accreditation standards and effectively implements each of the GCF-funded projects. The GCF requires all accredited entities to meet basic transparency and accountability standards as well as environmental and social safeguards (ESMS). In doing so, the accredited entity sets up different independent units. One of these units is called the Grievance Mechanism. When allegations of environmental and social harms are raised in association with GCF-funded activities, affected parties can contact the entity’s grievance unit. One of the most challenging things about the accredited entity's grievance unit is the lack of available information and resources on the entities' websites. Many AEs have anti-corruption or anti-money laundering unit, but they do not have the environmental and social unit for affected people. This paper will argue the effectiveness of environmental and social grievance mechanisms of AEs by using a qualitative approach to indicate how many of AEs have a poor or an effective GRM. Some ESMSs seem highly effective. On the other hand, other mechanisms lack basic requirements such as a clear, transparent, uniform procedure and a definitive timetable. We have looked at each AE mechanism not only in light of how the website goes into detail regarding the process of grievance mechanism but also in light of their risk category. Many mechanisms appear inadequate for the lower level risk category entities (C) and, even surprisingly, for many higher-risk categories (A). We found; in most cases, the grievance mechanism of AEs seems vague.Keywords: grievance mechanism, vague environmental and social policies, green climate fund, international climate finance, lower and higher risk category
Procedia PDF Downloads 124448 Seismic Vulnerability Analysis of Arch Dam Based on Response Surface Method
Authors: Serges Mendomo Meye, Li Guowei, Shen Zhenzhong
Abstract:
Earthquake is one of the main loads threatening dam safety. Once the dam is damaged, it will bring huge losses of life and property to the country and people. Therefore, it is very important to research the seismic safety of the dam. Due to the complex foundation conditions, high fortification intensity, and high scientific and technological content, it is necessary to adopt reasonable methods to evaluate the seismic safety performance of concrete arch dams built and under construction in strong earthquake areas. Structural seismic vulnerability analysis can predict the probability of structural failure at all levels under different intensity earthquakes, which can provide a scientific basis for reasonable seismic safety evaluation and decision-making. In this paper, the response surface method (RSM) is applied to the seismic vulnerability analysis of arch dams, which improves the efficiency of vulnerability analysis. Based on the central composite test design method, the material-seismic intensity samples are established. The response surface model (RSM) with arch crown displacement as performance index is obtained by finite element (FE) calculation of the samples, and then the accuracy of the response surface model (RSM) is verified. To obtain the seismic vulnerability curves, the seismic intensity measure ??(?1) is chosen to be 0.1~1.2g, with an interval of 0.1g and a total of 12 intensity levels. For each seismic intensity level, the arch crown displacement corresponding to 100 sets of different material samples can be calculated by algebraic operation of the response surface model (RSM), which avoids 1200 times of nonlinear dynamic calculation of arch dam; thus, the efficiency of vulnerability analysis is improved greatly.Keywords: high concrete arch dam, performance index, response surface method, seismic vulnerability analysis, vector-valued intensity measure
Procedia PDF Downloads 240447 Defect Correlation of Computed Tomography and Serial Sectioning in Additively Manufactured Ti-6Al-4V
Authors: Bryce R. Jolley, Michael Uchic
Abstract:
This study presents initial results toward the correlative characterization of inherent defects of Ti-6Al-4V additive manufacture (AM). X-Ray Computed Tomography (CT) defect data are compared and correlated with microscopic photographs obtained via automated serial sectioning. The metal AM specimen was manufactured out of Ti-6Al-4V virgin powder to specified dimensions. A post-contour was applied during the fabrication process with a speed of 1050 mm/s, power of 260 W, and a width of 140 µm. The specimen was stress relief heat-treated at 16°F for 3 hours. Microfocus CT imaging was accomplished on the specimen within a predetermined region of the build. Microfocus CT imaging was conducted with parameters optimized for Ti-6Al-4V additive manufacture. After CT imaging, a modified RoboMet. 3D version 2 was employed for serial sectioning and optical microscopy characterization of the same predetermined region. Automated montage capture with sub-micron resolution, bright-field reflection, 12-bit monochrome optical images were performed in an automated fashion. These optical images were post-processed to produce 2D and 3D data sets. This processing included thresholding and segmentation to improve visualization of defect features. The defects observed from optical imaging were compared and correlated with the defects observed from CT imaging over the same predetermined region of the specimen. Quantitative results of area fraction and equivalent pore diameters obtained via each method are presented for this correlation. It is shown that Microfocus CT imaging does not capture all inherent defects within this Ti-6Al-4V AM sample. Best practices for this correlative effort are also presented as well as the future direction of research resultant from this current study.Keywords: additive manufacture, automated serial sectioning, computed tomography, nondestructive evaluation
Procedia PDF Downloads 141446 Driving Green Public Procurement – A Framework for a Supporting Structure for Public Authorities Based on Good Practices in Europe
Authors: Pia Moschall, Kathrin Sackmann
Abstract:
Considering a purchasing volume of around two trillion Euros per year, which equals about 14% of the European Union’s gross domestic product, European public authorities have significant market power. Making use of this market power by prioritizing the procurement of green products and services offers a great potential to contribute to the Green New Deal. The market demand that is created by Green Public Procurement (GPP) sets incentives for European producers to design and develop Green Products and Eco-Innovations. However, most procurement still does not consider environmental criteria. The goal of the work is to encourage the adaptation of GPP in the European Union. To this end, the drivers for the adaptation were investigated over different case studies. The paper analyzes good-practice cases from European authorities from 2010 to 2020 that were provided by the European Commission. This analysis was guided by Philipp Mayring’s method of qualitative content analysis, whereby the inductively formed categories led to the identification of nine major drivers. The most important ones are ‘use of official guidelines and standards, ‘political support and requirements as well as ‘market research and involvement.’ Further, the paper discusses mutual dependencies between several drivers and how to exploit them. A supporting infrastructure was identified as a crucial factor for the successful adaption of green public procurement. In the next step, the work aims to examine on which administrative level the single drivers can be implemented most effectively. Practical implications of this research are recommendations on how to create a supporting structure on a municipal, federal and national level, including training for the responsible staff, support tools, as well as guidelines and standards for involved stakeholders.Keywords: content analysis, green public procurement, public authorities, sustainable procurement
Procedia PDF Downloads 146445 Recurrent Neural Networks for Complex Survival Models
Authors: Pius Marthin, Nihal Ata Tutkun
Abstract:
Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)
Procedia PDF Downloads 90444 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector
Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts
Abstract:
At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion
Procedia PDF Downloads 186443 Poor Proficiency of English Language among Tertiary Level Students in Bangladesh and Its Effect on Employability: An Investigation to Find Facts and Solutions
Authors: Tanvir Ahmed, Nahian Fyrose Fahim, Subrata Majumder, Sarker Kibria
Abstract:
English is unanimously recognized as the standard second language in the world, and no one can deny this fact. Many people believe that possessing English proficiency skills is the key to communicating effectively globally, especially for developing countries, which can bring further success to itself on many fronts, as well as to other countries, by ensuring its people worldwide access to education, business, and technology. Bangladesh is a developing country of about 160 million people. A notable number of students in Bangladesh are currently pursuing higher education, especially at the tertiary or collegiate level, in more than 150 public and private universities. English is the dominant linguistic medium through which college instruction and lectures are given to students in Bangladesh. However, many of our students who have only completed their primary and secondary levels of education in the Bangla medium or language are generally in an awkward position to suddenly take and complete many unfamiliar requirements by the time they enter the university as freshmen. As students, they struggle to complete at least 18 courses to acquire proficiency in English. After obtaining a tertiary education certificate, the students could then have the opportunity to acquire a sustainable position in the job market industry; however, many of them do fail, unfortunately, because of poor English proficiency skills. Our study focuses on students in both public and private universities (N=150) as well as education experts (N=30) in Bangladesh. We had prepared two sets of questionnaires that were based upon a literature review on this subject, as we had also collected data and identified the reasons, and arrived at probable solutions to overcoming these problems. After statistical analysis, the study suggested certain remedial measures that could be taken in order to increase student's proficiency in English as well as to ensure their employability potential.Keywords: tertiary education, English language proficiency, employability, unemployment problems
Procedia PDF Downloads 104442 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring
Authors: Daniel Fundi Murithi
Abstract:
Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring
Procedia PDF Downloads 163441 High-Risk Gene Variant Profiling Models Ethnic Disparities in Diabetes Vulnerability
Authors: Jianhua Zhang, Weiping Chen, Guanjie Chen, Jason Flannick, Emma Fikse, Glenda Smerin, Yanqin Yang, Yulong Li, John A. Hanover, William F. Simonds
Abstract:
Ethnic disparities in many diseases are well recognized and reflect the consequences of genetic, behavior, and environmental factors. However, direct scientific evidence connecting the ethnic genetic variations and the disease disparities has been elusive, which may have led to the ethnic inequalities in large scale genetic studies. Through the genome-wide analysis of data representing 185,934 subjects, including 14,955 from our own studies of the African America Diabetes Mellitus, we discovered sets of genetic variants either unique to or conserved in all ethnicities. We further developed a quantitative gene function-based high-risk variant index (hrVI) of 20,428 genes to establish profiles that strongly correlate with the subjects' self-identified ethnicities. With respect to the ability to detect human essential and pathogenic genes, the hrVI analysis method is both comparable with and complementary to the well-known genetic analysis methods, pLI and VIRlof. Application of the ethnicity-specific hrVI analysis to the type 2 diabetes mellitus (T2DM) national repository, containing 20,791 cases and 24,440 controls, identified 114 candidate T2DM-associated genes, 8.8-fold greater than that of ethnicity-blind analysis. All the genes identified are defined as either pathogenic or likely-pathogenic in ClinVar database, with 33.3% diabetes-associated and 54.4% obesity-associated genes. These results demonstrate the utility of hrVI analysis and provide the first genetic evidence by clustering patterns of how genetic variations among ethnicities may impede the discovery of diabetes and foreseeably other disease-associated genes.Keywords: diabetes-associated genes, ethnic health disparities, high-risk variant index, hrVI, T2DM
Procedia PDF Downloads 137440 Calcitonin gene-related peptide Receptor Antagonists for Chronic Migraine – Real World Outcomes
Authors: B. J. Mahen, N. E. Lloyd-Gale, S. Johnson, W. P. Rakowicz, M. J. Harris, A. D. Miller
Abstract:
Background: Migraine is a leading cause of disability in the world. Calcitonin gene-related peptide (CGRP) receptor antagonists offer an approach to migraine prophylaxis by inhibiting the inflammatory and vasodilatory effects of CGRP. In recent years, NICE licensed the use of three CGRP-receptor antagonists: Fremanezumab, Galcanezumab, and Erenumab. Here, we present the outcomes of CGRP-antagonist treatment in a cohort of patients who suffer from episodic or chronic migraine and have failed at least three oral prophylactic therapies. Methods: We offered CGRP antagonists to 86 patients who met the NICE criteria to start therapy. We recorded the number of headache days per month (HDPM) at 0 weeks, 3 months, and 12 months. Of those, 26 patients were switched to an alternative treatment due to poor response or side effects. Of the 112 total cases, 9 cases did not sufficiently maintain their headache diary, and 5 cases were not followed up at 3 months. We have therefore included 98 sets of data in our analysis. Results: Fremanezumab achieved a reduction in HDPM by 51.7% at 3 months (p<0.0001), with 63.7% of patients meeting NICE criteria to continue therapy. Patients trialed on Galcanezumab attained a reduction in HDPM by 47.0% (p=0.0019), with 51.6% of patients meeting NICE criteria to continue therapy. Erenumab, however, only achieved a reduction in HDPM by 17.0% (p=0.29), and this was not statistically significant. Furthermore, 34.4%, 9.7%, and 4.9% of patients taking Fremanezumab, Galcanezumab, and Erenumab, respectively, continued therapy beyond 12 months. Of those who attempted drug holidays following 12 months of treatment, migraine symptoms relapsed in 100% of cases. Conclusion: We observed a significant improvement in HDPM amongst episodic and chronic migraine patients following treatment with Fremanezumab or Galcanezumab.Keywords: migraine, CGRP, fremanezumab, galcanezumab, erenumab
Procedia PDF Downloads 95439 Experimental Study on Strength Development of Low Cement Concrete Using Mix Design for Both Binary and Ternary Mixes
Authors: Mulubrhan Berihu, Supratic Gupta, Zena Gebriel
Abstract:
Due to the design versatility, availability, and cost efficiency, concrete is continuing to be the most used construction material on earth. However, the production of Portland cement, the primary component of concrete mix is causing to have a serious effect on environmental and economic impacts. This shows there is a need to study using of supplementary cementitious materials (SCMs). The most commonly used supplementary cementitious materials are wastes and the use of these industrial waste products has technical, economical and environmental benefits besides the reduction of CO2 emission from cement production. The study aims to document the effect on strength property of concrete due to use of low cement by maximizing supplementary cementitious materials like fly ash or marble powder. Based on the different mix proportion of pozzolana and marble powder a range of mix design was formulated. The first part of the project is to study the strength of low cement concrete using fly ash replacement experimentally. The test results showed that using up to 85 kg/m3 of cement is possible for plain concrete works like hollow block concrete to achieve 9.8 Mpa and the experimental results indicates that strength is a function of w/b. In the second part a new set of mix design has been carried out with fly ash and marble powder to study the strength of both binary and ternary mixes. In this experimental study, three groups of mix design (c+FA, c+FA+m and c+m), four sets of mixes for each group were taken up. Experimental results show that c+FA has maintained the best strength and impermeability whereas c+m obtained less compressive strength, poorer permeability and split tensile strength. c+FA shows a big difference in gaining of compressive strength from 7 days to 28 days compression strength compared to others and this obviously shows the slow rate of hydration of fly ash concrete. As the w/b ratio increases the strength decreases significantly. At the same time higher permeability has been seen in the specimens which were tested for three hours than one hour.Keywords: efficiency factor, cement content, compressive strength, mix proportion, w/c ratio, water permeability, SCMs
Procedia PDF Downloads 209438 Design of Low-Emission Catalytically Stabilized Combustion Chamber Concept
Authors: Annapurna Basavaraju, Andreas Marn, Franz Heitmeir
Abstract:
The Advisory Council for Aeronautics Research in Europe (ACARE) is cognizant for the overall reduction of NOx emissions by 80% in its vision 2020. Moreover small turbo engines have higher fuel specific emissions compared to large engines due to their limited combustion chamber size. In order to fulfill these requirements, novel combustion concepts are essential. This motivates to carry out the research on the current state of art, catalytic stabilized combustion chamber using hydrogen in small jet engines which are designed and investigated both numerically and experimentally during this project. Catalytic combustion concepts can also be adopted for low caloric fuels and are therefore not constrained to only hydrogen. However, hydrogen has high heating value and has the major advantage of producing only the nitrogen oxides as pollutants during the combustion, thus eliminating the interest on other emissions such as Carbon monoxides etc. In the present work, the combustion chamber is designed based on the ‘Rich catalytic Lean burn’ concept. The experiments are conducted for the characteristic operating range of an existing engine. This engine has been tested successfully at Institute of Thermal Turbomachinery and Machine Dynamics (ITTM), Technical University Graz. One of the facts that the efficient combustion is a result of proper mixing of fuel-air mixture, considerable significance is given to the selection of appropriate mixer. This led to the design of three diverse configurations of mixers and is investigated experimentally and numerically. Subsequently the best mixer would be equipped in the main combustion chamber and used throughout the experimentation. Furthermore, temperatures and pressures would be recorded at various locations inside the combustion chamber and the exhaust emissions will also be analyzed. The instrumented combustion chamber would be inspected at the engine relevant inlet conditions for nine different sets of catalysts at the Hot Flow Test Facility (HFTF) of the institute.Keywords: catalytic combustion, gas turbine, hydrogen, mixer, NOx emissions
Procedia PDF Downloads 305437 The Psychology of Cross-Cultural Communication: A Socio-Linguistics Perspective
Authors: Tangyie Evani, Edmond Biloa, Emmanuel Nforbi, Lem Lilian Atanga, Kom Beatrice
Abstract:
The dynamics of languages in contact necessitates a close study of how its users negotiate meanings from shared values in the process of cross-cultural communication. A transverse analysis of the situation demonstrates the existence of complex efforts on connecting cultural knowledge to cross-linguistic competencies within a widening range of communicative exchanges. This paper sets to examine the psychology of cross-cultural communication in a multi-linguistic setting like Cameroon where many local and international languages are in close contact. The paper equally analyses the pertinence of existing macro sociological concepts as fundamental knowledge traits in literal and idiomatic cross semantic mapping. From this point, the article presents a path model of connecting sociolinguistics to the increasing adoption of a widening range of communicative genre piloted by the on-going globalisation trends with its high-speed information technology machinery. By applying a cross cultural analysis frame, the paper will be contributing to a better understanding of the fundamental changes in the nature and goals of cross-cultural knowledge in pragmatics of communication and cultural acceptability’s. It emphasises on the point that, in an era of increasing global interchange, a comprehensive inclusive global culture through bridging gaps in cross-cultural communication would have significant potentials to contribute to achieving global social development goals, if inadequacies in language constructs are adjusted to create avenues that intertwine with sociocultural beliefs, ensuring that meaningful and context bound sociolinguistic values are observed within the global arena of communication.Keywords: cross-cultural communication, customary language, literalisms, primary meaning, subclasses, transubstantiation
Procedia PDF Downloads 285436 A Prediction of Cutting Forces Using Extended Kienzle Force Model Incorporating Tool Flank Wear Progression
Authors: Wu Peng, Anders Liljerehn, Martin Magnevall
Abstract:
In metal cutting, tool wear gradually changes the micro geometry of the cutting edge. Today there is a significant gap in understanding the impact these geometrical changes have on the cutting forces which governs tool deflection and heat generation in the cutting zone. Accurate models and understanding of the interaction between the work piece and cutting tool leads to improved accuracy in simulation of the cutting process. These simulations are useful in several application areas, e.g., optimization of insert geometry and machine tool monitoring. This study aims to develop an extended Kienzle force model to account for the effect of rake angle variations and tool flank wear have on the cutting forces. In this paper, the starting point sets from cutting force measurements using orthogonal turning tests of pre-machined flanches with well-defined width, using triangular coated inserts to assure orthogonal condition. The cutting forces have been measured by dynamometer with a set of three different rake angles, and wear progression have been monitored during machining by an optical measuring collaborative robot. The method utilizes the measured cutting forces with the inserts flank wear progression to extend the mechanistic cutting forces model with flank wear as an input parameter. The adapted cutting forces model is validated in a turning process with commercial cutting tools. This adapted cutting forces model shows the significant capability of prediction of cutting forces accounting for tools flank wear and different-rake-angle cutting tool inserts. The result of this study suggests that the nonlinear effect of tools flank wear and interaction between the work piece and the cutting tool can be considered by the developed cutting forces model.Keywords: cutting force, kienzle model, predictive model, tool flank wear
Procedia PDF Downloads 108435 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process
Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma
Abstract:
As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis
Procedia PDF Downloads 100434 Intercultural Education and Changing Paradigms of Education: A Research Survey
Authors: Shalini Misra
Abstract:
The means and methods of education have been changing fast since the invention of internet. Both, ancient and modern education emphasized on the holistic development of students. But, a significant change has been observed in the 21st century learners. Online classes, intercultural and interdisciplinary education which were exceptions in the past, are setting new trends in the field of education. In the modern era, intercultural and interpersonal skills are of immense importance, not only for students but for everyone. It sets a platform for better understanding and deeper learning by ensuring the active participation and involvement of students belonging to different social and cultural backgrounds in various academic and non-academic pursuits. On October 31, 2015, on the occasion of 140th birth anniversary of Sardar Vallabhbhai Patel, Hon’ble Prime Minister of India, Narendra Modi announced a wonderful initiative, ‘Ek Bharat Shreshtha Bharat’ i.e. ‘One India Best India’ commonly known as ‘EBSB’. The program highlighted India’s rich culture and traditions. The objective of the program was to foster a better understanding and healthy relationship among Indian States. Under this program, a variety of subjects were covered like ‘Arts, Culture and Language’ .It was claimed to be a successful cultural exchange where students from diverse communities shared their thoughts and experiences with one another. Under this online cultural exchange program, the state of Uttarakhand was paired with the state of Karnataka in the year 2022. The present paper proposes to undertake a survey of a total of thirty secondary level students of Uttarakhand and the partner state Karnataka, who participated in this program with a purpose of learning and embracing new ideas and culture thus promoting intercultural education. It aims to study and examine the role of intercultural education in shifting and establishing new paradigms of education.Keywords: education, intercultural, interpersonal, traditions, understanding
Procedia PDF Downloads 79433 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel
Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani
Abstract:
Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry
Procedia PDF Downloads 271432 Socio-Cultural Representations through Lived Religions in Dalrymple’s Nine Lives
Authors: Suman
Abstract:
In the continuous interaction between the past and the present that historiography is, each time when history gets re/written, a new representation emerges. This new representation is a reflection of the earlier archives and their interpretations, fragmented remembrances of the past, as well as the reactions to the present. Memory, or lack thereof, and stereotyping generally play a major role in this representation. William Dalrymple’s Nine Lives: In Search of the Sacred in Modern India (2009) is one such written account that sets out to narrate the representations of religion and culture of India and contemporary reactions to it. Dalrymple’s nine saints belong to different castes, sects, religions, and regions. By dealing with their religions and expressions of those religions, and through the lived mysticism of these nine individuals, the book engages with some important issues like class, caste and gender in the contexts provided by historical as well as present India. The paper studies the development of religion and accompanied feeling of religiosity in modern as well as historical contexts through a study of these elements in the book. Since, the language used in creation of texts and the literary texts thus produced create a new reality that questions the stereotypes of the past, and in turn often end up creating new stereotypes or stereotypical representations at times, the paper seeks to actively engage with the text in order to identify and study such stereotypes, along with their changing representations. Through a detailed examination of the book, the paper seeks to unravel whether some socio-cultural stereotypes existed earlier, and whether there is development of new stereotypes from Dalrymple’s point of view as an outsider writing on issues that are deeply rooted in the cultural milieu of the country. For this analysis, the paper takes help from the psycho-literary theories of stereotyping and representation.Keywords: stereotyping, representation, William Dalrymple, religion
Procedia PDF Downloads 310431 Maintaining Experimental Consistency in Geomechanical Studies of Methane Hydrate Bearing Soils
Authors: Lior Rake, Shmulik Pinkert
Abstract:
Methane hydrate has been found in significant quantities in soils offshore within continental margins and in permafrost within arctic regions where low temperature and high pressure are present. The mechanical parameters for geotechnical engineering are commonly evaluated in geomechanical laboratories adapted to simulate the environmental conditions of methane hydrate-bearing sediments (MHBS). Due to the complexity and high cost of natural MHBS sampling, most laboratory investigations are conducted on artificially formed samples. MHBS artificial samples can be formed using different hydrate formation methods in the laboratory, where methane gas and water are supplied into the soil pore space under the methane hydrate phase conditions. The most commonly used formation method is the excess gas method which is considered a relatively simple, time-saving, and repeatable testing method. However, there are several differences in the procedures and techniques used to produce the hydrate using the excess gas method. As a result of the difference between the test facilities and the experimental approaches that were carried out in previous studies, different measurement criteria and analyses were proposed for MHBS geomechanics. The lack of uniformity among the various experimental investigations may adversely impact the reliability of integrating different data sets for unified mechanical model development. In this work, we address some fundamental aspects relevant to reliable MHBS geomechanical investigations, such as hydrate homogeneity in the sample, the hydrate formation duration criterion, the hydrate-saturation evaluation method, and the effect of temperature measurement accuracy. Finally, a set of recommendations for repeatable and reliable MHBS formation will be suggested for future standardization of MHBS geomechanical investigation.Keywords: experimental study, laboratory investigation, excess gas, hydrate formation, standardization, methane hydrate-bearing sediment
Procedia PDF Downloads 58430 A Multivariate Statistical Approach for Water Quality Assessment of River Hindon, India
Authors: Nida Rizvi, Deeksha Katyal, Varun Joshi
Abstract:
River Hindon is an important river catering the demand of highly populated rural and industrial cluster of western Uttar Pradesh, India. Water quality of river Hindon is deteriorating at an alarming rate due to various industrial, municipal and agricultural activities. The present study aimed at identifying the pollution sources and quantifying the degree to which these sources are responsible for the deteriorating water quality of the river. Various water quality parameters, like pH, temperature, electrical conductivity, total dissolved solids, total hardness, calcium, chloride, nitrate, sulphate, biological oxygen demand, chemical oxygen demand and total alkalinity were assessed. Water quality data obtained from eight study sites for one year has been subjected to the two multivariate techniques, namely, principal component analysis and cluster analysis. Principal component analysis was applied with the aim to find out spatial variability and to identify the sources responsible for the water quality of the river. Three Varifactors were obtained after varimax rotation of initial principal components using principal component analysis. Cluster analysis was carried out to classify sampling stations of certain similarity, which grouped eight different sites into two clusters. The study reveals that the anthropogenic influence (municipal, industrial, waste water and agricultural runoff) was the major source of river water pollution. Thus, this study illustrates the utility of multivariate statistical techniques for analysis and elucidation of multifaceted data sets, recognition of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.Keywords: cluster analysis, multivariate statistical techniques, river Hindon, water quality
Procedia PDF Downloads 467429 The Influence of Celebrity Endorsement on Consumers’ Attitude and Purchas Intention Towards Skincare Products in Malaysia
Authors: Tew Leh Ghee
Abstract:
The study's goal is to determine how celebrity endorsement affects Malaysian consumers' attitudes and intentions to buy skincare products. Since customers now largely rely on celebrity endorsement to influence purchasing decisions in almost every business, celebrity endorsement is not, in reality, a new phenomenon. Even though the market for skincare products has a vast potential to be exploited, corporations have yet to seize this niche via celebrity endorsement. Basically, there hasn't been much study done to recognize the significance of celebrity endorsement in this industry. This research combined descriptive and quantitative methods with a self-administered survey as the primary data-gathering tool. All of the characteristics under study were measured using a 5-point Likert scale, and the questionnaire was written in English. A convenience sample method was used to choose respondents, and 360 sets of valid questionnaires were gathered for the study's statistical analysis. Preliminary statistical analyses were analyzed using SPSS version 20.0 (Statistical Package for the Social Sciences). The backdrop of the respondents' demographics was examined using descriptive analysis. All concept assessments' validity and reliability were examined using exploratory factor analysis, item-total statistics, and reliability statistics. Pearson correlation and regression analysis were used, respectively, to assess relationships and impacts between the variables under study. The research showed that, apart from competence, celebrity endorsements of skincare products in Malaysia had a favorable impact on attitudes and purchase intentions as evaluated by attractiveness and dependability. The research indicated that the most significant element influencing attitude and buy intention was the credibility of a celebrity endorsement. The study offered implications in order to provide potential improvements of celebrity endorsement in skincare goods in Malaysia. The study's last portion includes its limits and ideas for the future.Keywords: trustworthiness, influential, phenomenon, celebrity emdorsement
Procedia PDF Downloads 80428 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 73427 Misconception on Multilingualism in Glorious Quran
Authors: Muhammed Unais
Abstract:
The holy Quran is a pure Arabic book completely ensured the absence of non Arabic term. If it was revealed in a multilingual way including various foreign languages besides the Arabic, it can be easily misunderstood that the Arabs became helpless to compile such a work positively responding to the challenge of Allah due to their lack of knowledge in other languages in which the Quran is compiled. As based on the presence of some non Arabic terms in Quran like Istabrq, Saradiq, Rabbaniyyoon, etc. some oriental scholars argued that the holy Quran is not a book revealed in Arabic. We can see some Muslim scholars who either support or deny the presence of foreign terms in Quran but all of them agree that the roots of these words suspected as non Arabic are from foreign languages and are assimilated to the Arabic and using as same in that foreign language. After this linguistic assimilation was occurred and the assimilated non Arabic words became familiar among the Arabs, the Quran revealed as using these words in such a way stating that all words it contains are Arabic either pure or assimilated. Hence the two of opinions around the authenticity and reliability of etymology of these words are right. Those who argue the presence of foreign words he is right by the way of the roots of that words are from foreign and those who argue its absence he is right for that are assimilated and changed as the pure Arabic. The possibility of multilingualism in a monolingual book is logically negative but its significance is being changed according to time and place. The problem of multilingualism in Quran is the misconception raised by some oriental scholars that the Arabs became helpless to compile a book equal to Quran not because of their weakness in Arabic but because the Quran is revealed in languages they are ignorant on them. Really, the Quran was revealed in pure Arabic, the most literate language of the Arabs, and the whole words and its meaning were familiar among them. If one become positively aware of the linguistic and cultural assimilation ever found in whole civilizations and cultural sets he will have not any question in this respect. In this paper the researcher intends to shed light on the possibility of multilingualism in a monolingual book and debates among scholars in this issue, foreign terms in Quran and the logical justifications along with the exclusive features of Quran.Keywords: Quran, foreign Terms, multilingualism, language
Procedia PDF Downloads 397426 Discussion as a Means to Improve Peer Assessment Accuracy
Authors: Jung Ae Park, Jooyong Park
Abstract:
Writing is an important learning activity that cultivates higher level thinking. Effective and immediate feedback is necessary to help improve students' writing skills. Peer assessment can be an effective method in writing tasks because it makes it possible for students not only to receive quick feedback on their writing but also to get a chance to examine different perspectives on the same topic. Peer assessment can be practiced frequently and has the advantage of immediate feedback. However, there is controversy about the accuracy of peer assessment. In this study, we tried to demonstrate experimentally how the accuracy of peer assessment could be improved. Participants (n=76) were randomly assigned to groups of 4 members. All the participant graded two sets of 4 essays on the same topic. They graded the first set twice, and the second set or the posttest once. After the first grading of the first set, each group in the experimental condition 1 (discussion group), were asked to discuss the results of the peer assessment and then to grade the essays again. Each group in the experimental condition 2 (reading group), were asked to read the assessment on each essay by an expert and then to grade the essays again. In the control group, the participants were asked to grade the 4 essays twice in different orders. Afterwards, all the participants graded the second set of 4 essays. The mean score from 4 participants was calculated for each essay. The accuracy of the peer assessment was measured by Pearson correlation with the scores of the expert. The results were analyzed by two-way repeated measure ANOVA. The main effect of grading was observed: Grading accuracy got better as the number of grading experience increased. Analysis of posttest accuracy revealed that the score variations within a group of 4 participants decreased in both discussion and reading conditions but not in the control condition. These results suggest that having students discuss their grading together can be an efficient means to improve peer assessment accuracy. By discussing, students can learn from others about what to consider in grading and whether their grading is too strict or lenient. Further research is needed to examine the exact cause of the grading accuracy.Keywords: peer assessment, evaluation accuracy, discussion, score variations
Procedia PDF Downloads 267425 Biopolitical Border Imagery during the European Migrant Crisis: A Comparative Discourse Analysis between Mediterranean Europe and the Balkans
Authors: Mira Kaneva
Abstract:
The ongoing migration crisis polemic opens up the debate to the ambivalent essence of borders due to both the legality and legitimacy of the displacement of vast masses of people across the European continent. In neoliberal terms, migration is seen as an economic opportunity, or, on the opposite, as a social disparity; in realist terms, it is regarded as a security threat that calls for mobilization; from a critical standpoint, it is a matter of discourse on democratic governance. This paper sets the objective of analyzing borders through the Foucauldian prism of biopolitics. It aims at defining the specifics of the management of the human body by producing both the irregular migrant as a subject (but prevalently as an object in the discourse) and the political subjectivity by exercising state power in repressive practices, including hate speech. The study relies on the conceptual framework of Bigo, Agamben, Huysmans, among others, and applies the methodology of qualitative comparative analysis between the cases of borders (fences, enclaves, camps and other forms of abnormal spatiality) in Italy, Spain, Greece, the Republic of Macedonia, Serbia and Bulgaria. The paper thus tries to throw light on these cross- and intra-regional contexts that share certain similarities and differences. It tries to argue that the governmentality of the masses of refugees and economic immigrants through the speech acts of their exclusion leads to a temporary populist backlash; a tentative finding is that the status-quo in terms of social and economic measures remains relatively balanced, whereas, values such as freedom, openness, and tolerance are consecutively marginalized.Keywords: Balkans, biopolitical borders, cross- and intra-regional discourse analysis, irregular migration, Mediterranean Europe, securitization vs. humanitarianism
Procedia PDF Downloads 214424 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 32