Search results for: automated assembly
1032 Transformer-Driven Multi-Category Classification for an Automated Academic Strand Recommendation Framework
Authors: Ma Cecilia Siva
Abstract:
This study introduces a Bidirectional Encoder Representations from Transformers (BERT)-based machine learning model aimed at improving educational counseling by automating the process of recommending academic strands for students. The framework is designed to streamline and enhance the strand selection process by analyzing students' profiles and suggesting suitable academic paths based on their interests, strengths, and goals. Data was gathered from a sample of 200 grade 10 students, which included personal essays and survey responses relevant to strand alignment. After thorough preprocessing, the text data was tokenized, label-encoded, and input into a fine-tuned BERT model set up for multi-label classification. The model was optimized for balanced accuracy and computational efficiency, featuring a multi-category classification layer with sigmoid activation for independent strand predictions. Performance metrics showed an F1 score of 88%, indicating a well-balanced model with precision at 80% and recall at 100%, demonstrating its effectiveness in providing reliable recommendations while reducing irrelevant strand suggestions. To facilitate practical use, the final deployment phase created a recommendation framework that processes new student data through the trained model and generates personalized academic strand suggestions. This automated recommendation system presents a scalable solution for academic guidance, potentially enhancing student satisfaction and alignment with educational objectives. The study's findings indicate that expanding the data set, integrating additional features, and refining the model iteratively could improve the framework's accuracy and broaden its applicability in various educational contexts.Keywords: tokenized, sigmoid activation, transformer, multi category classification
Procedia PDF Downloads 91031 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine
Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang
Abstract:
Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing
Procedia PDF Downloads 2041030 Numerical Investigation of the Operating Parameters of the Vertical Axis Wind Turbine
Authors: Zdzislaw Kaminski, Zbigniew Czyz, Tytus Tulwin
Abstract:
This paper describes the geometrical model, algorithm and CFD simulation of an airflow around a Vertical Axis Wind Turbine rotor. A solver, ANSYS Fluent, was applied for the numerical simulation. Numerical simulation, unlike experiments, enables us to validate project assumptions when it is designed to avoid a costly preparation of a model or a prototype for a bench test. This research focuses on the rotor designed according to patent no PL 219985 with its blades capable of modifying their working surfaces, i.e. absorbing wind kinetic energy. The operation of this rotor is based on a regulation of blade angle α between the top and bottom parts of blades mounted on an axis. If angle α increases, the working surface which absorbs wind kinetic energy also increases. CFD calculations enable us to compare aerodynamic characteristics of forces acting on rotor working surfaces and specify rotor operation parameters like torque or turbine assembly power output. This paper is part of the research to improve an efficiency of a rotor assembly and it contains investigation of the impact of a blade angle of wind turbine working blades on the power output as a function of rotor torque, specific rotational speed and wind speed. The simulation was made for wind speeds ranging from 3.4 m/s to 6.2 m/s and blade angles of 30°, 60°, 90°. The simulation enables us to create a mathematical model to describe how aerodynamic forces acting each of the blade of the studied rotor are generated. Also, the simulation results are compared with the wind tunnel ones. This investigation enables us to estimate the growth in turbine power output if a blade angle changes. The regulation of blade angle α enables a smooth change in turbine rotor power, which is a kind of safety measures if the wind is strong. Decreasing blade angle α reduces the risk of damaging or destroying a turbine that is still in operation and there is no complete rotor braking as it is in other Horizontal Axis Wind Turbines. This work has been financed by the Polish Ministry of Science and Higher Education.Keywords: computational fluid dynamics, mathematical model, numerical analysis, power, renewable energy, wind turbine
Procedia PDF Downloads 3381029 Optimization Process for Ride Quality of a Nonlinear Suspension Model Based on Newton-Euler’ Augmented Formulation
Authors: Mohamed Belhorma, Aboubakar S. Bouchikhi, Belkacem Bounab
Abstract:
This paper addresses modeling a Double A-Arm suspension, a three-dimensional nonlinear model has been developed using the multibody systems formalism. Dynamical study of the different components responses was done, particularly for the wheel assembly. To validate those results, the system was constructed and simulated by RecurDyn, a professional multibody dynamics simulation software. The model has been used as the Objectif function in an optimization algorithm for ride quality improvement.Keywords: double A-Arm suspension, multibody systems, ride quality optimization, dynamic simulation
Procedia PDF Downloads 1381028 Nanobiosensor System for Aptamer Based Pathogen Detection in Environmental Waters
Authors: Nimet Yildirim Tirgil, Ahmed Busnaina, April Z. Gu
Abstract:
Environmental waters are monitored worldwide to protect people from infectious diseases primarily caused by enteric pathogens. All long, Escherichia coli (E. coli) is a good indicator for potential enteric pathogens in waters. Thus, a rapid and simple detection method for E. coli is very important to predict the pathogen contamination. In this study, to the best of our knowledge, as the first time we developed a rapid, direct and reusable SWCNTs (single walled carbon nanotubes) based biosensor system for sensitive and selective E. coli detection in water samples. We use a novel and newly developed flexible biosensor device which was fabricated by high-rate nanoscale offset printing process using directed assembly and transfer of SWCNTs. By simple directed assembly and non-covalent functionalization, aptamer (biorecognition element that specifically distinguish the E. coli O157:H7 strain from other pathogens) based SWCNTs biosensor system was designed and was further evaluated for environmental applications with simple and cost-effective steps. The two gold electrode terminals and SWCNTs-bridge between them allow continuous resistance response monitoring for the E. coli detection. The detection procedure is based on competitive mode detection. A known concentration of aptamer and E. coli cells were mixed and after a certain time filtered. The rest of free aptamers injected to the system. With hybridization of the free aptamers and their SWCNTs surface immobilized probe DNA (complementary-DNA for E. coli aptamer), we can monitor the resistance difference which is proportional to the amount of the E. coli. Thus, we can detect the E. coli without injecting it directly onto the sensing surface, and we could protect the electrode surface from the aggregation of target bacteria or other pollutants that may come from real wastewater samples. After optimization experiments, the linear detection range was determined from 2 cfu/ml to 10⁵ cfu/ml with higher than 0.98 R² value. The system was regenerated successfully with 5 % SDS solution over 100 times without any significant deterioration of the sensor performance. The developed system had high specificity towards E. coli (less than 20 % signal with other pathogens), and it could be applied to real water samples with 86 to 101 % recovery and 3 to 18 % cv values (n=3).Keywords: aptamer, E. coli, environmental detection, nanobiosensor, SWCTs
Procedia PDF Downloads 1971027 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud
Authors: Sharda Kumari, Saiman Shetty
Abstract:
Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation
Procedia PDF Downloads 1081026 D-Lysine Assisted 1-Ethyl-3-(3-Dimethylaminopropyl)Carbodiimide / N-Hydroxy Succinimide Initiated Crosslinked Collagen Scaffold with Controlled Structural and Surface Properties
Authors: G. Krishnamoorthy, S. Anandhakumar
Abstract:
The effect of D-Lysine (D-Lys) on collagen with 1-ethyl-3-(3-dimethylaminopropyl) carbodiimide(EDC)/N-hydroxysuccinimide(NHS) initiated cross linking using experimental and modelling tools are evaluated. The results of the Coll-D-Lys-EDC/NHS scaffold also indicate an increase in the tensile strength (TS), percentage of elongation (% E), denaturation temperature (Td), and decrease the decomposition rate compared to L-Lys-EDC/NHS. Scanning electron microscopic (SEM) and atomic force microscopic (AFM) analyses revealed a well ordered with properly oriented and well-aligned structure of scaffold. The D-Lys stabilizes the scaffold against degradation by collagenase than L-Lys. The cell assay showed more than 98% fibroblast viability (NIH3T3) and improved cell adhesions, protein adsorption after 72h of culture when compared with native scaffold. Cell attachment after 74h was robust, with cytoskeletal analysis showing that the attached cells were aligned along the fibers assuming a spindle-shape appearance, despite, gene expression analyses revealed no apparent alterations in mRNA levels, although cell proliferation was not adversely affected. D-Lysine (D-Lys) plays a pivotal role in the self-assembly and conformation of collagen fibrils. The D-Lys assisted EDC/NHS initiated cross-linking induces the formation of an carboxamide by the activation of the side chain -COOH group, followed by aminolysis of the O-iso acylurea intermediates by the -NH2 groups are directly joined via an isopeptides bond. This leads to the formation of intra- and inter-helical cross links. Modeling studies indicated that D-Lys bind with collagen-like peptide (CLP) through multiple H-bonding and hydrophobic interactions. Orientational changes in collagenase on CLP-D-Lys are observed which may decrease its accessibility to degradation and stabilize CLP against the action of the former. D-Lys has lowest binding energy and improved fibrillar-assembly and staggered alignment without the undesired structural stiffness and aggregations. The proteolytic machinery is not well equipped to deal with Coll-D-Lys than Coll-L-Lys scaffold. The information derived from the present study could help in designing collagenolytically stable heterochiral collagen based scaffold for biomedical applications.Keywords: collagen, collagenase, collagen like peptide, D-lysine, heterochiral collagen scaffold
Procedia PDF Downloads 3921025 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2261024 Innovative Screening Tool Based on Physical Properties of Blood
Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan
Abstract:
This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability
Procedia PDF Downloads 3761023 Evaluation of Zr/NH₄ClO₄ and Zr/KClO₄ Compositions for Development of Igniter for Ammonium Perchlorate and Hydroxyl-Terminated Polybutadiene Based Base Bleed System
Authors: Amir Mukhtar, Habib Nasir
Abstract:
To achieve an enhanced range of large calibre artillery a base bleed unit equipped with ammonium perchlorate and hydroxyl-terminated polybutadiene (AP/HTPB) based composite propellant grain is installed at the bottom of a projectile which produces jet of hot gasses and reduces base drag during flight of the projectile. Upon leaving the muzzle at very high muzzle velocity, due to sudden pressure drop, the propellant grain gets quenched. Therefore, base-bleed unit is equipped with an igniter to ensure ignition as well as reignition of the propellant grain. Pyrotechnic compositions based on Zr/NH₄ClO₄ and Zr/KClO₄ mixtures have been studied for the effect of fuel/oxidizer ratio and oxidizer type on ballistic properties. Calorific values of mixtures were investigated by bomb calorimeter, the average burning rate was measured by fuse wire technique at ambient conditions, and high-pressure closed vessel was used to record pressure-time profile, maximum pressure achieved (Pmax), time to achieve Pmax and differential pressure (dP/dt). It was observed that the 30, 40, 50 and 60 wt.% of Zr has a very significant effect on ballistic properties of mixtures. Compositions with NH₄ClO₄ produced higher values of Pmax, dP/dt and Calorific value as compared to Zr/KClO₄ based mixtures. Composition containing KClO₄ comparatively produced higher burning rate and maximum burning rate was recorded at 8.30 mm/s with 60 wt.% Zr in Zr/KClO₄ pyrotechnic mixture. Zr/KClO₄ with 50 wt. % of Zr was tests fired in igniter assembly by electric initiation method. Igniter assembly was test fired several times and average burning time of 3.5 sec with igniter mass burning rate of 6.85 g/sec was recorded. Igniter was finally fired on static and dynamic level with base bleed unit which gave successful ignition to the base bleed grain and extended range was achieved with 155 mm artillery projectile.Keywords: base bleed, closed vessel, igniter, zirconium
Procedia PDF Downloads 1651022 Analysis of the Operating Load of Gas Bearings in the Gas Generator of the Turbine Engine during a Deceleration to Dash Maneuver
Authors: Zbigniew Czyz, Pawel Magryta, Mateusz Paszko
Abstract:
The paper discusses the status of loads acting on the drive unit of the unmanned helicopter during deceleration to dash maneuver. Special attention was given for the loads of bearings in the gas generator turbine engine, in which will be equipped a helicopter. The analysis was based on the speed changes as a function of time for manned flight of helicopter PZL W3-Falcon. The dependence of speed change during the flight was approximated by the least squares method and then determined for its changes in acceleration. This enabled us to specify the forces acting on the bearing of the gas generator in static and dynamic conditions. Deceleration to dash maneuvers occurs in steady flight at a speed of 222 km/h by horizontal braking and acceleration. When the speed reaches 92 km/h, it dynamically changes an inclination of the helicopter to the maximum acceleration and power to almost maximum and holds it until it reaches its initial speed. This type of maneuvers are used due to ineffective shots at significant cruising speeds. It is, therefore, important to reduce speed to the optimum as soon as possible and after giving a shot to return to the initial speed (cruising). In deceleration to dash maneuvers, we have to deal with the force of gravity of the rotor assembly, gas aerodynamics forces and the forces caused by axial acceleration during this maneuver. While we can assume that the working components of the gas generator are designed so that axial gas forces they create could balance the aerodynamic effects, the remaining ones operate with a value that results from the motion profile of the aircraft. Based on the analysis, we can make a compilation of the results. For this maneuver, the force of gravity (referring to statistical calculations) respectively equals for bearing A = 5.638 N and bearing B = 1.631 N. As overload coefficient k in this direction is 1, this force results solely from the weight of the rotor assembly. For this maneuver, the acceleration in the longitudinal direction achieved value a_max = 4.36 m/s2. Overload coefficient k is, therefore, 0.44. When we multiply overload coefficient k by the weight of all gas generator components that act on the axial bearing, the force caused by axial acceleration during deceleration to dash maneuver equals only 3.15 N. The results of the calculations are compared with other maneuvers such as acceleration and deceleration and jump up and jump down maneuvers. This work has been financed by the Polish Ministry of Science and Higher Education.Keywords: gas bearings, helicopters, helicopter maneuvers, turbine engines
Procedia PDF Downloads 3401021 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game
Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin
Abstract:
Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design
Procedia PDF Downloads 4221020 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System
Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee
Abstract:
This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation
Procedia PDF Downloads 1011019 A Comparative Study of the Alternatives to Land Acquisition: India
Authors: Aparna Soni
Abstract:
The much-celebrated foretold story of Indian city engines driving the growth of India has been scrutinized to have serious consequences. A wide spectrum of scholarship has brought to light the un-equalizing effects and the need to adopt a rights-based approach to development planning in India. Notably, these concepts and discourses ubiquitously entail the study of land struggles in the making of Urban. In fact, the very progression of the primitive accumulation theory to accumulation by dispossession, followed by ‘dispossession without development,’ thereafter Development without dispossession and now as Dispossession by financialization noticeably the last three developing in a span of mere three decades, is evidence enough to trace the centrality and evolving role of land in the making of urban India. India, in the last decade, has seen its regional governments actively experimenting with alternative models of land assembly (Amaravati and Delhi land pooling models, the loudly advertised ones). These are publicized as a replacement to the presumably cost and time antagonistic, prone to litigation land acquisition act of 2013. It has been observed that most of the literature treats these models as a generic large bracket of land expropriation and do not, in particular, try to differentially analyse to granularly find a pattern in these alternatives. To cater to this gap, this research comparatively studies these alternative land, assembly models. It categorises them based on their basic architecture, spatial and sectoral application, and governance frameworks. It is found that these alternatives are ad-hoc and fragmented pieces of legislation. These are fit for profit models commodifying land to ease its access by the private sector for real estate led growth. The research augments the literature on the privatization of land use planning in India. Further, it attempts to discuss the increasing role a landowner is expected to play in the future and suggests a way forward to safeguard them from market risks. The study involves a thematic analysis of the policy elements contained in legislative/policy documents, notifications, office orders. The study also derives from the various widely circulated print media information. With the present field-visit limitations, the study relies on documents accessed open-source in the public domain.Keywords: commodification, dispossession, land acquisition, landowner
Procedia PDF Downloads 1661018 Insights into The Oversight Functions of The Legislative Power Under The Nigerian Constitution
Authors: Olanrewaju O. Adeojo
Abstract:
The constitutional system of government provides for the federating units of the Federal Republic of Nigeria, the States and the Local Councils under a governing structure of the Executive, the Legislature and the Judiciary with attendant distinct powers and spheres of influence. The legislative powers of the Federal Republic of Nigeria and of a State are vested in the National Assembly and House of Assembly of the State respectively. The Local council exercises legislative powers in clearly defined matters as provided by the Constitution. Though, the executive as constituted by the President and the Governor are charged with the powers of execution and administration, the legislature is empowered to ensure that such powers are duly exercised in accordance with the provisions of the Constitution. The vast areas do not make oversight functions indefinite and more importantly the purpose for the exercise of the powers are circumscribed. It include, among others, any matter with respect to which it has power to make laws. Indeed, the law provides for the competence of the legislature to procure evidence, examine all persons as witnesses, to summon any person to give evidence and to issue a warrant to compel attendance in matters relevant to the subject matter of its investigation. The exercise of functions envisaged by the Constitution seem to an extent to be literal because it lacks power of enforcing the outcome. Furthermore, the docility of the legislature is apparent in a situation where the agency or authority being called in to question is part of the branch of government to enforce sanctions. The process allows for cover up and obstruction of justice. The oversight functions are not functional in a situation where the executive is overbearing. The friction, that ensues, between the Legislature and the Executive in an attempt by the former to project the spirit of a constitutional mandate calls for concern. It is needless to state a power that can easily be frustrated. To an extent, the arm of government with coercive authority seems to have over shadowy effect over the laid down functions of the legislature. Recourse to adjudication by the Judiciary had not proved to be of any serious utility especially in a clime where the wheels of justice grinds slowly, as in Nigeria, due to the nature of the legal system. Consequently, the law and the Constitution, drawing lessons from other jurisdiction, need to insulate the legislative oversight from the vagaries of the executive. A strong and virile Constitutional Court that determines, within specific time line, issues pertaining to the oversight functions of the legislative power, is apposite.Keywords: constitution, legislative, oversight, power
Procedia PDF Downloads 1301017 Inverse Matrix in the Theory of Dynamical Systems
Authors: Renata Masarova, Bohuslava Juhasova, Martin Juhas, Zuzana Sutova
Abstract:
In dynamic system theory a mathematical model is often used to describe their properties. In order to find a transfer matrix of a dynamic system we need to calculate an inverse matrix. The paper contains the fusion of the classical theory and the procedures used in the theory of automated control for calculating the inverse matrix. The final part of the paper models the given problem by the Matlab.Keywords: dynamic system, transfer matrix, inverse matrix, modeling
Procedia PDF Downloads 5161016 Synthesis of Novel Metallosurfactants for Drug Delivery
Authors: Fatima Zohra Belghait, Nawal Cheikh, Oscar Palacios, Ramon Barnadas, Pau Bayon
Abstract:
Metalloporphyrin and its derivatives play an important role in different scientific areas due to its tetradentate vacant site in the center that is suitable for metal coordination. Metalosomes (MTS) are supramolecular aggregates (similar to liposomes) generated by the self-assembly of compounds similar to phospholipids (with a polar and a hydrophobic part), but incorporating, as part of their membrane, molecules that contain bound metals. The aim of our work is to synthesise metalosomes containing catioinc amphiphilic porphyrin and their complexes with Fe and Cu to study their therapeutical applications. All synthesized compounds were confirmed with Dynamic Light Scattering; elemental analysis, Ultraviolet–visible spectroscopyKeywords: metalloporphyrin, amphiphilique porphyrin, metalosomes, supramolecular
Procedia PDF Downloads 41015 Micropower Composite Nanomaterials Based on Porous Silicon for Renewable Energy Sources
Authors: Alexey P. Antropov, Alexander V. Ragutkin, Nicolay A. Yashtulov
Abstract:
The original controlled technology for power active nanocomposite membrane-electrode assembly engineering on the basis of porous silicon is presented. The functional nanocomposites were studied by electron microscopy and cyclic voltammetry methods. The application possibility of the obtained nanocomposites as high performance renewable energy sources for micro-power electronic devices is demonstrated.Keywords: cyclic voltammetry, electron microscopy, nanotechnology, platinum-palladium nanocomposites, porous silicon, power activity, renewable energy sources
Procedia PDF Downloads 3541014 Improving Security Features of Traditional Automated Teller Machines-Based Banking Services via Fingerprint Biometrics Scheme
Authors: Anthony I. Otuonye, Juliet N. Odii, Perpetual N. Ibe
Abstract:
The obvious challenges faced by most commercial bank customers while using the services of ATMs (Automated Teller Machines) across developing countries have triggered the need for an improved system with better security features. Current ATM systems are password-based, and research has proved the vulnerabilities of these systems to heinous attacks and manipulations. We have discovered by research that the security of current ATM-assisted banking services in most developing countries of the world is easily broken and maneuvered by fraudsters, majorly because it is quite difficult for these systems to identify an impostor with privileged access as against the authentic bank account owner. Again, PIN (Personal Identification Number) code passwords are easily guessed, just to mention a few of such obvious limitations of traditional ATM operations. In this research work also, we have developed a system of fingerprint biometrics with PIN code Authentication that seeks to improve the security features of traditional ATM installations as well as other Banking Services. The aim is to ensure better security at all ATM installations and raise the confidence of bank customers. It is hoped that our system will overcome most of the challenges of the current password-based ATM operation if properly applied. The researchers made use of the OOADM (Object-Oriented Analysis and Design Methodology), a software development methodology that assures proper system design using modern design diagrams. Implementation and coding were carried out using Visual Studio 2010 together with other software tools. Results obtained show a working system that provides two levels of security at the client’s side using a fingerprint biometric scheme combined with the existing 4-digit PIN code to guarantee the confidence of bank customers across developing countries.Keywords: fingerprint biometrics, banking operations, verification, ATMs, PIN code
Procedia PDF Downloads 431013 Video Processing of a Football Game: Detecting Features of a Football Match for Automated Calculation of Statistics
Authors: Rishabh Beri, Sahil Shah
Abstract:
We have applied a range of filters and processing in order to extract out the various features of the football game, like the field lines of a football field. Another important aspect was the detection of the players in the field and tagging them according to their teams distinguished by their jersey colours. This extracted information combined about the players and field helped us to create a virtual field that consists of the playing field and the players mapped to their locations in it.Keywords: Detect, Football, Players, Virtual
Procedia PDF Downloads 3321012 The Second Generation of Tyrosine Kinase Inhibitor Afatinib Controls Inflammation by Regulating NLRP3 Inflammasome Activation
Authors: Shujun Xie, Shirong Zhang, Shenglin Ma
Abstract:
Background: Chronic inflammation might lead to many malignancies, and inadequate resolution could play a crucial role in tumor invasion, progression, and metastases. A randomised, double-blind, placebo-controlled trial shows that IL-1β inhibition with canakinumab could reduce incident lung cancer and lung cancer mortality in patients with atherosclerosis. The process and secretion of proinflammatory cytokine IL-1β are controlled by the inflammasome. Here we showed the correlation of the innate immune system and afatinib, a tyrosine kinase inhibitor targeting epidermal growth factor receptor (EGFR) in non-small cell lung cancer. Methods: Murine Bone marrow derived macrophages (BMDMs), peritoneal macrophages (PMs) and THP-1 were used to check the effect of afatinib on the activation of NLRP3 inflammasome. The assembly of NLRP3 inflammasome was check by co-immunoprecipitation of NLRP3 and apoptosis-associated speck-like protein containing CARD (ASC), disuccinimidyl suberate (DSS)-cross link of ASC. Lipopolysaccharide (LPS)-induced sepsis and Alum-induced peritonitis were conducted to confirm that afatinib could inhibit the activation of NLRP3 in vivo. Peripheral blood mononuclear cells (PBMCs) from non-small cell lung cancer (NSCLC) patients before or after taking afatinib were used to check that afatinib inhibits inflammation in NSCLC therapy. Results: Our data showed that afatinib could inhibit the secretion of IL-1β in a dose-dependent manner in macrophage. Moreover, afatinib could inhibit the maturation of IL-1β and caspase-1 without affecting the precursors of IL-1β and caspase-1. Next, we found that afatinib could block the assembly of NLRP3 inflammasome and the ASC speck by blocking the interaction of the sensor protein NLRP3 and the adaptor protein ASC. We also found that afatinib was able to alleviate the LPS-induced sepsis in vivo. Conclusion: Our study found that afatinib could inhibit the activation of NLRP3 inflammasome in macrophage, providing new evidence that afatinib could target the innate immune system to control chronic inflammation. These investigations will provide significant experimental evidence in afatinib as therapeutic drug for non-small cell lung cancer or other tumors and NLRP3-related diseases and will explore new targets for afatinib.Keywords: inflammasome, afatinib, inflammation, tyrosine kinase inhibitor
Procedia PDF Downloads 1181011 Application of Electrochromic Glazing for Reducing Peak Cooling Loads
Authors: Ranojoy Dutta
Abstract:
HVAC equipment capacity has a direct impact on occupant comfort and energy consumption of a building. Glazing gains, especially in buildings with high window area, can be a significant contributor to the total peak load on the HVAC system, leading to over-sized systems that mostly operate at poor part load efficiency. In addition, radiant temperature, which largely drives occupant comfort in glazed perimeter zones, is often not effectively controlled despite the HVAC being designed to meet the air temperature set-point. This is due to short wave solar radiation transmitted through windows, that is not sensed by the thermostat until much later when the thermal mass in the room releases the absorbed solar heat to the indoor air. The implication of this phenomenon is increased cooling energy despite poor occupant comfort. EC glazing can significantly eliminate direct solar transmission through windows, reducing both the space cooling loads for the building and improving comfort for occupants near glazing. This paper will review the exact mechanism of how EC glazing would reduce the peak load under design day conditions, leading to reduced cooling capacity vs regular high-performance glazing. Since glazing heat transfer only affects the sensible load, system sizing will be evaluated both with and without the availability of a DOAS to isolate the downsizing potential of the primary cooling equipment when outdoor air is conditioned separately. Given the dynamic nature of glazing gains due to the sun’s movement, effective peak load mitigation with EC requires an automated control system that can predict solar movement and radiation levels so that the right tint state with the appropriate SHGC is utilized at any given time for a given façade orientation. Such an automated EC product will be evaluated for a prototype commercial office model situated in four distinct climate zones.Keywords: electrochromic glazing, peak sizing, thermal comfort, glazing load
Procedia PDF Downloads 1301010 Design and Study of a DC/DC Converter for High Power, 14.4 V and 300 A for Automotive Applications
Authors: Júlio Cesar Lopes de Oliveira, Carlos Henrique Gonçalves Treviso
Abstract:
The shortage of the automotive market in relation to options for sources of high power car audio systems, led to development of this work. Thus, we developed a source with stabilized voltage with 4320 W effective power. Designed to the voltage of 14.4 V and a choice of two currents: 30 A load option in battery banks and 300 A at full load. This source can also be considered as a source of general use dedicated commercial with a simple control circuit in analog form based on discrete components. The assembly of power circuit uses a methodology for higher power than the initially stipulated.Keywords: DC-DC power converters, converters, power conversion, pulse width modulation converters
Procedia PDF Downloads 3851009 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas
Authors: Sahithi Yarlagadda
Abstract:
The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm
Procedia PDF Downloads 1101008 Image Based Landing Solutions for Large Passenger Aircraft
Authors: Thierry Sammour Sawaya, Heikki Deschacht
Abstract:
In commercial aircraft operations, almost half of the accidents happen during approach or landing phases. Automatic guidance and automatic landings have proven to bring significant safety value added for this challenging landing phase. This is why Airbus and ScioTeq have decided to work together to explore the capability of image-based landing solutions as additional landing aids to further expand the possibility to perform automatic approach and landing to runways where the current guiding systems are either not fitted or not optimum. Current systems for automated landing often depend on radio signals provided by airport ground infrastructure on the airport or satellite coverage. In addition, these radio signals may not always be available with the integrity and performance required for safe automatic landing. Being independent from these radio signals would widen the operations possibilities and increase the number of automated landings. Airbus and ScioTeq are joining their expertise in the field of Computer Vision in the European Program called Clean Sky 2 Large Passenger Aircraft, in which they are leading the IMBALS (IMage BAsed Landing Solutions) project. The ultimate goal of this project is to demonstrate, develop, validate and verify a certifiable automatic landing system guiding an airplane during the approach and landing phases based on an onboard camera system capturing images, enabling automatic landing independent from radio signals and without precision instrument for landing. In the frame of this project, ScioTeq is responsible for the development of the Image Processing Platform (IPP), while Airbus is responsible for defining the functional and system requirements as well as the testing and integration of the developed equipment in a Large Passenger Aircraft representative environment. The aim of this paper will be to describe the system as well as the associated methods and tools developed for validation and verification.Keywords: aircraft landing system, aircraft safety, autoland, avionic system, computer vision, image processing
Procedia PDF Downloads 1011007 Analysis of Brake System for Vehicle Off-Road
Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale, José Ubiragi de Lima Mendes
Abstract:
In elapsing of the years it elaborates automobile it is developing automobiles more and more modern that, every year, the vehicles recently of the assembly lines, practically they push for the past produced models there is very little time. Those innovations didn't also pass unperceived in 0respect the safety of the vehicles. It is in this development apprenticeship the brakes systems equipped more and more with resources sophisticated. In that way, before of that context, this research tried to project a brake system for a vehicle off-road and to analyze your acting as the brakes efficiency: distances traveled and time, concluding with possible improvements in the system.Keywords: brakes system, off-road, vehicle acting, automotive and mechanical engineering
Procedia PDF Downloads 4841006 Development of an Optimised, Automated Multidimensional Model for Supply Chains
Authors: Safaa H. Sindi, Michael Roe
Abstract:
This project divides supply chain (SC) models into seven Eras, according to the evolution of the market’s needs throughout time. The five earliest Eras describe the emergence of supply chains, while the last two Eras are to be created. Research objectives: The aim is to generate the two latest Eras with their respective models that focus on the consumable goods. Era Six contains the Optimal Multidimensional Matrix (OMM) that incorporates most characteristics of the SC and allocates them into four quarters (Agile, Lean, Leagile, and Basic SC). This will help companies, especially (SMEs) plan their optimal SC route. Era Seven creates an Automated Multidimensional Model (AMM) which upgrades the matrix of Era six, as it accounts for all the supply chain factors (i.e. Offshoring, sourcing, risk) into an interactive system with Heuristic Learning that helps larger companies and industries to select the best SC model for their market. Methodologies: The data collection is based on a Fuzzy-Delphi study that analyses statements using Fuzzy Logic. The first round of Delphi study will contain statements (fuzzy rules) about the matrix of Era six. The second round of Delphi contains the feedback given from the first round and so on. Preliminary findings: both models are applicable, Matrix of Era six reduces the complexity of choosing the best SC model for SMEs by helping them identify the best strategy of Basic SC, Lean, Agile and Leagile SC; that’s tailored to their needs. The interactive heuristic learning in the AMM of Era seven will help mitigate error and aid large companies to identify and re-strategize the best SC model and distribution system for their market and commodity, hence increasing efficiency. Potential contributions to the literature: The problematic issue facing many companies is to decide which SC model or strategy to incorporate, due to the many models and definitions developed over the years. This research simplifies this by putting most definition in a template and most models in the Matrix of era six. This research is original as the division of SC into Eras, the Matrix of Era six (OMM) with Fuzzy-Delphi and Heuristic Learning in the AMM of Era seven provides a synergy of tools that were not combined before in the area of SC. Additionally the OMM of Era six is unique as it combines most characteristics of the SC, which is an original concept in itself.Keywords: Leagile, automation, heuristic learning, supply chain models
Procedia PDF Downloads 3891005 Theoretical and Experimental Investigation of Fe and Ni-TCNQ on Graphene
Authors: A. Shahsavar, Z. Jakub
Abstract:
Due to the outstanding properties of the 2D metal-organic frameworks (MOF), intensive computational and experimental studies have been done. However, the lack of fundamental studies of MOFs on the graphene backbone is observed. This work studies Fe and Ni as metal and tetracyanoquinodimethane (TCNQ) with a high electron affinity as an organic linker functionalized on graphene. Here we present DFT calculations results to unveil the electronic and magnetic properties of iron and nickel-TCNQ physisorbed on graphene. Adsorption and Fermi energies, structural, and magnetic properties will be reported. Our experimental observations prove Fe- and NiTCNQ@Gr/Ir(111) are thermally highly stable up to 500 and 250°C, respectively, making them promising materials for single-atom catalysts or high-density storage media.Keywords: DFT, graphene, MTCNQ, self-assembly
Procedia PDF Downloads 1321004 Census and Mapping of Oil Palms Over Satellite Dataset Using Deep Learning Model
Authors: Gholba Niranjan Dilip, Anil Kumar
Abstract:
Conduct of accurate reliable mapping of oil palm plantations and census of individual palm trees is a huge challenge. This study addresses this challenge and developed an optimized solution implemented deep learning techniques on remote sensing data. The oil palm is a very important tropical crop. To improve its productivity and land management, it is imperative to have accurate census over large areas. Since, manual census is costly and prone to approximations, a methodology for automated census using panchromatic images from Cartosat-2, SkySat and World View-3 satellites is demonstrated. It is selected two different study sites in Indonesia. The customized set of training data and ground-truth data are created for this study from Cartosat-2 images. The pre-trained model of Single Shot MultiBox Detector (SSD) Lite MobileNet V2 Convolutional Neural Network (CNN) from the TensorFlow Object Detection API is subjected to transfer learning on this customized dataset. The SSD model is able to generate the bounding boxes for each oil palm and also do the counting of palms with good accuracy on the panchromatic images. The detection yielded an F-Score of 83.16 % on seven different images. The detections are buffered and dissolved to generate polygons demarcating the boundaries of the oil palm plantations. This provided the area under the plantations and also gave maps of their location, thereby completing the automated census, with a fairly high accuracy (≈100%). The trained CNN was found competent enough to detect oil palm crowns from images obtained from multiple satellite sensors and of varying temporal vintage. It helped to estimate the increase in oil palm plantations from 2014 to 2021 in the study area. The study proved that high-resolution panchromatic satellite image can successfully be used to undertake census of oil palm plantations using CNNs.Keywords: object detection, oil palm tree census, panchromatic images, single shot multibox detector
Procedia PDF Downloads 1611003 Applying Semi-Automatic Digital Aerial Survey Technology and Canopy Characters Classification for Surface Vegetation Interpretation of Archaeological Sites
Authors: Yung-Chung Chuang
Abstract:
The cultural layers of archaeological sites are mainly affected by surface land use, land cover, and root system of surface vegetation. For this reason, continuous monitoring of land use and land cover change is important for archaeological sites protection and management. However, in actual operation, on-site investigation and orthogonal photograph interpretation require a lot of time and manpower. For this reason, it is necessary to perform a good alternative for surface vegetation survey in an automated or semi-automated manner. In this study, we applied semi-automatic digital aerial survey technology and canopy characters classification with very high-resolution aerial photographs for surface vegetation interpretation of archaeological sites. The main idea is based on different landscape or forest type can easily be distinguished with canopy characters (e.g., specific texture distribution, shadow effects and gap characters) extracted by semi-automatic image classification. A novel methodology to classify the shape of canopy characters using landscape indices and multivariate statistics was also proposed. Non-hierarchical cluster analysis was used to assess the optimal number of canopy character clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy character classification (seven categories). Therefore, people could easily predict the forest type and vegetation land cover by corresponding to the specific canopy character category. The results showed that the semi-automatic classification could effectively extract the canopy characters of forest and vegetation land cover. As for forest type and vegetation type prediction, the average prediction accuracy reached 80.3%~91.7% with different sizes of test frame. It represented this technology is useful for archaeological site survey, and can improve the classification efficiency and data update rate.Keywords: digital aerial survey, canopy characters classification, archaeological sites, multivariate statistics
Procedia PDF Downloads 142