Search results for: comprehensive feature extraction
1521 A Comparative Analysis of Conventional and Organic Dairy Supply Chain: Assessing Transport Costs and External Effects in Southern Sweden
Authors: Vivianne Aggestam
Abstract:
Purpose: Organic dairy products have steadily increased with consumer popularity in recent years in Sweden, permitting more transport activities. The main aim of this study was to compare the transport costs and the environmental emissions made by the organic and conventional dairy production in Sweden. The objective was to evaluate differences and environmental impacts of transport between the two different production systems, allowing a more transparent understanding of the real impact of transport within the supply chain. Methods: A partial attributional Life Cycle Assessment has been conducted based on a comprehensive survey of Swedish farmers, dairies and consumers regarding their transport needs and costs. Interviews addressed the farmers and dairies. Consumers were targeted through an online survey. Results: Higher transport inputs from conventional dairy transportation are mainly via feed and soil management on farm level. The regional organic milk brand illustrate less initial transport burdens on farm level, however, after leaving the farm, it had equal or higher transportation requirements. This was mainly due to the location of the dairy farm and shorter product expiry dates, which requires more frequent retail deliveries. Organic consumers tend to use public transport more than private vehicles. Consumers using private vehicles for shopping trips primarily bought conventional products for which price was the main deciding factor. Conclusions: Organic dairy products that emphasise its regional attributes do not ensure less transportation and may therefore not be a more “climate smart” option for the consumer. This suggests that the idea of localism needs to be analysed from a more systemic perspective. Fuel and regional feed efficiency can be further implemented, mainly via fuel type and the types of vehicles used for transport.Keywords: supply chains, distribution, transportation, organic food productions, conventional food production, agricultural fossil fuel use
Procedia PDF Downloads 4541520 Structural Analysis and Modelling in an Evolving Iron Ore Operation
Authors: Sameh Shahin, Nannang Arrys
Abstract:
Optimizing pit slope stability and reducing strip ratio of a mining operation are two key tasks in geotechnical engineering. With a growing demand for minerals and an increasing cost associated with extraction, companies are constantly re-evaluating the viability of mineral deposits and challenging their geological understanding. Within Rio Tinto Iron Ore, the Structural Geology (SG) team investigate and collect critical data, such as point based orientations, mapping and geological inferences from adjacent pits to re-model deposits where previous interpretations have failed to account for structurally controlled slope failures. Utilizing innovative data collection methods and data-driven investigation, SG aims to address the root causes of slope instability. Committing to a resource grid drill campaign as the primary source of data collection will often bias data collection to a specific orientation and significantly reduce the capability to identify and qualify complexity. Consequently, these limitations make it difficult to construct a realistic and coherent structural model that identifies adverse structural domains. Without the consideration of complexity and the capability of capturing these structural domains, mining operations run the risk of inadequately designed slopes that may fail and potentially harm people. Regional structural trends have been considered in conjunction with surface and in-pit mapping data to model multi-batter fold structures that were absent from previous iterations of the structural model. The risk is evident in newly identified dip-slope and rock-mass controlled sectors of the geotechnical design rather than a ubiquitous dip-slope sector across the pit. The reward is two-fold: 1) providing sectors of rock-mass controlled design in previously interpreted structurally controlled domains and 2) the opportunity to optimize the slope angle for mineral recovery and reduced strip ratio. Furthermore, a resulting high confidence model with structures and geometries that can account for historic slope instabilities in structurally controlled domains where design assumptions failed.Keywords: structural geology, geotechnical design, optimization, slope stability, risk mitigation
Procedia PDF Downloads 471519 The Role of Specificity in Mastering the English Article System
Authors: Sugene Kim
Abstract:
The English articles are taught as a binary system based on nominal countability and definiteness. Despite the detailed rules of prescriptive grammar, it has been consistently reported in the literature that their correct usage is extremely difficult to master even for advanced learners of English as a second language (ESL) or a foreign language (EFL). Given that an English sentence (except for an imperative) cannot be constructed without a noun, which is always paired with one of the indefinite, definite, and zero articles; it is essential to understand specifically what causes ESL/EFL learners to misuse them. To that end, this study examined EFL learners’ article use employing a one-group pre–post-test design. Forty-three Korean college students received instruction on correct English article usage for two 75-minute classes employing the binary schema set up for the study. They also practiced in class how to apply the rules as instructed. Then, the participants were assigned a forced-choice elicitation task, which was also used as a pre-test administered three months prior to the instruction. Unlike the pre-test on which they only chose the correct article for each of the 40 items, the post-instruction task additionally asked them to give written accounts of their decision-making procedure to choose the article as they did. The participants’ performance was scored manually by checking whether the answer given is correct or incorrect, and their written comments were first categorized using thematic analysis and then ranked by frequency. The analyses of the performance on the two tasks and the written think-aloud data suggested that EFL learners exhibit fluctuation between specificity and definiteness, overgeneralizing the use of the definite article for almost all cataphoric references. It was apparent that they have trouble distinguishing from the two concepts possibly because the former is almost never introduced in the grammar books or classes designed for ESL/EFL learners. Particularly, most participants were found to be ignorant of the possibility of using nouns as [+specific, –definite]. Not surprisingly, the correct answer rates for such nouns averaged out at 33% and 46% on the pre- and post-tests, respectively, which narrowly reach half the overall mean correct answer rates of 65% on the pre-test and 81% on the post-test. In addition, correct article use for specific indefinites was most impermeable to instruction when compared with nouns used as [–specific, –definite] or [± specific, +definite]. Such findings underline the necessity for expanding the binary schema to a ternary form that incorporates the specificity feature, albeit not morphologically marked in the English language.Keywords: countability, definiteness, English articles, specificity, ternary system
Procedia PDF Downloads 1241518 Operating Characteristics of Point-of-Care Ultrasound in Identifying Skin and Soft Tissue Abscesses in the Emergency Department
Authors: Sathyaseelan Subramaniam, Jacqueline Bober, Jennifer Chao, Shahriar Zehtabchi
Abstract:
Background: Emergency physicians frequently evaluate skin and soft tissue infections in order to differentiate abscess from cellulitis. This helps determine which patients will benefit from incision and drainage. Our objective was to determine the operating characteristics of point-of-care ultrasound (POCUS) compared to clinical examination in identifying abscesses in emergency department (ED) patients with features of skin and soft tissue infections. Methods: We performed a comprehensive search in the following databases: Medline, Web of Science, EMBASE, CINAHL and Cochrane Library. Trials were included if they compared the operating characteristics of POCUS with clinical examination in identifying skin and soft tissue abscesses. Trials that included patients with oropharyngeal abscesses or that requiring abscess drainage in the operating room were excluded. The presence of an abscess was determined by pus drainage. No pus seen on incision or resolution of symptoms without pus drainage at follow up, determined the absence of an abscess. Quality of included trials was assessed using GRADE criteria. Operating characteristics of POCUS are reported as sensitivity, specificity, positive likelihood (LR+) and negative likelihood (LR-) ratios and the respective 95% confidence intervals (CI). Summary measures were calculated by generating a hierarchical summary receiver operating characteristic model (HSROC). Results: Out of 3203 references identified, 5 observational studies with 615 patients in aggregate were included (2 adults and 3 pediatrics). We rated the quality of 3 trials as low and 2 as very low. The operating characteristics of POCUS and clinical examination in identifying soft tissue abscesses are presented in the table. The HSROC for POCUS revealed a sensitivity of 96% (95% CI = 89-98%), specificity of 79% (95% CI = 71-86), LR+ of 4.6 (95% CI = 3.2-6.8), and LR- of 0.06 (95% CI = 0.02-0.2). Conclusion: Existing evidence indicates that POCUS is useful in identifying abscesses in ED patients with skin or soft tissue infections.Keywords: abscess, point-of-care ultrasound, pocus, skin and soft tissue infection
Procedia PDF Downloads 3691517 Uncertainty and Multifunctionality as Bridging Concepts from Socio-Ecological Resilience to Infrastructure Finance in Water Resource Decision Making
Authors: Anita Lazurko, Laszlo Pinter, Jeremy Richardson
Abstract:
Uncertain climate projections, multiple possible development futures, and a financing gap create challenges for water infrastructure decision making. In contrast to conventional predict-plan-act methods, an emerging decision paradigm that enables social-ecological resilience supports decisions that are appropriate for uncertainty and leverage social, ecological, and economic multifunctionality. Concurrently, water infrastructure project finance plays a powerful role in sustainable infrastructure development but remains disconnected from discourse in socio-ecological resilience. At the time of research, a project to transfer water from Lesotho to Botswana through South Africa in the Orange-Senqu River Basin was at the pre-feasibility stage. This case was analysed through documents and interviews to investigate how uncertainty and multifunctionality are conceptualised and considered in decisions for the resilience of water infrastructure and to explore bridging concepts that might allow project finance to better enable socio-ecological resilience. Interviewees conceptualised uncertainty as risk, ambiguity and ignorance, and multifunctionality as politically-motivated shared benefits. Numerous efforts to adopt emerging decision methods that consider these terms were in use but required compromises to accommodate the persistent, conventional decision paradigm, though a range of future opportunities was identified. Bridging these findings to finance revealed opportunities to consider a more comprehensive scope of risk, to leverage risk mitigation measures, to diffuse risks and benefits over space, time and to diverse actor groups, and to clarify roles to achieve multiple objectives for resilience. In addition to insights into how multiple decision paradigms interact in real-world decision contexts, the research highlights untapped potential at the juncture between socio-ecological resilience and project finance.Keywords: socio-ecological resilience, finance, multifunctionality, uncertainty
Procedia PDF Downloads 1261516 A Comparative Study of the Proposed Models for the Components of the National Health Information System
Authors: M. Ahmadi, Sh. Damanabi, F. Sadoughi
Abstract:
National Health Information System plays an important role in ensuring timely and reliable access to Health information which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, by using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system for better planning and management influential factors of performance seems necessary, therefore, in this study, different attitudes towards components of this system are explored comparatively. Methods: This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process, and output. In this context, search for information using library resources and internet search were conducted and data analysis was expressed using comparative tables and qualitative data. Results: The findings showed that there are three different perspectives presenting the components of national health information system, Lippeveld, Sauerborn, and Bodart Model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008 and Gattini’s 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities, and equipment. In addition, in the ‘process’ section from three models, we pointed up the actions ensuring the quality of health information system and in output section, except Lippeveld Model, two other models consider information products, usage and distribution of information as components of the national health information system. Conclusion: The results showed that all the three models have had a brief discussion about the components of health information in input section. However, Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process, and output.Keywords: National Health Information System, components of the NHIS, Lippeveld Model
Procedia PDF Downloads 4211515 Designing the Management Plan for Health Care (Medical) Wastes in the Cities of Semnan, Mahdishahr and Shahmirzad
Authors: Rasouli Divkalaee Zeinab, Kalteh Safa, Roudbari Aliakbar
Abstract:
Introduction: Medical waste can lead to the generation and transmission of many infectious and contagious diseases due to the presence of pathogenic agents, thereby necessitating the need for special management to collect, decontaminate, and finally dispose of such products. This study aimed to design a centralized health care (medical) waste management program for the cities of Semnan, Mahdishahr, and Shahmirzad. Methods: This descriptive-analytical study was conducted for six months in the cities of Semnan, Mahdishahr, and Shahmirzad. In this study, the quantitative and qualitative characteristics of the generated wastes were determined by taking samples from all medical waste production centers. Then, the equipment, devices, and machines required for separate collection of the waste from the production centers and for their subsequent decontamination were estimated. Next, the investment costs, current costs, and working capital required for collection, decontamination, and final disposal of the wastes were determined. Finally, the payment for proper waste management of each category of medical waste-producing centers was determined. Results: 1021 kilograms of medical waste are produced daily in the cities of Semnan, Mahdishahr, and Shahmirzad. It was estimated that a 1000-liter autoclave, a machine for collecting medical waste, four 60-liter bins, four 120-liter bins, and four 1200-liter bins were required for implementing the study plan. Also, the estimated total annual medical waste management costs for Semnan City were determined (23,283,903,720 Iranian Rials). Conclusion: The study results showed that establishing a proper management system for medical wastes generated in the three studied cities will cost between 334,280 and 1,253,715 Iranian Rials in fees for the medical centers. The findings of this study provided comprehensive data regarding medical wastes from the generation point to the landfill site, which is vital for the government and the private sector.Keywords: clinics, decontamination, management, medical waste
Procedia PDF Downloads 781514 Reduplication In Urdu-Hindi Nonsensical Words: An OT Analysis
Authors: Riaz Ahmed Mangrio
Abstract:
Reduplication in Urdu-Hindi affects all major word categories, particles, and even nonsensical words. It conveys a variety of meanings, including distribution, emphasis, iteration, adjectival and adverbial. This study will primarily discuss reduplicative structures of nonsensical words in Urdu-Hindi and then briefly look at some examples from other Indo-Aryan languages to introduce the debate regarding the same structures in them. The goal of this study is to present counter-evidence against Keane (2005: 241), who claims “the base in the cases of lexical and phrasal echo reduplication is always independently meaningful”. However, Urdu-Hindi reduplication derives meaningful compounds from nonsensical words e.g. gũ mgũ (A) ‘silent and confused’ and d̪əb d̪əb-a (N) ‘one’s fear over others’. This needs a comprehensive examination to see whether and how the various structures form patterns of a base-reduplicant relationship or, rather, they are merely sub lexical items joining together to form a word pattern of any grammatical category in content words. Another interesting theoretical question arises within the Optimality framework: in an OT analysis, is it necessary to identify one of the two constituents as the base and the other as reduplicant? Or is it best to consider this a pattern, but then how does this fit in with an OT analysis? This may be an even more interesting theoretical question. Looking for the solution to such questions can serve to make an important contribution. In the case at hand, each of the two constituents is an independent nonsensical word, but their echo reduplication is nonetheless meaningful. This casts significant doubt upon Keane’s (2005: 241) observation of some examples from Hindi and Tamil reduplication that “the base in cases of lexical and phrasal echo reduplication is always independently meaningful”. The debate on the point becomes further interesting when the triplication of nonsensical words in Urdu-Hindi e.g. aẽ baẽ ʃaẽ (N) ‘useless talk’ is also seen, which is equally important to discuss. The example is challenging to Harrison’s (1973) claim that only the monosyllabic verbs in their progressive forms reduplicate twice to result in triplication, which is not the case with the example presented. The study will consist of a thorough descriptive analysis of the data for the purpose of documentation, and then there will be OT analysis.Keywords: reduplication, urdu-hindi, nonsensical, optimality theory
Procedia PDF Downloads 751513 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing
Authors: Huan Ting Liao
Abstract:
In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning
Procedia PDF Downloads 241512 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1271511 Exploring the Design of Prospective Human Immunodeficiency Virus Type 1 Reverse Transcriptase Inhibitors through a Comprehensive Approach of Quantitative Structure Activity Relationship Study, Molecular Docking, and Molecular Dynamics Simulations
Authors: Mouna Baassi, Mohamed Moussaoui, Sanchaita Rajkhowa, Hatim Soufi, Said Belaaouad
Abstract:
The objective of this paper is to address the challenging task of targeting Human Immunodeficiency Virus type 1 Reverse Transcriptase (HIV-1 RT) in the treatment of AIDS. Reverse Transcriptase inhibitors (RTIs) have limitations due to the development of Reverse Transcriptase mutations that lead to treatment resistance. In this study, a combination of statistical analysis and bioinformatics tools was adopted to develop a mathematical model that relates the structure of compounds to their inhibitory activities against HIV-1 Reverse Transcriptase. Our approach was based on a series of compounds recognized for their HIV-1 RT enzymatic inhibitory activities. These compounds were designed via software, with their descriptors computed using multiple tools. The most statistically promising model was chosen, and its domain of application was ascertained. Furthermore, compounds exhibiting comparable biological activity to existing drugs were identified as potential inhibitors of HIV-1 RT. The compounds underwent evaluation based on their chemical absorption, distribution, metabolism, excretion, toxicity properties, and adherence to Lipinski's rule. Molecular docking techniques were employed to examine the interaction between the Reverse Transcriptase (Wild Type and Mutant Type) and the ligands, including a known drug available in the market. Molecular dynamics simulations were also conducted to assess the stability of the RT-ligand complexes. Our results reveal some of the new compounds as promising candidates for effectively inhibiting HIV-1 Reverse Transcriptase, matching the potency of the established drug. This necessitates further experimental validation. This study, beyond its immediate results, provides a methodological foundation for future endeavors aiming to discover and design new inhibitors targeting HIV-1 Reverse Transcriptase.Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation, reverse transcriptase inhibitors, HIV type 1
Procedia PDF Downloads 921510 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state
Procedia PDF Downloads 2661509 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process
Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani
Abstract:
Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process
Procedia PDF Downloads 3381508 Characterising Indigenous Chicken (Gallus gallus domesticus) Ecotypes of Tigray, Ethiopia: A Combined Approach Using Ecological Niche Modelling and Phenotypic Distribution Modelling
Authors: Gebreslassie Gebru, Gurja Belay, Minister Birhanie, Mulalem Zenebe, Tadelle Dessie, Adriana Vallejo-Trujillo, Olivier Hanotte
Abstract:
Livestock must adapt to changing environmental conditions, which can result in either phenotypic plasticity or irreversible phenotypic change. In this study, we combine Ecological Niche Modelling (ENM) and Phenotypic Distribution Modelling (PDM) to provide a comprehensive framework for understanding the ecological and phenotypic characteristics of indigenous chicken (Gallus gallus domesticus) ecotypes. This approach helped us to classify these ecotypes, differentiate their phenotypic traits, and identify associations between environmental variables and adaptive traits. We measured 297 adult indigenous chickens from various agro-ecologies, including 208 females and 89 males. A subset of the 22 measured traits was selected using stepwise selection, resulting in seven traits for each sex. Using ENM, we identified four agro-ecologies potentially harbouring distinct phenotypes of indigenous Tigray chickens. However, PDM classified these chickens into three phenotypical ecotypes. Chickens grouped in ecotype-1 and ecotype-3 exhibited superior adaptive traits compared to those in ecotype-2, with significant variance observed. This high variance suggests a broader range of trait expression within these ecotypes, indicating greater adaptation capacity and potentially more diverse genetic characteristics. Several environmental variables, such as soil clay content, forest cover, and mean temperature of the wettest quarter, were strongly associated with most phenotypic traits. This suggests that these environmental factors play a role in shaping the observed phenotypic variations. By integrating ENM and PDM, this study enhances our understanding of indigenous chickens' ecological and phenotypic diversity. It also provides valuable insights into their conservation and management in response to environmental changes.Keywords: adaptive traits, agro-ecology, appendage, climate, environment, imagej, morphology, phenotypic variation
Procedia PDF Downloads 331507 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models
Authors: Morten Brøgger, Kim Wittchen
Abstract:
Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.Keywords: building stock energy modelling, energy-savings, archetype
Procedia PDF Downloads 1541506 The Impact of Nutrition Education Intervention in Improving the Nutritional Status of Sickle Cell Patients
Authors: Lindy Adoma Dampare, Marina Aferiba Tandoh
Abstract:
Sickle cell disease (SCD) is an inherited blood disorder that mostly affects individuals in sub-Saharan Africa. Nutritional deficiencies have been well established in SCD patients. In Ghana, studies have revealed the prevalence of malnutrition, especially amongst children with SCD and hence the need to develop an evidence-based comprehensive nutritional therapy for SCD to improve their nutritional status. The aim of the study was to develop and assess the effect of a nutrition education material on the nutritional status of SCD patients in Ghana. This was a pre-post interventional study. Patients between the ages of 2 to 60 years were recruited from the Tema General Hospital. Following a baseline nutrition knowledge (NK), beliefs, sanitary practice and dietary consumption pattern assessment, a twice-monthly nutrition education was carried out for 3 months, followed by a post-intervention assessment. Nutritional status of SCD patients was assessed using a 3-days dietary recall and anthropometric measurements. Nutrition education (NE) was given to SCD adults and caregivers of SCD children. Majority of the caregivers (69%) and SCD adult (82%) at baseline had low NK. The level of NK improved significantly in SCD adults (4.18±1.83 vs. 10.00±1.00, p<0.001) and caregivers (5.58 ± 2.25 vs.10.44± 0.846, p<0.001) after NE. Increase in NK improved dietary intake and dietary consumption pattern of SCD patients. Significant increase in weight (23.2±11.6 vs. 25.9±12.1, p=0.036) and height (118.5±21.9 vs. 123.5±22.2, p=0.011) was observed in SCD children at post intervention. Stunting (10.5% vs. 8.6%, p=0.62) and wasting (22.1% vs. 14.4%, p=0.30) reduced in SCD children after NE although not statistically significant. Reduction (18.2% vs. 9.1%) in underweight and an increase (18.2% vs. 27.3%) in overweight SCD adults was recorded at post intervention. Fat mass remained the same while high muscle mass increased (18.2% vs. 27.3%) at post intervention in SCD adult. Anaemic status of SCD patients improved at post intervention and the improvement was statistically significant amongst SCD children. Nutrition education improved the NK of SCD caregivers and adults hence, improving the dietary consumption pattern and nutrient intake of SCD patients. Overall, NE improved the nutritional status of SCD patients. This study shows the potential of nutrition education in improving the nutritional knowledge, dietary consumption pattern, dietary intake and nutritional status of SCD patients, and should be further explored.Keywords: sickle cell disease, nutrition education, dietary intake, nutritional status
Procedia PDF Downloads 1031505 Job Satisfaction and Associated factors of Urban Health Extension Professionals in Addis Ababa City, Ethiopia
Authors: Metkel Gebremedhin, Biruk Kebede, Guash Abay
Abstract:
Job satisfaction largely determines the productivity and efficiency of human resources for health. There is scanty evidence on factors influencing the job satisfaction of health extension professionals (HEPs) in Addis Ababa. The objective of this study was to determine the level of and factors influencing job satisfaction among extension health workers in Addis Ababa city. This was a cross-sectional study conducted in Addis Ababa, Ethiopia. Among all public health centers found in the Addis Ababa city administration health bureau that would be included in the study, a multistage sampling technique was employed. Then we selected the study health centers randomly and urban health extension professionals from the selected health centers. In-depth interview data collection methods were carried out for a comprehensive understanding of factors affecting job satisfaction among Health extension professionals (HEPs) in Addis Ababa. HEPs working in Addis Ababa areas are the primary study population. Multivariate logistic regression with 95% CI at P ≤ 0.05 was used to assess associated factors to job satisfaction. The overall satisfaction rate was 10.7% only, while 89.3%% were dissatisfied with their jobs. The findings revealed that variables such as marital status, staff relations, community support, supervision, and rewards have a significant influence on the level of job satisfaction. For those who were not satisfied, the working environment, job description, low salary, poor leadership and training opportunities were the major causes. Other factors influencing the level of satisfaction were lack of medical equipment, lack of transport facilities, lack of training opportunities, and poor support from woreda experts. Our study documented a very low level of overall satisfaction among health extension professionals in Addis Ababa city public health centers. Considering the factors responsible for this state of affairs, urgent and concrete strategies must be developed to address the concerns of extension health professionals as they represent a sensitive domain of the health system of Addis Ababa city. Improving the overall work environment, review of job descriptions and better salaries might bring about a positive change.Keywords: job satisfaction, extension health professionals, Addis Ababa
Procedia PDF Downloads 771504 Assessing Future Offshore Wind Farms in the Gulf of Roses: Insights from Weather Research and Forecasting Model Version 4.2
Authors: Kurias George, Ildefonso Cuesta Romeo, Clara Salueña Pérez, Jordi Sole Olle
Abstract:
With the growing prevalence of wind energy there is a need, for modeling techniques to evaluate the impact of wind farms on meteorology and oceanography. This study presents an approach that utilizes the WRF (Weather Research and Forecasting )with that include a Wind Farm Parametrization model to simulate the dynamics around Parc Tramuntana project, a offshore wind farm to be located near the Gulf of Roses off the coast of Barcelona, Catalonia. The model incorporates parameterizations for wind turbines enabling a representation of the wind field and how it interacts with the infrastructure of the wind farm. Current results demonstrate that the model effectively captures variations in temeperature, pressure and in both wind speed and direction over time along with their resulting effects on power output from the wind farm. These findings are crucial for optimizing turbine placement and operation thus improving efficiency and sustainability of the wind farm. In addition to focusing on atmospheric interactions, this study delves into the wake effects within the turbines in the farm. A range of meteorological parameters were also considered to offer a comprehensive understanding of the farm's microclimate. The model was tested under different horizontal resolutions and farm layouts to scrutinize the wind farm's effects more closely. These experimental configurations allow for a nuanced understanding of how turbine wakes interact with each other and with the broader atmospheric and oceanic conditions. This modified approach serves as a potent tool for stakeholders in renewable energy, environmental protection, and marine spatial planning. environmental protection and marine spatial planning. It provides a range of information regarding the environmental and socio economic impacts of offshore wind energy projects.Keywords: weather research and forecasting, wind turbine wake effects, environmental impact, wind farm parametrization, sustainability analysis
Procedia PDF Downloads 721503 Advancing Inclusive Curriculum Development for Special Needs Education in Africa
Authors: Onosedeba Mary Ayayia
Abstract:
Inclusive education has emerged as a critical global imperative, aiming to provide equitable educational opportunities for all, regardless of their abilities or disabilities. In Africa, the pursuit of inclusive education faces significant challenges, particularly concerning the development and implementation of inclusive curricula tailored to the diverse needs of students with disabilities. This study delves into the heart of this issue, seeking to address the pressing problem of exclusion and marginalization of students with disabilities in mainstream educational systems across the continent. The problem is complex, entailing issues of limited access to tailored curricula, shortages of qualified teachers in special needs education, stigmatization, limited research and data, policy gaps, inadequate resources, and limited community awareness. These challenges perpetuate a system where students with disabilities are systematically excluded from quality education, limiting their future opportunities and societal contributions. This research proposes a comprehensive examination of the current state of inclusive curriculum development and implementation in Africa. Through an innovative and explicit exploration of the problem, the study aims to identify effective strategies, guidelines, and best practices that can inform the development of inclusive curricula. These curricula will be designed to address the diverse learning needs of students with disabilities, promote teacher capacity building, combat stigmatization, generate essential data, enhance policy coherence, allocate adequate resources, and raise community awareness. The goal of this research is to contribute to the advancement of inclusive education in Africa by fostering an educational environment where every student, regardless of ability or disability, has equitable access to quality education. Through this endeavor, the study aligns with the broader global pursuit of social inclusion and educational equity, emphasizing the importance of inclusive curricula as a foundational step towards a more inclusive and just society.Keywords: inclusive education, special education, curriculum development, Africa
Procedia PDF Downloads 641502 Assessing Carbon Stock and Sequestration of Reforestation Species on Old Mining Sites in Morocco Using the DNDC Model
Authors: Nabil Elkhatri, Mohamed Louay Metougui, Ngonidzashe Chirinda
Abstract:
Mining activities have left a legacy of degraded landscapes, prompting urgent efforts for ecological restoration. Reforestation holds promise as a potent tool to rehabilitate these old mining sites, with the potential to sequester carbon and contribute to climate change mitigation. This study focuses on evaluating the carbon stock and sequestration potential of reforestation species in the context of Morocco's mining areas, employing the DeNitrification-DeComposition (DNDC) model. The research is grounded in recognizing the need to connect theoretical models with practical implementation, ensuring that reforestation efforts are informed by accurate and context-specific data. Field data collection encompasses growth patterns, biomass accumulation, and carbon sequestration rates, establishing an empirical foundation for the study's analyses. By integrating the collected data with the DNDC model, the study aims to provide a comprehensive understanding of carbon dynamics within reforested ecosystems on old mining sites. The major findings reveal varying sequestration rates among different reforestation species, indicating the potential for species-specific optimization of reforestation strategies to enhance carbon capture. This research's significance lies in its potential to contribute to sustainable land management practices and climate change mitigation strategies. By quantifying the carbon stock and sequestration potential of reforestation species, the study serves as a valuable resource for policymakers, land managers, and practitioners involved in ecological restoration and carbon management. Ultimately, the study aligns with global objectives to rejuvenate degraded landscapes while addressing pressing climate challenges.Keywords: carbon stock, carbon sequestration, DNDC model, ecological restoration, mining sites, Morocco, reforestation, sustainable land management.
Procedia PDF Downloads 761501 The Diary of Dracula, by Marin Mincu: Inquiries into a Romanian 'Book of Wisdom' as a Fictional Counterpart for Corpus Hermeticum
Authors: Lucian Vasile Bagiu, Paraschiva Bagiu
Abstract:
The novel written in Italian and published in Italy in 1992 by the Romanian scholar Marin Mincu is meant for the foreign reader, aiming apparently at a better knowledge of the historical character of Vlad the Empalor (Vlad Dracul), within the European cultural, political and historical context of 1463. Throughout the very well written tome, one comes to realize that one of the underlining levels of the fiction is the exposing of various fundamental features of the Romanian culture and civilization. The author of the diary, Dracula, makes mention of Corpus Hermeticum no less than fifteen times, suggesting his own diary is some sort of a philosophical counterpart. The essay focuses on several ‘truths’ and ‘wisdom’ revealed in the fictional teachings of Dracula. The boycott of History by the Romanians is identified as an echo of the philosophical approach of the famous Romanian scholar and writer Lucian Blaga. The orality of the Romanian culture is a landmark opposed to written culture of the Western Europe. The religion of the ancient Dacian God Zalmoxis is seen as the basis for the Romanian existential and/or metaphysical ethnic philosophy (a feature tackled by the famous Romanian historian of religion Mircea Eliade), with a suggestion that Hermes Trismegistus may have written his Corpus Hermeticum being influenced by Zalmoxis. The historical figure of the last Dacian king Decebalus (death 106 AD) is a good pretext for a tantalizing Indo-European suggestion that the prehistoric Thraco-Dacian people may have been the ancestors of the first Romans settled in Latium. The lost diary of the Emperor Trajan The Bello Dacico may have proved that the unknown language of the Dacians was very much alike Latin language (a secret well hidden by the Vatican). The attitude towards death of the Dacians, as described by Herodotus, may have later inspired Pitagora, Socrates, the Eleusinian and Orphic Mysteries, etc. All of these within the Humanistic and Renascentist European context of the epoch, Dracula having a close relationship with scholars such as Nicolaus Cusanus, Cosimo de Medici, Marsilio Ficino, Pope Pius II, etc. Thus The Diary of Dracula turns out as exciting and stupefying as Corpus Hermeticum, a book impossible to assimilate entirely, yet a reference not wise to be ignored.Keywords: Corpus Hermeticum, Dacians, Dracula, Zalmoxis
Procedia PDF Downloads 1591500 Canada's "Flattened Curve": A Geospatial Temporal Analysis of Canada's Amelioration of the Sars-COV-2 Pandemic Through Coordinated Government Intervention
Authors: John Ahluwalia
Abstract:
As an affluent first-world nation, Canada took swift and comprehensive action during the outbreak of the SARS-CoV-2 (COVID-19) pandemic compared to other countries in the same socio-economic cohort. The United States has stumbled to overcome obstacles most developed nations have faced, which has led to significantly more per capita cases and deaths. The initial outbreaks of COVID-19 occurred in the US and Canada within days of each other and posed similar potentially catastrophic threats to public health, the economy, and governmental stability. On a macro level, events that take place in the US have a direct impact on Canada. For example, both countries tend to enter and exit economic recessions at approximately the same time, they are each other’s largest trading partners, and their currencies are inexorably linked. Why is it that Canada has not shared the same fate as the US (and many other nations) that have realized much worse outcomes relative to the COVID-19 pandemic? Variables intrinsic to Canada’s national infrastructure have been instrumental in the country’s efforts to flatten the curve of COVID-19 cases and deaths. Canada’s coordinated multi-level governmental effort has allowed it to create and enforce policies related to COVID-19 at both the national and provincial levels. Canada’s policy of universal healthcare is another variable. Health care and public health measures are enforced on a provincial level, and it is within each province’s jurisdiction to dictate standards for public safety based on scientific evidence. Rather than introducing confusion and the possibility of competition for resources such as PPE and vaccines, Canada’s multi-level chain of government authority has provided consistent policies supporting national public health and local delivery of medical care. This paper will demonstrate that the coordinated efforts on provincial and federal levels have been the linchpin in Canada’s relative success in containing the deadly spread of the COVID-19 virus.Keywords: COVID-19, Canada, GIS, temporal analysis, ESRI
Procedia PDF Downloads 1471499 One Health Approach: The Importance of Improving the Identification of Waterborne Bacteria in Austrian Water
Authors: Aurora Gitto, Philipp Proksch
Abstract:
The presence of various microorganisms (bacteria, fungi) in surface water and groundwater represents an important issue for human health worldwide. The matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF-MS) has emerged as a promising and reliable tool for bacteria identification in clinical diagnostic microbiology and environmental strains thanks to an ionization technique that uses a laser energy absorbing matrix to create ions from large molecules with minimal fragmentation. The study aims first to conceptualise and set up library information and create a comprehensive database of MALDI-TOF-MS spectra from environmental water samples. The samples were analysed over a year (2021-2022) using membrane filtration methodology (0.45 μm and 0.22 μm) and then isolated on R2A agar for a period of 5 days and Yeast extract agar growing at 22 °C up to 4 days and 37 °C for 48 hours. The undetected organisms by MALDI-TOF-MS were analysed by PCR and then sequenced. The information obtained by the sequencing was further implemented in the MALDI-TOF-MS library. Among the culturable bacteria, the results show how the incubator temperature affects the growth of some genera instead of others, as demonstrated by Pseudomonas sp., which grows at 22 °C, compared to Bacillus sp., which is abundant at 37 °C. The bacteria community shows a variation in composition also between the media used, as demonstrated with R2A agar which has been defined by a higher presence of organisms not detected compared to YEA. Interesting is the variability of the Genus over one year of sampling and how the seasonality impacts the bacteria community; in fact, in some sampling locations, we observed how the composition changed, moving from winter to spring and summer. In conclusion, the bacteria community in groundwater and river bank filtration represents important information that needs to be added to the library to simplify future water quality analysis but mainly to prevent potential risks to human health.Keywords: water quality, MALDI-TOF-MS, sequencing, library
Procedia PDF Downloads 831498 Desulphurization of Waste Tire Pyrolytic Oil (TPO) Using Photodegradation and Adsorption Techniques
Authors: Moshe Mello, Hilary Rutto, Tumisang Seodigeng
Abstract:
The nature of tires makes them extremely challenging to recycle due to the available chemically cross-linked polymer and, therefore, they are neither fusible nor soluble and, consequently, cannot be remolded into other shapes without serious degradation. Open dumping of tires pollutes the soil, contaminates underground water and provides ideal breeding grounds for disease carrying vermins. The thermal decomposition of tires by pyrolysis produce char, gases and oil. The composition of oils derived from waste tires has common properties to commercial diesel fuel. The problem associated with the light oil derived from pyrolysis of waste tires is that it has a high sulfur content (> 1.0 wt.%) and therefore emits harmful sulfur oxide (SOx) gases to the atmosphere when combusted in diesel engines. Desulphurization of TPO is necessary due to the increasing stringent environmental regulations worldwide. Hydrodesulphurization (HDS) is the commonly practiced technique for the removal of sulfur species in liquid hydrocarbons. However, the HDS technique fails in the presence of complex sulfur species such as Dibenzothiopene (DBT) present in TPO. This study aims to investigate the viability of photodegradation (Photocatalytic oxidative desulphurization) and adsorptive desulphurization technologies for efficient removal of complex and non-complex sulfur species in TPO. This study focuses on optimizing the cleaning (removal of impurities and asphaltenes) process by varying process parameters; temperature, stirring speed, acid/oil ratio and time. The treated TPO will then be sent for vacuum distillation to attain the desired diesel like fuel. The effect of temperature, pressure and time will be determined for vacuum distillation of both raw TPO and the acid treated oil for comparison purposes. Polycyclic sulfides present in the distilled (diesel like) light oil will be oxidized dominantly to the corresponding sulfoxides and sulfone via a photo-catalyzed system using TiO2 as a catalyst and hydrogen peroxide as an oxidizing agent and finally acetonitrile will be used as an extraction solvent. Adsorptive desulphurization will be used to adsorb traces of sulfurous compounds which remained during photocatalytic desulphurization step. This desulphurization convoy is expected to give high desulphurization efficiency with reasonable oil recovery.Keywords: adsorption, asphaltenes, photocatalytic oxidation, pyrolysis
Procedia PDF Downloads 2721497 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 631496 The Development of an Accident Causation Model Specific to Agriculture: The Irish Farm Accident Causation Model
Authors: Carolyn Scott, Rachel Nugent
Abstract:
The agricultural industry in Ireland and worldwide is one of the most dangerous occupations with respect to occupational health and safety accidents and fatalities. Many accident causation models have been developed in safety research to understand the underlying and contributory factors that lead to the occurrence of an accident. Due to the uniqueness of the agricultural sector, current accident causation theories cannot be applied. This paper presents an accident causation model named the Irish Farm Accident Causation Model (IFACM) which has been specifically tailored to the needs of Irish farms. The IFACM is a theoretical and practical model of accident causation that arranges the causal factors into a graphic representation of originating, shaping, and contributory factors that lead to accidents when unsafe acts and conditions are created that are not rectified by control measures. Causes of farm accidents were assimilated by means of a thorough literature review and were collated to form a graphical representation of the underlying causes of a farm accident. The IFACM was validated retrospectively through case study analysis and peer review. Participants in the case study (n=10) identified causes that led to a farm accident in which they were involved. A root cause analysis was conducted to understand the contributory factors surrounding the farm accident, traced back to the ‘root cause’. Experts relevant to farm safety accident causation in the agricultural industry have peer reviewed the IFACM. The accident causation process is complex. Accident prevention requires a comprehensive understanding of this complex process because to prevent the occurrence of accidents, the causes of accidents must be known. There is little research on the key causes and contributory factors of unsafe behaviours and accidents on Irish farms. The focus of this research is to gain a deep understanding of the causality of accidents on Irish farms. The results suggest that the IFACM framework is helpful for the analysis of the causes of accidents within the agricultural industry in Ireland. The research also suggests that there may be international applicability if further research is carried out. Furthermore, significant learning can be obtained from considering the underlying causes of accidents.Keywords: farm safety, farm accidents, accident causation, root cause analysis
Procedia PDF Downloads 781495 Antibacterial Effects of Some Medicinal and Aromatic Plant Extracts on Pathogenic Bacteria Isolated from Pear Orchards
Authors: Kubilay Kurtulus Bastas
Abstract:
Bacterial diseases are very destructive and cause economic losses on pears. Promising plant extracts for the management of plant diseases are environmentally safe, long-lasting and extracts of certain plants contain alkaloids, tannins, quinones, coumarins, phenolic compounds, and phytoalexins. In this study, bacteria were isolated from different parts of pear exhibiting characteristic symptoms of bacterial diseases from the Central Anatolia, Turkey. Pathogenic bacteria were identified by morphological, physiological, biochemical and molecular methods as fire blight (Erwinia amylovora (39%)), bacterial blossom blast and blister bark (Pseudomonas syringae pv. syringae (22%)), crown gall (Rhizobium radiobacter (1%)) from different pear cultivars, and determined virulence levels of the pathogens with pathogenicity tests. The air-dried 25 plant material was ground into fine powder and extraction was performed at room temperature by maceration with 80% (v/v) methanol/distilled water. The minimum inhibitory concentration (MIC) values were determined by using modified disc diffusion method at five different concentrations and streptomycin sulphate was used as control chemical. Bacterial suspensions were prepared as 108 CFU ml⁻¹ densities and 100 µl bacterial suspensions were spread to TSA medium. Antimicrobial activity was evaluated by measuring the inhibition zones in reference to the test organisms. Among the tested plants, Origanum vulgare, Hedera helix, Satureja hortensis, Rhus coriaria, Eucalyptus globulus, Rosmarinus officinalis, Ocimum basilicum, Salvia officinalis, Cuminum cyminum and Thymus vulgaris showed a good antibacterial activity and they inhibited the growth of the pathogens with inhibition zone diameter ranging from 7 to 27 mm at 20% (w/v) in absolute methanol in vitro conditions. In vivo, the highest efficacy was determined as 27% on reducing tumor formation of R. radiobacter, and 48% and 41% on reducing shoot blight of E. amylovora and P. s. pv. syringae on pear seedlings, respectively. Obtaining data indicated that some plant extracts may be used against the bacterial diseases on pome fruits within sustainable and organic management programs.Keywords: bacteria, eco-friendly management, organic, pear, plant extract
Procedia PDF Downloads 3351494 Transcriptome Analysis of Saffron (crocus sativus L.) Stigma Focusing on Identification Genes Involved in the Biosynthesis of Crocin
Authors: Parvaneh Mahmoudi, Ahmad Moeni, Seyed Mojtaba Khayam Nekoei, Mohsen Mardi, Mehrshad Zeinolabedini, Ghasem Hosseini Salekdeh
Abstract:
Saffron (Crocus sativus L.) is one of the most important spice and medicinal plants. The three-branch style of C. sativus flowers are the most important economic part of the plant and known as saffron, which has several medicinal properties. Despite the economic and biological significance of this plant, knowledge about its molecular characteristics is very limited. In the present study, we, for the first time, constructed a comprehensive dataset for C. sativus stigma through de novo transcriptome sequencing. We performed de novo transcriptome sequencing of C. sativus stigma using the Illumina paired-end sequencing technology. A total of 52075128 reads were generated and assembled into 118075 unigenes, with an average length of 629 bp and an N50 of 951 bp. A total of 66171unigenes were identified, among them, 66171 (56%) were annotated in the non-redundant National Center for Biotechnology Information (NCBI) database, 30938 (26%) were annotated in the Swiss-Prot database, 10273 (8.7%) unigenes were mapped to 141 Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway database, while 52560 (44%) and 40756 (34%) unigenes were assigned to Gen Ontology (GO) categories and Eukaryotic Orthologous Groups of proteins (KOG), respectively. In addition, 65 candidate genes involved in three stages of crocin biosynthesis were identified. Finally, transcriptome sequencing of saffron stigma was used to identify 6779 potential microsatellites (SSRs) molecular markers. High-throughput de novo transcriptome sequencing provided a valuable resource of transcript sequences of C. sativus in public databases. In addition, most of candidate genes potentially involved in crocin biosynthesis were identified which could be further utilized in functional genomics studies. Furthermore, numerous obtained SSRs might contribute to address open questions about the origin of this amphiploid spices with probable little genetic diversity.Keywords: saffron, transcriptome, NGS, bioinformatic
Procedia PDF Downloads 1001493 Creation and Implementation of A New Palliative Care Drug Chart, via A Closed-Loop Audit
Authors: Asfa Hussain, Chee Tang, Mien Nguyen
Abstract:
Introduction: The safe usage of medications is dependent on clear, well-documented prescribing. Medical drug charts should be regularly checked to ensure that they are fit for purpose. Aims: The purpose of this study was to evaluate whether the Isabel Hospice drug charts were effective or prone to medical errors. The aim was to create a comprehensive palliative care drug chart in line with medico-legal guidelines and to minimise drug administration and prescription errors. Methodology: 50 medical drug charts were audited from March to April 2020, to assess whether they complied with medico-legal guidelines, in a hospice within East of England. Meetings were held with the larger multi-disciplinary team (MDT), including the pharmacists, nursing staff and doctors, to raise awareness of the issue. A preliminary drug chart was created, using the input from the wider MDT. The chart was revised and trialled over 15 times, and each time feedback from the MDT was incorporated into the subsequent template. In the midst of the COVID-19 pandemic in September 2020, the finalised drug chart was trialled. 50 new palliative drug charts were re-audited, to evaluate the changes made. Results: Prescribing and administration errors were high prior to the implementation of the new chart. This improved significantly after introducing the new drug charts, therefore improving patient safety and care. The percentage of inadequately documented allergies went down from 66% to 20% and incorrect oxygen prescription from 40% to 16%. The prescription drug-drug interactions decreased by 30%. Conclusion: It is vital to have clear standardised drug charts, in line with medico-legal standards, to allow ease of prescription and administration of medications and ensure optimum patient-centred care. This closed loop audit demonstrated significant improvement in documentation and prevention of possible fatal drug errors and interactions.Keywords: palliative care, drug chart, medication errors, drug-drug interactions, COVID-19, patient safety
Procedia PDF Downloads 1761492 BiFormerDTA: Structural Embedding of Protein in Drug Target Affinity Prediction Using BiFormer
Authors: Leila Baghaarabani, Parvin Razzaghi, Mennatolla Magdy Mostafa, Ahmad Albaqsami, Al Warith Al Rushaidi, Masoud Al Rawahi
Abstract:
Predicting the interaction between drugs and their molecular targets is pivotal for advancing drug development processes. Due to the time and cost limitations, computational approaches have emerged as an effective approach to drug-target interaction (DTI) prediction. Most of the introduced computational based approaches utilize the drug molecule and protein sequence as input. This study does not only utilize these inputs, it also introduces a protein representation developed using a masked protein language model. In this representation, for every individual amino acid residue within the protein sequence, there exists a corresponding probability distribution that indicates the likelihood of each amino acid being present at that particular position. Then, the similarity between each pair of amino-acids is computed to create similarity matrix. To encode the knowledge of the similarity matrix, Bi-Level Routing Attention (BiFormer) is utilized, which combines aspects of transformer-based models with protein sequence analysis and represents a significant advancement in the field of drug-protein interaction prediction. BiFormer has the ability to pinpoint the most effective regions of the protein sequence that are responsible for facilitating interactions between the protein and drugs, thereby enhancing the understanding of these critical interactions. Thus, it appears promising in its ability to capture the local structural relationship of the proteins by enhancing the understanding of how it contributes to drug protein interactions, thereby facilitating more accurate predictions. To evaluate the proposed method, it was tested on two widely recognized datasets: Davis and KIBA. A comprehensive series of experiments was conducted to illustrate its effectiveness in comparison to cuttingedge techniques.Keywords: BiFormer, transformer, protein language processing, self-attention mechanism, binding affinity, drug target interaction, similarity matrix, protein masked representation, protein language model
Procedia PDF Downloads 8