Search results for: data driven decision making
26417 Modelling Forest Fire Risk in the Goaso Forest Area of Ghana: Remote Sensing and Geographic Information Systems Approach
Authors: Bernard Kumi-Boateng, Issaka Yakubu
Abstract:
Forest fire, which is, an uncontrolled fire occurring in nature has become a major concern for the Forestry Commission of Ghana (FCG). The forest fires in Ghana usually result in massive destruction and take a long time for the firefighting crews to gain control over the situation. In order to assess the effect of forest fire at local scale, it is important to consider the role fire plays in vegetation composition, biodiversity, soil erosion, and the hydrological cycle. The occurrence, frequency and behaviour of forest fires vary over time and space, primarily as a result of the complicated influences of changes in land use, vegetation composition, fire suppression efforts, and other indigenous factors. One of the forest zones in Ghana with a high level of vegetation stress is the Goaso forest area. The area has experienced changes in its traditional land use such as hunting, charcoal production, inefficient logging practices and rural abandonment patterns. These factors which were identified as major causes of forest fire, have recently modified the incidence of fire in the Goaso area. In spite of the incidence of forest fires in the Goaso forest area, most of the forest services do not provide a cartographic representation of the burned areas. This has resulted in significant amount of information being required by the firefighting unit of the FCG to understand fire risk factors and its spatial effects. This study uses Remote Sensing and Geographic Information System techniques to develop a fire risk hazard model using the Goaso Forest Area (GFA) as a case study. From the results of the study, natural forest, agricultural lands and plantation cover types were identified as the major fuel contributing loads. However, water bodies, roads and settlements were identified as minor fuel contributing loads. Based on the major and minor fuel contributing loads, a forest fire risk hazard model with a reasonable accuracy has been developed for the GFA to assist decision making.Keywords: forest, GIS, remote sensing, Goaso
Procedia PDF Downloads 46226416 Software Architecture Implications on Development Productivity: A Case of Malawi Point of Care Electronic Medical Records
Authors: Emmanuel Mkambankhani, Tiwonge Manda
Abstract:
Software platform architecture includes system components, their relationships, and design, as well as evolution principles. Software architecture and documentation affect a platform's customizability and openness to external innovators, thus affecting developer productivity. Malawi Point of Care (POC) Electronic Medical Records System (EMRS) follows some architectural design standards, but it lacks third-party innovators and is difficult to customize as compared to CommCare and District Health Information System 2 (DHIS2). Improving software architecture and documentation for the Malawi POC will increase productivity and third-party contributions. A conceptual framework based on Generativity and Boundary Resource Model (BRM) was used to compare the three platforms. Interviews, observations, and document analysis were used to collect primary and secondary data. Themes were found by analyzing qualitative and quantitative data, which led to the following results. Configurable, flexible, and cross-platform software platforms and the availability of interfaces (Boundary Resources) that let internal and external developers interact with the platform's core functionality, hence boosting developer productivity. Furthermore, documentation increases developer productivity, while its absence inhibits the use of resources. The study suggests that the architecture and openness of the Malawi POC EMR software platform will be improved by standardizing web application program interfaces (APIs) and making interfaces that can be changed by the user. In addition, increasing the availability of documentation and training will improve the use of boundary resources, thus improving internal and third-party development productivity.Keywords: health systems, configurable platforms, software architecture, software documentation, software development productivity
Procedia PDF Downloads 9326415 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19826414 An Interpretative Phenomenological Analysis on the Concept of Friends of Children in Conflict with the Law
Authors: Karla Kristine Bay, Jovie Ann Gabin, Allana Joyce Sasotona
Abstract:
This research employed an Interpretative Phenomenological Analysis to explore the experiences of Children in Conflict with the Law (CICL) which gave light to their concept of ‘friends’. Derived from this context are the following objectives of the study: 1) determining the differentiation of the forms of friends of the CICL; 2) presenting the process of attachment towards detachment in the formation of friendship; and 3) discussing the experiences, and reflections of the CICL on the ‘self’ out of their encounter with friendship. Using the data gathered from the individual drawings of the CICL of their representations of the self, family, friends, community, and Bahay Kalinga as subjects in the meaning-making process utilizing Filipino Psychology methods of pagtatanong-tanong (interview), and pakikipagkwentuhan (conversation), data analysis produced a synthesis of seventeen individual cases. Overall results generated three superordinate themes on the differentiation of the forms of friends which include friends with good influences, friends with bad influences, and friends within the family. While two superordinate themes were produced on the process of attachment towards detachment, namely social, emotional, and psychological experiences on the process of attachment, and emotional and psychological experiences on the process of detachment. Lastly, two superordinate themes were created on the experiences, and reflections of the CICL on the ‘self’ out of their encounter with friendship. This consists of the recognition of the ‘self’ as a responsible agent in developing healthy relationships between the self and others, and reconstruction of the self from the collective experiences of healing, forgiveness, and acceptance. These findings, together with supporting theories discussed the impact of friendship on the emergence of criminal behavior and other dispositions; springing from the child’s dissociation from the family that led to finding belongingness from an external group called friends.Keywords: children in conflict with the law, criminal behavior, friends, interpretative phenomenological analysis
Procedia PDF Downloads 23726413 Study of Parking Demand for Offices – Case Study: Kolkata
Authors: Sanghamitra Roy
Abstract:
In recent times, India has experienced the phenomenal rise in the number of registered vehicles and vehicular trips, particularly intra-city trips in most of its urban areas. The increase in vehicle ownership and use have increased parking demand immensely and accommodating the same is now a matter of big concern. Most cities do not have adequate off-street parking facilities thus forcing people to park on the streets. This has resulted in decreased carrying capacity, decreased traffic speed, increased congestion, and increased environmental problems. While integrated multi-modal transportation system is the answer to such problems, parking issues will continue to exist. In Kolkata, only 6.4% land is devoted for roads. The consequences of this huge crunch in road spaces coupled with increased parking demand are severe particularly in the CBD and major commercial areas, making the role of off-street parking facilities in Kolkata even more critical. To meaningfully address parking issues, it is important to identify the factors that influence parking demand so that it can be assessed and comprehensive parking policies and plans for the city can be formulated. This paper aims at identifying the factors that contribute towards parking demand for offices in Kolkata and their degree of correlation with parking demand. The study is limited to home-to-work trips located within Kolkata Municipal Corporation (KMC) where parking related issues are most pronounced. The data for the study is collected through personal interviews, questionnaires and direct observations from offices across the wards of KMC. SPSS is used for classification of the data and analyses of the same. The findings of this study will help in re-assessment of the parking requirements specified in The Kolkata Municipal Corporation Building Rules as a step towards alleviating parking related issues in the city.Keywords: building rules, office spaces, parking demand, urbanization
Procedia PDF Downloads 31826412 The Impact of the Chanpyons Credible Messenger Intervention on Breast Cancer Screening Rates among Haitian Creole Women
Authors: Zachary Bernard, Dorothy Dillard
Abstract:
Background: Haitian Creole women in Sussex County, Delaware, experience significant disparities in breast cancer outcomes, exacerbated by cultural, linguistic, and socioeconomic barriers. The Chanpyons Credible Messenger Intervention was developed to address these disparities through culturally tailored education, logistical support, and the engagement of trusted community members as credible messengers. Method: This mixed-methods study combined quantitative analysis of screening rates pre- and post-intervention, using demographic data from 85 participants, with qualitative interviews to explore participants' perceptions, barriers, and experiences. Results: Of the participants, 22.35% had received a mammogram before the program, compared to 50.59% after its implementation, marking a significant increase in screening rates. Women with private insurance had higher up-to-date screening rates (78.95%) compared to uninsured women (36.36%). Qualitative findings revealed that credible messengers effectively built trust, addressed cultural misconceptions, and alleviated fear, empowering women to prioritize preventive care. Conclusion: The study demonstrates the success of culturally specific interventions in increasing breast cancer screening rates and reducing health disparities. The Chanpyons model highlights the importance of integrating community-driven approaches in public health programs, offering a replicable framework for addressing similar challenges in underserved populations.Keywords: breast cancer, community engagement, Haitian Creole women, credible messengers, health disparities, preventive care
Procedia PDF Downloads 1226411 Adaptive Architecture and Urbanism - A Study of Coastal Cities, Climate Change Problems, Effects, Risks And Opportunities for Making Sustainable Habitat
Authors: Santosh Kumar Ketham
Abstract:
Climate change creating most dramatic and destructive consequences, the result is global warming and sea-level rise, flooding coastal cities around the world forming vulnerable situations affecting in multiple ways: environment, economy, social and political. The aim and goal of the research is to develop cities on water. Taking the problem as an opportunity to bring science, engineering, policies and design together to make a resilient and sustainable floating community on water considering existing/new technologies of floating. The quest is to make sustainable habitat on water to live, work, learn and play. To make sustainable energy generation and storage alongside maintaining balance of land and marine to conserve Ecosystem. The research would serve as a model for sustainable neighbourhoods designed in a modular way and thus can easily extend or re-arranged, to adapt for future socioeconomic realities. This research paper studies primarily on climate change problems, effects, risks and opportunities. It does so, through analysing existing case studies, books and writings published on coastal cities and understanding its various aspects for making sustainable habitat.Keywords: floating cities, flexible modular typologies, rising sea levels, sustainable architecture and urbanism
Procedia PDF Downloads 14226410 Inferential Reasoning for Heterogeneous Multi-Agent Mission
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
We describe issues bedeviling the coordination of heterogeneous (different sensors carrying agents) multi-agent missions such as belief conflict, situation reasoning, etc. We applied Bayesian and agents' presumptions inferential reasoning to solve the outlined issues with the heterogeneous multi-agent belief variation and situational-base reasoning. Bayesian Belief Network (BBN) was used in modeling the agents' belief conflict due to sensor variations. Simulation experiments were designed, and cases from agents’ missions were used in training the BBN using gradient descent and expectation-maximization algorithms. The output network is a well-trained BBN for making inferences for both agents and human experts. We claim that the Bayesian learning algorithm prediction capacity improves by the number of training data and argue that it enhances multi-agents robustness and solve agents’ sensor conflicts.Keywords: distributed constraint optimization problem, multi-agent system, multi-robot coordination, autonomous system, swarm intelligence
Procedia PDF Downloads 15926409 Automatic Identification and Classification of Contaminated Biodegradable Plastics using Machine Learning Algorithms and Hyperspectral Imaging Technology
Authors: Nutcha Taneepanichskul, Helen C. Hailes, Mark Miodownik
Abstract:
Plastic waste has emerged as a critical global environmental challenge, primarily driven by the prevalent use of conventional plastics derived from petrochemical refining and manufacturing processes in modern packaging. While these plastics serve vital functions, their persistence in the environment post-disposal poses significant threats to ecosystems. Addressing this issue necessitates approaches, one of which involves the development of biodegradable plastics designed to degrade under controlled conditions, such as industrial composting facilities. It is imperative to note that compostable plastics are engineered for degradation within specific environments and are not suited for uncontrolled settings, including natural landscapes and aquatic ecosystems. The full benefits of compostable packaging are realized when subjected to industrial composting, preventing environmental contamination and waste stream pollution. Therefore, effective sorting technologies are essential to enhance composting rates for these materials and diminish the risk of contaminating recycling streams. In this study, it leverage hyperspectral imaging technology (HSI) coupled with advanced machine learning algorithms to accurately identify various types of plastics, encompassing conventional variants like Polyethylene terephthalate (PET), Polypropylene (PP), Low density polyethylene (LDPE), High density polyethylene (HDPE) and biodegradable alternatives such as Polybutylene adipate terephthalate (PBAT), Polylactic acid (PLA), and Polyhydroxyalkanoates (PHA). The dataset is partitioned into three subsets: a training dataset comprising uncontaminated conventional and biodegradable plastics, a validation dataset encompassing contaminated plastics of both types, and a testing dataset featuring real-world packaging items in both pristine and contaminated states. Five distinct machine learning algorithms, namely Partial Least Squares Discriminant Analysis (PLS-DA), Support Vector Machine (SVM), Convolutional Neural Network (CNN), Logistic Regression, and Decision Tree Algorithm, were developed and evaluated for their classification performance. Remarkably, the Logistic Regression and CNN model exhibited the most promising outcomes, achieving a perfect accuracy rate of 100% for the training and validation datasets. Notably, the testing dataset yielded an accuracy exceeding 80%. The successful implementation of this sorting technology within recycling and composting facilities holds the potential to significantly elevate recycling and composting rates. As a result, the envisioned circular economy for plastics can be established, thereby offering a viable solution to mitigate plastic pollution.Keywords: biodegradable plastics, sorting technology, hyperspectral imaging technology, machine learning algorithms
Procedia PDF Downloads 8526408 Coastal Flood Mapping of Vulnerability Due to Sea Level Rise and Extreme Weather Events: A Case Study of St. Ives, UK
Authors: S. Vavias, T. R. Brewer, T. S. Farewell
Abstract:
Coastal floods have been identified as an important natural hazard that can cause significant damage to the populated built-up areas, related infrastructure and also ecosystems and habitats. This study attempts to fill the gap associated with the development of preliminary assessments of coastal flood vulnerability for compliance with the EU Directive on the Assessment and Management of Flood Risks (2007/60/EC). In this context, a methodology has been created by taking into account three major parameters; the maximum wave run-up modelled from historical weather observations, the highest tide according to historic time series, and the sea level rise projections due to climate change. A high resolution digital terrain model (DTM) derived from LIDAR data has been used to integrate the estimated flood events in a GIS environment. The flood vulnerability map created shows potential risk areas and can play a crucial role in the coastal zone planning process. The proposed method has the potential to be a powerful tool for policy and decision makers for spatial planning and strategic management.Keywords: coastal floods, vulnerability mapping, climate change, extreme weather events
Procedia PDF Downloads 40126407 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 31726406 Lifelong Learning in Applied Fields (LLAF) Tempus Funded Project: A Case Study of Problem-Based Learning
Authors: Nirit Raichel, Dorit Alt
Abstract:
Although university teaching is claimed to have a special task to support students in adopting ways of thinking and producing new knowledge anchored in scientific inquiry practices, it is argued that students' habits of learning are still overwhelmingly skewed toward passive acquisition of knowledge from authority sources rather than from collaborative inquiry activities. In order to overcome this critical inadequacy between current educational goals and instructional methods, the LLAF consortium is aimed at developing updated instructional practices that put a premium on adaptability to the emerging requirements of present society. LLAF has created a practical guide for teachers containing updated pedagogical strategies based on the constructivist approach for learning, arranged along Delors’ four theoretical ‘pillars’ of education: Learning to know, learning to do, learning to live together, and learning to be. This presentation will be limited to problem-based learning (PBL), as a strategy introduced in the second pillar. PBL leads not only to the acquisition of technical skills, but also allows the development of skills like problem analysis and solving, critical thinking, cooperation and teamwork, decision- making and self-regulation that can be transferred to other contexts. This educational strategy will be exemplified by a case study conducted in the pre-piloting stage of the project. The case describes a three-fold process implemented in a postgraduate course for in-service teachers, including: (1) learning about PBL (2) implementing PBL in the participants' classes, and (3) qualitatively assessing the contributions of PBL to students' outcomes. An example will be given regarding the ways by which PBL was applied and assessed in civic education for high-school students. Two 9th-grade classes have participated the study; both included several students with learning disability. PBL was applied only in one class whereas traditional instruction was used in the other. Results showed a robust contribution of PBL to students' affective and cognitive outcomes as reflected in their motivation to engage in learning activities, and to further explore the subject. However, students with learning disability were less favorable with this "active" and "annoying" environment. Implications of these findings for the LLAF project will be discussed.Keywords: problem-based learning, higher education, pedagogical strategies
Procedia PDF Downloads 33526405 Small and Medium Sized Ports between Specialisation and Diversification: A Framework Tool for Sustainable Development
Authors: Christopher Meyer, Laima Gerlitz
Abstract:
European ports are facing high political pressure through the implementation of initiatives such as the European Green Deal or IMO's 2030 targets (Fit for 55). However, small and medium-sized ports face even higher challenges compared to bigger ones due to lower capacities in various fields such as investments, infra-structure, Human Resources, and funding opportunities. Small and Medium-Sized Ports (SMPs) roles in economic systems are various depending on their specific functionality in maritime ecosystems. Depending on their different situations, being an actor in multiport gateways, aligned to core ports, regional nodes in peripheries for the hinterland, specialized cluster members, or logistical nodes, different strategic business models may be applied to increase SMPs' competitiveness among other bigger ports. Additionally, SMPs are facing more challenges for future development in terms of digital and green transition of their operations. Thus, it is necessary to evaluate the own strategical position and apply management strategies alongside the regional growth and innovation strategies for diversification or specialisation of own port businesses. The research uses inductive perspectives to set up a transferable framework based on case studies to be analysed. In line with particular research and document analysis, qualitative approaches were considered. The research is based on a deep literature review on SMPs as well as theories on diversification and specialisation. Existing theories from different fields are evaluated on their application for the port sector and these specific maritime actors, paying respect to enabling innovation incorporation to enhance digital and environmental transition with fu-ture perspectives for SMPs. The paper aims to provide a decision-making matrix for the strategic positioning of SMPs in Europe, including opportunities to get access to particular EU funds for future development alongside the Regional In-novation Strategies on Smart Specialisation.Keywords: strategic planning, sustainability transition, competitiveness portfolio, EU green deal
Procedia PDF Downloads 8426404 Bio Energy from Metabolic Activity of Bacteria in Plant and Soil Using Novel Microbial Fuel Cells
Authors: B. Samuel Raj, Solomon R. D. Jebakumar
Abstract:
Microbial fuel cells (MFCs) are an emerging and promising method for achieving sustainable energy since they can remove contaminated organic matter and simultaneously generate electricity. Our approach was driven in three different ways like Bacterial fuel cell, Soil Microbial fuel cell (Soil MFC) and Plant Microbial fuel cell (Plant MFC). Bacterial MFC: Sulphate reducing bacteria (SRB) were isolated and identified as the efficient electricigens which is able to produce ±2.5V (689mW/m2) and it has sustainable activity for 120 days. Experimental data with different MFC revealed that high electricity production harvested continuously for 90 days 1.45V (381mW/m2), 1.98V (456mW/m2) respectively. Biofilm formation was confirmed on the surface of the anode by high content screening (HCS) and scanning electron Microscopic analysis (SEM). Soil MFC: Soil MFC was constructed with low cost and standard Mudwatt soil MFC was purchased from keegotech (USA). Vermicompost soil (V1) produce high energy (± 3.5V for ± 400 days) compared to Agricultural soil (A1) (± 2V for ± 150 days). Biofilm formation was confirmed by HCS and SEM analysis. This finding provides a method for extracting energy from organic matter, but also suggests a strategy for promoting the bioremediation of organic contaminants in subsurface environments. Our Soil MFC were able to run successfully a 3.5V fan and three LED continuously for 150 days. Plant MFC: Amaranthus candatus (P1) and Triticum aestivium (P2) were used in Plant MFC to confirm the electricity production from plant associated microbes, four uniform size of Plant MFC were constructed and checked for energy production. P2 produce high energy (± 3.2V for 40 days) with harvesting interval of two times and P1 produces moderate energy without harvesting interval (±1.5V for 24 days). P2 is able run 3.5V fan continuously for 10days whereas P1 needs optimization of growth conditions to produce high energy.Keywords: microbial fuel cell, biofilm, soil microbial fuel cell, plant microbial fuel cell
Procedia PDF Downloads 35426403 An Application-Driven Procedure for Optimal Signal Digitization of Automotive-Grade Ultrasonic Sensors
Authors: Mohamed Shawki Elamir, Heinrich Gotzig, Raoul Zoellner, Patrick Maeder
Abstract:
In this work, a methodology is presented for identifying the optimal digitization parameters for the analog signal of ultrasonic sensors. These digitization parameters are the resolution of the analog to digital conversion and the sampling rate. This is accomplished through the derivation of characteristic curves based on Fano inequality and the calculation of the mutual information content over a given dataset. The mutual information is calculated between the examples in the dataset and the corresponding variation in the feature that needs to be estimated. The optimal parameters are identified in a manner that ensures optimal estimation performance while preventing inefficiency in using unnecessarily powerful analog to digital converters.Keywords: analog to digital conversion, digitization, sampling rate, ultrasonic
Procedia PDF Downloads 21026402 Comparing the SALT and START Triage System in Disaster and Mass Casualty Incidents: A Systematic Review
Authors: Hendri Purwadi, Christine McCloud
Abstract:
Triage is a complex decision-making process that aims to categorize a victim’s level of acuity and the need for medical assistance. Two common triage systems have been widely used in Mass Casualty Incidents (MCIs) and disaster situation are START (Simple triage algorithm and rapid treatment) and SALT (sort, asses, lifesaving, intervention, and treatment/transport). There is currently controversy regarding the effectiveness of SALT over START triage system. This systematic review aims to investigate and compare the effectiveness between SALT and START triage system in disaster and MCIs setting. Literatures were searched via systematic search strategy from 2009 until 2019 in PubMed, Cochrane Library, CINAHL, Scopus, Science direct, Medlib, ProQuest. This review included simulated-based and medical record -based studies investigating the accuracy and applicability of SALT and START triage systems of adult and children population during MCIs and disaster. All type of studies were included. Joana Briggs institute critical appraisal tools were used to assess the quality of reviewed studies. As a result, 1450 articles identified in the search, 10 articles were included. Four themes were identified by review, they were accuracy, under-triage, over-triage and time to triage per individual victim. The START triage system has a wide range and inconsistent level of accuracy compared to SALT triage system (44% to 94. 2% of START compared to 70% to 83% of SALT). The under-triage error of START triage system ranged from 2.73% to 20%, slightly lower than SALT triage system (7.6 to 23.3%). The over-triage error of START triage system was slightly greater than SALT triage system (START ranged from 2% to 53% compared to 2% to 22% of SALT). The time for applying START triage system was faster than SALT triage system (START was 70-72.18 seconds compared to 78 second of SALT). Consequently; The START triage system has lower level of under-triage error and faster than SALT triage system in classifying victims of MCIs and disaster whereas SALT triage system is known slightly more accurate and lower level of over-triage. However, the magnitude of these differences is relatively small, and therefore the effect on the patient outcomes is not significance. Hence, regardless of the triage error, either START or SALT triage system is equally effective to triage victims of disaster and MCIs.Keywords: disaster, effectiveness, mass casualty incidents, START triage system, SALT triage system
Procedia PDF Downloads 13626401 Evolution and Merging of Double-Diffusive Layers in a Vertically Stable Compositional Field
Authors: Ila Thakur, Atul Srivastava, Shyamprasad Karagadde
Abstract:
The phenomenon of double-diffusive convection is driven by density gradients created by two different components (e.g., temperature and concentration) having different molecular diffusivities. The evolution of horizontal double-diffusive layers (DDLs) is one of the outcomes of double-diffusive convection occurring in a laterally/vertically cooled rectangular cavity having a pre-existing vertically stable composition field. The present work mainly focuses on different characteristics of the formation and merging of double-diffusive layers by imposing lateral/vertical thermal gradients in a vertically stable compositional field. A CFD-based twodimensional fluent model has been developed for the investigation of the aforesaid phenomena. The configuration containing vertical thermal gradients shows the evolution and merging of DDLs, where, elements from the same horizontal plane move vertically and mix with surroundings, creating a horizontal layer. In the configuration of lateral thermal gradients, a specially oriented convective roll was found inside each DDL and each roll was driven by the competing density change due to the already existing composition field and imposed thermal field. When the thermal boundary layer near the vertical wall penetrates the salinity interface, it can disrupt the compositional interface and can lead to layer merging. Different analytical scales were quantified and compared for both configurations. Various combinations of solutal and thermal Rayleigh numbers were investigated to get three different regimes, namely; stagnant regime, layered regime and unicellular regime. For a particular solutal Rayleigh number, a layered structure can originate only for a range of thermal Rayleigh numbers. Lower thermal Rayleigh numbers correspond to a diffusion-dominated stagnant regime. Very high thermal Rayleigh corresponds to a unicellular regime with high convective mixing. Different plots identifying these three regimes, number, thickness and time of existence of DDLs have been studied and plotted. For a given solutal Rayleigh number, an increase in thermal Rayleigh number increases the width but decreases both the number and time of existence of DDLs in the fluid domain. Sudden peaks in the velocity and heat transfer coefficient have also been observed and discussed at the time of merging. The present study is expected to be useful in correlating the double-diffusive convection in many large-scale applications including oceanography, metallurgy, geology, etc. The model has also been developed for three-dimensional geometry, but the results were quite similar to that of 2-D simulations.Keywords: double diffusive layers, natural convection, Rayleigh number, thermal gradients, compositional gradients
Procedia PDF Downloads 8826400 A Comprehensive Survey of Artificial Intelligence and Machine Learning Approaches across Distinct Phases of Wildland Fire Management
Authors: Ursula Das, Manavjit Singh Dhindsa, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
Wildland fires, also known as forest fires or wildfires, are exhibiting an alarming surge in frequency in recent times, further adding to its perennial global concern. Forest fires often lead to devastating consequences ranging from loss of healthy forest foliage and wildlife to substantial economic losses and the tragic loss of human lives. Despite the existence of substantial literature on the detection of active forest fires, numerous potential research avenues in forest fire management, such as preventative measures and ancillary effects of forest fires, remain largely underexplored. This paper undertakes a systematic review of these underexplored areas in forest fire research, meticulously categorizing them into distinct phases, namely pre-fire, during-fire, and post-fire stages. The pre-fire phase encompasses the assessment of fire risk, analysis of fuel properties, and other activities aimed at preventing or reducing the risk of forest fires. The during-fire phase includes activities aimed at reducing the impact of active forest fires, such as the detection and localization of active fires, optimization of wildfire suppression methods, and prediction of the behavior of active fires. The post-fire phase involves analyzing the impact of forest fires on various aspects, such as the extent of damage in forest areas, post-fire regeneration of forests, impact on wildlife, economic losses, and health impacts from byproducts produced during burning. A comprehensive understanding of the three stages is imperative for effective forest fire management and mitigation of the impact of forest fires on both ecological systems and human well-being. Artificial intelligence and machine learning (AI/ML) methods have garnered much attention in the cyber-physical systems domain in recent times leading to their adoption in decision-making in diverse applications including disaster management. This paper explores the current state of AI/ML applications for managing the activities in the aforementioned phases of forest fire. While conventional machine learning and deep learning methods have been extensively explored for the prevention, detection, and management of forest fires, a systematic classification of these methods into distinct AI research domains is conspicuously absent. This paper gives a comprehensive overview of the state of forest fire research across more recent and prominent AI/ML disciplines, including big data, classical machine learning, computer vision, explainable AI, generative AI, natural language processing, optimization algorithms, and time series forecasting. By providing a detailed overview of the potential areas of research and identifying the diverse ways AI/ML can be employed in forest fire research, this paper aims to serve as a roadmap for future investigations in this domain.Keywords: artificial intelligence, computer vision, deep learning, during-fire activities, forest fire management, machine learning, pre-fire activities, post-fire activities
Procedia PDF Downloads 7726399 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule
Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.Keywords: instance selection, data reduction, MapReduce, kNN
Procedia PDF Downloads 25726398 Intensive Biological Control in Spanish Greenhouses: Problems of the Success
Authors: Carolina Sanchez, Juan R. Gallego, Manuel Gamez, Tomas Cabello
Abstract:
Currently, biological control programs in greenhouse crops involve the use, at the same time, several natural enemies during the crop cycle. Also, large number of plant species grown in greenhouses, among them, the used cultivars are also wide. However, the cultivar effects on entomophagous species efficacy (predators and parasitoids) have been scarcely studied. A new method had been developed, using the factitious prey or host Ephestia kuehniella. It allows us to evaluate, under greenhouse or controlled conditions (semi-field), the cultivar effects on the entomophagous species effectiveness. The work was carried out in greenhouse tomato crop. It has been found the biological and ecological activities of predatory species (Nesidiocoris tenuis) and egg-parasitoid (Trichogramma achaeae) can be well represented with the use of the factitious prey or host; being better in the former than the latter. The data found in the trial are shown and discussed. The developed method could be applied to evaluate new plant materials before making available to farmers as commercial varieties, at low costs and easy use.Keywords: cultivar effects, efficiency, predators, parasitoids
Procedia PDF Downloads 27826397 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16726396 Two-Sided Information Dissemination in Takeovers: Disclosure and Media
Authors: Eda Orhun
Abstract:
Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success
Procedia PDF Downloads 32226395 Ranking the Elements of Relationship Market Orientation Banks (Case Study: Saderat Bank of Iran)
Authors: Sahar Jami, Iman Valizadeh
Abstract:
Today banks not only should seek for new customers but also should consider previous maintenance and retention and establish a stable relationship with them. In this term, relationship-manner marketing seeks to make, maintain, and promote the relationship between customers and other stakeholders in benefits to fulfill all involved parties. This fact is possible just by interactive transaction and promises fulfillment. According to the importance of relationship-manner marketing in banks, making context to make relationship-manner marketing has high importance. Therefore, the present study aims at exploring intention condition to relationship-manner marketing in Iran Province Iran Limited bank, and also prioritizing its variables using hierarchical analysis (AHP). There is questionnaire designed in this research to paired comparison of relationship-manner marketing elements. After distributing this questionnaire among statistical society members who are 20 of Iran Limited bank experts, data analysis has been done by Expert Choice software.Keywords: relationship marketing, relationship market orientation, Saderat Bank of Iran, hierarchical analysis
Procedia PDF Downloads 42326394 Cost-Effective and Optimal Control Analysis for Mitigation Strategy to Chocolate Spot Disease of Faba Bean
Authors: Haileyesus Tessema Alemneh, Abiyu Enyew Molla, Oluwole Daniel Makinde
Abstract:
Introduction: Faba bean is one of the most important grown plants worldwide for humans and animals. Several biotic and abiotic elements have limited the output of faba beans, irrespective of their diverse significance. Many faba bean pathogens have been reported so far, of which the most important yield-limiting disease is chocolate spot disease (Botrytis fabae). The dynamics of disease transmission and decision-making processes for intervention programs for disease control are now better understood through the use of mathematical modeling. Currently, a lot of mathematical modeling researchers are interested in plant disease modeling. Objective: In this paper, a deterministic mathematical model for chocolate spot disease (CSD) on faba bean plant with an optimal control model was developed and analyzed to examine the best strategy for controlling CSD. Methodology: Three control interventions, quarantine (u2), chemical control (u3), and prevention (u1), are employed that would establish the optimal control model. The optimality system, characterization of controls, the adjoint variables, and the Hamiltonian are all generated employing Pontryagin’s maximum principle. A cost-effective approach is chosen from a set of possible integrated strategies using the incremental cost-effectiveness ratio (ICER). The forward-backward sweep iterative approach is used to run numerical simulations. Results: The Hamiltonian, the optimality system, the characterization of the controls, and the adjoint variables were established. The numerical results demonstrate that each integrated strategy can reduce the diseases within the specified period. However, due to limited resources, an integrated strategy of prevention and uprooting was found to be the best cost-effective strategy to combat CSD. Conclusion: Therefore, attention should be given to the integrated cost-effective and environmentally eco-friendly strategy by stakeholders and policymakers to control CSD and disseminate the integrated intervention to the farmers in order to fight the spread of CSD in the Faba bean population and produce the expected yield from the field.Keywords: CSD, optimal control theory, Pontryagin’s maximum principle, numerical simulation, cost-effectiveness analysis
Procedia PDF Downloads 9426393 Understanding What People with Epilepsy and Their Care-Partners Value about an Electronic Patient Portal
Authors: K. Power, M. White, B. Dunleavey, E. Comerford, C. Doherty, N. Delanty, R. Corbridge, M. Fitzsimons
Abstract:
Introduction: Providing people with access to their own healthcare information and engaging them as co-authors of their health record can promote better transparency, trust, and inclusivity in the healthcare system. With the advent of electronic health records, there is a move towards involving patients as partners in their healthcare by providing them with access to their own health data via electronic patient portals (ePortal). For example, a recently developed ePortal to the Irish National Epilepsy Electronic Patient Record (EPR) provides access to summary medical records, tools for Patient Reported Outcomes (PROM), health goal-setting and preparation for clinical appointments. Aim: To determine what people with epilepsy (their families/carers) value about the Irish epilepsy ePortal. Methods: A socio-technical process was employed recruiting 30 families of people with epilepsy who also have an intellectual disability (ID). Family members who are a care partner of the person with epilepsy (PWE) were invited to co-design, develop and implement the ePortal. Family members engaged in usability and utility testing which involved a face to face meeting to learn about the ePortal, register for a user account and evaluate its structure and content. Family members were instructed to login to the portal on at least two separate occasions following the meeting and to complete a self-report evaluation tool during this time. The evaluation tool, based on a Usability Questionnaire (Lewis, 1993), consists of a short assessment of comfort using technology, instructions for using the ePortal and some tasks to complete. Tasks included validating summary record details, assessing ePortal ease of use, evaluation of information presented. Participants were asked for suggestions on how to improve the portal and make it more applicable to PWE who also have an ID. Results: Family members responded positively to the ePortal and valued the ability to share information between clinicians and care partners; use the ePortal as a passport between different healthcare settings (e.g., primary care to hospital). In the context of elderly parents of PWE, the ePortal is valued as a tool for supporting shared care between family members. Participants welcomed the facility to log lists of questions and goals to discuss with the clinician at the next clinical appointment as a means of improving quality of care. Participants also suggested further enhancements to the ePortal such as access to clinic letters which can provide an aide memoir in terms of the careplan agreed with the clinical team. For example, through the ePortal, people could see what investigations or therapies are scheduled. Conclusion: The Epilepsy Patient Portal is accessible via a range of devices such as smartphones and tablets. ePortals have the potential to help personalise care, improve patient involvement in clinical decision making, engage them as quality and safety partners, and help clinicians be more responsive to patient needs. Acknowledgement: The epilepsy ePortal project is part of PISCES, a Lighthouse Project funded by eHealth Ireland and HSE to help build an understanding of the benefits of eHealth technologies in the Irish Healthcare System.Keywords: electronic patient portal, electronic patient record, epilepsy, intellectual disability, usability testing
Procedia PDF Downloads 34526392 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 11126391 The Study of Flood Resilient House in Ebo-Town
Authors: Alagie Salieu Nankey
Abstract:
Flood-resistant house is the key mechanism to withstand flood hazards in Ebo-Town. It emerged simple yet powerful way of mitigating flooding in the community of Ebo- Town. Even though there are different types of buildings, little is known yet how and why flood affects building severely. In this paper, we examine three different types of flood-resistant buildings that are suitable for Ebo Town. We gather content and contextual features from six (6) respondents and used this data set to identify factors that are significantly associated with the flood-resistant house. Moreover, we built a suitable design concept. We found that amongst all the theories studied in the literature study Slit or Elevated House is the most suitable building design in Ebo-Town and Pile foundation is the most appropriate foundation type in the study area. Amongst contextual features, local materials are the most economical materials for the proposed design. This research proposes a framework that explains the theoretical relationships between flood hazard zones and flood-resistant houses in Ebo Town. Moreover, this research informs the design of sense-making and analytics tools for the resistant house.Keywords: flood-resistant, slit, flood hazard zone, pile foundation
Procedia PDF Downloads 5226390 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42626389 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 7026388 The Influence of National Culture on Consumer Buying Behaviour: An Exploratory Study of Nigerian and British Consumers
Authors: Mohamed Haffar, Lombe Ngome Enongene, Mohammed Hamdan, Gbolahan Gbadamosi
Abstract:
Despite the considerable body of literature investigating the influence of National Culture (NC) dimensions on consumer behaviour, there is a lack of studies comparing the influence of NC in Africa with Western European countries. This study is intended to fill the vacuum in knowledge by exploring how NC affects consumer buyer behavior in Nigeria and the United Kingdom. The primary data were collected through in depth, semi-structured interviews conducted with three groups of individuals: British students, Nigerian students in the United Kingdom, and Nigerian-based students. This approach and new frontier to analyze culture and consumer behaviour could help understand residual cultural threads of people (that are ingrained in their being) irrespective of exposure to other cultures. The findings of this study show that Nigerian and British consumers differ remarkably in cultural orientations such as symbols, values and psychological standpoints. This ultimately affects the choices made at every stage of the decision building process, and proves beneficial for international retail marketing.Keywords: national culture, consumer behaviour, international business, Nigeria
Procedia PDF Downloads 282