Search results for: patch metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 796

Search results for: patch metrics

376 Lessons Learnt from Industry: Achieving Net Gain Outcomes for Biodiversity

Authors: Julia Baker

Abstract:

Development plays a major role in stopping biodiversity loss. But the ‘silo species’ protection of legislation (where certain species are protected while many are not) means that development can be ‘legally compliant’ and result in biodiversity loss. ‘Net Gain’ (NG) policies can help overcome this by making it an absolute requirement that development causes no overall loss of biodiversity and brings a benefit. However, offsetting biodiversity losses in one location with gains elsewhere is controversial because people suspect ‘offsetting’ to be an easy way for developers to buy their way out of conservation requirements. Yet the good practice principles (GPP) of offsetting provide several advantages over existing legislation for protecting biodiversity from development. This presentation describes the learning from implementing NG approaches based on GPP. It regards major upgrades of the UK’s transport networks, which involved removing vegetation in order to construct and safely operate new infrastructure. While low-lying habitats were retained, trees and other habitats disrupting the running or safety of transport networks could not. Consequently, achieving NG within the transport corridor was not possible and offsetting was required. The first ‘lessons learnt’ were on obtaining a commitment from business leaders to go beyond legislative requirements and deliver NG, and on the institutional change necessary to embed GPP within daily operations. These issues can only be addressed when the challenges that biodiversity poses for business are overcome. These challenges included: biodiversity cannot be measured easily unlike other sustainability factors like carbon and water that have metrics for target-setting and measuring progress; and, the mindset that biodiversity costs money and does not generate cash in return, which is the opposite of carbon or waste for example, where people can see how ‘sustainability’ actions save money. The challenges were overcome by presenting the GPP of NG as a cost-efficient solution to specific, critical risks facing the business that also boost industry recognition, and by using government-issued NG metrics to develop business-specific toolkits charting their NG progress whilst ensuring that NG decision-making was based on rich ecological data. An institutional change was best achieved by supporting, mentoring and training sustainability/environmental managers for these ‘frontline’ staff to embed GPP within the business. The second learning was from implementing the GPP where business partnered with local governments, wildlife groups and land owners to support their priorities for nature conservation, and where these partners had a say in decisions about where and how best to achieve NG. From this inclusive approach, offsetting contributed towards conservation priorities when all collaborated to manage trade-offs between: -Delivering ecologically equivalent offsets or compensating for losses of one type of biodiversity by providing another. -Achieving NG locally to the development whilst contributing towards national conservation priorities through landscape-level planning. -Not just protecting the extent and condition of existing biodiversity but ‘doing more’. -The multi-sector collaborations identified practical, workable solutions to ‘in perpetuity’. But key was strengthening linkages between biodiversity measures implemented for development and conservation work undertaken by local organizations so that developers support NG initiatives that really count.

Keywords: biodiversity offsetting, development, nature conservation planning, net gain

Procedia PDF Downloads 195
375 Building Scalable and Accurate Hybrid Kernel Mapping Recommender

Authors: Hina Iqbal, Mustansar Ali Ghazanfar, Sandor Szedmak

Abstract:

Recommender systems uses artificial intelligence practices for filtering obscure information and can predict if a user likes a specified item. Kernel mapping Recommender systems have been proposed which are accurate and state-of-the-art algorithms and resolve recommender system’s design objectives such as; long tail, cold-start, and sparsity. The aim of research is to propose hybrid framework that can efficiently integrate different versions— namely item-based and user-based KMR— of KMR algorithm. We have proposed various heuristic algorithms that integrate different versions of KMR (into a unified framework) resulting in improved accuracy and elimination of problems associated with conventional recommender system. We have tested our system on publically available movies dataset and benchmark with KMR. The results (in terms of accuracy, precision, recall, F1 measure and ROC metrics) reveal that the proposed algorithm is quite accurate especially under cold-start and sparse scenarios.

Keywords: Kernel Mapping Recommender Systems, hybrid recommender systems, cold start, sparsity, long tail

Procedia PDF Downloads 338
374 User-Based Cannibalization Mitigation in an Online Marketplace

Authors: Vivian Guo, Yan Qu

Abstract:

Online marketplaces are not only digital places where consumers buy and sell merchandise, and they are also destinations for brands to connect with real consumers at the moment when customers are in the shopping mindset. For many marketplaces, brands have been important partners through advertising. There can be, however, a risk of advertising impacting a consumer’s shopping journey if it hurts the use experience or takes the user away from the site. Both could lead to the loss of transaction revenue for the marketplace. In this paper, we present user-based methods for cannibalization control by selectively turning off ads to users who are likely to be cannibalized by ads subject to business objectives. We present ways of measuring cannibalization of advertising in the context of an online marketplace and propose novel ways of measuring cannibalization through purchase propensity and uplift modeling. A/B testing has shown that our methods can significantly improve user purchase and engagement metrics while operating within business objectives. To our knowledge, this is the first paper that addresses cannibalization mitigation at the user-level in the context of advertising.

Keywords: cannibalization, machine learning, online marketplace, revenue optimization, yield optimization

Procedia PDF Downloads 160
373 The Relationship between Rhythmic Complexity and Listening Engagement as a Proxy for Perceptual Interest

Authors: Noah R. Fram

Abstract:

Although it has been confirmed by multiple studies, the inverted-U relationship between stimulus complexity and preference (liking) remains contentious. Research aimed at substantiating the model are largely reliant upon anecdotal self-assessments of subjects and basic measures of complexity, leaving potential confounds unresolved. This study attempts to address the topic by assessing listening time as a behavioral correlate of liking (with the assumption that engagement prolongs listening time) and by looking for latent factors underlying several measures of rhythmic complexity. Participants listened to groups of rhythms, stopping each one when they started to lose interest and were asked to rate each rhythm in each group in terms of interest, complexity, and preference. Subjects were not informed that the time spent listening to each rhythm was the primary measure of interest. The hypothesis that listening time does demonstrate the same inverted-U relationship with complexity as verbal reports of liking was confirmed using a variety of metrics for rhythmic complexity, including meter-dependent measures of syncopation and meter-independent measures of entropy.

Keywords: complexity, entropy, rhythm, syncopation

Procedia PDF Downloads 174
372 Flexible Cities: A Multisided Spatial Application of Tracking Livability of Urban Environment

Authors: Maria Christofi, George Plastiras, Rafaella Elia, Vaggelis Tsiourtis, Theocharis Theocharides, Miltiadis Katsaros

Abstract:

The rapidly expanding urban areas of the world constitute a challenge of how we need to make the transition to "the next urbanization", which will be defined by new analytical tools and new sources of data. This paper is about the production of a spatial application, the ‘FUMapp’, where space and its initiative will be available literally, in meters, but also abstractly, at a sensed level. While existing spatial applications typically focus on illustrations of the urban infrastructure, the suggested application goes beyond the existing: It investigates how our environment's perception adapts to the alterations of the built environment through a dataset construction of biophysical measurements (eye-tracking, heart beating), and physical metrics (spatial characteristics, size of stimuli, rhythm of mobility). It explores the intersections between architecture, cognition, and computing where future design can be improved and identifies the flexibility and livability of the ‘available space’ of specific examined urban paths.

Keywords: biophysical data, flexibility of urban, livability, next urbanization, spatial application

Procedia PDF Downloads 142
371 Privacy Label: An Alternative Approach to Present Privacy Policies from Online Services to the User

Authors: Diego Roberto Goncalves De Pontes, Sergio Donizetti Zorzo

Abstract:

Studies show that most users do not read privacy policies from the online services they use. Some authors claim that one of the main causes of this is that policies are long and usually hard to understand, which make users lose interest in reading them. In this scenario, users may agree with terms without knowing what kind of data is being collected and why. Given that, we aimed to develop a model that would present the privacy policies contents in an easy and graphical way for the user to understand. We call it the Privacy Label. Using information recovery techniques, we propose an architecture that is able to extract information about what kind of data is being collected and to what end in the policies and show it to the user in an automated way. To assess our model, we calculated the precision, recall and f-measure metrics on the information extracted by our technique. The results for each metric were 68.53%, 85.61% e 76,13%, respectively, making it possible for the final user to understand which data was being collected without reading the whole policy. Also, our proposal can facilitate the notice-and-choice by presenting privacy policy information in an alternative way for online users.

Keywords: privacy, policies, user behavior, computer human interaction

Procedia PDF Downloads 307
370 Impact of Sustainability Reporting on the Financial Performance of Deposit Money Banks: Pre-Post Analysis of Integrating Environmental, Social, and Governance Disclosure into Corporate Annual Reports

Authors: A. O. Talabi, F. M. Taib, D. J. Jalaludin

Abstract:

The influence of sustainability reporting on Deposit Money Banks (DMBs)' financial performance both before and after mandated environmental, social, and governance (ESG) disclosure is examined in this article. Using a sample size of the top six strategically important listed banks in Nigeria, the study employed the paired sample t-test to assess the pre-mandatory ESG period (2009-2015) and the post-mandatory ESG period (2016-2022). According to the findings, there was no discernible difference between the performance of DMBs in Nigeria before and after the requirement for ESG disclosure. In the pre-mandatory requirement time, sustainability reporting is a major predictor of financial metrics, but in the post-mandatory requirement period, there was no discernible change in financial performance. Market authorities ought to have unrestricted authority to impose severe fines for noncompliance and bring legal action against corporations that fail to disclose ESG. This work contributes to the literature on ESG disclosure and financial performance by considering two different periods.

Keywords: financial, performance, sustainability, reporting

Procedia PDF Downloads 139
369 Defining a Framework for Holistic Life Cycle Assessment of Building Components by Considering Parameters Such as Circularity, Material Health, Biodiversity, Pollution Control, Cost, Social Impacts, and Uncertainty

Authors: Naomi Grigoryan, Alexandros Loutsioli Daskalakis, Anna Elisse Uy, Yihe Huang, Aude Laurent (Webanck)

Abstract:

In response to the building and construction sectors accounting for a third of all energy demand and emissions, the European Union has placed new laws and regulations in the construction sector that emphasize material circularity, energy efficiency, biodiversity, and social impact. Existing design tools assess sustainability in early-stage design for products or buildings; however, there is no standardized methodology for measuring the circularity performance of building components. Existing assessment methods for building components focus primarily on carbon footprint but lack the comprehensive analysis required to design for circularity. The research conducted in this paper covers the parameters needed to assess sustainability in the design process of architectural products such as doors, windows, and facades. It maps a framework for a tool that assists designers with real-time sustainability metrics. Considering the life cycle of building components such as façades, windows, and doors involves the life cycle stages applied to product design and many of the methods used in the life cycle analysis of buildings. The current industry standards of sustainability assessment for metal building components follow cradle-to-grave life cycle assessment (LCA), track Global Warming Potential (GWP), and document the parameters used for an Environmental Product Declaration (EPD). Developed by the Ellen Macarthur Foundation, the Material Circularity Indicator (MCI) is a methodology utilizing the data from LCA and EPDs to rate circularity, with a "value between 0 and 1 where higher values indicate a higher circularity+". Expanding on the MCI with additional indicators such as the Water Circularity Index (WCI), the Energy Circularity Index (ECI), the Social Circularity Index (SCI), Life Cycle Economic Value (EV), and calculating biodiversity risk and uncertainty, the assessment methodology of an architectural product's impact can be targeted more specifically based on product requirements, performance, and lifespan. Broadening the scope of LCA calculation for products to incorporate aspects of building design allows product designers to account for the disassembly of architectural components. For example, the Material Circularity Indicator for architectural products such as windows and facades is typically low due to the impact of glass, as 70% of glass ends up in landfills due to damage in the disassembly process. The low MCI can be combatted by expanding beyond cradle-to-grave assessment and focusing the design process on disassembly, recycling, and repurposing with the help of real-time assessment tools. Design for Disassembly and Urban Mining has been integrated within the construction field on small scales as project-based exercises, not addressing the entire supply chain of architectural products. By adopting more comprehensive sustainability metrics and incorporating uncertainty calculations, the sustainability assessment of building components can be more accurately assessed with decarbonization and disassembly in mind, addressing the large-scale commercial markets within construction, some of the most significant contributors to climate change.

Keywords: architectural products, early-stage design, life cycle assessment, material circularity indicator

Procedia PDF Downloads 88
368 A Practical Survey on Zero-Shot Prompt Design for In-Context Learning

Authors: Yinheng Li

Abstract:

The remarkable advancements in large language models (LLMs) have brought about significant improvements in natural language processing tasks. This paper presents a comprehensive review of in-context learning techniques, focusing on different types of prompts, including discrete, continuous, few-shot, and zero-shot, and their impact on LLM performance. We explore various approaches to prompt design, such as manual design, optimization algorithms, and evaluation methods, to optimize LLM performance across diverse tasks. Our review covers key research studies in prompt engineering, discussing their methodologies and contributions to the field. We also delve into the challenges faced in evaluating prompt performance, given the absence of a single ”best” prompt and the importance of considering multiple metrics. In conclusion, the paper highlights the critical role of prompt design in harnessing the full potential of LLMs and provides insights into the combination of manual design, optimization techniques, and rigorous evaluation for more effective and efficient use of LLMs in various Natural Language Processing (NLP) tasks.

Keywords: in-context learning, prompt engineering, zero-shot learning, large language models

Procedia PDF Downloads 81
367 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice

Authors: Diana Reckien

Abstract:

Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.

Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity

Procedia PDF Downloads 395
366 A Weighted K-Medoids Clustering Algorithm for Effective Stability in Vehicular Ad Hoc Networks

Authors: Rejab Hajlaoui, Tarek Moulahi, Hervé Guyennet

Abstract:

In a highway scenario, the vehicle speed can exceed 120 kmph. Therefore, any vehicle can enter or leave the network within a very short time. This mobility adversely affects the network connectivity and decreases the life time of all established links. To ensure an effective stability in vehicular ad hoc networks with minimum broadcasting storm, we have developed a weighted algorithm based on the k-medoids clustering algorithm (WKCA). Indeed, the number of clusters and the initial cluster heads will not be selected randomly as usual, but considering the available transmission range and the environment size. Then, to ensure optimal assignment of nodes to clusters in both k-medoids phases, the combined weight of any node will be computed according to additional metrics including direction, relative speed and proximity. Empirical results prove that in addition to the convergence speed that characterizes the k-medoids algorithm, our proposed model performs well both AODV-Clustering and OLSR-Clustering protocols under different densities and velocities in term of end-to-end delay, packet delivery ratio, and throughput.

Keywords: communication, clustering algorithm, k-medoids, sensor, vehicular ad hoc network

Procedia PDF Downloads 238
365 Malaria Parasite Detection Using Deep Learning Methods

Authors: Kaustubh Chakradeo, Michael Delves, Sofya Titarenko

Abstract:

Malaria is a serious disease which affects hundreds of millions of people around the world, each year. If not treated in time, it can be fatal. Despite recent developments in malaria diagnostics, the microscopy method to detect malaria remains the most common. Unfortunately, the accuracy of microscopic diagnostics is dependent on the skill of the microscopist and limits the throughput of malaria diagnosis. With the development of Artificial Intelligence tools and Deep Learning techniques in particular, it is possible to lower the cost, while achieving an overall higher accuracy. In this paper, we present a VGG-based model and compare it with previously developed models for identifying infected cells. Our model surpasses most previously developed models in a range of the accuracy metrics. The model has an advantage of being constructed from a relatively small number of layers. This reduces the computer resources and computational time. Moreover, we test our model on two types of datasets and argue that the currently developed deep-learning-based methods cannot efficiently distinguish between infected and contaminated cells. A more precise study of suspicious regions is required.

Keywords: convolution neural network, deep learning, malaria, thin blood smears

Procedia PDF Downloads 130
364 Metrics and Methods for Improving Resilience in Agribusiness Supply Chains

Authors: Golnar Behzadi, Michael O'Sullivan, Tava Olsen, Abraham Zhang

Abstract:

By definition, increasing supply chain resilience improves the supply chain’s ability to return to normal, or to an even more desirable situation, quickly and efficiently after being hit by a disruption. This is especially critical in agribusiness supply chains where the products are perishable and have a short life-cycle. In this paper, we propose a resilience metric to capture and improve the recovery process in terms of both performance and time, of an agribusiness supply chain following either supply or demand-side disruption. We build a model that determines optimal supply chain recovery planning decisions and selects the best resilient strategies that minimize the loss of profit during the recovery time window. The model is formulated as a two-stage stochastic mixed-integer linear programming problem and solved with a branch-and-cut algorithm. The results show that the optimal recovery schedule is highly dependent on the duration of the time-window allowed for recovery. In addition, the profit loss during recovery is reduced by utilizing the proposed resilient actions.

Keywords: agribusiness supply chain, recovery, resilience metric, risk management

Procedia PDF Downloads 397
363 Dynamics and Advection in a Vortex Parquet on the Plane

Authors: Filimonova Alexanra

Abstract:

Inviscid incompressible fluid flows are considered. The object of the study is a vortex parquet – a structure consisting of distributed vortex spots of different directions, occupying the entire plane. The main attention is paid to the study of advection processes of passive particles in the corresponding velocity field. The dynamics of the vortex structures is considered in a rectangular region under the assumption that periodic boundary conditions are imposed on the stream function. Numerical algorithms are based on the solution of the initial-boundary value problem for nonstationary Euler equations in terms of vorticity and stream function. For this, the spectral-vortex meshless method is used. It is based on the approximation of the stream function by the Fourier series cut and the approximation of the vorticity field by the least-squares method from its values in marker particles. A vortex configuration, consisting of four vortex patches is investigated. Results of a numerical study of the dynamics and interaction of the structure are presented. The influence of the patch radius and the relative position of positively and negatively directed patches on the processes of interaction and mixing is studied. The obtained results correspond to the following possible scenarios: the initial configuration does not change over time; the initial configuration forms a new structure, which is maintained for longer times; the initial configuration returns to its initial state after a certain period of time. The processes of mass transfer of vorticity by liquid particles on a plane were calculated and analyzed. The results of a numerical analysis of the particles dynamics and trajectories on the entire plane and the field of local Lyapunov exponents are presented.

Keywords: ideal fluid, meshless methods, vortex structures in liquids, vortex parquet.

Procedia PDF Downloads 64
362 Chitosan Doped Curcumin Gold Clusters Flexible Nanofiber for Wound Dressing and Anticancer Activities

Authors: Saravanan Govindaraju, Kyusik Yun

Abstract:

The purpose of this study is to develop the chitosan doped curcumin gold cluster nanofiber for wound healing and skin cancer drug delivery applications. Chitosan is a typical marine polysaccharide composed of glucosamine and n-acetyl glucosamine biodegradable and biocompatible polymer. Curcumin is a natural bioactive molecule obtained from Curcuma longo, it mostly occurs in some Asian countries like India and China. It has naturally antioxidant, antimicrobial, wound healing and anticancer property. Due to this advantage, we prepared a combination of natural polymer chitosan with Curcumin and gold nanocluster nanofiber (CH-CUR-AuNCs nanofibers). The prepared nanofiber was characterized by using Fourier transform infrared spectroscopy (FT-IR), and scanning electron microscopy (SEM). Antibacterial studies were performed with E.coli and S.aureus. Antioxidant assay, drug release test, and cytotoxicity will be evaluated. Prepared nanofiber emits low intensity of red fluorescent. The FTIR confirm the presence of chitosan and Curcumin in the nanofiber. In vitro study clearly shows the antibacterial activity against the gram negative and gram positive bacteria. Particularly, synthesised nanofibers provide better antibacterial activity against gram negative than gram positive. Cytotoxicity study also provides better killing rate in cancer cell, biocompatible with normal cell. Prepared CH-CUR-AuNCs nanofibers provide the better killing rate to bacterial strains and cancer cells. Finally, prepared nanofiber can be possible to use for wound healing dressing, patch for skin cancer and other biomedical applications.

Keywords: curcumin, chitosan, gold clusters, nanofibers

Procedia PDF Downloads 261
361 Modeling User Context Using CEAR Diagram

Authors: Ravindra Dastikop, G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper, we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model, ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .

Keywords: user context, context entity, context entity attributes, situation, sensors, devices, relationships, actors, expressiveness, understandability

Procedia PDF Downloads 344
360 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis

Authors: Gon Park

Abstract:

Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.

Keywords: cadastral data, green Infrastructure, network analysis, parcel data

Procedia PDF Downloads 205
359 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization

Authors: Soheila Sadeghi

Abstract:

Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.

Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction

Procedia PDF Downloads 58
358 Eco Scale: A Tool for Assessing the Greenness of Pharmaceuticals Analysis

Authors: Heba M. Mohamed

Abstract:

Owing to scientific and public concern about health and environment and seeking for a better quality of life; “Green”, “Environmentally” and “Eco” friendly practices have been presented and implemented in different research areas. Subsequently, researchers’ attention is drawn in the direction of greening the analytical methodologies and taking the Green Analytical Chemistry principles (GAC) into consideration. It is of high importance to appraise the environmental impact of each of the implemented green approaches. Compared to the other traditional green metrics (E-factor, Atom economy and the process profile), the eco scale is the optimum choice to assess the environmental impact of the analytical procedures used for pharmaceuticals analysis. For analytical methodologies, Eco-Scale is calculated by allotting penalty points to any factor of the used analytical procedure which disagree and not match with the model green analysis, where the perfect green analysis has its Eco-Scale value of 100. In this work, calculation and comparison of the Eco-Scale for some of the reported green analytical methods was done, to accentuate their greening potentials. Where the different scores can reveal how green the method is, compared to the ideal value. The study emphasizes that greenness measurement is not only about the waste quantity determination but also dictates a holistic scheme, considering all factors.

Keywords: eco scale, green analysis, environmentally friendly, pharmaceuticals analysis

Procedia PDF Downloads 438
357 Receptor-Independent Effects of Endocannabinoid Anandamide on Contractility and Electrophysiological Properties of Rat Ventricular Myocytes

Authors: Lina T. Al Kury, Oleg I. Voitychuk, Ramiz M. Ali, Sehamuddin Galadari, Keun-Hang Susan Yang, Frank Christopher Howarth, Yaroslav M. Shuba, Murat Oz

Abstract:

A role for anandamide (N-arachidonoyl ethanolamide; AEA), a major endocannabinoid, in the cardiovascular system in various pathological conditions has been reported in earlier studies. In the present work, we have hypothesized that the antiarrhythmic effects reported for AEA are due to its negative inotropic effect and altered action potential (AP) characteristics. Therefore, we tested the effects of AEA on contractility and electrophysiological properties of rat ventricular myocytes. Video edge detection was used to measure myocyte shortening. Intracellular Ca2+ was measured in cells loaded with the fluorescent indicator fura-2 AM. Whole-cell patch-clamp technique was employed to investigate the effect of AEA on the characteristics of APs. AEA (1 μM) caused a significant decrease in the amplitudes of electrically-evoked myocyte shortening and Ca2+ transients and significantly decreased the duration of AP. The effect of AEA on myocyte shortening and AP characteristics was not altered in the presence of pertussis toxin (PTX, 2 µg/ml for 4 h), AM251 and SR141716 (cannabinoid type 1 receptor antagonists) or AM630 and SR 144528 (cannabinoid type 2 receptor antagonists). Furthermore, AEA inhibited voltage-activated inward Na+ (INa) and Ca2+ (IL,Ca) currents; major ionic currents shaping the APs in ventricular myocytes, in a voltage and PTX-independent manner. Collectively, the results suggest that AEA depresses ventricular myocyte contractility, by decreasing the action potential duration (APD), and inhibits the function of voltage-dependent Na+ and L-type Ca2+ channels in a manner independent of cannabinoid receptors. This mechanism may be importantly involved in the antiarrhythmic effects of anandamide.

Keywords: action potential, anandamide, cannabinoid receptor, endocannabinoid, ventricular myocytes

Procedia PDF Downloads 355
356 Efficient Deep Neural Networks for Real-Time Strawberry Freshness Monitoring: A Transfer Learning Approach

Authors: Mst. Tuhin Akter, Sharun Akter Khushbu, S. M. Shaqib

Abstract:

A real-time system architecture is highly effective for monitoring and detecting various damaged products or fruits that may deteriorate over time or become infected with diseases. Deep learning models have proven to be effective in building such architectures. However, building a deep learning model from scratch is a time-consuming and costly process. A more efficient solution is to utilize deep neural network (DNN) based transfer learning models in the real-time monitoring architecture. This study focuses on using a novel strawberry dataset to develop effective transfer learning models for the proposed real-time monitoring system architecture, specifically for evaluating and detecting strawberry freshness. Several state-of-the-art transfer learning models were employed, and the best performing model was found to be Xception, demonstrating higher performance across evaluation metrics such as accuracy, recall, precision, and F1-score.

Keywords: strawberry freshness evaluation, deep neural network, transfer learning, image augmentation

Procedia PDF Downloads 90
355 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 297
354 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 256
353 Digital Image Steganography with Multilayer Security

Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal

Abstract:

In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.

Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix

Procedia PDF Downloads 337
352 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 40
351 An Efficient Resource Management Algorithm for Mobility Management in Wireless Mesh Networks

Authors: Mallikarjuna Rao Yamarthy, Subramanyam Makam Venkata, Satya Prasad Kodati

Abstract:

The main objective of the proposed work is to reduce the overall network traffic incurred by mobility management, packet delivery cost and to increase the resource utilization. The proposed algorithm, An Efficient Resource Management Algorithm (ERMA) for mobility management in wireless mesh networks, relies on pointer based mobility management scheme. Whenever a mesh client moves from one mesh router to another, the pointer is set up dynamically between the previous mesh router and current mesh router based on the distance constraints. The algorithm evaluated for signaling cost, data delivery cost and total communication cost performance metrics. The proposed algorithm is demonstrated for both internet sessions and intranet sessions. The proposed algorithm yields significantly better performance in terms of signaling cost, data delivery cost, and total communication cost.

Keywords: data delivery cost, mobility management, pointer forwarding, resource management, wireless mesh networks

Procedia PDF Downloads 367
350 Generating Insights from Data Using a Hybrid Approach

Authors: Allmin Susaiyah, Aki Härmä, Milan Petković

Abstract:

Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.

Keywords: data mining, insight mining, natural language generation, pre-trained language models

Procedia PDF Downloads 119
349 A Review of Routing Protocols for Mobile Ad-Hoc NETworks (MANET)

Authors: Hafiza Khaddija Saman, Muhammad Sufyan

Abstract:

The increase in availability and popularity of mobile wireless devices has led researchers to develop a wide variety of Mobile Ad-hoc Networking (MANET) protocols to exploit the unique communication opportunities presented by these devices. Devices are able to communicate directly using the wireless spectrum in a peer-to-peer fashion, and route messages through intermediate nodes, however, the nature of wireless shared communication and mobile devices result in many routing and security challenges which must be addressed before deploying a MANET. In this paper, we investigate the range of MANET routing protocols available and discuss the functionalities of several ranging from early protocols such as DSDV to more advanced such as MAODV, our protocol study focuses upon works by Perkins in developing and improving MANET routing. A range of literature relating to the field of MANET routing was identified and reviewed, we also reviewed literature on the topic of securing AODV based MANETs as this may be the most popular MANET protocol. The literature review identified a number of trends within research papers such as exclusive use of the random waypoint mobility model, excluding key metrics from simulation results and not comparing protocol performance against available alternatives.

Keywords: protocol, MANET, ad-Hoc, communication

Procedia PDF Downloads 261
348 Leveraging Quality Metrics in Voting Model Based Thread Retrieval

Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim

Abstract:

Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.

Keywords: content quality, forum search, thread retrieval, voting techniques

Procedia PDF Downloads 213
347 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris

Abstract:

The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.

Keywords: energy efficiency, handover, HetNets, MADM, small cells

Procedia PDF Downloads 116