Search results for: finsler metrics
341 Impact of Sustainability Reporting on the Financial Performance of Deposit Money Banks: Pre-Post Analysis of Integrating Environmental, Social, and Governance Disclosure into Corporate Annual Reports
Authors: A. O. Talabi, F. M. Taib, D. J. Jalaludin
Abstract:
The influence of sustainability reporting on Deposit Money Banks (DMBs)' financial performance both before and after mandated environmental, social, and governance (ESG) disclosure is examined in this article. Using a sample size of the top six strategically important listed banks in Nigeria, the study employed the paired sample t-test to assess the pre-mandatory ESG period (2009-2015) and the post-mandatory ESG period (2016-2022). According to the findings, there was no discernible difference between the performance of DMBs in Nigeria before and after the requirement for ESG disclosure. In the pre-mandatory requirement time, sustainability reporting is a major predictor of financial metrics, but in the post-mandatory requirement period, there was no discernible change in financial performance. Market authorities ought to have unrestricted authority to impose severe fines for noncompliance and bring legal action against corporations that fail to disclose ESG. This work contributes to the literature on ESG disclosure and financial performance by considering two different periods.Keywords: financial, performance, sustainability, reporting
Procedia PDF Downloads 139340 Defining a Framework for Holistic Life Cycle Assessment of Building Components by Considering Parameters Such as Circularity, Material Health, Biodiversity, Pollution Control, Cost, Social Impacts, and Uncertainty
Authors: Naomi Grigoryan, Alexandros Loutsioli Daskalakis, Anna Elisse Uy, Yihe Huang, Aude Laurent (Webanck)
Abstract:
In response to the building and construction sectors accounting for a third of all energy demand and emissions, the European Union has placed new laws and regulations in the construction sector that emphasize material circularity, energy efficiency, biodiversity, and social impact. Existing design tools assess sustainability in early-stage design for products or buildings; however, there is no standardized methodology for measuring the circularity performance of building components. Existing assessment methods for building components focus primarily on carbon footprint but lack the comprehensive analysis required to design for circularity. The research conducted in this paper covers the parameters needed to assess sustainability in the design process of architectural products such as doors, windows, and facades. It maps a framework for a tool that assists designers with real-time sustainability metrics. Considering the life cycle of building components such as façades, windows, and doors involves the life cycle stages applied to product design and many of the methods used in the life cycle analysis of buildings. The current industry standards of sustainability assessment for metal building components follow cradle-to-grave life cycle assessment (LCA), track Global Warming Potential (GWP), and document the parameters used for an Environmental Product Declaration (EPD). Developed by the Ellen Macarthur Foundation, the Material Circularity Indicator (MCI) is a methodology utilizing the data from LCA and EPDs to rate circularity, with a "value between 0 and 1 where higher values indicate a higher circularity+". Expanding on the MCI with additional indicators such as the Water Circularity Index (WCI), the Energy Circularity Index (ECI), the Social Circularity Index (SCI), Life Cycle Economic Value (EV), and calculating biodiversity risk and uncertainty, the assessment methodology of an architectural product's impact can be targeted more specifically based on product requirements, performance, and lifespan. Broadening the scope of LCA calculation for products to incorporate aspects of building design allows product designers to account for the disassembly of architectural components. For example, the Material Circularity Indicator for architectural products such as windows and facades is typically low due to the impact of glass, as 70% of glass ends up in landfills due to damage in the disassembly process. The low MCI can be combatted by expanding beyond cradle-to-grave assessment and focusing the design process on disassembly, recycling, and repurposing with the help of real-time assessment tools. Design for Disassembly and Urban Mining has been integrated within the construction field on small scales as project-based exercises, not addressing the entire supply chain of architectural products. By adopting more comprehensive sustainability metrics and incorporating uncertainty calculations, the sustainability assessment of building components can be more accurately assessed with decarbonization and disassembly in mind, addressing the large-scale commercial markets within construction, some of the most significant contributors to climate change.Keywords: architectural products, early-stage design, life cycle assessment, material circularity indicator
Procedia PDF Downloads 88339 A Practical Survey on Zero-Shot Prompt Design for In-Context Learning
Authors: Yinheng Li
Abstract:
The remarkable advancements in large language models (LLMs) have brought about significant improvements in natural language processing tasks. This paper presents a comprehensive review of in-context learning techniques, focusing on different types of prompts, including discrete, continuous, few-shot, and zero-shot, and their impact on LLM performance. We explore various approaches to prompt design, such as manual design, optimization algorithms, and evaluation methods, to optimize LLM performance across diverse tasks. Our review covers key research studies in prompt engineering, discussing their methodologies and contributions to the field. We also delve into the challenges faced in evaluating prompt performance, given the absence of a single ”best” prompt and the importance of considering multiple metrics. In conclusion, the paper highlights the critical role of prompt design in harnessing the full potential of LLMs and provides insights into the combination of manual design, optimization techniques, and rigorous evaluation for more effective and efficient use of LLMs in various Natural Language Processing (NLP) tasks.Keywords: in-context learning, prompt engineering, zero-shot learning, large language models
Procedia PDF Downloads 81338 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice
Authors: Diana Reckien
Abstract:
Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity
Procedia PDF Downloads 395337 A Weighted K-Medoids Clustering Algorithm for Effective Stability in Vehicular Ad Hoc Networks
Authors: Rejab Hajlaoui, Tarek Moulahi, Hervé Guyennet
Abstract:
In a highway scenario, the vehicle speed can exceed 120 kmph. Therefore, any vehicle can enter or leave the network within a very short time. This mobility adversely affects the network connectivity and decreases the life time of all established links. To ensure an effective stability in vehicular ad hoc networks with minimum broadcasting storm, we have developed a weighted algorithm based on the k-medoids clustering algorithm (WKCA). Indeed, the number of clusters and the initial cluster heads will not be selected randomly as usual, but considering the available transmission range and the environment size. Then, to ensure optimal assignment of nodes to clusters in both k-medoids phases, the combined weight of any node will be computed according to additional metrics including direction, relative speed and proximity. Empirical results prove that in addition to the convergence speed that characterizes the k-medoids algorithm, our proposed model performs well both AODV-Clustering and OLSR-Clustering protocols under different densities and velocities in term of end-to-end delay, packet delivery ratio, and throughput.Keywords: communication, clustering algorithm, k-medoids, sensor, vehicular ad hoc network
Procedia PDF Downloads 238336 Malaria Parasite Detection Using Deep Learning Methods
Authors: Kaustubh Chakradeo, Michael Delves, Sofya Titarenko
Abstract:
Malaria is a serious disease which affects hundreds of millions of people around the world, each year. If not treated in time, it can be fatal. Despite recent developments in malaria diagnostics, the microscopy method to detect malaria remains the most common. Unfortunately, the accuracy of microscopic diagnostics is dependent on the skill of the microscopist and limits the throughput of malaria diagnosis. With the development of Artificial Intelligence tools and Deep Learning techniques in particular, it is possible to lower the cost, while achieving an overall higher accuracy. In this paper, we present a VGG-based model and compare it with previously developed models for identifying infected cells. Our model surpasses most previously developed models in a range of the accuracy metrics. The model has an advantage of being constructed from a relatively small number of layers. This reduces the computer resources and computational time. Moreover, we test our model on two types of datasets and argue that the currently developed deep-learning-based methods cannot efficiently distinguish between infected and contaminated cells. A more precise study of suspicious regions is required.Keywords: convolution neural network, deep learning, malaria, thin blood smears
Procedia PDF Downloads 130335 Metrics and Methods for Improving Resilience in Agribusiness Supply Chains
Authors: Golnar Behzadi, Michael O'Sullivan, Tava Olsen, Abraham Zhang
Abstract:
By definition, increasing supply chain resilience improves the supply chain’s ability to return to normal, or to an even more desirable situation, quickly and efficiently after being hit by a disruption. This is especially critical in agribusiness supply chains where the products are perishable and have a short life-cycle. In this paper, we propose a resilience metric to capture and improve the recovery process in terms of both performance and time, of an agribusiness supply chain following either supply or demand-side disruption. We build a model that determines optimal supply chain recovery planning decisions and selects the best resilient strategies that minimize the loss of profit during the recovery time window. The model is formulated as a two-stage stochastic mixed-integer linear programming problem and solved with a branch-and-cut algorithm. The results show that the optimal recovery schedule is highly dependent on the duration of the time-window allowed for recovery. In addition, the profit loss during recovery is reduced by utilizing the proposed resilient actions.Keywords: agribusiness supply chain, recovery, resilience metric, risk management
Procedia PDF Downloads 397334 Modeling User Context Using CEAR Diagram
Authors: Ravindra Dastikop, G. S. Thyagaraju, U. P. Kulkarni
Abstract:
Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper, we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model, ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .Keywords: user context, context entity, context entity attributes, situation, sensors, devices, relationships, actors, expressiveness, understandability
Procedia PDF Downloads 344333 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis
Authors: Gon Park
Abstract:
Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.Keywords: cadastral data, green Infrastructure, network analysis, parcel data
Procedia PDF Downloads 205332 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization
Authors: Soheila Sadeghi
Abstract:
Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction
Procedia PDF Downloads 58331 Eco Scale: A Tool for Assessing the Greenness of Pharmaceuticals Analysis
Authors: Heba M. Mohamed
Abstract:
Owing to scientific and public concern about health and environment and seeking for a better quality of life; “Green”, “Environmentally” and “Eco” friendly practices have been presented and implemented in different research areas. Subsequently, researchers’ attention is drawn in the direction of greening the analytical methodologies and taking the Green Analytical Chemistry principles (GAC) into consideration. It is of high importance to appraise the environmental impact of each of the implemented green approaches. Compared to the other traditional green metrics (E-factor, Atom economy and the process profile), the eco scale is the optimum choice to assess the environmental impact of the analytical procedures used for pharmaceuticals analysis. For analytical methodologies, Eco-Scale is calculated by allotting penalty points to any factor of the used analytical procedure which disagree and not match with the model green analysis, where the perfect green analysis has its Eco-Scale value of 100. In this work, calculation and comparison of the Eco-Scale for some of the reported green analytical methods was done, to accentuate their greening potentials. Where the different scores can reveal how green the method is, compared to the ideal value. The study emphasizes that greenness measurement is not only about the waste quantity determination but also dictates a holistic scheme, considering all factors.Keywords: eco scale, green analysis, environmentally friendly, pharmaceuticals analysis
Procedia PDF Downloads 438330 Efficient Deep Neural Networks for Real-Time Strawberry Freshness Monitoring: A Transfer Learning Approach
Authors: Mst. Tuhin Akter, Sharun Akter Khushbu, S. M. Shaqib
Abstract:
A real-time system architecture is highly effective for monitoring and detecting various damaged products or fruits that may deteriorate over time or become infected with diseases. Deep learning models have proven to be effective in building such architectures. However, building a deep learning model from scratch is a time-consuming and costly process. A more efficient solution is to utilize deep neural network (DNN) based transfer learning models in the real-time monitoring architecture. This study focuses on using a novel strawberry dataset to develop effective transfer learning models for the proposed real-time monitoring system architecture, specifically for evaluating and detecting strawberry freshness. Several state-of-the-art transfer learning models were employed, and the best performing model was found to be Xception, demonstrating higher performance across evaluation metrics such as accuracy, recall, precision, and F1-score.Keywords: strawberry freshness evaluation, deep neural network, transfer learning, image augmentation
Procedia PDF Downloads 90329 Generalized Approach to Linear Data Transformation
Authors: Abhijith Asok
Abstract:
This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.Keywords: data transformation, dummy dimension, linear transformation, scaling
Procedia PDF Downloads 297328 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods
Authors: Devatha Kalyan Kumar, R. Poovarasan
Abstract:
In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric
Procedia PDF Downloads 256327 Digital Image Steganography with Multilayer Security
Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal
Abstract:
In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix
Procedia PDF Downloads 337326 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 40325 An Efficient Resource Management Algorithm for Mobility Management in Wireless Mesh Networks
Authors: Mallikarjuna Rao Yamarthy, Subramanyam Makam Venkata, Satya Prasad Kodati
Abstract:
The main objective of the proposed work is to reduce the overall network traffic incurred by mobility management, packet delivery cost and to increase the resource utilization. The proposed algorithm, An Efficient Resource Management Algorithm (ERMA) for mobility management in wireless mesh networks, relies on pointer based mobility management scheme. Whenever a mesh client moves from one mesh router to another, the pointer is set up dynamically between the previous mesh router and current mesh router based on the distance constraints. The algorithm evaluated for signaling cost, data delivery cost and total communication cost performance metrics. The proposed algorithm is demonstrated for both internet sessions and intranet sessions. The proposed algorithm yields significantly better performance in terms of signaling cost, data delivery cost, and total communication cost.Keywords: data delivery cost, mobility management, pointer forwarding, resource management, wireless mesh networks
Procedia PDF Downloads 367324 Generating Insights from Data Using a Hybrid Approach
Authors: Allmin Susaiyah, Aki Härmä, Milan Petković
Abstract:
Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.Keywords: data mining, insight mining, natural language generation, pre-trained language models
Procedia PDF Downloads 119323 A Review of Routing Protocols for Mobile Ad-Hoc NETworks (MANET)
Authors: Hafiza Khaddija Saman, Muhammad Sufyan
Abstract:
The increase in availability and popularity of mobile wireless devices has led researchers to develop a wide variety of Mobile Ad-hoc Networking (MANET) protocols to exploit the unique communication opportunities presented by these devices. Devices are able to communicate directly using the wireless spectrum in a peer-to-peer fashion, and route messages through intermediate nodes, however, the nature of wireless shared communication and mobile devices result in many routing and security challenges which must be addressed before deploying a MANET. In this paper, we investigate the range of MANET routing protocols available and discuss the functionalities of several ranging from early protocols such as DSDV to more advanced such as MAODV, our protocol study focuses upon works by Perkins in developing and improving MANET routing. A range of literature relating to the field of MANET routing was identified and reviewed, we also reviewed literature on the topic of securing AODV based MANETs as this may be the most popular MANET protocol. The literature review identified a number of trends within research papers such as exclusive use of the random waypoint mobility model, excluding key metrics from simulation results and not comparing protocol performance against available alternatives.Keywords: protocol, MANET, ad-Hoc, communication
Procedia PDF Downloads 261322 Leveraging Quality Metrics in Voting Model Based Thread Retrieval
Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim
Abstract:
Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.Keywords: content quality, forum search, thread retrieval, voting techniques
Procedia PDF Downloads 213321 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks
Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris
Abstract:
The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.Keywords: energy efficiency, handover, HetNets, MADM, small cells
Procedia PDF Downloads 116320 Comparison of Techniques for Detection and Diagnosis of Eccentricity in the Air-Gap Fault in Induction Motors
Authors: Abrahão S. Fontes, Carlos A. V. Cardoso, Levi P. B. Oliveira
Abstract:
The induction motors are used worldwide in various industries. Several maintenance techniques are applied to increase the operating time and the lifespan of these motors. Among these, the predictive maintenance techniques such as Motor Current Signature Analysis (MCSA), Motor Square Current Signature Analysis (MSCSA), Park's Vector Approach (PVA) and Park's Vector Square Modulus (PVSM) are used to detect and diagnose faults in electric motors, characterized by patterns in the stator current frequency spectrum. In this article, these techniques are applied and compared on a real motor, which has the fault of eccentricity in the air-gap. It was used as a theoretical model of an electric induction motor without fault in order to assist comparison between the stator current frequency spectrum patterns with and without faults. Metrics were purposed and applied to evaluate the sensitivity of each technique fault detection. The results presented here show that the above techniques are suitable for the fault of eccentricity in the air gap, whose comparison between these showed the suitability of each one.Keywords: eccentricity in the air-gap, fault diagnosis, induction motors, predictive maintenance
Procedia PDF Downloads 350319 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison
Authors: Saugata Bose, Ritambhra Korpal
Abstract:
The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram
Procedia PDF Downloads 357318 Aggregate Fluctuations and the Global Network of Input-Output Linkages
Authors: Alexander Hempfing
Abstract:
The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.Keywords: economic integration, industrial organization, input-output economics, network economics, production networks
Procedia PDF Downloads 276317 Bibliometrics of 'Community Garden' and Associated Keywords
Authors: Guilherme Reis Ranieri, Guilherme Leite Gaudereto, Michele Toledo, Luis Fernando Amato-Lourenco, Thais Mauad
Abstract:
Given the importance to urban sustainability and the growing relevance of the term ‘community garden’, this paper aims to conduct a bibliometric analysis of the term. Using SCOPUS as database, we analyzed 105 articles that contained the keywords ‘community garden’, and conducted a cluster analysis with the associated keywords. As results, we found 205 articles and 404 different keywords. Among the keywords, 334 are not repeated anytime, 44 are repeated 2 times and 9 appear 3 times. The most frequent keywords are: community food systems (74), urban activism (14), Communities of practice (6), food production (6) and public rethoric (5). Within the areas, which contains more articles are: social sciences (74), environmental science (29) and agricultural and biological sciences (24).The three main countries that concentrated the papers are United States (54), Canada (15) and Australia (12). The main journal with these keywords is Local Environment (10). The first publication was in 1999, and by 2010 concentrated 30,5% of the publications. The other 69,5% occurred 2010 to 2015, indicating an increase in frequency. We can conclude that the papers, based on the distribution of the keywords, are still scattered in various research topics and presents high variability between subjects.Keywords: bibliometrics, community garden, metrics, urban agriculture
Procedia PDF Downloads 367316 An Application of Lean Thinking at the Cargo Transport Area
Authors: Caroline Demartin, Natalia Camaras, Nelson Maestrelli, Max Filipe Gonçalves
Abstract:
This paper presents a case study of Lean Thinking at the cargo transport area. Lean Office principles are considered the application of Lean Thinking focusing on the service area and it is based on Lean Production concepts. Lean production is a philosophy that was born and gained ground after the Second World War when the Japanese Toyota Company developed a process of identifying and eliminating waste. Many researchers show that most part of the companies decide to adopt the principles created at Toyota especially in the manufacturing sector, but until 90’s, has no major applications for the service sector. Due to increased competition and the need for competitive advantage, many companies began to observe the lean transformation and take it as reference. In this study, a key process at a cargo transport company was analyzed using Lean Office tools and methods: a current state map was developed, main wastes were identified, some metrics were used to evaluate improvements and a priority matrix was used to identify action plans. The obtained results showed that Lean Office has a great potential to be successful applied in cargo air transport companies.Keywords: lean production, lean office, logistic, service sector
Procedia PDF Downloads 190315 Social Impact Evaluation in the Housing Sector
Authors: Edgard Barki, Tânia Modesto Veludo-de-Oliveira, Felipe Zambaldi
Abstract:
The social enterprise sector can be characterized as organizations that aim to solve social problems with financial sustainability and using market mechanisms. This sector has shown an increasing interest worldwide. Despite the growth and relevance of the sector, there is still a gap regarding the assessment of the social impact resulting from the initiatives of the organizations in this field. A number of metrics have been designed worldwide to evaluate the impact of social enterprises (e.g., IRIS, GIIRS, BACO), as well as some ad hoc studies that have been carried out, mainly in the microcredit sector, but there is still a gap to be filled in the development of research in social impact evaluation. Therefore, this research seeks to evaluate the social impact of two social enterprises (Terra Nova and Vivenda) in the area of housing in Brazil. To evaluate these impacts and their dimensions, we conducted an exploratory research, through three focus groups, thirty in-depth interviews and a survey with beneficiaries of both organizations. The results allowed us to evaluate how the two organizations were able to create a deep social impact in the populations served. Terra Nova has a more collective perspective, with a clear benefit of social inclusion and improvement of the community’s infrastructure, while Vivenda has a more individualized perspective, improving self-esteem, sociability and family coexistence.Keywords: Brazil, housing, social enterprise, social impact evaluation
Procedia PDF Downloads 442314 Application of VE in Healthcare Services: An Overview of Healthcare Facility
Authors: Safeer Ahmad, Pratheek Sudhakran, M. Arif Kamal, Tarique Anwar
Abstract:
In Healthcare facility designing, Efficient MEP services are very crucial because the built environment not only affects patients and family but also Healthcare staff and their outcomes. This paper shall cover the basics of Value engineering and its different phases that can be implemented to the MEP Designing stage for Healthcare facility optimization, also VE can improve the product cost the unnecessary costs associated with healthcare services. This paper explores Healthcare facility services and their Value engineering Job plan for the successful application of the VE technique by conducting a Workshop with end-users, designing team and associate experts shall be carried out using certain concepts, tools, methods and mechanism developed to achieve the purpose of selecting what is actually appropriate and ideal among many value engineering processes and tools that have long proven their ability to enhance the value by following the concept of Total quality management while achieving the most efficient resources allocation to satisfy the key functions and requirements of the project without sacrificing the targeted level of service for all design metrics. Detail study has been discussed with analysis been carried out by this process to achieve a better outcome, Various tools are used for the Analysis of the product at different phases used, at the end the results obtained after implementation of techniques are discussed.Keywords: value engineering, healthcare facility, design, services
Procedia PDF Downloads 197313 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis
Authors: Alexander Marx
Abstract:
Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.Keywords: value at risk, financial market risk, banking, quantitative risk management
Procedia PDF Downloads 94312 Near Infrared Spectrometry to Determine the Quality of Milk, Experimental Design Setup and Chemometrics: Review
Authors: Meghana Shankara, Priyadarshini Natarajan
Abstract:
Infrared (IR) spectroscopy has revolutionized the way we look at materials around us. Unraveling the pattern in the molecular spectra of materials to analyze the composition and properties of it has been one of the most interesting challenges in modern science. Applications of the IR spectrometry are numerous in the field’s pharmaceuticals, health, food and nutrition, oils, agriculture, construction, polymers, beverage, fabrics and much more limited only by the curiosity of the people. Near Infrared (NIR) spectrometry is applied robustly in analyzing the solids and liquid substances because of its non-destructive analysis method. In this paper, we have reviewed the application of NIR spectrometry in milk quality analysis and have presented the modes of measurement applied in NIRS measurement setup, Design of Experiment (DoE), classification/quantification algorithms used in the case of milk composition prediction like Fat%, Protein%, Lactose%, Solids Not Fat (SNF%) along with different approaches for adulterant identification. We have also discussed the important NIR ranges for the chosen milk parameters. The performance metrics used in the comparison of the various Chemometric approaches include Root Mean Square Error (RMSE), R^2, slope, offset, sensitivity, specificity and accuracyKeywords: chemometrics, design of experiment, milk quality analysis, NIRS measurement modes
Procedia PDF Downloads 271