Search results for: performance metrics
13022 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches
Authors: Wuttigrai Ngamsirijit
Abstract:
Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.Keywords: decision making, human capital analytics, talent management, talent value chain
Procedia PDF Downloads 18713021 Performance Analysis of Ad-Hoc Network Routing Protocols
Authors: I. Baddari, A. Riahla, M. Mezghich
Abstract:
Today in the literature, we discover a lot of routing algorithms which some have been the subject of normalization. Two great classes Routing algorithms are defined, the first is the class reactive algorithms and the second that of algorithms proactive. The aim of this work is to make a comparative study between some routing algorithms. Two comparisons are considered. The first will focus on the protocols of the same class and second class on algorithms of different classes (one reactive and the other proactive). Since they are not based on analytical models, the exact evaluation of some aspects of these protocols is challenging. Simulations have to be done in order to study their performances. Our simulation is performed in NS2 (Network Simulator 2). It identified a classification of the different routing algorithms studied in a metrics such as loss of message, the time transmission, mobility, etc.Keywords: ad-hoc network routing protocol, simulation, NS2, delay, packet loss, wideband, mobility
Procedia PDF Downloads 40013020 Evaluation Methods for Question Decomposition Formalism
Authors: Aviv Yaniv, Ron Ben Arosh, Nadav Gasner, Michael Konviser, Arbel Yaniv
Abstract:
This paper introduces two methods for the evaluation of Question Decomposition Meaning Representation (QDMR) as predicted by sequence-to-sequence model and COPYNET parser for natural language questions processing, motivated by the fact that previous evaluation metrics used for this task do not take into account some characteristics of the representation, such as partial ordering structure. To this end, several heuristics to extract such partial dependencies are formulated, followed by the hereby proposed evaluation methods denoted as Proportional Graph Matcher (PGM) and Conversion to Normal String Representation (Nor-Str), designed to better capture the accuracy level of QDMR predictions. Experiments are conducted to demonstrate the efficacy of the proposed evaluation methods and show the added value suggested by one of them- the Nor-Str, for better distinguishing between high and low-quality QDMR when predicted by models such as COPYNET. This work represents an important step forward in the development of better evaluation methods for QDMR predictions, which will be critical for improving the accuracy and reliability of natural language question-answering systems.Keywords: NLP, question answering, question decomposition meaning representation, QDMR evaluation metrics
Procedia PDF Downloads 7813019 On Multiobjective Optimization to Improve the Scalability of Fog Application Deployments Using Fogtorch
Authors: Suleiman Aliyu
Abstract:
Integrating IoT applications with Fog systems presents challenges in optimization due to diverse environments and conflicting objectives. This study explores achieving Pareto optimal deployments for Fog-based IoT systems to address growing QoS demands. We introduce Pareto optimality to balance competing performance metrics. Using the FogTorch optimization framework, we propose a hybrid approach (Backtracking search with branch and bound) for scalable IoT deployments. Our research highlights the advantages of Pareto optimality over single-objective methods and emphasizes the role of FogTorch in this context. Initial results show improvements in IoT deployment cost in Fog systems, promoting resource-efficient strategies.Keywords: pareto optimality, fog application deployment, resource allocation, internet of things
Procedia PDF Downloads 8813018 Application of a Lighting Design Method Using Mean Room Surface Exitance
Authors: Antonello Durante, James Duff, Kevin Kelly
Abstract:
The visual needs of people in modern work based buildings are changing. Self-illuminated screens of computers, televisions, tablets and smart phones have changed the relationship between people and the lit environment. In the past, lighting design practice was primarily based on providing uniform horizontal illuminance on the working plane, but this has failed to ensure good quality lit environments. Lighting standards of today continue to be set based upon a 100 year old approach that at its core, considers the task illuminance of the utmost importance, with this task typically being located on a horizontal plane. An alternative method focused on appearance has been proposed, as opposed to the traditional performance based approach. Mean Room Surface Exitance (MRSE) and Target-Ambient Illuminance Ratio (TAIR) are two new metrics proposed to assess illumination adequacy in interiors. The hypothesis is that these factors will be superior to the existing metrics used, which are horizontal illuminance led. For the six past years, research has examined this, within the Dublin Institute of Technology, with a view to determining the suitability of this approach for application to general lighting practice. Since the start of this research, a number of key findings have been produced that centered on how occupants will react to various levels of MRSE. This paper provides a broad update on how this research has progressed. More specifically, this paper will: i) Demonstrate how MRSE can be measured using HDR images technology, ii) Illustrate how MRSE can be calculated using scripting and an open source lighting computation engine, iii) Describe experimental results that demonstrate how occupants have reacted to various levels of MRSE within experimental office environments.Keywords: illumination hierarchy (IH), mean room surface exitance (MRSE), perceived adequacy of illumination (PAI), target-ambient illumination ratio (TAIR)
Procedia PDF Downloads 18713017 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark
Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos
Abstract:
This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark
Procedia PDF Downloads 12013016 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization
Authors: Soheila Sadeghi
Abstract:
Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction
Procedia PDF Downloads 5913015 An Approach to Physical Performance Analysis for Judo
Authors: Stefano Frassinelli, Alessandro Niccolai, Riccardo E. Zich
Abstract:
Sport performance analysis is a technique that is becoming every year more important for athletes of every level. Many techniques have been developed to measure and analyse efficiently the performance of athletes in some sports, but in combat sports these techniques found in many times their limits, due to the high interaction between the two opponents during the competition. In this paper the problem will be framed. Moreover the physical performance measurement problem will be analysed and three different techniques to manage it will be presented. All the techniques have been used to analyse the performance of 22 high level Judo athletes.Keywords: sport performance, physical performance, judo, performance coefficients
Procedia PDF Downloads 41413014 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data
Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali
Abstract:
The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors
Procedia PDF Downloads 6913013 From Comfort to Safety: Assessing the Influence of Car Seat Design on Driver Reaction and Performance
Authors: Sabariah Mohd Yusoff, Qamaruddin Adzeem Muhamad Murad
Abstract:
This study investigates the impact of car seat design on driver response time, addressing a critical gap in understanding how ergonomic features influence both performance and safety. Controlled driving experiments were conducted with fourteen participants (11 male, 3 female) across three locations chosen for their varying traffic conditions to account for differences in driver alertness. Participants interacted with various seat designs while performing driving tasks, and objective metrics such as braking and steering response times were meticulously recorded. Advanced statistical methods, including regression analysis and t-tests, were employed to identify design factors that significantly affect driver response times. Subjective feedback was gathered through detailed questionnaires—focused on driving experience and knowledge of response time—and in-depth interviews. This qualitative data was analyzed thematically to provide insights into driver comfort and usability preferences. The study aims to identify key seat design features that impact driver response time and to gain a deeper understanding of driver preferences for comfort and usability. The findings are expected to inform evidence-based guidelines for optimizing car seat design, ultimately enhancing driver performance and safety. The research offers valuable implications for automotive manufacturers and designers, contributing to the development of seats that improve driver response time and overall driving safety.Keywords: car seat design, driver response time, cognitive driving, ergonomics optimization
Procedia PDF Downloads 2413012 Evaluating the Location of Effective Product Advertising on Facebook Ads
Authors: Aulia F. Hadining, Atya Nur Aisha, Dimas Kurninatoro Aji
Abstract:
Utilization of social media as a marketing tool is growing rapidly, including for SMEs. Social media allows the user to give product evaluation and recommendations to the public. In addition, the social media facilitate word-of-mouth marketing communication. One of the social media that can be used is Facebook, with Facebook Ads. This study aimed to evaluate the location of Facebook Ads, to obtain an appropriate advertising design. There are three alternatives location consist of desktop, right-hand column and mobile. The effectiveness and efficiency of advertising will be measured based on advertising metrics such as reach, click, Cost per Click (CUC) and Unique Click-Through-Rate (UCTR). Facebook's Ads Manager was used for seven days, targeted by age (18-24), location (Bandung), language (Indonesia) and keywords. The result was 13,999 total reach, as well as 342 clicks. Based on the results of comparison using ANOVA, there was a significant difference for each placement location based on advertising metrics. Mobile location was chosen to be successful ads, because it produces the lowest CUC, amounting to Rp 691,- per click and 14% UCTR. Results of this study showed Facebook Ads was useful and cost-effective media to promote the product of SME, because it could be view by many people in the same time.Keywords: marketing communication, social media, Facebook Ads, mobile location
Procedia PDF Downloads 35413011 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation
Procedia PDF Downloads 7013010 Load Balancing Technique for Energy - Efficiency in Cloud Computing
Authors: Rani Danavath, V. B. Narsimha
Abstract:
Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission
Procedia PDF Downloads 44913009 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics
Authors: Sairi Satari
Abstract:
Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.Keywords: sigma matrics, analytical performance, total error, bias
Procedia PDF Downloads 17113008 Performance Evaluation of Clustered Routing Protocols for Heterogeneous Wireless Sensor Networks
Authors: Awatef Chniguir, Tarek Farah, Zouhair Ben Jemaa, Safya Belguith
Abstract:
Optimal routing allows minimizing energy consumption in wireless sensor networks (WSN). Clustering has proven its effectiveness in organizing WSN by reducing channel contention and packet collision and enhancing network throughput under heavy load. Therefore, nowadays, with the emergence of the Internet of Things, heterogeneity is essential. Stable election protocol (SEP) that has increased the network stability period and lifetime is the first clustering protocol for heterogeneous WSN. SEP and its descendants, namely SEP, Threshold Sensitive SEP (TSEP), Enhanced TSEP (ETSSEP) and Current Energy Allotted TSEP (CEATSEP), were studied. These algorithms’ performance was evaluated based on different metrics, especially first node death (FND), to compare their stability. Simulations were conducted on the MATLAB tool considering two scenarios: The first one demonstrates the fraction variation of advanced nodes by setting the number of total nodes. The second considers the interpretation of the number of nodes while keeping the number of advanced nodes permanent. CEATSEP outperforms its antecedents by increasing stability and, at the same time, keeping a low throughput. It also operates very well in a large-scale network. Consequently, CEATSEP has a useful lifespan and energy efficiency compared to the other routing protocol for heterogeneous WSN.Keywords: clustering, heterogeneous, stability, scalability, IoT, WSN
Procedia PDF Downloads 13113007 Call-Back Laterality and Bilaterality: Possible Screening Mammography Quality Metrics
Authors: Samson Munn, Virginia H. Kim, Huija Chen, Sean Maldonado, Michelle Kim, Paul Koscheski, Babak N. Kalantari, Gregory Eckel, Albert Lee
Abstract:
In terms of screening mammography quality, neither the portion of reports that advise call-back imaging that should be bilateral versus unilateral nor how much the unilateral call-backs may appropriately diverge from 50–50 (left versus right) is known. Many factors may affect detection laterality: display arrangement, reflections preferentially striking one display location, hanging protocols, seating positions with respect to others and displays, visual field cuts, health, etc. The call-back bilateral fraction may reflect radiologist experience (not in our data) or confidence level. Thus, laterality and bilaterality of call-backs advised in screening mammography reports could be worthy quality metrics. Here, laterality data did not reveal a concern until drilling down to individuals. Bilateral screening mammogram report recommendations by five breast imaging, attending radiologists at Harbor-UCLA Medical Center (Torrance, California) 9/1/15--8/31/16 and 9/1/16--8/31/17 were retrospectively reviewed. Recommended call-backs for bilateral versus unilateral, and for left versus right, findings were counted. Chi-square (χ²) statistic was applied. Year 1: of 2,665 bilateral screening mammograms, reports of 556 (20.9%) recommended call-back, of which 99 (17.8% of the 556) were for bilateral findings. Of the 457 unilateral recommendations, 222 (48.6%) regarded the left breast. Year 2: of 2,106 bilateral screening mammograms, reports of 439 (20.8%) recommended call-back, of which 65 (14.8% of the 439) were for bilateral findings. Of the 374 unilateral recommendations, 182 (48.7%) regarded the left breast. Individual ranges of call-backs that were bilateral were 13.2–23.3%, 10.2–22.5%, and 13.6–17.9%, by year(s) 1, 2, and 1+2, respectively; these ranges were unrelated to experience level; the two-year mean was 15.8% (SD=1.9%). The lowest χ² p value of the group's sidedness disparities years 1, 2, and 1+2 was > 0.4. Regarding four individual radiologists, the lowest p value was 0.42. However, the fifth radiologist disfavored the left, with p values of 0.21, 0.19, and 0.07, respectively; that radiologist had the greatest number of years of experience. There was a concerning, 93% likelihood that bias against left breast findings evidenced by one of our radiologists was not random. Notably, very soon after the period under review, he retired, presented with leukemia, and died. We call for research to be done, particularly by large departments with many radiologists, of two possible, new, quality metrics in screening mammography: laterality and bilaterality. (Images, patient outcomes, report validity, and radiologist psychological confidence levels were not assessed. No intervention nor subsequent data collection was conducted. This uncomplicated collection of data and simple appraisal were not designed, nor had there been any intention to develop or contribute, to generalizable knowledge (per U.S. DHHS 45 CFR, part 46)).Keywords: mammography, screening mammography, quality, quality metrics, laterality
Procedia PDF Downloads 16213006 Analyzing the Programme for International Student Assessment (PISA) Results in Uzbekistan: Insights from Organisation for Economic Co-operation and Development (OECD) Assessments
Authors: Nukarova Marjona Kayimovna
Abstract:
This article examines Uzbekistan's participation in the Programme for International Student Assessment (PISA) 2022, as the country took part in the assessment for the first time. The analysis delves into the initial results and performance metrics reported by the Organisation for Economic Co-operation and Development (OECD). By exploring Uzbekistan's data, the article highlights key findings, trends, and areas of strength and improvement. The aim is to provide a comprehensive understanding of how Uzbekistan's education system compares on the international stage and to offer insights into potential implications for future educational policies and reforms.Keywords: PISA, OECD, data analysis of Uzbekistan, results, critical thinking.
Procedia PDF Downloads 1113005 Recommender System Based on Mining Graph Databases for Data-Intensive Applications
Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi
Abstract:
In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.Keywords: graph databases, NLP, recommendation systems, similarity metrics
Procedia PDF Downloads 10413004 The Mediatory Role of Innovation in the Link between Social and Financial Performance
Authors: Bita Mashayekhi, Amin Jahangard, Milad Samavat, Saeid Homayoun
Abstract:
In the modern competitive business environment, one cannot overstate the importance of corporate social responsibility. The controversial link between the social and financial performance of firms has become a topic of interest for scholars. Hence, this study examines the social and financial performance link by taking into account the mediating role of innovation performance. We conducted the Covariance-based Structural Equation Modeling (CB-SEM) method on an international sample of firms provided by the ASSET4 database. In this research, to explore the black box of the social and financial performance relationship, we first examined the effect of social performance separately on financial performance and innovation; then, we measured the mediation role of innovation in the social and financial performance link. While our results indicate the positive effect of social performance on financial performance and innovation, we cannot document the positive mediating role of innovation. This possibly relates to the long-term nature of benefits from investments in innovation.Keywords: ESG, financial performance, innovation, social performance, structural equation modeling
Procedia PDF Downloads 10213003 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict
Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez
Abstract:
This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks
Procedia PDF Downloads 48813002 An Enhanced Distributed Weighted Clustering Algorithm for Intra and Inter Cluster Routing in MANET
Authors: K. Gomathi
Abstract:
Mobile Ad hoc Networks (MANET) is defined as collection of routable wireless mobile nodes with no centralized administration and communicate each other using radio signals. Especially MANETs deployed in hostile environments where hackers will try to disturb the secure data transfer and drain the valuable network resources. Since MANET is battery operated network, preserving the network resource is essential one. For resource constrained computation, efficient routing and to increase the network stability, the network is divided into smaller groups called clusters. The clustering architecture consists of Cluster Head(CH), ordinary node and gateway. The CH is responsible for inter and intra cluster routing. CH election is a prominent research area and many more algorithms are developed using many different metrics. The CH with longer life sustains network lifetime, for this purpose Secondary Cluster Head(SCH) also elected and it is more economical. To nominate efficient CH, a Enhanced Distributed Weighted Clustering Algorithm (EDWCA) has been proposed. This approach considers metrics like battery power, degree difference and speed of the node for CH election. The proficiency of proposed one is evaluated and compared with existing algorithm using Network Simulator(NS-2).Keywords: MANET, EDWCA, clustering, cluster head
Procedia PDF Downloads 39813001 Sustainable Rehabilitation of Concrete Buildings in Iran: Harnessing Sunlight and Navigating Limited Water Resources
Authors: Amin Khamoosh, Hamed Faramarzifar
Abstract:
In the capital of Iran, Tehran, numerous buildings constructed when extreme climates were not prevalent now face the need for rehabilitation, typically within their first decade. Our data delves into the performance metrics and economic advantages of sustainable rehabilitation practices compared to traditional methods. With a focus on the scarcity of water resources, we specifically scrutinize water-efficient techniques throughout construction, rehabilitation, and usage. Examining design elements that optimize natural light while efficiently managing heat transmission is crucial, given the reliance on water for cooling devices in this region. The data aims to present a comprehensive strategy, addressing immediate structural concerns while harmonizing with Iran's unique environmental conditions.Keywords: sustainable rehabilitation, concrete buildings, iran, solar energy, water-efficient techniques
Procedia PDF Downloads 5613000 Evaluating Models Through Feature Selection Methods Using Data Driven Approach
Authors: Shital Patil, Surendra Bhosale
Abstract:
Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE
Procedia PDF Downloads 11812999 Developing a Product Circularity Index with an Emphasis on Longevity, Repairability, and Material Efficiency
Authors: Lina Psarra, Manogj Sundaresan, Purjeet Sutar
Abstract:
In response to the global imperative for sustainable solutions, this article proposes the development of a comprehensive circularity index applicable to a wide range of products across various industries. The absence of a consensus on using a universal metric to assess circularity performance presents a significant challenge in prioritizing and effectively managing sustainable initiatives. This circularity index serves as a quantitative measure to evaluate the adherence of products, processes, and systems to the principles of a circular economy. Unlike traditional distinct metrics such as recycling rates or material efficiency, this index considers the entire lifecycle of a product in one single metric, also incorporating additional factors such as reusability, scarcity of materials, reparability, and recyclability. Through a systematic approach and by reviewing existing metrics and past methodologies, this work aims to address this gap by formulating a circularity index that can be applied to diverse product portfolio and assist in comparing the circularity of products on a scale of 0%-100%. Project objectives include developing a formula, designing and implementing a pilot tool based on the developed Product Circularity Index (PCI), evaluating the effectiveness of the formula and tool using real product data, and assessing the feasibility of integration into various sustainability initiatives. The research methodology involves an iterative process of comprehensive research, analysis, and refinement where key steps include defining circularity parameters, collecting relevant product data, applying the developed formula, and testing the tool in a pilot phase to gather insights and make necessary adjustments. Major findings of the study indicate that the PCI provides a robust framework for evaluating product circularity across various dimensions. The Excel-based pilot tool demonstrated high accuracy and reliability in measuring circularity, and the database proved instrumental in supporting comprehensive assessments. The PCI facilitated the identification of key areas for improvement, enabling more informed decision-making towards circularity and benchmarking across different products, essentially assisting towards better resource management. In conclusion, the development of the Product Circularity Index represents a significant advancement in global sustainability efforts. By providing a standardized metric, the PCI empowers companies and stakeholders to systematically assess product circularity, track progress, identify improvement areas, and make informed decisions about resource management. This project contributes to the broader discourse on sustainable development by offering a practical approach to enhance circularity within industrial systems, thus paving the way towards a more resilient and sustainable future.Keywords: circular economy, circular metrics, circularity assessment, circularity tool, sustainable product design, product circularity index
Procedia PDF Downloads 2812998 Working Conditions, Motivation and Job Performance of Hotel Workers
Authors: Thushel Jayaweera
Abstract:
In performance evaluation literature, there has been no investigation indicating the impact of job characteristics, working conditions and motivation on the job performance among the hotel workers in Britain. This study tested the relationship between working conditions (physical and psychosocial working conditions) and job performance (task and contextual performance) with motivators (e.g. recognition, achievement, the work itself, the possibility for growth and work significance) as the mediating variable. A total of 254 hotel workers in 25 hotels in Bristol, United Kingdom participated in this study. Working conditions influenced job performance and motivation moderated the relationship between working conditions and job performance. Poor workplace conditions resulted in decreasing employee performance. The results point to the importance of motivators among hotel workers and highlighted that work be designed to provide recognition and sense of autonomy on the job to enhance job performance of the hotel workers. These findings have implications for organizational interventions aimed at increasing employee job performance.Keywords: hotel workers, working conditions, motivation, job characteristics, job performance
Procedia PDF Downloads 59812997 Measuring Fragmentation Index of Urban Landscape: A Case Study on Kuala Lumpur City
Authors: Shagufta Tazin Shathy, Mohammad Imam Hasan Reza
Abstract:
Fragmentation due to urbanization and agricultural expansion has become the main reason for destruction of forest area and loss of biodiversity particularly in the developing world. At present, the world is experiencing the largest wave of urban growth in human history, and it is estimated that this influx will be mainly taking place in developing world. Therefore, study on urban fragmentation is vital for a sustainable urban development. Landscape fragmentation is one of the most important conservation issues in the last few decades. Habitat fragmentation due to landscape alteration has caused habitat isolation, destruction in ecosystem pattern and processes. Thus, this research analyses the spatial and temporal extent of urban fragmentation using landscape indices in the Kuala Lumpur (KL) – the capital and most populous city in Malaysia. The objective of this study is to examine the urban fragmentation index in KL city. Fragmentation metrics used in the study are: a) Urban landscape ratio (the ratio of urban landscape area and build up area), b) Infill (development that occurred within urbanized open space), and c) Extension (development of exterior open space). After analyzing all three metrics, these are calculated for the combined urban fragmentation index (UFI). In this combined index, all three metrics are given an equal weight. Land cover/ land use maps of the year 1996 and 2005 have been developed from the Landsat TM 30 m resolution satellite image. The year 1996 is taken as a reference year to analyze the changes. The UFI calculated for the year of 1996 and2005 found that the KL city has undergone rapid landscape changes destructing forest ecosystem adversely. Increasing UFI for the year of 1996 compared to 2005 indicates that the developmental activities have been occupying open spaces and fragmenting natural lands and forest. This index can be implemented in other unplanned and rapidly urbanizing Asian cities for example Dhaka and Delhi to calculate the urban fragmentation rate. The findings from the study will help the stakeholders and urban planners for a sustainable urban management planning in this region.Keywords: GIS, index, sustainable urban management, urbanization
Procedia PDF Downloads 36512996 Factors Affecting Employee Performance: A Case Study in Marketing and Trading Directorate, Pertamina Ltd.
Authors: Saptiadi Nugroho, A. Nur Muhamad Afif
Abstract:
Understanding factors that influence employee performance is very important. By finding the significant factors, organization could intervene to improve the employee performance that simultaneously will affect organization itself. In this research, four aspects consist of PCCD training, education level, corrective action, and work location were tested to identify their influence on employee performance. By using correlation analysis and T-Test, it was found that employee performance significantly influenced by PCCD training, work location, and corrective action. Meanwhile the education level did not influence employee performance.Keywords: employee development, employee performance, performance management system, organization
Procedia PDF Downloads 39012995 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 2712994 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams
Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew
Abstract:
Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions
Procedia PDF Downloads 11412993 Machine Learning Techniques to Develop Traffic Accident Frequency Prediction Models
Authors: Rodrigo Aguiar, Adelino Ferreira
Abstract:
Road traffic accidents are the leading cause of unnatural death and injuries worldwide, representing a significant problem of road safety. In this context, the use of artificial intelligence with advanced machine learning techniques has gained prominence as a promising approach to predict traffic accidents. This article investigates the application of machine learning algorithms to develop traffic accident frequency prediction models. Models are evaluated based on performance metrics, making it possible to do a comparative analysis with traditional prediction approaches. The results suggest that machine learning can provide a powerful tool for accident prediction, which will contribute to making more informed decisions regarding road safety.Keywords: machine learning, artificial intelligence, frequency of accidents, road safety
Procedia PDF Downloads 89