Search results for: performance metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12646

Search results for: performance metrics

12496 The Mediatory Role of Innovation in the Link between Social and Financial Performance

Authors: Bita Mashayekhi, Amin Jahangard, Milad Samavat, Saeid Homayoun

Abstract:

In the modern competitive business environment, one cannot overstate the importance of corporate social responsibility. The controversial link between the social and financial performance of firms has become a topic of interest for scholars. Hence, this study examines the social and financial performance link by taking into account the mediating role of innovation performance. We conducted the Covariance-based Structural Equation Modeling (CB-SEM) method on an international sample of firms provided by the ASSET4 database. In this research, to explore the black box of the social and financial performance relationship, we first examined the effect of social performance separately on financial performance and innovation; then, we measured the mediation role of innovation in the social and financial performance link. While our results indicate the positive effect of social performance on financial performance and innovation, we cannot document the positive mediating role of innovation. This possibly relates to the long-term nature of benefits from investments in innovation.

Keywords: ESG, financial performance, innovation, social performance, structural equation modeling

Procedia PDF Downloads 62
12495 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico

Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez

Abstract:

Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.

Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation

Procedia PDF Downloads 36
12494 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 419
12493 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics

Authors: Sairi Satari

Abstract:

Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.

Keywords: sigma matrics, analytical performance, total error, bias

Procedia PDF Downloads 147
12492 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data

Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali

Abstract:

The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.

Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors

Procedia PDF Downloads 33
12491 Working Conditions, Motivation and Job Performance of Hotel Workers

Authors: Thushel Jayaweera

Abstract:

In performance evaluation literature, there has been no investigation indicating the impact of job characteristics, working conditions and motivation on the job performance among the hotel workers in Britain. This study tested the relationship between working conditions (physical and psychosocial working conditions) and job performance (task and contextual performance) with motivators (e.g. recognition, achievement, the work itself, the possibility for growth and work significance) as the mediating variable. A total of 254 hotel workers in 25 hotels in Bristol, United Kingdom participated in this study. Working conditions influenced job performance and motivation moderated the relationship between working conditions and job performance. Poor workplace conditions resulted in decreasing employee performance. The results point to the importance of motivators among hotel workers and highlighted that work be designed to provide recognition and sense of autonomy on the job to enhance job performance of the hotel workers. These findings have implications for organizational interventions aimed at increasing employee job performance.

Keywords: hotel workers, working conditions, motivation, job characteristics, job performance

Procedia PDF Downloads 563
12490 Performance Evaluation of Clustered Routing Protocols for Heterogeneous Wireless Sensor Networks

Authors: Awatef Chniguir, Tarek Farah, Zouhair Ben Jemaa, Safya Belguith

Abstract:

Optimal routing allows minimizing energy consumption in wireless sensor networks (WSN). Clustering has proven its effectiveness in organizing WSN by reducing channel contention and packet collision and enhancing network throughput under heavy load. Therefore, nowadays, with the emergence of the Internet of Things, heterogeneity is essential. Stable election protocol (SEP) that has increased the network stability period and lifetime is the first clustering protocol for heterogeneous WSN. SEP and its descendants, namely SEP, Threshold Sensitive SEP (TSEP), Enhanced TSEP (ETSSEP) and Current Energy Allotted TSEP (CEATSEP), were studied. These algorithms’ performance was evaluated based on different metrics, especially first node death (FND), to compare their stability. Simulations were conducted on the MATLAB tool considering two scenarios: The first one demonstrates the fraction variation of advanced nodes by setting the number of total nodes. The second considers the interpretation of the number of nodes while keeping the number of advanced nodes permanent. CEATSEP outperforms its antecedents by increasing stability and, at the same time, keeping a low throughput. It also operates very well in a large-scale network. Consequently, CEATSEP has a useful lifespan and energy efficiency compared to the other routing protocol for heterogeneous WSN.

Keywords: clustering, heterogeneous, stability, scalability, IoT, WSN

Procedia PDF Downloads 99
12489 Factors Affecting Employee Performance: A Case Study in Marketing and Trading Directorate, Pertamina Ltd.

Authors: Saptiadi Nugroho, A. Nur Muhamad Afif

Abstract:

Understanding factors that influence employee performance is very important. By finding the significant factors, organization could intervene to improve the employee performance that simultaneously will affect organization itself. In this research, four aspects consist of PCCD training, education level, corrective action, and work location were tested to identify their influence on employee performance. By using correlation analysis and T-Test, it was found that employee performance significantly influenced by PCCD training, work location, and corrective action. Meanwhile the education level did not influence employee performance.

Keywords: employee development, employee performance, performance management system, organization

Procedia PDF Downloads 359
12488 Evaluating the Location of Effective Product Advertising on Facebook Ads

Authors: Aulia F. Hadining, Atya Nur Aisha, Dimas Kurninatoro Aji

Abstract:

Utilization of social media as a marketing tool is growing rapidly, including for SMEs. Social media allows the user to give product evaluation and recommendations to the public. In addition, the social media facilitate word-of-mouth marketing communication. One of the social media that can be used is Facebook, with Facebook Ads. This study aimed to evaluate the location of Facebook Ads, to obtain an appropriate advertising design. There are three alternatives location consist of desktop, right-hand column and mobile. The effectiveness and efficiency of advertising will be measured based on advertising metrics such as reach, click, Cost per Click (CUC) and Unique Click-Through-Rate (UCTR). Facebook's Ads Manager was used for seven days, targeted by age (18-24), location (Bandung), language (Indonesia) and keywords. The result was 13,999 total reach, as well as 342 clicks. Based on the results of comparison using ANOVA, there was a significant difference for each placement location based on advertising metrics. Mobile location was chosen to be successful ads, because it produces the lowest CUC, amounting to Rp 691,- per click and 14% UCTR. Results of this study showed Facebook Ads was useful and cost-effective media to promote the product of SME, because it could be view by many people in the same time.

Keywords: marketing communication, social media, Facebook Ads, mobile location

Procedia PDF Downloads 320
12487 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict

Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez

Abstract:

This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.

Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks

Procedia PDF Downloads 458
12486 Call-Back Laterality and Bilaterality: Possible Screening Mammography Quality Metrics

Authors: Samson Munn, Virginia H. Kim, Huija Chen, Sean Maldonado, Michelle Kim, Paul Koscheski, Babak N. Kalantari, Gregory Eckel, Albert Lee

Abstract:

In terms of screening mammography quality, neither the portion of reports that advise call-back imaging that should be bilateral versus unilateral nor how much the unilateral call-backs may appropriately diverge from 50–50 (left versus right) is known. Many factors may affect detection laterality: display arrangement, reflections preferentially striking one display location, hanging protocols, seating positions with respect to others and displays, visual field cuts, health, etc. The call-back bilateral fraction may reflect radiologist experience (not in our data) or confidence level. Thus, laterality and bilaterality of call-backs advised in screening mammography reports could be worthy quality metrics. Here, laterality data did not reveal a concern until drilling down to individuals. Bilateral screening mammogram report recommendations by five breast imaging, attending radiologists at Harbor-UCLA Medical Center (Torrance, California) 9/1/15--8/31/16 and 9/1/16--8/31/17 were retrospectively reviewed. Recommended call-backs for bilateral versus unilateral, and for left versus right, findings were counted. Chi-square (χ²) statistic was applied. Year 1: of 2,665 bilateral screening mammograms, reports of 556 (20.9%) recommended call-back, of which 99 (17.8% of the 556) were for bilateral findings. Of the 457 unilateral recommendations, 222 (48.6%) regarded the left breast. Year 2: of 2,106 bilateral screening mammograms, reports of 439 (20.8%) recommended call-back, of which 65 (14.8% of the 439) were for bilateral findings. Of the 374 unilateral recommendations, 182 (48.7%) regarded the left breast. Individual ranges of call-backs that were bilateral were 13.2–23.3%, 10.2–22.5%, and 13.6–17.9%, by year(s) 1, 2, and 1+2, respectively; these ranges were unrelated to experience level; the two-year mean was 15.8% (SD=1.9%). The lowest χ² p value of the group's sidedness disparities years 1, 2, and 1+2 was > 0.4. Regarding four individual radiologists, the lowest p value was 0.42. However, the fifth radiologist disfavored the left, with p values of 0.21, 0.19, and 0.07, respectively; that radiologist had the greatest number of years of experience. There was a concerning, 93% likelihood that bias against left breast findings evidenced by one of our radiologists was not random. Notably, very soon after the period under review, he retired, presented with leukemia, and died. We call for research to be done, particularly by large departments with many radiologists, of two possible, new, quality metrics in screening mammography: laterality and bilaterality. (Images, patient outcomes, report validity, and radiologist psychological confidence levels were not assessed. No intervention nor subsequent data collection was conducted. This uncomplicated collection of data and simple appraisal were not designed, nor had there been any intention to develop or contribute, to generalizable knowledge (per U.S. DHHS 45 CFR, part 46)).

Keywords: mammography, screening mammography, quality, quality metrics, laterality

Procedia PDF Downloads 134
12485 Evaluating Models Through Feature Selection Methods Using Data Driven Approach

Authors: Shital Patil, Surendra Bhosale

Abstract:

Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.

Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE

Procedia PDF Downloads 82
12484 Recommender System Based on Mining Graph Databases for Data-Intensive Applications

Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi

Abstract:

In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.

Keywords: graph databases, NLP, recommendation systems, similarity metrics

Procedia PDF Downloads 73
12483 Sustainable Rehabilitation of Concrete Buildings in Iran: Harnessing Sunlight and Navigating Limited Water Resources

Authors: Amin Khamoosh, Hamed Faramarzifar

Abstract:

In the capital of Iran, Tehran, numerous buildings constructed when extreme climates were not prevalent now face the need for rehabilitation, typically within their first decade. Our data delves into the performance metrics and economic advantages of sustainable rehabilitation practices compared to traditional methods. With a focus on the scarcity of water resources, we specifically scrutinize water-efficient techniques throughout construction, rehabilitation, and usage. Examining design elements that optimize natural light while efficiently managing heat transmission is crucial, given the reliance on water for cooling devices in this region. The data aims to present a comprehensive strategy, addressing immediate structural concerns while harmonizing with Iran's unique environmental conditions.

Keywords: sustainable rehabilitation, concrete buildings, iran, solar energy, water-efficient techniques

Procedia PDF Downloads 23
12482 Examining the Role of Corporate Culture in Driving Firm Performance

Authors: Lovorka Galetić, Ivana Načinović Braje, Nevenka Čavlek

Abstract:

The purpose of this paper is to analyze the relationship between corporate culture and firm performance. Extensive theoretical and empirical evidence on this issue is provided. A quantitative methodology was used to explore relationship between corporate culture and performance among large Croatian companies. Corporate culture was explored by using Denison framework. The research revealed a positive, statistically significant relationship between mission and performance. Other dimensions of corporate culture (involvement, consistency and adaptability) show only partial relationship with performance.

Keywords: corporate culture, Croatia, Denison culture model, performance

Procedia PDF Downloads 496
12481 An Enhanced Distributed Weighted Clustering Algorithm for Intra and Inter Cluster Routing in MANET

Authors: K. Gomathi

Abstract:

Mobile Ad hoc Networks (MANET) is defined as collection of routable wireless mobile nodes with no centralized administration and communicate each other using radio signals. Especially MANETs deployed in hostile environments where hackers will try to disturb the secure data transfer and drain the valuable network resources. Since MANET is battery operated network, preserving the network resource is essential one. For resource constrained computation, efficient routing and to increase the network stability, the network is divided into smaller groups called clusters. The clustering architecture consists of Cluster Head(CH), ordinary node and gateway. The CH is responsible for inter and intra cluster routing. CH election is a prominent research area and many more algorithms are developed using many different metrics. The CH with longer life sustains network lifetime, for this purpose Secondary Cluster Head(SCH) also elected and it is more economical. To nominate efficient CH, a Enhanced Distributed Weighted Clustering Algorithm (EDWCA) has been proposed. This approach considers metrics like battery power, degree difference and speed of the node for CH election. The proficiency of proposed one is evaluated and compared with existing algorithm using Network Simulator(NS-2).

Keywords: MANET, EDWCA, clustering, cluster head

Procedia PDF Downloads 364
12480 Optimizing Wind Turbine Blade Geometry for Enhanced Performance and Durability: A Computational Approach

Authors: Nwachukwu Ifeanyi

Abstract:

Wind energy is a vital component of the global renewable energy portfolio, with wind turbines serving as the primary means of harnessing this abundant resource. However, the efficiency and stability of wind turbines remain critical challenges in maximizing energy output and ensuring long-term operational viability. This study proposes a comprehensive approach utilizing computational aerodynamics and aeromechanics to optimize wind turbine performance across multiple objectives. The proposed research aims to integrate advanced computational fluid dynamics (CFD) simulations with structural analysis techniques to enhance the aerodynamic efficiency and mechanical stability of wind turbine blades. By leveraging multi-objective optimization algorithms, the study seeks to simultaneously optimize aerodynamic performance metrics such as lift-to-drag ratio and power coefficient while ensuring structural integrity and minimizing fatigue loads on the turbine components. Furthermore, the investigation will explore the influence of various design parameters, including blade geometry, airfoil profiles, and turbine operating conditions, on the overall performance and stability of wind turbines. Through detailed parametric studies and sensitivity analyses, valuable insights into the complex interplay between aerodynamics and structural dynamics will be gained, facilitating the development of next-generation wind turbine designs. Ultimately, this research endeavours to contribute to the advancement of sustainable energy technologies by providing innovative solutions to enhance the efficiency, reliability, and economic viability of wind power generation systems. The findings have the potential to inform the design and optimization of wind turbines, leading to increased energy output, reduced maintenance costs, and greater environmental benefits in the transition towards a cleaner and more sustainable energy future.

Keywords: computation, robotics, mathematics, simulation

Procedia PDF Downloads 14
12479 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams

Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew

Abstract:

Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.

Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions

Procedia PDF Downloads 81
12478 Measuring Fragmentation Index of Urban Landscape: A Case Study on Kuala Lumpur City

Authors: Shagufta Tazin Shathy, Mohammad Imam Hasan Reza

Abstract:

Fragmentation due to urbanization and agricultural expansion has become the main reason for destruction of forest area and loss of biodiversity particularly in the developing world. At present, the world is experiencing the largest wave of urban growth in human history, and it is estimated that this influx will be mainly taking place in developing world. Therefore, study on urban fragmentation is vital for a sustainable urban development. Landscape fragmentation is one of the most important conservation issues in the last few decades. Habitat fragmentation due to landscape alteration has caused habitat isolation, destruction in ecosystem pattern and processes. Thus, this research analyses the spatial and temporal extent of urban fragmentation using landscape indices in the Kuala Lumpur (KL) – the capital and most populous city in Malaysia. The objective of this study is to examine the urban fragmentation index in KL city. Fragmentation metrics used in the study are: a) Urban landscape ratio (the ratio of urban landscape area and build up area), b) Infill (development that occurred within urbanized open space), and c) Extension (development of exterior open space). After analyzing all three metrics, these are calculated for the combined urban fragmentation index (UFI). In this combined index, all three metrics are given an equal weight. Land cover/ land use maps of the year 1996 and 2005 have been developed from the Landsat TM 30 m resolution satellite image. The year 1996 is taken as a reference year to analyze the changes. The UFI calculated for the year of 1996 and2005 found that the KL city has undergone rapid landscape changes destructing forest ecosystem adversely. Increasing UFI for the year of 1996 compared to 2005 indicates that the developmental activities have been occupying open spaces and fragmenting natural lands and forest. This index can be implemented in other unplanned and rapidly urbanizing Asian cities for example Dhaka and Delhi to calculate the urban fragmentation rate. The findings from the study will help the stakeholders and urban planners for a sustainable urban management planning in this region.

Keywords: GIS, index, sustainable urban management, urbanization

Procedia PDF Downloads 344
12477 Machine Learning Techniques to Develop Traffic Accident Frequency Prediction Models

Authors: Rodrigo Aguiar, Adelino Ferreira

Abstract:

Road traffic accidents are the leading cause of unnatural death and injuries worldwide, representing a significant problem of road safety. In this context, the use of artificial intelligence with advanced machine learning techniques has gained prominence as a promising approach to predict traffic accidents. This article investigates the application of machine learning algorithms to develop traffic accident frequency prediction models. Models are evaluated based on performance metrics, making it possible to do a comparative analysis with traditional prediction approaches. The results suggest that machine learning can provide a powerful tool for accident prediction, which will contribute to making more informed decisions regarding road safety.

Keywords: machine learning, artificial intelligence, frequency of accidents, road safety

Procedia PDF Downloads 50
12476 Evaluation of NASA POWER and CRU Precipitation and Temperature Datasets over a Desert-prone Yobe River Basin: An Investigation of the Impact of Drought in the North-East Arid Zone of Nigeria

Authors: Yusuf Dawa Sidi, Abdulrahman Bulama Bizi

Abstract:

The most dependable and precise source of climate data is often gauge observation. However, long-term records of gauge observations, on the other hand, are unavailable in many regions around the world. In recent years, a number of gridded climate datasets with high spatial and temporal resolutions have emerged as viable alternatives to gauge-based measurements. However, it is crucial to thoroughly evaluate their performance prior to utilising them in hydroclimatic applications. Therefore, this study aims to assess the effectiveness of NASA Prediction of Worldwide Energy Resources (NASA POWER) and Climate Research Unit (CRU) datasets in accurately estimating precipitation and temperature patterns within the dry region of Nigeria from 1990 to 2020. The study employs widely used statistical metrics and the Standardised Precipitation Index (SPI) to effectively capture the monthly variability of precipitation and temperature and inter-annual anomalies in rainfall. The findings suggest that CRU exhibited superior performance compared to NASA POWER in terms of monthly precipitation and minimum and maximum temperatures, demonstrating a high correlation and much lower error values for both RMSE and MAE. Nevertheless, NASA POWER has exhibited a moderate agreement with gauge observations in accurately replicating monthly precipitation. The analysis of the SPI reveals that the CRU product exhibits superior performance compared to NASA POWER in accurately reflecting inter-annual variations in rainfall anomalies. The findings of this study indicate that the CRU gridded product is often regarded as the most favourable gridded precipitation product.

Keywords: CRU, climate change, precipitation, SPI, temperature

Procedia PDF Downloads 45
12475 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool

Procedia PDF Downloads 344
12474 The Impact of Environmental Social and Governance (ESG) on Corporate Financial Performance (CFP): Evidence from New Zealand Companies

Authors: Muhammad Akhtaruzzaman

Abstract:

The impact of corporate environmental social and governance (ESG) on financial performance is often difficult to quantify despite the ESG related theories predict that ESG performance improves financial performance of a company. This research examines the link between corporate ESG performance and the financial performance of the NZX (New Zealand Stock Exchange) listed companies. For this purpose, this research utilizes mixed methods approaches to examine and understand this link. While quantitative results found no robust evidence of such a link, however, the qualitative analysis of content data suggests a strong cooccurrence exists between ESG performance and financial performance. The findings of this research have important implications for policymakers to support higher ESG-performing companies and for management practitioners to develop ESG-related strategies.

Keywords: ESG, financial performance, New Zealand firms, thematic analysis, mixed methods

Procedia PDF Downloads 21
12473 Effect of Communication Pattern on Agricultural Employees' Job Performance

Authors: B. G. Abiona, E. O. Fakoya, S. O. Adeogun, J. O. Blessed

Abstract:

This study assessed the influence of communication pattern on agricultural employees’ job performance. Data were collected from 61 randomly selected respondents using a structured questionnaire. Perceived communication pattern that influence job performance include: the attitude of the administrators (x̅ = 3.41, physical barriers to communication flow among employees (x̅ = 3.21). Major challenges to respondents’ job performance were different language among employees (x̅ = 3.12), employees perception on organizational issues (x̅ = 3.09), networking (x̅ = 2.88), and unclear definition of work (x̅ = 2.74). A significant relationship was found between employees’ perceived communication pattern (r = 0.423, p < 0.00) and job performance. Information must be well designed in such a way that would positively influence employees’ job performance as this is essential in any agricultural organizations.

Keywords: communication pattern, job performance, agricultural employees, constraint, administrators, attitude

Procedia PDF Downloads 322
12472 A Transformer-Based Approach for Multi-Human 3D Pose Estimation Using Color and Depth Images

Authors: Qiang Wang, Hongyang Yu

Abstract:

Multi-human 3D pose estimation is a challenging task in computer vision, which aims to recover the 3D joint locations of multiple people from multi-view images. In contrast to traditional methods, which typically only use color (RGB) images as input, our approach utilizes both color and depth (D) information contained in RGB-D images. We also employ a transformer-based model as the backbone of our approach, which is able to capture long-range dependencies and has been shown to perform well on various sequence modeling tasks. Our method is trained and tested on the Carnegie Mellon University (CMU) Panoptic dataset, which contains a diverse set of indoor and outdoor scenes with multiple people in varying poses and clothing. We evaluate the performance of our model on the standard 3D pose estimation metrics of mean per-joint position error (MPJPE). Our results show that the transformer-based approach outperforms traditional methods and achieves competitive results on the CMU Panoptic dataset. We also perform an ablation study to understand the impact of different design choices on the overall performance of the model. In summary, our work demonstrates the effectiveness of using a transformer-based approach with RGB-D images for multi-human 3D pose estimation and has potential applications in real-world scenarios such as human-computer interaction, robotics, and augmented reality.

Keywords: multi-human 3D pose estimation, RGB-D images, transformer, 3D joint locations

Procedia PDF Downloads 46
12471 Increasing a Computer Performance by Overclocking Central Processing Unit (CPU)

Authors: Witthaya Mekhum, Wutthikorn Malikong

Abstract:

The objective of this study is to investigate the increasing desktop computer performance after overclocking central processing unit or CPU by running a computer component at a higher clock rate (more clock cycles per second) than it was designed at the rate of 0.1 GHz for each level or 100 MHz starting at 4000 GHz-4500 GHz. The computer performance is tested for each level with 4 programs, i.e. Hyper PI ver. 0.99b, Cinebench R15, LinX ver.0.6.4 and WinRAR . After the CPU overclock, the computer performance increased. When overclocking CPU at 29% the computer performance tested by Hyper PI ver. 0.99b increased by 10.03% and when tested by Cinebench R15 the performance increased by 20.05% and when tested by LinX Program the performance increased by 16.61%. However, the performance increased only 8.14% when tested with Winrar program. The computer performance did not increase according to the overclock rate because the computer consists of many components such as Random Access Memory or RAM, Hard disk Drive, Motherboard and Display Card, etc.

Keywords: overclock, performance, central processing unit, computer

Procedia PDF Downloads 257
12470 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate

Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe

Abstract:

This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.

Keywords: ARIMA, error metrices, model selection, SETAR

Procedia PDF Downloads 217
12469 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps

Authors: Butta Singh

Abstract:

This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.

Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram

Procedia PDF Downloads 148
12468 Firm Performance and Evolving Corporate Governance: An Empirical Study from Pakistan

Authors: Mohammed Nishat, Ahmad Ghazali

Abstract:

This study empirically examines the corporate governance and firm performance, and tries to evaluate the governance, ownership and control related variables which are hypothesized to affect on firms performance. This study tries to evaluate the effectiveness of corporate governance mechanism to achieve high level performance among companies listed on the Karachi Stock Exchange (KSE) over the period from 2005 to 2008. To measure the firm performance level this research uses three measures of performance; Return on assets (ROA), Return on Equity (ROE) and Tobin’s Q. To link the performance of firms with the corporate governance three categories of corporate governance variables are tested which includes governance, ownership and control related variables. Fixed effect regression model is used to test the link between corporate governance and firm performance for 267 KSE listed Pakistani firms. The result shows that corporate governance variables such as percentage block holding by individuals have positive impact on firm performance. When CEO is also the chairperson of board then it is found that firm performance is adversely affected. Also negative relationship is found between share held by insiders and performance of firm. Leverage has negative impact on the performance of the firm and firm size is positively related with the firms performance.

Keywords: corporate governance, performance, agency cost, Karachi stock market

Procedia PDF Downloads 321
12467 Product Modularity, Collaboration and the Impact on Innovation Performance in Intra-Organizational R&D Networks

Authors: Daniel Martinez, Tim de Leeuw, Stefan Haefliger

Abstract:

The challenges of managing a large and geographically dispersed R&D organization have been further increasing during the past years, concentrating on the leverage of a geo-graphically dispersed body of knowledge in an efficient and effective manner. In order to reduce complexity and improve performance, firms introduce product modularity as one key element for global R&D network teams to develop their products and projects in collaboration. However, empirical studies on the effects of product modularity on innovation performance are really scant. Furthermore, some researchers have suggested that product modularity promotes innovation performance, while others argue that it inhibits innovation performance. This research fills this gap by investigating the impact of product modularity on various dimensions of innovation performance, i.e. effectiveness and efficiency. By constructing the theoretical framework, this study suggests that that there is an inverted U-shaped relationship between product modularity and innovation performance. Moreover, this research work suggests that the optimum of innovation performance efficiency will be at a higher level than innovation performance effectiveness at a given product modularity level.

Keywords: modularity, innovation performance, networks, R&D, collaboration

Procedia PDF Downloads 489