Search results for: usability performance metrics
13246 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data
Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali
Abstract:
The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors
Procedia PDF Downloads 7013245 Evaluating the Location of Effective Product Advertising on Facebook Ads
Authors: Aulia F. Hadining, Atya Nur Aisha, Dimas Kurninatoro Aji
Abstract:
Utilization of social media as a marketing tool is growing rapidly, including for SMEs. Social media allows the user to give product evaluation and recommendations to the public. In addition, the social media facilitate word-of-mouth marketing communication. One of the social media that can be used is Facebook, with Facebook Ads. This study aimed to evaluate the location of Facebook Ads, to obtain an appropriate advertising design. There are three alternatives location consist of desktop, right-hand column and mobile. The effectiveness and efficiency of advertising will be measured based on advertising metrics such as reach, click, Cost per Click (CUC) and Unique Click-Through-Rate (UCTR). Facebook's Ads Manager was used for seven days, targeted by age (18-24), location (Bandung), language (Indonesia) and keywords. The result was 13,999 total reach, as well as 342 clicks. Based on the results of comparison using ANOVA, there was a significant difference for each placement location based on advertising metrics. Mobile location was chosen to be successful ads, because it produces the lowest CUC, amounting to Rp 691,- per click and 14% UCTR. Results of this study showed Facebook Ads was useful and cost-effective media to promote the product of SME, because it could be view by many people in the same time.Keywords: marketing communication, social media, Facebook Ads, mobile location
Procedia PDF Downloads 35513244 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation
Procedia PDF Downloads 7013243 Load Balancing Technique for Energy - Efficiency in Cloud Computing
Authors: Rani Danavath, V. B. Narsimha
Abstract:
Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission
Procedia PDF Downloads 45013242 Studying the Effect of Hydrocarbon Solutions on the Properties of Epoxy Polymer Concrete
Authors: Mustafa Hasan Omar
Abstract:
The destruction effect of hydrocarbon solutions on concrete besides its high permeability have led researchers to try to improve the performance of concrete exposed to these solutions, hence improving the durability and usability of oil concrete structures. Recently, polymer concrete is considered one of the most important types of concrete, and its behavior after exposure to oil products is still unknown. In the present work, an experimental study has been carried out, in which the prepared epoxy polymer concrete immersed in different types of hydrocarbon exposure solutions (gasoline, kerosene, and gas oil) for 120 days and compared with the reference concrete left in the air. The results for outdoor specimens indicate that the mechanical properties are increased after 120 days, but the specimens that were immersed in gasoline, kerosene, and gas oil for the same period show a reduction in compressive strength by -21%, -27% and -23%, whereas in splitting tensile strength by -19%, -24% and -20%, respectively. The reductions in ultrasonic pulse velocity for cubic specimens are -17%, -22% and -19% and in cylindrical specimens are -20%, -25% and -22%, respectively.Keywords: epoxy resin, hydrocarbon solutions, mechanical properties, polymer concrete, ultrasonic pulse velocity
Procedia PDF Downloads 12913241 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators
Authors: Raluca Ana Maria Viziteu, Anna Prudnikova
Abstract:
Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling
Procedia PDF Downloads 8113240 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach
Authors: Helen L. Hein, Joachim Schwarte
Abstract:
As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.Keywords: aerogel-based, insulating material, early development phase, interval arithmetic
Procedia PDF Downloads 14413239 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics
Authors: Sairi Satari
Abstract:
Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.Keywords: sigma matrics, analytical performance, total error, bias
Procedia PDF Downloads 17213238 Call-Back Laterality and Bilaterality: Possible Screening Mammography Quality Metrics
Authors: Samson Munn, Virginia H. Kim, Huija Chen, Sean Maldonado, Michelle Kim, Paul Koscheski, Babak N. Kalantari, Gregory Eckel, Albert Lee
Abstract:
In terms of screening mammography quality, neither the portion of reports that advise call-back imaging that should be bilateral versus unilateral nor how much the unilateral call-backs may appropriately diverge from 50–50 (left versus right) is known. Many factors may affect detection laterality: display arrangement, reflections preferentially striking one display location, hanging protocols, seating positions with respect to others and displays, visual field cuts, health, etc. The call-back bilateral fraction may reflect radiologist experience (not in our data) or confidence level. Thus, laterality and bilaterality of call-backs advised in screening mammography reports could be worthy quality metrics. Here, laterality data did not reveal a concern until drilling down to individuals. Bilateral screening mammogram report recommendations by five breast imaging, attending radiologists at Harbor-UCLA Medical Center (Torrance, California) 9/1/15--8/31/16 and 9/1/16--8/31/17 were retrospectively reviewed. Recommended call-backs for bilateral versus unilateral, and for left versus right, findings were counted. Chi-square (χ²) statistic was applied. Year 1: of 2,665 bilateral screening mammograms, reports of 556 (20.9%) recommended call-back, of which 99 (17.8% of the 556) were for bilateral findings. Of the 457 unilateral recommendations, 222 (48.6%) regarded the left breast. Year 2: of 2,106 bilateral screening mammograms, reports of 439 (20.8%) recommended call-back, of which 65 (14.8% of the 439) were for bilateral findings. Of the 374 unilateral recommendations, 182 (48.7%) regarded the left breast. Individual ranges of call-backs that were bilateral were 13.2–23.3%, 10.2–22.5%, and 13.6–17.9%, by year(s) 1, 2, and 1+2, respectively; these ranges were unrelated to experience level; the two-year mean was 15.8% (SD=1.9%). The lowest χ² p value of the group's sidedness disparities years 1, 2, and 1+2 was > 0.4. Regarding four individual radiologists, the lowest p value was 0.42. However, the fifth radiologist disfavored the left, with p values of 0.21, 0.19, and 0.07, respectively; that radiologist had the greatest number of years of experience. There was a concerning, 93% likelihood that bias against left breast findings evidenced by one of our radiologists was not random. Notably, very soon after the period under review, he retired, presented with leukemia, and died. We call for research to be done, particularly by large departments with many radiologists, of two possible, new, quality metrics in screening mammography: laterality and bilaterality. (Images, patient outcomes, report validity, and radiologist psychological confidence levels were not assessed. No intervention nor subsequent data collection was conducted. This uncomplicated collection of data and simple appraisal were not designed, nor had there been any intention to develop or contribute, to generalizable knowledge (per U.S. DHHS 45 CFR, part 46)).Keywords: mammography, screening mammography, quality, quality metrics, laterality
Procedia PDF Downloads 16413237 Performance Evaluation of Clustered Routing Protocols for Heterogeneous Wireless Sensor Networks
Authors: Awatef Chniguir, Tarek Farah, Zouhair Ben Jemaa, Safya Belguith
Abstract:
Optimal routing allows minimizing energy consumption in wireless sensor networks (WSN). Clustering has proven its effectiveness in organizing WSN by reducing channel contention and packet collision and enhancing network throughput under heavy load. Therefore, nowadays, with the emergence of the Internet of Things, heterogeneity is essential. Stable election protocol (SEP) that has increased the network stability period and lifetime is the first clustering protocol for heterogeneous WSN. SEP and its descendants, namely SEP, Threshold Sensitive SEP (TSEP), Enhanced TSEP (ETSSEP) and Current Energy Allotted TSEP (CEATSEP), were studied. These algorithms’ performance was evaluated based on different metrics, especially first node death (FND), to compare their stability. Simulations were conducted on the MATLAB tool considering two scenarios: The first one demonstrates the fraction variation of advanced nodes by setting the number of total nodes. The second considers the interpretation of the number of nodes while keeping the number of advanced nodes permanent. CEATSEP outperforms its antecedents by increasing stability and, at the same time, keeping a low throughput. It also operates very well in a large-scale network. Consequently, CEATSEP has a useful lifespan and energy efficiency compared to the other routing protocol for heterogeneous WSN.Keywords: clustering, heterogeneous, stability, scalability, IoT, WSN
Procedia PDF Downloads 13313236 Analyzing the Programme for International Student Assessment (PISA) Results in Uzbekistan: Insights from Organisation for Economic Co-operation and Development (OECD) Assessments
Authors: Nukarova Marjona Kayimovna
Abstract:
This article examines Uzbekistan's participation in the Programme for International Student Assessment (PISA) 2022, as the country took part in the assessment for the first time. The analysis delves into the initial results and performance metrics reported by the Organisation for Economic Co-operation and Development (OECD). By exploring Uzbekistan's data, the article highlights key findings, trends, and areas of strength and improvement. The aim is to provide a comprehensive understanding of how Uzbekistan's education system compares on the international stage and to offer insights into potential implications for future educational policies and reforms.Keywords: PISA, OECD, data analysis of Uzbekistan, results, critical thinking.
Procedia PDF Downloads 2213235 Recommender System Based on Mining Graph Databases for Data-Intensive Applications
Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi
Abstract:
In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.Keywords: graph databases, NLP, recommendation systems, similarity metrics
Procedia PDF Downloads 10513234 Streamlining the Fuzzy Front-End and Improving the Usability of the Tools Involved
Authors: Michael N. O'Sullivan, Con Sheahan
Abstract:
Researchers have spent decades developing tools and techniques to aid teams in the new product development (NPD) process. Despite this, it is evident that there is a huge gap between their academic prevalence and their industry adoption. For the fuzzy front-end, in particular, there is a wide range of tools to choose from, including the Kano Model, the House of Quality, and many others. In fact, there are so many tools that it can often be difficult for teams to know which ones to use and how they interact with one another. Moreover, while the benefits of using these tools are obvious to industrialists, they are rarely used as they carry a learning curve that is too steep and they become too complex to manage over time. In essence, it is commonly believed that they are simply not worth the effort required to learn and use them. This research explores a streamlined process for the fuzzy front-end, assembling the most effective tools and making them accessible to everyone. The process was developed iteratively over the course of 3 years, following over 80 final year NPD teams from engineering, design, technology, and construction as they carried a product from concept through to production specification. Questionnaires, focus groups, and observations were used to understand the usability issues with the tools involved, and a human-centred design approach was adopted to produce a solution to these issues. The solution takes the form of physical toolkit, similar to a board game, which allows the team to play through an example of a new product development in order to understand the process and the tools, before using it for their own product development efforts. A complimentary website is used to enhance the physical toolkit, and it provides more examples of the tools being used, as well as deeper discussions on each of the topics, allowing teams to adapt the process to their skills, preferences and product type. Teams found the solution very useful and intuitive and experienced significantly less confusion and mistakes with the process than teams who did not use it. Those with a design background found it especially useful for the engineering principles like Quality Function Deployment, while those with an engineering or technology background found it especially useful for design and customer requirements acquisition principles, like Voice of the Customer. Products developed using the toolkit are added to the website as more examples of how it can be used, creating a loop which helps future teams understand how the toolkit can be adapted to their project, whether it be a small consumer product or a large B2B service. The toolkit unlocks the potential of these beneficial tools to those in industry, both for large, experienced teams and for inexperienced start-ups. It allows users to assess the market potential of their product concept faster and more effectively, arriving at the product design stage with technical requirements prioritized according to their customers’ needs and wants.Keywords: new product development, fuzzy front-end, usability, Kano model, quality function deployment, voice of customer
Procedia PDF Downloads 10813233 Analysis and Comparison of Prototypes of an Ergometric Step in a Multidisciplinary Design Process
Authors: M. B. Ricardo De Oliveira, A. Borghi-Silva, L. Di Thommazo, D. Braatz
Abstract:
Prototypes can be understood as representations of a product concept. Furthermore, prototyping consists in an important stage in product development and results in better team communication, decision making, testing and problem solving through feedback. Although there are several methods of prototyping suggested by recent studies for designers to choose from, some methods present different advantages, such as cost and time reduction, performance and fidelity, which should be taken in account during a product development project. In this multidisciplinary study, involving areas of physiotherapy, engineering and computer science (hardware and software), we compared four developed prototypes of an ergometric step: a virtual prototype, a 3D printed prototype, a bricolage prototype and a prototype manufactured by a third-party company. These prototypes were evaluated in a comparative-qualitative approach for their contribution to the concept’s maturation of the product, the different prototyping methods used and the advantages and disadvantages of each one based on the product’s design specifications (performance, safety, materials, cost, maintenance, usability, ergonomics and portability). Our results indicated that despite prototypes show overall advantages, all of them have limitations, thus being crucial to have different methods of testing and interacting with the product. Additionally, virtual and 3D printed prototypes were essential at early stages of the project due to their low-cost and high-fidelity representation of the product, while the prototype manufactured by a third-party company and bricolage prototype introduced functional tests in real scenarios, allowing more detailed evaluations. This study also resulted in a patent for an ergometric step.Keywords: Product Design, Product Development, Prototypes, Step
Procedia PDF Downloads 11713232 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict
Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez
Abstract:
This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks
Procedia PDF Downloads 49113231 The Mediatory Role of Innovation in the Link between Social and Financial Performance
Authors: Bita Mashayekhi, Amin Jahangard, Milad Samavat, Saeid Homayoun
Abstract:
In the modern competitive business environment, one cannot overstate the importance of corporate social responsibility. The controversial link between the social and financial performance of firms has become a topic of interest for scholars. Hence, this study examines the social and financial performance link by taking into account the mediating role of innovation performance. We conducted the Covariance-based Structural Equation Modeling (CB-SEM) method on an international sample of firms provided by the ASSET4 database. In this research, to explore the black box of the social and financial performance relationship, we first examined the effect of social performance separately on financial performance and innovation; then, we measured the mediation role of innovation in the social and financial performance link. While our results indicate the positive effect of social performance on financial performance and innovation, we cannot document the positive mediating role of innovation. This possibly relates to the long-term nature of benefits from investments in innovation.Keywords: ESG, financial performance, innovation, social performance, structural equation modeling
Procedia PDF Downloads 10413230 An Enhanced Distributed Weighted Clustering Algorithm for Intra and Inter Cluster Routing in MANET
Authors: K. Gomathi
Abstract:
Mobile Ad hoc Networks (MANET) is defined as collection of routable wireless mobile nodes with no centralized administration and communicate each other using radio signals. Especially MANETs deployed in hostile environments where hackers will try to disturb the secure data transfer and drain the valuable network resources. Since MANET is battery operated network, preserving the network resource is essential one. For resource constrained computation, efficient routing and to increase the network stability, the network is divided into smaller groups called clusters. The clustering architecture consists of Cluster Head(CH), ordinary node and gateway. The CH is responsible for inter and intra cluster routing. CH election is a prominent research area and many more algorithms are developed using many different metrics. The CH with longer life sustains network lifetime, for this purpose Secondary Cluster Head(SCH) also elected and it is more economical. To nominate efficient CH, a Enhanced Distributed Weighted Clustering Algorithm (EDWCA) has been proposed. This approach considers metrics like battery power, degree difference and speed of the node for CH election. The proficiency of proposed one is evaluated and compared with existing algorithm using Network Simulator(NS-2).Keywords: MANET, EDWCA, clustering, cluster head
Procedia PDF Downloads 39913229 Sustainable Rehabilitation of Concrete Buildings in Iran: Harnessing Sunlight and Navigating Limited Water Resources
Authors: Amin Khamoosh, Hamed Faramarzifar
Abstract:
In the capital of Iran, Tehran, numerous buildings constructed when extreme climates were not prevalent now face the need for rehabilitation, typically within their first decade. Our data delves into the performance metrics and economic advantages of sustainable rehabilitation practices compared to traditional methods. With a focus on the scarcity of water resources, we specifically scrutinize water-efficient techniques throughout construction, rehabilitation, and usage. Examining design elements that optimize natural light while efficiently managing heat transmission is crucial, given the reliance on water for cooling devices in this region. The data aims to present a comprehensive strategy, addressing immediate structural concerns while harmonizing with Iran's unique environmental conditions.Keywords: sustainable rehabilitation, concrete buildings, iran, solar energy, water-efficient techniques
Procedia PDF Downloads 5613228 Evaluating Models Through Feature Selection Methods Using Data Driven Approach
Authors: Shital Patil, Surendra Bhosale
Abstract:
Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE
Procedia PDF Downloads 11913227 Developing a Product Circularity Index with an Emphasis on Longevity, Repairability, and Material Efficiency
Authors: Lina Psarra, Manogj Sundaresan, Purjeet Sutar
Abstract:
In response to the global imperative for sustainable solutions, this article proposes the development of a comprehensive circularity index applicable to a wide range of products across various industries. The absence of a consensus on using a universal metric to assess circularity performance presents a significant challenge in prioritizing and effectively managing sustainable initiatives. This circularity index serves as a quantitative measure to evaluate the adherence of products, processes, and systems to the principles of a circular economy. Unlike traditional distinct metrics such as recycling rates or material efficiency, this index considers the entire lifecycle of a product in one single metric, also incorporating additional factors such as reusability, scarcity of materials, reparability, and recyclability. Through a systematic approach and by reviewing existing metrics and past methodologies, this work aims to address this gap by formulating a circularity index that can be applied to diverse product portfolio and assist in comparing the circularity of products on a scale of 0%-100%. Project objectives include developing a formula, designing and implementing a pilot tool based on the developed Product Circularity Index (PCI), evaluating the effectiveness of the formula and tool using real product data, and assessing the feasibility of integration into various sustainability initiatives. The research methodology involves an iterative process of comprehensive research, analysis, and refinement where key steps include defining circularity parameters, collecting relevant product data, applying the developed formula, and testing the tool in a pilot phase to gather insights and make necessary adjustments. Major findings of the study indicate that the PCI provides a robust framework for evaluating product circularity across various dimensions. The Excel-based pilot tool demonstrated high accuracy and reliability in measuring circularity, and the database proved instrumental in supporting comprehensive assessments. The PCI facilitated the identification of key areas for improvement, enabling more informed decision-making towards circularity and benchmarking across different products, essentially assisting towards better resource management. In conclusion, the development of the Product Circularity Index represents a significant advancement in global sustainability efforts. By providing a standardized metric, the PCI empowers companies and stakeholders to systematically assess product circularity, track progress, identify improvement areas, and make informed decisions about resource management. This project contributes to the broader discourse on sustainable development by offering a practical approach to enhance circularity within industrial systems, thus paving the way towards a more resilient and sustainable future.Keywords: circular economy, circular metrics, circularity assessment, circularity tool, sustainable product design, product circularity index
Procedia PDF Downloads 3013226 Measuring Fragmentation Index of Urban Landscape: A Case Study on Kuala Lumpur City
Authors: Shagufta Tazin Shathy, Mohammad Imam Hasan Reza
Abstract:
Fragmentation due to urbanization and agricultural expansion has become the main reason for destruction of forest area and loss of biodiversity particularly in the developing world. At present, the world is experiencing the largest wave of urban growth in human history, and it is estimated that this influx will be mainly taking place in developing world. Therefore, study on urban fragmentation is vital for a sustainable urban development. Landscape fragmentation is one of the most important conservation issues in the last few decades. Habitat fragmentation due to landscape alteration has caused habitat isolation, destruction in ecosystem pattern and processes. Thus, this research analyses the spatial and temporal extent of urban fragmentation using landscape indices in the Kuala Lumpur (KL) – the capital and most populous city in Malaysia. The objective of this study is to examine the urban fragmentation index in KL city. Fragmentation metrics used in the study are: a) Urban landscape ratio (the ratio of urban landscape area and build up area), b) Infill (development that occurred within urbanized open space), and c) Extension (development of exterior open space). After analyzing all three metrics, these are calculated for the combined urban fragmentation index (UFI). In this combined index, all three metrics are given an equal weight. Land cover/ land use maps of the year 1996 and 2005 have been developed from the Landsat TM 30 m resolution satellite image. The year 1996 is taken as a reference year to analyze the changes. The UFI calculated for the year of 1996 and2005 found that the KL city has undergone rapid landscape changes destructing forest ecosystem adversely. Increasing UFI for the year of 1996 compared to 2005 indicates that the developmental activities have been occupying open spaces and fragmenting natural lands and forest. This index can be implemented in other unplanned and rapidly urbanizing Asian cities for example Dhaka and Delhi to calculate the urban fragmentation rate. The findings from the study will help the stakeholders and urban planners for a sustainable urban management planning in this region.Keywords: GIS, index, sustainable urban management, urbanization
Procedia PDF Downloads 36713225 Working Conditions, Motivation and Job Performance of Hotel Workers
Authors: Thushel Jayaweera
Abstract:
In performance evaluation literature, there has been no investigation indicating the impact of job characteristics, working conditions and motivation on the job performance among the hotel workers in Britain. This study tested the relationship between working conditions (physical and psychosocial working conditions) and job performance (task and contextual performance) with motivators (e.g. recognition, achievement, the work itself, the possibility for growth and work significance) as the mediating variable. A total of 254 hotel workers in 25 hotels in Bristol, United Kingdom participated in this study. Working conditions influenced job performance and motivation moderated the relationship between working conditions and job performance. Poor workplace conditions resulted in decreasing employee performance. The results point to the importance of motivators among hotel workers and highlighted that work be designed to provide recognition and sense of autonomy on the job to enhance job performance of the hotel workers. These findings have implications for organizational interventions aimed at increasing employee job performance.Keywords: hotel workers, working conditions, motivation, job characteristics, job performance
Procedia PDF Downloads 59913224 The ReliVR Project: Feasibility of a Virtual Reality Intervention in the Psychotherapy of Depression
Authors: Kyra Kannen, Sonja D. Roelen, Sebastian Schnieder, Jarek Krajewski, Steffen Holsteg, André Karger, Johanna Askeridis, Celina Slawik, Philip Mildner, Jens Piesk, Ruslan David, Holger Kürten, Benjamin Oster, Robert Malzan, Mike Ludemann
Abstract:
Virtual Reality (VR) is increasingly recognized for its potential in transforming mental disorder treatment, offering advantages such as cost-effectiveness, time efficiency, accessibility, reduced stigma, and scalability. While the application of VR in the context of anxiety disorders has been extensively evaluated and demonstrated to be effective, the utilization of VR as a therapeutic treatment for depression remains under-investigated. Our goal is to pioneer immersive VR therapy modules for treating major depression, alongside a web-based system for home use. We develop a modular digital therapy platform grounded in psychodynamic therapy interventions which addresses stress reduction, exploration of social situations and relationship support, social skill training, avoidance behavior analysis, and psychoeducation. In addition, an automated depression monitoring system, based on acoustic voice analysis, is implemented in the form of a speech-based diary to track the affective state of the user and depression severity. The use of immersive VR facilitates patient immersion into complex and realistic interpersonal interactions with high emotional engagement, which may contribute to positive treatment acceptance and satisfaction. In a proof-of-concept study, 45 depressed patients were assigned to VR or web-platform modules, evaluating user experience, usability and additional metrics including depression severity, mindfulness, interpersonal problems, and treatment satisfaction. The findings provide valuable insights into the effectiveness and user-friendliness of VR and web modules for depression therapy and contribute to the refinement of more tailored digital interventions to improve mental health.Keywords: virtual reality therapy, digital health, depression, psychotherapy
Procedia PDF Downloads 6413223 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 3013222 Factors Affecting Employee Performance: A Case Study in Marketing and Trading Directorate, Pertamina Ltd.
Authors: Saptiadi Nugroho, A. Nur Muhamad Afif
Abstract:
Understanding factors that influence employee performance is very important. By finding the significant factors, organization could intervene to improve the employee performance that simultaneously will affect organization itself. In this research, four aspects consist of PCCD training, education level, corrective action, and work location were tested to identify their influence on employee performance. By using correlation analysis and T-Test, it was found that employee performance significantly influenced by PCCD training, work location, and corrective action. Meanwhile the education level did not influence employee performance.Keywords: employee development, employee performance, performance management system, organization
Procedia PDF Downloads 39013221 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams
Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew
Abstract:
Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions
Procedia PDF Downloads 11713220 Machine Learning Techniques to Develop Traffic Accident Frequency Prediction Models
Authors: Rodrigo Aguiar, Adelino Ferreira
Abstract:
Road traffic accidents are the leading cause of unnatural death and injuries worldwide, representing a significant problem of road safety. In this context, the use of artificial intelligence with advanced machine learning techniques has gained prominence as a promising approach to predict traffic accidents. This article investigates the application of machine learning algorithms to develop traffic accident frequency prediction models. Models are evaluated based on performance metrics, making it possible to do a comparative analysis with traditional prediction approaches. The results suggest that machine learning can provide a powerful tool for accident prediction, which will contribute to making more informed decisions regarding road safety.Keywords: machine learning, artificial intelligence, frequency of accidents, road safety
Procedia PDF Downloads 8913219 Optimizing Wind Turbine Blade Geometry for Enhanced Performance and Durability: A Computational Approach
Authors: Nwachukwu Ifeanyi
Abstract:
Wind energy is a vital component of the global renewable energy portfolio, with wind turbines serving as the primary means of harnessing this abundant resource. However, the efficiency and stability of wind turbines remain critical challenges in maximizing energy output and ensuring long-term operational viability. This study proposes a comprehensive approach utilizing computational aerodynamics and aeromechanics to optimize wind turbine performance across multiple objectives. The proposed research aims to integrate advanced computational fluid dynamics (CFD) simulations with structural analysis techniques to enhance the aerodynamic efficiency and mechanical stability of wind turbine blades. By leveraging multi-objective optimization algorithms, the study seeks to simultaneously optimize aerodynamic performance metrics such as lift-to-drag ratio and power coefficient while ensuring structural integrity and minimizing fatigue loads on the turbine components. Furthermore, the investigation will explore the influence of various design parameters, including blade geometry, airfoil profiles, and turbine operating conditions, on the overall performance and stability of wind turbines. Through detailed parametric studies and sensitivity analyses, valuable insights into the complex interplay between aerodynamics and structural dynamics will be gained, facilitating the development of next-generation wind turbine designs. Ultimately, this research endeavours to contribute to the advancement of sustainable energy technologies by providing innovative solutions to enhance the efficiency, reliability, and economic viability of wind power generation systems. The findings have the potential to inform the design and optimization of wind turbines, leading to increased energy output, reduced maintenance costs, and greater environmental benefits in the transition towards a cleaner and more sustainable energy future.Keywords: computation, robotics, mathematics, simulation
Procedia PDF Downloads 6013218 Analyzing Sociocultural Factors Shaping Architects’ Construction Material Choices: The Case of Jordan
Authors: Maiss Razem
Abstract:
The construction sector is considered a major consumer of materials that undergoes processes of extraction, processing, transportation, and maintaining when used in buildings. Several metrics have been devised to capture the environmental impact of the materials consumed during construction using lifecycle thinking. Rarely has the materiality of this sector been explored qualitatively and systemically. This paper aims to explore socio-cultural forces that drive the use of certain materials in the Jordanian construction industry, using practice theory as a heuristic method of analysis, more specifically Shove et al. three-element model. By conducting semi-structured interviews with architects, the results unravel contextually embedded routines when determining qualities of three materialities highlighted herein; stone, glass and spatial openness. The study highlights the inadequacy of only using efficiency as a quantitative metric of sustainable materials and argues for the need to link material consumption with socio-economic, cultural, and aesthetic driving forces. The operationalization of practice theory by tracing materials’ lifetimes as they integrate with competencies and meanings captures dynamic engagements through the analyzed routines of actors in the construction practice. This study can offer policymakers better-nuanced representation to green this sector beyond efficiency rhetoric and quantitative metrics.Keywords: architects' practices, construction materials, Jordan, practice theory
Procedia PDF Downloads 17013217 Comparing the Educational Effectiveness of eHealth to Deliver Health Knowledge between Higher Literacy Users and Lower Literacy Users
Authors: Yah-Ling Hung
Abstract:
eHealth is undoubtedly emerging as a promising vehicle to provide information for individual self-care management. However, the accessing ability, reading strategies and navigating behavior between higher literacy users and lower literacy users are significantly different. Yet, ways to tailor audiences’ health literacy and develop appropriate eHealth to feed their need become a big challenge. The purpose of this study is to compare the educational effectiveness of eHealth to deliver health knowledge between higher literacy users and lower literacy users, thus establishing useful design strategies of eHealth for users with different level of health literacy. The study was implemented in four stages, the first of which developed a website as the testing media to introduce health care knowledge relating to children’s allergy. Secondly, a reliability and validity test was conducted to make sure that all of the questions in the questionnaire were good indicators. Thirdly, a pre-post knowledge test was conducted with 66 participants, 33 users with higher literacy and 33 users with lower literacy respectively. Finally, a usability evaluation survey was undertaken to explore the criteria used by users with different levels of health literacy to evaluate eHealth. The results demonstrated that the eHealth Intervention in both groups had a positive outcome. There was no significant difference between the effectiveness of eHealth intervention between users with higher literacy and users with lower literacy. However, the average mean of lower literacy group was marginally higher than the average mean of higher literacy group. The findings also showed that the criteria used to evaluate eHealth could be analyzed in terms of the quality of information, appearance, appeal and interaction, but the users with lower literacy have different evaluation criteria from those with higher literacy. This is an interdisciplinary research which proposes the sequential key steps that incorporate the planning, developing and accessing issues that need to be considered when designing eHealth for patients with varying degrees of health literacy.Keywords: eHealth, health intervention, health literacy, usability evaluation
Procedia PDF Downloads 142