Search results for: cost prediction
6977 Commuters Trip Purpose Decision Tree Based Model of Makurdi Metropolis, Nigeria and Strategic Digital City Project
Authors: Emmanuel Okechukwu Nwafor, Folake Olubunmi Akintayo, Denis Alcides Rezende
Abstract:
Decision tree models are versatile and interpretable machine learning algorithms widely used for both classification and regression tasks, which can be related to cities, whether physical or digital. The aim of this research is to assess how well decision tree algorithms can predict trip purposes in Makurdi, Nigeria, while also exploring their connection to the strategic digital city initiative. The research methodology involves formalizing household demographic and trips information datasets obtained from extensive survey process. Modelling and Prediction were achieved using Python Programming Language and the evaluation metrics like R-squared and mean absolute error were used to assess the decision tree algorithm's performance. The results indicate that the model performed well, with accuracies of 84% and 68%, and low MAE values of 0.188 and 0.314, on training and validation data, respectively. This suggests the model can be relied upon for future prediction. The conclusion reiterates that This model will assist decision-makers, including urban planners, transportation engineers, government officials, and commuters, in making informed decisions on transportation planning and management within the framework of a strategic digital city. Its application will enhance the efficiency, sustainability, and overall quality of transportation services in Makurdi, Nigeria.Keywords: decision tree algorithm, trip purpose, intelligent transport, strategic digital city, travel pattern, sustainable transport
Procedia PDF Downloads 296976 Role of Pulp Volume Method in Assessment of Age and Gender in Lucknow, India, an Observational Study
Authors: Anurag Tripathi, Sanad Khandelwal
Abstract:
Age and gender determination are required in forensic for victim identification. There is secondary dentine deposition throughout life, resulting in decreased pulp volume and size. Evaluation of pulp volume using Cone Beam Computed Tomography (CBCT)is a noninvasive method to evaluate the age and gender of an individual. The study was done to evaluate the efficacy of pulp volume method in the determination of age and gender.Aims/Objectives: The study was conducted to estimate age and determine sex by measuring tooth pulp volume with the help of CBCT. An observational study of one year duration on CBCT data of individuals was conducted in Lucknow. Maxillary central incisors (CI) and maxillary canine (C) of the randomly selected samples were assessed for measurement of pulp volume using a software. Statistical analysis: Chi Square Test, Arithmetic Mean, Standard deviation, Pearson’s Correlation, Linear & Logistic regression analysis. Results: The CBCT data of Ninety individuals with age range between 18-70 years was evaluated for pulp volume of central incisor and canine (CI & C). The Pearson correlation coefficient between the tooth pulp volume (CI & C) and chronological age suggested that pulp volume decreased with age. The validation of the equations for sex determination showed higher prediction accuracy for CI (56.70%) and lower for C (53.30%).Conclusion: Pulp volume obtained from CBCT is a reliable indicator for age estimation and gender prediction.Keywords: forensic, dental age, pulp volume, cone beam computed tomography
Procedia PDF Downloads 1066975 Performance Evaluation of Construction Projects by Earned Value Management Method, Using Primavera P6 – A Case Study in Istanbul, Turkey
Authors: Mohammad Lemar Zalmai, Osman Hurol Turkakin, Cemil Akcay, Ekrem Manisali
Abstract:
Most of the construction projects are exposed to time and cost overruns due to various factors and this is a major problem. As a solution to this, the Earned Value Management (EVM) method is considered. EVM is a powerful and well-known method used in monitoring and controlling the project. EVM is a technique that project managers use to track the performance of their project against project baselines. EVM gives an early indication that either project is delayed or not, and the project is either over budget or under budget at any particular day by tracking it. Thus, it helps to improve the management control system of a construction project, to detect and control the problems in potential risk areas and to suggest the importance and purpose of monitoring the construction work. This paper explains the main parameters of the EVM system involved in the calculation of time and cost for construction projects. In this study, the project management software Primavera P6 is used to deals with the project monitoring process of a seven-storeyed (G+6) faculty building whose construction is in progress at Istanbul, Turkey. A comparison between the planned progress of construction activities and actual progress is performed, and the analysis results are interpreted. This case study justifies the benefits of using EVM for project cash flow analysis and forecasting.Keywords: earned value management (EVM), construction cost management, construction planning, primavera P6, project management, project scheduling
Procedia PDF Downloads 2496974 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 1566973 Seat Assignment Model for Student Admissions Process at Saudi Higher Education Institutions
Authors: Mohammed Salem Alzahrani
Abstract:
In this paper, student admission process is studied to optimize the assignment of vacant seats with three main objectives. Utilizing all vacant seats, satisfying all program of study admission requirements and maintaining fairness among all candidates are the three main objectives of the optimization model. Seat Assignment Method (SAM) is used to build the model and solve the optimization problem with help of Northwest Coroner Method and Least Cost Method. A closed formula is derived for applying the priority of assigning seat to candidate based on SAM.Keywords: admission process model, assignment problem, Hungarian Method, Least Cost Method, Northwest Corner Method, SAM
Procedia PDF Downloads 5036972 In-Flight Aircraft Performance Model Enhancement Using Adaptive Lookup Tables
Authors: Georges Ghazi, Magali Gelhaye, Ruxandra Botez
Abstract:
Over the years, the Flight Management System (FMS) has experienced a continuous improvement of its many features, to the point of becoming the pilot’s primary interface for flight planning operation on the airplane. With the assistance of the FMS, the concept of distance and time has been completely revolutionized, providing the crew members with the determination of the optimized route (or flight plan) from the departure airport to the arrival airport. To accomplish this function, the FMS needs an accurate Aircraft Performance Model (APM) of the aircraft. In general, APMs that equipped most modern FMSs are established before the entry into service of an individual aircraft, and results from the combination of a set of ordinary differential equations and a set of performance databases. Unfortunately, an aircraft in service is constantly exposed to dynamic loads that degrade its flight characteristics. These degradations endow two main origins: airframe deterioration (control surfaces rigging, seals missing or damaged, etc.) and engine performance degradation (fuel consumption increase for a given thrust). Thus, after several years of service, the performance databases and the APM associated to a specific aircraft are no longer representative enough of the actual aircraft performance. It is important to monitor the trend of the performance deterioration and correct the uncertainties of the aircraft model in order to improve the accuracy the flight management system predictions. The basis of this research lies in the new ability to continuously update an Aircraft Performance Model (APM) during flight using an adaptive lookup table technique. This methodology was developed and applied to the well-known Cessna Citation X business aircraft. For the purpose of this study, a level D Research Aircraft Flight Simulator (RAFS) was used as a test aircraft. According to Federal Aviation Administration the level D is the highest certification level for the flight dynamics modeling. Basically, using data available in the Flight Crew Operating Manual (FCOM), a first APM describing the variation of the engine fan speed and aircraft fuel flow w.r.t flight conditions was derived. This model was next improved using the proposed methodology. To do that, several cruise flights were performed using the RAFS. An algorithm was developed to frequently sample the aircraft sensors measurements during the flight and compare the model prediction with the actual measurements. Based on these comparisons, a correction was performed on the actual APM in order to minimize the error between the predicted data and the measured data. In this way, as the aircraft flies, the APM will be continuously enhanced, making the FMS more and more precise and the prediction of trajectories more realistic and more reliable. The results obtained are very encouraging. Indeed, using the tables initialized with the FCOM data, only a few iterations were needed to reduce the fuel flow prediction error from an average relative error of 12% to 0.3%. Similarly, the FCOM prediction regarding the engine fan speed was reduced from a maximum error deviation of 5.0% to 0.2% after only ten flights.Keywords: aircraft performance, cruise, trajectory optimization, adaptive lookup tables, Cessna Citation X
Procedia PDF Downloads 2676971 Development of pm2.5 Forecasting System in Seoul, South Korea Using Chemical Transport Modeling and ConvLSTM-DNN
Authors: Ji-Seok Koo, Hee‑Yong Kwon, Hui-Young Yun, Kyung-Hui Wang, Youn-Seo Koo
Abstract:
This paper presents a forecasting system for PM2.5 levels in Seoul, South Korea, leveraging a combination of chemical transport modeling and ConvLSTM-DNN machine learning technology. Exposure to PM2.5 has known detrimental impacts on public health, making its prediction crucial for establishing preventive measures. Existing forecasting models, like the Community Multiscale Air Quality (CMAQ) and Weather Research and Forecasting (WRF), are hindered by their reliance on uncertain input data, such as anthropogenic emissions and meteorological patterns, as well as certain intrinsic model limitations. The system we've developed specifically addresses these issues by integrating machine learning and using carefully selected input features that account for local and distant sources of PM2.5. In South Korea, the PM2.5 concentration is greatly influenced by both local emissions and long-range transport from China, and our model effectively captures these spatial and temporal dynamics. Our PM2.5 prediction system combines the strengths of advanced hybrid machine learning algorithms, convLSTM and DNN, to improve upon the limitations of the traditional CMAQ model. Data used in the system include forecasted information from CMAQ and WRF models, along with actual PM2.5 concentration and weather variable data from monitoring stations in China and South Korea. The system was implemented specifically for Seoul's PM2.5 forecasting.Keywords: PM2.5 forecast, machine learning, convLSTM, DNN
Procedia PDF Downloads 606970 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work
Authors: K. Myška, L. Pilařová
Abstract:
The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.Keywords: digitalization, e-learning, mobile devices, questionnaire
Procedia PDF Downloads 1556969 RFID Based Indoor Navigation with Obstacle Detection Based on A* Algorithm for the Visually Impaired
Authors: Jayron Sanchez, Analyn Yumang, Felicito Caluyo
Abstract:
The visually impaired individual may use a cane, guide dog or ask for assistance from a person. This study implemented the RFID technology which consists of a low-cost RFID reader and passive RFID tag cards. The passive RFID tag cards served as checkpoints for the visually impaired. The visually impaired was guided through audio output from the system while traversing the path. The study implemented an ultrasonic sensor in detecting static obstacles. The system generated an alternate path based on A* algorithm to avoid the obstacles. Alternate paths were also generated in case the visually impaired traversed outside the intended path to the destination. A* algorithm generated the shortest path to the destination by calculating the total cost of movement. The algorithm then selected the smallest movement cost as a successor to the current tag card. Several trials were conducted to determine the effect of obstacles in the time traversal of the visually impaired. A dependent sample t-test was applied for the statistical analysis of the study. Based on the analysis, the obstacles along the path generated delays while requesting for the alternate path because of the delay in transmission from the laptop to the device via ZigBee modules.Keywords: A* algorithm, RFID technology, ultrasonic sensor, ZigBee module
Procedia PDF Downloads 4136968 Integrating Building Information Modeling into Facilities Management Operations
Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi
Abstract:
Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.Keywords: building information modeling, facility management, operational phase, building life cycle
Procedia PDF Downloads 1576967 An Improvement of ComiR Algorithm for MicroRNA Target Prediction by Exploiting Coding Region Sequences of mRNAs
Authors: Giorgio Bertolazzi, Panayiotis Benos, Michele Tumminello, Claudia Coronnello
Abstract:
MicroRNAs are small non-coding RNAs that post-transcriptionally regulate the expression levels of messenger RNAs. MicroRNA regulation activity depends on the recognition of binding sites located on mRNA molecules. ComiR (Combinatorial miRNA targeting) is a user friendly web tool realized to predict the targets of a set of microRNAs, starting from their expression profile. ComiR incorporates miRNA expression in a thermodynamic binding model, and it associates each gene with the probability of being a target of a set of miRNAs. ComiR algorithms were trained with the information regarding binding sites in the 3’UTR region, by using a reliable dataset containing the targets of endogenously expressed microRNA in D. melanogaster S2 cells. This dataset was obtained by comparing the results from two different experimental approaches, i.e., inhibition, and immunoprecipitation of the AGO1 protein; this protein is a component of the microRNA induced silencing complex. In this work, we tested whether including coding region binding sites in the ComiR algorithm improves the performance of the tool in predicting microRNA targets. We focused the analysis on the D. melanogaster species and updated the ComiR underlying database with the currently available releases of mRNA and microRNA sequences. As a result, we find that the ComiR algorithm trained with the information related to the coding regions is more efficient in predicting the microRNA targets, with respect to the algorithm trained with 3’utr information. On the other hand, we show that 3’utr based predictions can be seen as complementary to the coding region based predictions, which suggests that both predictions, from 3'UTR and coding regions, should be considered in a comprehensive analysis. Furthermore, we observed that the lists of targets obtained by analyzing data from one experimental approach only, that is, inhibition or immunoprecipitation of AGO1, are not reliable enough to test the performance of our microRNA target prediction algorithm. Further analysis will be conducted to investigate the effectiveness of the tool with data from other species, provided that validated datasets, as obtained from the comparison of RISC proteins inhibition and immunoprecipitation experiments, will be available for the same samples. Finally, we propose to upgrade the existing ComiR web-tool by including the coding region based trained model, available together with the 3’UTR based one.Keywords: AGO1, coding region, Drosophila melanogaster, microRNA target prediction
Procedia PDF Downloads 4556966 Investigating the Effect of Refinancing on Financial Behaviour of Energy Efficiency Projects
Authors: Zohreh Soltani, Seyedmohammadhossein Hosseinian
Abstract:
Reduction of energy consumption in built infrastructure, through the installation of energy-efficient technologies, is a major approach to achieving sustainability. In practice, the viability of energy efficiency projects strongly depends on the cost reimbursement and profitability. These projects are subject to failure if the actual cost savings do not reimburse the project cost in a timely manner. In such cases, refinancing could be a solution to benefit from the long-term returns of the project if implemented wisely. However, very little is still known about the effect of refinancing options on financial performance of energy efficiency projects. To fill this gap, the present study investigates the financial behavior of energy efficiency projects with focus on refinancing options, such as Leveraged Loans. A System Dynamics (SD) model is introduced, and the model application is presented using an actual case-study data. The case study results indicate that while high-interest start-ups make using Leveraged Loan inevitable, refinancing can rescue the project and bring about profitability. This paper also presents some managerial implications of refinancing energy efficiency projects based on the case-study analysis. Results of this study help implementing financially viable energy efficiency projects, so the community could benefit from their environmental advantages widely.Keywords: energy efficiency projects, leveraged loan, refinancing, sustainability
Procedia PDF Downloads 3946965 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 1306964 Comparative Analysis of Motor Insurance Claims using Machine Learning
Authors: Francis Kwame Bukari, Maclean Acheampong Yeboah
Abstract:
From collective hunting to contemporary financial markets, the concept of risk sharing in insurance has evolved significantly. In today's insurance landscape, statistical analysis plays a pivotal role in determining premiums and assessing the likelihood of insurance claims. Accurately estimating motor insurance claims remains a challenge, allowing insurance companies to pull much of their money to cover claims, which in the long run will affect their reserves and impact their profitability. Advanced machine learning algorithms can enhance accuracy and profitability. The primary objectives of this study encompassed the prediction of motor insurance claims through the utilization of Artificial Neural Networks (ANN) and Random Forest (RF). Additionally, a comparative analysis was conducted to assess the performance of these two models in the domain of claim prediction. The study drew upon secondary data derived from motor insurance claims, employing a range of techniques, including data preprocessing, model training, and model evaluation. To mitigate potential biases, a random over-sampler was used to balance the target variable within the preprocessed dataset. The Random Forest model outperformed the ANN model, achieving an accuracy rate of 90.33% compared to the ANN model's accuracy of 86.33%. This study highlights the importance of modern data-driven approaches in enhancing accuracy and profitability in the insurance industry.Keywords: risk, insurance claims, artificial neural network, random forest, over-sampler, profitability
Procedia PDF Downloads 126963 Evaluation of Turbulence Prediction over Washington, D.C.: Comparison of DCNet Observations and North American Mesoscale Model Outputs
Authors: Nebila Lichiheb, LaToya Myles, William Pendergrass, Bruce Hicks, Dawson Cagle
Abstract:
Atmospheric transport of hazardous materials in urban areas is increasingly under investigation due to the potential impact on human health and the environment. In response to health and safety concerns, several dispersion models have been developed to analyze and predict the dispersion of hazardous contaminants. The models of interest usually rely on meteorological information obtained from the meteorological models of NOAA’s National Weather Service (NWS). However, due to the complexity of the urban environment, NWS forecasts provide an inadequate basis for dispersion computation in urban areas. A dense meteorological network in Washington, DC, called DCNet, has been operated by NOAA since 2003 to support the development of urban monitoring methodologies and provide the driving meteorological observations for atmospheric transport and dispersion models. This study focuses on the comparison of wind observations from the DCNet station on the U.S. Department of Commerce Herbert C. Hoover Building against the North American Mesoscale (NAM) model outputs for the period 2017-2019. The goal is to develop a simple methodology for modifying NAM outputs so that the dispersion requirements of the city and its urban area can be satisfied. This methodology will allow us to quantify the prediction errors of the NAM model and propose adjustments of key variables controlling dispersion model calculation.Keywords: meteorological data, Washington D.C., DCNet data, NAM model
Procedia PDF Downloads 2376962 Optimizing Scribe Resourcing to Improve Hospitalist Workloads
Authors: Ahmed Hamzi, Bryan Norman
Abstract:
Having scribes help document patient records in electronic health record systems can improve hospitalists’ productivity. But hospitals need to determine the optimum number of scribes to hire to maximize scribe cost effectiveness. Scribe attendance uncertainty due to planned and unplanned absences is a primary challenge. This paper presents simulation and analytical models to determine the optimum number of scribes for a hospital to hire. Scribe staffing practices vary from one context to another; different staffing scenarios are considered where having extra attending scribes provides or does not provide additional value and utilizing on-call scribes to fill in for potentially absent scribes. These staffing scenarios are assessed for different scribe revenue ratios (ratio of the value of the scribe relative to scribe costs) ranging from 100% to 300%. The optimum solution depends on the absenteeism rate, revenue ratio, and desired service level. The analytical model obtains solutions easier and faster than the simulation model, but the simulation model is more accurate. Therefore, the analytical model’s solutions are compared with the simulation model’s solutions regarding both the number of scribes hired and cost-effectiveness. Additionally, an Excel tool has been developed to facilitate decision-makers in easily obtaining solutions using the analytical model.Keywords: hospitalists, workload, optimization cost, economic analysis
Procedia PDF Downloads 506961 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis
Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby
Abstract:
The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.Keywords: component analysis, multivariate, slaughter, regression
Procedia PDF Downloads 1696960 A Study on Numerical Modelling of Rigid Pavement: Temperature and Thickness Effect
Authors: Amin Chegenizadeh, Mahdi Keramatikerman, Hamid Nikraz
Abstract:
Pavement engineering plays a significant role to develop cost effective and efficient highway and road networks. In general, pavement regarding structure is categorized in two core group namely flexible and rigid pavements. There are various benefits in application of rigid pavement. For instance, they have a longer life and lower maintenance costs in compare with the flexible pavement. In rigid pavement designs, temperature and thickness are two effective parameters that could widely affect the total cost of the project. In this study, a numerical modeling using Kenpave-Kenslab was performed to investigate the effect of these two important parameters in the rigid pavement.Keywords: rigid pavement, Kenpave, Kenslab, thickness, temperature
Procedia PDF Downloads 3776959 Prediction Factor of Recurrence Supraventricular Tachycardia After Adenosine Treatment in the Emergency Department
Authors: Welawat Tienpratarn, Chaiyaporn Yuksen, Rungrawin Promkul, Chetsadakon Jenpanitpong, Pajit Bunta, Suthap Jaiboon
Abstract:
Supraventricular tachycardia (SVT) is an abnormally fast atrial tachycardia characterized by narrow (≤ 120 ms) and constant QRS. Adenosine was the drug of choice; the first dose was 6 mg. It can be repeated with the second and third doses of 12 mg, with greater than 90% success. The study found that patients observed at 4 hours after normal sinus rhythm was no recurrence within 24 hours. The objective of this study was to investigate the factors that influence the recurrence of SVT after adenosine in the emergency department (ED). The study was conducted retrospectively exploratory model, prognostic study at the Emergency Department (ED) in Faculty of Medicine, Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand. The study was conducted for ten years period between 2010 and 2020. The inclusion criteria were age > 15 years, visiting the ED with SVT, and treating with adenosine. Those patients were recorded with the recurrence SVT in ED. The multivariable logistic regression model developed the predictive model and prediction score for recurrence PSVT. 264 patients met the study criteria. Of those, 24 patients (10%) had recurrence PSVT. Five independent factors were predictive of recurrence PSVT. There was age>65 years, heart rate (after adenosine) > 100 per min, structural heart disease, and dose of adenosine. The clinical risk score to predict recurrence PSVT is developed accuracy 74.41%. The score of >6 had the likelihood ratio of recurrence PSVT by 5.71 times. The clinical predictive score of > 6 was associated with recurrence PSVT in ED.Keywords: supraventricular tachycardia, recurrance, emergency department, adenosine
Procedia PDF Downloads 1226958 Evaluation of a Staffing to Workload Tool in a Multispecialty Clinic Setting
Authors: Kristin Thooft
Abstract:
— Increasing pressure to manage healthcare costs has resulted in shifting care towards ambulatory settings and is driving a focus on cost transparency. There are few nurse staffing to workload models developed for ambulatory settings, less for multi-specialty clinics. Of the existing models, few have been evaluated against outcomes to understand any impact. This evaluation took place after the AWARD model for nurse staffing to workload was implemented in a multi-specialty clinic at a regional healthcare system in the Midwest. The multi-specialty clinic houses 26 medical and surgical specialty practices. The AWARD model was implemented in two specialty practices in October 2020. Donabedian’s Structure-Process-Outcome (SPO) model was used to evaluate outcomes based on changes to the structure and processes of care provided. The AWARD model defined and quantified the processes, recommended changes in the structure of day-to-day nurse staffing. Cost of care per patient visit, total visits, a total nurse performed visits used as structural and process measures, influencing the outcomes of cost of care and access to care. Independent t-tests were used to compare the difference in variables pre-and post-implementation. The SPO model was useful as an evaluation tool, providing a simple framework that is understood by a diverse care team. No statistically significant changes in the cost of care, total visits, or nurse visits were observed, but there were differences. Cost of care increased and access to care decreased. Two weeks into the post-implementation period, the multi-specialty clinic paused all non-critical patient visits due to a second surge of the COVID-19 pandemic. Clinic nursing staff was re-allocated to support the inpatient areas. This negatively impacted the ability of the Nurse Manager to utilize the AWARD model to plan daily staffing fully. The SPO framework could be used for the ongoing assessment of nurse staffing performance. Additional variables could be measured, giving a complete picture of the impact of nurse staffing. Going forward, there must be a continued focus on the outcomes of care and the value of nursingKeywords: ambulatory, clinic, evaluation, outcomes, staffing, staffing model, staffing to workload
Procedia PDF Downloads 1776957 Multifluid Computational Fluid Dynamics Simulation for Sawdust Gasification inside an Industrial Scale Fluidized Bed Gasifier
Authors: Vasujeet Singh, Pruthiviraj Nemalipuri, Vivek Vitankar, Harish Chandra Das
Abstract:
For the correct prediction of thermal and hydraulic performance (bed voidage, suspension density, pressure drop, heat transfer, and combustion kinetics), one should incorporate the correct parameters in the computational fluid dynamics simulation of a fluidized bed gasifier. Scarcity of fossil fuels, and to fulfill the energy demand of the increasing population, researchers need to shift their attention to the alternative to fossil fuels. The current research work focuses on hydrodynamics behavior and gasification of sawdust inside a 2D industrial scale FBG using the Eulerian-Eulerian multifluid model. The present numerical model is validated with experimental data. Further, this model extended for the prediction of gasification characteristics of sawdust by incorporating eight heterogeneous moisture release, volatile cracking, tar cracking, tar oxidation, char combustion, CO₂ gasification, steam gasification, methanation reaction, and five homogeneous oxidation of CO, CH₄, H₂, forward and backward water gas shift (WGS) reactions. In the result section, composition of gasification products is analyzed, along with the hydrodynamics of sawdust and sand phase, heat transfer between the gas, sand and sawdust, reaction rates of different homogeneous and heterogeneous reactions is being analyzed along the height of the domain.Keywords: devolatilization, Eulerian-Eulerian, fluidized bed gasifier, mathematical modelling, sawdust gasification
Procedia PDF Downloads 1096956 Efficacy Study of Post-Tensioned I Girder Made of Ultra-High Performance Fiber Reinforced Concrete and Ordinary Concrete for IRC Loading
Authors: Ayush Satija, Ritu Raj
Abstract:
Escalating demand for elevated structures as a remedy for traffic congestion has led to a surge in the construction of viaducts and bridges predominantly employing prestressed beams. However, post-tensioned I-girder superstructures are gaining traction for their attributes like structural efficiency, cost-effectiveness, and easy construction. Recently, Ultra-high-performance fiber-reinforced concrete (UHPFRC) has emerged as a revolutionary material in reshaping conventional infrastructure engineering. UHPFRC offers exceptional properties including high compressive and tensile strength, alongside enhanced durability. Its adoption in bridges yields benefits, notably a remarkable strength-to-weight ratio enabling the design of lighter and slender structural elements, enhancing functionality and sustainability. Despite its myriad advantages, integration of UHPFRC in construction is still evolving, hindered by factors like cost, material availability, and design standardization. Consequently, there's a need to assess the feasibility of substituting ordinary concrete (OC) with UHPFRC in bridges, focusing on economic considerations. This research undertakes an efficacy study between post-tensioned I-girders fabricated from UHPFRC and OC, evaluating cost parameters associated with concrete production, reinforcement, and erection. The study reveals that UHPFRC becomes economically viable for spans exceeding 40.0m. This shift in cost-effectiveness is attributed to factors like reduced girder depth, elimination of un-tensioned steel, diminished need for shear reinforcement and decreased erection costs.Keywords: post tensioned I girder, superstructure, ultra-high-performance fiber reinforced concrete, ordinary concrete
Procedia PDF Downloads 526955 Quantifying the Impact of Climate Change on Agritourism: The Transformative Role of Solar Energy in Enhancing Growth and Resilience in Eritrea
Authors: Beyene Daniel Abrha
Abstract:
Agritourism in Eritrea is increasingly threatened by climate change, manifesting through rising temperatures, shifting rainfall patterns, and resource scarcity. This study employs quantitative methods to assess the economic and environmental impacts of climate change on agritourism, utilizing metrics such as annual income fluctuations, changes in visitor numbers, and energy consumption patterns. The methodology relies on secondary data sourced from the World Bank, government reports, and academic publications to analyze the economic viability of integrating solar energy into agritourism operations. Key variables include the Benefits from Renewable Energy (BRE), encompassing cost savings from reduced energy expenses and the monetized value of avoided greenhouse gas emissions. Using a net present value (NPV) framework, the research compares the impact of solar energy against traditional fossil fuel sources by evaluating the Value of Reduced Greenhouse Gas Emissions (CO2) and the Value of Health-Related Costs (VHRC) due to air pollution. The preliminary findings of this research are of utmost importance. They indicate that the adoption of solar energy can enhance energy independence by up to 40%, reduce operational costs by 25%, and stabilize agritourism activities in climate-sensitive regions. This research aims to provide actionable insights for policymakers and stakeholders, supporting the sustainable development of agritourism in Eritrea and contributing to broader climate adaptation strategies. By employing a comprehensive cost-benefit analysis, the study highlights the economic advantages and environmental benefits of transitioning to renewable energy in the face of climate change.Keywords: agritourism, climate change, renewable energy, cost benefit analysis, resilience, cost-benefit analysis
Procedia PDF Downloads 276954 Practical Method for Failure Prediction of Mg Alloy Sheets during Warm Forming Processes
Authors: Sang-Woo Kim, Young-Seon Lee
Abstract:
An important concern in metal forming, even at elevated temperatures, is whether a desired deformation can be accomplished without any failure of the material. A detailed understanding of the critical condition for crack initiation provides not only the workability limit of a material but also a guide-line for process design. This paper describes the utilization of ductile fracture criteria in conjunction with the finite element method (FEM) for predicting the onset of fracture in warm metal working processes of magnesium alloy sheets. Critical damage values for various ductile fracture criteria were determined from uniaxial tensile tests and were expressed as the function of strain rate and temperature. In order to find the best criterion for failure prediction, Erichsen cupping tests under isothermal conditions and FE simulations combined with ductile fracture criteria were carried out. Based on the plastic deformation histories obtained from the FE analyses of the Erichsen cupping tests and the critical damage value curves, the initiation time and location of fracture were predicted under a bi-axial tensile condition. The results were compared with experimental results and the best criterion was recommended. In addition, the proposed methodology was used to predict the onset of fracture in non-isothermal deep drawing processes using an irregular shaped blank, and the results were verified experimentally.Keywords: magnesium, AZ31 alloy, ductile fracture, FEM, sheet forming, Erichsen cupping test
Procedia PDF Downloads 3796953 Optimizing Inanda Dam Using Water Resources Models
Authors: O. I. Nkwonta, B. Dzwairo, J. Adeyemo, A. Jaiyola, N. Sawyerr, F. Otieno
Abstract:
The effective management of water resources is of great importance to ensure the supply of water resources to support changing water requirements over a selected planning horizon and in a sustainable and cost-effective way. Essentially, the purpose of the water resources planning process is to balance the available water resources in a system with the water requirements and losses to which the system is subjected. In such situations, Water resources yield and planning model can be used to solve those difficulties. It has an advantage over other models by managing model runs, developing a representative system network, modelling incremental sub-catchments, creating a variety of standard system features, special modelling features, and run result output options.Keywords: complex, water resources, planning, cost effective and management
Procedia PDF Downloads 5776952 Suboptimal Retiree Allocations with Housing
Authors: Asiye Aydilek, Harun Aydilek
Abstract:
We investigate the costs of various suboptimal allocations in housing, consumption, bond and stock holdings of a retiree in a setting with recursive utility, considering the extensive empirical evidence that investors make suboptimal decisions in different ways. We find that suboptimal stock holdings impose only modest costs on the retiree. This may have a merit in explaining the limited stock investment in the data. The cost of suboptimal bond holdings is higher than that of stocks, but still small. This may partially explain why many more people hold bonds compared to stocks. We find that positive deviations from the optimal level are less costly relative to the negative ones in suboptimal housing allocations. This may help us to clarify why the elderly are over consuming housing, as seen in the housing data. The cost of suboptimal consumption is quite high and the highest of all. Our paper suggests that, in terms of welfare, the decisions of how much of liquid wealth to use for consumption and for saving are more important than the decision about the composition of liquid savings. Suboptimal stock holdings are twice more costly in power utility and suboptimal bond holdings are twenty times more costly in recursive utility. Recursive utility is superior to power utility in terms of rationalizing many people's preference for bonds instead of stocks in investment.Keywords: housing, recursive utility, retirement, suboptimal decisions, welfare cost
Procedia PDF Downloads 3236951 Price Regulation in Domestic Market: Incentives to Collude in the Deregulated Market
Authors: S. Avdasheva, D. Tsytsulina
Abstract:
In many regulated industries over the world price cap as a method of price regulation replaces cost-plus pricing. It is a kind of incentive regulation introduced in order to enhance productive efficiency by strengthening sellers’ incentives for cost reduction as well as incentives for more efficient pricing. However pricing under cap is not neutral for competition in the market. We consider influence on competition on the markets where benchmark for cap is chosen from when sellers are multi-market. We argue that the impact of price cap regulation on market competition depends on the design of cap. More specifically if cap for one (regulated) market depends on the price of the supplier in other (non-regulated) market, there is sub-type of price cap regulation (known in Russian tariff regulation as ‘netback minus’) that enhance incentives to collude in non-regulated market.Keywords: price regulation, competition, collusion
Procedia PDF Downloads 5266950 Budgetary Performance Model for Managing Pavement Maintenance
Authors: Vivek Hokam, Vishrut Landge
Abstract:
An ideal maintenance program for an industrial road network is one that would maintain all sections at a sufficiently high level of functional and structural conditions. However, due to various constraints such as budget, manpower and equipment, it is not possible to carry out maintenance on all the needy industrial road sections within a given planning period. A rational and systematic priority scheme needs to be employed to select and schedule industrial road sections for maintenance. Priority analysis is a multi-criteria process that determines the best ranking list of sections for maintenance based on several factors. In priority setting, difficult decisions are required to be made for selection of sections for maintenance. It is more important to repair a section with poor functional conditions which includes uncomfortable ride etc. or poor structural conditions i.e. sections those are in danger of becoming structurally unsound. It would seem therefore that any rational priority setting approach must consider the relative importance of functional and structural condition of the section. The maintenance priority index and pavement performance models tend to focus mainly on the pavement condition, traffic criteria etc. There is a need to develop the model which is suitably used with respect to limited budget provisions for maintenance of pavement. Linear programming is one of the most popular and widely used quantitative techniques. A linear programming model provides an efficient method for determining an optimal decision chosen from a large number of possible decisions. The optimum decision is one that meets a specified objective of management, subject to various constraints and restrictions. The objective is mainly minimization of maintenance cost of roads in industrial area. In order to determine the objective function for analysis of distress model it is necessary to fix the realistic data into a formulation. Each type of repair is to be quantified in a number of stretches by considering 1000 m as one stretch. A stretch considered under study is having 3750 m length. The quantity has to be put into an objective function for maximizing the number of repairs in a stretch related to quantity. The distress observed in this stretch are potholes, surface cracks, rutting and ravelling. The distress data is measured manually by observing each distress level on a stretch of 1000 m. The maintenance and rehabilitation measured that are followed currently are based on subjective judgments. Hence, there is a need to adopt a scientific approach in order to effectively use the limited resources. It is also necessary to determine the pavement performance and deterioration prediction relationship with more accurate and economic benefits of road networks with respect to vehicle operating cost. The infrastructure of road network should have best results expected from available funds. In this paper objective function for distress model is determined by linear programming and deterioration model considering overloading is discussed.Keywords: budget, maintenance, deterioration, priority
Procedia PDF Downloads 2106949 Passive Solar Distiller with Low Cost of Implementation, Operation and Maintenance
Authors: Valentina Alessandra Carvalho do Vale, Elmo Thiago Lins Cöuras Ford, Rudson de Sousa Lima
Abstract:
Around the planet Earth, access to clean water is a problem whose importance has increased due to population growth and its misuse. Thus, projects that seek to transform water sources improper (salty and brackish) in drinking water sources are current issues. However, this transformation generally requires a high cost of implementation, operation and maintenance. In this context, the aim of this work is the development of a passive solar distiller for brackish water, made from recycled and durable materials such as aluminum, cement, glass and PVC basins. The results reveal factors that influence the performance and viability of the expansion project.Keywords: solar distiller, passive distiller, distiller with pyramidal roof, ecologically correct
Procedia PDF Downloads 4196948 Intelligent Decision Support for Wind Park Operation: Machine-Learning Based Detection and Diagnosis of Anomalous Operating States
Authors: Angela Meyer
Abstract:
The operation and maintenance cost for wind parks make up a major fraction of the park’s overall lifetime cost. To minimize the cost and risk involved, an optimal operation and maintenance strategy requires continuous monitoring and analysis. In order to facilitate this, we present a decision support system that automatically scans the stream of telemetry sensor data generated from the turbines. By learning decision boundaries and normal reference operating states using machine learning algorithms, the decision support system can detect anomalous operating behavior in individual wind turbines and diagnose the involved turbine sub-systems. Operating personal can be alerted if a normal operating state boundary is exceeded. The presented decision support system and method are applicable for any turbine type and manufacturer providing telemetry data of the turbine operating state. We demonstrate the successful detection and diagnosis of anomalous operating states in a case study at a German onshore wind park comprised of Vestas V112 turbines.Keywords: anomaly detection, decision support, machine learning, monitoring, performance optimization, wind turbines
Procedia PDF Downloads 170