Search results for: predictive maintenance model
17847 The Impact of Artificial Intelligence on Spare Parts Technology
Authors: Amir Andria Gad Shehata
Abstract:
Minimizing the inventory cost, optimizing the inventory quantities, and increasing system operational availability are the main motivations to enhance forecasting demand of spare parts in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare parts purchase orders with uncertain demand. The pattern of the demand considers a lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case study. Three actual data sets were used to make the forecast in this case study. Two neural networks (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.Keywords: spare part, spare part inventory, inventory model, optimization, maintenanceneural network, LSTM, MLP, forecasting demand, inventory management
Procedia PDF Downloads 6217846 Long-Term Indoor Air Monitoring for Students with Emphasis on Particulate Matter (PM2.5) Exposure
Authors: Seyedtaghi Mirmohammadi, Jamshid Yazdani, Syavash Etemadi Nejad
Abstract:
One of the main indoor air parameters in classrooms is dust pollution and it depends on the particle size and exposure duration. However, there is a lake of data about the exposure level to PM2.5 concentrations in rural area classrooms. The objective of the current study was exposure assessment for PM2.5 for students in the classrooms. One year monitoring was carried out for fifteen schools by time-series sampling to evaluate the indoor air PM2.5 in the rural district of Sari city, Iran. A hygrometer and thermometer were used to measure some psychrometric parameters (temperature, relative humidity, and wind speed) and Real-Time Dust Monitor, (MicroDust Pro, Casella, UK) was used to monitor particulate matters (PM2.5) concentration. The results show the mean indoor PM2.5 concentration in the studied classrooms was 135µg/m3. The regression model indicated that a positive correlation between indoor PM2.5 concentration and relative humidity, also with distance from city center and classroom size. Meanwhile, the regression model revealed that the indoor PM2.5 concentration, the relative humidity, and dry bulb temperature was significant at 0.05, 0.035, and 0.05 levels, respectively. A statistical predictive model was obtained from multiple regressions modeling for indoor PM2.5 concentration and indoor psychrometric parameters conditions.Keywords: classrooms, concentration, humidity, particulate matters, regression
Procedia PDF Downloads 33317845 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data
Authors: Benjamin Leiby, Darryl Ahner
Abstract:
This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions while presenting a need for further refinement that mimics predictive mean matching.Keywords: correlation, country conflict, imputation, stochastic regression
Procedia PDF Downloads 11817844 Methadone Maintenance Treatment Patients' and Medical Students' Common Trait: Low Mindfulness Trait Associated with High Perceived Stress
Authors: Einat Peles, Anat Sason, Ariel Claman, Gabriel Barkay, Miriam Adelson
Abstract:
Individuals with opioid addiction are characterized as suffering from stress responses disturbance, including the hypothalamic-pituitary-adrenal (HPA) axis, and autonomic nervous system function. HPA axis is known to be stabilized during methadone maintenance treatment (MMT). Mindfulness (present-oriented, nonjudgmental awareness of cognitions, emotions, perceptions, and habitual behavioral reactions in daily life) counteracts stress. To our knowledge, the relation between perceived stress and mindfulness trait among MMT patients has never been studied. To measure indices of mindfulness and their relation to perceived stress among MMT patients, a cross-sectional random sample of current MMT patients was performed using questionnaires for perceived stress (PSS) and mindfulness trait (FFMQ- yields a total score and individual scores for five internally consistent mindfulness factors: Observing, Describing, Acting with awareness and consciousness, Non-judging the inner experience, Non-reactivity to the inner experience). Two additional groups were studied to serve as reference groups; Medical students that are known to suffer from stress, and Axis II psychiatric diagnosis patients that are known to characterized with poor mindfulness trait. Results: Groups included 41 MMT patients, 27 Axis II patients and 36 medical students. High perceived stressed (PSS≥18) defined among 61% of the MMT patients and 50% of the medical students. Highest mindfulness score observed among non-stressed MMT patients (153.5±17.2) followed by the groups of stressed MMT and non-stressed student (128.9±17.0 and 130.5±13.3 respectively), with the lowest score among stressed students (116.3±17.9) (multivariate analyses, corrected model p (F=14.3) < 0.0005, p (group) < 0.0005, p (stress) < 0.0005, p (interaction) =0.2). Linear inverse correlations were found between perceived stress score and mindfulness score among MMT patients (R=-0.65, p < 0.0005) and students (R=-0.51, p=0.002). Axis II patients had the lowest mindfulness score (103.4±25.3). Conclusion: High prevalence of high perceived stressed which characterized with poor mindfulness trait observed in both MMT patients and medical students, two different population groups. The effectiveness of mindfulness treatment in reducing stress and improve mindfulness trait should be evaluated to improve rehabilitation of MMT patients, and students success.Keywords: mindfulness, stress, methadone maintenance treatment, medical students
Procedia PDF Downloads 18217843 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 35217842 Passive Solar Distiller with Low Cost of Implementation, Operation and Maintenance
Authors: Valentina Alessandra Carvalho do Vale, Elmo Thiago Lins Cöuras Ford, Rudson de Sousa Lima
Abstract:
Around the planet Earth, access to clean water is a problem whose importance has increased due to population growth and its misuse. Thus, projects that seek to transform water sources improper (salty and brackish) in drinking water sources are current issues. However, this transformation generally requires a high cost of implementation, operation and maintenance. In this context, the aim of this work is the development of a passive solar distiller for brackish water, made from recycled and durable materials such as aluminum, cement, glass and PVC basins. The results reveal factors that influence the performance and viability of the expansion project.Keywords: solar distiller, passive distiller, distiller with pyramidal roof, ecologically correct
Procedia PDF Downloads 41217841 Development of Programmed Cell Death Protein 1 Pathway-Associated Prognostic Biomarkers for Bladder Cancer Using Transcriptomic Databases
Authors: Shu-Pin Huang, Pai-Chi Teng, Hao-Han Chang, Chia-Hsin Liu, Yung-Lun Lin, Shu-Chi Wang, Hsin-Chih Yeh, Chih-Pin Chuu, Jiun-Hung Geng, Li-Hsin Chang, Wei-Chung Cheng, Chia-Yang Li
Abstract:
The emergence of immune checkpoint inhibitors (ICIs) targeting proteins like PD-1 and PD-L1 has changed the treatment paradigm of bladder cancer. However, not all patients benefit from ICIs, with some experiencing early death. There's a significant need for biomarkers associated with the PD-1 pathway in bladder cancer. Current biomarkers focus on tumor PD-L1 expression, but a more comprehensive understanding of PD-1-related biology is needed. Our study has developed a seven-gene risk score panel, employing a comprehensive bioinformatics strategy, which could serve as a potential prognostic and predictive biomarker for bladder cancer. This panel incorporates the FYN, GRAP2, TRIB3, MAP3K8, AKT3, CD274, and CD80 genes. Additionally, we examined the relationship between this panel and immune cell function, utilizing validated tools such as ESTIMATE, TIDE, and CIBERSORT. Our seven-genes panel has been found to be significantly associated with bladder cancer survival in two independent cohorts. The panel was also significantly correlated with tumor infiltration lymphocytes, immune scores, and tumor purity. These factors have been previously reported to have clinical implications on ICIs. The findings suggest the potential of a PD-1 pathway-based transcriptomic panel as a prognostic and predictive biomarker in bladder cancer, which could help optimize treatment strategies and improve patient outcomes.Keywords: bladder cancer, programmed cell death protein 1, prognostic biomarker, immune checkpoint inhibitors, predictive biomarker
Procedia PDF Downloads 7617840 Modeling of Crack Growth in Railway Axles under Static Loading
Authors: Zellagui Redouane, Bellaouar Ahmed, Lachi Mohammed
Abstract:
The railway axles are the essential parts in the bogie of train, and its failure creates a big problem in the railway transport; during the work of this parts we noticed a premature deterioration. The aim has been presented a predictive model allowing the identification of the probable causes that are the cause of these premature deterioration. The results are employed for predicting fatigue crack growth in the railway axle, Also we want to present the variation value of stress intensity factor in different positions of elliptical crack tip. The modeling of axle in performed by the SOLID WORKS software and imported into ANSYS.Keywords: crack growth, static load, railway axle, lifetime
Procedia PDF Downloads 36317839 Measurement of Blood Phenobarbital Concentration Within Newborns Admitted to the NICU of Imam Reza Hospital and Received the Drug by Intravenous Mode
Authors: Ahmad Shah Farhat, Anahita Alizadeh Qamsari, Ashraf Mohammadzadeh, Hamid Reza Goldouzian, Ezat Khodashenas
Abstract:
Introduction: Newborns may be treated with phenobarbital for many reasons. Because in each region, depending on different races and genetic factors, different pharmacokinetic conditions govern the drug. It is essential to control blood levels of certain drugs, especially phenobarbital, and maintain these levels during treatment. Methods: In this study, venous blood was collected from 50 neonates who received intravenous phenobarbital at a loading dose of 20 mg/kg weight and at least three days had passed since the maintenance dose of 5 mg/kg body weight. in 24 hours. and sent to the laboratory. Phenobarbital blood levels were measured, then the results were analyzed descriptively. Results: In this study, the average weight of newborns was 9.93 ± 2.58. The mean blood concentration of phenobarbital, three days after starting the maintenance dose in the group of infants weighing more than 2.5 kg, was 3.33 ± 9.1 micrograms/liter in the group of infants weighing less than 2 kg. and half a kilogram or LBW was 5.9 ± 9.5 micrograms/liter and in the group weighing less than 1.5 kg VLBW was 14.4 ± 15.46 micrograms/liter. There was no significant difference between groups (p>0.05). Three days after starting the maintenance dose in all three groups, the mean blood phenobarbital concentration was 9.86 ± 0.86 micrograms/liter. Conclusion: Blood phenobarbital levels in our newborns are below therapeutic levels, so phenobarbital levels should be evaluated.Keywords: poisining, neonats, phenobarbital, drug
Procedia PDF Downloads 6117838 Logistic Regression Model versus Additive Model for Recurrent Event Data
Authors: Entisar A. Elgmati
Abstract:
Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event
Procedia PDF Downloads 63317837 Improving the Performance of Road Salt on Anti-Icing
Authors: Mohsen Abotalebi Esfahani, Amin Rahimi
Abstract:
Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.Keywords: maintenance, sodium chloride, icyroad, calcium chloride
Procedia PDF Downloads 28217836 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity
Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink
Abstract:
The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction
Procedia PDF Downloads 31117835 Uncertainty Estimation in Neural Networks through Transfer Learning
Authors: Ashish James, Anusha James
Abstract:
The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.Keywords: uncertainty estimation, neural networks, transfer learning, regression
Procedia PDF Downloads 13417834 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning
Procedia PDF Downloads 42217833 Predicting Radioactive Waste Glass Viscosity, Density and Dissolution with Machine Learning
Authors: Joseph Lillington, Tom Gout, Mike Harrison, Ian Farnan
Abstract:
The vitrification of high-level nuclear waste within borosilicate glass and its incorporation within a multi-barrier repository deep underground is widely accepted as the preferred disposal method. However, for this to happen, any safety case will require validation that the initially localized radionuclides will not be considerably released into the near/far-field. Therefore, accurate mechanistic models are necessary to predict glass dissolution, and these should be robust to a variety of incorporated waste species and leaching test conditions, particularly given substantial variations across international waste-streams. Here, machine learning is used to predict glass material properties (viscosity, density) and glass leaching model parameters from large-scale industrial data. A variety of different machine learning algorithms have been compared to assess performance. Density was predicted solely from composition, whereas viscosity additionally considered temperature. To predict suitable glass leaching model parameters, a large simulated dataset was created by coupling MATLAB and the chemical reactive-transport code HYTEC, considering the state-of-the-art GRAAL model (glass reactivity in allowance of the alteration layer). The trained models were then subsequently applied to the large-scale industrial, experimental data to identify potentially appropriate model parameters. Results indicate that ensemble methods can accurately predict viscosity as a function of temperature and composition across all three industrial datasets. Glass density prediction shows reliable learning performance with predictions primarily being within the experimental uncertainty of the test data. Furthermore, machine learning can predict glass dissolution model parameters behavior, demonstrating potential value in GRAAL model development and in assessing suitable model parameters for large-scale industrial glass dissolution data.Keywords: machine learning, predictive modelling, pattern recognition, radioactive waste glass
Procedia PDF Downloads 11517832 Shear Stress and Effective Structural Stress Fields of an Atherosclerotic Coronary Artery
Authors: Alireza Gholipour, Mergen H. Ghayesh, Anthony Zander, Stephen J. Nicholls, Peter J. Psaltis
Abstract:
A three-dimensional numerical model of an atherosclerotic coronary artery is developed for the determination of high-risk situation and hence heart attack prediction. Employing the finite element method (FEM) using ANSYS, fluid-structure interaction (FSI) model of the artery is constructed to determine the shear stress distribution as well as the von Mises stress field. A flexible model for an atherosclerotic coronary artery conveying pulsatile blood is developed incorporating three-dimensionality, artery’s tapered shape via a linear function for artery wall distribution, motion of the artery, blood viscosity via the non-Newtonian flow theory, blood pulsation via use of one-period heartbeat, hyperelasticity via the Mooney-Rivlin model, viscoelasticity via the Prony series shear relaxation scheme, and micro-calcification inside the plaque. The material properties used to relate the stress field to the strain field have been extracted from clinical data from previous in-vitro studies. The determined stress fields has potential to be used as a predictive tool for plaque rupture and dissection. The results show that stress concentration due to micro-calcification increases the von Mises stress significantly; chance of developing a crack inside the plaque increases. Moreover, the blood pulsation varies the stress distribution substantially for some cases.Keywords: atherosclerosis, fluid-structure interaction, coronary arteries, pulsatile flow
Procedia PDF Downloads 17217831 A Mega-Analysis of the Predictive Power of Initial Contact within Minimal Social Network
Authors: Cathal Ffrench, Ryan Barrett, Mike Quayle
Abstract:
It is accepted in social psychology that categorization leads to ingroup favoritism, without further thought given to the processes that may co-occur or even precede categorization. These categorizations move away from the conceptualization of the self as a unique social being toward an increasingly collective identity. Subsequently, many individuals derive much of their self-evaluations from these collective identities. The seminal literature on this topic argues that it is primarily categorization that evokes instances of ingroup favoritism. Apropos to these theories, we argue that categorization acts to enhance and further intergroup processes rather than defining them. More accurately, we propose categorization aids initial ingroup contact and this first contact is predictive of subsequent favoritism on individual and collective levels. This analysis focuses on Virtual Interaction APPLication (VIAPPL) based studies, a software interface that builds on the flaws of the original minimal group studies. The VIAPPL allows the exchange of tokens in an intra and inter-group manner. This token exchange is how we classified the first contact. The study involves binary longitudinal analysis to better understand the subsequent exchanges of individuals based on who they first interacted with. Studies were selected on the criteria of evidence of explicit first interactions and two-group designs. Our findings paint a compelling picture in support of a motivated contact hypothesis, which suggests that an individual’s first motivated contact toward another has strong predictive capabilities for future behavior. This contact can lead to habit formation and specific favoritism towards individuals where contact has been established. This has important implications for understanding how group conflict occurs, and how intra-group individual bias can develop.Keywords: categorization, group dynamics, initial contact, minimal social networks, momentary contact
Procedia PDF Downloads 14617830 The Prognostic Prediction Value of Positive Lymph Nodes Numbers for the Hypopharyngeal Squamous Cell Carcinoma
Authors: Wendu Pang, Yaxin Luo, Junhong Li, Yu Zhao, Danni Cheng, Yufang Rao, Minzi Mao, Ke Qiu, Yijun Dong, Fei Chen, Jun Liu, Jian Zou, Haiyang Wang, Wei Xu, Jianjun Ren
Abstract:
We aimed to compare the prognostic prediction value of positive lymph node number (PLNN) to the American Joint Committee on Cancer (AJCC) tumor, lymph node, and metastasis (TNM) staging system for patients with hypopharyngeal squamous cell carcinoma (HPSCC). A total of 826 patients with HPSCC from the Surveillance, Epidemiology, and End Results database (2004–2015) were identified and split into two independent cohorts: training (n=461) and validation (n=365). Univariate and multivariate Cox regression analyses were used to evaluate the prognostic effects of PLNN in patients with HPSCC. We further applied six Cox regression models to compare the survival predictive values of the PLNN and AJCC TNM staging system. PLNN showed a significant association with overall survival (OS) and cancer-specific survival (CSS) (P < 0.001) in both univariate and multivariable analyses, and was divided into three groups (PLNN 0, PLNN 1-5, and PLNN>5). In the training cohort, multivariate analysis revealed that the increased PLNN of HPSCC gave rise to significantly poor OS and CSS after adjusting for age, sex, tumor size, and cancer stage; this trend was also verified by the validation cohort. Additionally, the survival model incorporating a composite of PLNN and TNM classification (C-index, 0.705, 0.734) performed better than the PLNN and AJCC TNM models. PLNN can serve as a powerful survival predictor for patients with HPSCC and is a surrogate supplement for cancer staging systems.Keywords: hypopharyngeal squamous cell carcinoma, positive lymph nodes number, prognosis, prediction models, survival predictive values
Procedia PDF Downloads 15217829 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation
Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner
Abstract:
This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.Keywords: business valuation, corporate finance, digitisation, disruption
Procedia PDF Downloads 13217828 Evaluating Psychologist Practice Competencies through Multisource Feedback: An International Research Design
Authors: Jac J. W. Andrews, James B. Hale
Abstract:
Effective practicing psychologists require ongoing skill development that is constructivist and recursive in nature, with mentor, colleague, co-worker, and patient feedback critical to successful acquisition and maintenance of professional competencies. This paper will provide an overview of the nature and scope of psychologist skill development through multisource feedback (MSF) or 360 degree evaluation, present a rationale for its use for assessing practicing psychologist performance, and advocate its use in psychology given the demonstrated model utility in other health professions. The paper will conclude that an international research design is needed to assess the feasibility, reliability, and validity of MSF system ratings intended to solicit feedback from mentors, colleagues, coworkers, and patients about psychologist competencies. If adopted, the MSF model could lead to enhanced skill development that fosters patient satisfaction within and across countries.Keywords: psychologist, multisource feedback, psychologist competency, professionalism
Procedia PDF Downloads 44417827 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron
Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni
Abstract:
The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow
Procedia PDF Downloads 34317826 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis
Authors: Chang-Jen Lan
Abstract:
Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as XKeywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index
Procedia PDF Downloads 13017825 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1117824 The Interactions of Attentional Bias for Food, Trait Self-Control, and Motivation: A Model Testing Study
Authors: Hamish Love, Navjot Bhullar, Nicola Schutte
Abstract:
Self-control and related psychological constructs have been shown to have a large role in the improvement and maintenance of healthful dietary behaviour. However, self-control for diet, and related constructs such as motivation, level of conflict between tempting desires and dietary goals, and attentional bias for tempting food, have not been studied together to establish their relationships, to the author’s best knowledge. Therefore the aim of this paper was to conduct model testing on these constructs and evaluate how they relate to affect dietary outcomes. 400 Australian adult participants will be recruited via the Qualtrics platform and will be representative across age and gender. They will complete survey and reaction timing surveys to gather data on the five target constructs: Trait Self-control, Attentional Bias for Food, Dietary Goal-Desire Incongruence, Motivation for Dietary Self-control, and Satisfaction with Dietary Behaviour. A model of moderated mediation is predicted, whereby the initial predictor (Dietary Goal-Desire Incongruence) predicts the level of the outcome variable, Satisfaction with Dietary Behaviour. We hypothesise that the relationship between these two variables will be mediated by Trait Self-Control and that the extent that Trait Self-control is allowed to mediate dietary outcome is moderated by both Attentional Bias for Food and Motivation for Dietary Self-control. The analysis will be conducted using the PROCESS module in SPSS 23. The results of model testing in this current study will be valuable to direct future research and inform which constructs could be important targets for intervention to improve dietary outcomes.Keywords: self-control, diet, model testing, attentional bias, motivation
Procedia PDF Downloads 16817823 Effectiveness of the Lacey Assessment of Preterm Infants to Predict Neuromotor Outcomes of Premature Babies at 12 Months Corrected Age
Authors: Thanooja Naushad, Meena Natarajan, Tushar Vasant Kulkarni
Abstract:
Background: The Lacey Assessment of Preterm Infants (LAPI) is used in clinical practice to identify premature babies at risk of neuromotor impairments, especially cerebral palsy. This study attempted to find the validity of the Lacey assessment of preterm infants to predict neuromotor outcomes of premature babies at 12 months corrected age and to compare its predictive ability with the brain ultrasound. Methods: This prospective cohort study included 89 preterm infants (45 females and 44 males) born below 35 weeks gestation who were admitted to the neonatal intensive care unit of a government hospital in Dubai. Initial assessment was done using the Lacey assessment after the babies reached 33 weeks postmenstrual age. Follow up assessment on neuromotor outcomes was done at 12 months (± 1 week) corrected age using two standardized outcome measures, i.e., infant neurological international battery and Alberta infant motor scale. Brain ultrasound data were collected retrospectively. Data were statistically analyzed, and the diagnostic accuracy of the Lacey assessment of preterm infants (LAPI) was calculated -when used alone and in combination with the brain ultrasound. Results: On comparison with brain ultrasound, the Lacey assessment showed superior specificity (96% vs. 77%), higher positive predictive value (57% vs. 22%), and higher positive likelihood ratio (18 vs. 3) to predict neuromotor outcomes at one year of age. The sensitivity of Lacey assessment was lower than brain ultrasound (66% vs. 83%), whereas specificity was similar (97% vs. 98%). A combination of Lacey assessment and brain ultrasound results showed higher sensitivity (80%), positive (66%), and negative (98%) predictive values, positive likelihood ratio (24), and test accuracy (95%) than Lacey assessment alone in predicting neurological outcomes. The negative predictive value of the Lacey assessment was similar to that of its combination with brain ultrasound (96%). Conclusion: Results of this study suggest that the Lacey assessment of preterm infants can be used as a supplementary assessment tool for premature babies in the neonatal intensive care unit. Due to its high specificity, Lacey assessment can be used to identify those babies at low risk of abnormal neuromotor outcomes at a later age. When used along with the findings of the brain ultrasound, Lacey assessment has better sensitivity to identify preterm babies at particular risk. These findings have applications in identifying premature babies who may benefit from early intervention services.Keywords: brain ultrasound, lacey assessment of preterm infants, neuromotor outcomes, preterm
Procedia PDF Downloads 13717822 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics
Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich
Abstract:
Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes
Procedia PDF Downloads 7317821 Psychological Testing in Industrial/Organizational Psychology: Validity and Reliability of Psychological Assessments in the Workplace
Authors: Melissa C. Monney
Abstract:
Psychological testing has been of interest to researchers for many years as useful tools in assessing and diagnosing various disorders as well as to assist in understanding human behavior. However, for over 20 years now, researchers and laypersons alike have been interested in using them for other purposes, such as determining factors in employee selection, promotion, and even termination. In recent years, psychological assessments have been useful in facilitating workplace decision processing, regarding employee circulation within organizations. This literature review explores four of the most commonly used psychological tests in workplace environments, namely cognitive ability, emotional intelligence, integrity, and personality tests, as organizations have used these tests to assess different factors of human behavior as predictive measures of future employee behaviors. The findings suggest that while there is much controversy and debate regarding the validity and reliability of these tests in workplace settings as they were not originally designed for these purposes, the use of such assessments in the workplace has been useful in decreasing costs and employee turnover as well as increase job satisfaction by ensuring the right employees are selected for their roles.Keywords: cognitive ability, personality testing, predictive validity, workplace behavior
Procedia PDF Downloads 24117820 Development of Graph-Theoretic Model for Ranking Top of Rail Lubricants
Authors: Subhash Chandra Sharma, Mohammad Soleimani
Abstract:
Selection of the correct lubricant for the top of rail application is a complex process. In this paper, the selection of the proper lubricant for a Top-Of-Rail (TOR) lubrication system based on graph theory and matrix approach has been developed. Attributes influencing the selection process and their influence on each other has been represented through a digraph and an equivalent matrix. A matrix function which is called the Permanent Function is derived. By substituting the level of inherent contribution of the influencing parameters and their influence on each other qualitatively, a criterion called Suitability Index is derived. Based on these indices, lubricants can be ranked for their suitability. The proposed model can be useful for maintenance engineers in selecting the best lubricant for a TOR application. The proposed methodology is illustrated step–by-step through an example.Keywords: lubricant selection, top of rail lubrication, graph-theory, Ranking of lubricants
Procedia PDF Downloads 29317819 Revolutionary Wastewater Treatment Technology: An Affordable, Low-Maintenance Solution for Wastewater Recovery and Energy-Saving
Authors: Hady Hamidyan
Abstract:
As the global population continues to grow, the demand for clean water and effective wastewater treatment becomes increasingly critical. By 2030, global water demand is projected to exceed supply by 40%, driven by population growth, increased water usage, and climate change. Currently, about 4.2 billion people lack access to safely managed sanitation services. The wastewater treatment sector faces numerous challenges, including the need for energy-efficient solutions, cost-effectiveness, ease of use, and low maintenance requirements. This abstract presents a groundbreaking wastewater treatment technology that addresses these challenges by offering an energy-saving approach, wastewater recovery capabilities, and a ready-made, affordable, and user-friendly package with minimal maintenance costs. The unique design of this ready-made package made it possible to eliminate the need for pumps, filters, airlift, and other common equipment. Consequently, it enables sustainable wastewater treatment management with exceptionally low energy and cost requirements, minimizing investment and maintenance expenses. The operation of these packages is based on continuous aeration, which involves injecting oxygen gas or air into the aeration chamber through a tubular diffuser with very small openings. This process supplies the necessary oxygen for aerobic bacteria. The recovered water, which amounts to almost 95% of the input, can be treated to meet specific quality standards, allowing safe reuse for irrigation, industrial processes, or even potable purposes. This not only reduces the strain on freshwater resources but also provides economic benefits by offsetting the costs associated with freshwater acquisition and wastewater discharge. The ready-made, affordable, and user-friendly nature of this technology makes it accessible to a wide range of users, including small communities, industries, and decentralized wastewater treatment systems. The system incorporates user-friendly interfaces, simplified operational procedures, and integrated automation, facilitating easy implementation and operation. Additionally, the use of durable materials, efficient equipment, and advanced monitoring systems significantly reduces maintenance requirements, resulting in low overall life-cycle costs and alleviating the burden on operators and maintenance personnel. In conclusion, the presented wastewater treatment technology offers a comprehensive solution to the challenges faced by the industry. Its energy-saving approach, combined with wastewater recovery capabilities, ensures sustainable resource management and enhances environmental stewardship. This affordable, ready-made, and low-maintenance package promotes broad adoption across various sectors and communities, contributing to a more sustainable future for water and wastewater management.Keywords: wastewater treatment, energy saving, wastewater recovery, affordable package, low maintenance costs, sustainable resource management, environmental stewardship
Procedia PDF Downloads 9117818 Quantitative Assessment of Road Infrastructure Health Using High-Resolution Remote Sensing Data
Authors: Wang Zhaoming, Shao Shegang, Chen Xiaorong, Qi Yanan, Tian Lei, Wang Jian
Abstract:
This study conducts a comparative analysis of the spectral curves of asphalt pavements at various aging stages to improve road information extraction from high-resolution remote sensing imagery. By examining the distinguishing capabilities and spectral characteristics, the research aims to establish a pavement information extraction methodology based on China's high-resolution satellite images. The process begins by analyzing the spectral features of asphalt pavements to construct a spectral assessment model suitable for evaluating pavement health. This model is then tested at a national highway traffic testing site in China, validating its effectiveness in distinguishing different pavement aging levels. The study's findings demonstrate that the proposed model can accurately assess road health, offering a valuable tool for road maintenance planning and infrastructure management.Keywords: spectral analysis, asphalt pavement aging, high-resolution remote sensing, pavement health assessment
Procedia PDF Downloads 19