Search results for: time delayed SIR epidemic model
30554 Comparison of Accumulated Stress Based Pore Pressure Model and Plasticity Model in 1D Site Response Analysis
Authors: Saeedullah J. Mandokhail, Shamsher Sadiq, Meer H. Khan
Abstract:
This paper presents the comparison of excess pore water pressure ratio (ru) predicted by using accumulated stress based pore pressure model and plasticity model. One dimensional effective stress site response analyses were performed on a 30 m deep sand column (consists of a liquefiable layer in between non-liquefiable layers) using accumulated stress based pore pressure model in Deepsoil and PDMY2 (PressureDependentMultiYield02) model in Opensees. Three Input motions with different peak ground acceleration (PGA) levels of 0.357 g, 0.124 g, and 0.11 g were used in this study. The developed excess pore pressure ratio predicted by the above two models were compared and analyzed along the depth. The time history of the ru at mid of the liquefiable layer and non-liquefiable layer were also compared. The comparisons show that the two models predict mostly similar ru values. The predicted ru is also consistent with the PGA level of the input motions.Keywords: effective stress, excess pore pressure ratio, pore pressure model, site response analysis
Procedia PDF Downloads 22630553 Predictive Analytics in Traffic Flow Management: Integrating Temporal Dynamics and Traffic Characteristics to Estimate Travel Time
Authors: Maria Ezziani, Rabie Zine, Amine Amar, Ilhame Kissani
Abstract:
This paper introduces a predictive model for urban transportation engineering, which is vital for efficient traffic management. Utilizing comprehensive datasets and advanced statistical techniques, the model accurately forecasts travel times by considering temporal variations and traffic dynamics. Machine learning algorithms, including regression trees and neural networks, are employed to capture sequential dependencies. Results indicate significant improvements in predictive accuracy, particularly during peak hours and holidays, with the incorporation of traffic flow and speed variables. Future enhancements may integrate weather conditions and traffic incidents. The model's applications range from adaptive traffic management systems to route optimization algorithms, facilitating congestion reduction and enhancing journey reliability. Overall, this research extends beyond travel time estimation, offering insights into broader transportation planning and policy-making realms, empowering stakeholders to optimize infrastructure utilization and improve network efficiency.Keywords: predictive analytics, traffic flow, travel time estimation, urban transportation, machine learning, traffic management
Procedia PDF Downloads 8230552 Impact of Enhanced Business Models on Technology Companies in the Pandemic: A Case Study about the Revolutionary Change in Management Styles
Authors: Murat Colak, Berkay Cakir Saridogan
Abstract:
Since the dawn of modern corporations, almost every single employee has been working in the same loop, which contains three basic steps: going to work, providing the needs for the work, and getting back home. Only a small amount of people were able to break that standard and live outside the box. As the 2019 pandemic hit the Earth and most companies shut down their physical offices, that loop had to change for everyone. This means that the old management styles had to be significantly re-arranged to the "work from home" type of business methods. The methods include online conferences and meetings, time and task tracking using algorithms, globalization of the work, and, most importantly, remote working. After the global epidemic started, even the tech giants were concerned. Now, it can be seen those technology companies have an incredible step-up in their shares compared to the other companies because they know how to manage such situations even better than every other industry. This study aims to take the old traditional management styles in big companies and compare them with the post-covid methods (2019-2022). As a result of this comparison made using the annual reports and shared statistics, this study aims to explain why the winners of this crisis are the technology companies.Keywords: Covid-19, technology companies, business models, remote work
Procedia PDF Downloads 6330551 Forming Simulation of Thermoplastic Pre-Impregnated Textile Composite
Authors: Masato Nishi, Tetsushi Kaburagi, Masashi Kurose, Tei Hirashima, Tetsusei Kurasiki
Abstract:
The process of thermoforming a carbon fiber reinforced thermoplastic (CFRTP) has increased its presence in the automotive industry for its wide applicability to the mass production car. A non-isothermal forming for CFRTP can shorten its cycle time to less than 1 minute. In this paper, the textile reinforcement FE model which the authors proposed in a previous work is extended to the CFRTP model for non-isothermal forming simulation. The effect of thermoplastic is given by adding shell elements which consider thermal effect to the textile reinforcement model. By applying Reuss model to the stress calculation of thermoplastic, the proposed model can accurately predict in-plane shear behavior, which is the key deformation mode during forming, in the range of the process temperature. Using the proposed model, thermoforming simulation was conducted and the results are in good agreement with the experimental results.Keywords: carbon fiber reinforced thermoplastic, finite element analysis, pre-impregnated textile composite, non-isothermal forming
Procedia PDF Downloads 42830550 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 6530549 Nonlinear Modeling of the PEMFC Based on NNARX Approach
Authors: Shan-Jen Cheng, Te-Jen Chang, Kuang-Hsiung Tan, Shou-Ling Kuo
Abstract:
Polymer Electrolyte Membrane Fuel Cell (PEMFC) is such a time-vary nonlinear dynamic system. The traditional linear modeling approach is hard to estimate structure correctly of PEMFC system. From this reason, this paper presents a nonlinear modeling of the PEMFC using Neural Network Auto-regressive model with eXogenous inputs (NNARX) approach. The multilayer perception (MLP) network is applied to evaluate the structure of the NNARX model of PEMFC. The validity and accuracy of NNARX model are tested by one step ahead relating output voltage to input current from measured experimental of PEMFC. The results show that the obtained nonlinear NNARX model can efficiently approximate the dynamic mode of the PEMFC and model output and system measured output consistently.Keywords: PEMFC, neural network, nonlinear modeling, NNARX
Procedia PDF Downloads 38030548 Verification and Proposal of Information Processing Model Using EEG-Based Brain Activity Monitoring
Authors: Toshitaka Higashino, Naoki Wakamiya
Abstract:
Human beings perform a task by perceiving information from outside, recognizing them, and responding them. There have been various attempts to analyze and understand internal processes behind the reaction to a given stimulus by conducting psychological experiments and analysis from multiple perspectives. Among these, we focused on Model Human Processor (MHP). However, it was built based on psychological experiments and thus the relation with brain activity was unclear so far. To verify the validity of the MHP and propose our model from a viewpoint of neuroscience, EEG (Electroencephalography) measurements are performed during experiments in this study. More specifically, first, experiments were conducted where Latin alphabet characters were used as visual stimuli. In addition to response time, ERPs (event-related potentials) such as N100 and P300 were measured by using EEG. By comparing cycle time predicted by the MHP and latency of ERPs, it was found that N100, related to perception of stimuli, appeared at the end of the perceptual processor. Furthermore, by conducting an additional experiment, it was revealed that P300, related to decision making, appeared during the response decision process, not at the end. Second, by experiments using Japanese Hiragana characters, i.e. Japan's own phonetic symbols, those findings were confirmed. Finally, Japanese Kanji characters were used as more complicated visual stimuli. A Kanji character usually has several readings and several meanings. Despite the difference, a reading-related task and a meaning-related task exhibited similar results, meaning that they involved similar information processing processes of the brain. Based on those results, our model was proposed which reflects response time and ERP latency. It consists of three processors: the perception processor from an input of a stimulus to appearance of N100, the cognitive processor from N100 to P300, and the decision-action processor from P300 to response. Using our model, an application system which reflects brain activity can be established.Keywords: brain activity, EEG, information processing model, model human processor
Procedia PDF Downloads 9630547 Real Time Detection, Prediction and Reconstitution of Rain Drops
Authors: R. Burahee, B. Chassinat, T. de Laclos, A. Dépée, A. Sastim
Abstract:
The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution.Keywords: reconstitution, prediction, detection, rain drop, real time, raspberry, infrared
Procedia PDF Downloads 41730546 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15430545 Comparative Analysis of Dissimilarity Detection between Binary Images Based on Equivalency and Non-Equivalency of Image Inversion
Authors: Adnan A. Y. Mustafa
Abstract:
Image matching is a fundamental problem that arises frequently in many aspects of robot and computer vision. It can become a time-consuming process when matching images to a database consisting of hundreds of images, especially if the images are big. One approach to reducing the time complexity of the matching process is to reduce the search space in a pre-matching stage, by simply removing dissimilar images quickly. The Probabilistic Matching Model for Binary Images (PMMBI) showed that dissimilarity detection between binary images can be accomplished quickly by random pixel mapping and is size invariant. The model is based on the gamma binary similarity distance that recognizes an image and its inverse as containing the same scene and hence considers them to be the same image. However, in many applications, an image and its inverse are not treated as being the same but rather dissimilar. In this paper, we present a comparative analysis of dissimilarity detection between PMMBI based on the gamma binary similarity distance and a modified PMMBI model based on a similarity distance that does distinguish between an image and its inverse as being dissimilar.Keywords: binary image, dissimilarity detection, probabilistic matching model for binary images, image mapping
Procedia PDF Downloads 15130544 Defining a Holistic Approach for Model-Based System Engineering: Paradigm and Modeling Requirements
Authors: Hycham Aboutaleb, Bruno Monsuez
Abstract:
Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account all the necessary aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and a environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and defines the refined functional as well as non functional requirements modeling tools needs to meet to be useful in model-based system engineering.Keywords: system modeling, modeling language, modeling requirements, framework
Procedia PDF Downloads 52930543 Time-Dependent Analysis of Composite Steel-Concrete Beams Subjected to Shrinkage
Authors: Rahal Nacer, Beghdad Houda, Tehami Mohamed, Souici Abdelaziz
Abstract:
Although the shrinkage of the concrete causes undesirable parasitic effects to the structure, it can then harm the resistance and the good appearance of the structure. Long term behaviourmodelling of steel-concrete composite beams requires the use of the time variable and the taking into account of all the sustained stress history of the concrete slab constituting the cross section. The work introduced in this article is a theoretical study of the behaviour of composite beams with respect to the phenomenon of concrete shrinkage. While using the theory of the linear viscoelasticity of the concrete, and on the basis of the rate of creep method, in proposing an analytical model, made up by a system of two linear differential equations, emphasizing the effects caused by shrinkage on the resistance of a steel-concrete composite beams. Results obtained from the application of the suggested model to a steel-concrete composite beam are satisfactory.Keywords: composite beams, shrinkage, time, rate of creep method, viscoelasticity theory
Procedia PDF Downloads 52730542 Information Construction of Higher Education in Teaching Practice
Authors: Yang Meng, James L. Patnao
Abstract:
With the rapid development of information technology and the impact of the epidemic environment, the traditional teaching model can’t longer meet the requirements of the development of the times. The development of teaching mechanism is the inevitable trend of the future development of higher education. We must further promote the informatization of higher education in teaching practice, let modern information technology penetrate and practice in classroom teaching, and provide promising opportunities for the high-quality development of higher education. This article mainly through the distribution of questionnaires to teachers of colleges and universities, so as to understand the degree of informatization in the teaching of colleges and universities. And on the basis of domestic and foreign scholars' research on higher education informatization, it analyzes the existing problems, and finds the optimal solution based on the needs of education and teaching development. According to the survey results, most college teachers will use information technology in teaching practice, but the information technology teaching tools used by teachers are relatively simple, and most of them only use slides. In addition, backward informatization infrastructure and less informatization training are the main challenges facing the current teaching informatization construction. If colleges and universities can make good use of information technology and multimedia technology and combine it with traditional teaching, it will definitely promote the development of college education and further promote the modernization and informatization of higher education.Keywords: higher education, teaching practice, informatization construction, e-education
Procedia PDF Downloads 12030541 WhatsApp as Part of a Blended Learning Model to Help Programming Novices
Authors: Tlou J. Ramabu
Abstract:
Programming is one of the challenging subjects in the field of computing. In the higher education sphere, some programming novices’ performance, retention rate, and success rate are not improving. Most of the time, the problem is caused by the slow pace of learning, difficulty in grasping the syntax of the programming language and poor logical skills. More importantly, programming forms part of major subjects within the field of computing. As a result, specialized pedagogical methods and innovation are highly recommended. Little research has been done on the potential productivity of the WhatsApp platform as part of a blended learning model. In this article, the authors discuss the WhatsApp group as a part of blended learning model incorporated for a group of programming novices. We discuss possible administrative activities for productive utilisation of the WhatsApp group on the blended learning overview. The aim is to take advantage of the popularity of WhatsApp and the time students spend on it for their educational purpose. We believe that blended learning featuring a WhatsApp group may ease novices’ cognitive load and strengthen their foundational programming knowledge and skills. This is a work in progress as the proposed blended learning model with WhatsApp incorporated is yet to be implemented.Keywords: blended learning, higher education, WhatsApp, programming, novices, lecturers
Procedia PDF Downloads 17130540 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model
Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis
Abstract:
In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data
Procedia PDF Downloads 32930539 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules
Authors: Michael Naderhirn, Marco Pavone
Abstract:
Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.Keywords: formal system verification, reachability, real time controller, hybrid system
Procedia PDF Downloads 24130538 Assessing the Survival Time of Hospitalized Patients in Eastern Ethiopia During 2019–2020 Using the Bayesian Approach: A Retrospective Cohort Study
Authors: Chalachew Gashu, Yoseph Kassa, Habtamu Geremew, Mengestie Mulugeta
Abstract:
Background and Aims: Severe acute malnutrition remains a significant health challenge, particularly in low‐ and middle‐income countries. The aim of this study was to determine the survival time of under‐five children with severe acute malnutrition. Methods: A retrospective cohort study was conducted at a hospital, focusing on under‐five children with severe acute malnutrition. The study included 322 inpatients admitted to the Chiro hospital in Chiro, Ethiopia, between September 2019 and August 2020, whose data was obtained from medical records. Survival functions were analyzed using Kaplan‒Meier plots and log‐rank tests. The survival time of severe acute malnutrition was further analyzed using the Cox proportional hazards model and Bayesian parametric survival models, employing integrated nested Laplace approximation methods. Results: Among the 322 patients, 118 (36.6%) died as a result of severe acute malnutrition. The estimated median survival time for inpatients was found to be 2 weeks. Model selection criteria favored the Bayesian Weibull accelerated failure time model, which demonstrated that age, body temperature, pulse rate, nasogastric (NG) tube usage, hypoglycemia, anemia, diarrhea, dehydration, malaria, and pneumonia significantly influenced the survival time of severe acute malnutrition. Conclusions: This study revealed that children below 24 months, those with altered body temperature and pulse rate, NG tube usage, hypoglycemia, and comorbidities such as anemia, diarrhea, dehydration, malaria, and pneumonia had a shorter survival time when affected by severe acute malnutrition under the age of five. To reduce the death rate of children under 5 years of age, it is necessary to design community management for acute malnutrition to ensure early detection and improve access to and coverage for children who are malnourished.Keywords: Bayesian analysis, severe acute malnutrition, survival data analysis, survival time
Procedia PDF Downloads 4430537 Numerical Simulation of Urea Water Solution Evaporation Behavior inside the Diesel Selective Catalytic Reduction System
Authors: Kumaresh Selvakumar, Man Young Kim
Abstract:
Selective catalytic reduction (SCR) converts the nitrogen oxides with the aid of a catalyst by adding aqueous urea into the exhaust stream. In this work, the urea water droplets are sprayed over the exhaust gases by treating with Lagrangian particle tracking. The evaporation of ammonia from a single droplet of urea water solution is investigated computationally by convection-diffusion controlled model. The conversion to ammonia due to thermolysis of urea water droplets is measured downstream at different sections using finite rate/eddy dissipation model. In this paper, the mixer installed at the upstream enhances the distribution of ammonia over the entire domain which is calculated for different time steps. Calculations are made within the respective duration such that the complete decomposition of urea is possible at a much shorter residence time.Keywords: convection-diffusion controlled model, lagrangian particle tracking, selective catalytic reduction, thermolysis
Procedia PDF Downloads 40330536 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium
Authors: Janne Engblom, Elias Oikarinen
Abstract:
The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.Keywords: dynamic model, panel data, cross-sectional dependence, interaction model
Procedia PDF Downloads 25130535 Near-Infrared Spectrometry as an Alternative Method for Determination of Oxidation Stability for Biodiesel
Authors: R. Velvarska, A. Vrablik, M. Fiedlerova, R. Cerny
Abstract:
Near-infrared spectrometry (NIR) was tested as a rapid and alternative tool for determination of biodiesel oxidation stability. A PetroOxy method is standardly used for the determination, but this method is hazardous due to the possibility of explosion and ignition of flammable fuels. The second disadvantage is time consuming. The near-infrared spectrometry served for the development of the calibration model which was composed of 133 real samples (calibration standards). The reference values of these standards were obtained by PetroOxy method. Many chemometric diagnostics were used for the development of the final NIR model with the aim to have accurate prediction of the oxidation stability. The final NIR model was validated by 30 validation standards. The repeatability was determined as well with the acceptable residual standard deviation (8.59 %). The NIR spectrometry has proved to be an accurate alternative method for the determination of biodiesel oxidation stability with advantages as the time and cost saving, non-destructive character of analyzing and the possibility of online monitoring in safe mode.Keywords: biodiesel, fatty acid methyl ester, NIR, oxidation stability
Procedia PDF Downloads 17430534 Time to Pancreatic Surgery after Preoperative Biliary Drainage in Periampullary Cancers: A Systematic Review and Meta‑Analysis
Authors: Maatouk Mohamed, Nouira Mariem, Hamdi Kbir Gh, Mahjoubi M. F., Ben Moussa M.
Abstract:
Background and aim: Preoperative biliary drainage (PBD) has been introduced to lower bilirubin levels and to control the negative effects of obstructive jaundice in patients with malignant obstructive jaundice undergoing pancreaticoduodenectomy (PD). The optimal time interval between PBD and PD is still not clear. Delaying surgery by 4 to 6 weeks is the commonly accepted practice. However, delayed PD has been shown to decrease the rate of resection and adversely affect the tumor grading and prognosis. Thus, the purpose of our systematic review and meta-analysis was to evaluate the optimal period for PBD prior to PD: short or prolonged in terms of postoperative morbidity and survival outcomes. Methods: Trials were searched in PubMed, Science Direct, Google Scholar, and Cochrane Library until November 2022. Studies using PBD in patients with malignant obstructive jaundice that compared short duration group (SDG) (surgery performed within 3-4 weeks) with prolonged duration group (PDG) (at least 3-4 weeks after PBD) were included in this study. The risk of bias was assessed using the Rob v2 and Robins-I tools. The priori protocol was published in PROSPERO (ID: CRD42022381405). Results: Seven studies comprising 1625 patients (SDG 870, PDG 882) were included. All studies were non-randomized, and only one was prospective. No significant differences were observed between the SDG and PDG in mortality (OR= 0.59; 95% CI [0.30, 1.17], p=0.13), major morbidity (Chi² = 30.28, p <0.00001; I² = 87%), pancreatic fistula (Chi² = 6.61, p = 0.25); I² = 24%), post pancreatectomy haemorrhage (OR= 1.16; 95% CI [0.67, 2.01], p=0.59), positive drainage culture (OR= 0.36; 95% CI [0.10, 1.32], p=0.12), septic complications (OR= 0.78; 95% CI [0.23, 2.72], p=0.70), wound infection (OR= 0.08, p=0.07), operative time (MD= 0.21; p=0.21). Conclusion: Early surgery within 3 or 4 weeks after biliary drainage is both safe and effective. Thus, it is reasonable to suggest early surgery following PBD for patients having resectable periampullary cancers.Keywords: preoperative biliary drainage, pancreatic cancer, pancreatic surgery, complication
Procedia PDF Downloads 6630533 A Model for Solid Transportation Problem with Three Hierarchical Objectives under Uncertain Environment
Authors: Wajahat Ali, Shakeel Javaid
Abstract:
In this study, we have developed a mathematical programming model for a solid transportation problem with three objective functions arranged in hierarchical order. The mathematical programming models with more than one objective function to be solved in hierarchical order is termed as a multi-level programming model. Our study explores a Multi-Level Solid Transportation Problem with Uncertain Parameters (MLSTPWU). The proposed MLSTPWU model consists of three objective functions, viz. minimization of transportation cost, minimization of total transportation time, and minimization of deterioration during transportation. These three objective functions are supposed to be solved by decision-makers at three consecutive levels. Three constraint functions are added to the model, restricting the total availability, total demand, and capacity of modes of transportation. All the parameters involved in the model are assumed to be uncertain in nature. A solution method based on fuzzy logic is also discussed to obtain the compromise solution for the proposed model. Further, a simulated numerical example is discussed to establish the efficiency and applicability of the proposed model.Keywords: solid transportation problem, multi-level programming, uncertain variable, uncertain environment
Procedia PDF Downloads 8130532 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms
Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano
Abstract:
In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.Keywords: heuristic, MIP model, remedial course, school, timetabling
Procedia PDF Downloads 60430531 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete
Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi
Abstract:
Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.Keywords: Abaqus, concrete, constitutive model, numerical simulation
Procedia PDF Downloads 36230530 On Periodic Integer-Valued Moving Average Models
Authors: Aries Nawel, Bentarzi Mohamed
Abstract:
This paper deals with the study of some probabilistic and statistical properties of a Periodic Integer-Valued Moving Average Model (PINMA_{S}(q)). The closed forms of the mean, the second moment and the periodic autocovariance function are obtained. Furthermore, the time reversibility of the model is discussed in details. Moreover, the estimation of the underlying parameters are obtained by the Yule-Walker method, the Conditional Least Square method (CLS) and the Weighted Conditional Least Square method (WCLS). A simulation study is carried out to evaluate the performance of the estimation method. Moreover, an application on real data set is provided.Keywords: periodic integer-valued moving average, periodically correlated process, time reversibility, count data
Procedia PDF Downloads 20030529 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 26030528 Supplemental VisCo-friction Damping for Dynamical Structural Systems
Authors: Sharad Singh, Ajay Kumar Sinha
Abstract:
Coupled dampers like viscoelastic-frictional dampers for supplemental damping are a newer technique. In this paper, innovative Visco-frictional damping models have been presented and investigated. This paper attempts to couple frictional and fluid viscous dampers into a single unit of supplemental dampers. Visco-frictional damping model is developed by series and parallel coupling of frictional and fluid viscous dampers using Maxwell and Kelvin-Voigat models. The time analysis has been performed using numerical simulation on an SDOF system with varying fundamental periods, subject to a set of 12 ground motions. The simulation was performed using the direct time integration method. MATLAB programming tool was used to carry out the numerical simulation. The response behavior has been analyzed for the varying time period and added damping. This paper compares the response reduction behavior of the two modes of coupling. This paper highlights the performance efficiency of the suggested damping models. It also presents a mathematical modeling approach to visco-frictional dampers and simultaneously suggests the suitable mode of coupling between the two sub-units.Keywords: hysteretic damping, Kelvin model, Maxwell model, parallel coupling, series coupling, viscous damping
Procedia PDF Downloads 15730527 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data
Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali
Abstract:
The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors
Procedia PDF Downloads 6830526 Investigating Acute and Chronic Pain after Bariatric Surgery
Authors: Patti Kastanias, Wei Wang, Karyn Mackenzie, Sandra Robinson, Susan Wnuk
Abstract:
Obesity is a worldwide epidemic and is recognized as a chronic disease. Pain in the obese individual is a multidimensional issue. An increase in BMI is positively correlated with pain incidence and severity, especially in central obesity where individuals are twice as likely to have chronic pain. Both obesity and chronic pain are also associated with mood disorders. Pain is worse among obese individuals with depression and anxiety. Bariatric surgery provides patients with an effective solution for long-term weight loss and associated health problems. However, not much is known about acute and chronic pain after bariatric surgery and its contributing factors, including mood disorders. Nurse practitioners (NPs) at one large multidisciplinary bariatric surgery centre led two studies to examine acute and chronic pain and pain management over time after bariatric surgery. The purpose of the initial study was to examine the incidence and severity of acute and chronic pain after bariatric surgery. The aim of the secondary study was to further examine chronic pain, specifically looking at psychological factors that influence severity or incidence of both neuropathic and somatic pain as well as changes in opioid use. The initial study was a prospective, longitudinal study where patients having bariatric surgery at one surgical center were followed up to 6 months postop. Data was collected at 7 time points using validated instruments for pain severity, pain interference, and patient satisfaction. In the second study, subjects were followed longitudinally starting preoperatively and then at 6 months and 1 year postoperatively to capture changes in chronic pain and influencing variables over time. Valid and reliable instruments were utilized for all major study outcomes. In the first study, there was a trend towards decreased acute post-operative pain over time. The incidence and severity of chronic pain was found to be significantly reduced at 6 months post bariatric surgery. Interestingly, interference of chronic pain in daily life such as normal work, mood, and walking ability was significantly improved at 6 months postop however; this was not the case with sleep. Preliminary results of the secondary study indicate that pain severity, pain interference, anxiety and depression are significantly improved at 6 months postoperatively. In addition, preoperative anxiety, depression and emotional regulation were predictive of pain interference, but not pain severity. The results of our regression analyses provide evidence for the impact of pre-existing psychological factors on pain, particularly anxiety in obese populations.Keywords: bariatric surgery, mood disorders, obesity, pain
Procedia PDF Downloads 30330525 Influence of Low and Extreme Heat Fluxes on Thermal Degradation of Carbon Fibre-Reinforced Polymers
Authors: Johannes Bibinger, Sebastian Eibl, Hans-Joachim Gudladt
Abstract:
This study considers the influence of different irradiation scenarios on the thermal degradation of carbon fiber-reinforced polymers (CFRP). Real threats are simulated, such as fires with long-lasting low heat fluxes and nuclear heat flashes with short-lasting high heat fluxes. For this purpose, coated and uncoated quasi-isotropic samples of the commercially available CFRP HexPly® 8552/IM7 are thermally irradiated from one side by a cone calorimeter and a xenon short-arc lamp with heat fluxes between 5 and 175 W/cm² at varying time intervals. The specimen temperature is recorded on the front and backside as well as at different laminate depths. The CFRP is non-destructively tested with ultrasonic testing, infrared spectroscopy (ATR-FTIR), scanning electron microscopy (SEM), and micro-focused computed X-Ray tomography (μCT). Destructive tests are performed to evaluate the mechanical properties in terms of interlaminar shear strength (ILSS), compressive and tensile strength. The irradiation scenarios vary significantly in heat flux and exposure time. Thus, different heating rates, radiation effects, and temperature distributions occur. This leads to unequal decomposition processes, which affect the sensitivity of the strength type and damage behaviour of the specimens. However, with the use of surface coatings, thermal degradation of composite materials can be delayed.Keywords: CFRP, one-sided thermal damage, high heat flux, heating rate, non-destructive and destructive testing
Procedia PDF Downloads 110