Search results for: traditional models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10912

Search results for: traditional models

9652 A Machine Learning Approach for Intelligent Transportation System Management on Urban Roads

Authors: Ashish Dhamaniya, Vineet Jain, Rajesh Chouhan

Abstract:

Traffic management is one of the gigantic issue in most of the urban roads in al-most all metropolitan cities in India. Speed is one of the critical traffic parameters for effective Intelligent Transportation System (ITS) implementation as it decides the arrival rate of vehicles on an intersection which are majorly the point of con-gestions. The study aimed to leverage Machine Learning (ML) models to produce precise predictions of speed on urban roadway links. The research objective was to assess how categorized traffic volume and road width, serving as variables, in-fluence speed prediction. Four tree-based regression models namely: Decision Tree (DT), Random Forest (RF), Extra Tree (ET), and Extreme Gradient Boost (XGB)are employed for this purpose. The models' performances were validated using test data, and the results demonstrate that Random Forest surpasses other machine learning techniques and a conventional utility theory-based model in speed prediction. The study is useful for managing the urban roadway network performance under mixed traffic conditions and effective implementation of ITS.

Keywords: stream speed, urban roads, machine learning, traffic flow

Procedia PDF Downloads 59
9651 Heritage Preservation and Cultural Tourism; The 'Pueblos Mágicos' Program and Its Role in Preserving Traditional Architecture in Mexico

Authors: Claudia Rodríguez Espinosa, Erika Elizabeth Pérez Múzquiz

Abstract:

The Pueblos Mágicos federal program tries to preserve the traditional environment of small towns (under 20,000 inhabitants), through economic investments, legislation, and legal aid. To access the program, it’s important to cover 8 requirements; one of them is the fourth, which considers ‘Promotion of symbolic and differentiated touristic attractions, such as architecture, emblematic buildings, festivities and traditions, artisan production, traditional cuisine, and touristic services that guarantee their commercialization along with assistantship and security services’. With this objective in mind, the Federal government of Mexico had developed local programs to protect emblematic public buildings in each of the 83 towns included in the Pueblos Mágicos program that involved federal and local administrations as well as local civil associations, like Adopte una Obra de Arte. In this paper, we present 3 different intervention cases: first the restoration project (now concluded) of the 16th century monastery of Santa María Magdalena in Cuitzeo, an enormous building which took 6 years to be completely restored. Second case, the public spaces intervention in Pátzcuaro, included the Plaza Grande or Vasco de Quiroga square, and the access to the arts and crafts house known as Casa de los once patios or eleven backyards house. The third case is the recovery project of the 16th century atrium of the Tzintzuntzan monastery that included the original olive trees brought by Franciscans monks to this town in the middle 1500’s. This paper tries to present successful preservation projects in 3 different scales: building, urban spaces and landscape; and in 3 different towns with the objective to preserve public architecture, public spaces and cultural traditions. Learn from foreign experiences, different ways to manage preservation projects focused on public architecture and public spaces.

Keywords: cultural tourism, heritage preservation, traditional architecture, public policies

Procedia PDF Downloads 283
9650 Thermal Method for Testing Small Chemisorbent Samples on the Base of Potassium Superoxide

Authors: Pavel V. Balabanov, Daria A. Liubimova, Aleksandr P. Savenkov

Abstract:

The increase of technogenic and natural accidents, accompanied by air pollution, for example, by combustion products, leads to the necessity of respiratory protection. This work is devoted to the development of a calorimetric method and a device which allow investigating quickly the kinetics of carbon dioxide sorption by chemo-sorbents on the base of potassium superoxide in order to assess the protective properties of respiratory protective closed-circuit apparatus. The features of the traditional approach for determining the sorption properties in a thin layer of chemo-sorbent are described, as well as methods and devices, which can be used for the sorption kinetics study. The authors of the paper developed an approach (as opposed to the traditional approach) based on the power measurement of internal heat sources in the chemo-sorbent layer. The emergence of the heat sources is a result of the exothermic reaction of carbon dioxide sorption. This approach eliminates the necessity of chemical analysis of samples and can significantly reduce the time and material expenses during chemo-sorbents testing. The error of determining the volume fraction of adsorbed carbon dioxide by the developed method does not exceed 12%. Taking into account the efficiency of the method, we consider that it is a good alternative to traditional methods of chemical analysis under the assessment of the protection sorbents quality.

Keywords: carbon dioxide chemisorption, exothermic reaction, internal heat sources, respiratory protective apparatus

Procedia PDF Downloads 401
9649 The Model Establishment and Analysis of TRACE/FRAPTRAN for Chinshan Nuclear Power Plant Spent Fuel Pool

Authors: J. R. Wang, H. T. Lin, Y. S. Tseng, W. Y. Li, H. C. Chen, S. W. Chen, C. Shih

Abstract:

TRACE is developed by U.S. NRC for the nuclear power plants (NPPs) safety analysis. We focus on the establishment and application of TRACE/FRAPTRAN/SNAP models for Chinshan NPP (BWR/4) spent fuel pool in this research. The geometry is 12.17 m × 7.87 m × 11.61 m for the spent fuel pool. In this study, there are three TRACE/SNAP models: one-channel, two-channel, and multi-channel TRACE/SNAP model. Additionally, the cooling system failure of the spent fuel pool was simulated and analyzed by using the above models. According to the analysis results, the peak cladding temperature response was more accurate in the multi-channel TRACE/SNAP model. The results depicted that the uncovered of the fuels occurred at 2.7 day after the cooling system failed. In order to estimate the detailed fuel rods performance, FRAPTRAN code was used in this research. According to the results of FRAPTRAN, the highest cladding temperature located on the node 21 of the fuel rod (the highest node at node 23) and the cladding burst roughly after 3.7 day.

Keywords: TRACE, FRAPTRAN, BWR, spent fuel pool

Procedia PDF Downloads 350
9648 Analytical Description of Disordered Structures in Continuum Models of Pattern Formation

Authors: Gyula I. Tóth, Shaho Abdalla

Abstract:

Even though numerical simulations indeed have a significant precursory/supportive role in exploring the disordered phase displaying no long-range order in pattern formation models, studying the stability properties of this phase and determining the order of the ordered-disordered phase transition in these models necessitate an analytical description of the disordered phase. First, we will present the results of a comprehensive statistical analysis of a large number (1,000-10,000) of numerical simulations in the Swift-Hohenberg model, where the bulk disordered (or amorphous) phase is stable. We will show that the average free energy density (over configurations) converges, while the variance of the energy density vanishes with increasing system size in numerical simulations, which suggest that the disordered phase is a thermodynamic phase (i.e., its properties are independent of the configuration in the macroscopic limit). Furthermore, the structural analysis of this phase in the Fourier space suggests that the phase can be modeled by a colored isotropic Gaussian noise, where any instant of the noise describes a possible configuration. Based on these results, we developed the general mathematical framework of finding a pool of solutions to partial differential equations in the sense of continuous probability measure, which we will present briefly. Applying the general idea to the Swift-Hohenberg model we show, that the amorphous phase can be found, and its properties can be determined analytically. As the general mathematical framework is not restricted to continuum theories, we hope that the proposed methodology will open a new chapter in studying disordered phases.

Keywords: fundamental theory, mathematical physics, continuum models, analytical description

Procedia PDF Downloads 123
9647 An Augmented Reality Based Self-Learning Support System for Skills Training

Authors: Chinlun Lai, Yu-Mei Chang

Abstract:

In this paper, an augmented reality learning support system is proposed to replace the traditional teaching tool thus to help students improve their learning motivation, effectiveness, and efficiency. The system can not only reduce the exhaust of educational hardware and realistic material, but also provide an eco-friendly and self-learning practical environment in any time and anywhere with immediate practical experiences feedback. To achieve this, an interactive self-training methodology which containing step by step operation directions is designed using virtual 3D scenario and wearable device platforms. The course of nasogastric tube care of nursing skills is selected as the test example for self-learning and online test. From the experimental results, it is observed that the support system can not only increase the student’s learning interest but also improve the learning performance than the traditional teaching methods. Thus, it fulfills the strategy of learning by practice while reducing the related cost and effort significantly and is practical in various fields.

Keywords: augmented reality technology, learning support system, self-learning, simulation learning method

Procedia PDF Downloads 161
9646 Optimal Bayesian Control of the Proportion of Defectives in a Manufacturing Process

Authors: Viliam Makis, Farnoosh Naderkhani, Leila Jafari

Abstract:

In this paper, we present a model and an algorithm for the calculation of the optimal control limit, average cost, sample size, and the sampling interval for an optimal Bayesian chart to control the proportion of defective items produced using a semi-Markov decision process approach. Traditional p-chart has been widely used for controlling the proportion of defectives in various kinds of production processes for many years. It is well known that traditional non-Bayesian charts are not optimal, but very few optimal Bayesian control charts have been developed in the literature, mostly considering finite horizon. The objective of this paper is to develop a fast computational algorithm to obtain the optimal parameters of a Bayesian p-chart. The decision problem is formulated in the partially observable framework and the developed algorithm is illustrated by a numerical example.

Keywords: Bayesian control chart, semi-Markov decision process, quality control, partially observable process

Procedia PDF Downloads 314
9645 Numerical Investigation of the Jacketing Method of Reinforced Concrete Column

Authors: S. Boukais, A. Nekmouche, N. Khelil, A. Kezmane

Abstract:

The first intent of this study is to develop a finite element model that can predict correctly the behavior of the reinforced concrete column. Second aim is to use the finite element model to investigate and evaluate the effect of the strengthening method by jacketing of the reinforced concrete column, by considering different interface contact between the old and the new concrete. Four models were evaluated, one by considering perfect contact, the other three models by using friction coefficient of 0.1, 0.3 and 0.5. The simulation was carried out by using Abaqus software. The obtained results show that the jacketing reinforcement led to significant increase of the global performance of the behavior of the simulated reinforced concrete column.

Keywords: strengthening, jacketing, rienforced concrete column, Abaqus, simulation

Procedia PDF Downloads 141
9644 Seismic Hazard Assessment of Offshore Platforms

Authors: F. D. Konstandakopoulou, G. A. Papagiannopoulos, N. G. Pnevmatikos, G. D. Hatzigeorgiou

Abstract:

This paper examines the effects of pile-soil-structure interaction on the dynamic response of offshore platforms under the action of near-fault earthquakes. Two offshore platforms models are investigated, one with completely fixed supports and one with piles which are clamped into deformable layered soil. The soil deformability for the second model is simulated using non-linear springs. These platform models are subjected to near-fault seismic ground motions. The role of fault mechanism on platforms’ response is additionally investigated, while the study also examines the effects of different angles of incidence of seismic records on the maximum response of each platform.

Keywords: hazard analysis, offshore platforms, earthquakes, safety

Procedia PDF Downloads 141
9643 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: fingerprint, template protection, bio-cryptography, minutiae protection

Procedia PDF Downloads 164
9642 A Comparative Analysis of Traditional and Advanced Methods in Evaluating Anti-corrosion Performance of Sacrificial and Barrier Coatings

Authors: Kazem Sabet-Bokati, Ilia Rodionov, Marciel Gaier, Kevin Plucknett

Abstract:

Protective coatings play a pivotal role in mitigating corrosion and preserving the integrity of metallic structures exposed to harsh environmental conditions. The diversity of corrosive environments necessitates the development of protective coatings suitable for various conditions. Accurately selecting and interpreting analysis methods is crucial in identifying the most suitable protective coatings for the various corrosive environments. This study conducted a comprehensive comparative analysis of traditional and advanced methods to assess the anti-corrosion performance of sacrificial and barrier coatings. The protective performance of pure epoxy, zinc-rich epoxy, and cold galvanizing coatings was evaluated using salt spray tests, together with electrochemical impedance spectroscopy (EIS) and potentiodynamic polarization methods. The performance of each coating was thoroughly differentiated under both atmospheric and immersion conditions. The distinct protective performance of each coating against atmospheric corrosion was assessed using traditional standard methods. Additionally, the electrochemical responses of these coatings in immersion conditions were systematically studied, and a detailed discussion on interpreting the electrochemical responses is provided. Zinc-rich epoxy and cold galvanizing coatings offer superior anti-corrosion performance against atmospheric corrosion, while the pure epoxy coating excels in immersion conditions.

Keywords: corrosion, barrier coatings, sacrificial coatings, salt-spray, EIS, polarization

Procedia PDF Downloads 52
9641 A Quantitative Case Study Analysis of Store Format Contributors to U.S. County Obesity Prevalence in Virginia

Authors: Bailey Houghtaling, Sarah Misyak

Abstract:

Food access; the availability, affordability, convenience, and desirability of food and beverage products within communities, is influential on consumers’ purchasing and consumption decisions. These variables may contribute to lower dietary quality scores and a higher obesity prevalence documented among rural and disadvantaged populations in the United States (U.S.). Current research assessing linkages between food access and obesity outcomes has primarily focused on distance to a traditional grocery/supermarket store as a measure of optimality. However, low-income consumers especially, including U.S. Department of Agriculture’s Supplemental Nutrition Assistance Program (SNAP) participants, seem to utilize non-traditional food store formats with greater frequency for household dietary needs. Non-traditional formats have been associated with less nutritious food and beverage options and consumer purchases that are high in saturated fats, added sugars, and sodium. Authors’ formative research indicated differences by U.S. region and rurality in the distribution of traditional and non-traditional SNAP-authorized food store formats. Therefore, using Virginia as a case study, the purpose of this research was to determine if a relationship between store format, rurality, and obesity exists. This research applied SNAP-authorized food store data (food access points for SNAP as well as non-SNAP consumers) and obesity prevalence data by Virginia county using publicly available databases: (1) SNAP Retailer Locator, and; (2) U.S. County Health Rankings. The alpha level was set a priori at 0.05. All Virginia SNAP-authorized stores (n=6,461) were coded by format – grocery, drug, mass merchandiser, club, convenience, dollar, supercenter, specialty, farmers market, independent grocer, and non-food store. Simple linear regression was applied primarily to assess the relationship between store format and obesity. Thereafter, multiple variables were added to the regression to account for potential moderating relationships (e.g., county income, rurality). Convenience, dollar, non-food or restaurant, mass merchandiser, farmers market, and independent grocer formats were significantly, positively related to obesity prevalence. Upon controlling for urban-rural status and income, results indicated the following formats to be significantly related to county obesity prevalence with a small, positive effect: convenience (p=0.010), accounting for 0.3% of the variance in obesity prevalence; dollar (p=0.005; 0.5% of the variance), and; non-food (p=0.030; 1.3% of the variance) formats. These results align with current literature on consumer behavior at non-traditional formats. For example, consumers’ food and beverage purchases at convenience and dollar stores are documented to be high in saturated fats, added sugars, and sodium. Further, non-food stores (i.e., quick-serve restaurants) often contribute to a large portion of U.S. consumers’ dietary intake and thus poor dietary quality scores. Current food access research investigates grocery/supermarket access and obesity outcomes. These results suggest more research is needed that focuses on non-traditional food store formats. Nutrition interventions within convenience, dollar, and non-food stores, for example, that aim to enhance not only healthy food access but the affordability, convenience, and desirability of nutritious food and beverage options may impact obesity rates in Virginia. More research is warranted utilizing the presented investigative framework in other U.S. and global regions to explore the role and the potential of non-traditional food store formats to prevent and reduce obesity.

Keywords: food access, food store format, non-traditional food stores, obesity prevalence

Procedia PDF Downloads 128
9640 Segregation Patterns of Trees and Grass Based on a Modified Age-Structured Continuous-Space Forest Model

Authors: Jian Yang, Atsushi Yagi

Abstract:

Tree-grass coexistence system is of great importance for forest ecology. Mathematical models are being proposed to study the dynamics of tree-grass coexistence and the stability of the systems. However, few of the models concentrates on spatial dynamics of the tree-grass coexistence. In this study, we modified an age-structured continuous-space population model for forests, obtaining an age-structured continuous-space population model for the tree-grass competition model. In the model, for thermal competitions, adult trees can out-compete grass, and grass can out-compete seedlings. We mathematically studied the model to make sure tree-grass coexistence solutions exist. Numerical experiments demonstrated that a fraction of area that trees or grass occupies can affect whether the coexistence is stable or not. We also tried regulating the mortality of adult trees with other parameters and the fraction of area trees and grass occupies were fixed; results show that the mortality of adult trees is also a factor affecting the stability of the tree-grass coexistence in this model.

Keywords: population-structured models, stabilities of ecosystems, thermal competitions, tree-grass coexistence systems

Procedia PDF Downloads 146
9639 Communication Tools Used in Teaching and Their Effects: An Empirical Study on the T. C. Selcuk University Samples

Authors: Sedat Simsek, Tugay Arat

Abstract:

Today's communication concept, which has a great revolution with the printing press which has been found by Gutenberg, has no boundary thanks to advanced communication devices and the internet. It is possible to take advantage in many areas, such as from medicine to social sciences or from mathematics to education, from the computers that was first produced for the purpose of military services. The use of these developing technologies in the field of education has created a great vision changes in both training and having education. Materials, which can be considered as basic communication resources and used in traditional education has begun to lose its significance, and some technologies have begun to replace them such as internet, computers, smart boards, projection devices and mobile phone. On the other hand, the programs and applications used in these technologies have also been developed. University students use virtual books instead of the traditional printed book, use cell phones instead of note books, use the internet and virtual databases instead of the library to research. They even submit their homework with interactive methods rather than printed materials. The traditional education system, these technologies, which increase productivity, have brought a new dimension to education. The aim of this study is to determine the influence of technologies in the learning process of students and to find whether is there any similarities and differences that arise from the their faculty that they have been educated and and their learning process. In addition to this, it is aimed to determine the level of ICT usage of students studying at the university level. In this context, the advantages and conveniences of the technology used by students are also scrutinized. In this study, we used surveys to collect data. The data were analyzed by using SPSS 16 statistical program with the appropriate testing.

Keywords: education, communication technologies, role of technology, teaching

Procedia PDF Downloads 298
9638 Comparison of Applicability of Time Series Forecasting Models VAR, ARCH and ARMA in Management Science: Study Based on Empirical Analysis of Time Series Techniques

Authors: Muhammad Tariq, Hammad Tahir, Fawwad Mahmood Butt

Abstract:

Purpose: This study attempts to examine the best forecasting methodologies in the time series. The time series forecasting models such as VAR, ARCH and the ARMA are considered for the analysis. Methodology: The Bench Marks or the parameters such as Adjusted R square, F-stats, Durban Watson, and Direction of the roots have been critically and empirically analyzed. The empirical analysis consists of time series data of Consumer Price Index and Closing Stock Price. Findings: The results show that the VAR model performed better in comparison to other models. Both the reliability and significance of VAR model is highly appreciable. In contrary to it, the ARCH model showed very poor results for forecasting. However, the results of ARMA model appeared double standards i.e. the AR roots showed that model is stationary and that of MA roots showed that the model is invertible. Therefore, the forecasting would remain doubtful if it made on the bases of ARMA model. It has been concluded that VAR model provides best forecasting results. Practical Implications: This paper provides empirical evidences for the application of time series forecasting model. This paper therefore provides the base for the application of best time series forecasting model.

Keywords: forecasting, time series, auto regression, ARCH, ARMA

Procedia PDF Downloads 336
9637 A Simple Fluid Dynamic Model for Slippery Pulse Pattern in Traditional Chinese Pulse Diagnosis

Authors: Yifang Gong

Abstract:

Pulse diagnosis is one of the most important diagnosis methods in traditional Chinese medicine. It is also the trickiest method to learn. It is known as that it can only to be sensed not explained. This becomes a serious threat to the survival of this diagnostic method. However, there are a large amount of experiences accumulated during the several thousand years of practice of Chinese doctors. A pulse pattern called 'Slippery pulse' is one of the indications of pregnancy. A simple fluid dynamic model is proposed to simulate the effects of the existence of a placenta. The placenta is modeled as an extra plenum in an extremely simplified fluid network model. It is found that because of the existence of the extra plenum, indeed the pulse pattern shows a secondary peak in one pulse period. As for the author’s knowledge, this work is the first time to show the link between Pulse diagnoses and basic physical principle. Key parameters which might affect the pattern are also investigated.

Keywords: Chinese medicine, flow network, pregnancy, pulse

Procedia PDF Downloads 372
9636 Use of SUDOKU Design to Assess the Implications of the Block Size and Testing Order on Efficiency and Precision of Dulce De Leche Preference Estimation

Authors: Jéssica Ferreira Rodrigues, Júlio Silvio De Sousa Bueno Filho, Vanessa Rios De Souza, Ana Carla Marques Pinheiro

Abstract:

This study aimed to evaluate the implications of the block size and testing order on efficiency and precision of preference estimation for Dulce de leche samples. Efficiency was defined as the inverse of the average variance of pairwise comparisons among treatments. Precision was defined as the inverse of the variance of treatment means (or effects) estimates. The experiment was originally designed to test 16 treatments as a series of 8 Sudoku 16x16 designs being 4 randomized independently and 4 others in the reverse order, to yield balance in testing order. Linear mixed models were assigned to the whole experiment with 112 testers and all their grades, as well as their partially balanced subgroups, namely: a) experiment with the four initial EU; b) experiment with EU 5 to 8; c) experiment with EU 9 to 12; and b) experiment with EU 13 to 16. To record responses we used a nine-point hedonic scale, it was assumed a mixed linear model analysis with random tester and treatments effects and with fixed test order effect. Analysis of a cumulative random effects probit link model was very similar, with essentially no different conclusions and for simplicity, we present the results using Gaussian assumption. R-CRAN library lme4 and its function lmer (Fit Linear Mixed-Effects Models) was used for the mixed models and libraries Bayesthresh (default Gaussian threshold function) and ordinal with the function clmm (Cumulative Link Mixed Model) was used to check Bayesian analysis of threshold models and cumulative link probit models. It was noted that the number of samples tested in the same session can influence the acceptance level, underestimating the acceptance. However, proving a large number of samples can help to improve the samples discrimination.

Keywords: acceptance, block size, mixed linear model, testing order, testing order

Procedia PDF Downloads 315
9635 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 94
9634 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 27
9633 Churn Prediction for Savings Bank Customers: A Machine Learning Approach

Authors: Prashant Verma

Abstract:

Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.

Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling

Procedia PDF Downloads 133
9632 Communication and Management of Incidental Pathology in a Cohort of 1,214 Consecutive Appendicectomies

Authors: Matheesha Herath, Ned Kinnear, Bridget Heijkoop, Eliza Bramwell, Alannah Frazetto, Amy Noll, Prajay Patel, Derek Hennessey, Greg Otto, Christopher Dobbins, Tarik Sammour, James Moore

Abstract:

Background: Important incidental pathology requiring further action is commonly found during appendicectomy, macro- and microscopically. It is unknown whether the acute surgical unit (ASU) model affects the management and disclosure of these findings. Methods: An ASU model was introduced at our institution on 01/08/2012. In this retrospective cohort study, all patients undergoing appendicectomy 2.5 years before (traditional group) or after (ASU group) this date were compared. The primary outcomes were rates of appropriate management of the incidental findings and communication of the findings to the patient and to their general practitioner (GP). Results: 1,214 patients underwent emergency appendicectomy; 465 in the traditional group and 749 in the ASU group. 80 (6.6%) patients (25 and 55 in each respective period) had important incidental findings. There were 24 patients with benign polyps, 15 with neuro-endocrine tumour, 11 with endometriosis, 8 with pelvic inflammatory disease, 8 Enterobius vermicularis infection, 7 with low grade mucinous cystadenoma, 3 with inflammatory bowel disease, 2 with diverticulitis, 2 with tubo-ovarian mass, 1 with secondary appendiceal malignancy and none with primary appendiceal adenocarcinoma. One patient had dual pathologies. There was no difference between the traditional and ASU group with regards to communication of the findings to the patient (p=0.44) and their GP (p=0.27), and there was no difference in the rates of appropriate management (p=0.21). Conclusions: The introduction of an ASU model did not change rates of surgeon-to-patient and surgeon-to-GP communication nor affect rates of appropriate management of important incidental pathology during an appendectomy.

Keywords: acute care surgery, appendicitis, appendicectomy, incidental

Procedia PDF Downloads 137
9631 The Posthuman Condition and a Translational Ethics of Entanglement

Authors: Shabnam Naderi

Abstract:

Traditional understandings of ethics considered translators, translations, technologies and other agents as separate and prioritized human agents. In fact, ethics was equated with morality. This disengaged understanding of ethics is superseded by an ethics of relation/entanglement in the posthuman philosophy. According to this ethics of entanglement, human and nonhuman agents are in constant ‘intra-action’. The human is not separate from nature, from technology and from other nonhuman entities, and an ethics of translation in this regard cannot be separated from technology and ecology and get defined merely within the realm of human-human encounter. As such, a posthuman ethics offers opportunities for change and responds to the changing nature of reality, it is negotiable and reveals itself as a moment-by-moment practice (i.e. as temporally emergent and beyond determinacy and permanence). Far from the linguistic or cultural, or individual concerns, posthuman translational ethics discusses how the former rigid norms and laws are challenged in a process ontology which puts emphasis on activity and activation and considers ethics as surfacing in activity, not as a predefined set of rules and values. In this sense, traditional ethical principles like faithfulness, accuracy and representation are superseded by principles of privacy, sustainability, multiplicity and decentralization. The present conceptual study, drawing on Ferrando’s philosophical posthumanism (as a post-humanism, as a post-dualism and as a post-anthropocentrism), Deleuze-Guattarian philosophy of immanence and Barad’s physics-philosophy strives to destabilize traditional understandings of translation ethics and bring an ethics that has loose ends and revolves around multiplicity and decentralization into the picture.

Keywords: ethics of entanglement, post-anthropocentrism, post-dualism, post-humanism, translation

Procedia PDF Downloads 67
9630 Exploring and Evaluating the Current Style of Teaching Biology in Saudi Universities from Teachers' Points of View

Authors: Ibraheem Alzahrani

Abstract:

The Saudi Arabia ministry of higher education has established 24 universities across various cities in the kingdom. The universities have the mandate of sustaining technological progress in both teaching and learning. The present study explores the statues of teaching in Saudi universities, focusing on biology, a critical curriculum. The paper explores biology teachers’ points of view is several Saudi higher education institutions through questionnaires disseminated via emails. According to the findings, the current teaching methods are traditional and the teachers believe that it is critical to change it. This study also, reviews how biology has been taught in the kingdom over the past, as well as how it is undertaken presently. In addition, some aspects of biology teaching are considered, including the biology curriculum and learning objectives in higher education biology.

Keywords: higher education, teaching style, traditional learning, electronic learning, web 2.0 applications, blended learning

Procedia PDF Downloads 374
9629 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 421
9628 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 390
9627 Optimizing the Passenger Throughput at an Airport Security Checkpoint

Authors: Kun Li, Yuzheng Liu, Xiuqi Fan

Abstract:

High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.

Keywords: queue theory, security check, stochatic process, Monte Carlo simulation

Procedia PDF Downloads 196
9626 Inhibitory Effect on TNF-Alpha Release of Dioscorea membranacea and Its Compounds

Authors: Arunporn Itharat, Srisopa Ruangnoo, Pakakrong Thongdeeying

Abstract:

The rhizomes of Dioscorea membranacea (DM) has long been used in Thai Traditional medicine to treat cancer and inflammatory conditions such as rheumatism. The objective of this study was to investigate anti-inflammatory activity by determining the inhibitory effect on LPS-induced TNF-α from RAW264.7 cells of crude extracts and pure isolated compounds from DM. Three known dihydrophenantrene compounds were isolated by a bioassay guided isolation method from DM ethanolic extract [2,4 dimethoxy-5,6-dihydroxy-9,10-dihydrophenanthrene (1) and 5-hydroxy-2,4,6-trimethoxy-9,10-dihydrophenanthrene(2) and 5,6,2 -trihydroxy 3,4-methoxy, 9,10- dihydrophenanthrene (3)]. 1 showed the highest inhibitory effect on PGE2, followed by 3 and 1 (IC50 = 2.26, 4.97 and >20 μg/ml or 8.31,17.25 and > 20 µM respectively). These findings suggest that this plant showed anti-inflamatory effects by displaying an inhibitory effect on TNF-α release, hence, this result supports the usage of Thai traditional medicine to treat inflammation related diseases.

Keywords: Dioscorea membranacea, anti-inflammatory activity, TNF-Alpha , dihidrophenantrene compound

Procedia PDF Downloads 497
9625 Application of Signature Verification Models for Document Recognition

Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova

Abstract:

In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.

Keywords: signature recognition, biometric data, artificial intelligence, neural networks

Procedia PDF Downloads 139
9624 The Influence of Islamic Arts in Omani Weaving Motifs

Authors: Zahra Ahmed Al-zadjali

Abstract:

The influence of Islam on arts can be found primarily in calligraphy, arabesque designs and architecture. Also, geometric designs were used quite extensively. Muslim craftsmen produced stunning designs based on simple geometric principles and traditional motifs which were used to decorate many surfaces. The idea of interlacing simple rectilinear lines to form the patterns impressed Arabs. Nomads of Persia, Turks and Mongols were equally impressed with the designs so they begin to use them in their homes in carpet weaving. Islamic designs, motifs and colours which were used became common place and served to influence people’s tastes. Modern life style and contemporary products have changed the style of people’s daily lives, however, people still long for the nomadic way of life. This is clearly reflected in people’s homes. In a great many Muslim homes, Islamic decorative motifs can be seen along with traditional ‘Bedouin’ style furnishing, especially in homes of the Arabian Peninsula.

Keywords: art, craft, design, Oman, weaving

Procedia PDF Downloads 461
9623 Analog Input Output Buffer Information Specification Modelling Techniques for Single Ended Inter-Integrated Circuit and Differential Low Voltage Differential Signaling I/O Interfaces

Authors: Monika Rawat, Rahul Kumar

Abstract:

Input output Buffer Information Specification (IBIS) models are used for describing the analog behavior of the Input Output (I/O) buffers of a digital device. They are widely used to perform signal integrity analysis. Advantages of using IBIS models include simple structure, IP protection and fast simulation time with reasonable accuracy. As design complexity of driver and receiver increases, capturing exact behavior from transistor level model into IBIS model becomes an essential task to achieve better accuracy. In this paper, an improvement in existing methodology of generating IBIS model for complex I/O interfaces such as Inter-Integrated Circuit (I2C) and Low Voltage Differential Signaling (LVDS) is proposed. Furthermore, the accuracy and computational performance of standard method and proposed approach with respect to SPICE are presented. The investigations will be useful to further improve the accuracy of IBIS models and to enhance their wider acceptance.

Keywords: IBIS, signal integrity, open-drain buffer, low voltage differential signaling, behavior modelling, transient simulation

Procedia PDF Downloads 186