Search results for: diffusive Johnson-Segalman model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16807

Search results for: diffusive Johnson-Segalman model

15397 The Effects of Different Parameters of Wood Floating Debris on Scour Rate Around Bridge Piers

Authors: Muhanad Al-Jubouri

Abstract:

A local scour is the most important of the several scours impacting bridge performance and security. Even though scour is widespread in bridges, especially during flood seasons, the experimental tests could not be applied to many standard highway bridges. A computational fluid dynamics numerical model was used to solve the problem of calculating local scouring and deposition for non-cohesive silt and clear water conditions near single and double cylindrical piers with the effect of floating debris. When FLOW-3D software is employed with the Rang turbulence model, the Nilsson bed-load transfer equation and fine mesh size are considered. The numerical findings of single cylindrical piers correspond pretty well with the physical model's results. Furthermore, after parameter effectiveness investigates the range of outcomes based on predicted user inputs such as the bed-load equation, mesh cell size, and turbulence model, the final numerical predictions are compared to experimental data. When the findings are compared, the error rate for the deepest point of the scour is equivalent to 3.8% for the single pier example.

Keywords: local scouring, non-cohesive, clear water, computational fluid dynamics, turbulence model, bed-load equation, debris

Procedia PDF Downloads 68
15396 The Role of Group Size, Public Employees’ Wages and Control Corruption Institutions in a Game-Theoretical Model of Public Corruption

Authors: Pablo J. Valverde, Jaime E. Fernandez

Abstract:

This paper shows under which conditions public corruption can emerge. The theoretical model includes variables such as the public employee wage (w), a control corruption parameter (c), and the group size of interactions (GS) between clusters of public officers and contractors. The system behavior is analyzed using phase diagrams based on combinations of such parameters (c, w, GS). Numerical simulations are implemented in order to contrast analytic results based on Nash equilibria of the theoretical model. Major findings include the functional relationship between wages and network topology, which attempts to reduce the emergence of corrupt behavior.

Keywords: public corruption, game theory, complex systems, Nash equilibrium.

Procedia PDF Downloads 241
15395 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures

Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman

Abstract:

Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.

Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction

Procedia PDF Downloads 44
15394 Efficiency of Secondary Schools by ICT Intervention in Sylhet Division of Bangladesh

Authors: Azizul Baten, Kamrul Hossain, Abdullah-Al-Zabir

Abstract:

The objective of this study is to develop an appropriate stochastic frontier secondary schools efficiency model by ICT Intervention and to examine the impact of ICT challenges on secondary schools efficiency in the Sylhet division in Bangladesh using stochastic frontier analysis. The Translog stochastic frontier model was found an appropriate than the Cobb-Douglas model in secondary schools efficiency by ICT Intervention. Based on the results of the Cobb-Douglas model, it is found that the coefficient of the number of teachers, the number of students, and teaching ability had a positive effect on increasing the level of efficiency. It indicated that these are related to technical efficiency. In the case of inefficiency effects for both Cobb-Douglas and Translog models, the coefficient of the ICT lab decreased secondary school inefficiency, but the online class in school was found to increase the level of inefficiency. The coefficients of teacher’s preference for ICT tools like multimedia projectors played a contributor role in decreasing the secondary school inefficiency in the Sylhet division of Bangladesh. The interaction effects of the number of teachers and the classrooms, and the number of students and the number of classrooms, the number of students and teaching ability, and the classrooms and teaching ability of the teachers were recorded with the positive values and these have a positive impact on increasing the secondary school efficiency. The overall mean efficiency of urban secondary schools was found at 84.66% for the Translog model, while it was 83.63% for the Cobb-Douglas model. The overall mean efficiency of rural secondary schools was found at 80.98% for the Translog model, while it was 81.24% for the Cobb-Douglas model. So, the urban secondary schools performed better than the rural secondary schools in the Sylhet division. It is observed from the results of the Tobit model that the teacher-student ratio had a positive influence on secondary school efficiency. The teaching experiences of those who have 1 to 5 years and 10 years above, MPO type school, conventional teaching method have had a negative and significant influence on secondary school efficiency. The estimated value of σ-square (0.0625) was different from Zero, indicating a good fit. The value of γ (0.9872) was recorded as positive and it can be interpreted as follows: 98.72 percent of random variation around in secondary school outcomes due to inefficiency.

Keywords: efficiency, secondary schools, ICT, stochastic frontier analysis

Procedia PDF Downloads 149
15393 Distangling Biological Noise in Cellular Images with a Focus on Explainability

Authors: Manik Sharma, Ganapathy Krishnamurthi

Abstract:

The cost of some drugs and medical treatments has risen in recent years, that many patients are having to go without. A classification project could make researchers more efficient. One of the more surprising reasons behind the cost is how long it takes to bring new treatments to market. Despite improvements in technology and science, research and development continues to lag. In fact, finding new treatment takes, on average, more than 10 years and costs hundreds of millions of dollars. If successful, we could dramatically improve the industry's ability to model cellular images according to their relevant biology. In turn, greatly decreasing the cost of treatments and ensure these treatments get to patients faster. This work aims at solving a part of this problem by creating a cellular image classification model which can decipher the genetic perturbations in cell (occurring naturally or artificially). Another interesting question addressed is what makes the deep-learning model decide in a particular fashion, which can further help in demystifying the mechanism of action of certain perturbations and paves a way towards the explainability of the deep-learning model.

Keywords: cellular images, genetic perturbations, deep-learning, explainability

Procedia PDF Downloads 110
15392 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons

Authors: Ozgu Hafizoglu

Abstract:

Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.

Keywords: analogy, analogical reasoning, cognitive model, brain and glials

Procedia PDF Downloads 184
15391 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation

Authors: Yoonsuh Jung, Steven N. MacEachern

Abstract:

Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.

Keywords: cross-validation, model selection, quantile regression, tuning parameter selection

Procedia PDF Downloads 435
15390 Uncertainty in Risk Modeling

Authors: Mueller Jann, Hoffmann Christian Hugo

Abstract:

Conventional quantitative risk management in banking is a risk factor of its own, because it rests on assumptions such as independence and availability of data which do not hold when rare events of extreme consequences are involved. There is a growing recognition of the need for alternative risk measures that do not make these assumptions. We propose a novel method for modeling the risk associated with investment products, in particular derivatives, by using a formal language for specifying financial contracts. Expressions in this language are interpreted in the category of values annotated with (a formal representation of) uncertainty. The choice of uncertainty formalism thus becomes a parameter of the model, so it can be adapted to the particular application and it is not constrained to classical probabilities. We demonstrate our approach using a simple logic-based uncertainty model and a case study in which we assess the risk of counter party default in a portfolio of collateralized loans.

Keywords: risk model, uncertainty monad, derivatives, contract algebra

Procedia PDF Downloads 575
15389 Comparison Analysis of CFD Turbulence Fluid Numerical Study for Quick Coupling

Authors: JoonHo Lee, KyoJin An, JunSu Kim, Young-Chul Park

Abstract:

In this study, the fluid flow characteristics and performance numerical study through CFD model of the Non-split quick coupling for flow control in hydraulic system equipment for the aerospace business group focused to predict. In this study, we considered turbulence models for the application of Computational Fluid Dynamics for the CFD model of the Non-split Quick Coupling for aerospace business. In addition to this, the adequacy of the CFD model were verified by comparing with standard value. Based on this analysis, accurate the fluid flow characteristics can be predicted. It is, therefore, the design of the fluid flow characteristic contribute the reliability for the Quick Coupling which is required in industries on the basis of research results.

Keywords: CFD, FEM, quick coupling, turbulence

Procedia PDF Downloads 382
15388 Deepfake Detection for Compressed Media

Authors: Sushil Kumar Gupta, Atharva Joshi, Ayush Sonawale, Sachin Naik, Rajshree Khande

Abstract:

The usage of artificially created videos and audio by deep learning is a major problem of the current media landscape, as it pursues the goal of misinformation and distrust. In conclusion, the objective of this work targets generating a reliable deepfake detection model using deep learning that will help detect forged videos accurately. In this work, CelebDF v1, one of the largest deepfake benchmark datasets in the literature, is adopted to train and test the proposed models. The data includes authentic and synthetic videos of high quality, therefore allowing an assessment of the model’s performance against realistic distortions.

Keywords: deepfake detection, CelebDF v1, convolutional neural network (CNN), xception model, data augmentation, media manipulation

Procedia PDF Downloads 5
15387 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran

Authors: Hamed Zolfaghari, Mojtaba Kord

Abstract:

Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.

Keywords: primavera P6, SQL, Power BI, EVM, integration management

Procedia PDF Downloads 106
15386 Artificial Neural Network Based Parameter Prediction of Miniaturized Solid Rocket Motor

Authors: Hao Yan, Xiaobing Zhang

Abstract:

The working mechanism of miniaturized solid rocket motors (SRMs) is not yet fully understood. It is imperative to explore its unique features. However, there are many disadvantages to using common multi-objective evolutionary algorithms (MOEAs) in predicting the parameters of the miniaturized SRM during its conceptual design phase. Initially, the design variables and objectives are constrained in a lumped parameter model (LPM) of this SRM, which leads to local optima in MOEAs. In addition, MOEAs require a large number of calculations due to their population strategy. Although the calculation time for simulating an LPM just once is usually less than that of a CFD simulation, the number of function evaluations (NFEs) is usually large in MOEAs, which makes the total time cost unacceptably long. Moreover, the accuracy of the LPM is relatively low compared to that of a CFD model due to its assumptions. CFD simulations or experiments are required for comparison and verification of the optimal results obtained by MOEAs with an LPM. The conceptual design phase based on MOEAs is a lengthy process, and its results are not precise enough due to the above shortcomings. An artificial neural network (ANN) based parameter prediction is proposed as a way to reduce time costs and improve prediction accuracy. In this method, an ANN is used to build a surrogate model that is trained with a 3D numerical simulation. In design, the original LPM is replaced by a surrogate model. Each case uses the same MOEAs, in which the calculation time of the two models is compared, and their optimization results are compared with 3D simulation results. Using the surrogate model for the parameter prediction process of the miniaturized SRMs results in a significant increase in computational efficiency and an improvement in prediction accuracy. Thus, the ANN-based surrogate model does provide faster and more accurate parameter prediction for an initial design scheme. Moreover, even when the MOEAs converge to local optima, the time cost of the ANN-based surrogate model is much lower than that of the simplified physical model LPM. This means that designers can save a lot of time during code debugging and parameter tuning in a complex design process. Designers can reduce repeated calculation costs and obtain accurate optimal solutions by combining an ANN-based surrogate model with MOEAs.

Keywords: artificial neural network, solid rocket motor, multi-objective evolutionary algorithm, surrogate model

Procedia PDF Downloads 89
15385 Verification of a Simple Model for Rolling Isolation System Response

Authors: Aarthi Sridhar, Henri Gavin, Karah Kelly

Abstract:

Rolling Isolation Systems (RISs) are simple and effective means to mitigate earthquake hazards to equipment in critical and precious facilities, such as hospitals, network collocation facilities, supercomputer centers, and museums. The RIS works by isolating components acceleration the inertial forces felt by the subsystem. The RIS consists of two platforms with counter-facing concave surfaces (dishes) in each corner. Steel balls lie inside the dishes and allow the relative motion between the top and bottom platform. Formerly, a mathematical model for the dynamics of RISs was developed using Lagrange’s equations (LE) and experimentally validated. A new mathematical model was developed using Gauss’s Principle of Least Constraint (GPLC) and verified by comparing impulse response trajectories of the GPLC model and the LE model in terms of the peak displacements and accelerations of the top platform. Mathematical models for the RIS are tedious to derive because of the non-holonomic rolling constraints imposed on the system. However, using Gauss’s Principle of Least constraint to find the equations of motion removes some of the obscurity and yields a system that can be easily extended. Though the GPLC model requires more state variables, the equations of motion are far simpler. The non-holonomic constraint is enforced in terms of accelerations and therefore requires additional constraint stabilization methods in order to avoid the possibility that numerical integration methods can cause the system to go unstable. The GPLC model allows the incorporation of more physical aspects related to the RIS, such as contribution of the vertical velocity of the platform to the kinetic energy and the mass of the balls. This mathematical model for the RIS is a tool to predict the motion of the isolation platform. The ability to statistically quantify the expected responses of the RIS is critical in the implementation of earthquake hazard mitigation.

Keywords: earthquake hazard mitigation, earthquake isolation, Gauss’s Principle of Least Constraint, nonlinear dynamics, rolling isolation system

Procedia PDF Downloads 248
15384 Assessment of Modern RANS Models for the C3X Vane Film Cooling Prediction

Authors: Mikhail Gritskevich, Sebastian Hohenstein

Abstract:

The paper presents the results of a detailed assessment of several modern Reynolds Averaged Navier-Stokes (RANS) turbulence models for prediction of C3X vane film cooling at various injection regimes. Three models are considered, namely the Shear Stress Transport (SST) model, the modification of the SST model accounting for the streamlines curvature (SST-CC), and the Explicit Algebraic Reynolds Stress Model (EARSM). It is shown that all the considered models face with a problem in prediction of the adiabatic effectiveness in the vicinity of the cooling holes; however, accounting for the Reynolds stress anisotropy within the EARSM model noticeably increases the solution accuracy. On the other hand, further downstream all the models provide a reasonable agreement with the experimental data for the adiabatic effectiveness and among the considered models the most accurate results are obtained with the use EARMS.

Keywords: discrete holes film cooling, Reynolds Averaged Navier-Stokes (RANS), Reynolds stress tensor anisotropy, turbulent heat transfer

Procedia PDF Downloads 417
15383 Islamic Finance: What is the Outlook for Italy?

Authors: Paolo Pietro Biancone

Abstract:

The spread of Islamic financial instruments is an opportunity to offer integration for the immigrant population and to attract, through the specific products, the richness of sovereign funds from the "Arab" countries. However, it is important to consider the possibility of comparing a traditional finance model, which in recent times has given rise to many doubts, with an "alternative" finance model, where the ethical aspect arising from religious principles is very important.

Keywords: banks, Europe, Islamic finance, Italy

Procedia PDF Downloads 268
15382 The BL-5D Model: The Development of a Model of Instructional Design for Blended Learning Activities

Authors: Damian Gordon, Paul Doyle, Anna Becevel, Júlia Vilafranca Molero, Cinta Gascon, Arianna Vitiello, Tina Baloh

Abstract:

It has long been recognized that the creation of any teaching content can be enhanced if the development process follows a pre-defined approach, which is often referred to as an instructional design methodology. These methodologies typically define a number of stages, or phases, that an educator should undertake to help ensure the quality of the final teaching content that is developed. In this paper, we present an instructional design methodology that is focused specifically on the introduction of blended resources into a heretofore bricks-and-mortar course. To achieve this, research was undertaken concerning a range of models of instructional design, as well as literature covering some of the key challenges and “pain points” of blending. Following this, our model, the BL-5D model, is presented, which incorporates some key questions at each stage of this five-stage methodology to guide the development process. Finally, a discussion of some of the key themes and issues that have been uncovered in this work is presented, as well as a template for a blended learning case study that emerged from this approach.

Keywords: blended learning, challenges of blended learning, design methodologies, instructional design

Procedia PDF Downloads 117
15381 Numerical Simulation of a Three-Dimensional Framework under the Action of Two-Dimensional Moving Loads

Authors: Jia-Jang Wu

Abstract:

The objective of this research is to develop a general technique so that one may predict the dynamic behaviour of a three-dimensional scale crane model subjected to time-dependent moving point forces by means of conventional finite element computer packages. To this end, the whole scale crane model is divided into two parts: the stationary framework and the moving substructure. In such a case, the dynamic responses of a scale crane model can be predicted from the forced vibration responses of the stationary framework due to actions of the four time-dependent moving point forces induced by the moving substructure. Since the magnitudes and positions of the moving point forces are dependent on the relative positions between the trolley, moving substructure and the stationary framework, it can be found from the numerical results that the time histories for the moving speeds of the moving substructure and the trolley are the key factors affecting the dynamic responses of the scale crane model.

Keywords: moving load, moving substructure, dynamic responses, forced vibration responses

Procedia PDF Downloads 350
15380 Social Collaborative Learning Model Based on Proactive Involvement to Promote the Global Merit Principle in Cultivating Youths' Morality

Authors: Wera Supa, Panita Wannapiroon

Abstract:

This paper is a report on the designing of the social collaborative learning model based on proactive involvement to Promote the global merit principle in cultivating youths’ morality. The research procedures into two phases, the first phase is to design the social collaborative learning model based on proactive involvement to promote the global merit principle in cultivating youths’ morality, and the second is to evaluate the social collaborative learning model based on proactive involvement. The sample group in this study consists of 15 experts who are dominant in proactive participation, moral merit principle and youths’ morality cultivation from executive level, lecturers and the professionals in information and communication technology expertise selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. This study has explored that there are four significant factors in promoting the hands-on collaboration of global merit scheme in order to implant virtues to adolescences which are: 1) information and communication Technology Usage; 2) proactive involvement; 3) morality cultivation policy, and 4) global merit principle. The experts agree that the social collaborative learning model based on proactive involvement is highly appropriate.

Keywords: social collaborative learning, proactive involvement, global merit principle, morality

Procedia PDF Downloads 385
15379 Two Concurrent Convolution Neural Networks TC*CNN Model for Face Recognition Using Edge

Authors: T. Alghamdi, G. Alaghband

Abstract:

In this paper we develop a model that couples Two Concurrent Convolution Neural Network with different filters (TC*CNN) for face recognition and compare its performance to an existing sequential CNN (base model). We also test and compare the quality and performance of the models on three datasets with various levels of complexity (easy, moderate, and difficult) and show that for the most complex datasets, edges will produce the most accurate and efficient results. We further show that in such cases while Support Vector Machine (SVM) models are fast, they do not produce accurate results.

Keywords: Convolution Neural Network, Edges, Face Recognition , Support Vector Machine.

Procedia PDF Downloads 152
15378 Mathematical Modeling of the Water Bridge Formation in Porous Media: PEMFC Microchannels

Authors: N. Ibrahim-Rassoul, A. Kessi, E. K. Si-Ahmed, N. Djilali, J. Legrand

Abstract:

The static and dynamic formation of liquid water bridges is analyzed using a combination of visualization experiments in a microchannel with a mathematical model. This paper presents experimental and theoretical findings of water plug/capillary bridge formation in a 250 μm squared microchannel. The approach combines mathematical and numerical modeling with experimental visualization and measurements. The generality of the model is also illustrated for flow conditions encountered in manipulation of polymeric materials and formation of liquid bridges between patterned surfaces. The predictions of the model agree favorably the observations as well as with the experimental recordings.

Keywords: green energy, mathematical modeling, fuel cell, water plug, gas diffusion layer, surface of revolution

Procedia PDF Downloads 528
15377 Automatic Classification of Lung Diseases from CT Images

Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari

Abstract:

Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.

Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification

Procedia PDF Downloads 153
15376 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR

Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.

Abstract:

We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.

Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME

Procedia PDF Downloads 392
15375 A Model Towards Creating Positive Accounting Classroom Conditions That Supports Successful Learning at School

Authors: Vine Petzer, Mirna Nel

Abstract:

An explanatory mixed method design was used to investigate accounting classroom conditions in the Further Education and Training (FET) Phase in South Africa. A descriptive survey research study with a heterogeneous group of learners and teachers was conducted in the first phase. In the qualitative phase, semi-structured individual interviews with learners and teachers, as well as observations in the accounting classroom, were employed to gain more in depth understanding of the learning conditions in the accounting classroom. The findings of the empirical research informed the development of a model for teachers in accounting, supporting them to use more effective teaching methods and create positive learning conditions for all learners to experience successful learning. A model towards creating positive Accounting classroom conditions that support successful learning was developed and recommended for education policy and decision-makers for use as a classroom intervention capacity building tool. The model identifies and delineates classroom practices that exert significant effect on learner attainment of quality education.

Keywords: accounting classroom conditions, positive education, successful learning, teaching accounting

Procedia PDF Downloads 145
15374 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes

Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini

Abstract:

Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.

Keywords: modelling, Monte Carlo simulations, probabilistic models, data clustering, reinforced concrete members, structural design

Procedia PDF Downloads 471
15373 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 123
15372 Integrating Machine Learning and Rule-Based Decision Models for Enhanced B2B Sales Forecasting and Customer Prioritization

Authors: Wenqi Liu, Reginald Bailey

Abstract:

This study explores an advanced approach to enhancing B2B sales forecasting by integrating machine learning models with a rule-based decision framework. The methodology begins with the development of a machine learning classification model to predict conversion likelihood, aiming to improve accuracy over traditional methods like logistic regression. The classification model's effectiveness is measured using metrics such as accuracy, precision, recall, and F1 score, alongside a feature importance analysis to identify key predictors. Following this, a machine learning regression model is used to forecast sales value, with the objective of reducing mean absolute error (MAE) compared to linear regression techniques. The regression model's performance is assessed using MAE, root mean square error (RMSE), and R-squared metrics, emphasizing feature contribution to the prediction. To bridge the gap between predictive analytics and decision-making, a rule-based decision model is introduced that prioritizes customers based on predefined thresholds for conversion probability and predicted sales value. This approach significantly enhances customer prioritization and improves overall sales performance by increasing conversion rates and optimizing revenue generation. The findings suggest that this combined framework offers a practical, data-driven solution for sales teams, facilitating more strategic decision-making in B2B environments.

Keywords: sales forecasting, machine learning, rule-based decision model, customer prioritization, predictive analytics

Procedia PDF Downloads 14
15371 Learning Algorithms for Fuzzy Inference Systems Composed of Double- and Single-Input Rule Modules

Authors: Hirofumi Miyajima, Kazuya Kishida, Noritaka Shigei, Hiromi Miyajima

Abstract:

Most of self-tuning fuzzy systems, which are automatically constructed from learning data, are based on the steepest descent method (SDM). However, this approach often requires a large convergence time and gets stuck into a shallow local minimum. One of its solutions is to use fuzzy rule modules with a small number of inputs such as DIRMs (Double-Input Rule Modules) and SIRMs (Single-Input Rule Modules). In this paper, we consider a (generalized) DIRMs model composed of double and single-input rule modules. Further, in order to reduce the redundant modules for the (generalized) DIRMs model, pruning and generative learning algorithms for the model are suggested. In order to show the effectiveness of them, numerical simulations for function approximation, Box-Jenkins and obstacle avoidance problems are performed.

Keywords: Box-Jenkins's problem, double-input rule module, fuzzy inference model, obstacle avoidance, single-input rule module

Procedia PDF Downloads 352
15370 Improving Reading Comprehension Skills of Elementary School Students through Cooperative Integrated Reading and Composition Model Using Padlet

Authors: Neneng Hayatul Milah

Abstract:

The most important reading skill for students is comprehension. Understanding the reading text will have an impact on learning outcomes. However, reading comprehension instruction in Indonesian elementary schools is lacking. A more effective learning model is needed to enhance students' reading comprehension. This study aimed to evaluate the effectiveness of the CIRC (Cooperative Integrated Reading and Composition) model with Padlet integration in improving the reading comprehension skills of grade IV students in elementary schools in Cimahi City, Indonesia. This research methodology was quantitative with a pre-experiment research type and one group pretest-posttest research design. The sample of this study consisted of 30 students. The results of statistical analysis showed that there was a significant effect of using the CIRC learning model using Padlet on improving students' reading comprehension skills of narrative text. The mean score of students' pretest was 67.41, while the mean score of the posttest increased to 84.82. The paired sample t-test resulted in a t-count score of -13.706 with a significance score of <0.001, which is smaller than α = 0.05. This research is expected to provide useful insights for educational practitioners on how the use of the CIRC model using Padlet can improve the reading comprehension skills of elementary school students.

Keywords: reading comprehension skills, CIRC, Padlet, narrative text

Procedia PDF Downloads 31
15369 CFD Study of Subcooled Boiling Flow at Elevated Pressure Using a Mechanistic Wall Heat Partitioning Model

Authors: Machimontorn Promtong, Sherman C. P. Cheung, Guan H. Yeoh, Sara Vahaji, Jiyuan Tu

Abstract:

The wide range of industrial applications involved with boiling flows promotes the necessity of establishing fundamental knowledge in boiling flow phenomena. For this purpose, a number of experimental and numerical researches have been performed to elucidate the underlying physics of this flow. In this paper, the improved wall boiling models, implemented on ANSYS CFX 14.5, were introduced to study subcooled boiling flow at elevated pressure. At the heated wall boundary, the Fractal model, Force balance approach and Mechanistic frequency model are given for predicting the nucleation site density, bubble departure diameter, and bubble departure frequency. The presented wall heat flux partitioning closures were modified to consider the influence of bubble sliding along the wall before the lift-off, which usually happens in the flow boiling. The simulation was performed based on the Two-fluid model, where the standard k-ω SST model was selected for turbulence modelling. Existing experimental data at around 5 bars were chosen to evaluate the accuracy of the presented mechanistic approach. The void fraction and Interfacial Area Concentration (IAC) are in good agreement with the experimental data. However, the predicted bubble velocity and Sauter Mean Diameter (SMD) are over-predicted. This over-prediction may be caused by consideration of only dispersed and spherical bubbles in the simulations. In the future work, the important physical mechanisms of bubbles, such as merging and shrinking during sliding on the heated wall will be incorporated into this mechanistic model to enhance its capability for a wider range of flow prediction.

Keywords: subcooled boiling flow, computational fluid dynamics (CFD), mechanistic approach, two-fluid model

Procedia PDF Downloads 317
15368 Multi-Objective Multi-Period Allocation of Temporary Earthquake Disaster Response Facilities with Multi-Commodities

Authors: Abolghasem Yousefi-Babadi, Ali Bozorgi-Amiri, Aida Kazempour, Reza Tavakkoli-Moghaddam, Maryam Irani

Abstract:

All over the world, natural disasters (e.g., earthquakes, floods, volcanoes and hurricanes) causes a lot of deaths. Earthquakes are introduced as catastrophic events, which is accident by unusual phenomena leading to much loss around the world. Such could be replaced by disasters or any other synonyms strongly demand great long-term help and relief, which can be hard to be managed. Supplies and facilities are very important challenges after any earthquake which should be prepared for the disaster regions to satisfy the people's demands who are suffering from earthquake. This paper proposed disaster response facility allocation problem for disaster relief operations as a mathematical programming model. Not only damaged people in the earthquake victims, need the consumable commodities (e.g., food and water), but also they need non-consumable commodities (e.g., clothes) to protect themselves. Therefore, it is concluded that paying attention to disaster points and people's demands are very necessary. To deal with this objective, both commodities including consumable and need non-consumable commodities are considered in the presented model. This paper presented the multi-objective multi-period mathematical programming model regarding the minimizing the average of the weighted response times and minimizing the total operational cost and penalty costs of unmet demand and unused commodities simultaneously. Furthermore, a Chebycheff multi-objective solution procedure as a powerful solution algorithm is applied to solve the proposed model. Finally, to illustrate the model applicability, a case study of the Tehran earthquake is studied, also to show model validation a sensitivity analysis is carried out.

Keywords: facility location, multi-objective model, disaster response, commodity

Procedia PDF Downloads 257