Search results for: term-matching model
12165 Optimization of Springback Prediction in U-Channel Process Using Response Surface Methodology
Authors: Muhamad Sani Buang, Shahrul Azam Abdullah, Juri Saedon
Abstract:
There is not much effective guideline on development of design parameters selection on springback for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for springback in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in U-channel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on springback of flange angle (β2) and wall opening angle (β1), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the springback behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for springback was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental valuesKeywords: advance high strength steel, u-channel process, springback, design of experiment, optimization, response surface methodology (rsm)
Procedia PDF Downloads 54012164 Speed Optimization Model for Reducing Fuel Consumption Based on Shipping Log Data
Authors: Ayudhia P. Gusti, Semin
Abstract:
It is known that total operating cost of a vessel is dominated by the cost of fuel consumption. How to reduce the fuel cost of ship so that the operational costs of fuel can be minimized is the question that arises. As the basis of these kinds of problem, sailing speed determination is an important factor to be considered by a shipping company. Optimal speed determination will give a significant influence on the route and berth schedule of ships, which also affect vessel operating costs. The purpose of this paper is to clarify some important issues about ship speed optimization. Sailing speed, displacement, sailing time, and specific fuel consumption were obtained from shipping log data to be further analyzed for modeling the speed optimization. The presented speed optimization model is expected to affect the fuel consumption and to reduce the cost of fuel consumption.Keywords: maritime transportation, reducing fuel, shipping log data, speed optimization
Procedia PDF Downloads 56612163 Coherencing a Diametrical Interests between the State, Adat Community and Private Interests in Utilising the Land for Investment in Indonesia
Authors: L. M. Hayyan ul Haq, Lalu Sabardi
Abstract:
This research is aimed at exploring an appropriate regulatory model in coherencing a diametrical interest between the state, Adat legal community, and private interests in utilising and optimizing land in Indonesia. This work is also highly relevant to coherencing the obligation of the state to respect, to fulfill and to protect the fundamental rights of people, especially to protect the communal or adat community rights to the land. In visualizing those ideas, this research will use the normative legal research to elaborate the normative problem in land use, as well as redesigning and creating an appropriate regulatory model in bridging and protecting all interest parties, especially, the state, Adat legal community, and private parties. In addition, it will also employ an empirical legal research for identifying some operational problems in protecting and optimising the land. In detail, this research will not only identify the problems at the normative level, such as conflicted norms, the absence of the norms, and the unclear norm in land law, but also the problems at operational level, such as institutional relationship in managing the land use. At the end, this work offers an appropriate regulatory model at the systems level, which covers value and norms in land use, as well as the appropriate mechanism in managing the utilization of the land for the state, Adat legal community, and private sector. By manifesting this objective, the government will not only fulfill its obligation to regulate the land for people and private, but also to protect the fundamental rights of people, as mandated by the Indonesian 1945 Constitution.Keywords: adat community rights, fundamental rights, investment, land law, private sector
Procedia PDF Downloads 51412162 TMIF: Transformer-Based Multi-Modal Interactive Fusion for Rumor Detection
Authors: Jiandong Lv, Xingang Wang, Cuiling Shao
Abstract:
The rapid development of social media platforms has made it one of the important news sources. While it provides people with convenient real-time communication channels, fake news and rumors are also spread rapidly through social media platforms, misleading the public and even causing bad social impact in view of the slow speed and poor consistency of artificial rumor detection. We propose an end-to-end rumor detection model-TIMF, which captures the dependencies between multimodal data based on the interactive attention mechanism, uses a transformer for cross-modal feature sequence mapping and combines hybrid fusion strategies to obtain decision results. This paper verifies two multi-modal rumor detection datasets and proves the superior performance and early detection performance of the proposed model.Keywords: hybrid fusion, multimodal fusion, rumor detection, social media, transformer
Procedia PDF Downloads 24412161 A Simulation Model to Analyze the Impact of Virtual Responsiveness in an E-Commerce Supply Chain
Authors: T. Godwin
Abstract:
The design of a supply chain always entails the trade-off between responsiveness and efficiency. The launch of e-commerce has not only changed the way of shopping but also altered the supply chain design while trading off efficiency with responsiveness. A concept called ‘virtual responsiveness’ is introduced in the context of e-commerce supply chain. A simulation model is developed to compare actual responsiveness and virtual responsiveness to the customer in an e-commerce supply chain. The simulation is restricted to the movement of goods from the e-tailer to the customer. Customer demand follows a statistical distribution and is generated using inverse transformation technique. The two responsiveness schemes of the supply chain are compared in terms of the minimum number of inventory required at the e-tailer to fulfill the orders. Computational results show the savings achieved through virtual responsiveness. The insights gained from this study could be used to redesign e-commerce supply chain by incorporating virtual responsiveness. A part of the achieved cost savings could be passed back to the customer, thereby making the supply chain both effective and competitive.Keywords: e-commerce, simulation modeling, supply chain, virtual responsiveness
Procedia PDF Downloads 34112160 LEDs Based Indoor Positioning by Distances Derivation from Lambertian Illumination Model
Authors: Yan-Ren Chen, Jenn-Kaie Lain
Abstract:
This paper proposes a novel indoor positioning algorithm based on visible light communications, implemented by light-emitting diode fixtures. In the proposed positioning algorithm, distances between light-emitting diode fixtures and mobile terminal are derived from the assumption of ideal Lambertian optic radiation model, and Trilateration positioning method is proceeded immediately to get the coordinates of mobile terminal. The proposed positioning algorithm directly obtains distance information from the optical signal modeling, and therefore, statistical distribution of received signal strength at different positions in interior space has no need to be pre-established. Numerically, simulation results have shown that the proposed indoor positioning algorithm can provide accurate location coordinates estimation.Keywords: indoor positioning, received signal strength, trilateration, visible light communications
Procedia PDF Downloads 40912159 Analysis of Bending Abilities of Soft Pneumatic Actuator
Authors: Jeevan Balaji, Shreyas Chigurupati
Abstract:
Pneumatic gripper use compressed air to operate its actuators (fingers). Unlike the conventional metallic gripper, a soft pneumatic actuator (SPA) can be used for relocating fragile objects. An added advantage for this gripper is that the pressure exerted on the object can be varied by changing the dimensions of the air chambers and also by the number of chambers. SPAs have many benefits over conventional robots in the military, medical fields because of their compliance nature and are easily produced using the 3D printing process. In the paper, SPA is proposed to perform pick and place tasks. A design was developed for the actuators, which is convenient for gripping any fragile objects. Thermoplastic polyurethane (TPU) is used for 3D printing the actuators. The actuator model behaves differently as the parameters such as its chamber height, number of chambers change. A detailed FEM model of the actuator is drafted for different pressure inputs using ABAQUS CAE software, and a safe loading pressure range is found.Keywords: soft robotics, pneumatic actuator, design and modelling, bending analysis
Procedia PDF Downloads 16512158 On Generalized Cumulative Past Inaccuracy Measure for Marginal and Conditional Lifetimes
Authors: Amit Ghosh, Chanchal Kundu
Abstract:
Recently, the notion of past cumulative inaccuracy (CPI) measure has been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α (alpha) and study the proposed measure for conditionally specified models of two components failed at different time instants called generalized conditional CPI (GCCPI). We provide some bounds using usual stochastic order and investigate several properties of GCCPI. The effect of monotone transformation on this proposed measure has also been examined. Furthermore, we characterize some bivariate distributions under the assumption of conditional proportional reversed hazard rate model. Moreover, the role of GCCPI in reliability modeling has also been investigated for a real-life problem.Keywords: cumulative past inaccuracy, marginal and conditional past lifetimes, conditional proportional reversed hazard rate model, usual stochastic order
Procedia PDF Downloads 25012157 A Comparison between Modelled and Actual Thermal Performance of Load Bearing Rammed Earth Walls in Egypt
Authors: H. Hafez, A. Mekkawy, R. Rostom
Abstract:
Around 10% of the world’s CO₂ emissions could be attributed to the operational energy of buildings; that is why more research is directed towards the use of rammed earth walls which is claimed to have enhanced thermal properties compared to conventional building materials. The objective of this paper is to outline how the thermal performance of rammed earth walls compares to conventional reinforced concrete skeleton and red brick in-fill walls. For this sake, the indoor temperature and relative humidity of a classroom built with rammed earth walls and a vaulted red brick roof in the area of Behbeit, Giza, Egypt were measured hourly over 6 months using smart sensors. These parameters for the rammed earth walls were later also compared against the values obtained using a 'DesignBuilder v5' model to verify the model assumptions. The thermal insulation of rammed earth walls was found to be 30% better than this of the redbrick infill, and the recorded data were found to be almost 90% similar to the modelled values.Keywords: rammed earth, thermal insulation, indoor air quality, design builder
Procedia PDF Downloads 14412156 Sustainability Model for Rural Telecenter Using Business Intelligence Technique
Authors: Razak Rahmat, Azizah Ahmad, Rafidah Razak, Roshidi Din, Azizi Abas
Abstract:
Telecenter is a place where communities can access computers, the Internet, and other digital technologies to enable them to gather information, create, learn, and communicate with others. However, previous studies found that sustainability issues related to economic, political and institutional, social and technology is one of the major problem faced by the telecenter. Based on that problem, this research is planning to design a possible solution on rural telecenters sustainability with the support of business intelligence (BI). The empirical study will be conducted through the qualitative and quantitative method including interviews and observations with a range of stakeholders including ministry officers, telecenters managers and operators. Result from the data collection will be analyze using the causal modeling approach of SEM SmartPLS for the validity. The expected finding from this research is the Business Intelligent Requirement Model as a guild for sustainability of the rural telecenters.Keywords: Rural ICT Telecenter(RICTT), business intelligence, sustainability, requirement analysis modal
Procedia PDF Downloads 48312155 An Interactive Institutional Framework for Evolution of Enterprise Technological Innovation Capabilities System: A Complex Adaptive Systems Approach
Authors: Sohail Ahmed, Ke Xing
Abstract:
This research theoretically explored the evolution mechanism of enterprise technological innovation capability system (ETICS) from the perspective of complex adaptive systems (CAS). This research proposed an analytical framework for ETICS, its concepts, and theory by integrating CAS methodology into the management of the technological innovation capability of enterprises and discusses how to use the principles of complexity to analyze the composition, evolution, and realization of the technological innovation capabilities in complex dynamic environments. This paper introduces the concept and interaction of multi-agent, the theoretical background of CAS, and summarizes the sources of technological innovation, the elements of each subject, and the main clusters of adaptive interactions and innovation activities. The concept of multi-agents is applied through the linkages of enterprises, research institutions, and government agencies with the leading enterprises in industrial settings. The study was exploratory and based on CAS theory. Theoretical model is built by considering technological and innovation literature from foundational to state of the art projects of technological enterprises. On this basis, the theoretical model is developed to measure the evolution mechanism of the enterprise's technological innovation capability system. This paper concludes that the main characteristics for evolution in technological systems are based on the enterprise’s research and development personnel, investments in technological processes, and innovation resources are responsible for the evolution of enterprise technological innovation performance. The research specifically enriched the application process of technological innovation in institutional networks related to enterprises.Keywords: complex adaptive system, echo model, enterprise technological innovation capability system, research institutions, multi-agents
Procedia PDF Downloads 13612154 Online Teacher Professional Development: An Extension of the Unified Theory of Acceptance and Use of Technology Model
Authors: Lovemore Motsi
Abstract:
The rapid pace of technological innovation, along with a global fascination with the internet, continues to result in a dominating call to integrate internet technologies in institutions of learning. However, the pressing question remains – how can online in-service training for teachers, support quality and success in professional development programmers. The aim of this study was to examine an integrated model that extended the Unified Theory of Acceptance and Use of Technology (UTAUT) with additional constructs – including attitude and behaviour intention – adopted from the Theory of Planned Behaviour (TPB) to answer the question. Data was collected from secondary school teachers at 10 selected schools in the Tshwane South district by means of the Statistical Package for Social Scientists (SPSS v 23.0), and the collected data was analysed quantitatively. The findings are congruent with model testing under conditions of volitional usage behaviour. In this regard, the role of facilitating condition variables is insignificant as a determinant of usage behaviour. Social norm variables also proved to be a weak determinant of behavioural intentions. Findings demonstrate that effort expectancy is the key determinant of online INSET usage. Based on these findings, the variable social influence and facilitating conditions are important factors in ensuring the acceptance of online INSET among teachers in selected secondary schools in the Tshwane South district.Keywords: unified theory of acceptance and use of technology (UTAUT), teacher professional development, secondary schools, online INSET
Procedia PDF Downloads 21412153 Application of Grey Theory in the Forecast of Facility Maintenance Hours for Office Building Tenants and Public Areas
Authors: Yen Chia-Ju, Cheng Ding-Ruei
Abstract:
This study took case office building as subject and explored the responsive work order repair request of facilities and equipment in offices and public areas by gray theory, with the purpose of providing for future related office building owners, executive managers, property management companies, mechanical and electrical companies as reference for deciding and assessing forecast model. Important conclusions of this study are summarized as follows according to the study findings: 1. Grey Relational Analysis discusses the importance of facilities repair number of six categories, namely, power systems, building systems, water systems, air conditioning systems, fire systems and manpower dispatch in order. In terms of facilities maintenance importance are power systems, building systems, water systems, air conditioning systems, manpower dispatch and fire systems in order. 2. GM (1,N) and regression method took maintenance hours as dependent variables and repair number, leased area and tenants number as independent variables and conducted single month forecast based on 12 data from January to December 2011. The mean absolute error and average accuracy of GM (1,N) from verification results were 6.41% and 93.59%; the mean absolute error and average accuracy of regression model were 4.66% and 95.34%, indicating that they have highly accurate forecast capability.Keywords: rey theory, forecast model, Taipei 101, office buildings, property management, facilities, equipment
Procedia PDF Downloads 44312152 From Values to Sustainable Actions: A Dual-Theory Approach to Green Consumerism
Authors: Jiyeon Kim
Abstract:
This conceptual paper examines the psychological drivers of green consumerism and sustainable consumption by integrating the Value-Belief-Norm (VBN) Theory and the Theory of Reasoned Action (TRA). With growing environmental concerns, green consumerism promotes eco-friendly choices such as purchasing sustainable products and supporting environmentally responsible companies. However, there remains a need for research that effectively guides strategies to encourage sustainable behaviors. This paper evaluates VBN Theory’s role in driving pro-environmental behaviors. By incorporating TRA, the paper proposes an enhanced model that improves understanding of the factors driving sustained pro-environmental actions. Focusing on values, beliefs, and norms, this integrated model provides a deeper understanding of the cognitive and motivational factors that influence sustainable consumption. The findings offer valuable theoretical and practical insights for developing strategies to support long-term responsible consumer behavior.Keywords: green consumerism, sustainable behavior, TRA, VBN
Procedia PDF Downloads 412151 Model Updating-Based Approach for Damage Prognosis in Frames via Modal Residual Force
Authors: Gholamreza Ghodrati Amiri, Mojtaba Jafarian Abyaneh, Ali Zare Hosseinzadeh
Abstract:
This paper presents an effective model updating strategy for damage localization and quantification in frames by defining damage detection problem as an optimization issue. A generalized version of the Modal Residual Force (MRF) is employed for presenting a new damage-sensitive cost function. Then, Grey Wolf Optimization (GWO) algorithm is utilized for solving suggested inverse problem and the global extremums are reported as damage detection results. The applicability of the presented method is investigated by studying different damage patterns on the benchmark problem of the IASC-ASCE, as well as a planar shear frame structure. The obtained results emphasize good performance of the method not only in free-noise cases, but also when the input data are contaminated with different levels of noises.Keywords: frame, grey wolf optimization algorithm, modal residual force, structural damage detection
Procedia PDF Downloads 38712150 Single Pass Design of Genetic Circuits Using Absolute Binding Free Energy Measurements and Dimensionless Analysis
Authors: Iman Farasat, Howard M. Salis
Abstract:
Engineered genetic circuits reprogram cellular behavior to act as living computers with applications in detecting cancer, creating self-controlling artificial tissues, and dynamically regulating metabolic pathways. Phenemenological models are often used to simulate and design genetic circuit behavior towards a desired behavior. While such models assume that each circuit component’s function is modular and independent, even small changes in a circuit (e.g. a new promoter, a change in transcription factor expression level, or even a new media) can have significant effects on the circuit’s function. Here, we use statistical thermodynamics to account for the several factors that control transcriptional regulation in bacteria, and experimentally demonstrate the model’s accuracy across 825 measurements in several genetic contexts and hosts. We then employ our first principles model to design, experimentally construct, and characterize a family of signal amplifying genetic circuits (genetic OpAmps) that expand the dynamic range of cell sensors. To develop these models, we needed a new approach to measuring the in vivo binding free energies of transcription factors (TFs), a key ingredient of statistical thermodynamic models of gene regulation. We developed a new high-throughput assay to measure RNA polymerase and TF binding free energies, requiring the construction and characterization of only a few constructs and data analysis (Figure 1A). We experimentally verified the assay on 6 TetR-homolog repressors and a CRISPR/dCas9 guide RNA. We found that our binding free energy measurements quantitatively explains why changing TF expression levels alters circuit function. Altogether, by combining these measurements with our biophysical model of translation (the RBS Calculator) as well as other measurements (Figure 1B), our model can account for changes in TF binding sites, TF expression levels, circuit copy number, host genome size, and host growth rate (Figure 1C). Model predictions correctly accounted for how these 8 factors control a promoter’s transcription rate (Figure 1D). Using the model, we developed a design framework for engineering multi-promoter genetic circuits that greatly reduces the number of degrees of freedom (8 factors per promoter) to a single dimensionless unit. We propose the Ptashne (Pt) number to encapsulate the 8 co-dependent factors that control transcriptional regulation into a single number. Therefore, a single number controls a promoter’s output rather than these 8 co-dependent factors, and designing a genetic circuit with N promoters requires specification of only N Pt numbers. We demonstrate how to design genetic circuits in Pt number space by constructing and characterizing 15 2-repressor OpAmp circuits that act as signal amplifiers when within an optimal Pt region. We experimentally show that OpAmp circuits using different TFs and TF expression levels will only amplify the dynamic range of input signals when their corresponding Pt numbers are within the optimal region. Thus, the use of the Pt number greatly simplifies the genetic circuit design, particularly important as circuits employ more TFs to perform increasingly complex functions.Keywords: transcription factor, synthetic biology, genetic circuit, biophysical model, binding energy measurement
Procedia PDF Downloads 47212149 Optimization Modeling of the Hybrid Antenna Array for the DoA Estimation
Authors: Somayeh Komeylian
Abstract:
The direction of arrival (DoA) estimation is the crucial aspect of the radar technologies for detecting and dividing several signal sources. In this scenario, the antenna array output modeling involves numerous parameters including noise samples, signal waveform, signal directions, signal number, and signal to noise ratio (SNR), and thereby the methods of the DoA estimation rely heavily on the generalization characteristic for establishing a large number of the training data sets. Hence, we have analogously represented the two different optimization models of the DoA estimation; (1) the implementation of the decision directed acyclic graph (DDAG) for the multiclass least-squares support vector machine (LS-SVM), and (2) the optimization method of the deep neural network (DNN) radial basis function (RBF). We have rigorously verified that the LS-SVM DDAG algorithm is capable of accurately classifying DoAs for the three classes. However, the accuracy and robustness of the DoA estimation are still highly sensitive to technological imperfections of the antenna arrays such as non-ideal array design and manufacture, array implementation, mutual coupling effect, and background radiation and thereby the method may fail in representing high precision for the DoA estimation. Therefore, this work has a further contribution on developing the DNN-RBF model for the DoA estimation for overcoming the limitations of the non-parametric and data-driven methods in terms of array imperfection and generalization. The numerical results of implementing the DNN-RBF model have confirmed the better performance of the DoA estimation compared with the LS-SVM algorithm. Consequently, we have analogously evaluated the performance of utilizing the two aforementioned optimization methods for the DoA estimation using the concept of the mean squared error (MSE).Keywords: DoA estimation, Adaptive antenna array, Deep Neural Network, LS-SVM optimization model, Radial basis function, and MSE
Procedia PDF Downloads 9912148 Two-Sided Information Dissemination in Takeovers: Disclosure and Media
Authors: Eda Orhun
Abstract:
Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success
Procedia PDF Downloads 31712147 Development of a Congestion Controller of Computer Network Using Artificial Intelligence Algorithm
Authors: Mary Anne Roa
Abstract:
Congestion in network occurs due to exceed in aggregate demand as compared to the accessible capacity of the resources. Network congestion will increase as network speed increases and new effective congestion control methods are needed, especially for today’s very high speed networks. To address this undeniably global issue, the study focuses on the development of a fuzzy-based congestion control model concerned with allocating the resources of a computer network such that the system can operate at an adequate performance level when the demand exceeds or is near the capacity of the resources. Fuzzy logic based models have proven capable of accurately representing a wide variety of processes. The model built is based on bandwidth, the aggregate incoming traffic and the waiting time. The theoretical analysis and simulation results show that the proposed algorithm provides not only good utilization but also low packet loss.Keywords: congestion control, queue management, computer networks, fuzzy logic
Procedia PDF Downloads 39612146 Effect of Realistic Lubricant Properties on Thermal Electrohydrodynamic Lubrication Behavior in Circular Contacts
Authors: Puneet Katyal, Punit Kumar
Abstract:
A great deal of efforts has been done in the field of thermal effects in electrohydrodynamic lubrication (TEHL) during the last five decades. The focus was primarily on the development of an efficient numerical scheme to deal with the computational challenges involved in the solution of TEHL model; however, some important aspects related to the accurate description of lubricant properties such as viscosity, rheology and thermal conductivity in EHL point contact analysis remain largely neglected. A few studies available in this regard are based upon highly complex mathematical models difficult to formulate and execute. Using a simplified thermal EHL model for point contacts, this work sheds some light on the importance of accurate characterization of the lubricant properties and demonstrates that the computed TEHL characteristics are highly sensitive to lubricant properties. It also emphasizes the use of appropriate mathematical models with experimentally determined parameters to account for correct lubricant behaviour.Keywords: TEHL, shear thinning, rheology, conductivity
Procedia PDF Downloads 20012145 Transformation to M-Learning at the Nursing Institute in the Armed Force Hospital Alhada, in Saudi Arabia Based on Activity Theory
Authors: Rahimah Abdulrahman, A. Eardle, Wilfred Alan, Abdel Hamid Soliman
Abstract:
With the rapid development in technology, and advances in learning technologies, m-learning has begun to occupy a great part of our lives. The pace of the life getting together with the need for learning started mobile learning (m-learning) concept. In 2008, Saudi Arabia requested a national plan for the adoption of information technology (IT) across the country. Part of the recommendations of this plan concerns the implementation of mobile learning (m-learning) as well as their prospective applications to higher education within the Kingdom of Saudi Arabia. The overall aim of the research is to explore the main issues that impact the deployment of m-learning in nursing institutes in Saudi Arabia, at the Armed Force Hospitals (AFH), Alhada. This is in order to be able to develop a generic model to enable and assist the educational policy makers and implementers of m-learning, to comprehend and treat those issues effectively. Specifically, the research will explore the concept of m-learning; identify and analyse the main organisational; technological and cultural issue, that relate to the adoption of m-learning; develop a model of m-learning; investigate the perception of the students of the Nursing Institutes to the use of m-learning technologies for their nursing diploma programmes based on their experiences; conduct a validation of the m-learning model with the use of the nursing Institute of the AFH, Alhada in Saudi Arabia, and evaluate the research project as a learning experience and as a contribution to the body of knowledge. Activity Theory (AT) will be adopted for the study due to the fact that it provides a conceptual framework that engenders an understanding of the structure, development and the context of computer-supported activities. The study will be adopt a set of data collection methods which engage nursing students in a quantitative survey, while nurse teachers are engaged through in depth qualitative studies to get first-hand information about the organisational, technological and cultural issues that impact on the deployment of m-learning. The original contribution will be a model for developing m-learning material for classroom-based learning in the nursing institute that can have a general application.Keywords: activity theory (at), mobile learning (m-learning), nursing institute, Saudi Arabia (sa)
Procedia PDF Downloads 35212144 Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast
Authors: Muhammad Luthfi, Sutikno Sutikno, Purhadi Purhadi
Abstract:
Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport.Keywords: Bayesian Model Averaging, ensemble forecast, geostatistical output perturbation, numerical weather prediction, temperature
Procedia PDF Downloads 27812143 Parameter Estimation with Uncertainty and Sensitivity Analysis for the SARS Outbreak in Hong Kong
Authors: Afia Naheed, Manmohan Singh, David Lucy
Abstract:
This work is based on a mathematical as well as statistical study of an SEIJTR deterministic model for the interpretation of transmission of severe acute respiratory syndrome (SARS). Based on the SARS epidemic in 2003, the parameters are estimated using Runge-Kutta (Dormand-Prince pairs) and least squares methods. Possible graphical and numerical techniques are used to validate the estimates. Then effect of the model parameters on the dynamics of the disease is examined using sensitivity and uncertainty analysis. Sensitivity and uncertainty analytical techniques are used in order to analyze the affect of the uncertainty in the obtained parameter estimates and to determine which parameters have the largest impact on controlling the disease dynamics.Keywords: infectious disease, severe acute respiratory syndrome (SARS), parameter estimation, sensitivity analysis, uncertainty analysis, Runge-Kutta methods, Levenberg-Marquardt method
Procedia PDF Downloads 35812142 Modelling of Moisture Loss and Oil Uptake during Deep-Fat Frying of Plantain
Authors: James A. Adeyanju, John O. Olajide, Akinbode A. Adedeji
Abstract:
A predictive mathematical model based on the fundamental principles of mass transfer was developed to simulate the moisture content and oil content during Deep-Fat Frying (DFF) process of dodo. The resulting governing equation, that is, partial differential equation that describes rate of moisture loss and oil uptake was solved numerically using explicit Finite Difference Technique (FDT). Computer codes were written in MATLAB environment for the implementation of FDT at different frying conditions and moisture loss as well as oil uptake simulation during DFF of dodo. Plantain samples were sliced into 5 mm thickness and fried at different frying oil temperatures (150, 160 and 170 ⁰C) for periods varying from 2 to 4 min. The comparison between the predicted results and experimental data for the validation of the model showed reasonable agreement. The correlation coefficients between the predicted and experimental values of moisture and oil transfer models ranging from 0.912 to 0.947 and 0.895 to 0.957, respectively. The predicted results could be further used for the design, control and optimization of deep-fat frying process.Keywords: frying, moisture loss, modelling, oil uptake
Procedia PDF Downloads 44612141 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data
Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard
Abstract:
Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset
Procedia PDF Downloads 512140 Socio-Economic Child’S Wellbeing Impasse in South Africa: Towards a Theory-Based Solution Model
Authors: Paulin Mbecke
Abstract:
Research Issue: Under economic constraints, socio-economic conditions of households worsen discounting child’s wellbeing to the bottom of many governments and households’ priority lists. In such situation, many governments fail to rebalance priorities in providing services such as education, housing and social security which are the prerequisites for the wellbeing of children. Consequently, many households struggle to respond to basic needs especially those of children. Although economic conditions play a crucial role in creating prosperity or poverty in households and therefore the wellbeing or misery for children; they are not the sole cause. Research Insights: The review of the South African Index of Multiple Deprivation and the South African Child Gauge establish the extent to which economic conditions impact on the wellbeing or misery of children. The analysis of social, cultural, environmental and structural theories demonstrates that non-economic factors contribute equally to the wellbeing or misery of children, yet, they are disregarded. In addition, the assessment of a child abuse database proves a weak correlation between economic factors (prosperity or poverty) and child’s wellbeing or misery. Theoretical Implications: Through critical social research theory and modelling, the paper proposes a Theory-Based Model that combines different factors to facilitate the understanding of child’s wellbeing or misery. Policy Implications: The proposed model assists in broad policy and decision making and reviews processes in promoting child’s wellbeing and in preventing, intervening and managing child’s misery with regard to education, housing, and social security.Keywords: children, child’s misery, child’s wellbeing, household’s despair, household’s prosperity
Procedia PDF Downloads 28212139 Study and Simulation of a Sever Dust Storm over West and South West of Iran
Authors: Saeed Farhadypour, Majid Azadi, Habibolla Sayyari, Mahmood Mosavi, Shahram Irani, Aliakbar Bidokhti, Omid Alizadeh Choobari, Ziba Hamidi
Abstract:
In the recent decades, frequencies of dust events have increased significantly in west and south west of Iran. First, a survey on the dust events during the period (1990-2013) is investigated using historical dust data collected at 6 weather stations scattered over west and south-west of Iran. After statistical analysis of the observational data, one of the most severe dust storm event that occurred in the region from 3rd to 6th July 2009, is selected and analyzed. WRF-Chem model is used to simulate the amount of PM10 and how to transport it to the areas. The initial and lateral boundary conditions for model obtained from GFS data with 0.5°×0.5° spatial resolution. In the simulation, two aerosol schemas (GOCART and MADE/SORGAM) with 3 options (chem_opt=106,300 and 303) were evaluated. Results of the statistical analysis of the historical data showed that south west of Iran has high frequency of dust events, so that Bushehr station has the highest frequency between stations and Urmia station has the lowest frequency. Also in the period of 1990 to 2013, the years 2009 and 1998 with the amounts of 3221 and 100 respectively had the highest and lowest dust events and according to the monthly variation, June and July had the highest frequency of dust events and December had the lowest frequency. Besides, model results showed that the MADE / SORGAM scheme has predicted values and trends of PM10 better than the other schemes and has showed the better performance in comparison with the observations. Finally, distribution of PM10 and the wind surface maps obtained from numerical modeling showed that the formation of dust plums formed in Iraq and Syria and also transportation of them to the West and Southwest of Iran. In addition, comparing the MODIS satellite image acquired on 4th July 2009 with model output at the same time showed the good ability of WRF-Chem in simulating spatial distribution of dust.Keywords: dust storm, MADE/SORGAM scheme, PM10, WRF-Chem
Procedia PDF Downloads 26912138 Prediction of Terrorist Activities in Nigeria using Bayesian Neural Network with Heterogeneous Transfer Functions
Authors: Tayo P. Ogundunmade, Adedayo A. Adepoju
Abstract:
Terrorist attacks in liberal democracies bring about a few pessimistic results, for example, sabotaged public support in the governments they target, disturbing the peace of a protected environment underwritten by the state, and a limitation of individuals from adding to the advancement of the country, among others. Hence, seeking for techniques to understand the different factors involved in terrorism and how to deal with those factors in order to completely stop or reduce terrorist activities is the topmost priority of the government in every country. This research aim is to develop an efficient deep learning-based predictive model for the prediction of future terrorist activities in Nigeria, addressing low-quality prediction accuracy problems associated with the existing solution methods. The proposed predictive AI-based model as a counterterrorism tool will be useful by governments and law enforcement agencies to protect the lives of individuals in society and to improve the quality of life in general. A Heterogeneous Bayesian Neural Network (HETBNN) model was derived with Gaussian error normal distribution. Three primary transfer functions (HOTTFs), as well as two derived transfer functions (HETTFs) arising from the convolution of the HOTTFs, are namely; Symmetric Saturated Linear transfer function (SATLINS ), Hyperbolic Tangent transfer function (TANH), Hyperbolic Tangent sigmoid transfer function (TANSIG), Symmetric Saturated Linear and Hyperbolic Tangent transfer function (SATLINS-TANH) and Symmetric Saturated Linear and Hyperbolic Tangent Sigmoid transfer function (SATLINS-TANSIG). Data on the Terrorist activities in Nigeria gathered through questionnaires for the purpose of this study were used. Mean Square Error (MSE), Mean Absolute Error (MAE) and Test Error are the forecast prediction criteria. The results showed that the HETFs performed better in terms of prediction and factors associated with terrorist activities in Nigeria were determined. The proposed predictive deep learning-based model will be useful to governments and law enforcement agencies as an effective counterterrorism mechanism to understand the parameters of terrorism and to design strategies to deal with terrorism before an incident actually happens and potentially causes the loss of precious lives. The proposed predictive AI-based model will reduce the chances of terrorist activities and is particularly helpful for security agencies to predict future terrorist activities.Keywords: activation functions, Bayesian neural network, mean square error, test error, terrorism
Procedia PDF Downloads 16312137 Material Characterization and Numerical Simulation of a Rubber Bumper
Authors: Tamás Mankovits, Dávid Huri, Imre Kállai, Imre Kocsis, Tamás Szabó
Abstract:
Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. Rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. In this paper, a comprehensive investigation is introduced including laboratory measurements, mesh density analysis and complex finite element simulations to obtain the load-displacement curve of the chosen rubber bumper. Contact and friction effects are also taken into consideration. The aim of this research is to elaborate an FEM model which is accurate and competitive for a future shape optimization task.Keywords: rubber bumper, finite element analysis, compression test, Mooney-Rivlin material model
Procedia PDF Downloads 50612136 Behavior of Steel Moment Frames Subjected to Impact Load
Authors: Hyungoo Kang, Minsung Kim, Jinkoo Kim
Abstract:
This study investigates the performance of a 2D and 3D steel moment frame subjected to vehicle collision at a first story column using LS-DYNA. The finite element models of vehicles provided by the National Crash Analysis Center (NCAC) are used for numerical analysis. Nonlinear dynamic time history analysis of the 2D and 3D model structures are carried out based on the arbitrary column removal scenario, and the vertical displacement of the damaged structures are compared with that obtained from collision analysis. The analysis results show that the model structure remains stable when the speed of the vehicle is 40km/h. However, at the speed of 80 and 120km/h both the 2D and 3D structures collapse by progressive collapse. The vertical displacement of the damaged joint obtained from collision analysis is significantly larger than the displacement computed based on the arbitrary column removal scenario.Keywords: vehicle collision, progressive collapse, FEM, LS-DYNA
Procedia PDF Downloads 341