Search results for: coupled Markov random field (MRF)
11435 Fully Coupled Porous Media Model
Authors: Nia Mair Fry, Matthew Profit, Chenfeng Li
Abstract:
This work focuses on the development and implementation of a fully implicit-implicit, coupled mechanical deformation and porous flow, finite element software tool. The fully implicit software accurately predicts classical fundamental analytical solutions such as the Terzaghi consolidation problem. Furthermore, it can capture other analytical solutions less well known in the literature, such as Gibson’s sedimentation rate problem and Coussy’s problems investigating wellbore stability for poroelastic rocks. The mechanical volume strains are transferred to the porous flow governing equation in an implicit framework. This will overcome some of the many current industrial issues, which use explicit solvers for the mechanical governing equations and only implicit solvers on the porous flow side. This can potentially lead to instability and non-convergence issues in the coupled system, plus giving results with an accountable degree of error. The specification of a fully monolithic implicit-implicit coupled porous media code sees the solution of both seepage-mechanical equations in one matrix system, under a unified time-stepping scheme, which makes the problem definition much easier. When using an explicit solver, additional input such as the damping coefficient and mass scaling factor is required, which are circumvented with a fully implicit solution. Further, improved accuracy is achieved as the solution is not dependent on predictor-corrector methods for the pore fluid pressure solution, but at the potential cost of reduced stability. In testing of this fully monolithic porous media code, there is the comparison of the fully implicit coupled scheme against an existing staggered explicit-implicit coupled scheme solution across a range of geotechnical problems. These cases include 1) Biot coefficient calculation, 2) consolidation theory with Terzaghi analytical solution, 3) sedimentation theory with Gibson analytical solution, and 4) Coussy well-bore poroelastic analytical solutions.Keywords: coupled, implicit, monolithic, porous media
Procedia PDF Downloads 13811434 A New Concept for Deriving the Expected Value of Fuzzy Random Variables
Authors: Liang-Hsuan Chen, Chia-Jung Chang
Abstract:
Fuzzy random variables have been introduced as an imprecise concept of numeric values for characterizing the imprecise knowledge. The descriptive parameters can be used to describe the primary features of a set of fuzzy random observations. In fuzzy environments, the expected values are usually represented as fuzzy-valued, interval-valued or numeric-valued descriptive parameters using various metrics. Instead of the concept of area metric that is usually adopted in the relevant studies, the numeric expected value is proposed by the concept of distance metric in this study based on two characters (fuzziness and randomness) of FRVs. Comparing with the existing measures, although the results show that the proposed numeric expected value is same with those using the different metric, if only triangular membership functions are used. However, the proposed approach has the advantages of intuitiveness and computational efficiency, when the membership functions are not triangular types. An example with three datasets is provided for verifying the proposed approach.Keywords: fuzzy random variables, distance measure, expected value, descriptive parameters
Procedia PDF Downloads 34311433 Radio Frequency Identification Encryption via Modified Two Dimensional Logistic Map
Authors: Hongmin Deng, Qionghua Wang
Abstract:
A modified two dimensional (2D) logistic map based on cross feedback control is proposed. This 2D map exhibits more random chaotic dynamical properties than the classic one dimensional (1D) logistic map in the statistical characteristics analysis. So it is utilized as the pseudo-random (PN) sequence generator, where the obtained real-valued PN sequence is quantized at first, then applied to radio frequency identification (RFID) communication system in this paper. This system is experimentally validated on a cortex-M0 development board, which shows the effectiveness in key generation, the size of key space and security. At last, further cryptanalysis is studied through the test suite in the National Institute of Standards and Technology (NIST).Keywords: chaos encryption, logistic map, pseudo-random sequence, RFID
Procedia PDF Downloads 40011432 Uncertainty Quantification of Crack Widths and Crack Spacing in Reinforced Concrete
Authors: Marcel Meinhardt, Manfred Keuser, Thomas Braml
Abstract:
Cracking of reinforced concrete is a complex phenomenon induced by direct loads or restraints affecting reinforced concrete structures as soon as the tensile strength of the concrete is exceeded. Hence it is important to predict where cracks will be located and how they will propagate. The bond theory and the crack formulas in the actual design codes, for example, DIN EN 1992-1-1, are all based on the assumption that the reinforcement bars are embedded in homogeneous concrete without taking into account the influence of transverse reinforcement and the real stress situation. However, it can often be observed that real structures such as walls, slabs or beams show a crack spacing that is orientated to the transverse reinforcement bars or to the stirrups. In most Finite Element Analysis studies, the smeared crack approach is used for crack prediction. The disadvantage of this model is that the typical strain localization of a crack on element level can’t be seen. The crack propagation in concrete is a discontinuous process characterized by different factors such as the initial random distribution of defects or the scatter of material properties. Such behavior presupposes the elaboration of adequate models and methods of simulation because traditional mechanical approaches deal mainly with average material parameters. This paper concerned with the modelling of the initiation and the propagation of cracks in reinforced concrete structures considering the influence of transverse reinforcement and the real stress distribution in reinforced concrete (R/C) beams/plates in bending action. Therefore, a parameter study was carried out to investigate: (I) the influence of the transversal reinforcement to the stress distribution in concrete in bending mode and (II) the crack initiation in dependence of the diameter and distance of the transversal reinforcement to each other. The numerical investigations on the crack initiation and propagation were carried out with a 2D reinforced concrete structure subjected to quasi static loading and given boundary conditions. To model the uncertainty in the tensile strength of concrete in the Finite Element Analysis correlated normally and lognormally distributed random filed with different correlation lengths were generated. The paper also presents and discuss different methods to generate random fields, e.g. the Covariance Matrix Decomposition Method. For all computations, a plastic constitutive law with softening was used to model the crack initiation and the damage of the concrete in tension. It was found that the distributions of crack spacing and crack widths are highly dependent of the used random field. These distributions are validated to experimental studies on R/C panels which were carried out at the Laboratory for Structural Engineering at the University of the German Armed Forces in Munich. Also, a recommendation for parameters of the random field for realistic modelling the uncertainty of the tensile strength is given. The aim of this research was to show a method in which the localization of strains and cracks as well as the influence of transverse reinforcement on the crack initiation and propagation in Finite Element Analysis can be seen.Keywords: crack initiation, crack modelling, crack propagation, cracks, numerical simulation, random fields, reinforced concrete, stochastic
Procedia PDF Downloads 15711431 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32311430 Solving Process Planning, Weighted Apparent Tardiness Cost Dispatching, and Weighted Processing plus Weight Due-Date Assignment Simultaneously Using a Hybrid Search
Authors: Halil Ibrahim Demir, Caner Erden, Abdullah Hulusi Kokcam, Mumtaz Ipek
Abstract:
Process planning, scheduling, and due date assignment are three important manufacturing functions which are studied independently in literature. There are hundreds of works on IPPS and SWDDA problems but a few works on IPPSDDA problem. Integrating these three functions is very crucial due to the high relationship between them. Since the scheduling problem is in the NP-Hard problem class without any integration, an integrated problem is even harder to solve. This study focuses on the integration of these functions. Sum of weighted tardiness, earliness, and due date related costs are used as a penalty function. Random search and hybrid metaheuristics are used to solve the integrated problem. Marginal improvement in random search is very high in the early iterations and reduces enormously in later iterations. At that point directed search contribute to marginal improvement more than random search. In this study, random and genetic search methods are combined to find better solutions. Results show that overall performance becomes better as the integration level increases.Keywords: process planning, genetic algorithm, hybrid search, random search, weighted due-date assignment, weighted scheduling
Procedia PDF Downloads 36111429 Classification of State Transition by Using a Microwave Doppler Sensor for Wandering Detection
Authors: K. Shiba, T. Kaburagi, Y. Kurihara
Abstract:
With global aging, people who require care, such as people with dementia (PwD), are increasing within many developed countries. And PwDs may wander and unconsciously set foot outdoors, it may lead serious accidents, such as, traffic accidents. Here, round-the-clock monitoring by caregivers is necessary, which can be a burden for the caregivers. Therefore, an automatic wandering detection system is required when an elderly person wanders outdoors, in which case the detection system transmits a ‘moving’ followed by an ‘absence’ state. In this paper, we focus on the transition from the ‘resting’ to the ‘absence’ state, via the ‘moving’ state as one of the wandering transitions. To capture the transition of the three states, our method based on the hidden Markov model (HMM) is built. Using our method, the restraint where the ‘resting’ state and ‘absence’ state cannot be transmitted to each other is applied. To validate our method, we conducted the experiment with 10 subjects. Our results show that the method can classify three states with 0.92 accuracy.Keywords: wander, microwave Doppler sensor, respiratory frequency band, the state transition, hidden Markov model (HMM).
Procedia PDF Downloads 18311428 Spillage Prediction Using Fluid-Structure Interaction Simulation with Coupled Eulerian-Lagrangian Technique
Authors: Ravi Soni, Irfan Pathan, Manish Pande
Abstract:
The current product development process needs simultaneous consideration of different physics. The performance of the product needs to be considered under both structural and fluid loads. Examples include ducts and valves where structural behavior affects fluid motion and vice versa. Simulation of fluid-structure interaction involves modeling interaction between moving components and the fluid flow. In these scenarios, it is difficult to calculate the damping provided by fluid flow because of dynamic motions of components and the transient nature of the flow. Abaqus Explicit offers general capabilities for modeling fluid-structure interaction with the Coupled Eulerian-Lagrangian (CEL) method. The Coupled Eulerian-Lagrangian technique has been used to simulate fluid spillage through fuel valves during dynamic closure events. The technique to simulate pressure drops across Eulerian domains has been developed using stagnation pressure. Also, the fluid flow is calculated considering material flow through elements at the outlet section of the valves. The methodology has been verified on Eaton products and shows a good correlation with the test results.Keywords: Coupled Eulerian-Lagrangian Technique, fluid structure interaction, spillage prediction, stagnation pressure
Procedia PDF Downloads 37911427 Metaphysics of the Unified Field of the Universe
Authors: Santosh Kaware, Dnyandeo Patil, Moninder Modgil, Hemant Bhoir, Debendra Behera
Abstract:
The Unified Field Theory has been an area of intensive research since many decades. This paper focuses on philosophy and metaphysics of unified field theory at Planck scale - and its relationship with super string theory and Quantum Vacuum Dynamic Physics. We examined the epistemology of questions such as - (1) what is the Unified Field of universe? (2) can it actually - (a) permeate the complete universe - or (b) be localized in bound regions of the universe - or, (c) extend into the extra dimensions? - -or (d) live only in extra dimensions? (3) What should be the emergent ontological properties of Unified field? (4) How the universe is manifesting through its Quantum Vacuum energies? (5) How is the space time metric coupled to the Unified field? We present a number of ansatz - which we outline below. It is proposed that the unified field possesses consciousness as well as a memory - a recording of past history - analogous to ‘Consistent Histories’ interpretation of quantum mechanics. We proposed Planck scale geometry of Unified Field with circle like topology and having 32 energy points on its periphery which are the connected to each other by 10 dimensional meta-strings which are sources for manifestation of different fundamentals forces and particles of universe through its Quantum Vacuum energies. It is also proposed that the sub energy levels of ‘Conscious Unified Field’ are used for the process of creation, preservation and rejuvenation of the universe over a period of time by means of negentropy. These epochs can be for the complete universe, or for localized regions such as galaxies or cluster of galaxies. It is proposed that Unified field operates through geometric patterns of its Quantum Vacuum energies - manifesting as various elementary particles by giving spins to zero point energy elements. Epistemological relationship between unified field theory and super-string theories is examined. Properties of ‘consciousness’ and 'memory' cascades from universe, into macroscopic objects - and further onto the elementary particles - via a fractal pattern. Other properties of fundamental particles - such as mass, charge, spin, iso-spin also spill out of such a cascade. The manifestations of the unified field can reach into the parallel universes or the ‘multi-verse’ and essentially have an existence independent of the space-time. It is proposed that mass, length, time scales of the unified theory are less than even the Planck scale - and can be called at a level which we call that of 'Super Quantum Gravity (SQG)'.Keywords: super string theory, Planck scale geometry, negentropy, super quantum gravity
Procedia PDF Downloads 27411426 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 54211425 A Study on the Etching Characteristics of High aspect ratio Oxide Etching Using C4F6 Plasma in Inductively Coupled Plasma with Low Frequency Bias
Authors: ByungJun Woo
Abstract:
In this study, high-aspect-ratio (HAR) oxide etching characteristics in inductively coupled plasma were investigated using low frequency (2 MHz) bias power with C4F6 gas. An experiment was conducted using CF4/C4F6/He as the mixed gas. A 100 nm (etch area)/500 nm (mask area) line patterns were used, and the etch cross-section and etch selectivity of the amorphous carbon layer thin film were derived using a scanning electron microscope. Ion density was extracted using a double Langmuir probe, and CFx and F neutral species were observed via optical emission spectroscopy. Based on these results, the possibility for HAR oxide etching using C4F6 gas chemistry was suggested in this work. These etching results also indicate that the use of C4F6 gas can significantly contribute to the development of next-generation HAR oxide etching.Keywords: plasma, etching, C4F6, high aspect ratio, inductively coupled plasma
Procedia PDF Downloads 7311424 House Price Index Predicts a Larger Impact of Habitat Loss than Primary Productivity on the Biodiversity of North American Avian Communities
Authors: Marlen Acosta Alamo, Lisa Manne, Richard Veit
Abstract:
Habitat loss due to land use change is one of the leading causes of biodiversity loss worldwide. This form of habitat loss is a non-random phenomenon since the same environmental factors that make an area suitable for supporting high local biodiversity overlap with those that make it attractive for urban development. We aimed to compare the effect of two non-random habitat loss predictors on the richness, abundance, and rarity of nature-affiliated and human-affiliated North American breeding birds. For each group of birds, we simulated the non-random habitat loss using two predictors: the House Price Index as a measure of the attractiveness of an area for humans and the Normalized Difference Vegetation Index as a proxy for primary productivity. We compared the results of the two non-random simulation sets and one set of random habitat loss simulations using an analysis of variance and followed up with a Tukey-Kramer test when appropriate. The attractiveness of an area for humans predicted estimates of richness loss and increase of rarity higher than primary productivity and random habitat loss for nature-affiliated and human-affiliated birds. For example, at 50% of habitat loss, the attractiveness of an area for humans produced estimates of richness at least 5% lower and of a rarity at least 40% higher than primary productivity and random habitat loss for both groups of birds. Only for the species abundance of nature-affiliated birds, the attractiveness of an area for humans did not outperform primary productivity as a predictor of biodiversity following habitat loss. We demonstrated the value of the House Price Index, which can be used in conservation assessments as an index of the risks of habitat loss for natural communities. Thus, our results have relevant implications for sustainable urban land-use planning practices and can guide stakeholders and developers in their efforts to conserve local biodiversity.Keywords: biodiversity loss, bird biodiversity, house price index, non-random habitat loss
Procedia PDF Downloads 8611423 The Cost of Non-Communicable Diseases in the European Union: A Projection towards the Future
Authors: Desiree Vandenberghe, Johan Albrecht
Abstract:
Non-communicable diseases (NCDs) are responsible for the vast majority of deaths in the European Union (EU) and represent a large share of total health care spending. A future increase in this health and financial burden is likely to be driven by population ageing, lifestyle changes and technological advances in medicine. Without adequate prevention measures, this burden can severely threaten population health and economic development. To tackle this challenge, a correct assessment of the current burden of NCDs is required, as well as a projection of potential increases of this burden. The contribution of this paper is to offer perspective on the evolution of the NCD burden towards the future and to give an indication of the potential of prevention policy. A Non-Homogenous, Semi-Markov model for the EU was constructed, which allowed for a projection of the cost burden for the four main NCDs (cancer, cardiovascular disease, chronic respiratory disease and diabetes mellitus) towards 2030 and 2050. This simulation is done based on multiple baseline scenarios that vary in demand and supply factors such as health status, population structure, and technological advances. Finally, in order to assess the potential of preventive measures to curb the cost explosion of NCDs, a simulation is executed which includes increased efforts for preventive health care measures. According to the Markov model, by 2030 and 2050, total costs (direct and indirect costs) in the EU could increase by 30.1% and 44.1% respectively, compared to 2015 levels. An ambitious prevention policy framework for NCDs will be required if the EU wants to meet this challenge of rising costs. To conclude, significant cost increases due to Non-Communicable Diseases are likely to occur due to demographic and lifestyle changes. Nevertheless, an ambitious prevention program throughout the EU can aid in making this cost burden manageable for future generations.Keywords: non-communicable diseases, preventive health care, health policy, Markov model, scenario analysis
Procedia PDF Downloads 13811422 The Dynamics of Unsteady Squeezing Flow between Parallel Plates (Two-Dimensional)
Authors: Jiya Mohammed, Ibrahim Ismail Giwa
Abstract:
Unsteady squeezing flow of a viscous fluid between parallel plates is considered. The two plates are considered to be approaching each other symmetrically, causing the squeezing flow. Two-dimensional rectangular Cartesian coordinate is considered. The Navier-Stokes equation was reduced using similarity transformation to a single fourth order non-linear ordinary differential equation. The energy equation was transformed to a second order coupled differential equation. We obtained solution to the resulting ordinary differential equations via Homotopy Perturbation Method (HPM). HPM deforms a differential problem into a set of problem that are easier to solve and it produces analytic approximate expression in the form of an infinite power series by using only sixth and fifth terms for the velocity and temperature respectively. The results reveal that the proposed method is very effective and simple. Comparisons among present and existing solutions were provided and it is shown that the proposed method is in good agreement with Variation of Parameter Method (VPM). The effects of appropriate dimensionless parameters on the velocity profiles and temperature field are demonstrated with the aid of comprehensive graphs and tables.Keywords: coupled differential equation, Homotopy Perturbation Method, plates, squeezing flow
Procedia PDF Downloads 47411421 Building a Stochastic Simulation Model for Blue Crab Population Evolution in Antinioti Lagoon
Authors: Nikolaos Simantiris, Markos Avlonitis
Abstract:
This work builds a simulation platform, modeling the spatial diffusion of the invasive species Callinectes sapidus (blue crab) as a random walk, incorporating also generation, fatality, and fishing rates modeling the time evolution of its population. Antinioti lagoon in West Greece was used as a testbed for applying the simulation model. Field measurements from June 2020 to June 2021 on the lagoon’s setting, bathymetry, and blue crab juveniles provided the initial population simulation of blue crabs, as well as biological parameters from the current literature were used to calibrate simulation parameters. The scope of this study is to render the authors able to predict the evolution of the blue crab population in confined environments of the Ionian Islands region in West Greece. The first result of the simulation experiments shows the possibility for a robust prediction for blue crab population evolution in the Antinioti lagoon.Keywords: antinioti lagoon, blue crab, stochastic simulation, random walk
Procedia PDF Downloads 22911420 Multilevel Modeling of the Progression of HIV/AIDS Disease among Patients under HAART Treatment
Authors: Awol Seid Ebrie
Abstract:
HIV results as an incurable disease, AIDS. After a person is infected with virus, the virus gradually destroys all the infection fighting cells called CD4 cells and makes the individual susceptible to opportunistic infections which cause severe or fatal health problems. Several studies show that the CD4 cells count is the most determinant indicator of the effectiveness of the treatment or progression of the disease. The objective of this paper is to investigate the progression of the disease over time among patient under HAART treatment. Two main approaches of the generalized multilevel ordinal models; namely the proportional odds model and the nonproportional odds model have been applied to the HAART data. Also, the multilevel part of both models includes random intercepts and random coefficients. In general, four models are explored in the analysis and then the models are compared using the deviance information criteria. Of these models, the random coefficients nonproportional odds model is selected as the best model for the HAART data used as it has the smallest DIC value. The selected model shows that the progression of the disease increases as the time under the treatment increases. In addition, it reveals that gender, baseline clinical stage and functional status of the patient have a significant association with the progression of the disease.Keywords: nonproportional odds model, proportional odds model, random coefficients model, random intercepts model
Procedia PDF Downloads 42111419 Using Hidden Markov Chain for Improving the Dependability of Safety-Critical Wireless Sensor Networks
Authors: Issam Alnader, Aboubaker Lasebae, Rand Raheem
Abstract:
Wireless sensor networks (WSNs) are distributed network systems used in a wide range of applications, including safety-critical systems. The latter provide critical services, often concerned with human life or assets. Therefore, ensuring the dependability requirements of Safety critical systems is of paramount importance. The purpose of this paper is to utilize the Hidden Markov Model (HMM) to elongate the service availability of WSNs by increasing the time it takes a node to become obsolete via optimal load balancing. We propose an HMM algorithm that, given a WSN, analyses and predicts undesirable situations, notably, nodes dying unexpectedly or prematurely. We apply this technique to improve on C. Lius’ algorithm, a scheduling-based algorithm which has served to improve the lifetime of WSNs. Our experiments show that our HMM technique improves the lifetime of the network, achieved by detecting nodes that die early and rebalancing their load. Our technique can also be used for diagnosis and provide maintenance warnings to WSN system administrators. Finally, our technique can be used to improve algorithms other than C. Liu’s.Keywords: wireless sensor networks, IoT, dependability of safety WSNs, energy conservation, sleep awake schedule
Procedia PDF Downloads 10011418 Bayesian Flexibility Modelling of the Conditional Autoregressive Prior in a Disease Mapping Model
Authors: Davies Obaromi, Qin Yongsong, James Ndege, Azeez Adeboye, Akinwumi Odeyemi
Abstract:
The basic model usually used in disease mapping, is the Besag, York and Mollie (BYM) model and which combines the spatially structured and spatially unstructured priors as random effects. Bayesian Conditional Autoregressive (CAR) model is a disease mapping method that is commonly used for smoothening the relative risk of any disease as used in the Besag, York and Mollie (BYM) model. This model (CAR), which is also usually assigned as a prior to one of the spatial random effects in the BYM model, successfully uses information from adjacent sites to improve estimates for individual sites. To our knowledge, there are some unrealistic or counter-intuitive consequences on the posterior covariance matrix of the CAR prior for the spatial random effects. In the conventional BYM (Besag, York and Mollie) model, the spatially structured and the unstructured random components cannot be seen independently, and which challenges the prior definitions for the hyperparameters of the two random effects. Therefore, the main objective of this study is to construct and utilize an extended Bayesian spatial CAR model for studying tuberculosis patterns in the Eastern Cape Province of South Africa, and then compare for flexibility with some existing CAR models. The results of the study revealed the flexibility and robustness of this alternative extended CAR to the commonly used CAR models by comparison, using the deviance information criteria. The extended Bayesian spatial CAR model is proved to be a useful and robust tool for disease modeling and as a prior for the structured spatial random effects because of the inclusion of an extra hyperparameter.Keywords: Besag2, CAR models, disease mapping, INLA, spatial models
Procedia PDF Downloads 27911417 Experimental Performance and Numerical Simulation of Double Glass Wall
Authors: Thana Ananacha
Abstract:
This paper reports the numerical and experimental performances of Double Glass Wall are investigated. Two configurations were considered namely, the Double Clear Glass Wall (DCGW) and the Double Translucent Glass Wall (DTGW). The coupled governing equations as well as boundary conditions are solved using the finite element method (FEM) via COMSOLTM Multiphysics. Temperature profiles and flow field of the DCGW and DTGW are reported and discussed. Different constant heat fluxes were considered namely 400 and 800 W.m-2 the corresponding initial condition temperatures were to 30.5 and 38.5 ºC respectively. The results show that the simulation results are in agreement with the experimental data. Conclusively, the model considered in this study could reasonable be used simulate the thermal and ventilation performance of the DCGW and DTGW configurations.Keywords: thermal simulation, Double Glass Wall, velocity field, finite element method (FEM)
Procedia PDF Downloads 35911416 Issues and Challenges in Social Work Field Education: The Field Coordinator's Perspective
Authors: Tracy B.E. Omorogiuwa
Abstract:
Understanding the role of social work in improving societal well-being cannot be separated from the place of field education, which is an integral aspect of social work education. Field learning provides students with knowledge and opportunities to experience solving issues in the field and giving them a clue of the practice situation. Despite being a crucial component in social work curriculum, field education occupies a large space in learning outcome, given the issues and challenges pertaining to its purpose and significance in the society. The drive of this paper is to provide insight on the specific ways in which field education has been conceived, realized and valued in the society. Emphasis is on the significance of field instruction; the link with classroom learning; and the structure of field experience in social work education. Given documented analysis and experience, this study intends to contribute to the development of social work curriculum, by analyzing the pattern, issues and challenges fronting the social work field education in the University of Benin, Nigeria.Keywords: challenges, curriculum, field education, social work education
Procedia PDF Downloads 29711415 Predictive Modeling of Bridge Conditions Using Random Forest
Authors: Miral Selim, May Haggag, Ibrahim Abotaleb
Abstract:
The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.Keywords: data analysis, random forest, predictive modeling, bridge management
Procedia PDF Downloads 2111414 Ballistic Transport in One-Dimensional Random Dimer Photonic Crystals
Authors: Samira Cherid, Samir Bentata, F. Zahira Meghoufel, Sabria Terkhi, Yamina Sefir, Fatima Bendahma, Bouabdellah Bouadjemi, Ali Z. Itouni
Abstract:
In this work, we examined the propagation of light in one-dimensional systems is examined by means of the random dimer model. The introduction of defect elements, randomly in the studied system, breaks down the Anderson localization and provides a set of propagating delocalized modes at the corresponding conventional dimer resonances. However, tuning suitably the defect dimer resonance on the host ones (or vice versa), the transmission magnitudes can be enhanced providing the optimized ballistic transmission regime as an average response. Hence, ballistic optical filters can be conceived at desired wavelengths.Keywords: photonic crystals, random dimer model, ballistic resonance, localization and transmission
Procedia PDF Downloads 52911413 Assessing Effects of an Intervention on Bottle-Weaning and Reducing Daily Milk Intake from Bottles in Toddlers Using Two-Part Random Effects Models
Authors: Yungtai Lo
Abstract:
Two-part random effects models have been used to fit semi-continuous longitudinal data where the response variable has a point mass at 0 and a continuous right-skewed distribution for positive values. We review methods proposed in the literature for analyzing data with excess zeros. A two-part logit-log-normal random effects model, a two-part logit-truncated normal random effects model, a two-part logit-gamma random effects model, and a two-part logit-skew normal random effects model were used to examine effects of a bottle-weaning intervention on reducing bottle use and daily milk intake from bottles in toddlers aged 11 to 13 months in a randomized controlled trial. We show in all four two-part models that the intervention promoted bottle-weaning and reduced daily milk intake from bottles in toddlers drinking from a bottle. We also show that there are no differences in model fit using either the logit link function or the probit link function for modeling the probability of bottle-weaning in all four models. Furthermore, prediction accuracy of the logit or probit link function is not sensitive to the distribution assumption on daily milk intake from bottles in toddlers not off bottles.Keywords: two-part model, semi-continuous variable, truncated normal, gamma regression, skew normal, Pearson residual, receiver operating characteristic curve
Procedia PDF Downloads 34911412 Quantifying Spatiotemporal Patterns of Past and Future Urbanization Trends in El Paso, Texas and Their Impact on Electricity Consumption
Authors: Joanne Moyer
Abstract:
El Paso, Texas is a southwest border city that has experienced continuous growth within the last 15-years. Understanding the urban growth trends and patterns using data from the National Land Cover Database (NLCD) and landscape metrics, provides a quantitative description of growth. Past urban growth provided a basis to predict 2031 future land-use for El Paso using the CA-Markov model. As a consequence of growth, an increase in demand of resources follows. Using panel data analysis, an understanding of the relation between landscape metrics and electricity consumption is further analyzed. The studies’ findings indicate that past growth focused within three districts within the City of El Paso. The landscape metrics suggest as the city has grown, fragmentation has decreased. Alternatively, the landscape metrics for the projected 2031 land-use indicates possible fragmentation within one of these districts. Panel data suggests electricity consumption and mean patch area landscape metric are positively correlated. The study provides local decision makers to make informed decisions for policies and urban planning to ensure a future sustainable community.Keywords: landscape metrics, CA-Markov, El Paso, Texas, panel data
Procedia PDF Downloads 14311411 Localising Gauss’s Law and the Electric Charge Induction on a Conducting Sphere
Authors: Sirapat Lookrak, Anol Paisal
Abstract:
Space debris has numerous manifestations, including ferro-metalize and non-ferrous. The electric field will induce negative charges to split from positive charges inside the space debris. In this research, we focus only on conducting materials. The assumption is that the electric charge density of a conducting surface is proportional to the electric field on that surface due to Gauss's Law. We are trying to find the induced charge density from an external electric field perpendicular to a conducting spherical surface. An object is a sphere on which the external electric field is not uniform. The electric field is, therefore, considered locally. The localised spherical surface is a tangent plane, so the Gaussian surface is a very small cylinder, and every point on a spherical surface has its own cylinder. The electric field from a circular electrode has been calculated in near-field and far-field approximation and shown Explanation Touchless maneuvering space debris orbit properties. The electric charge density calculation from a near-field and far-field approximation is done.Keywords: near-field approximation, far-field approximation, localized Gauss’s law, electric charge density
Procedia PDF Downloads 13111410 Design and Implementation of Pseudorandom Number Generator Using Android Sensors
Authors: Mochamad Beta Auditama, Yusuf Kurniawan
Abstract:
A smartphone or tablet require a strong randomness to establish secure encrypted communication, encrypt files, etc. Therefore, random number generation is one of the main keys to provide secrecy. Android devices are equipped with hardware-based sensors, such as accelerometer, gyroscope, etc. Each of these sensors provides a stochastic process which has a potential to be used as an extra randomness source, in addition to /dev/random and /dev/urandom pseudorandom number generators. Android sensors can provide randomness automatically. To obtain randomness from Android sensors, each one of Android sensors shall be used to construct an entropy source. After all entropy sources are constructed, output from these entropy sources are combined to provide more entropy. Then, a deterministic process is used to produces a sequence of random bits from the combined output. All of these processes are done in accordance with NIST SP 800-22 and the series of NIST SP 800-90. The operation conditions are done 1) on Android user-space, and 2) the Android device is placed motionless on a desk.Keywords: Android hardware-based sensor, deterministic process, entropy source, random number generation/generators
Procedia PDF Downloads 37411409 Finite Element Modeling of Heat and Moisture Transfer in Porous Material
Authors: V. D. Thi, M. Li, M. Khelifa, M. El Ganaoui, Y. Rogaume
Abstract:
This paper presents a two-dimensional model to study the heat and moisture transfer through porous building materials. Dynamic and static coupled models of heat and moisture transfer in porous material under low temperature are presented and the coupled models together with variable initial and boundary conditions have been considered in an analytical way and using the finite element method. The resulting coupled model is converted to two nonlinear partial differential equations, which is then numerically solved by an implicit iterative scheme. The numerical results of temperature and moisture potential changes are compared with the experimental measurements available in the literature. Predicted results demonstrate validation of the theoretical model and effectiveness of the developed numerical algorithms. It is expected to provide useful information for the porous building material design based on heat and moisture transfer model.Keywords: finite element method, heat transfer, moisture transfer, porous materials, wood
Procedia PDF Downloads 40011408 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.Keywords: human motion recognition, motion representation, Laban Movement Analysis, Discrete Hidden Markov Model
Procedia PDF Downloads 20711407 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 47711406 Precise Identification of Clustered Regularly Interspaced Short Palindromic Repeats-Induced Mutations via Hidden Markov Model-Based Sequence Alignment
Authors: Jingyuan Hu, Zhandong Liu
Abstract:
CRISPR genome editing technology has transformed molecular biology by accurately targeting and altering an organism’s DNA. Despite the state-of-art precision of CRISPR genome editing, the imprecise mutation outcome and off-target effects present considerable risk, potentially leading to unintended genetic changes. Targeted deep sequencing, combined with bioinformatics sequence alignment, can detect such unwanted mutations. Nevertheless, the classical method, Needleman-Wunsch (NW) algorithm may produce false alignment outcomes, resulting in inaccurate mutation identification. The key to precisely identifying CRISPR-induced mutations lies in determining optimal parameters for the sequence alignment algorithm. Hidden Markov models (HMM) are ideally suited for this task, offering flexibility across CRISPR systems by leveraging forward-backward algorithms for parameter estimation. In this study, we introduce CRISPR-HMM, a statistical software to precisely call CRISPR-induced mutations. We demonstrate that the software significantly improves precision in identifying CRISPR-induced mutations compared to NW-based alignment, thereby enhancing the overall understanding of the CRISPR gene-editing process.Keywords: CRISPR, HMM, sequence alignment, gene editing
Procedia PDF Downloads 51