Search results for: multivariate distribution theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9924

Search results for: multivariate distribution theory

8664 Breaking Stress Criterion that Changes Everything We Know About Materials Failure

Authors: Ali Nour El Hajj

Abstract:

Background: The perennial deficiencies of the failure models in the materials field have profoundly and significantly impacted all associated technical fields that depend on accurate failure predictions. Many preeminent and well-known scientists from an earlier era of groundbreaking discoveries attempted to solve the issue of material failure. However, a thorough understanding of material failure has been frustratingly elusive. Objective: The heart of this study is the presentation of a methodology that identifies a newly derived one-parameter criterion as the only general failure theory for noncompressible, homogeneous, and isotropic materials subjected to multiaxial states of stress and various boundary conditions, providing the solution to this longstanding problem. This theory is the counterpart and companion piece to the theory of elasticity and is in a formalism that is suitable for broad application. Methods: Utilizing advanced finite-element analysis, the maximum internal breaking stress corresponding to the maximum applied external force is identified as a unified and universal material failure criterion for determining the structural capacity of any system, regardless of its geometry or architecture. Results: A comparison between the proposed criterion and methodology against design codes reveals that current provisions may underestimate the structural capacity by 2.17 times or overestimate the capacity by 2.096 times. It also shows that existing standards may underestimate the structural capacity by 1.4 times or overestimate the capacity by 2.49 times. Conclusion: The proposed failure criterion and methodology will pave the way for a new era in designing unconventional structural systems composed of unconventional materials.

Keywords: failure criteria, strength theory, failure mechanics, materials mechanics, rock mechanics, concrete strength, finite-element analysis, mechanical engineering, aeronautical engineering, civil engineering

Procedia PDF Downloads 74
8663 Different in Factors of the Distributor Selection for Food and Non-Food OTOP Entrepreneur in Thailand

Authors: Phutthiwat Waiyawuththanapoom

Abstract:

This study has only one objective which is to identify the different in factors of choosing the distributor for food and non-food OTOP entrepreneur in Thailand. In this research, the types of OTOP product will be divided into two groups which are food and non-food. The sample for the food type OTOP product was the processed fruit and vegetable from Nakorn Pathom province and the sample for the non-food type OTOP product was the court doll from Ang Thong province. The research was divided into 3 parts which were a study of the distribution pattern and how to choose the distributor of the food type OTOP product, a study of the distribution pattern and how to choose the distributor of the non-food type OTOP product and a comparison between 2 types of products to find the differentiation in the factor of choosing distributor. The data and information was collected by using the interview. The populations in the research were 5 producers of the processed fruit and vegetable from Nakorn Pathom province and 5 producers of the court doll from Ang Thong province. The significant factor in choosing the distributor of the food type OTOP product is the material handling efficiency and on-time delivery but for the non-food type OTOP product is focused on the channel of distribution and cost of the distributor.

Keywords: distributor, OTOP, food and non-food, selection

Procedia PDF Downloads 352
8662 Full Mini Nutritional Assessment Questionnaire and the Risk of Malnutrition and Mortality in Elderly, Hospitalized Patients: A Cross-Sectional Study

Authors: Christos E. Lampropoulos, Maria Konsta, Tamta Sirbilatze, Ifigenia Apostolou, Vicky Dradaki, Konstantina Panouria, Irini Dri, Christina Kordali, Vaggelis Lambas, Georgios Mavras

Abstract:

Objectives: Full Mini Nutritional Assessment (MNA) questionnaire is one of the most useful tools in diagnosis of malnutrition in hospitalized patients, which is related to increased morbidity and mortality. The purpose of our study was to assess the nutritional status of elderly, hospitalized patients and examine the hypothesis that MNA may predict mortality and extension of hospitalization. Methods: One hundred fifty patients (78 men, 72 women, mean age 80±8.2) were included in this cross-sectional study. The following data were taken into account in analysis: anthropometric and laboratory data, physical activity (International Physical Activity Questionnaires, IPAQ), smoking status, dietary habits, cause and duration of current admission, medical history (co-morbidities, previous admissions). Primary endpoints were mortality (from admission until 6 months afterwards) and duration of admission. The latter was compared to national guidelines for closed consolidated medical expenses. Logistic regression and linear regression analysis were performed in order to identify independent predictors for mortality and extended hospitalization respectively. Results: According to MNA, nutrition was normal in 54/150 (36%) of patients, 46/150 (30.7%) of them were at risk of malnutrition and the rest 50/150 (33.3%) were malnourished. After performing multivariate logistic regression analysis we found that the odds of death decreased 20% per each unit increase of full MNA score (OR=0.8, 95% CI 0.74-0.89, p < 0.0001). Patients who admitted due to cancer were 23 times more likely to die, compared to those with infection (OR=23, 95% CI 3.8-141.6, p=0.001). Similarly, patients who admitted due to stroke were 7 times more likely to die (OR=7, 95% CI 1.4-34.5, p=0.02), while these with all other causes of admission were less likely (OR=0.2, 95% CI 0.06-0.8, p=0.03), compared to patients with infection. According to multivariate linear regression analysis, each increase of unit of full MNA, decreased the admission duration on average 0.3 days (b:-0.3, 95% CI -0.45 - -0.15, p < 0.0001). Patients admitted due to cancer had on average 6.8 days higher extension of hospitalization, compared to those admitted for infection (b:6.8, 95% CI 3.2-10.3, p < 0.0001). Conclusion: Mortality and extension of hospitalization is significantly increased in elderly, malnourished patients. Full MNA score is a useful diagnostic tool of malnutrition.

Keywords: duration of admission, malnutrition, mini nutritional assessment score, prognostic factors for mortality

Procedia PDF Downloads 308
8661 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 383
8660 Metaphysics of the Unified Field of the Universe

Authors: Santosh Kaware, Dnyandeo Patil, Moninder Modgil, Hemant Bhoir, Debendra Behera

Abstract:

The Unified Field Theory has been an area of intensive research since many decades. This paper focuses on philosophy and metaphysics of unified field theory at Planck scale - and its relationship with super string theory and Quantum Vacuum Dynamic Physics. We examined the epistemology of questions such as - (1) what is the Unified Field of universe? (2) can it actually - (a) permeate the complete universe - or (b) be localized in bound regions of the universe - or, (c) extend into the extra dimensions? - -or (d) live only in extra dimensions? (3) What should be the emergent ontological properties of Unified field? (4) How the universe is manifesting through its Quantum Vacuum energies? (5) How is the space time metric coupled to the Unified field? We present a number of ansatz - which we outline below. It is proposed that the unified field possesses consciousness as well as a memory - a recording of past history - analogous to ‘Consistent Histories’ interpretation of quantum mechanics. We proposed Planck scale geometry of Unified Field with circle like topology and having 32 energy points on its periphery which are the connected to each other by 10 dimensional meta-strings which are sources for manifestation of different fundamentals forces and particles of universe through its Quantum Vacuum energies. It is also proposed that the sub energy levels of ‘Conscious Unified Field’ are used for the process of creation, preservation and rejuvenation of the universe over a period of time by means of negentropy. These epochs can be for the complete universe, or for localized regions such as galaxies or cluster of galaxies. It is proposed that Unified field operates through geometric patterns of its Quantum Vacuum energies - manifesting as various elementary particles by giving spins to zero point energy elements. Epistemological relationship between unified field theory and super-string theories is examined. Properties of ‘consciousness’ and 'memory' cascades from universe, into macroscopic objects - and further onto the elementary particles - via a fractal pattern. Other properties of fundamental particles - such as mass, charge, spin, iso-spin also spill out of such a cascade. The manifestations of the unified field can reach into the parallel universes or the ‘multi-verse’ and essentially have an existence independent of the space-time. It is proposed that mass, length, time scales of the unified theory are less than even the Planck scale - and can be called at a level which we call that of 'Super Quantum Gravity (SQG)'.

Keywords: super string theory, Planck scale geometry, negentropy, super quantum gravity

Procedia PDF Downloads 265
8659 Self-Determination Theory at the Workplace: Associations between Need Satisfaction and Employment Outcomes

Authors: Wendy I. E. Wesseling

Abstract:

The unemployment rate has been on the rise since the outbreak of the global financial crisis in 2008. Especially labor market entrants suffer from economic downfall. Despite the abundance of programs and agencies that help to reintegrate unemployed youth, considerable less research attention has been paid to 'fit' between these programs and its participants that ensure a durable labor market transition. According to Self-Determination Theory, need satisfaction is associated with better (mental) adjustment. As such, three hypothesis were formulated: when workers’ needs for competence (H1), relatedness (H2), and autonomy (H3) are satisfied in the workplace, they are more likely to remain employed at the same employer. To test these assumptions, a sample of approximately 800 young people enrolled in a youth unemployment policy participated in a longitudinal study. The unemployment policy was aimed at the development of generic and vocational competences, and had a maximum duration of six months. Need satisfaction during the program was measured, as well as their employment outcomes up to 12 months after completion of the policy. All hypotheses were (partly) supported. Some limitations should be noted. First, since our sample consisted primarily of highly educated white graduates, it remains to be tested whether our results generalize to other groups of unemployed youth. Moreover, we are unable to conclude whether the results are due to the intervention, participants (selection effect), or both, because of the lack of a control group.

Keywords: need satisfaction, person-job fit, self-determination theory, youth unemployment policy

Procedia PDF Downloads 250
8658 From a Distance: A Grounded Theory Study of Incarcerated Filipino Elderly's Separation Anxiety

Authors: Allan B. de Guzman, Rochelle Gabrielle R. Gatan, Ira Bianca Mae G. Gesmundo, Astley Justine H. Golosinda

Abstract:

Background: While in prison, the elderly, like the younger prisoners, face specific problems and deprivations arising directly from their imprisonment, one of which is forced separation from family and loved ones. Despite the numerous studies that examined the impact of separation and separation anxiety on the emotions and behavior of young individuals, little is known about separation anxiety in the elderly population. Objective: This grounded theory study purports to describe the process of separation anxiety among incarcerated Filipino elderly men. Method: Individual interviews and participant observations were conducted with 25 incarcerated elderly Filipino men who are first-time prisoners, sentenced to lifetime imprisonment and were analyzed using constant comparative method. Results: Following Strauss and Corbin’s protocol, a four-part process emerged to describe the studied layer of human experience. The Tectonic Model of Separation Anxiety among incarcerated Filipino elderly men comprises of four phases: Winkling, Wilting, Weeding, and Weaving. Conclusion: This study has inductively and creatively explored the process of separation anxiety among the Filipino incarcerated elderly men. Findings of this study invite nurses and other clinicians to identify developmentally appropriate strategies and interventions for this vulnerable and neglected sector of society.

Keywords: elderly, grounded theory, separation anxiety, Filipino, incarcerated

Procedia PDF Downloads 355
8657 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions

Authors: Hannah F. Opayinka, Adedayo A. Adepoju

Abstract:

This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.

Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution

Procedia PDF Downloads 184
8656 A Nonlocal Means Algorithm for Poisson Denoising Based on Information Geometry

Authors: Dongxu Chen, Yipeng Li

Abstract:

This paper presents an information geometry NonlocalMeans(NLM) algorithm for Poisson denoising. NLM estimates a noise-free pixel as a weighted average of image pixels, where each pixel is weighted according to the similarity between image patches in Euclidean space. In this work, every pixel is a Poisson distribution locally estimated by Maximum Likelihood (ML), all distributions consist of a statistical manifold. A NLM denoising algorithm is conducted on the statistical manifold where Fisher information matrix can be used for computing distribution geodesics referenced as the similarity between patches. This approach was demonstrated to be competitive with related state-of-the-art methods.

Keywords: image denoising, Poisson noise, information geometry, nonlocal-means

Procedia PDF Downloads 282
8655 Cyber Violence Behaviors Among Social Media Users in Ghana: An Application of Self-Control Theory and Social Learning Theory

Authors: Aisha Iddrisu

Abstract:

The proliferation of cyberviolence in the wave of increased social media consumption calls for immediate attention both at the local and global levels. With over 4.70 billion social media users worldwide and 8.8 social media users in Ghana, various forms of violence have become the order of the day in most countries and communities. Cyber violence is defined as producing, retrieving, and sharing of hurtful or dangerous online content to cause emotional, psychological, or physical harm. The urgency and severity of cyber violence have led to the enactment of laws in various countries though lots still need to be done, especially in Ghana. In Ghana, studies on cyber violence have not been extensively dealt with. Existing studies concentrate only on one form or the other form of cyber violence, thus cybercrime and cyber bullying. Also, most studies in Africa have not explored cyber violence forms using empirical theories and the few that existed were qualitatively researched, whereas others examine the effect of cyber violence rather than examining why those who involve in it behave the way they behave. It is against this backdrop that this study aims to examine various cyber violence behaviour among social media users in Ghana by applying the theory of Self-control and Social control theory. This study is important for the following reasons. The outcome of this research will help at both national and international level of policymaking by adding to the knowledge of understanding cyberviolence and why people engage in various forms of cyberviolence. It will also help expose other ways by which such behaviours are enforced thereby serving as a guide in the enactment of the rightful rules and laws to curb such behaviours. It will add to literature on consequences of new media. This study seeks to confirm or reject to the following research hypotheses. H1 Social media usage has direct significant effect of cyberviolence behaviours. H2 Ineffective parental management has direct significant positive relation to Low self-control. H3 Low self-control has direct significant positive effect on cyber violence behaviours among social, H4 Differential association has significant positive effect on cyberviolence behaviour among social media users in Ghana. H5 Definitions have a significant positive effect on cyberviolence behaviour among social media users in Ghana. H6 Imitation has a significant positive effect on cyberviolence behaviour among social media users in Ghana. H7 Differential reinforcement has a significant positive effect on cyberviolence behaviour among social media users in Ghana. H8 Differential association has a significant positive effect on definitions. H9 Differential association has a significant positive effect on imitation. H10 Differential association has a significant positive effect on differential reinforcement. H11 Differential association has significant indirect positive effects on cyberviolence through the learning process.

Keywords: cyberviolence, social media users, self-control theory, social learning theory

Procedia PDF Downloads 71
8654 Biopolitics and Race in the Age of a Global Pandemic: Interactions and Transformations

Authors: Aistis ZekevicIus

Abstract:

Biopolitical theory, which was first developed by Michel Foucault, takes into consideration the administration of life by implying a style of government based on the regulation of populations as its subject. The intensification of the #BlackLivesMatter movement and popular outcries against racial discrimination in the US health system have prompted us to reconsider the relationship between biopolitics and race in the face of the COVID-19 pandemic. Based on works by Foucault, Achille Mbembe and Nicholas Mirzoeff that transcend the boundaries of poststructuralism, critical theory and postcolonial studies, the paper suggests that the global pandemic has highlighted new aspects of the interplay between biopower and race by encouraging the search for scapegoats, deepening the structural racial inequality, and thus producing necropolitical regimes of exclusion.

Keywords: biopolitics, biopower, necropolitics, pandemic, race

Procedia PDF Downloads 249
8653 Fairly Irrigation Water Distribution between Upstream and Downstream Water Users in Water Shortage Periods

Authors: S. M. Hashemy Shahdany

Abstract:

Equitable water delivery becomes one of the main concerns for water authorities in arid regions. Due to water scarcity, providing reliable amount of water is not possible for most of the irrigation districts in arid regions. In this paper, water level difference control is applied to keep the water level errors equal in adjacent reaches. Distant downstream decentralized configurations of the control method are designed and tested under a realistic scenario shows canal operation under water shortage. The simulation results show that the difference controllers share the water level error among all of the users in a fair way. Therefore, water deficit has a similar influence on downstream as well as upstream and water offtakes.

Keywords: equitable water distribution, precise agriculture, sustainable agriculture, water shortage

Procedia PDF Downloads 457
8652 Optimal Planning of Dispatchable Distributed Generators for Power Loss Reduction in Unbalanced Distribution Networks

Authors: Mahmoud M. Othman, Y. G. Hegazy, A. Y. Abdelaziz

Abstract:

This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.

Keywords: distributed generation, heuristic approach, optimization, planning

Procedia PDF Downloads 514
8651 Moroccan Human Ecological Behavior: Grounded Theory Approach

Authors: Dalal Tarfaoui, Salah Zkim

Abstract:

Today, environmental sustainability is everyone’s concern as it contributes in many aspects to a country's development. Morocco is also aware of the increasing threats to its natural resources. Accordingly, many projects and research have been discussed pointing mainly to water security, pollution, desertification, and land degradation, but few studies bothered to dig into the human demeanor to disclose its ecological behavior. Human behavior is accountable for environment deterioration in the first place, but we keep fighting the symptoms instead of limiting the root causes. In the conceptual framework highlighted in the present article, semi-structured interviews have been conducted using a grounded theory approach. Initially this study will serve as a pilot study and a cornerstone to approve a bigger project now in progress. Beyond the existing general ecological measures (GEM), this study has chosen the grounded theory approach to bring out firsthand insights, and probe to which extent an ecological dimension exists in Morocco as a developing country. The discourse of the ecological behavior within the Moroccan context is seen in more realist, social, and community philosophy. The study has revealed an appreciative ecological behavior that is unfortunately repressed by variables beyond people’s control, which would prevent the people’s environmental good intentions to be translated into real ecological actions.

Keywords: ecological behavior, ecological dimension, variables beyond people’s control, Morocco

Procedia PDF Downloads 485
8650 Reductive Control in the Management of Redundant Actuation

Authors: Mkhinini Maher, Knani Jilani

Abstract:

We present in this work the performances of a mobile omnidirectional robot through evaluating its management of the redundancy of actuation. Thus we come to the predictive control implemented. The distribution of the wringer on the robot actions, through the inverse pseudo of Moore-Penrose, corresponds to a -geometric- distribution of efforts. We will show that the load on vehicle wheels would not be equi-distributed in terms of wheels configuration and of robot movement. Thus, the threshold of sliding is not the same for the three wheels of the vehicle. We suggest exploiting the redundancy of actuation to reduce the risk of wheels sliding and to ameliorate, thereby, its accuracy of displacement. This kind of approach was the subject of study for the legged robots.

Keywords: mobile robot, actuation, redundancy, omnidirectional, inverse pseudo moore-penrose, reductive control

Procedia PDF Downloads 504
8649 The Use of the Matlab Software as the Best Way to Recognize Penumbra Region in Radiotherapy

Authors: Alireza Shayegan, Morteza Amirabadi

Abstract:

The y tool was developed to quantitatively compare dose distributions, either measured or calculated. Before computing ɣ, the dose and distance scales of the two distributions, referred to as evaluated and reference, are re-normalized by dose and distance criteria, respectively. The re-normalization allows the dose distribution comparison to be conducted simultaneously along dose and distance axes. Several two-dimensional images were acquired using a Scanning Liquid Ionization Chamber EPID and Extended Dose Range (EDR2) films for regular and irregular radiation fields. The raw images were then converted into two-dimensional dose maps. Transitional and rotational manipulations were performed for images using Matlab software. As evaluated dose distribution maps, they were then compared with the corresponding original dose maps as the reference dose maps.

Keywords: energetic electron, gamma function, penumbra, Matlab software

Procedia PDF Downloads 292
8648 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.

Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit

Procedia PDF Downloads 533
8647 Uncertainty Quantification of Crack Widths and Crack Spacing in Reinforced Concrete

Authors: Marcel Meinhardt, Manfred Keuser, Thomas Braml

Abstract:

Cracking of reinforced concrete is a complex phenomenon induced by direct loads or restraints affecting reinforced concrete structures as soon as the tensile strength of the concrete is exceeded. Hence it is important to predict where cracks will be located and how they will propagate. The bond theory and the crack formulas in the actual design codes, for example, DIN EN 1992-1-1, are all based on the assumption that the reinforcement bars are embedded in homogeneous concrete without taking into account the influence of transverse reinforcement and the real stress situation. However, it can often be observed that real structures such as walls, slabs or beams show a crack spacing that is orientated to the transverse reinforcement bars or to the stirrups. In most Finite Element Analysis studies, the smeared crack approach is used for crack prediction. The disadvantage of this model is that the typical strain localization of a crack on element level can’t be seen. The crack propagation in concrete is a discontinuous process characterized by different factors such as the initial random distribution of defects or the scatter of material properties. Such behavior presupposes the elaboration of adequate models and methods of simulation because traditional mechanical approaches deal mainly with average material parameters. This paper concerned with the modelling of the initiation and the propagation of cracks in reinforced concrete structures considering the influence of transverse reinforcement and the real stress distribution in reinforced concrete (R/C) beams/plates in bending action. Therefore, a parameter study was carried out to investigate: (I) the influence of the transversal reinforcement to the stress distribution in concrete in bending mode and (II) the crack initiation in dependence of the diameter and distance of the transversal reinforcement to each other. The numerical investigations on the crack initiation and propagation were carried out with a 2D reinforced concrete structure subjected to quasi static loading and given boundary conditions. To model the uncertainty in the tensile strength of concrete in the Finite Element Analysis correlated normally and lognormally distributed random filed with different correlation lengths were generated. The paper also presents and discuss different methods to generate random fields, e.g. the Covariance Matrix Decomposition Method. For all computations, a plastic constitutive law with softening was used to model the crack initiation and the damage of the concrete in tension. It was found that the distributions of crack spacing and crack widths are highly dependent of the used random field. These distributions are validated to experimental studies on R/C panels which were carried out at the Laboratory for Structural Engineering at the University of the German Armed Forces in Munich. Also, a recommendation for parameters of the random field for realistic modelling the uncertainty of the tensile strength is given. The aim of this research was to show a method in which the localization of strains and cracks as well as the influence of transverse reinforcement on the crack initiation and propagation in Finite Element Analysis can be seen.

Keywords: crack initiation, crack modelling, crack propagation, cracks, numerical simulation, random fields, reinforced concrete, stochastic

Procedia PDF Downloads 147
8646 Hydrochemical Contamination Profiling and Spatial-Temporal Mapping with the Support of Multivariate and Cluster Statistical Analysis

Authors: Sofia Barbosa, Mariana Pinto, José António Almeida, Edgar Carvalho, Catarina Diamantino

Abstract:

The aim of this work was to test a methodology able to generate spatial-temporal maps that can synthesize simultaneously the trends of distinct hydrochemical indicators in an old radium-uranium tailings dam deposit. Multidimensionality reduction derived from principal component analysis and subsequent data aggregation derived from clustering analysis allow to identify distinct hydrochemical behavioural profiles and to generate synthetic evolutionary hydrochemical maps.

Keywords: Contamination plume migration, K-means of PCA scores, groundwater and mine water monitoring, spatial-temporal hydrochemical trends

Procedia PDF Downloads 222
8645 Effectiveness of Self-Learning Module on the Academic Performance of Students in Statistics and Probability

Authors: Aneia Rajiel Busmente, Renato Gunio Jr., Jazin Mautante, Denise Joy Mendoza, Raymond Benedict Tagorio, Gabriel Uy, Natalie Quinn Valenzuela, Ma. Elayza Villa, Francine Yezha Vizcarra, Sofia Madelle Yapan, Eugene Kurt Yboa

Abstract:

COVID-19’s rapid spread caused a dramatic change in the nation, especially the educational system. The Department of Education was forced to adopt a practical learning platform without neglecting health, a printed modular distance learning. The Philippines' K–12 curriculum includes Statistics and Probability as one of the key courses as it offers students the knowledge to evaluate and comprehend data. Due to student’s difficulty and lack of understanding of the concepts of Statistics and Probability in Normal Distribution. The Self-Learning Module in Statistics and Probability about the Normal Distribution created by the Department of Education has several problems, including many activities, unclear illustrations, and insufficient examples of concepts which enables learners to have a difficulty accomplishing the module. The purpose of this study is to determine the effectiveness of self-learning module on the academic performance of students in the subject Statistics and Probability, it will also explore students’ perception towards the quality of created Self-Learning Module in Statistics and Probability. Despite the availability of Self-Learning Modules in Statistics and Probability in the Philippines, there are still few literatures that discuss its effectiveness in improving the performance of Senior High School students in Statistics and Probability. In this study, a Self-Learning Module on Normal Distribution is evaluated using a quasi-experimental design. STEM students in Grade 11 from National University's Nazareth School will be the study's participants, chosen by purposive sampling. Google Forms will be utilized to find at least 100 STEM students in Grade 11. The research instrument consists of 20-item pre- and post-test to assess participants' knowledge and performance regarding Normal Distribution, and a Likert scale survey to evaluate how the students perceived the self-learning module. Pre-test, post-test, and Likert scale surveys will be utilized to gather data, with Jeffreys' Amazing Statistics Program (JASP) software being used for analysis.

Keywords: self-learning module, academic performance, statistics and probability, normal distribution

Procedia PDF Downloads 100
8644 Analytical Model of Locomotion of a Thin-Film Piezoelectric 2D Soft Robot Including Gravity Effects

Authors: Zhiwu Zheng, Prakhar Kumar, Sigurd Wagner, Naveen Verma, James C. Sturm

Abstract:

Soft robots have drawn great interest recently due to a rich range of possible shapes and motions they can take on to address new applications, compared to traditional rigid robots. Large-area electronics (LAE) provides a unique platform for creating soft robots by leveraging thin-film technology to enable the integration of a large number of actuators, sensors, and control circuits on flexible sheets. However, the rich shapes and motions possible, especially when interacting with complex environments, pose significant challenges to forming well-generalized and robust models necessary for robot design and control. In this work, we describe an analytical model for predicting the shape and locomotion of a flexible (steel-foil-based) piezoelectric-actuated 2D robot based on Euler-Bernoulli beam theory. It is nominally (unpowered) lying flat on the ground, and when powered, its shape is controlled by an array of piezoelectric thin-film actuators. Key features of the models are its ability to incorporate the significant effects of gravity on the shape and to precisely predict the spatial distribution of friction against the contacting surfaces, necessary for determining inchworm-type motion. We verified the model by developing a distributed discrete element representation of a continuous piezoelectric actuator and by comparing its analytical predictions to discrete-element robot simulations using PyBullet. Without gravity, predicting the shape of a sheet with a linear array of piezoelectric actuators at arbitrary voltages is straightforward. However, gravity significantly distorts the shape of the sheet, causing some segments to flatten against the ground. Our work includes the following contributions: (i) A self-consistent approach was developed to exactly determine which parts of the soft robot are lifted off the ground, and the exact shape of these sections, for an arbitrary array of piezoelectric voltages and configurations. (ii) Inchworm-type motion relies on controlling the relative friction with the ground surface in different sections of the robot. By adding torque-balance to our model and analyzing shear forces, the model can then determine the exact spatial distribution of the vertical force that the ground is exerting on the soft robot. Through this, the spatial distribution of friction forces between ground and robot can be determined. (iii) By combining this spatial friction distribution with the shape of the soft robot, in the function of time as piezoelectric actuator voltages are changed, the inchworm-type locomotion of the robot can be determined. As a practical example, we calculated the performance of a 5-actuator system on a 50-µm thick steel foil. Piezoelectric properties of commercially available thin-film piezoelectric actuators were assumed. The model predicted inchworm motion of up to 200 µm per step. For independent verification, we also modelled the system using PyBullet, a discrete-element robot simulator. To model a continuous thin-film piezoelectric actuator, we broke each actuator into multiple segments, each of which consisted of two rigid arms with appropriate mass connected with a 'motor' whose torque was set by the applied actuator voltage. Excellent agreement between our analytical model and the discrete-element simulator was shown for both for the full deformation shape and motion of the robot.

Keywords: analytical modeling, piezoelectric actuators, soft robot locomotion, thin-film technology

Procedia PDF Downloads 172
8643 Impact of Diabetes Mellitus Type 2 on Clinical In-Stent Restenosis in First Elective Percutaneous Coronary Intervention Patients

Authors: Leonard Simoni, Ilir Alimehmeti, Ervina Shirka, Endri Hasimi, Ndricim Kallashi, Verona Beka, Suerta Kabili, Artan Goda

Abstract:

Background: Diabetes Mellitus type 2, small vessel calibre, stented length of vessel, complex lesion morphology, and prior bypass surgery have resulted risk factors for In-Stent Restenosis (ISR). However, there are some contradictory results about body mass index (BMI) as a risk factor for ISR. Purpose: We want to identify clinical, lesional and procedural factors that can predict clinical ISR in our patients. Methods: Were enrolled 759 patients who underwent first-time elective PCI with Bare Metal Stents (BMS) from September 2011 to December 2013 in our Department of Cardiology and followed them for at least 1.5 years with a median of 862 days (2 years and 4 months). Only the patients re-admitted with ischemic heart disease underwent control coronary angiography but no routine angiographic control was performed. Patients were categorized in ISR and non-ISR groups and compared between them. Multivariate analysis - Binary Logistic Regression: Forward Conditional Method was used to identify independent predictive risk factors. P was considered statistically significant when <0.05. Results: ISR compared to non-ISR individuals had a significantly lower BMI (25.7±3.3 vs. 26.9±3.7, p=0.004), higher risk anatomy (LM + 3-vessel CAD) (23% vs. 14%, p=0.03), higher number of stents/person used (2.1±1.1 vs. 1.75±0.96, p=0.004), greater length of stents/person used (39.3±21.6 vs. 33.3±18.5, p=0.01), and a lower use of clopidogrel and ASA (together) (95% vs. 99%, p=0.012). They also had a higher, although not statistically significant, prevalence of Diabetes Mellitus (42% vs. 32%, p=0.072) and a greater number of treated vessels (1.36±0.5 vs. 1.26±0.5, p=0.08). In the multivariate analysis, Diabetes Mellitus type 2 and multiple stents used were independent predictors risk factors for In-Stent Restenosis, OR 1.66 [1.03-2.68], p=0.039, and OR 1.44 [1.16-1.78,] p=0.001, respectively. On the other side higher BMI and use of clopidogrel and ASA together resulted protective factors OR 0.88 [0.81-0.95], p=0.001 and OR 0.2 [0.06-0.72] p=0.013, respectively. Conclusion: Diabetes Mellitus and multiple stents are strong predictive risk factors, whereas the use of clopidogrel and ASA together are protective factors for clinical In-Stent Restenosis. Paradoxically High BMI is a protective factor for In-stent Restenosis, probably related to a larger diameter of vessels and consequently a larger diameter of stents implanted in these patients. Further studies are needed to clarify this finding.

Keywords: body mass index, diabetes mellitus, in-stent restenosis, percutaneous coronary intervention

Procedia PDF Downloads 203
8642 Optimal Design of Step-Stress Partially Life Test Using Multiply Censored Exponential Data with Random Removals

Authors: Showkat Ahmad Lone, Ahmadur Rahman, Ariful Islam

Abstract:

The major assumption in accelerated life tests (ALT) is that the mathematical model relating the lifetime of a test unit and the stress are known or can be assumed. In some cases, such life–stress relationships are not known and cannot be assumed, i.e. ALT data cannot be extrapolated to use condition. So, in such cases, partially accelerated life test (PALT) is a more suitable test to be performed for which tested units are subjected to both normal and accelerated conditions. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests using progressive failure-censored hybrid data with random removals. The life data of the units under test is considered to follow exponential life distribution. The removals from the test are assumed to have binomial distributions. The point and interval maximum likelihood estimations are obtained for unknown distribution parameters and tampering coefficient. An optimum test plan is developed using the D-optimality criterion. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.

Keywords: binomial distribution, d-optimality, multiple censoring, optimal design, partially accelerated life testing, simulation study

Procedia PDF Downloads 315
8641 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB

Procedia PDF Downloads 140
8640 Development of an Instructional Model for Health Education Based On Social Cognitive Theory and Strategic Life Planning to Enhance Self-Regulation and Learning Achievement of Lower Secondary School Students

Authors: Adisorn Bansong, Walai Isarankura Na Ayudhaya, Aumporn Makanong

Abstract:

A Development of an Instructional Model for Health Education was the aim to develop and study the effectiveness of an instructional model for health education to enhance self-regulation and learning achievement of lower secondary school students. It was the Quasi-Experimental Designs, used a Single-group Interrupted Time-series Designs, conducted by 2 phases: 1. To develop an instructional model based on Social Cognitive Theory and Strategic Life Planning. 2. To trial and evaluate effectiveness of an instructional model. The results as the following: i. An Instructional Model for Health Education consists of five main components: a) Attention b) Forethought c) Tactic Planning d) Execution and e) Reflection. ii. After an Instructional Model for Health Education has used for a semester trial, found the 4.07 percent of sample’s Self-Regulation higher and learning achievement on post-test were significantly higher than pre-test at .05 levels (p = .033, .000).

Keywords: social cognitive theory, strategic life planning, self-regulation, learning achievement

Procedia PDF Downloads 455
8639 Quantile Smoothing Splines: Application on Productivity of Enterprises

Authors: Semra Turkan

Abstract:

In this paper, we have examined the factors that affect the productivity of Turkey’s Top 500 Industrial Enterprises in 2014. The labor productivity of enterprises is taken as an indicator of productivity of industrial enterprises. When the relationships between some financial ratios and labor productivity, it is seen that there is a nonparametric relationship between labor productivity and return on sales. In addition, the distribution of labor productivity of enterprises is right-skewed. If the dependent distribution is skewed, the quantile regression is more suitable for this data. Hence, the nonparametric relationship between labor productivity and return on sales by quantile smoothing splines.

Keywords: quantile regression, smoothing spline, labor productivity, financial ratios

Procedia PDF Downloads 296
8638 Optimum Stratification of a Skewed Population

Authors: D. K. Rao, M. G. M. Khan, K. G. Reddy

Abstract:

The focus of this paper is to develop a technique of solving a combined problem of determining Optimum Strata Boundaries (OSB) and Optimum Sample Size (OSS) of each stratum, when the population understudy is skewed and the study variable has a Pareto frequency distribution. The problem of determining the OSB is formulated as a Mathematical Programming Problem (MPP) which is then solved by dynamic programming technique. A numerical example is presented to illustrate the computational details of the proposed method. The proposed technique is useful to obtain OSB and OSS for a Pareto type skewed population, which minimizes the variance of the estimate of population mean.

Keywords: stratified sampling, optimum strata boundaries, optimum sample size, pareto distribution, mathematical programming problem, dynamic programming technique

Procedia PDF Downloads 447
8637 Application of Neutron Stimulated Gamma Spectroscopy for Soil Elemental Analysis and Mapping

Authors: Aleksandr Kavetskiy, Galina Yakubova, Nikolay Sargsyan, Stephen A. Prior, H. Allen Torbert

Abstract:

Determining soil elemental content and distribution (mapping) within a field are key features of modern agricultural practice. While traditional chemical analysis is a time consuming and labor-intensive multi-step process (e.g., sample collections, transport to laboratory, physical preparations, and chemical analysis), neutron-gamma soil analysis can be performed in-situ. This analysis is based on the registration of gamma rays issued from nuclei upon interaction with neutrons. Soil elements such as Si, C, Fe, O, Al, K, and H (moisture) can be assessed with this method. Data received from analysis can be directly used for creating soil elemental distribution maps (based on ArcGIS software) suitable for agricultural purposes. The neutron-gamma analysis system developed for field application consisted of an MP320 Neutron Generator (Thermo Fisher Scientific, Inc.), 3 sodium iodide gamma detectors (SCIONIX, Inc.) with a total volume of 7 liters, 'split electronics' (XIA, LLC), a power system, and an operational computer. Paired with GPS, this system can be used in the scanning mode to acquire gamma spectra while traversing a field. Using acquired spectra, soil elemental content can be calculated. These data can be combined with geographical coordinates in a geographical information system (i.e., ArcGIS) to produce elemental distribution maps suitable for agricultural purposes. Special software has been developed that will acquire gamma spectra, process and sort data, calculate soil elemental content, and combine these data with measured geographic coordinates to create soil elemental distribution maps. For example, 5.5 hours was needed to acquire necessary data for creating a carbon distribution map of an 8.5 ha field. This paper will briefly describe the physics behind the neutron gamma analysis method, physical construction the measurement system, and main characteristics and modes of work when conducting field surveys. Soil elemental distribution maps resulting from field surveys will be presented. and discussed. Comparison of these maps with maps created on the bases of chemical analysis and soil moisture measurements determined by soil electrical conductivity was similar. The maps created by neutron-gamma analysis were reproducible, as well. Based on these facts, it can be asserted that neutron stimulated soil gamma spectroscopy paired with GPS system is fully applicable for soil elemental agricultural field mapping.

Keywords: ArcGIS mapping, neutron gamma analysis, soil elemental content, soil gamma spectroscopy

Procedia PDF Downloads 130
8636 Measuring of the Volume Ratio of Two Immiscible Liquids Using Electrical Impedance Tomography

Authors: Jiri Primas, Michal Malik, Darina Jasikova, Michal Kotek, Vaclav Kopecky

Abstract:

Authors of this paper discuss the measuring of volume ratio of two immiscible liquids in the homogenous mixture using the industrial Electrical Impedance Tomography (EIT) system ITS p2+. In the first part of the paper, the principle of EIT and the basic theory of conductivity of mixture of two components are stated. In the next part, the experiment with water and olive oil mixed with Rushton turbine is described, and the measured results are used to verify the theory. In the conclusion, the results are discussed in detail, and the accuracy of the measuring method and its advantages are also mentioned.

Keywords: conductivity, electrical impedance tomography, homogenous mixture, mixing process

Procedia PDF Downloads 395
8635 A Heteroskedasticity Robust Test for Contemporaneous Correlation in Dynamic Panel Data Models

Authors: Andreea Halunga, Chris D. Orme, Takashi Yamagata

Abstract:

This paper proposes a heteroskedasticity-robust Breusch-Pagan test of the null hypothesis of zero cross-section (or contemporaneous) correlation in linear panel-data models, without necessarily assuming independence of the cross-sections. The procedure allows for either fixed, strictly exogenous and/or lagged dependent regressor variables, as well as quite general forms of both non-normality and heteroskedasticity in the error distribution. The asymptotic validity of the test procedure is predicated on the number of time series observations, T, being large relative to the number of cross-section units, N, in that: (i) either N is fixed as T→∞; or, (ii) N²/T→0, as both T and N diverge, jointly, to infinity. Given this, it is not expected that asymptotic theory would provide an adequate guide to finite sample performance when T/N is "small". Because of this, we also propose and establish asymptotic validity of, a number of wild bootstrap schemes designed to provide improved inference when T/N is small. Across a variety of experimental designs, a Monte Carlo study suggests that the predictions from asymptotic theory do, in fact, provide a good guide to the finite sample behaviour of the test when T is large relative to N. However, when T and N are of similar orders of magnitude, discrepancies between the nominal and empirical significance levels occur as predicted by the first-order asymptotic analysis. On the other hand, for all the experimental designs, the proposed wild bootstrap approximations do improve agreement between nominal and empirical significance levels, when T/N is small, with a recursive-design wild bootstrap scheme performing best, in general, and providing quite close agreement between the nominal and empirical significance levels of the test even when T and N are of similar size. Moreover, in comparison with the wild bootstrap "version" of the original Breusch-Pagan test our experiments indicate that the corresponding version of the heteroskedasticity-robust Breusch-Pagan test appears reliable. As an illustration, the proposed tests are applied to a dynamic growth model for a panel of 20 OECD countries.

Keywords: cross-section correlation, time-series heteroskedasticity, dynamic panel data, heteroskedasticity robust Breusch-Pagan test

Procedia PDF Downloads 424