Search results for: invariant moments
418 Hyperelastic Formulation for Orthotropic Materials
Authors: Daniel O'Shea, Mario M. Attard, David C. Kellermann
Abstract:
In this paper, we propose a hyperelastic strain energy function that maps isotopic hyperelastic constitutive laws for the use of orthotropic materials without the use of structural tensors or any kind of fiber vector, or the use of standard invariants. In particular, we focus on neo-Hookean class of models and represent them using an invariant-free formulation. To achieve this, we revise the invariant-free formulation of isotropic hyperelasticity. The formulation uses quadruple contractions between fourth-order tensors, rather than scalar products of scalar invariants. We also propose a new decomposition of the orthotropic Hookean stiffness tensor into two fourth-order Lamé tensors that collapse down to the classic Lamé parameters for isotropic continua. The resulting orthotropic hyperelastic model naturally maintains all of the advanced properties of the isotropic counterparts, and similarly collapse back down to their isotropic form by nothing more than equality of parameters in all directions (isotropy). Comparisons are made with large strain experimental results for transversely isotropic rubber type materials under tension.Keywords: finite strain, hyperelastic, invariants, orthotropic
Procedia PDF Downloads 446417 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation
Authors: Serge B. Provost, Yishan Zhang
Abstract:
A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation
Procedia PDF Downloads 162416 Classifications of Images for the Recognition of People’s Behaviors by SIFT and SVM
Authors: Henni Sid Ahmed, Belbachir Mohamed Faouzi, Jean Caelen
Abstract:
Behavior recognition has been studied for realizing drivers assisting system and automated navigation and is an important studied field in the intelligent Building. In this paper, a recognition method of behavior recognition separated from a real image was studied. Images were divided into several categories according to the actual weather, distance and angle of view etc. SIFT was firstly used to detect key points and describe them because the SIFT (Scale Invariant Feature Transform) features were invariant to image scale and rotation and were robust to changes in the viewpoint and illumination. My goal is to develop a robust and reliable system which is composed of two fixed cameras in every room of intelligent building which are connected to a computer for acquisition of video sequences, with a program using these video sequences as inputs, we use SIFT represented different images of video sequences, and SVM (support vector machine) Lights as a programming tool for classification of images in order to classify people’s behaviors in the intelligent building in order to give maximum comfort with optimized energy consumption.Keywords: video analysis, people behavior, intelligent building, classification
Procedia PDF Downloads 378415 From Transference Love to Self Alienation in the Therapeutic Relationship: A Case Study
Authors: Efi Koutantou
Abstract:
The foundation of successful therapy is the bond between the psychotherapist and the patient, Psychoanalysis would argue. The present study explores lived experiences of a psychotherapeutic relationship in different moments, initial and final with special reference to the transference love developed through the process. The fight between moments of ‘leaving a self’ behind and following ‘lines of flight’ in the process of creating a new subjectivity and ‘becoming-other’ will be explored. Moments between de-territorialisation – surpassing given constraints such as gender, family and religion, kinship bonds - freeing the space in favor of re-territorialisation – creation of oneself creation of oneself will also be analyzed. The generation of new possibilities of being, new ways of self-actualization for this patient will be discussed. The second part of this study will explore the extent to which this ‘transference love’ results for this specific patient to become ‘the discourse of the other’; it is a desideratum whether the patient finally becomes a subject of his/her own through his/her own self-exploration of new possibilities of existence or becomes alienated within the thought of the therapist. The way in which the patient uses or is (ab)used by the transference love in order to experience and undergo alienation from an ‘authority’ which may or may not sacrifice his/her own thought in favor of satisfying the therapist will be investigated. Finally, from an observer’s perspective and from the analysis of the results of this therapeutic relationship, the counter-transference will also be analyzed, in terms of an attempt of the analyst to relive and satisfy his/her own desires through the life of the analysand. The accession and fall of an idealized self will be analyzed, the turn of the transference love into ‘hate’ will conclude this case study through a lived experience in the therapeutic procedure; a relationship which can be called to be a mixture of a real relationship and remnants from a past object relationship.Keywords: alienation, authority, counter-transference, hate, transference love
Procedia PDF Downloads 211414 Determining the Octanol-Water Partition Coefficient for Armchair Polyhex BN Nanotubes Using Topological Indices
Authors: Esmat Mohammadinasab
Abstract:
The aim of this paper is to investigate theoretically and establish a predictive model for determination LogP of armchair polyhex BN nanotubes by using simple descriptors. The relationship between the octanol-water partition coefficient (LogP) and quantum chemical descriptors, electric moments, and topological indices of some armchair polyhex BN nanotubes with various lengths and fixed circumference are represented. Based on density functional theory (DFT) electric moments and physico-chemical properties of those nanotubes are calculated. The DFT method performed based on the Becke’s 3-parameter formulation with the Lee-Yang-Parr functional (B3LYP) method and 3-21G standard basis sets. For the first time, the relationship between partition coefficient and different properties of polyhex BN nanotubes is investigated.Keywords: topological indices, quantum descriptors, DFT method, nanotubes
Procedia PDF Downloads 335413 The Effect of "Trait" Variance of Personality on Depression: Application of the Trait-State-Occasion Modeling
Authors: Pei-Chen Wu
Abstract:
Both preexisting cross-sectional and longitudinal studies of personality-depression relationship have suffered from one main limitation: they ignored the stability of the construct of interest (e.g., personality and depression) can be expected to influence the estimate of the association between personality and depression. To address this limitation, the Trait-State-Occasion (TSO) modeling was adopted to analyze the sources of variance of the focused constructs. A TSO modeling was operated by partitioning a state variance into time-invariant (trait) and time-variant (occasion) components. Within a TSO framework, it is possible to predict change on the part of construct that really changes (i.e., time-variant variance), when controlling the trait variances. 750 high school students were followed for 4 waves over six-month intervals. The baseline data (T1) were collected from the senior high schools (aged 14 to 15 years). Participants were given Beck Depression Inventory and Big Five Inventory at each assessment. TSO modeling revealed that 70~78% of the variance in personality (five constructs) was stable over follow-up period; however, 57~61% of the variance in depression was stable. For personality construct, there were 7.6% to 8.4% of the total variance from the autoregressive occasion factors; for depression construct there were 15.2% to 18.1% of the total variance from the autoregressive occasion factors. Additionally, results showed that when controlling initial symptom severity, the time-invariant components of all five dimensions of personality were predictive of change in depression (Extraversion: B= .32, Openness: B = -.21, Agreeableness: B = -.27, Conscientious: B = -.36, Neuroticism: B = .39). Because five dimensions of personality shared some variance, the models in which all five dimensions of personality were simultaneous to predict change in depression were investigated. The time-invariant components of five dimensions were still significant predictors for change in depression (Extraversion: B = .30, Openness: B = -.24, Agreeableness: B = -.28, Conscientious: B = -.35, Neuroticism: B = .42). In sum, the majority of the variability of personality was stable over 2 years. Individuals with the greater tendency of Extraversion and Neuroticism have higher degrees of depression; individuals with the greater tendency of Openness, Agreeableness and Conscientious have lower degrees of depression.Keywords: assessment, depression, personality, trait-state-occasion model
Procedia PDF Downloads 177412 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm
Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim
Abstract:
All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.Keywords: currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features
Procedia PDF Downloads 235411 Aspects Concerning Flame Propagation of Various Fuels in Combustion Chamber of Four Valve Engines
Authors: Zoran Jovanovic, Zoran Masonicic, S. Dragutinovic, Z. Sakota
Abstract:
In this paper, results concerning flame propagation of various fuels in a particular combustion chamber with four tilted valves were elucidated. Flame propagation was represented by the evolution of spatial distribution of temperature in various cut-planes within combustion chamber while the flame front location was determined by dint of zones with maximum temperature gradient. The results presented are only a small part of broader on-going scrutinizing activity in the field of multidimensional modeling of reactive flows in combustion chambers with complicated geometries encompassing various models of turbulence, different fuels and combustion models. In the case of turbulence two different models were applied i.e. standard k-ε model of turbulence and k-ξ-f model of turbulence. In this paper flame propagation results were analyzed and presented for two different hydrocarbon fuels, such as CH4 and C8H18. In the case of combustion all differences ensuing from different turbulence models, obvious for non-reactive flows are annihilated entirely. Namely the interplay between fluid flow pattern and flame propagation is invariant as regards turbulence models and fuels applied. Namely the interplay between fluid flow pattern and flame propagation is entirely invariant as regards fuel variation indicating that the flame propagation through unburned mixture of CH4 and C8H18 fuels is not chemically controlled.Keywords: automotive flows, flame propagation, combustion modelling, CNG
Procedia PDF Downloads 292410 The Probability Foundation of Fundamental Theoretical Physics
Authors: Quznetsov Gunn
Abstract:
In the study of the logical foundations of probability theory, it was found that the terms and equations of the fundamental theoretical physics represent terms and theorems of the classical probability theory, more precisely, of that part of this theory, which considers the probability of dot events in the 3 + 1 space-time. In particular, the masses, moments, energies, spins, etc. turn out of parameters of probability distributions such events. The terms and the equations of the electroweak and of the quark-gluon theories turn out the theoretical-probabilistic terms and theorems. Here the relation of a neutrino to his lepton becomes clear, the W and Z bosons masses turn out dynamic ones, the cause of the asymmetry between particles and antiparticles is the impossibility of the birth of single antiparticles. In addition, phenomena such as confinement and asymptotic freedom receive their probabilistic explanation. And here we have the logical foundations of the gravity theory with phenomena dark energy and dark matter.Keywords: classical theory of probability, logical foundation of fundamental theoretical physics, masses, moments, energies, spins
Procedia PDF Downloads 296409 Refined Procedures for Second Order Asymptotic Theory
Authors: Gubhinder Kundhi, Paul Rilstone
Abstract:
Refined procedures for higher-order asymptotic theory for non-linear models are developed. These include a new method for deriving stochastic expansions of arbitrary order, new methods for evaluating the moments of polynomials of sample averages, a new method for deriving the approximate moments of the stochastic expansions; an application of these techniques to gather improved inferences with the weak instruments problem is considered. It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. In our application, finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.Keywords: edgeworth expansions, higher order asymptotics, saddlepoint expansions, weak instruments
Procedia PDF Downloads 277408 Conformal Invariance and F(R,T) Gravity
Authors: P. Y. Tsyba, O. V. Razina, E. Güdekli, R. Myrzakulov
Abstract:
In this paper, we consider the equation of motion for the F(R,T) gravity on their property of conformal invariance. It is shown that in the general case such a theory is not conformally invariant. Special cases for the functions v and u, in which the properties of the theory can appear, were studied.Keywords: conformal invariance, gravity, space-time, metric
Procedia PDF Downloads 663407 Defining the Turbulent Coefficients with the Effect of Atmospheric Stability in Wake of a Wind Turbine Wake
Authors: Mohammad A. Sazzad, Md M. Alam
Abstract:
Wind energy is one of the cleanest form of renewable energy. Despite wind industry is growing faster than ever there are some roadblocks towards the improvement. One of the difficulties the industry facing is insufficient knowledge about wake within the wind farms. As we know energy is generated in the lowest layer of the atmospheric boundary layer (ABL). This interaction between the wind turbine (WT) blades and wind introduces a low speed wind region which is defined as wake. This wake region shows different characteristics under each stability condition of the ABL. So, it is fundamental to know this wake region well which is defined mainly by turbulence transport and wake shear. Defining the wake recovery length and width are very crucial for wind farm to optimize the generation and reduce the waste of power to the grid. Therefore, in order to obtain the turbulent coefficients of velocity and length, this research focused on the large eddy simulation (LES) data for neutral ABL (NABL). According to turbulent theory, if we can present velocity defect and Reynolds stress in the form of local length and velocity scales, they become invariant. In our study velocity and length coefficients are 0.4867 and 0.4794 respectively which is close to the theoretical value of 0.5 for NABL. There are some invariant profiles because of the presence of thermal and wind shear power coefficients varied a little from the ideal condition.Keywords: atmospheric boundary layer, renewable energy, turbulent coefficient, wind turbine, wake
Procedia PDF Downloads 132406 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features
Authors: Rabab M. Ramadan, Elaraby A. Elgallad
Abstract:
With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.Keywords: iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, the Scale Invariant Feature Transform (SIFT)
Procedia PDF Downloads 235405 Review for Mechanical Tests of Corner Joints on Wooden Windows and Effects to the Stiffness
Authors: Milan Podlena, Stepan Hysek, Jiri Prochazka, Martin Bohm, Jan Bomba
Abstract:
Corner joints are the weakest part of windows, where the members are connected together. Since the dimensions of the windows started become bigger, the strength requirements for corner joints started to increase as well. Therefore, the aim of this study was to test the samples of corner joints of wooden windows. Moisture content of test specimens was stabilized in the climate chamber. After conditioning, test specimens were loaded in the laboratory conditions onto an universal testing machine and the failure load was measured. Data was recalculated by using goniometric, bending moment and stiffness equation to the stiffness coefficients and the bending moments were investigated. The results showed difference that was observed for the mortise with tenon joint and the dowel joint. This difference was explained by a varied adhesive bond area, which is related to the dimensions of dowels (diameter and length) as well. The bending moments and stiffness ware (except of type of corner joint) also affected by type of used adhesive, type of dowels and wood species.Keywords: corner joint, wooden window, bending moment, stiffness
Procedia PDF Downloads 219404 First Cracking Moments of Hybrid Fiber Reinforced Polymer-Steel Reinforced Concrete Beams
Authors: Saruhan Kartal, Ilker Kalkan
Abstract:
The present paper reports the cracking moment estimates of a set of steel-reinforced, Fiber Reinforced Polymer (FRP)-reinforced and hybrid steel-FRP reinforced concrete beams, calculated from different analytical formulations in the codes, together with the experimental cracking load values. A total of three steel-reinforced, four FRP-reinforced, 12 hybrid FRP-steel over-reinforced and five hybrid FRP-steel under-reinforced concrete beam tests were analyzed within the scope of the study. Glass FRP (GFRP) and Basalt FRP (BFRP) bars were used in the beams as FRP bars. In under-reinforced hybrid beams, rupture of the FRP bars preceded crushing of concrete, while concrete crushing preceded FRP rupture in over-reinforced beams. In both types, steel yielding took place long before the FRP rupture and concrete crushing. The cracking moment mainly depends on two quantities, namely the moment of inertia of the section at the initiation of cracking and the flexural tensile strength of concrete, i.e. the modulus of rupture. In the present study, two different definitions of uncracked moment of inertia, i.e. the gross and the uncracked transformed moments of inertia, were adopted. Two analytical equations for the modulus of rupture (ACI 318M and Eurocode 2) were utilized in the calculations as well as the experimental tensile strength of concrete from prismatic specimen tests. The ACI 318M modulus of rupture expression produced cracking moment estimates closer to the experimental cracking moments of FRP-reinforced and hybrid FRP-steel reinforced concrete beams when used in combination with the uncracked transformed moment of inertia, yet the Eurocode 2 modulus of rupture expression gave more accurate cracking moment estimates in steel-reinforced concrete beams. All of the analytical definitions produced analytical values considerably different from the experimental cracking load values of the solely FRP-reinforced concrete beam specimens.Keywords: polymer reinforcement, four-point bending, hybrid use of reinforcement, cracking moment
Procedia PDF Downloads 140403 Synthesis of Balanced 3-RRR Planar Parallel Manipulators
Authors: Arakelian Vigen, Geng Jing, Le Baron Jean-Paul
Abstract:
The paper deals with the design of parallel manipulators with balanced inertia forces and moments. The balancing of the resultant of the inertia forces of 3-RRR planar parallel manipulators is carried out through mass redistribution and centre of mass acceleration minimization. The proposed balancing technique is achieved in two steps: at first, optimal redistribution of the masses of input links is accomplished, which ensures the similarity of the end-effector trajectory and the manipulator’s common centre of mass trajectory, then, optimal trajectory planning of the end-effector by 'bang-bang' profile is reached. In such a way, the minimization of the magnitude of the acceleration of the centre of mass of the manipulator brings about a minimization of shaking force. To minimize the resultant of the inertia moments (shaking moment), the active balancing via inertia flywheel is applied. However, in this case, the active balancing is quite different from previous applications because it provides only a partial cancellation of the shaking moment due to the incomplete balancing of shaking force.Keywords: dynamic balancing, inertia force minimization, inertia moment minimization, 3-RRR planar parallel manipulator
Procedia PDF Downloads 462402 Singular Perturbed Vector Field Method Applied to the Problem of Thermal Explosion of Polydisperse Fuel Spray
Authors: Ophir Nave
Abstract:
In our research, we present the concept of singularly perturbed vector field (SPVF) method, and its application to thermal explosion of diesel spray combustion. Given a system of governing equations, which consist of hidden Multi-scale variables, the SPVF method transfer and decompose such system to fast and slow singularly perturbed subsystems (SPS). The SPVF method enables us to understand the complex system, and simplify the calculations. Later powerful analytical, numerical and asymptotic methods (e.g method of integral (invariant) manifold (MIM), the homotopy analysis method (HAM) etc.) can be applied to each subsystem. We compare the results obtained by the methods of integral invariant manifold and SPVF apply to spray droplets combustion model. The research deals with the development of an innovative method for extracting fast and slow variables in physical mathematical models. The method that we developed called singular perturbed vector field. This method based on a numerical algorithm applied to global quasi linearization applied to given physical model. The SPVF method applied successfully to combustion processes. Our results were compared to experimentally results. The SPVF is a general numerical and asymptotical method that reveals the hierarchy (multi-scale system) of a given system.Keywords: polydisperse spray, model reduction, asymptotic analysis, multi-scale systems
Procedia PDF Downloads 220401 Effects of Umbilical Cord Clamping on Puppies Neonatal Vitality
Authors: Maria L. G. Lourenço, Keylla H. N. P. Pereira, Viviane Y. Hibaru, Fabiana F. Souza, Joao C. P. Ferreira, Simone B. Chiacchio, Luiz H. A. Machado
Abstract:
In veterinary medicine, the standard procedure during a caesarian section is clamping the umbilical cord immediately after birth. In human neonates, when the umbilical cord is kept intact after birth, blood continues to flow from the cord to the newborn, but this procedure may prove to be difficult in dogs due to the shorter umbilical cord and the number of newborns in the litter. However, a possible detachment of the placenta while keeping the umbilical cord intact may make the residual blood to flow to the neonate. This study compared the effects on neonatal vitality between clamping and no clamping the umbilical cord of dogs born through cesarean section, assessing them through Apgar and reflex scores. Fifty puppies delivered from 16 bitches were randomly allocated to receive clamping of the umbilical cord immediately (n=25) or to not receive the clamping until breathing (n=25). The neonates were assessed during the first five min of life and once again 10 min after the first assessment. The differences observed between the two moments were significant (p < 0.01) for both the Apgar and reflex scores. The differences observed between the groups (clamped vs. not clamped) were not significant for the Apgar score in the 1st moment (p=0.1), but the 2nd moment was significantly (p < 0.01) in the group not clamped, as well as significant (p < 0.05) for the reflex score in the 1st moment and 2nd moment (p < 0.05), revealing higher neonatal vitality in the not clamped group. The differences observed between the moments (1st vs. 2nd) of each group as significant (p < 0.01), revealing higher neonatal vitality in the 2nd moments. In the no clamping group, after removing the neonates together with the umbilical cord and the placenta, we observed that the umbilical cords were full of blood at the time of birth and later became whitish and collapsed, demonstrating the blood transfer. The results suggest that keeping the umbilical cord intact for at least three minutes after the onset breathing is not detrimental and may contribute to increase neonate vitality in puppies delivered by cesarean section.Keywords: puppy vitality, newborn dog, cesarean section, Apgar score
Procedia PDF Downloads 153400 Regionalization of IDF Curves with L-Moments for Storm Events
Authors: Noratiqah Mohd Ariff, Abdul Aziz Jemain, Mohd Aftar Abu Bakar
Abstract:
The construction of Intensity-Duration-Frequency (IDF) curves is one of the most common and useful tools in order to design hydraulic structures and to provide a mathematical relationship between rainfall characteristics. IDF curves, especially those in Peninsular Malaysia, are often built using moving windows of rainfalls. However, these windows do not represent the actual rainfall events since the duration of rainfalls is usually prefixed. Hence, instead of using moving windows, this study aims to find regionalized distributions for IDF curves of extreme rainfalls based on storm events. Homogeneity test is performed on annual maximum of storm intensities to identify homogeneous regions of storms in Peninsular Malaysia. The L-moment method is then used to regionalized Generalized Extreme Value (GEV) distribution of these annual maximums and subsequently. IDF curves are constructed using the regional distributions. The differences between the IDF curves obtained and IDF curves found using at-site GEV distributions are observed through the computation of the coefficient of variation of root mean square error, mean percentage difference and the coefficient of determination. The small differences implied that the construction of IDF curves could be simplified by finding a general probability distribution of each region. This will also help in constructing IDF curves for sites with no rainfall station.Keywords: IDF curves, L-moments, regionalization, storm events
Procedia PDF Downloads 528399 Contrasted Mean and Median Models in Egyptian Stock Markets
Authors: Mai A. Ibrahim, Mohammed El-Beltagy, Motaz Khorshid
Abstract:
Emerging Markets return distributions have shown significance departure from normality were they are characterized by fatter tails relative to the normal distribution and exhibit levels of skewness and kurtosis that constitute a significant departure from normality. Therefore, the classical Markowitz Mean-Variance is not applicable for emerging markets since it assumes normally-distributed returns (with zero skewness and kurtosis) and a quadratic utility function. Moreover, the Markowitz mean-variance analysis can be used in cases of moderate non-normality and it still provides a good approximation of the expected utility, but it may be ineffective under large departure from normality. Higher moments models and median models have been suggested in the literature for asset allocation in this case. Higher moments models have been introduced to account for the insufficiency of the description of a portfolio by only its first two moments while the median model has been introduced as a robust statistic which is less affected by outliers than the mean. Tail risk measures such as Value-at Risk (VaR) and Conditional Value-at-Risk (CVaR) have been introduced instead of Variance to capture the effect of risk. In this research, higher moment models including the Mean-Variance-Skewness (MVS) and Mean-Variance-Skewness-Kurtosis (MVSK) are formulated as single-objective non-linear programming problems (NLP) and median models including the Median-Value at Risk (MedVaR) and Median-Mean Absolute Deviation (MedMAD) are formulated as a single-objective mixed-integer linear programming (MILP) problems. The higher moment models and median models are compared to some benchmark portfolios and tested on real financial data in the Egyptian main Index EGX30. The results show that all the median models outperform the higher moment models were they provide higher final wealth for the investor over the entire period of study. In addition, the results have confirmed the inapplicability of the classical Markowitz Mean-Variance to the Egyptian stock market as it resulted in very low realized profits.Keywords: Egyptian stock exchange, emerging markets, higher moment models, median models, mixed-integer linear programming, non-linear programming
Procedia PDF Downloads 315398 Creating Moments and Memories: An Evaluation of the Starlight 'Moments' Program for Palliative Children, Adolescents and Their Families
Authors: C. Treadgold, S. Sivaraman
Abstract:
The Starlight Children's Foundation (Starlight) is an Australian non-profit organisation that delivers programs, in partnership with health professionals, to support children, adolescents, and their families who are living with a serious illness. While supporting children and adolescents with life-limiting conditions has always been a feature of Starlight's work, providing a dedicated program, specifically targeting and meeting the needs of the paediatric palliative population, is a recent area of focus. Recognising the challenges in providing children’s palliative services, Starlight initiated a research and development project to better understand and meet the needs of this group. The aim was to create a program which enhances the wellbeing of children, adolescents, and their families receiving paediatric palliative care in their community through the provision of on-going, tailored, positive experiences or 'moments'. This paper will present the results of the formative evaluation of this unique program, highlighting the development processes and outcomes of the pilot. The pilot was designed using an innovation methodology, which included a number of research components. There was a strong belief that it needed to be delivered in partnership with a dedicated palliative care team, helping to ensure the best interests of the family were always represented. This resulted in Starlight collaborating with both the Victorian Paediatric Palliative Care Program (VPPCP) at the Royal Children's Hospital, Melbourne, and the Sydney Children's Hospital Network (SCHN) to pilot the 'Moments' program. As experts in 'positive disruption', with a long history of collaborating with health professionals, Starlight was well placed to deliver a program which helps children, adolescents, and their families to experience moments of joy, connection and achieve their own sense of accomplishment. Building on Starlight’s evidence-based approach and experience in creative service delivery, the program aims to use the power of 'positive disruption' to brighten the lives of this group and create important memories. The clinical and Starlight team members collaborate to ensure that the child and family are at the centre of the program. The design of each experience is specific to their needs and ensures the creation of positive memories and family connection. It aims for each moment to enhance quality of life. The partnership with the VPPCP and SCHN has allowed the program to reach families across metropolitan and regional locations. In late 2019 a formative evaluation of the pilot was conducted utilising both quantitative and qualitative methodologies to document both the delivery and outcomes of the program. Central to the evaluation was the interviews conducted with both clinical teams and families in order to gain a comprehensive understanding of the impact of and satisfaction with the program. The findings, which will be shared in this presentation, provide practical insight into the delivery of the program, the key elements for its success with families, and areas which could benefit from additional research and focus. It will use stories and case studies from the pilot to highlight the impact of the program and discuss what opportunities, challenges, and learnings emerged.Keywords: children, families, memory making, pediatric palliative care, support
Procedia PDF Downloads 99397 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 346396 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan
Authors: Tasir Khan, Yejuan Wang
Abstract:
The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments
Procedia PDF Downloads 82395 Crack Propagation Effect at the Interface of a Composite Beam
Authors: Mezidi Amar
Abstract:
In this research work, crack propagation at the interface of a composite beam is considered. The behavior of composite beams (CB) depends upon a law based on relationship between tangential or normal efforts with inelastic propagation. Throughout this study, composite beams are classified like composite beams with partial connection or sandwich beams of three layers. These structural systems are controlled by the same nature of differential equations regarding their behavior in the plane, as well as out-of-plane. Multi-layer elements with partial connection are typically met in the field of timber construction where the elements are assembled by joining. The formalism of the behavior in the plane and out-of-plane of these composite beams is obtained and their results concerning the engineering aspect or simple of interpretation are proposed for the case of composite beams made up of rectangular section and simply supported section. An apparent analytical peculiarity or paradox in the bending behavior of elastic–composite beams with interlayer slip, sandwich beam or other similar problems subjected to boundary moments exists. For a fully composite beam subjected to end moments, the partial composite model will render a non-vanishing uniform value for the normal force in the individual subelement. Obtained results are similar to those for the case of vibrations in the plane as well for the composite beams as for the sandwich beams where eigen-frequencies increase with related rigidity.Keywords: composite beam, behaviour, interface, deflection, propagation
Procedia PDF Downloads 302394 Relevance Feedback within CBIR Systems
Authors: Mawloud Mosbah, Bachir Boucheham
Abstract:
We present here the results for a comparative study of some techniques, available in the literature, related to the relevance feedback mechanism in the case of a short-term learning. Only one method among those considered here is belonging to the data mining field which is the K-Nearest Neighbours Algorithm (KNN) while the rest of the methods is related purely to the information retrieval field and they fall under the purview of the following three major axes: Shifting query, Feature Weighting and the optimization of the parameters of similarity metric. As a contribution, and in addition to the comparative purpose, we propose a new version of the KNN algorithm referred to as an incremental KNN which is distinct from the original version in the sense that besides the influence of the seeds, the rate of the actual target image is influenced also by the images already rated. The results presented here have been obtained after experiments conducted on the Wang database for one iteration and utilizing colour moments on the RGB space. This compact descriptor, Colour Moments, is adequate for the efficiency purposes needed in the case of interactive systems. The results obtained allow us to claim that the proposed algorithm proves good results; it even outperforms a wide range of techniques available in the literature.Keywords: CBIR, category search, relevance feedback, query point movement, standard Rocchio’s formula, adaptive shifting query, feature weighting, original KNN, incremental KNN
Procedia PDF Downloads 280393 Lattice Twinning and Detwinning Processes in Phase Transformation in Shape Memory Alloys
Authors: Osman Adiguzel
Abstract:
Shape memory effect is a peculiar property exhibited by certain alloy systems and based on martensitic transformation, and shape memory properties are closely related to the microstructures of the material. Shape memory effect is linked with martensitic transformation, which is a solid state phase transformation and occurs with the cooperative movement of atoms by means of lattice invariant shears on cooling from high-temperature parent phase. Lattice twinning and detwinning can be considered as elementary processes activated during the transformation. Thermally induced martensite occurs as martensite variants, in self-accommodating manner and consists of lattice twins. Also, this martensite is called the twinned martensite or multivariant martensite. Deformation of shape memory alloys in martensitic state proceeds through a martensite variant reorientation. The martensite variants turn into the reoriented single variants with deformation, and the reorientation process has great importance for the shape memory behavior. Copper based alloys exhibit this property in metastable β- phase region, which has DO3 –type ordered lattice in ternary case at high temperature, and these structures martensiticaly turn into the layered complex structures with lattice twinning mechanism, on cooling from high temperature parent phase region. The twinning occurs as martensite variants with lattice invariant shears in two opposite directions, <110 > -type directions on the {110}- type plane of austenite matrix. Lattice invariant shear is not uniform in copper based ternary alloys and gives rise to the formation of unusual layered structures, like 3R, 9R, or 18R depending on the stacking sequences on the close-packed planes of the ordered lattice. The unit cell and periodicity are completed through 18 atomic layers in case of 18R-structure. On the other hand, the deformed material recovers the original shape on heating above the austenite finish temperature. Meanwhile, the material returns to the twinned martensite structures (thermally induced martensite structure) in one way (irreversible) shape memory effect on cooling below the martensite finish temperature, whereas the material returns to the detwinned martensite structure (deformed martensite) in two-way (reversible) shape memory effect. Shortly one can say that the microstructural mechanisms, responsible for the shape memory effect are the twinning and detwinning processes as well as martensitic transformation. In the present contribution, x-ray diffraction, transmission electron microscopy (TEM) and differential scanning calorimetry (DSC) studies were carried out on two copper-based ternary alloys, CuZnAl, and CuAlMn.Keywords: shape memory effect, martensitic transformation, twinning and detwinning, layered structures
Procedia PDF Downloads 428392 Static and Dynamic Analysis of Hyperboloidal Helix Having Thin Walled Open and Close Sections
Authors: Merve Ermis, Murat Yılmaz, Nihal Eratlı, Mehmet H. Omurtag
Abstract:
The static and dynamic analyses of hyperboloidal helix having the closed and the open square box sections are investigated via the mixed finite element formulation based on Timoshenko beam theory. Frenet triad is considered as local coordinate systems for helix geometry. Helix domain is discretized with a two-noded curved element and linear shape functions are used. Each node of the curved element has 12 degrees of freedom, namely, three translations, three rotations, two shear forces, one axial force, two bending moments and one torque. Finite element matrices are derived by using exact nodal values of curvatures and arc length and it is interpolated linearly throughout the element axial length. The torsional moments of inertia for close and open square box sections are obtained by finite element solution of St. Venant torsion formulation. With the proposed method, the torsional rigidity of simply and multiply connected cross-sections can be also calculated in same manner. The influence of the close and the open square box cross-sections on the static and dynamic analyses of hyperboloidal helix is investigated. The benchmark problems are represented for the literature.Keywords: hyperboloidal helix, squared cross section, thin walled cross section, torsional rigidity
Procedia PDF Downloads 377391 Modelling Hydrological Time Series Using Wakeby Distribution
Authors: Ilaria Lucrezia Amerise
Abstract:
The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.Keywords: generalized extreme values, likelihood estimation, precipitation data, Wakeby distribution
Procedia PDF Downloads 139390 One vs. Rest and Error Correcting Output Codes Principled Rebalancing Schemes for Solving Imbalanced Multiclass Problems
Authors: Alvaro Callejas-Ramos, Lorena Alvarez-Perez, Alexander Benitez-Buenache, Anibal R. Figueiras-Vidal
Abstract:
This contribution presents a promising formulation which allows to extend the principled binary rebalancing procedures, also known as neutral re-balancing mechanisms in the sense that they do not alter the likelihood ratioKeywords: Bregman divergences, imbalanced multiclass classifi-cation, informed re-balancing, invariant likelihood ratio
Procedia PDF Downloads 216389 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data
Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer
Abstract:
This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML
Procedia PDF Downloads 129