Search results for: continuum approximation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 718

Search results for: continuum approximation

508 A Robust Theoretical Elastoplastic Continuum Damage T-H-M Model for Rock Surrounding a Wellbore

Authors: Nikolaos Reppas, Yilin Gui, Ben Wetenhall, Colin Davie

Abstract:

Injection of CO2 inside wellbore can induce different kind of loadings that can lead to thermal, hydraulic, and mechanical changes on the surrounding rock. A dual-porosity theoretical constitutive model will be presented for the stability analysis of the wellbore during CO2 injection. An elastoplastic damage response will be considered. A bounding yield surface will be presented considering damage effects on sandstone. The main target of the research paper is to present a theoretical constitutive model that can help industries to safely store CO2 in geological rock formations and forecast any changes on the surrounding rock of the wellbore. The fully coupled elasto-plastic damage Thermo-Hydraulic-Mechanical theoretical model will be validated from existing experimental data for sandstone after simulating some scenarios by using FEM on MATLAB software.

Keywords: carbon capture and storage, rock mechanics, THM effects on rock, constitutive model

Procedia PDF Downloads 124
507 Curve Designing Using an Approximating 4-Point C^2 Ternary Non-Stationary Subdivision Scheme

Authors: Muhammad Younis

Abstract:

A ternary 4-point approximating non-stationary subdivision scheme has been introduced that generates the family of $C^2$ limiting curves. The theory of asymptotic equivalence is being used to analyze the convergence and smoothness of the scheme. The comparison of the proposed scheme has been demonstrated using different examples with the existing 4-point ternary approximating schemes, which shows that the limit curves of the proposed scheme behave more pleasantly and can generate conic sections as well.

Keywords: ternary, non-stationary, approximation subdivision scheme, convergence and smoothness

Procedia PDF Downloads 448
506 A Multistep Broyden’s-Type Method for Solving Systems of Nonlinear Equations

Authors: M. Y. Waziri, M. A. Aliyu

Abstract:

The paper proposes an approach to improve the performance of Broyden’s method for solving systems of nonlinear equations. In this work, we consider the information from two preceding iterates rather than a single preceding iterate to update the Broyden’s matrix that will produce a better approximation of the Jacobian matrix in each iteration. The numerical results verify that the proposed method has clearly enhanced the numerical performance of Broyden’s Method.

Keywords: mulit-step Broyden, nonlinear systems of equations, computational efficiency, iterate

Procedia PDF Downloads 604
505 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 364
504 Cosmic Background Reduction in the Radiocarbon Measurements by Liquid Scintillation Spectrometry

Authors: Natasa Todorovic, Jovana Nikolov

Abstract:

Guard detector efficiency, cosmic background, and its variation were determinate using ultra low-level liquid scintillation spectrometer Quantulus 1220, equipped with an anti-Compton guard detector, in the surface laboratory at the University of Novi Sad, Serbia, Atmospheric pressure variation has an observable effect on the anti-Compton guard detector count rate. and the cosmic muon flux is lower during a high-pressure period. Also, the guard detector Compton continuum provides a good view of the level of gamma radiation in the laboratory environment. The efficiency of the guard detector in the channel interval from 750 to 1024 was assessed to 93.45%; efficiency in the entire window (channels 1 to 1024) was 75.23%, which is in good agreement with literature data.

Keywords: cosmic radiation, background reduction, liquid scintillation counting, guard detector efficiency

Procedia PDF Downloads 132
503 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is re-examined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness

Procedia PDF Downloads 408
502 Solving the Transportation Problem for Warehouses and Dealers in Bangalore City

Authors: S. Aditya, K. T. Nideesh, N. Guruprasad

Abstract:

Being a subclass of linear programing problem, the Transportation Problem is a classic Operations Research problem where the objective is to determine the schedule for transporting goods from source to destination in a way that minimizes the shipping cost while satisfying supply and demand constraints. In this paper, we are representing the transportation problem for various warehouses along with various dealers situated in Bangalore city to reduce the transportation cost incurred by them as of now. The problem is solved by obtaining the Initial Basic feasible Solution through various methods and further proceeding to obtain optimal cost.

Keywords: NW method, optimum utilization, transportation problem, Vogel’s approximation method

Procedia PDF Downloads 404
501 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: coherent covering location problem, efficient frontier, lagragian relaxation, data envelopment analysis

Procedia PDF Downloads 306
500 Investigating the Stylistic Features of Advertising: Ad Design and Creation

Authors: Asma Ben Abdallah

Abstract:

Language has a powerful influence over people and their actions. The language of advertising has a very great impact on the consumer. It makes use of different features from the linguistic continuum. The present paper attempts to apply the theories of stylistics to the analysis of advertising texts. In order to decipher the stylistic features of the advertising discourse, 30 advertising text samples designed by MA Business students have been selected. These samples have been analyzed at the level of design and content. The study brings insights into the use of stylistic devices in advertising, and it reveals that both linguistic and non-linguistic features of advertisements are frequently employed to develop a well-thought-out design and content. The practical significance of the study is to highlight the specificities of the advertising genre so that people interested in the language of advertising (Business students and ESP teachers) will have a better understanding of the nature of the language used and the techniques of writing and designing ads. Similarly, those working in the advertising sphere (ad designers) will appreciate the specificities of the advertising discourse.

Keywords: the language of advertising, advertising discourse, ad design, stylistic features

Procedia PDF Downloads 205
499 Dominant Correlation Effects in Atomic Spectra

Authors: Hubert Klar

Abstract:

High double excitation of two-electron atoms has been investigated using hyperpherical coordinates within a modified adiabatic expansion technique. This modification creates a novel fictitious force leading to a spontaneous exchange symmetry breaking at high double excitation. The Pauli principle must therefore be regarded as approximation valid only at low excitation energy. Threshold electron scattering from high Rydberg states shows an unexpected time reversal symmetry breaking. At threshold for double escape we discover a broad (few eV) Cooper pair.

Keywords: correlation, resonances, threshold ionization, Cooper pair

Procedia PDF Downloads 317
498 Block Implicit Adams Type Algorithms for Solution of First Order Differential Equation

Authors: Asabe Ahmad Tijani, Y. A. Yahaya

Abstract:

The paper considers the derivation of implicit Adams-Moulton type method, with k=4 and 5. We adopted the method of interpolation and collocation of power series approximation to generate the continuous formula which was evaluated at off-grid and some grid points within the step length to generate the proposed block schemes, the schemes were investigated and found to be consistent and zero stable. Finally, the methods were tested with numerical experiments to ascertain their level of accuracy.

Keywords: Adam-Moulton Type (AMT), off-grid, block method, consistent and zero stable

Procedia PDF Downloads 455
497 A Non-linear Damage Model For The Annulus Of the Intervertebral Disc Under Cyclic Loading, Including Recovery

Authors: Shruti Motiwale, Xianlin Zhou, Reuben H. Kraft

Abstract:

Military and sports personnel are often required to wear heavy helmets for extended periods of time. This leads to excessive cyclic loads on the neck and an increased chance of injury. Computational models offer one approach to understand and predict the time progression of disc degeneration under severe cyclic loading. In this paper, we have applied an analytic non-linear damage evolution model to estimate damage evolution in an intervertebral disc due to cyclic loads over decade-long time periods. We have also proposed a novel strategy for inclusion of recovery in the damage model. Our results show that damage only grows 20% in the initial 75% of the life, growing exponentially in the remaining 25% life. The analysis also shows that it is crucial to include recovery in a damage model.

Keywords: cervical spine, computational biomechanics, damage evolution, intervertebral disc, continuum damage mechanics

Procedia PDF Downloads 536
496 Normalized Compression Distance Based Scene Alteration Analysis of a Video

Authors: Lakshay Kharbanda, Aabhas Chauhan

Abstract:

In this paper, an application of Normalized Compression Distance (NCD) to detect notable scene alterations occurring in videos is presented. Several research groups have been developing methods to perform image classification using NCD, a computable approximation to Normalized Information Distance (NID) by studying the degree of similarity in images. The timeframes where significant aberrations between the frames of a video have occurred have been identified by obtaining a threshold NCD value, using two compressors: LZMA and BZIP2 and defining scene alterations using Pixel Difference Percentage metrics.

Keywords: image compression, Kolmogorov complexity, normalized compression distance, root mean square error

Procedia PDF Downloads 304
495 Application of Regularized Low-Rank Matrix Factorization in Personalized Targeting

Authors: Kourosh Modarresi

Abstract:

The Netflix problem has brought the topic of “Recommendation Systems” into the mainstream of computer science, mathematics, and statistics. Though much progress has been made, the available algorithms do not obtain satisfactory results. The success of these algorithms is rarely above 5%. This work is based on the belief that the main challenge is to come up with “scalable personalization” models. This paper uses an adaptive regularization of inverse singular value decomposition (SVD) that applies adaptive penalization on the singular vectors. The results show far better matching for recommender systems when compared to the ones from the state of the art models in the industry.

Keywords: convex optimization, LASSO, regression, recommender systems, singular value decomposition, low rank approximation

Procedia PDF Downloads 421
494 Density functional (DFT), Study of the Structural and Phase Transition of ThC and ThN: LDA vs GGA Computational

Authors: Hamza Rekab Djabri, Salah Daoud

Abstract:

The present paper deals with the computational of structural and electronic properties of ThC and ThN compounds using density functional theory within generalized-gradient (GGA) apraximation and local density approximation (LDA). We employ the full potential linear muffin-tin orbitals (FP-LMTO) as implemented in the Lmtart code. We have used to examine structure parameter in eight different structures such as in NaCl (B1), CsCl (B2), ZB (B3), NiAs (B8), PbO (B10), Wurtzite (B4) , HCP (A3) βSn (A5) structures . The equilibrium lattice parameter, bulk modulus, and its pressure derivative were presented for all calculated phases. The calculated ground state properties are in good agreement with available experimental and theoretical results.

Keywords: DFT, GGA, LDA, properties structurales, ThC, ThN

Procedia PDF Downloads 68
493 Effective Medium Approximations for Modeling Ellipsometric Responses from Zinc Dialkyldithiophosphates (ZDDP) Tribofilms Formed on Sliding Surfaces

Authors: Maria Miranda-Medina, Sara Salopek, Andras Vernes, Martin Jech

Abstract:

Sliding lubricated surfaces induce the formation of tribofilms that reduce friction, wear and prevent large-scale damage of contact parts. Engine oils and lubricants use antiwear and antioxidant additives such as zinc dialkyldithiophosphate (ZDDP) from where protective tribofilms are formed by degradation. The ZDDP tribofilms are described as a two-layer structure composed of inorganic polymer material. On the top surface, the long chain polyphosphate is a zinc phosphate and in the bulk, the short chain polyphosphate is a mixed Fe/Zn phosphate with a gradient concentration. The polyphosphate chains are partially adherent to steel surface through a sulfide and work as anti-wear pads. In this contribution, ZDDP tribofilms formed on gray cast iron surfaces are studied. The tribofilms were generated in a reciprocating sliding tribometer with a piston ring-cylinder liner configuration. Fully formulated oil of SAE grade 5W-30 was used as lubricant during two tests at 40Hz and 50Hz. For the estimation of the tribofilm thicknesses, spectroscopic ellipsometry was used due to its high accuracy and non-destructive nature. Ellipsometry works under an optical principle where the change in polarisation of light reflected by the surface, is associated with the refractive index of the surface material or to the thickness of the layer deposited on top. Ellipsometrical responses derived from tribofilms are modelled by effective medium approximation (EMA), which includes the refractive index of involved materials, homogeneity of the film and thickness. The materials composition was obtained from x-ray photoelectron spectroscopic studies, where the presence of ZDDP, O and C was confirmed. From EMA models it was concluded that tribofilms formed at 40 Hz are thicker and more homogeneous than the ones formed at 50 Hz. In addition, the refractive index of each material is mixed to derive an effective refractive index that describes the optical composition of the tribofilm and exhibits a maximum response in the UV range, being a characteristic of glassy semitransparent films.

Keywords: effective medium approximation, reciprocating sliding tribometer, spectroscopic ellipsometry, zinc dialkyldithiophosphate

Procedia PDF Downloads 224
492 Finite Sample Inferences for Weak Instrument Models

Authors: Gubhinder Kundhi, Paul Rilstone

Abstract:

It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. Finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.

Keywords: bootstrap, Instrumental Variable, Edgeworth expansions, Saddlepoint expansions

Procedia PDF Downloads 284
491 Design of a Fuzzy Luenberger Observer for Fault Nonlinear System

Authors: Mounir Bekaik, Messaoud Ramdani

Abstract:

We present in this work a new technique of stabilization for fault nonlinear systems. The approach we adopt focus on a fuzzy Luenverger observer. The T-S approximation of the nonlinear observer is based on fuzzy C-Means clustering algorithm to find local linear subsystems. The MOESP identification approach was applied to design an empirical model describing the subsystems state variables. The gain of the observer is given by the minimization of the estimation error through Lyapunov-krasovskii functional and LMI approach. We consider a three tank hydraulic system for an illustrative example.

Keywords: nonlinear system, fuzzy, faults, TS, Lyapunov-Krasovskii, observer

Procedia PDF Downloads 301
490 BER Estimate of WCDMA Systems with MATLAB Simulation Model

Authors: Suyeb Ahmed Khan, Mahmood Mian

Abstract:

Simulation plays an important role during all phases of the design and engineering of communications systems, from early stages of conceptual design through the various stages of implementation, testing, and fielding of the system. In the present paper, a simulation model has been constructed for the WCDMA system in order to evaluate the performance. This model describes multiusers effects and calculation of BER (Bit Error Rate) in 3G mobile systems using Simulink MATLAB 7.1. Gaussian Approximation defines the multi-user effect on system performance. BER has been analyzed with comparison between transmitting data and receiving data.

Keywords: WCDMA, simulations, BER, MATLAB

Procedia PDF Downloads 553
489 Mechanical Properties of Ternary Metal Nitride Ti1-xTaxN Alloys from First-Principles

Authors: M. Benhamida, Kh. Bouamama, P. Djemia

Abstract:

We investigate by first-principles pseudo-potential calculations the composition dependence of lattice parameter, hardness and elastic properties of ternary disordered solid solutions Ti(1-x)Ta(x)N (1>=x>=0) with B1-rocksalt structure. Calculations use the coherent potential approximation with the exact muffin-tin orbitals (EMTO) and hardness formula for multicomponent covalent solid solution proposed. Bulk modulus B shows a nearly linear behaviour whereas not C44 and C’=(C11-C12)/2 that are not monotonous. Influences of vacancies on hardness of off-stoichiometric transition-metal nitrides TiN(1−x) and TaN(1−x) are also considered.

Keywords: transition metal nitride materials, elastic constants, hardness, EMTO

Procedia PDF Downloads 402
488 Timely Palliative Screening and Interventions in Oncology

Authors: Jaci Marie Mastrandrea, Rosario Haro

Abstract:

Background: The National Comprehensive Cancer Network (NCCN) recommends that healthcare institutions have established processes for integrating palliative care (PC) into cancer treatment and that all cancer patients be screened for PC needs upon initial diagnosis as well as throughout the entire continuum of care (National Comprehensive Cancer Network, 2021). Early PC screening and intervention is directly associated with improved patient outcomes. The Sky Lakes Cancer Treatment Center (SLCTC) is an institution that has access to PC services yet does not have protocols in place for identifying patients with palliative needs or a standardized referral process. The aim of this quality improvement project was to improve early access to PC services by establishing a standardized screening and referral process for outpatient oncology patients. Method: The sample population included all adult patients with an oncology diagnosis who presented to the SLCTC for treatment during the project timeline. The “Palliative and Supportive Needs Assessment'' (PSNA) screening tool was developed from validated, evidence-based PC referral criteria. The tool was initially implemented using paper forms, and data was collected over a period of eight weeks. Patients were screened by nurses on the SLCTC oncology treatment team. Nurses responsible for screening patients received an educational inservice prior to implementation. Patients with a PSNA score of three or higher received an educational handout on the topic of PC and education about PC and symptom management. A score of five or higher indicates that PC referral is strongly recommended, and the patient’s EHR is flagged for the oncology provider to review orders for PC referral. The PSNA tool was approved by Sky Lakes administration for full integration into Epic-Beacon. The project lead collaborated with the Sky Lakes’ information systems team and representatives from Epic on the tool’s aesthetic and functionality within the Epic system. SLCTC nurses and physicians were educated on how to document the PSNA within Epic and where to view results. Results: Prior to the implementation of the PSNA screening tool, the SLCTC had zero referrals to PC in the past year, excluding referrals to hospice. Data was collected from the completed screening assessments of 100 patients under active treatment at the SLCTC. Seventy-three percent of patients met criteria for PC referral with a score greater than or equal to three. Of those patients who met referral criteria, 53.4% (39 patients) were referred for a palliative and supportive care consultation. Patients that were not referred to PC upon meeting criteria were flagged in EPIC for re-screening within one to three months. Patients with lung cancer, chronic hematologic malignancies, breast cancer, and gastrointestinal malignancy most frequently met the criteria for PC referral and scored highest overall on the scale of 0-12. Conclusion: The implementation of a standardized PC screening tool at the SLCTC significantly increased awareness of PC needs among cancer patients in the outpatient setting. Additionally, data derived from this quality improvement project supports the national recommendation for PC to be an integral component of cancer treatment across the entire continuum of care.

Keywords: oncology, palliative and supportive care, symptom management, outpatient oncology, palliative screening tool

Procedia PDF Downloads 82
487 Variational Evolutionary Splines for Solving a Model of Temporomandibular Disorders

Authors: Alberto Hananel

Abstract:

The aim of this work is to modelize the occlusion of a person with temporomandibular disorders as an evolutionary equation and approach its solution by the construction and characterizing of discrete variational splines. To formulate the problem, certain boundary conditions have been considered. After showing the existence and the uniqueness of the solution of such a problem, a convergence result of a discrete variational evolutionary spline is shown. A stress analysis of the occlusion of a human jaw with temporomandibular disorders by finite elements is carried out in FreeFem++ in order to prove the validity of the presented method.

Keywords: approximation, evolutionary PDE, Finite Element Method, temporomandibular disorders, variational spline

Procedia PDF Downloads 345
486 From Modern to Contemporary Art: Transformations of Art Market in Istanbul

Authors: Cem Ozatalay, Senem Ornek

Abstract:

The Artprice Contemporary Art Market Annual Report 2014 notices that Istanbul, with its art market volume of $3.6 million has become the first city of the Middle East and North Africa region and the 14th city of the World. Indeed, the period 2004–2014 has been significant in terms of the growth of the art market, during which the majority of contemporary art galleries and museums in Istanbul was inaugurated. This boom means that with the joining of new agents, the structure of the art market has dramatically changed. To use Nathalie Heinich’s terminology, in the current art field, three art genres – namely classical art, modern art and contemporary art – coexist, but in the case of Istanbul, such as many art cities in the world, the latter genre has become increasingly dominant. This presentation aims to show how the power shifts away from the classical art agents to contemporary art agents, and the effects produced by the conflicts between the old and new agents of current art field. Based on the data obtained from an ongoing field research in Istanbul among the art market agents such as art dealers, curators, art critics and artists, it will be shown that even if the agents of different art genres are in conflict with each other, there is, at the same time, a continuum between the three art worlds.

Keywords: contemporary art market, economic sociology of art, Istanbul art market, structure of the art field in Istanbul

Procedia PDF Downloads 223
485 Nonlinear Analysis of Reinforced Concrete Arched Structures Considering Soil-Structure Interaction

Authors: Mohamed M. El Gendy, Ibrahim A. El Arabi, Rafeek W. Abdel-Missih, Omar A. Kandil

Abstract:

Nonlinear analysis is one of the most important design and safety tools in structural engineering. Based on the finite-element method, a geometrical and material nonlinear analysis of large span reinforced concrete arches is carried out considering soil-structure interaction. The concrete section details and reinforcement distribution are taken into account. The behavior of soil is considered via Winkler's and continuum models. A computer program (NARC II) is specially developed in order to follow the structural behavior of large span reinforced concrete arches up to failure. The results obtained by the proposed model are compared with available literature for verification. This work confirmed that the geometrical and material nonlinearities, as well as soil structure interaction, have considerable influence on the structural response of reinforced concrete arches.

Keywords: nonlinear analysis, reinforced concrete arched structure, soil-structure interaction, geotechnical engineering

Procedia PDF Downloads 412
484 Combining a Continuum of Hidden Regimes and a Heteroskedastic Three-Factor Model in Option Pricing

Authors: Rachid Belhachemi, Pierre Rostan, Alexandra Rostan

Abstract:

This paper develops a discrete-time option pricing model for index options. The model consists of two key ingredients. First, daily stock return innovations are driven by a continuous hidden threshold mixed skew-normal (HTSN) distribution which generates conditional non-normality that is needed to fit daily index return. The most important feature of the HTSN is the inclusion of a latent state variable with a continuum of states, unlike the traditional mixture distributions where the state variable is discrete with little number of states. The HTSN distribution belongs to the class of univariate probability distributions where parameters of the distribution capture the dependence between the variable of interest and the continuous latent state variable (the regime). The distribution has an interpretation in terms of a mixture distribution with time-varying mixing probabilities. It has been shown empirically that this distribution outperforms its main competitor, the mixed normal (MN) distribution, in terms of capturing the stylized facts known for stock returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence. Second, heteroscedasticity in the model is captured by a threeexogenous-factor GARCH model (GARCHX), where the factors are taken from the principal components analysis of various world indices and presents an application to option pricing. The factors of the GARCHX model are extracted from a matrix of world indices applying principal component analysis (PCA). The empirically determined factors are uncorrelated and represent truly different common components driving the returns. Both factors and the eight parameters inherent to the HTSN distribution aim at capturing the impact of the state of the economy on price levels since distribution parameters have economic interpretations in terms of conditional volatilities and correlations of the returns with the hidden continuous state. The PCA identifies statistically independent factors affecting the random evolution of a given pool of assets -in our paper a pool of international stock indices- and sorting them by order of relative importance. The PCA computes a historical cross asset covariance matrix and identifies principal components representing independent factors. In our paper, factors are used to calibrate the HTSN-GARCHX model and are ultimately responsible for the nature of the distribution of random variables being generated. We benchmark our model to the MN-GARCHX model following the same PCA methodology and the standard Black-Scholes model. We show that our model outperforms the benchmark in terms of RMSE in dollar losses for put and call options, which in turn outperforms the analytical Black-Scholes by capturing the stylized facts known for index returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence.

Keywords: continuous hidden threshold, factor models, GARCHX models, option pricing, risk-premium

Procedia PDF Downloads 276
483 A Constitutive Model of Ligaments and Tendons Accounting for Fiber-Matrix Interaction

Authors: Ratchada Sopakayang, Gerhard A. Holzapfel

Abstract:

In this study, a new constitutive model is developed to describe the hyperelastic behavior of collagenous tissues with a parallel arrangement of collagen fibers such as ligaments and tendons. The model is formulated using a continuum approach incorporating the structural changes of the main tissue components: collagen fibers, proteoglycan-rich matrix and fiber-matrix interaction. The mechanical contribution of the interaction between the fibers and the matrix is simply expressed by a coupling term. The structural change of the collagen fibers is incorporated in the constitutive model to describe the activation of the fibers under tissue straining. Finally, the constitutive model can easily describe the stress-stretch nonlinearity which occurs when a ligament/tendon is axially stretched. This study shows that the interaction between the fibers and the matrix contributes to the mechanical tissue response. Therefore, the model may lead to a better understanding of the physiological mechanisms of ligaments and tendons under axial loading.

Keywords: constitutive model, fiber-matrix, hyperelasticity, interaction, ligament, tendon

Procedia PDF Downloads 267
482 Vector-Based Analysis in Cognitive Linguistics

Authors: Chuluundorj Begz

Abstract:

This paper presents the dynamic, psycho-cognitive approach to study of human verbal thinking on the basis of typologically different languages /as a Mongolian, English and Russian/. Topological equivalence in verbal communication serves as a basis of Universality of mental structures and therefore deep structures. Mechanism of verbal thinking consisted at the deep level of basic concepts, rules for integration and classification, neural networks of vocabulary. In neuro cognitive study of language, neural architecture and neuro psychological mechanism of verbal cognition are basis of a vector-based modeling. Verbal perception and interpretation of the infinite set of meanings and propositions in mental continuum can be modeled by applying tensor methods. Euclidean and non-Euclidean spaces are applied for a description of human semantic vocabulary and high order structures.

Keywords: Euclidean spaces, isomorphism and homomorphism, mental lexicon, mental mapping, semantic memory, verbal cognition, vector space

Procedia PDF Downloads 490
481 Ratio Type Estimators for the Estimation of Population Coefficient of Variation under Two-Stage Sampling

Authors: Muhammad Jabbar

Abstract:

In this paper we propose two ratio and ratio type exponential estimator for the estimation of population coefficient of variation using the auxiliary information under two-stage sampling. The properties of these estimators are derived up to first order of approximation. The efficiency conditions under which suggested estimator are more efficient, are obtained. Numerical and simulated studies are conducted to support the superiority of the estimators. Theoretically and numerically, we have found that our proposed estimator is always more efficient as compared to its competitor estimator.

Keywords: two-stage sampling, coefficient of variation, ratio type exponential estimator

Procedia PDF Downloads 492
480 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method

Authors: M. K. Balyan

Abstract:

The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.

Keywords: dynamical diffraction, hologram, object image, X-ray holography

Procedia PDF Downloads 364
479 The Evaluation of the Performance of Different Filtering Approaches in Tracking Problem and the Effect of Noise Variance

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

Performance of different filtering approaches depends on modeling of dynamical system and algorithm structure. For modeling and smoothing the data the evaluation of posterior distribution in different filtering approach should be chosen carefully. In this paper different filtering approaches like filter KALMAN, EKF, UKF, EKS and smoother RTS is simulated in some trajectory tracking of path and accuracy and limitation of these approaches are explained. Then probability of model with different filters is compered and finally the effect of the noise variance to estimation is described with simulations results.

Keywords: Gaussian approximation, Kalman smoother, parameter estimation, noise variance

Procedia PDF Downloads 403