Search results for: generalized extreme values
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8244

Search results for: generalized extreme values

8154 Nonlocal Beam Models for Free Vibration Analysis of Double-Walled Carbon Nanotubes with Various End Supports

Authors: Babak Safaei, Ahmad Ghanbari, Arash Rahmani

Abstract:

In the present study, the free vibration characteristics of double-walled carbon nanotubes (DWCNTs) are investigated. The small-scale effects are taken into account using the Eringen’s nonlocal elasticity theory. The nonlocal elasticity equations are implemented into the different classical beam theories namely as Euler-Bernoulli beam theory (EBT), Timoshenko beam theory (TBT), Reddy beam theory (RBT), and Levinson beam theory (LBT) to analyze the free vibrations of DWCNTs in which each wall of the nanotubes is considered as individual beam with van der Waals interaction forces. Generalized differential quadrature (GDQ) method is utilized to discretize the governing differential equations of each nonlocal beam model along with four commonly used boundary conditions. Then molecular dynamics (MD) simulation is performed for a series of armchair and zigzag DWCNTs with different aspect ratios and boundary conditions, the results of which are matched with those of nonlocal beam models to extract the appropriate values of the nonlocal parameter corresponding to each type of chirality, nonlocal beam model and boundary condition. It is found that the present nonlocal beam models with their proposed correct values of nonlocal parameter have good capability to predict the vibrational behavior of DWCNTs, especially for higher aspect ratios.

Keywords: double-walled carbon nanotubes, nonlocal continuum elasticity, free vibrations, molecular dynamics simulation, generalized differential quadrature method

Procedia PDF Downloads 272
8153 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions

Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen

Abstract:

Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.

Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma

Procedia PDF Downloads 157
8152 3D Printing Perceptual Models of Preference Using a Fuzzy Extreme Learning Machine Approach

Authors: Xinyi Le

Abstract:

In this paper, 3D printing orientations were determined through our perceptual model. Some FDM (Fused Deposition Modeling) 3D printers, which are widely used in universities and industries, often require support structures during the additive manufacturing. After removing the residual material, some surface artifacts remain at the contact points. These artifacts will damage the function and visual effect of the model. To prevent the impact of these artifacts, we present a fuzzy extreme learning machine approach to find printing directions that avoid placing supports in perceptually significant regions. The proposed approach is able to solve the evaluation problem by combing both the subjective knowledge and objective information. Our method combines the advantages of fuzzy theory, auto-encoders, and extreme learning machine. Fuzzy set theory is applied for dealing with subjective preference information, and auto-encoder step is used to extract good features without supervised labels before extreme learning machine. An extreme learning machine method is then developed successfully for training and learning perceptual models. The performance of this perceptual model will be demonstrated on both natural and man-made objects. It is a good human-computer interaction practice which draws from supporting knowledge on both the machine side and the human side.

Keywords: 3d printing, perceptual model, fuzzy evaluation, data-driven approach

Procedia PDF Downloads 416
8151 Explicit Iterative Scheme for Approximating a Common Solution of Generalized Mixed Equilibrium Problem and Fixed Point Problem for a Nonexpansive Semigroup in Hilbert Space

Authors: Mohammad Farid

Abstract:

In this paper, we introduce and study an explicit iterative method based on hybrid extragradient method to approximate a common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup in Hilbert space. Further, we prove that the sequence generated by the proposed iterative scheme converge strongly to the common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup. This common solution is the unique solution of a variational inequality problem and is the optimality condition for a minimization problem. The results presented in this paper are the supplement, extension and generalization of the previously known results in this area.

Keywords: generalized mixed equilibrium problem, fixed-point problem, nonexpansive semigroup, variational inequality problem, iterative algorithms, hybrid extragradient method

Procedia PDF Downloads 451
8150 Investigation on Machine Tools Energy Consumptions

Authors: Shiva Abdoli, Daniel T.Semere

Abstract:

Several researches have been conducted to study consumption of energy in cutting process. Most of these researches are focusing to measure the consumption and propose consumption reduction methods. In this work, the relation between the cutting parameters and the consumption is investigated in order to establish a generalized energy consumption model that can be used for process and production planning in real production lines. Using the generalized model, the process planning will be carried out by taking into account the energy as a function of the selected process parameters. Similarly, the generalized model can be used in production planning to select the right operational parameters like batch sizes, routing, buffer size, etc. in a production line. The description and derivation of the model as well as a case study are given in this paper to illustrate the applicability and validity of the model.

Keywords: process parameters, cutting process, energy efficiency, Material Removal Rate (MRR)

Procedia PDF Downloads 479
8149 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values

Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec

Abstract:

A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.

Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods

Procedia PDF Downloads 275
8148 A Comparison of Transdiagnostic Components in Generalized Anxiety Disorder, Unipolar Mood Disorder and Nonclinical Population

Authors: Imaneh Abbasi, Ladan Fata, Majid Sadeghi, Sara Banihashemi, Abolfazl Mohammadee

Abstract:

Background: Dimensional and transdiagnostic approaches as a result of high comorbidity among mental disorders have captured researchers and clinicians interests for exploring the latent factors of development and maintenance of some psychological disorders. The goal of present study is to compare some of these common factors between generalized anxiety disorder and unipolar mood disorder. Methods: 27 patients with generalized anxiety disorder, 29 patients with depression disorder were recruited using SCID-I and 69 non-clinical population were selected using GHQ cut off point. MANCOVA was used for analyzing data. Results: The results show that worry, rumination, intolerance of uncertainty, maladaptive metacognitive beliefs, and experiential avoidance were all significantly different between GAD and unipolar mood disorder groups. However, there were not any significant differences in difficulties in emotion regulation and neuroticism between GAD and unipolar mood disorder groups. Discussion: Results indicate that although there are some transdiagnostic and common factors in GAD and unipolar mood disorder, there may be some specific vulnerability factors for each disorder. Further study is needed for answering these questions.

Keywords: transdiagnostic, depression, generalized anxiety disorder, emotion regulation

Procedia PDF Downloads 476
8147 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: generalized extreme values, likelihood estimation, precipitation data, Wakeby distribution

Procedia PDF Downloads 121
8146 A Stochastic Approach to Extreme Wind Speeds Conditions on a Small Axial Wind Turbine

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, to model a real life wind turbine, a probabilistic approach is proposed to model the dynamics of the blade elements of a small axial wind turbine under extreme stochastic wind speeds conditions. It was found that the power and the torque probability density functions even though decreases at these extreme wind speeds but are not infinite. Moreover, we also found that it is possible to stabilize the power coefficient (stabilizing the output power) above rated wind speeds by turning some control parameters. This method helps to explain the effect of turbulence on the quality and quantity of the harness power and aerodynamic torque.

Keywords: probability, probability density function, stochastic, turbulence

Procedia PDF Downloads 560
8145 Concerns for Extreme Climate Conditions and Their Implications in Southwest Nigeria

Authors: Oyenike Eludoyin

Abstract:

Extreme climate conditions are deviation from the norms and are capable of causing upsets in many important environmental parameter including disruption of water balance and air temperature balance. Studies have shown that extreme climate conditions can foretell disaster in regions with inadequate early warning systems. In this paper, we combined geographical information systems, statistics and social surveys to evaluate the physiologic indices [(Dewpoint Temperature (Td), Effective Temperature Index (ETI) and Relative Strain Index (RSI)] and extreme climate conditions in different parts of southwest Nigeria. This was with the view to assessing the nature and the impact of the conditions on the people and their coping strategies. The results indicate that minimum, mean and maximum temperatures were higher in 1960-1990 than 1991-2013 periods at most areas, and more than 80% of the people adapt to thermal stress by changing wear type or cloth, installing air conditioner and fan at home and/or work place and sleeping outside at certain period of the night and day. With respect to livelihoods, about 52% of the interviewed farmers indicated that too early rainfall, late rainfall, prolonged dryness after an initial rainfall, excessive rainfall and windstorms caused low crop yields. Main (76%) coping strategies were changing of planting dates, diversification of crops, and practices of mulching and intercropping. Government or institutional support was less than 20%.

Keywords: coping strategies, extreme climate, livelihoods, physiologic comfort

Procedia PDF Downloads 258
8144 Object Recognition Approach Based on Generalized Hough Transform and Color Distribution Serving in Generating Arabic Sentences

Authors: Nada Farhani, Naim Terbeh, Mounir Zrigui

Abstract:

The recognition of the objects contained in images has always presented a challenge in the field of research because of several difficulties that the researcher can envisage because of the variability of shape, position, contrast of objects, etc. In this paper, we will be interested in the recognition of objects. The classical Hough Transform (HT) presented a tool for detecting straight line segments in images. The technique of HT has been generalized (GHT) for the detection of arbitrary forms. With GHT, the forms sought are not necessarily defined analytically but rather by a particular silhouette. For more precision, we proposed to combine the results from the GHT with the results from a calculation of similarity between the histograms and the spatiograms of the images. The main purpose of our work is to use the concepts from recognition to generate sentences in Arabic that summarize the content of the image.

Keywords: recognition of shape, generalized hough transformation, histogram, spatiogram, learning

Procedia PDF Downloads 133
8143 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques

Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa

Abstract:

This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).

Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences

Procedia PDF Downloads 326
8142 Assessing Missouri State Park Employee Perceptions of Vulnerability and Resilience to Extreme Weather Events

Authors: Ojetunde Ojewola, Mark Morgan, Sonja Wilhelm-Stanis

Abstract:

State parks and historic sites are vulnerable to extreme weather events which can affect visitor experiences, management priorities, and legislative requests for disaster relief funds. Recently, global attention has been focused on the perceptions of global warming and how the presence of extreme weather events might impact protected areas, both now and in the future. The effects of climate change are not equally distributed across the United States, leading to varied perceptions based on personal experience with extreme weather events. This study describes employee perceptions of vulnerability and resilience in Missouri State Parks & Historic Sites due to extreme weather events that occur across the state but grouped according to physiographic provinces. Using a four-point rating scale, perceptions of vulnerability and resilience were divided into high and low sub-groups, thus allowing researchers to construct a two by two typology of employee responses. Subsequently, this data was used to develop a three-point continuum of environmental concern (higher scores meant more concern). Employee scores were then compared against a statewide assessment which combined social, economic, infrastructural and environmental indicators of vulnerability and resilience. State park employees thought the system was less vulnerable and more resilient to climate change than data found in statewide assessment This result was also consistent in three out of five physiographic regions across Missouri. Implications suggest that Missouri state park should develop a climate change adaptation strategy for emergency preparedness.

Keywords: extreme weather events, resilience, state parks, vulnerability

Procedia PDF Downloads 104
8141 Pressure Distribution, Load Capacity, and Thermal Effect with Generalized Maxwell Model in Journal Bearing Lubrication

Authors: M. Guemmadi, A. Ouibrahim

Abstract:

This numerical investigation aims to evaluate how a viscoelastic lubricant described by a generalized Maxwell model, affects the pressure distribution, the load capacity and thermal effect in a journal bearing lubrication. We use for the purpose the CFD package software completed by adapted user define functions (UDFs) to solve the coupled equations of momentum, of energy and of the viscoelastic model (generalized Maxwell model). Two parameters, viscosity and relaxation time are involved to show how viscoelasticity substantially affect the pressure distribution, the load capacity and the thermal transfer by comparison to Newtonian lubricant. These results were also compared with the available published results.

Keywords: journal bearing, lubrication, Maxwell model, viscoelastic fluids, computational modelling, load capacity

Procedia PDF Downloads 522
8140 Population Size Estimation Based on the GPD

Authors: O. Anan, D. Böhning, A. Maruotti

Abstract:

The purpose of the study is to estimate the elusive target population size under a truncated count model that accounts for heterogeneity. The purposed estimator is based on the generalized Poisson distribution (GPD), which extends the Poisson distribution by adding a dispersion parameter. Thus, it becomes an useful model for capture-recapture data where concurrent events are not homogeneous. In addition, it can account for over-dispersion and under-dispersion. The ratios of neighboring frequency counts are used as a tool for investigating the validity of whether generalized Poisson or Poisson distribution. Since capture-recapture approaches do not provide the zero counts, the estimated parameters can be achieved by modifying the EM-algorithm technique for the zero-truncated generalized Poisson distribution. The properties and the comparative performance of proposed estimator were investigated through simulation studies. Furthermore, some empirical examples are represented insights on the behavior of the estimators.

Keywords: capture, recapture methods, ratio plot, heterogeneous population, zero-truncated count

Procedia PDF Downloads 419
8139 On a Single Server Queue with Arrivals in Batches of Variable Size, Generalized Coxian-2 Service and Compulsory Server Vacations

Authors: Kailash C. Madan

Abstract:

We study the steady state behaviour of a batch arrival single server queue in which the first service with general service times is compulsory and the second service with general service times is optional. We term such a two phase service as generalized Coxian-2 service. Just after completion of a service the server must take a vacation of random length of time with general vacation times. We obtain steady state probability generating functions for the queue size as well as the steady state mean queue size at a random epoch of time in explicit and closed forms. Some particular cases of interest including some known results have been derived.

Keywords: batch arrivals, compound Poisson process, generalized Coxian-2 service, steady state

Procedia PDF Downloads 434
8138 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data

Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa

Abstract:

A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.

Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation

Procedia PDF Downloads 175
8137 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 282
8136 A Fundamental Functional Equation for Lie Algebras

Authors: Ih-Ching Hsu

Abstract:

Inspired by the so called Jacobi Identity (x y) z + (y z) x + (z x) y = 0, the following class of functional equations EQ I: F [F (x, y), z] + F [F (y, z), x] + F [F (z, x), y] = 0 is proposed, researched and generalized. Research methodologies begin with classical methods for functional equations, then evolve into discovering of any implicit algebraic structures. One of this paper’s major findings is that EQ I, under two additional conditions F (x, x) = 0 and F (x, y) + F (y, x) = 0, proves to be a fundamental functional equation for Lie Algebras. Existence of non-trivial solutions for EQ I can be proven by defining F (p, q) = [p q] = pq –qp, where p and q are quaternions, and pq is the quaternion product of p and q. EQ I can be generalized to the following class of functional equations EQ II: F [G (x, y), z] + F [G (y, z), x] + F [G (z, x), y] = 0. Concluding Statement: With a major finding proven, and non-trivial solutions derived, this research paper illustrates and provides a new functional equation scheme for studies in two major areas: (1) What underlying algebraic structures can be defined and/or derived from EQ I or EQ II? (2) What conditions can be imposed so that conditional general solutions to EQ I and EQ II can be found, investigated and applied?

Keywords: fundamental functional equation, generalized functional equations, Lie algebras, quaternions

Procedia PDF Downloads 203
8135 Analysis of Risk Factors Affecting the Motor Insurance Pricing with Generalized Linear Models

Authors: Puttharapong Sakulwaropas, Uraiwan Jaroengeratikun

Abstract:

Casualty insurance business, the optimal premium pricing and adequate cost for an insurance company are important in risk management. Normally, the insurance pure premium can be determined by multiplying the claim frequency with the claim cost. The aim of this research was to study in the application of generalized linear models to select the risk factor for model of claim frequency and claim cost for estimating a pure premium. In this study, the data set was the claim of comprehensive motor insurance, which was provided by one of the insurance company in Thailand. The results of this study found that the risk factors significantly related to pure premium at the 0.05 level consisted of no claim bonus (NCB) and used of the car (Car code).

Keywords: generalized linear models, risk factor, pure premium, regression model

Procedia PDF Downloads 446
8134 Modelling Volatility of Cryptocurrencies: Evidence from GARCH Family of Models with Skewed Error Innovation Distributions

Authors: Timothy Kayode Samson, Adedoyin Isola Lawal

Abstract:

The past five years have shown a sharp increase in public interest in the crypto market, with its market capitalization growing from $100 billion in June 2017 to $2158.42 billion on April 5, 2022. Despite the outrageous nature of the volatility of cryptocurrencies, the use of skewed error innovation distributions in modelling the volatility behaviour of these digital currencies has not been given much research attention. Hence, this study models the volatility of 5 largest cryptocurrencies by market capitalization (Bitcoin, Ethereum, Tether, Binance coin, and USD Coin) using four variants of GARCH models (GJR-GARCH, sGARCH, EGARCH, and APARCH) estimated using three skewed error innovation distributions (skewed normal, skewed student- t and skewed generalized error innovation distributions). Daily closing prices of these currencies were obtained from Yahoo Finance website. Finding reveals that the Binance coin reported higher mean returns compared to other digital currencies, while the skewness indicates that the Binance coin, Tether, and USD coin increased more than they decreased in values within the period of study. For both Bitcoin and Ethereum, negative skewness was obtained, meaning that within the period of study, the returns of these currencies decreased more than they increased in value. Returns from these cryptocurrencies were found to be stationary but not normality distributed with evidence of the ARCH effect. The skewness parameters in all best forecasting models were all significant (p<.05), justifying of use of skewed error innovation distributions with a fatter tail than normal, Student-t, and generalized error innovation distributions. For Binance coin, EGARCH-sstd outperformed other volatility models, while for Bitcoin, Ethereum, Tether, and USD coin, the best forecasting models were EGARCH-sstd, APARCH-sstd, EGARCH-sged, and GJR-GARCH-sstd, respectively. This suggests the superiority of skewed Student t- distribution and skewed generalized error distribution over the skewed normal distribution.

Keywords: skewed generalized error distribution, skewed normal distribution, skewed student t- distribution, APARCH, EGARCH, sGARCH, GJR-GARCH

Procedia PDF Downloads 77
8133 Active Space Debris Removal by Extreme Ultraviolet Radiation

Authors: A. Anandha Selvan, B. Malarvizhi

Abstract:

In recent year the problem of space debris have become very serious. The mass of the artificial objects in orbit increased quite steadily at the rate of about 145 metric tons annually, leading to a total tally of approximately 7000 metric tons. Now most of space debris object orbiting in LEO region about 97%. The catastrophic collision can be mostly occurred in LEO region, where this collision generate the new debris. Thus, we propose a concept for cleaning the space debris in the region of thermosphere by passing the Extreme Ultraviolet (EUV) radiation to in front of space debris object from the re-orbiter. So in our concept the Extreme Ultraviolet (EUV) radiation will create the thermosphere expansion by reacting with atmospheric gas particles. So the drag is produced in front of the space debris object by thermosphere expansion. This drag force is high enough to slow down the space debris object’s relative velocity. Therefore the space debris object gradually reducing the altitude and finally enter into the earth’s atmosphere. After the first target is removed, the re-orbiter can be goes into next target. This method remove the space debris object without catching debris object. Thus it can be applied to a wide range of debris object without regard to their shapes or rotation. This paper discusses the operation of re-orbiter for removing the space debris in thermosphere region.

Keywords: active space debris removal, space debris, LEO, extreme ultraviolet, re-orbiter, thermosphere

Procedia PDF Downloads 441
8132 Impact of Weather Conditions on Generalized Frequency Division Multiplexing over Gamma Gamma Channel

Authors: Muhammad Sameer Ahmed, Piotr Remlein, Tansal Gucluoglu

Abstract:

The technique called as Generalized frequency division multiplexing (GFDM) used in the free space optical channel can be a good option for implementation free space optical communication systems. This technique has several strengths e.g. good spectral efficiency, low peak-to-average power ratio (PAPR), adaptability and low co-channel interference. In this paper, the impact of weather conditions such as haze, rain and fog on GFDM over the gamma-gamma channel model is discussed. A Trade off between link distance and system performance under intense weather conditions is also analysed. The symbol error probability (SEP) of GFDM over the gamma-gamma turbulence channel is derived and verified with the computer simulations.

Keywords: free space optics, generalized frequency division multiplexing, weather conditions, gamma gamma distribution

Procedia PDF Downloads 148
8131 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method

Authors: Luh Eka Suryani, Purhadi

Abstract:

Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.

Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion

Procedia PDF Downloads 137
8130 Development of Generalized Correlation for Liquid Thermal Conductivity of N-Alkane and Olefin

Authors: A. Ishag Mohamed, A. A. Rabah

Abstract:

The objective of this research is to develop a generalized correlation for the prediction of thermal conductivity of n-Alkanes and Alkenes. There is a minority of research and lack of correlation for thermal conductivity of liquids in the open literature. The available experimental data are collected covering the groups of n-Alkanes and Alkenes.The data were assumed to correlate to temperature using Filippov correlation. Nonparametric regression of Grace Algorithm was used to develop the generalized correlation model. A spread sheet program based on Microsoft Excel was used to plot and calculate the value of the coefficients. The results obtained were compared with the data that found in Perry's Chemical Engineering Hand Book. The experimental data correlated to the temperature ranged "between" 273.15 to 673.15 K, with R2 = 0.99.The developed correlation reproduced experimental data that which were not included in regression with absolute average percent deviation (AAPD) of less than 7 %. Thus the spread sheet was quite accurate which produces reliable data.

Keywords: N-Alkanes, N-Alkenes, nonparametric, regression

Procedia PDF Downloads 636
8129 Spirometric Reference Values in 236,606 Healthy, Non-Smoking Chinese Aged 4–90 Years

Authors: Jiashu Shen

Abstract:

Objectives: Spirometry is a basic reference for health evaluation which is widely used in clinical. Previous reference of spirometry is not applicable because of drastic changes of social and natural circumstance in China. A new reference values for the spirometry of the Chinese population is extremely needed. Method: Spirometric reference value was established using the statistical modeling method Generalized Additive Models for Location, Scale and Shape for forced expiratory volume in 1 s (FEV1), forced vital capacity (FVC), FEV1/FVC, and maximal mid-expiratory flow (MMEF). Results: Data from 236,606 healthy non-smokers aged 4–90 years was collected from the MJ Health Check database. Spirometry equations for FEV1, FVC, MMEF, and FEV1/FVC were established, including the predicted values and lower limits of normal (LLNs) by sex. The predictive equations that were developed for the spirometric results elaborated the relationship between spirometry and age, and they eliminated the effects of height as a variable. Most previous predictive equations for Chinese spirometry were significantly overestimated (to be exact, with mean differences of 22.21% in FEV1 and 31.39% in FVC for males, along with differences of 26.93% in FEV1 and 35.76% in FVC for females) or underestimated (with mean differences of -5.81% in MMEF and -14.56% in FEV1/FVC for males, along with a difference of -14.54% in FEV1/FVC for females) the results of lung function measurements as found in this study. Through cross-validation, our equations were established as having good fit, and the means of the measured value and the estimated value were compared, with good results. Conclusions: Our study updates the spirometric reference equations for Chinese people of all ages and provides comprehensive values for both physical examination and clinical diagnosis.

Keywords: Chinese, GAMLSS model, reference values, spirometry

Procedia PDF Downloads 113
8128 A Generalization of Planar Pascal’s Triangle to Polynomial Expansion and Connection with Sierpinski Patterns

Authors: Wajdi Mohamed Ratemi

Abstract:

The very well-known stacked sets of numbers referred to as Pascal’s triangle present the coefficients of the binomial expansion of the form (x+y)n. This paper presents an approach (the Staircase Horizontal Vertical, SHV-method) to the generalization of planar Pascal’s triangle for polynomial expansion of the form (x+y+z+w+r+⋯)n. The presented generalization of Pascal’s triangle is different from other generalizations of Pascal’s triangles given in the literature. The coefficients of the generalized Pascal’s triangles, presented in this work, are generated by inspection, using embedded Pascal’s triangles. The coefficients of I-variables expansion are generated by horizontally laying out the Pascal’s elements of (I-1) variables expansion, in a staircase manner, and multiplying them with the relevant columns of vertically laid out classical Pascal’s elements, hence avoiding factorial calculations for generating the coefficients of the polynomial expansion. Furthermore, the classical Pascal’s triangle has some pattern built into it regarding its odd and even numbers. Such pattern is known as the Sierpinski’s triangle. In this study, a presentation of Sierpinski-like patterns of the generalized Pascal’s triangles is given. Applications related to those coefficients of the binomial expansion (Pascal’s triangle), or polynomial expansion (generalized Pascal’s triangles) can be in areas of combinatorics, and probabilities.

Keywords: pascal’s triangle, generalized pascal’s triangle, polynomial expansion, sierpinski’s triangle, combinatorics, probabilities

Procedia PDF Downloads 344
8127 Extreme Rainfall Frequency Analysis For Meteorological Sub-Division 4 Of India Using L-Moments.

Authors: Arti Devi, Parthasarthi Choudhury

Abstract:

Extreme rainfall frequency analysis for Meteorological Sub-Division 4 of India was analysed using L-moments approach. Serial Correlation and Mann Kendall tests were conducted for checking serially independent and stationarity of the observations. The discordancy measure for the sites was conducted to detect the discordant sites. The regional homogeneity was tested by comparing with 500 generated homogeneous regions using a 4 parameter Kappa distribution. The best fit distribution was selected based on ZDIST statistics and L-moments ratio diagram from the five extreme value distributions GPD, GLO, GEV, P3 and LP3. The LN3 distribution was selected and regional rainfall frequency relationship was established using index-rainfall procedure. A regional mean rainfall relationship was developed using multiple linear regression with latitude and longitude of the sites as variables.

Keywords: L-moments, ZDIST statistics, serial correlation, Mann Kendall test

Procedia PDF Downloads 422
8126 Short-Term Effects of Extreme Temperatures on Cause Specific Cardiovascular Admissions in Beijing, China

Authors: Deginet Aklilu, Tianqi Wang, Endwoke Amsalu, Wei Feng, Zhiwei Li, Xia Li, Lixin Tao, Yanxia Luo, Moning Guo, Xiangtong Liu, Xiuhua Guo

Abstract:

Extreme temperature-related cardiovascular diseases (CVDs) have become a growing public health concern. However, the impact of temperature on the cause of specific CVDs has not been well studied in the study area. The objective of this study was to assess the impact of temperature on cause-specific cardiovascular hospital admissions in Beijing, China. We obtained data from 172 large general hospitals from the Beijing Public Health Information Center Cardiovascular Case Database and China. Meteorological Administration covering 16 districts in Beijing from 2013 to 2017. We used a time-stratified case crossover design with a distributed lag nonlinear model (DLNM) to derive the impact of temperature on CVD in hospitals back to 27 days on CVD admissions. The temperature data were stratified as cold (extreme and moderate ) and hot (moderate and extreme ). Within five years (January 2013-December 2017), a total of 460,938 (male 54.9% and female 45.1%) CVD admission cases were reported. The exposure-response relationship for hospitalization was described by a "J" shape for the total and cause-specific. An increase in the six-day moving average temperature from moderate hot (30.2 °C) to extreme hot (36.9 °C) resulted in a significant increase in CVD admissions of 16.1%(95% CI = 12.8%-28.9%). However, the effect of cold temperature exposure on CVD admissions over a lag time of 0-27 days was found to be non significant, with a relative risk of 0.45 (95% CI = 0.378-0.55) for extreme cold (-8.5 °C)and 0.53 (95% CI = 0.47-0.60) for moderate cold (-5.6 °C). The results of this study indicate that exposure to extremely high temperatures is highly associated with an increase in cause-specific CVD admissions. These finding may guide to create and raise awareness of the general population, government and private sectors regarding on the effects of current weather conditions on CVD.

Keywords: admission, Beijing, cardiovascular diseases, distributed lag non linear model, temperature

Procedia PDF Downloads 39
8125 An Efficient Algorithm of Time Step Control for Error Correction Method

Authors: Youngji Lee, Yonghyeon Jeon, Sunyoung Bu, Philsu Kim

Abstract:

The aim of this paper is to construct an algorithm of time step control for the error correction method most recently developed by one of the authors for solving stiff initial value problems. It is achieved with the generalized Chebyshev polynomial and the corresponding error correction method. The main idea of the proposed scheme is in the usage of the duplicated node points in the generalized Chebyshev polynomials of two different degrees by adding necessary sample points instead of re-sampling all points. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. Two stiff problems are numerically solved to assess the effectiveness of the proposed scheme.

Keywords: stiff initial value problem, error correction method, generalized Chebyshev polynomial, node points

Procedia PDF Downloads 548