Search results for: closed form method of variance estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25220

Search results for: closed form method of variance estimation

25130 Efficient Modeling Technique for Microstrip Discontinuities

Authors: Nassim Ourabia, Malika Ourabia

Abstract:

A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The technique obtains closed form expressions for the equivalent circuits which are used to model these discontinuities. Then it would be easy to handle and to characterize complicated structures like T and Y junctions, truncated junctions, arbitrarily shaped junctions, cascading junctions, and more generally planar multiport junctions. Another advantage of this method is that the edge line concept for arbitrary shape junctions operates with real parameters circuits. The validity of the method was further confirmed by comparing our results for various discontinuities (bend, filters) with those from HFSS as well as from other published sources.

Keywords: CAD analysis, contour integral approach, microwave circuits, s-parameters

Procedia PDF Downloads 473
25129 Brain Age Prediction Based on Brain Magnetic Resonance Imaging by 3D Convolutional Neural Network

Authors: Leila Keshavarz Afshar, Hedieh Sajedi

Abstract:

Estimation of biological brain age from MR images is a topic that has been much addressed in recent years due to the importance it attaches to early diagnosis of diseases such as Alzheimer's. In this paper, we use a 3D Convolutional Neural Network (CNN) to provide a method for estimating the biological age of the brain. The 3D-CNN model is trained by MRI data that has been normalized. In addition, to reduce computation while saving overall performance, some effectual slices are selected for age estimation. By this method, the biological age of individuals using selected normalized data was estimated with Mean Absolute Error (MAE) of 4.82 years.

Keywords: brain age estimation, biological age, 3D-CNN, deep learning, T1-weighted image, SPM, preprocessing, MRI, canny, gray matter

Procedia PDF Downloads 117
25128 [Keynote Talk]: Discovering Liouville-Type Problems for p-Energy Minimizing Maps in Closed Half-Ellipsoids by Calculus Variation Method

Authors: Lina Wu, Jia Liu, Ye Li

Abstract:

The goal of this project is to investigate constant properties (called the Liouville-type Problem) for a p-stable map as a local or global minimum of a p-energy functional where the domain is a Euclidean space and the target space is a closed half-ellipsoid. The First and Second Variation Formulas for a p-energy functional has been applied in the Calculus Variation Method as computation techniques. Stokes’ Theorem, Cauchy-Schwarz Inequality, Hardy-Sobolev type Inequalities, and the Bochner Formula as estimation techniques have been used to estimate the lower bound and the upper bound of the derived p-Harmonic Stability Inequality. One challenging point in this project is to construct a family of variation maps such that the images of variation maps must be guaranteed in a closed half-ellipsoid. The other challenging point is to find a contradiction between the lower bound and the upper bound in an analysis of p-Harmonic Stability Inequality when a p-energy minimizing map is not constant. Therefore, the possibility of a non-constant p-energy minimizing map has been ruled out and the constant property for a p-energy minimizing map has been obtained. Our research finding is to explore the constant property for a p-stable map from a Euclidean space into a closed half-ellipsoid in a certain range of p. The certain range of p is determined by the dimension values of a Euclidean space (the domain) and an ellipsoid (the target space). The certain range of p is also bounded by the curvature values on an ellipsoid (that is, the ratio of the longest axis to the shortest axis). Regarding Liouville-type results for a p-stable map, our research finding on an ellipsoid is a generalization of mathematicians’ results on a sphere. Our result is also an extension of mathematicians’ Liouville-type results from a special ellipsoid with only one parameter to any ellipsoid with (n+1) parameters in the general setting.

Keywords: Bochner formula, Calculus Stokes' Theorem, Cauchy-Schwarz Inequality, first and second variation formulas, Liouville-type problem, p-harmonic map

Procedia PDF Downloads 246
25127 Monte Carlo Estimation of Heteroscedasticity and Periodicity Effects in a Panel Data Regression Model

Authors: Nureni O. Adeboye, Dawud A. Agunbiade

Abstract:

This research attempts to investigate the effects of heteroscedasticity and periodicity in a Panel Data Regression Model (PDRM) by extending previous works on balanced panel data estimation within the context of fitting PDRM for Banks audit fee. The estimation of such model was achieved through the derivation of Joint Lagrange Multiplier (LM) test for homoscedasticity and zero-serial correlation, a conditional LM test for zero serial correlation given heteroscedasticity of varying degrees as well as conditional LM test for homoscedasticity given first order positive serial correlation via a two-way error component model. Monte Carlo simulations were carried out for 81 different variations, of which its design assumed a uniform distribution under a linear heteroscedasticity function. Each of the variation was iterated 1000 times and the assessment of the three estimators considered are based on Variance, Absolute bias (ABIAS), Mean square error (MSE) and the Root Mean Square (RMSE) of parameters estimates. Eighteen different models at different specified conditions were fitted, and the best-fitted model is that of within estimator when heteroscedasticity is severe at either zero or positive serial correlation value. LM test results showed that the tests have good size and power as all the three tests are significant at 5% for the specified linear form of heteroscedasticity function which established the facts that Banks operations are severely heteroscedastic in nature with little or no periodicity effects.

Keywords: audit fee lagrange multiplier test, heteroscedasticity, lagrange multiplier test, Monte-Carlo scheme, periodicity

Procedia PDF Downloads 114
25126 Applicability of Cameriere’s Age Estimation Method in a Sample of Turkish Adults

Authors: Hatice Boyacioglu, Nursel Akkaya, Humeyra Ozge Yilanci, Hilmi Kansu, Nihal Avcu

Abstract:

The strong relationship between the reduction in the size of the pulp cavity and increasing age has been reported in the literature. This relationship can be utilized to estimate the age of an individual by measuring the pulp cavity size using dental radiographs as a non-destructive method. The purpose of this study is to develop a population specific regression model for age estimation in a sample of Turkish adults by applying Cameriere’s method on panoramic radiographs. The sample consisted of 100 panoramic radiographs of Turkish patients (40 men, 60 women) aged between 20 and 70 years. Pulp and tooth area ratios (AR) of the maxilla¬¬ry canines were measured by two maxillofacial radiologists and then the results were subjected to regression analysis. There were no statistically significant intra-observer and inter-observer differences. The correlation coefficient between age and the AR of the maxillary canines was -0.71 and the following regression equation was derived: Estimated Age = 77,365 – ( 351,193 × AR ). The mean prediction error was 4 years which is within acceptable errors limits for age estimation. This shows that the pulp/tooth area ratio is a useful variable for assessing age with reasonable accuracy. Based on the results of this research, it was concluded that Cameriere’s method is suitable for dental age estimation and it can be used for forensic procedures in Turkish adults. These instructions give you guidelines for preparing papers for conferences or journals.

Keywords: age estimation by teeth, forensic dentistry, panoramic radiograph, Cameriere's method

Procedia PDF Downloads 422
25125 Identification of Wiener Model Using Iterative Schemes

Authors: Vikram Saini, Lillie Dewan

Abstract:

This paper presents the iterative schemes based on Least square, Hierarchical Least Square and Stochastic Approximation Gradient method for the Identification of Wiener model with parametric structure. A gradient method is presented for the parameter estimation of wiener model with noise conditions based on the stochastic approximation. Simulation results are presented for the Wiener model structure with different static non-linear elements in the presence of colored noise to show the comparative analysis of the iterative methods. The stochastic gradient method shows improvement in the estimation performance and provides fast convergence of the parameters estimates.

Keywords: hard non-linearity, least square, parameter estimation, stochastic approximation gradient, Wiener model

Procedia PDF Downloads 366
25124 Optimization Modeling of the Hybrid Antenna Array for the DoA Estimation

Authors: Somayeh Komeylian

Abstract:

The direction of arrival (DoA) estimation is the crucial aspect of the radar technologies for detecting and dividing several signal sources. In this scenario, the antenna array output modeling involves numerous parameters including noise samples, signal waveform, signal directions, signal number, and signal to noise ratio (SNR), and thereby the methods of the DoA estimation rely heavily on the generalization characteristic for establishing a large number of the training data sets. Hence, we have analogously represented the two different optimization models of the DoA estimation; (1) the implementation of the decision directed acyclic graph (DDAG) for the multiclass least-squares support vector machine (LS-SVM), and (2) the optimization method of the deep neural network (DNN) radial basis function (RBF). We have rigorously verified that the LS-SVM DDAG algorithm is capable of accurately classifying DoAs for the three classes. However, the accuracy and robustness of the DoA estimation are still highly sensitive to technological imperfections of the antenna arrays such as non-ideal array design and manufacture, array implementation, mutual coupling effect, and background radiation and thereby the method may fail in representing high precision for the DoA estimation. Therefore, this work has a further contribution on developing the DNN-RBF model for the DoA estimation for overcoming the limitations of the non-parametric and data-driven methods in terms of array imperfection and generalization. The numerical results of implementing the DNN-RBF model have confirmed the better performance of the DoA estimation compared with the LS-SVM algorithm. Consequently, we have analogously evaluated the performance of utilizing the two aforementioned optimization methods for the DoA estimation using the concept of the mean squared error (MSE).

Keywords: DoA estimation, Adaptive antenna array, Deep Neural Network, LS-SVM optimization model, Radial basis function, and MSE

Procedia PDF Downloads 70
25123 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is re-examined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness

Procedia PDF Downloads 408
25122 Estimation of Opc, Fly Ash and Slag Contents in Blended and Composite Cements by Selective Dissolution Method

Authors: Suresh Palla

Abstract:

This research paper presents the results of the study on the estimation of fly ash, slag and cement contents in blended and composite cements by novel selective dissolution method. Types of cement samples investigated include OPC with fly ash as performance improver, OPC with slag as performance improver, PPC, PSC and Composite cement confirming to respective Indian Standards. Slag and OPC contents in PSC were estimated by selectively dissolving OPC in stage 1 and selectively dissolving slag in stage 2. In the case of composite cement sample, the percentage of cement, slag and fly ash were estimated systematically by selective dissolution of cement, slag and fly ash in three stages. In the first stage, cement dissolved and separated by leaving the residue of slag and fly ash, designated as R1. The second stage involves gravimetric estimation of fractions of OPC, residue and selective dissolution of fly ash and slag contents. Fly ash content, R2 was estimated through gravimetric analysis. Thereafter, the difference between the R1 and R2 is considered as slag content. The obtained results of cement, fly ash and slag using selective dissolution method showed 10% of standard deviation with the corresponding percentage of respective constituents. The results suggest that this novel selective dissolution method can be successfully used for estimation of OPC and SCMs contents in different types of cements.

Keywords: selective dissolution method , fly ash, ggbfs slag, edta

Procedia PDF Downloads 129
25121 Closed-Form Sharma-Mittal Entropy Rate for Gaussian Processes

Authors: Septimia Sarbu

Abstract:

The entropy rate of a stochastic process is a fundamental concept in information theory. It provides a limit to the amount of information that can be transmitted reliably over a communication channel, as stated by Shannon's coding theorems. Recently, researchers have focused on developing new measures of information that generalize Shannon's classical theory. The aim is to design more efficient information encoding and transmission schemes. This paper continues the study of generalized entropy rates, by deriving a closed-form solution to the Sharma-Mittal entropy rate for Gaussian processes. Using the squeeze theorem, we solve the limit in the definition of the entropy rate, for different values of alpha and beta, which are the parameters of the Sharma-Mittal entropy. In the end, we compare it with Shannon and Rényi's entropy rates for Gaussian processes.

Keywords: generalized entropies, Sharma-Mittal entropy rate, Gaussian processes, eigenvalues of the covariance matrix, squeeze theorem

Procedia PDF Downloads 476
25120 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk

Authors: Giriraj Achari, Malay Bhattacharyya

Abstract:

In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.

Keywords: Copula, Markov Switching, multifractal, value-at-risk

Procedia PDF Downloads 142
25119 Blind Super-Resolution Reconstruction Based on PSF Estimation

Authors: Osama A. Omer, Amal Hamed

Abstract:

Successful blind image Super-Resolution algorithms require the exact estimation of the Point Spread Function (PSF). In the absence of any prior information about the imagery system and the true image; this estimation is normally done by trial and error experimentation until an acceptable restored image quality is obtained. Multi-frame blind Super-Resolution algorithms often have disadvantages of slow convergence and sensitiveness to complex noises. This paper presents a Super-Resolution image reconstruction algorithm based on estimation of the PSF that yields the optimum restored image quality. The estimation of PSF is performed by the knife-edge method and it is implemented by measuring spreading of the edges in the reproduced HR image itself during the reconstruction process. The proposed image reconstruction approach is using L1 norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. A series of experiment results show that the proposed method can outperform other previous work robustly and efficiently.

Keywords: blind, PSF, super-resolution, knife-edge, blurring, bilateral, L1 norm

Procedia PDF Downloads 336
25118 Estimation and Restoration of Ill-Posed Parameters for Underwater Motion Blurred Images

Authors: M. Vimal Raj, S. Sakthivel Murugan

Abstract:

Underwater images degrade their quality due to atmospheric conditions. One of the major problems in an underwater image is motion blur caused by the imaging device or the movement of the object. In order to rectify that in post-imaging, parameters of the blurred image are to be estimated. So, the point spread function is estimated by the properties, using the spectrum of the image. To improve the estimation accuracy of the parameters, Optimized Polynomial Lagrange Interpolation (OPLI) method is implemented after the angle and length measurement of motion-blurred images. Initially, the data were collected from real-time environments in Chennai and processed. The proposed OPLI method shows better accuracy than the existing classical Cepstral, Hough, and Radon transform estimation methods for underwater images.

Keywords: image restoration, motion blur, parameter estimation, radon transform, underwater

Procedia PDF Downloads 152
25117 Estimation and Comparison of Delay at Signalized Intersections Based on Existing Methods

Authors: Arpita Saha, Satish Chandra, Indrajit Ghosh

Abstract:

Delay implicates the time loss of a traveler while crossing an intersection. Efficiency of traffic operation at signalized intersections is assessed in terms of delay caused to an individual vehicle. Highway Capacity Manual (HCM) method and Webster’s method are the most widely used in India for delay estimation purpose. However, in India, traffic is highly heterogeneous in nature with extremely poor lane discipline. Therefore, to explore best delay estimation technique for Indian condition, a comparison was made. In this study, seven signalized intersections from three different cities where chosen. Data was collected for both during morning and evening peak hours. Only under saturated cycles were considered for this study. Delay was estimated based on the field data. With the help of Simpson’s 1/3 rd rule, delay of under saturated cycles was estimated by measuring the area under the curve of queue length and cycle time. Moreover, the field observed delay was compared with the delay estimated using HCM, Webster, Probabilistic, Taylor’s expansion and Regression methods. The drawbacks of the existing delay estimation methods to be use in Indian heterogeneous traffic conditions were figured out, and best method was proposed. It was observed that direct estimation of delay using field measured data is more accurate than existing conventional and modified methods.

Keywords: delay estimation technique, field delay, heterogeneous traffic, signalised intersection

Procedia PDF Downloads 270
25116 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes

Authors: J. J. Vargas, N. Prieto, L. A. Toro

Abstract:

Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.

Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method

Procedia PDF Downloads 352
25115 Comparative Analysis of Spectral Estimation Methods for Brain-Computer Interfaces

Authors: Rafik Djemili, Hocine Bourouba, M. C. Amara Korba

Abstract:

In this paper, we present a method in order to classify EEG signals for Brain-Computer Interfaces (BCI). EEG signals are first processed by means of spectral estimation methods to derive reliable features before classification step. Spectral estimation methods used are standard periodogram and the periodogram calculated by the Welch method; both methods are compared with Logarithm of Band Power (logBP) features. In the method proposed, we apply Linear Discriminant Analysis (LDA) followed by Support Vector Machine (SVM). Classification accuracy reached could be as high as 85%, which proves the effectiveness of classification of EEG signals based BCI using spectral methods.

Keywords: brain-computer interface, motor imagery, electroencephalogram, linear discriminant analysis, support vector machine

Procedia PDF Downloads 471
25114 Reliability Estimation of Bridge Structures with Updated Finite Element Models

Authors: Ekin Ozer

Abstract:

Assessment of structural reliability is essential for efficient use of civil infrastructure which is subjected hazardous events. Dynamic analysis of finite element models is a commonly used tool to simulate structural behavior and estimate its performance accordingly. However, theoretical models purely based on preliminary assumptions and design drawings may deviate from the actual behavior of the structure. This study proposes up-to-date reliability estimation procedures which engages actual bridge vibration data modifying finite element models for finite element model updating and performing reliability estimation, accordingly. The proposed method utilizes vibration response measurements of bridge structures to identify modal parameters, then uses these parameters to calibrate finite element models which are originally based on design drawings. The proposed method does not only show that reliability estimation based on updated models differs from the original models, but also infer that non-updated models may overestimate the structural capacity.

Keywords: earthquake engineering, engineering vibrations, reliability estimation, structural health monitoring

Procedia PDF Downloads 178
25113 Aesthetic Modification with Combined Orthognathic Surgery and Closed Rhinoplasty

Authors: Alessandro Marano

Abstract:

Aim: The author describes the aesthetic modification using orthognathic surgery and closed rhinoplasty. Methods: Series of case study. After orthognathic surgery we can observe a dramatical change of aesthetic especially in the mid-face and nose projection. The advancement of maxillary bone through Le Fort I osteotomy will change the nasal tip projection and lips roundness; combining orthognathic surgery with closed approach rhinoplasty will manage both function and aesthetic of all mid face district. Results: Combining Le Fort I osteotomy with closed approach rhinoplasty resulted in good objective results with high patient satisfaction. Le Fort I osteotomy will increase projection of mid face and the closed approach rhinoplasty will modify the nasal shape to be more harmonic with the new maxillary district. The scars are not visible because hidden inside the mouth and nose. Conclusions: The orthognathic surgery combined with closed approach rhinoplasty are very effective for changing the aesthetic of the mid face. The results illustrate the difference between the use of orthognathic surgery only and to use it in association of closed approach rhinoplasty. Using both will allow to obtain a long lasting and pleasing results.

Keywords: orthognathic, rhinoplasty, aesthetic, face

Procedia PDF Downloads 71
25112 A Comparative Study to Evaluate Chronological Age and Dental Age in the North Indian Population Using Cameriere's Method

Authors: Ranjitkumar Patil

Abstract:

Age estimation has importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seem to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in the north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’s method and to compare the chronological age and dental age for validation of the Cameriere’s method in the north Indian population. A comparative study of 02-year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with ages ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from institutional ethical committee. The data was obtained based on inclusion and exclusion criteria and was analyzed by software for dental age estimation. Statistical analysis: The student’s t-test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. The regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between males and females, with their dental age assessed by using a Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that Cameriere’s method can be effectively applied to the north Indian population.

Keywords: forensic, dental age, skeletal age, chronological age, Cameriere’s method

Procedia PDF Downloads 93
25111 Stability Indicating RP – HPLC Method Development, Validation and Kinetic Study for Amiloride Hydrochloride and Furosemide in Pharmaceutical Dosage Form

Authors: Jignasha Derasari, Patel Krishna M, Modi Jignasa G.

Abstract:

Chemical stability of pharmaceutical molecules is a matter of great concern as it affects the safety and efficacy of the drug product.Stability testing data provides the basis to understand how the quality of a drug substance and drug product changes with time under the influence of various environmental factors. Besides this, it also helps in selecting proper formulation and package as well as providing proper storage conditions and shelf life, which is essential for regulatory documentation. The ICH guideline states that stress testing is intended to identify the likely degradation products which further help in determination of the intrinsic stability of the molecule and establishing degradation pathways, and to validate the stability indicating procedures. A simple, accurate and precise stability indicating RP- HPLC method was developed and validated for simultaneous estimation of Amiloride Hydrochloride and Furosemide in tablet dosage form. Separation was achieved on an Phenomenexluna ODS C18 (250 mm × 4.6 mm i.d., 5 µm particle size) by using a mobile phase consisting of Ortho phosphoric acid: Acetonitrile (50:50 %v/v) at a flow rate of 1.0 ml/min (pH 3.5 adjusted with 0.1 % TEA in Water) isocratic pump mode, Injection volume 20 µl and wavelength of detection was kept at 283 nm. Retention time for Amiloride Hydrochloride and Furosemide was 1.810 min and 4.269 min respectively. Linearity of the proposed method was obtained in the range of 40-60 µg/ml and 320-480 µg/ml and Correlation coefficient was 0.999 and 0.998 for Amiloride hydrochloride and Furosemide, respectively. Forced degradation study was carried out on combined dosage form with various stress conditions like hydrolysis (acid and base hydrolysis), oxidative and thermal conditions as per ICH guideline Q2 (R1). The RP- HPLC method has shown an adequate separation for Amiloride hydrochloride and Furosemide from its degradation products. Proposed method was validated as per ICH guidelines for specificity, linearity, accuracy; precision and robustness for estimation of Amiloride hydrochloride and Furosemide in commercially available tablet dosage form and results were found to be satisfactory and significant. The developed and validated stability indicating RP-HPLC method can be used successfully for marketed formulations. Forced degradation studies help in generating degradants in much shorter span of time, mostly a few weeks can be used to develop the stability indicating method which can be applied later for the analysis of samples generated from accelerated and long term stability studies. Further, kinetic study was also performed for different forced degradation parameters of the same combination, which help in determining order of reaction.

Keywords: amiloride hydrochloride, furosemide, kinetic study, stability indicating RP-HPLC method validation

Procedia PDF Downloads 438
25110 Age Estimation Using Destructive and Non-Destructive Dental Methods on an Archeological Human Sample from the Poor Claire Nunnery in Brussels, Belgium

Authors: Pilar Cornejo Ulloa, Guy Willems, Steffen Fieuws, Kim Quintelier, Wim Van Neer, Patrick Thevissen

Abstract:

Dental age estimation can be performed both in living and deceased individuals. In anthropology, few studies have tested the reliability of dental age estimation methods complementary to the usually applied osteological methods. Objectives: In this study, destructive and non-destructive dental age estimation methods were applied on an archeological sample in order to compare them with the previously obtained anthropological age estimates. Materials and Methods: One hundred and thirty-four teeth from 24 individuals were analyzed using Kvaal, Kvaal and Solheim, Bang and Ramm, Lamendin, Gustafson, Maples, Dalitz and Johanson’s methods. Results: A high variability and wider age ranges than the ones previously obtained by the anthropologist could be observed. Destructive methods had a slightly higher agreement than the non-destructive. Discussion: Due to the heterogeneity of the sample and the lack of the real age at death, the obtained results were not representative, and it was not possible to suggest one dental age estimation method over another.

Keywords: archeology, dental age estimation, forensic anthropology, forensic dentistry

Procedia PDF Downloads 332
25109 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data

Authors: Rana Rimawi, Ayman Baklizi

Abstract:

Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.

Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation

Procedia PDF Downloads 167
25108 Blind Channel Estimation for Frequency Hopping System Using Subspace Based Method

Authors: M. M. Qasaymeh, M. A. Khodeir

Abstract:

Subspace channel estimation methods have been studied widely. It depends on subspace decomposition of the covariance matrix to separate signal subspace from noise subspace. The decomposition normally is done by either Eigenvalue Decomposition (EVD) or Singular Value Decomposition (SVD) of the Auto-Correlation matrix (ACM). However, the subspace decomposition process is computationally expensive. In this paper, the multipath channel estimation problem for a Slow Frequency Hopping (SFH) system using noise space based method is considered. An efficient method to estimate multipath the time delays basically is proposed, by applying MUltiple Signal Classification (MUSIC) algorithm which used the null space extracted by the Rank Revealing LU factorization (RRLU). The RRLU provides accurate information about the rank and the numerical null space which make it a valuable tool in numerical linear algebra. The proposed novel method decreases the computational complexity approximately to the half compared with RRQR methods keeping the same performance. Computer simulations are also included to demonstrate the effectiveness of the proposed scheme.

Keywords: frequency hopping, channel model, time delay estimation, RRLU, RRQR, MUSIC, LS-ESPRIT

Procedia PDF Downloads 386
25107 Product Form Bionic Design Based on Eye Tracking Data: A Case Study of Desk Lamp

Authors: Huan Lin, Liwen Pang

Abstract:

In order to reduce the ambiguity and uncertainty of product form bionic design, a product form bionic design method based on eye tracking is proposed. The eye-tracking experiment is designed to calculate the average time ranking of the specific parts of the bionic shape that the subjects are looking at. Key bionic shape is explored through the experiment and then applied to a desk lamp bionic design. During the design case, FAHP (Fuzzy Analytic Hierachy Process) and SD (Semantic Differential) method are firstly used to identify consumer emotional perception model toward desk lamp before product design. Through investigating different desk lamp design elements and consumer views, the form design factors on the desk lamp product are reflected and all design schemes are sequenced after caculation. Desk lamp form bionic design method is combined the key bionic shape extracted from eye-tracking experiment and priority of desk lamp design schemes. This study provides an objective and rational method to product form bionic design.

Keywords: Bionic design; Form; Eye tracking; FAHP; Desk lamp

Procedia PDF Downloads 185
25106 Estimation and Forecasting with a Quantile AR Model for Financial Returns

Authors: Yuzhi Cai

Abstract:

This talk presents a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated MCMC algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. An application of the method to the USD to GBP daily currency exchange rates will also be discussed. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.

Keywords: combining forecasts, MCMC, quantile modelling, quantile forecasting, predictive density functions

Procedia PDF Downloads 315
25105 Asymptotic Confidence Intervals for the Difference of Coefficients of Variation in Gamma Distributions

Authors: Patarawan Sangnawakij, Sa-Aat Niwitpong

Abstract:

In this paper, we proposed two new confidence intervals for the difference of coefficients of variation, CIw and CIs, in two independent gamma distributions. These proposed confidence intervals using the close form method of variance estimation which was presented by Donner and Zou (2010) based on concept of Wald and Score confidence interval, respectively. Monte Carlo simulation study is used to evaluate the performance, coverage probability and expected length, of these confidence intervals. The results indicate that values of coverage probabilities of the new confidence interval based on Wald and Score are satisfied the nominal coverage and close to nominal level 0.95 in various situations, particularly, the former proposed confidence interval is better when sample sizes are small. Moreover, the expected lengths of the proposed confidence intervals are nearly difference when sample sizes are moderate to large. Therefore, in this study, the confidence interval for the difference of coefficients of variation which based on Wald is preferable than the other one confidence interval.

Keywords: confidence interval, score’s interval, wald’s interval, coefficient of variation, gamma distribution, simulation study

Procedia PDF Downloads 390
25104 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes

Authors: V. Churkin, M. Lopatin

Abstract:

The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second –95,3%.

Keywords: bass model, generalized bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States

Procedia PDF Downloads 324
25103 Rational Probabilistic Method for Calculating Thermal Cracking Risk of Mass Concrete Structures

Authors: Naoyuki Sugihashi, Toshiharu Kishi

Abstract:

The probability of occurrence of thermal cracks in mass concrete in Japan is evaluated by the cracking probability diagram that represents the relationship between the thermal cracking index and the probability of occurrence of cracks in the actual structure. In this paper, we propose a method to directly calculate the cracking probability, following a probabilistic theory by modeling the variance of tensile stress and tensile strength. In this method, the relationship between the variance of tensile stress and tensile strength, the thermal cracking index, and the cracking probability are formulated and presented. In addition, standard deviation of tensile stress and tensile strength was identified, and the method of calculating cracking probability in a general construction controlled environment was also demonstrated.

Keywords: thermal crack control, mass concrete, thermal cracking probability, durability of concrete, calculating method of cracking probability

Procedia PDF Downloads 300
25102 Stability Indicating Method Development and Validation for Estimation of Antiasthmatic Drug in Combined Dosages Formed by RP-HPLC

Authors: Laxman H. Surwase, Lalit V. Sonawane, Bhagwat N. Poul

Abstract:

A simple stability indicating high performance liquid chromatographic method has been developed for the simultaneous determination of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical dosage form using reverse phase Zorbax Eclipse Plus C8 column (250mm×4.6mm), with mobile phase phosphate buffer (0.05M KH2PO4): acetonitrile (55:45v/v) pH 3.5 adjusted with ortho-phosphoric acid, the flow rate was 1.0 mL/min and the detection was carried at 212 nm. The retention times of Levosalbutamol Sulphate and Ipratropium Bromide were 2.2007 and 2.6611 min respectively. The correlation coefficient of Levosalbutamol Sulphate and Ipratropium Bromide was found to be 0.997 and 0.998.Calibration plots were linear over the concentration ranges 10-100µg/mL for both Levosalbutamol Sulphate and Ipratropium Bromide. The LOD and LOQ of Levosalbutamol Sulphate were 2.520µg/mL and 7.638µg/mL while for Ipratropium Bromide was 1.201µg/mL and 3.640 µg/mL. The accuracy of the proposed method was determined by recovery studies and found to be 100.15% for Levosalbutamol Sulphate and 100.19% for Ipratropium Bromide respectively. The method was validated for accuracy, linearity, sensitivity, precision, robustness, system suitability. The proposed method could be utilized for routine analysis of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical capsule dosage form.

Keywords: levosalbutamol sulphate, ipratropium bromide, RP-HPLC, phosphate buffer, acetonitrile

Procedia PDF Downloads 324
25101 A Novel Stator Resistance Estimation Method and Control Design of Speed-Sensorless Induction Motor Drives

Authors: N. Ben Si Ali, N. Benalia, N. Zarzouri

Abstract:

Speed sensorless systems are intensively studied during recent years; this is mainly due to their economical benefit and fragility of mechanical sensors and also the difficulty of installing this type of sensor in many applications. These systems suffer from instability problems and sensitivity to parameter mismatch at low speed operation. In this paper an analysis of adaptive observer stability with stator resistance estimation is given.

Keywords: motor drive, sensorless control, adaptive observer, stator resistance estimation

Procedia PDF Downloads 349