Search results for: insurance estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2231

Search results for: insurance estimation

1721 A Procedure for Post-Earthquake Damage Estimation Based on Detection of High-Frequency Transients

Authors: Aleksandar Zhelyazkov, Daniele Zonta, Helmut Wenzel, Peter Furtner

Abstract:

In the current research structural health monitoring is considered for addressing the critical issue of post-earthquake damage detection. A non-standard approach for damage detection via acoustic emission is presented - acoustic emissions are monitored in the low frequency range (up to 120 Hz). Such emissions are termed high-frequency transients. Further a damage indicator defined as the Time-Ratio Damage Indicator is introduced. The indicator relies on time-instance measurements of damage initiation and deformation peaks. Based on the time-instance measurements a procedure for estimation of the maximum drift ratio is proposed. Monitoring data is used from a shaking-table test of a full-scale reinforced concrete bridge pier. Damage of the experimental column is successfully detected and the proposed damage indicator is calculated.

Keywords: acoustic emission, damage detection, shaking table test, structural health monitoring

Procedia PDF Downloads 231
1720 Estimation and Validation of Free Lime Analysis of Clinker by Quantitative Phase Analysis Using X ray diffraction

Authors: Suresh Palla, Kalpna Sharma, Gaurav Bhatnagar, S. K. Chaturvedi, B. N. Mohapatra

Abstract:

Determining the content of free lime is especially important to judge reactivity of the raw materials and clinker quality. The free lime limit isn’t the same for all cements; it depends on several factors, especially the temperature reached during the cooking and the grain size distribution in cement after grinding. Estimation of free lime by conventional method is influenced by the presence of portlandite and misleads the actual free lime content in the clinker for quality check up conditions. To ensure the product quality according to the standard specifications in terms of within the quality limits or not, a reliable, precise, and very reproducible method to quantify the relative phase abundances in the Portland Cement clinker and Portland Cements is to use X-ray diffraction (XRD) in combination with the Rietveld method. In the present study, a methodology was proposed using XRD to validate the obtained results of free lime by conventional method. The XRD and TG/DTA results confirm the presence of portlandite in the clinker to take the decision on the obtained free lime results through conventional method.

Keywords: free lime, quantitative phase analysis, conventional method, x ray diffraction

Procedia PDF Downloads 136
1719 Elastic Deformation of Multistory RC Frames under Lateral Loads

Authors: Hamdy Elgohary, Majid Assas

Abstract:

Estimation of lateral displacement and interstory drifts represent a major step in multistory frames design. In the preliminary design stage, it is essential to perform a fast check for the expected values of lateral deformations. This step will help to ensure the compliance of the expected values with the design code requirements. Also, in some cases during or after the detailed design stage, it may be required to carry fast check of lateral deformations by design reviewer. In the present paper, a parametric study is carried out on the factors affecting in the lateral displacements of multistory frame buildings. Based on the results of the parametric study, simplified empirical equations are recommended for the direct determination of the lateral deflection of multistory frames. The results obtained using the recommended equations have been compared with the results obtained by finite element analysis. The comparison shows that the proposed equations lead to good approximation for the estimation of lateral deflection of multistory RC frame buildings.

Keywords: lateral deflection, interstory drift, approximate analysis, multistory frames

Procedia PDF Downloads 271
1718 Estimation of Human Absorbed Dose Using Compartmental Model

Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning to minimize the absorbed dose in vital tissues. In this study, compartmental model was used in order to estimate the human absorbed dose of 177Lu-DOTATOC from the biodistribution data in wild type rats. For this purpose, 177Lu-DOTATOC was prepared under optimized conditions and its biodistribution was studied in male Syrian rats up to 168 h. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. Dosimetric estimation of the complex was performed using radiation absorbed dose assessment resource (RADAR). The biodistribution data showed high accumulation in the adrenal and pancreas as the major expression sites for somatostatin receptor (SSTR). While kidneys as the major route of excretion receive 0.037 mSv/MBq, pancreas and adrenal also obtain 0.039 and 0.028 mSv/MBq. Due to the usage of this method, the points of accumulated activity data were enhanced, and further information of tissues uptake was collected that it will be followed by high (or improved) precision in dosimetric calculations.

Keywords: compartmental modeling, human absorbed dose, ¹⁷⁷Lu-DOTATOC, Syrian rats

Procedia PDF Downloads 195
1717 PostureCheck with the Kinect and Proficio: Posture Modeling for Exercise Assessment

Authors: Elham Saraee, Saurabh Singh, Margrit Betke

Abstract:

Evaluation of a person’s posture while exercising is important in physical therapy. During a therapy session, a physical therapist or a monitoring system must assure that the person is performing an exercise correctly to achieve the desired therapeutic effect. In this work, we introduce a system called POSTURECHECK for exercise assessment in physical therapy. POSTURECHECK assesses the posture of a person who is exercising with the Proficio robotic arm while being recorded by the Microsoft Kinect interface. POSTURECHECK extracts unique features from the person’s upper body during the exercise, and classifies the sequence of postures as correct or incorrect using Bayesian estimation and majority voting. If POSTURECHECK recognizes an incorrect posture, it specifies what the user can do to correct it. The result of our experiment shows that POSTURECHECK is capable of recognizing the incorrect postures in real time while the user is performing an exercise.

Keywords: Bayesian estimation, majority voting, Microsoft Kinect, PostureCheck, Proficio robotic arm, upper body physical therapy

Procedia PDF Downloads 284
1716 Comparison of Petrophysical Relationship for Soil Water Content Estimation at Peat Soil Area Using GPR Common-Offset Measurements

Authors: Nurul Izzati Abd Karim, Samira Albati Kamaruddin, Rozaimi Che Hasan

Abstract:

The appropriate petrophysical relationship is needed for Soil Water Content (SWC) estimation especially when using Ground Penetrating Radar (GPR). Ground penetrating radar is a geophysical tool that provides indirectly the parameter of SWC. This paper examines the performance of few published petrophysical relationships to obtain SWC estimates from in-situ GPR common- offset survey measurements with gravimetric measurements at peat soil area. Gravimetric measurements were conducted to support of GPR measurements for the accuracy assessment. Further, GPR with dual frequencies (250MHhz and 700MHz) were used in the survey measurements to obtain the dielectric permittivity. Three empirical equations (i.e., Roth’s equation, Schaap’s equation and Idi’s equation) were selected for the study, used to compute the soil water content from dielectric permittivity of the GPR profile. The results indicate that Schaap’s equation provides strong correlation with SWC as measured by GPR data sets and gravimetric measurements.

Keywords: common-offset measurements, ground penetrating radar, petrophysical relationship, soil water content

Procedia PDF Downloads 252
1715 Estimation of Mobility Parameters and Threshold Voltage of an Organic Thin Film Transistor Using an Asymmetric Capacitive Test Structure

Authors: Rajesh Agarwal

Abstract:

Carrier mobility at the organic/insulator interface is essential to the performance of organic thin film transistors (OTFT). The present work describes estimation of field dependent mobility (FDM) parameters and the threshold voltage of an OTFT using a simple, easy to fabricate two terminal asymmetric capacitive test structure using admittance measurements. Conventionally, transfer characteristics are used to estimate the threshold voltage in an OTFT with field independent mobility (FIDM). Yet, this technique breaks down to give accurate results for devices with high contact resistance and having field dependent mobility. In this work, a new technique is presented for characterization of long channel organic capacitor (LCOC). The proposed technique helps in the accurate estimation of mobility enhancement factor (γ), the threshold voltage (V_th) and band mobility (µ₀) using capacitance-voltage (C-V) measurement in OTFT. This technique also helps to get rid of making short channel OTFT or metal-insulator-metal (MIM) structures for making C-V measurements. To understand the behavior of devices and ease of analysis, transmission line compact model is developed. The 2-D numerical simulation was carried out to illustrate the correctness of the model. Results show that proposed technique estimates device parameters accurately even in the presence of contact resistance and field dependent mobility. Pentacene/Poly (4-vinyl phenol) based top contact bottom-gate OTFT’s are fabricated to illustrate the operation and advantages of the proposed technique. Small signal of frequency varying from 1 kHz to 5 kHz and gate potential ranging from +40 V to -40 V have been applied to the devices for measurement.

Keywords: capacitance, mobility, organic, thin film transistor

Procedia PDF Downloads 165
1714 A Targeted Maximum Likelihood Estimation for a Non-Binary Causal Variable: An Application

Authors: Mohamed Raouf Benmakrelouf, Joseph Rynkiewicz

Abstract:

Targeted maximum likelihood estimation (TMLE) is well-established method for causal effect estimation with desirable statistical properties. TMLE is a doubly robust maximum likelihood based approach that includes a secondary targeting step that optimizes the target statistical parameter. A causal interpretation of the statistical parameter requires assumptions of the Rubin causal framework. The causal effect of binary variable, E, on outcomes, Y, is defined in terms of comparisons between two potential outcomes as E[YE=1 − YE=0]. Our aim in this paper is to present an adaptation of TMLE methodology to estimate the causal effect of a non-binary categorical variable, providing a large application. We propose coding on the initial data in order to operate a binarization of the interest variable. For each category, we get a transformation of the non-binary interest variable into a binary variable, taking value 1 to indicate the presence of category (or group of categories) for an individual, 0 otherwise. Such a dummy variable makes it possible to have a pair of potential outcomes and oppose a category (or a group of categories) to another category (or a group of categories). Let E be a non-binary interest variable. We propose a complete disjunctive coding of our variable E. We transform the initial variable to obtain a set of binary vectors (dummy variables), E = (Ee : e ∈ {1, ..., |E|}), where each vector (variable), Ee, takes the value of 0 when its category is not present, and the value of 1 when its category is present, which allows to compute a pairwise-TMLE comparing difference in the outcome between one category and all remaining categories. In order to illustrate the application of our strategy, first, we present the implementation of TMLE to estimate the causal effect of non-binary variable on outcome using simulated data. Secondly, we apply our TMLE adaptation to survey data from the French Political Barometer (CEVIPOF), to estimate the causal effect of education level (A five-level variable) on a potential vote in favor of the French extreme right candidate Jean-Marie Le Pen. Counterfactual reasoning requires us to consider some causal questions (additional causal assumptions). Leading to different coding of E, as a set of binary vectors, E = (Ee : e ∈ {2, ..., |E|}), where each vector (variable), Ee, takes the value of 0 when the first category (reference category) is present, and the value of 1 when its category is present, which allows to apply a pairwise-TMLE comparing difference in the outcome between the first level (fixed) and each remaining level. We confirmed that the increase in the level of education decreases the voting rate for the extreme right party.

Keywords: statistical inference, causal inference, super learning, targeted maximum likelihood estimation

Procedia PDF Downloads 103
1713 Role of Spatial Variability in the Service Life Prediction of Reinforced Concrete Bridges Affected by Corrosion

Authors: Omran M. Kenshel, Alan J. O'Connor

Abstract:

Estimating the service life of Reinforced Concrete (RC) bridge structures located in corrosive marine environments of a great importance to their owners/engineers. Traditionally, bridge owners/engineers relied more on subjective engineering judgment, e.g. visual inspection, in their estimation approach. However, because financial resources are often limited, rational calculation methods of estimation are needed to aid in making reliable and more accurate predictions for the service life of RC structures. This is in order to direct funds to bridges found to be the most critical. Criticality of the structure can be considered either form the Structural Capacity (i.e. Ultimate Limit State) or from Serviceability viewpoint whichever is adopted. This paper considers the service life of the structure only from the Structural Capacity viewpoint. Considering the great variability associated with the parameters involved in the estimation process, the probabilistic approach is most suited. The probabilistic modelling adopted here used Monte Carlo simulation technique to estimate the Reliability (i.e. Probability of Failure) of the structure under consideration. In this paper the authors used their own experimental data for the Correlation Length (CL) for the most important deterioration parameters. The CL is a parameter of the Correlation Function (CF) by which the spatial fluctuation of a certain deterioration parameter is described. The CL data used here were produced by analyzing 45 chloride profiles obtained from a 30 years old RC bridge located in a marine environment. The service life of the structure were predicted in terms of the load carrying capacity of an RC bridge beam girder. The analysis showed that the influence of SV is only evident if the reliability of the structure is governed by the Flexure failure rather than by the Shear failure.

Keywords: Chloride-induced corrosion, Monte-Carlo simulation, reinforced concrete, spatial variability

Procedia PDF Downloads 473
1712 Parameter Estimation of Additive Genetic and Unique Environment (AE) Model on Diabetes Mellitus Type 2 Using Bayesian Method

Authors: Andi Darmawan, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Diabetes mellitus (DM) is a chronic disease in human that occurred if pancreas cannot produce enough of insulin hormone or the body uses ineffectively insulin hormone which causes increasing level of glucose in the blood, or it was called hyperglycemia. In Indonesia, DM is a serious disease on health because it can cause blindness, kidney disease, diabetic feet (gangrene), and stroke. The type of DM criteria can also be divided based on the main causes; they are DM type 1, type 2, and gestational. Diabetes type 1 or previously known as insulin-independent diabetes is due to a lack of production of insulin hormone. Diabetes type 2 or previously known as non-insulin dependent diabetes is due to ineffective use of insulin while gestational diabetes is a hyperglycemia that found during pregnancy. The most one type commonly found in patient is DM type 2. The main factors of this disease are genetic (A) and life style (E). Those disease with 2 factors can be constructed with additive genetic and unique environment (AE) model. In this article was discussed parameter estimation of AE model using Bayesian method and the inheritance character simulation on parent-offspring. On the AE model, there are response variable, predictor variables, and parameters were capable of representing the number of population on research. The population can be measured through a taken random sample. The response and predictor variables can be determined by sample while the parameters are unknown, so it was required to estimate the parameters based on the sample. Estimation of AE model parameters was obtained based on a joint posterior distribution. The simulation was conducted to get the value of genetic variance and life style variance. The results of simulation are 0.3600 for genetic variance and 0.0899 for life style variance. Therefore, the variance of genetic factor in DM type 2 is greater than life style.

Keywords: AE model, Bayesian method, diabetes mellitus type 2, genetic, life style

Procedia PDF Downloads 284
1711 Approximating Maximum Speed on Road from Curvature Information of Bezier Curve

Authors: M. Yushalify Misro, Ahmad Ramli, Jamaludin M. Ali

Abstract:

Bezier curves have useful properties for path generation problem, for instance, it can generate the reference trajectory for vehicles to satisfy the path constraints. Both algorithms join cubic Bezier curve segment smoothly to generate the path. Some of the useful properties of Bezier are curvature. In mathematics, the curvature is the amount by which a geometric object deviates from being flat, or straight in the case of a line. Another extrinsic example of curvature is a circle, where the curvature is equal to the reciprocal of its radius at any point on the circle. The smaller the radius, the higher the curvature thus the vehicle needs to bend sharply. In this study, we use Bezier curve to fit highway-like curve. We use the different approach to finding the best approximation for the curve so that it will resemble highway-like curve. We compute curvature value by analytical differentiation of the Bezier Curve. We will then compute the maximum speed for driving using the curvature information obtained. Our research works on some assumptions; first the Bezier curve estimates the real shape of the curve which can be verified visually. Even, though, the fitting process of Bezier curve does not interpolate exactly on the curve of interest, we believe that the estimation of speed is acceptable. We verified our result with the manual calculation of the curvature from the map.

Keywords: speed estimation, path constraints, reference trajectory, Bezier curve

Procedia PDF Downloads 375
1710 The Impact of Board Characteristics on Firm Performance: Evidence from Banking Industry in India

Authors: Manmeet Kaur, Madhu Vij

Abstract:

The Board of Directors in a firm performs the primary role of an internal control mechanism. This Study seeks to understand the relationship between internal governance and performance of banks in India. The research paper investigates the effect of board structure (proportion of nonexecutive directors, gender diversity, board size and meetings per year) on the firm performance. This paper evaluates the impact of corporate governance mechanisms on bank’s financial performance using panel data for 28 listed banks in National Stock Exchange of India for the period of 2008-2014. Returns on Asset, Return on Equity, Tobin’s Q and Net Interest Margin were used as the financial performance indicators. To estimate the relationship among governance and bank performance initially the Study uses Pooled Ordinary Least Square (OLS) Estimation and Generalized Least Square (GLS) Estimation. Then a well-developed panel Generalized Method of Moments (GMM) Estimator is developed to investigate the dynamic nature of performance and governance relationship. The Study empirically confirms that two-step system GMM approach controls the problem of unobserved heterogeneity and endogeneity as compared to the OLS and GLS approach. The result suggests that banks with small board, boards with female members, and boards that meet more frequently tend to be more efficient and subsequently have a positive impact on performance of banks. The study offers insights to policy makers interested in enhancing the quality of governance of banks in India. Also, the findings suggest that board structure plays a vital role in the improvement of corporate governance mechanism for financial institutions. There is a need to have efficient boards in banks to improve the overall health of the financial institutions and the economic development of the country.

Keywords: board of directors, corporate governance, GMM estimation, Indian banking

Procedia PDF Downloads 260
1709 Count Data Regression Modeling: An Application to Spontaneous Abortion in India

Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan

Abstract:

Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.

Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression

Procedia PDF Downloads 155
1708 Wind Energy Resources Assessment and Micrositting on Different Areas of Libya: The Case Study in Darnah

Authors: F. Ahwide, Y. Bouker, K. Hatem

Abstract:

This paper presents long term wind data analysis in terms of annual and diurnal variations at different areas of Libya. The data of the wind speed and direction are taken each ten minutes for a period, at least two years, are used in the analysis. ‘WindPRO’ software and Excel workbook were used for the wind statistics and energy calculations. As for Derna, average speeds are 10 m, 20 m, and 40 m, and respectively 6.57 m/s, 7.18 m/s, and 8.09 m/s. Highest wind speeds are observed at SSW, followed by S, WNW and NW sectors. Lowest wind speeds are observed between N and E sectors. Most frequent wind directions are NW and NNW. Hence, wind turbines can be installed against these directions. The most powerful sector is NW (29.4 % of total expected wind energy), followed by 19.9 % SSW, 11.9% NNW, 8.6% WNW and 8.2% S. Furthermore in Al-Maqrun: the most powerful sector is W (26.8 % of total expected wind energy), followed by 12.3 % WSW and 9.5% WNW. While in Goterria: the most powerful sector is S (14.8 % of total expected wind energy), followed by SSE, SE, and WSW. And Misalatha: the most powerful sector is S, by far represents 28.5% of the expected power, followed by SSE and SE. As for Tarhuna, it is by far SSE and SE, representing each one two times the expected energy of the third powerful sector (NW). In Al-Asaaba: it is SSE by far represents 50% of the expected power, followed by S. It can to be noted that the high frequency of the south direction winds, that come from the desert could cause a high frequency of dust episodes. This fact then, should be taken into account in order to take appropriate measures to prevent wind turbine deterioration. In Excel workbook, an estimation of annual energy yield at position of Derna, Al-Maqrun, Tarhuna, and Al-Asaaba meteorological mast has been done, considering a generic wind turbine of 1.65 MW. (mtORRES, TWT 82-1.65MW) in position of meteorological mast. Three other turbines have been tested. At 80 m, the estimation of energy yield for Derna, Al-Maqrun, Tarhuna, and Asaaba is 6.78 GWh or 3390 equivalent hours, 5.80 GWh or 2900 equivalent hours, 4.91 GWh or 2454 equivalent hours and 5.08 GWh or 2541 equivalent hours respectively. It seems a fair value in the context of a possible development of a wind energy project in the areas, considering a value of 2400 equivalent hours as an approximate limit to consider a wind warm economically profitable. Furthermore, an estimation of annual energy yield at positions of Misalatha, Azizyah and Goterria meteorological mast has been done, considering a generic wind turbine of 2 MW. We found that, at 80 m, the estimation of energy yield is 3.12 GWh or 1557 equivalent hours, 4.47 GWh or 2235 equivalent hours and 4.07GWh or 2033 respectively . It seems a very poor value in the context of possible development of a wind energy project in the areas, considering a value of 2400 equivalent hours as an approximate limit to consider a wind warm economically profitable. Anyway, more data and a detailed wind farm study would be necessary to draw conclusions.

Keywords: wind turbines, wind data, energy yield, micrositting

Procedia PDF Downloads 188
1707 In vitro Estimation of Genotoxic Lesions in Peripheral Blood Lymphocytes of Rat Exposed to Organophosphate Pesticides

Authors: A. Ojha, Y. K. Gupta

Abstract:

Organophosphate (OP) pesticides are among the most widely used synthetic chemicals for controlling a wide variety of pests throughout the world. Chlorpyrifos (CPF), methyl parathion (MPT), and malathion (MLT) are among the most extensively used OP pesticides in India. DNA strand breaks and DNA-protein crosslinks (DPC) are toxic lesions associated with the mechanisms of toxicity of genotoxic compounds. In the present study, we have examined the potential of CPF, MPT, and MLT individually and in combination, to cause DNA strand breakage and DPC formation. Peripheral blood lymphocytes of rat were exposed to 1/4 and 1/10 LC50 dose of CPF, MPT, and MLT for 2, 4, 8, and 12h. The DNA strand break was measured by the comet assay and expressed as DNA damage index while DPC estimation was done by fluorescence emission. There was significantly marked increase in DNA damage and DNA-protein crosslink formation in time and dose dependent manner. It was also observed that MPT caused the highest level of DNA damage as compared to other studied OP compounds. Thus, from present study, we can conclude that studied pesticides have genotoxic potential. The pesticides mixture does not potentiate the toxicity of each other. Nonetheless, additional in vivo data are required before a definitive conclusion can be drawn regarding hazard prediction to humans.

Keywords: organophosphate, pesticides, DNA damage, DNA protein crosslink, genotoxic

Procedia PDF Downloads 356
1706 Parameter Estimation for the Oral Minimal Model and Parameter Distinctions Between Obese and Non-obese Type 2 Diabetes

Authors: Manoja Rajalakshmi Aravindakshana, Devleena Ghosha, Chittaranjan Mandala, K. V. Venkateshb, Jit Sarkarc, Partha Chakrabartic, Sujay K. Maity

Abstract:

Oral Glucose Tolerance Test (OGTT) is the primary test used to diagnose type 2 diabetes mellitus (T2DM) in a clinical setting. Analysis of OGTT data using the Oral Minimal Model (OMM) along with the rate of appearance of ingested glucose (Ra) is performed to study differences in model parameters for control and T2DM groups. The differentiation of parameters of the model gives insight into the behaviour and physiology of T2DM. The model is also studied to find parameter differences among obese and non-obese T2DM subjects and the sensitive parameters were co-related to the known physiological findings. Sensitivity analysis is performed to understand changes in parameter values with model output and to support the findings, appropriate statistical tests are done. This seems to be the first preliminary application of the OMM with obesity as a distinguishing factor in understanding T2DM from estimated parameters of insulin-glucose model and relating the statistical differences in parameters to diabetes pathophysiology.

Keywords: oral minimal model, OGTT, obese and non-obese T2DM, mathematical modeling, parameter estimation

Procedia PDF Downloads 92
1705 An Application of Sinc Function to Approximate Quadrature Integrals in Generalized Linear Mixed Models

Authors: Altaf H. Khan, Frank Stenger, Mohammed A. Hussein, Reaz A. Chaudhuri, Sameera Asif

Abstract:

This paper discusses a novel approach to approximate quadrature integrals that arise in the estimation of likelihood parameters for the generalized linear mixed models (GLMM) as well as Bayesian methodology also requires computation of multidimensional integrals with respect to the posterior distributions in which computation are not only tedious and cumbersome rather in some situations impossible to find solutions because of singularities, irregular domains, etc. An attempt has been made in this work to apply Sinc function based quadrature rules to approximate intractable integrals, as there are several advantages of using Sinc based methods, for example: order of convergence is exponential, works very well in the neighborhood of singularities, in general quite stable and provide high accurate and double precisions estimates. The Sinc function based approach seems to be utilized first time in statistical domain to our knowledge, and it's viability and future scopes have been discussed to apply in the estimation of parameters for GLMM models as well as some other statistical areas.

Keywords: generalized linear mixed model, likelihood parameters, qudarature, Sinc function

Procedia PDF Downloads 395
1704 Deep Learning Based Fall Detection Using Simplified Human Posture

Authors: Kripesh Adhikari, Hamid Bouchachia, Hammadi Nait-Charif

Abstract:

Falls are one of the major causes of injury and death among elderly people aged 65 and above. A support system to identify such kind of abnormal activities have become extremely important with the increase in ageing population. Pose estimation is a challenging task and to add more to this, it is even more challenging when pose estimations are performed on challenging poses that may occur during fall. Location of the body provides a clue where the person is at the time of fall. This paper presents a vision-based tracking strategy where available joints are grouped into three different feature points depending upon the section they are located in the body. The three feature points derived from different joints combinations represents the upper region or head region, mid-region or torso and lower region or leg region. Tracking is always challenging when a motion is involved. Hence the idea is to locate the regions in the body in every frame and consider it as the tracking strategy. Grouping these joints can be beneficial to achieve a stable region for tracking. The location of the body parts provides a crucial information to distinguish normal activities from falls.

Keywords: fall detection, machine learning, deep learning, pose estimation, tracking

Procedia PDF Downloads 189
1703 Technological Innovations and African Export Performances

Authors: Lukman Oyelami

Abstract:

Studies have identified trade as a veritable tool for inclusive economic growth and poverty reduction in developing countries. However, contrary to the overwhelming pieces of evidence of the Asian tiger as a success story of beneficial trade, many African countries still experience poverty unabatedly despite active engagement in trade. Consequently, this study seeks to investigate the contributory effect of technological innovation on total export performance and specifically manufacturing exports of African countries. This is with a view to exploring manufacturing exports as a viable option for diversification. To achieve the empirical investigation this study, require Systems Generalized Method of Moments (sys-GMM) estimation technique was adopted based on the econometric realities inherent in the data utilized. However, the static technique of panel estimation of the Fixed Effects (FE) model was utilized for baseline analysis and robustness check. The conclusion from this study is that innovation generally impacts export performance of African countries positively, however, manufacturing export shows more sensitivity to innovation than total export. And, this provides a clear pathway for export diversification for many African countries that run a resource-based economy.

Keywords: innovation, export, GMM, Africa

Procedia PDF Downloads 220
1702 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area

Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim

Abstract:

In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.

Keywords: data estimation, link data, machine learning, road network

Procedia PDF Downloads 510
1701 Frailty Patterns in the US and Implications for Long-Term Care

Authors: Joelle Fong

Abstract:

Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.

Keywords: actuarial modeling, cohort analysis, frail elderly, health

Procedia PDF Downloads 244
1700 Formulation of Extended-Release Gliclazide Tablet Using a Mathematical Model for Estimation of Hypromellose

Authors: Farzad Khajavi, Farzaneh Jalilfar, Faranak Jafari, Leila Shokrani

Abstract:

Formulation of gliclazide in the form of extended-release tablet in 30 and 60 mg dosage forms was performed using hypromellose (HPMC K4M) as a retarding agent. Drug-release profiles were investigated in comparison with references Diamicron MR 30 and 60 mg tablets. The effect of size of powder particles, the amount of hypromellose in formulation, hardness of tablets, and also the effect of halving the tablets were investigated on drug release profile. A mathematical model which describes hypromellose behavior in initial times of drug release was proposed for the estimation of hypromellose content in modified-release gliclazide 60 mg tablet. This model is based on erosion of hypromellose in dissolution media. The model is applicable to describe release profiles of insoluble drugs. Therefore, by using dissolved amount of drug in initial times of dissolution and the model, the amount of hypromellose in formulation can be predictable. The model was used to predict the HPMC K4M content in modified-release gliclazide 30 mg and extended-release quetiapine 200 mg tablets.

Keywords: Gliclazide, hypromellose, drug release, modified-release tablet, mathematical model

Procedia PDF Downloads 222
1699 Downtime Estimation of Building Structures Using Fuzzy Logic

Authors: M. De Iuliis, O. Kammouh, G. P. Cimellaro, S. Tesfamariam

Abstract:

Community Resilience has gained a significant attention due to the recent unexpected natural and man-made disasters. Resilience is the process of maintaining livable conditions in the event of interruptions in normally available services. Estimating the resilience of systems, ranging from individuals to communities, is a formidable task due to the complexity involved in the process. The most challenging parameter involved in the resilience assessment is the 'downtime'. Downtime is the time needed for a system to recover its services following a disaster event. Estimating the exact downtime of a system requires a lot of inputs and resources that are not always obtainable. The uncertainties in the downtime estimation are usually handled using probabilistic methods, which necessitates acquiring large historical data. The estimation process also involves ignorance, imprecision, vagueness, and subjective judgment. In this paper, a fuzzy-based approach to estimate the downtime of building structures following earthquake events is proposed. Fuzzy logic can integrate descriptive (linguistic) knowledge and numerical data into the fuzzy system. This ability allows the use of walk down surveys, which collect data in a linguistic or a numerical form. The use of fuzzy logic permits a fast and economical estimation of parameters that involve uncertainties. The first step of the method is to determine the building’s vulnerability. A rapid visual screening is designed to acquire information about the analyzed building (e.g. year of construction, structural system, site seismicity, etc.). Then, a fuzzy logic is implemented using a hierarchical scheme to determine the building damageability, which is the main ingredient to estimate the downtime. Generally, the downtime can be divided into three main components: downtime due to the actual damage (DT1); downtime caused by rational and irrational delays (DT2); and downtime due to utilities disruption (DT3). In this work, DT1 is computed by relating the building damageability results obtained from the visual screening to some already-defined components repair times available in the literature. DT2 and DT3 are estimated using the REDITM Guidelines. The Downtime of the building is finally obtained by combining the three components. The proposed method also allows identifying the downtime corresponding to each of the three recovery states: re-occupancy; functional recovery; and full recovery. Future work is aimed at improving the current methodology to pass from the downtime to the resilience of buildings. This will provide a simple tool that can be used by the authorities for decision making.

Keywords: resilience, restoration, downtime, community resilience, fuzzy logic, recovery, damage, built environment

Procedia PDF Downloads 160
1698 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition

Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman

Abstract:

Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.

Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat

Procedia PDF Downloads 146
1697 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis

Authors: Alexander Marx

Abstract:

Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.

Keywords: value at risk, financial market risk, banking, quantitative risk management

Procedia PDF Downloads 95
1696 Repeatable Scalable Business Models: Can Innovation Drive an Entrepreneurs Un-Validated Business Model?

Authors: Paul Ojeaga

Abstract:

Can the level of innovation use drive un-validated business models across regions? To what extent does industrial sector attractiveness drive firm’s success across regions at the time of start-up? This study examines the role of innovation on start-up success in six regions of the world (namely Sub Saharan Africa, the Middle East and North Africa, Latin America, South East Asia Pacific, the European Union and the United States representing North America) using macroeconomic variables. While there have been studies using firm level data, results from such studies are not suitable for national policy decisions. The need to drive a regional innovation policy also begs for an answer, therefore providing room for this study. Results using dynamic panel estimation show that innovation counts in the early infancy stage of new business life cycle. The results are robust even after controlling for time fixed effects and the study present variance-covariance estimation robust standard errors.

Keywords: industrial economics, un-validated business models, scalable models, entrepreneurship

Procedia PDF Downloads 282
1695 Ultra-High Frequency Passive Radar Coverage for Cars Detection in Semi-Urban Scenarios

Authors: Pedro Gómez-del-Hoyo, Jose-Luis Bárcena-Humanes, Nerea del-Rey-Maestre, María-Pilar Jarabo-Amores, David Mata-Moya

Abstract:

A study of achievable coverages using passive radar systems in terrestrial traffic monitoring applications is presented. The study includes the estimation of the bistatic radar cross section of different commercial vehicle models that provide challenging low values which make detection really difficult. A semi-urban scenario is selected to evaluate the impact of excess propagation losses generated by an irregular relief. A bistatic passive radar exploiting UHF frequencies radiated by digital video broadcasting transmitters is assumed. A general method of coverage estimation using electromagnetic simulators in combination with estimated car average bistatic radar cross section is applied. In order to reduce the computational cost, hybrid solution is implemented, assuming free space for the target-receiver path but estimating the excess propagation losses for the transmitter-target one.

Keywords: bistatic radar cross section, passive radar, propagation losses, radar coverage

Procedia PDF Downloads 336
1694 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 356
1693 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection

Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye

Abstract:

The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.

Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document

Procedia PDF Downloads 159
1692 Management Support, Role Ambiguity and Role Ambiguity among Professional Nurses at National Health Insurance Pilot Sites in South Africa: An Interpretive Phenomenology

Authors: Nomcebo N. Mpili, Cynthia Z. Madlabana

Abstract:

The South African Primary Health Care (PHC) system has undergone a number of transformations such as the introduction of National Health Insurance (NHI) to bring about easily accessible universal health coverage and to meet the health needs for all its citizens. This provides ongoing challenges to ensure that health workers are equipped with appropriate knowledge, support, and skills to meet these changes. Therefore it is crucial to understand the experiences and challenges of nurses as the backbone of PHC in providing quality healthcare services. In addition there has been a need to understand nurses’ experiences with management support, role ambiguity and role conflict amongst other challenges in light of the current reforms in healthcare. Indeed these constructs are notorious for having a detrimental impact on the outcomes of change initiatives within any organisation, this is no different in healthcare. This draws a discussion on professional nurses within the South African health care system especially since they have been labelled as the backbone of PHC, meaning any healthcare backlog falls on them. The study made use of semi-structured interviews and adopted the interpretative phenomenological approach (IPA) as the researcher aimed to explore the lived experiences of (n= 18) participants. The study discovered that professional nurses experienced a lack of management support within PHC facilities and that management mainly played an administrative and disciplinary role. Although participants mainly held positive perceptions with regards to changes happening in health care however they also expressed negative experiences in terms of how change initiatives were introduced resulting in role conflict and role ambiguity. Participants mentioned a shortage of staff, inadequate training as well as a lack of management support as some of the key challenges faced in facilities. This study offers unique findings as participants have not only experienced the various reforms within the PHC system however they have also been part of NHI pilot. The authors are not aware of any other studies published that examine management support, role conflict and role ambiguity together especially in South African PHC facilities. In conclusion understanding these challenges may provide insight and opportunities available to improve the current landscape of PHC not only in South Africa but internationally.

Keywords: management support, professional nurse, role ambiguity, role conflict

Procedia PDF Downloads 144