Search results for: missing data estimation
26037 The Impact of Board Characteristics on Firm Performance: Evidence from Banking Industry in India
Authors: Manmeet Kaur, Madhu Vij
Abstract:
The Board of Directors in a firm performs the primary role of an internal control mechanism. This Study seeks to understand the relationship between internal governance and performance of banks in India. The research paper investigates the effect of board structure (proportion of nonexecutive directors, gender diversity, board size and meetings per year) on the firm performance. This paper evaluates the impact of corporate governance mechanisms on bank’s financial performance using panel data for 28 listed banks in National Stock Exchange of India for the period of 2008-2014. Returns on Asset, Return on Equity, Tobin’s Q and Net Interest Margin were used as the financial performance indicators. To estimate the relationship among governance and bank performance initially the Study uses Pooled Ordinary Least Square (OLS) Estimation and Generalized Least Square (GLS) Estimation. Then a well-developed panel Generalized Method of Moments (GMM) Estimator is developed to investigate the dynamic nature of performance and governance relationship. The Study empirically confirms that two-step system GMM approach controls the problem of unobserved heterogeneity and endogeneity as compared to the OLS and GLS approach. The result suggests that banks with small board, boards with female members, and boards that meet more frequently tend to be more efficient and subsequently have a positive impact on performance of banks. The study offers insights to policy makers interested in enhancing the quality of governance of banks in India. Also, the findings suggest that board structure plays a vital role in the improvement of corporate governance mechanism for financial institutions. There is a need to have efficient boards in banks to improve the overall health of the financial institutions and the economic development of the country.Keywords: board of directors, corporate governance, GMM estimation, Indian banking
Procedia PDF Downloads 26226036 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant
Authors: Michael Smalenberger
Abstract:
Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation
Procedia PDF Downloads 17426035 Support Vector Machine Based Retinal Therapeutic for Glaucoma Using Machine Learning Algorithm
Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Yang Yung, Tracy Lin Huan
Abstract:
Glaucoma is a group of visual maladies represented by the scheduled optic nerve neuropathy; means to the increasing dwindling in vision ground, resulting in loss of sight. In this paper, a novel support vector machine based retinal therapeutic for glaucoma using machine learning algorithm is conservative. The algorithm has fitting pragmatism; subsequently sustained on correlation clustering mode, it visualizes perfect computations in the multi-dimensional space. Support vector clustering turns out to be comparable to the scale-space advance that investigates the cluster organization by means of a kernel density estimation of the likelihood distribution, where cluster midpoints are idiosyncratic by the neighborhood maxima of the concreteness. The predicted planning has 91% attainment rate on data set deterrent on a consolidation of 500 realistic images of resolute and glaucoma retina; therefore, the computational benefit of depending on the cluster overlapping system pedestal on machine learning algorithm has complete performance in glaucoma therapeutic.Keywords: machine learning algorithm, correlation clustering mode, cluster overlapping system, glaucoma, kernel density estimation, retinal therapeutic
Procedia PDF Downloads 25526034 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform
Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier
Abstract:
The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing
Procedia PDF Downloads 19726033 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 54226032 Estimation of Reservoirs Fracture Network Properties Using an Artificial Intelligence Technique
Authors: Reda Abdel Azim, Tariq Shehab
Abstract:
The main objective of this study is to develop a subsurface fracture map of naturally fractured reservoirs by overcoming the limitations associated with different data sources in characterising fracture properties. Some of these limitations are overcome by employing a nested neuro-stochastic technique to establish inter-relationship between different data, as conventional well logs, borehole images (FMI), core description, seismic attributes, and etc. and then characterise fracture properties in terms of fracture density and fractal dimension for each data source. Fracture density is an important property of a system of fracture network as it is a measure of the cumulative area of all the fractures in a unit volume of a fracture network system and Fractal dimension is also used to characterize self-similar objects such as fractures. At the wellbore locations, fracture density and fractal dimension can only be estimated for limited sections where FMI data are available. Therefore, artificial intelligence technique is applied to approximate the quantities at locations along the wellbore, where the hard data is not available. It should be noted that Artificial intelligence techniques have proven their effectiveness in this domain of applications.Keywords: naturally fractured reservoirs, artificial intelligence, fracture intensity, fractal dimension
Procedia PDF Downloads 25626031 Plot Scale Estimation of Crop Biophysical Parameters from High Resolution Satellite Imagery
Authors: Shreedevi Moharana, Subashisa Dutta
Abstract:
The present study focuses on the estimation of crop biophysical parameters like crop chlorophyll, nitrogen and water stress at plot scale in the crop fields. To achieve these, we have used high-resolution satellite LISS IV imagery. A new methodology has proposed in this research work, the spectral shape function of paddy crop is employed to get the significant wavelengths sensitive to paddy crop parameters. From the shape functions, regression index models were established for the critical wavelength with minimum and maximum wavelengths of multi-spectrum high-resolution LISS IV data. Moreover, the functional relationships were utilized to develop the index models. From these index models crop, biophysical parameters were estimated and mapped from LISS IV imagery at plot scale in crop field level. The result showed that the nitrogen content of the paddy crop varied from 2-8%, chlorophyll from 1.5-9% and water content variation observed from 40-90% respectively. It was observed that the variability in rice agriculture system in India was purely a function of field topography.Keywords: crop parameters, index model, LISS IV imagery, plot scale, shape function
Procedia PDF Downloads 16826030 Deep Learning Based 6D Pose Estimation for Bin-Picking Using 3D Point Clouds
Authors: Hesheng Wang, Haoyu Wang, Chungang Zhuang
Abstract:
Estimating the 6D pose of objects is a core step for robot bin-picking tasks. The problem is that various objects are usually randomly stacked with heavy occlusion in real applications. In this work, we propose a method to regress 6D poses by predicting three points for each object in the 3D point cloud through deep learning. To solve the ambiguity of symmetric pose, we propose a labeling method to help the network converge better. Based on the predicted pose, an iterative method is employed for pose optimization. In real-world experiments, our method outperforms the classical approach in both precision and recall.Keywords: pose estimation, deep learning, point cloud, bin-picking, 3D computer vision
Procedia PDF Downloads 16126029 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records
Authors: Sara ElElimy, Samir Moustafa
Abstract:
Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).Keywords: big data analytics, machine learning, CDRs, 5G
Procedia PDF Downloads 14026028 Mothers, the Missing Link: A Critical Discourse Analysis of the Women-Centric Counterterrorism Measures
Authors: Bukola Solomon
Abstract:
In counterterrorism, policymakers typically design a confined role for women as family members and nurturers. In recent years, they have embraced the idea of mothers as the missing link to preventing and countering violent extremism. This ‘programmed’ role of women is derived from the convictions that women’s central roles in the family and community afford them the ‘unique set of skills’ to detect early signs of radicalization and extremism. This paper attempts to focus on the ‘mother’ narrative that frames women’s agency as mothers of ‘terrorists’ and ‘potential’ terrorists. The general underlying assumption of the ‘mother’ narrative is that naturally, every ‘terrorist’ has or once had a mother, and their radicalization is a maternal ‘oversight.’ By deconstructing the notion of motherhood as a social construct instead of an inherent female desire and ability, this paper argues that the assumption of ‘mothers know best’ is invalid. Also, this paper suggests that the ‘mother’ narrative is a deliberate effort to restrict women’s participation in counterterrorism as ‘preventers.’ Finally, this paper notes a global trend in which mothers are contesting the dominant view of women empowerment that restricts their agency by seeking alternative versions in terrorist organizations. And as such, they create parallel terror cells. Thus, the overemphasis on the role women plays as mothers in counterterrorism limits the scope and potential of counterterrorism programs by marginalizing gender issues and reinforcing gender disparities to the extent that the programs become counterproductive.Keywords: countering violent extremism, counterterrorism, gender, gender roles, terrorism, women
Procedia PDF Downloads 11726027 3D Model Completion Based on Similarity Search with Slim-Tree
Authors: Alexis Aldo Mendoza Villarroel, Ademir Clemente Villena Zevallos, Cristian Jose Lopez Del Alamo
Abstract:
With the advancement of technology it is now possible to scan entire objects and obtain their digital representation by using point clouds or polygon meshes. However, some objects may be broken or have missing parts; thus, several methods focused on this problem have been proposed based on Geometric Deep Learning, such as GCNN, ACNN, PointNet, among others. In this article an approach from a different paradigm is proposed, using metric data structures to index global descriptors in the spectral domain and allow the recovery of a set of similar models in polynomial time; to later use the Iterative Close Point algorithm and recover the parts of the incomplete model using the geometry and topology of the model with less Hausdorff distance.Keywords: 3D reconstruction method, point cloud completion, shape completion, similarity search
Procedia PDF Downloads 12226026 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19426025 Tensile Force Estimation for Real-Size Pre-Stressed Concrete Girder using Embedded Elasto-Magnetic Sensor
Authors: Junkyeong Kim, Jooyoung Park, Aoqi Zhang, Seunghee Park
Abstract:
The tensile force of Pre-Stressed Concrete (PSC) girder is the most important factor for evaluating the performance of PSC girder bridges. To measure the tensile force of PSC girder, several NDT methods were studied. However, conventional NDT method cannot be applied to the real-size PSC girder because the PS tendons could not be approached. To measure the tensile force of real-size PSC girder, this study proposed embedded EM sensor based tensile force estimation method. The embedded EM sensor could be installed inside of PSC girder as a sheath joint before the concrete casting. After curing process, the PS tendons were installed, and the tensile force was induced step by step using hydraulic jacking machine. The B-H loop was measured using embedded EM sensor at each tensile force steps and to compare with actual tensile force, the load cell was installed at each end of girder. The magnetization energy loss, that is the closed area of B-H loop, was decreased according to the increase of tensile force with regular pattern. Thus, the tensile force could be estimated by the tracking the change of magnetization energy loss of PS tendons. Through the experimental result, the proposed method can be used to estimate the tensile force of the in-situ real-size PSC girder bridge.Keywords: tensile force estimation, embedded EM sensor, magnetization energy loss, PSC girder
Procedia PDF Downloads 33826024 Estimation of Natural Convection Heat Transfer from Plate-Fin Heat Sinks in a Closed Enclosure
Authors: Han-Taw Chen, Chung-Hou Lai, Tzu-Hsiang Lin, Ge-Jang He
Abstract:
This study applies the inverse method and three-dimensional CFD commercial software in conjunction with the experimental temperature data to investigate the heat transfer and fluid flow characteristics of the plate-fin heat sink in a closed rectangular enclosure for various values of fin height. The inverse method with the finite difference method and the experimental temperature data is applied to determine the heat transfer coefficient. The k-ε turbulence model is used to obtain the heat transfer and fluid flow characteristics within the fins. To validate the accuracy of the results obtained, the comparison of the average heat transfer coefficient is made. The calculated temperature at selected measurement locations on the plate-fin is also compared with experimental data.Keywords: inverse method, FLUENT, k-ε model, heat transfer characteristics, plate-fin heat sink
Procedia PDF Downloads 46026023 Age Estimation and Sex Determination by CT-Scan Analysis of the Hyoid Bone: Application on a Tunisian Population
Authors: N. Haj Salem, M. Belhadj, S. Ben Jomâa, R. Dhouieb, S. Saadi, M. A. Mesrati, A. Chadly
Abstract:
Introduction: The hyoid bone is considered as one of many bones used to identify a missed person. There is a specificity of each population group in human identifications. Objective: To analyze the relationship between age, sex and metric parameters of hyoid bone in Tunisian population sample, using CT-scan. Materials and Methods: A prospective study was conducted in the Department of Forensic Medicine of FattoumaBourguiba Hospital of Monastir-Tunisia during 4 years. A total of 240 samples of hyoid bone were studied. The age of cases ranged from 18 days to 81 years. The specimens were collected only from the deceased of known age. Once dried, each hyoid bone was scanned using CT scan. For each specimen, 10 measurements were taken using a computer program. The measurements consisted of 6 lengths and 4 widths. A regression analysis was used to estimate the relationship between age, sex, and different measurements. For age estimation, a multiple logistic regression was carried out for samples ≤ 35 years. For sex determination, ROC curve was performed. Discriminant value finally retained was based on the best specificity with the best sensitivity. Results: The correlation between real age and estimated age was good (r²=0.72) for samples aged 35 years or less. The unstandardised canonical function equation was estimated using three variables: maximum length of the right greater cornua, length from the middle of the left joint space to the middle of the right joint space and perpendicular length from the centre point of a line between the distal ends of the right and left greater cornua to the centre point of the anterior view of the body of the hyoid bone. For sex determination, the ROC curve analysis reveals that the area under curve was at 81.8%. Discriminant value was 0.451 with a specificity of 73% and sensibility of 79%. The equation function was estimated based on two variables: maximum length of the greater cornua and maximum length of the hyoid bone. Conclusion: The findings of the current study suggest that metric analysis of the hyoid bone may predict the age ≤ 35 years. Sex estimation seems to be more reliable. Further studies dealing with the fusion of the hyoid bone and the current study could help to achieve more accurate age estimation rates.Keywords: anthropology, age estimation, CT scan, sex determination, Tunisia
Procedia PDF Downloads 17426022 Nonlinear Estimation Model for Rail Track Deterioration
Authors: M. Karimpour, L. Hitihamillage, N. Elkhoury, S. Moridpour, R. Hesami
Abstract:
Rail transport authorities around the world have been facing a significant challenge when predicting rail infrastructure maintenance work for a long period of time. Generally, maintenance monitoring and prediction is conducted manually. With the restrictions in economy, the rail transport authorities are in pursuit of improved modern methods, which can provide precise prediction of rail maintenance time and location. The expectation from such a method is to develop models to minimize the human error that is strongly related to manual prediction. Such models will help them in understanding how the track degradation occurs overtime under the change in different conditions (e.g. rail load, rail type, rail profile). They need a well-structured technique to identify the precise time that rail tracks fail in order to minimize the maintenance cost/time and secure the vehicles. The rail track characteristics that have been collected over the years will be used in developing rail track degradation prediction models. Since these data have been collected in large volumes and the data collection is done both electronically and manually, it is possible to have some errors. Sometimes these errors make it impossible to use them in prediction model development. This is one of the major drawbacks in rail track degradation prediction. An accurate model can play a key role in the estimation of the long-term behavior of rail tracks. Accurate models increase the track safety and decrease the cost of maintenance in long term. In this research, a short review of rail track degradation prediction models has been discussed before estimating rail track degradation for the curve sections of Melbourne tram track system using Adaptive Network-based Fuzzy Inference System (ANFIS) model.Keywords: ANFIS, MGT, prediction modeling, rail track degradation
Procedia PDF Downloads 33726021 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland
Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli
Abstract:
This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.Keywords: analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges
Procedia PDF Downloads 16426020 The Response of the Central Bank to the Exchange Rate Movement: A Dynamic Stochastic General Equilibrium-Vector Autoregressive Approach for Tunisian Economy
Authors: Abdelli Soulaima, Belhadj Besma
Abstract:
The paper examines the choice of the central bank toward the movements of the nominal exchange rate and evaluates its effects on the volatility of the output growth and the inflation. The novel hybrid method of the dynamic stochastic general equilibrium called the DSGE-VAR is proposed for analyzing this policy experiment in a small scale open economy in particular Tunisia. The contribution is provided to the empirical literature as we apply the Tunisian data with this model, which is rarely used in this context. Note additionally that the issue of treating the degree of response of the central bank to the exchange rate in Tunisia is special. To ameliorate the estimation, the Bayesian technique is carried out for the sample 1980:q1 to 2011 q4. Our results reveal that the central bank should not react or softly react to the exchange rate. The variance decomposition displayed that the overall inflation volatility is more pronounced with the fixed exchange rate regime for most of the shocks except for the productivity and the interest rate. The output volatility is also higher with this regime with the majority of the shocks exempting the foreign interest rate and the interest rate shocks.Keywords: DSGE-VAR modeling, exchange rate, monetary policy, Bayesian estimation
Procedia PDF Downloads 30026019 The Study of Periodontal Health Status in Menopausal Women with Osteoporosis Referred to Rheumatology Clinics in Yazd and Healthy People
Authors: Mahboobe Daneshvar
Abstract:
Introduction: Clinical studies on the effect of systemic conditions on periodontal diseases have shown that some systemic deficiencies may provide grounds for the onset of periodontal diseases. One of these systemic problems is osteoporosis, which may be a risk factor for the onset and exacerbation of periodontitis. This study tends to evaluate periodontal indices in osteoporotic menopausal women and compare them with healthy controls. Materials and Methods: In this case-control study, participants included 45-75-year-old menopausal women referred to rheumatology wards of the Khatamolanbia Clinic and Shahid Sadoughi Hospital in Yazd; Their bone density was determined by DEXA-scan and by imaging the femoral-lumbar bone. Thirty patients with osteoporosis and 30 subjects with normal BMD were selected. Then, informed consent was obtained for participation in the study. During the clinical examinations, tooth loss (TL), plaque index (PI), gingival recession, pocket probing depth (PPD), clinical attachment loss (CAL), and tooth mobility (TM) were measured to evaluate the periodontal status. These clinical examinations were performed to determine the periodontal status by catheter, mirror and probe. Results: During the evaluation, there was no significant difference in PPD, PI, TM, gingival recession, and CAL between case and control groups (P-value>0.05); that is, osteoporosis has no effect on the above factors. These periodontal factors are almost the same in both healthy and patient groups. In the case of missing teeth, the following results were obtained: the mean of missing teeth was 22.173% of the total teeth in the case group and 18.583% of the total teeth in the control group. In the study of the missing teeth in the case and control groups, there was a significant relationship between case and control groups (P-value = 0.025). Conclusion: In fact, since periodontal disease is multifactorial and microbial plaque is the main cause, osteoporosis is considered a predisposing factor in exacerbation or persistence of periodontal disease. In patients with osteoporosis, usually pathological fractures, hormonal changes, and aging lead to reduced physical activity and affect oral health, which leads to the manifestation of periodontal disease. But this disease increases tooth loss by changing the shape and structure of bone trabeculae and weakening them. Osteoporosis does not seem to be a deterministic factor in the incidence of periodontal disease, since it affects bone quality rather than bone quantity.Keywords: plaque index, Osteoporosis, tooth mobility, periodontal packet
Procedia PDF Downloads 7326018 Electromagnetic Source Direction of Arrival Estimation via Virtual Antenna Array
Authors: Meiling Yang, Shuguo Xie, Yilong Zhu
Abstract:
Nowadays, due to diverse electric products and complex electromagnetic environment, the localization and troubleshooting of the electromagnetic radiation source is urgent and necessary especially on the condition of far field. However, based on the existing DOA positioning method, the system or devices are complex, bulky and expensive. To address this issue, this paper proposes a single antenna radiation source localization method. A single antenna moves to form a virtual antenna array combined with DOA and MUSIC algorithm to position accurately, meanwhile reducing the cost and simplify the equipment. As shown in the results of simulations and experiments, the virtual antenna array DOA estimation modeling is correct and its positioning is credible.Keywords: virtual antenna array, DOA, localization, far field
Procedia PDF Downloads 37426017 Proficient Estimation Procedure for a Rare Sensitive Attribute Using Poisson Distribution
Authors: S. Suman, G. N. Singh
Abstract:
The present manuscript addresses the estimation procedure of population parameter using Poisson probability distribution when characteristic under study possesses a rare sensitive attribute. The generalized form of unrelated randomized response model is suggested in order to acquire the truthful responses from respondents. The resultant estimators have been proposed for two situations when the information on an unrelated rare non-sensitive characteristic is known as well as unknown. The properties of the proposed estimators are derived, and the measure of confidentiality of respondent is also suggested for respondents. Empirical studies are carried out in the support of discussed theory.Keywords: Poisson distribution, randomized response model, rare sensitive attribute, non-sensitive attribute
Procedia PDF Downloads 26826016 Prediction of Physical Properties and Sound Absorption Performance of Automotive Interior Materials
Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Seong-Jin Cho, Tae-Hyeon Oh, Dae-Kyu Park
Abstract:
Sound absorption coefficient is considered important when designing because noise affects emotion quality of car. It is designed with lots of experiment tunings in the field because it is unreliable to predict it for multi-layer material. In this paper, we present the design of sound absorption for automotive interior material with multiple layers using estimation software of sound absorption coefficient for reverberation chamber. Additionally, we introduce the method for estimation of physical properties required to predict sound absorption coefficient of car interior materials with multiple layers too. It is calculated by inverse algorithm. It is very economical to get information about physical properties without expensive equipment. Correlation test is carried out to ensure reliability for accuracy. The data to be used for the correlation is sound absorption coefficient measured in the reverberation chamber. In this way, it is considered economical and efficient to design automotive interior materials. And design optimization for sound absorption coefficient is also easy to implement when it is designed.Keywords: sound absorption coefficient, optimization design, inverse algorithm, automotive interior material, multiple layers nonwoven, scaled reverberation chamber, sound impedance tubes
Procedia PDF Downloads 30926015 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 14226014 Gender Estimation by Means of Quantitative Measurements of Foramen Magnum: An Analysis of CT Head Images
Authors: Thilini Hathurusinghe, Uthpalie Siriwardhana, W. M. Ediri Arachchi, Ranga Thudugala, Indeewari Herath, Gayani Senanayake
Abstract:
The foramen magnum is more prone to protect than other skeletal remains during high impact and severe disruptive injuries. Therefore, it is worthwhile to explore whether these measurements can be used to determine the human gender which is vital in forensic and anthropological studies. The idea was to find out the ability to use quantitative measurements of foramen magnum as an anatomical indicator for human gender estimation and to evaluate the gender-dependent variations of foramen magnum using quantitative measurements. Randomly selected 113 subjects who underwent CT head scans at Sri Jayawardhanapura General Hospital of Sri Lanka within a period of six months, were included in the study. The sample contained 58 males (48.76 ± 14.7 years old) and 55 females (47.04 ±15.9 years old). Maximum length of the foramen magnum (LFM), maximum width of the foramen magnum (WFM), minimum distance between occipital condyles (MnD) and maximum interior distance between occipital condyles (MxID) were measured. Further, AreaT and AreaR were also calculated. The gender was estimated using binomial logistic regression. The mean values of all explanatory variables (LFM, WFM, MnD, MxID, AreaT, and AreaR) were greater among male than female. All explanatory variables except MnD (p=0.669) were statistically significant (p < 0.05). Significant bivariate correlations were demonstrated by AreaT and AreaR with the explanatory variables. The results evidenced that WFM and MxID were the best measurements in predicting gender according to binomial logistic regression. The estimated model was: log (p/1-p) =10.391-0.136×MxID-0.231×WFM, where p is the probability of being a female. The classification accuracy given by the above model was 65.5%. The quantitative measurements of foramen magnum can be used as a reliable anatomical marker for human gender estimation in the Sri Lankan context.Keywords: foramen magnum, forensic and anthropological studies, gender estimation, logistic regression
Procedia PDF Downloads 15126013 Estimation of Population Mean under Random Non-Response in Two-Phase Successive Sampling
Authors: M. Khalid, G. N. Singh
Abstract:
In this paper, we have considered the problem of estimation for population mean, on current (second) occasion in the presence of random non response in two-occasion successive sampling under two phase set-up. Modified exponential type estimators have been proposed, and their properties are studied under the assumptions that numbers of sampling units follow a distribution due to random non response situations. The performances of the proposed estimators are compared with linear combinations of two estimators, (a) sample mean estimator for fresh sample and (b) ratio estimator for matched sample under the complete response situations. Results are demonstrated through empirical studies which present the effectiveness of the proposed estimators. Suitable recommendations have been made to the survey practitioners.Keywords: successive sampling, random non-response, auxiliary variable, bias, mean square error
Procedia PDF Downloads 52226012 Analysis of Translational Ship Oscillations in a Realistic Environment
Authors: Chen Zhang, Bernhard Schwarz-Röhr, Alexander Härting
Abstract:
To acquire accurate ship motions at the center of gravity, a single low-cost inertial sensor is utilized and applied on board to measure ship oscillating motions. As observations, the three axes accelerations and three axes rotational rates provided by the sensor are used. The mathematical model of processing the observation data includes determination of the distance vector between the sensor and the center of gravity in x, y, and z directions. After setting up the transfer matrix from sensor’s own coordinate system to the ship’s body frame, an extended Kalman filter is applied to deal with nonlinearities between the ship motion in the body frame and the observation information in the sensor’s frame. As a side effect, the method eliminates sensor noise and other unwanted errors. Results are not only roll and pitch, but also linear motions, in particular heave and surge at the center of gravity. For testing, we resort to measurements recorded on a small vessel in a well-defined sea state. With response amplitude operators computed numerically by a commercial software (Seaway), motion characteristics are estimated. These agree well with the measurements after processing with the suggested method.Keywords: extended Kalman filter, nonlinear estimation, sea trial, ship motion estimation
Procedia PDF Downloads 52426011 Using Deep Learning for the Detection of Faulty RJ45 Connectors on a Radio Base Station
Authors: Djamel Fawzi Hadj Sadok, Marrone Silvério Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner
Abstract:
A radio base station (RBS), part of the radio access network, is a particular type of equipment that supports the connection between a wide range of cellular user devices and an operator network access infrastructure. Nowadays, most of the RBS maintenance is carried out manually, resulting in a time consuming and costly task. A suitable candidate for RBS maintenance automation is repairing faulty links between devices caused by missing or unplugged connectors. A suitable candidate for RBS maintenance automation is repairing faulty links between devices caused by missing or unplugged connectors. This paper proposes and compares two deep learning solutions to identify attached RJ45 connectors on network ports. We named connector detection, the solution based on object detection, and connector classification, the one based on object classification. With the connector detection, we get an accuracy of 0:934, mean average precision 0:903. Connector classification, get a maximum accuracy of 0:981 and an AUC of 0:989. Although connector detection was outperformed in this study, this should not be viewed as an overall result as connector detection is more flexible for scenarios where there is no precise information about the environment and the possible devices. At the same time, the connector classification requires that information to be well-defined.Keywords: radio base station, maintenance, classification, detection, deep learning, automation
Procedia PDF Downloads 20226010 Model Estimation and Error Level for Okike’s Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba E. J. D.
Abstract:
The researcher has developed a new encryption technique known as Merged Irregular Transposition Cipher. In this cipher method of encryption, a message to be encrypted is split into parts and each part encrypted separately. Before the encrypted message is transmitted to the recipient(s), the positions of the split in the encrypted messages could be swapped to ensure more security. This work seeks to develop a model by considering the split number, S and the average number of characters per split, L as the message under consideration is split from 2 through 10. Again, after developing the model, the error level in the model would be determined.Keywords: merged irregular transposition, error level, model estimation, message splitting
Procedia PDF Downloads 31426009 Model Predictive Control with Unscented Kalman Filter for Nonlinear Implicit Systems
Authors: Takashi Shimizu, Tomoaki Hashimoto
Abstract:
A class of implicit systems is known as a more generalized class of systems than a class of explicit systems. To establish a control method for such a generalized class of systems, we adopt model predictive control method which is a kind of optimal feedback control with a performance index that has a moving initial time and terminal time. However, model predictive control method is inapplicable to systems whose all state variables are not exactly known. In other words, model predictive control method is inapplicable to systems with limited measurable states. In fact, it is usual that the state variables of systems are measured through outputs, hence, only limited parts of them can be used directly. It is also usual that output signals are disturbed by process and sensor noises. Hence, it is important to establish a state estimation method for nonlinear implicit systems with taking the process noise and sensor noise into consideration. To this purpose, we apply the model predictive control method and unscented Kalman filter for solving the optimization and estimation problems of nonlinear implicit systems, respectively. The objective of this study is to establish a model predictive control with unscented Kalman filter for nonlinear implicit systems.Keywords: optimal control, nonlinear systems, state estimation, Kalman filter
Procedia PDF Downloads 20226008 Modelling the Indonesian Goverment Securities Yield Curve Using Nelson-Siegel-Svensson and Support Vector Regression
Authors: Jamilatuzzahro, Rezzy Eko Caraka
Abstract:
The yield curve is the plot of the yield to maturity of zero-coupon bonds against maturity. In practice, the yield curve is not observed but must be extracted from observed bond prices for a set of (usually) incomplete maturities. There exist many methodologies and theory to analyze of yield curve. We use two methods (the Nelson-Siegel Method, the Svensson Method, and the SVR method) in order to construct and compare our zero-coupon yield curves. The objectives of this research were: (i) to study the adequacy of NSS model and SVR to Indonesian government bonds data, (ii) to choose the best optimization or estimation method for NSS model and SVR. To obtain that objective, this research was done by the following steps: data preparation, cleaning or filtering data, modeling, and model evaluation.Keywords: support vector regression, Nelson-Siegel-Svensson, yield curve, Indonesian government
Procedia PDF Downloads 246