Search results for: statistical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17253

Search results for: statistical methods

17103 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 114
17102 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis

Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai

Abstract:

The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.

Keywords: forecast, ICT, industrial structural changes, statistical analysis

Procedia PDF Downloads 337
17101 An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models

Authors: Christopher Godwin Udomboso, Angela Unna Chukwu, Isaac Kwame Dontwi

Abstract:

In selecting a Statistical Neural Network model, the Network Information Criterion (NIC) has been observed to be sample biased, because it does not account for sample sizes. The selection of a model from a set of fitted candidate models requires objective data-driven criteria. In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC), based on Kullback’s symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The analyses show that on a general note, the ANIC improves model selection in more sample sizes than does the NIC.

Keywords: statistical neural network, network information criterion, adjusted network, information criterion, transfer function

Procedia PDF Downloads 524
17100 Weighted Rank Regression with Adaptive Penalty Function

Authors: Kang-Mo Jung

Abstract:

The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.

Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression

Procedia PDF Downloads 429
17099 A New Conjugate Gradient Method with Guaranteed Descent

Authors: B. Sellami, M. Belloufi

Abstract:

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but also has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed.

Keywords: unconstrained optimization, conjugate gradient method, line search, global convergence

Procedia PDF Downloads 413
17098 Diagnosis of Diabetes Using Computer Methods: Soft Computing Methods for Diabetes Detection Using Iris

Authors: Piyush Samant, Ravinder Agarwal

Abstract:

Complementary and Alternative Medicine (CAM) techniques are quite popular and effective for chronic diseases. Iridology is more than 150 years old CAM technique which analyzes the patterns, tissue weakness, color, shape, structure, etc. for disease diagnosis. The objective of this paper is to validate the use of iridology for the diagnosis of the diabetes. The suggested model was applied in a systemic disease with ocular effects. 200 subject data of 100 each diabetic and non-diabetic were evaluated. Complete procedure was kept very simple and free from the involvement of any iridologist. From the normalized iris, the region of interest was cropped. All 63 features were extracted using statistical, texture analysis, and two-dimensional discrete wavelet transformation. A comparison of accuracies of six different classifiers has been presented. The result shows 89.66% accuracy by the random forest classifier.

Keywords: complementary and alternative medicine, classification, iridology, iris, feature extraction, disease prediction

Procedia PDF Downloads 364
17097 Image Encryption Using Eureqa to Generate an Automated Mathematical Key

Authors: Halima Adel Halim Shnishah, David Mulvaney

Abstract:

Applying traditional symmetric cryptography algorithms while computing encryption and decryption provides immunity to secret keys against different attacks. One of the popular techniques generating automated secret keys is evolutionary computing by using Eureqa API tool, which got attention in 2013. In this paper, we are generating automated secret keys for image encryption and decryption using Eureqa API (tool which is used in evolutionary computing technique). Eureqa API models pseudo-random input data obtained from a suitable source to generate secret keys. The validation of generated secret keys is investigated by performing various statistical tests (histogram, chi-square, correlation of two adjacent pixels, correlation between original and encrypted images, entropy and key sensitivity). Experimental results obtained from methods including histogram analysis, correlation coefficient, entropy and key sensitivity, show that the proposed image encryption algorithms are secure and reliable, with the potential to be adapted for secure image communication applications.

Keywords: image encryption algorithms, Eureqa, statistical measurements, automated key generation

Procedia PDF Downloads 451
17096 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method

Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter

Abstract:

This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.

Keywords: aging, eye tracking, implicit learning, visual statistical learning

Procedia PDF Downloads 45
17095 An Application of Sinc Function to Approximate Quadrature Integrals in Generalized Linear Mixed Models

Authors: Altaf H. Khan, Frank Stenger, Mohammed A. Hussein, Reaz A. Chaudhuri, Sameera Asif

Abstract:

This paper discusses a novel approach to approximate quadrature integrals that arise in the estimation of likelihood parameters for the generalized linear mixed models (GLMM) as well as Bayesian methodology also requires computation of multidimensional integrals with respect to the posterior distributions in which computation are not only tedious and cumbersome rather in some situations impossible to find solutions because of singularities, irregular domains, etc. An attempt has been made in this work to apply Sinc function based quadrature rules to approximate intractable integrals, as there are several advantages of using Sinc based methods, for example: order of convergence is exponential, works very well in the neighborhood of singularities, in general quite stable and provide high accurate and double precisions estimates. The Sinc function based approach seems to be utilized first time in statistical domain to our knowledge, and it's viability and future scopes have been discussed to apply in the estimation of parameters for GLMM models as well as some other statistical areas.

Keywords: generalized linear mixed model, likelihood parameters, qudarature, Sinc function

Procedia PDF Downloads 366
17094 Employer Learning, Statistical Discrimination and University Prestige

Authors: Paola Bordon, Breno Braga

Abstract:

This paper investigates whether firms use university prestige to statistically discriminate among college graduates. The test is based on the employer learning literature which suggests that if firms use a characteristic for statistical discrimination, this variable should become less important for earnings as a worker gains labor market experience. In this framework, we use a regression discontinuity design to estimate a 19% wage premium for recent graduates of two of the most selective universities in Chile. However, we find that this premium decreases by 3 percentage points per year of labor market experience. These results suggest that employers use college selectivity as a signal of workers' quality when they leave school. However, as workers reveal their productivity throughout their careers, they become rewarded based on their true quality rather than the prestige of their college.

Keywords: employer learning, statistical discrimination, college returns, college selectivity

Procedia PDF Downloads 545
17093 Review of Downscaling Methods in Climate Change and Their Role in Hydrological Studies

Authors: Nishi Bhuvandas, P. V. Timbadiya, P. L. Patel, P. D. Porey

Abstract:

Recent perceived climate variability raises concerns with unprecedented hydrological phenomena and extremes. Distribution and circulation of the waters of the Earth become increasingly difficult to determine because of additional uncertainty related to anthropogenic emissions. According to the sixth Intergovernmental Panel on Climate Change (IPCC) Technical Paper on Climate Change and water, changes in the large-scale hydrological cycle have been related to an increase in the observed temperature over several decades. Although many previous research carried on effect of change in climate on hydrology provides a general picture of possible hydrological global change, new tools and frameworks for modelling hydrological series with nonstationary characteristics at finer scales, are required for assessing climate change impacts. Of the downscaling techniques, dynamic downscaling is usually based on the use of Regional Climate Models (RCMs), which generate finer resolution output based on atmospheric physics over a region using General Circulation Model (GCM) fields as boundary conditions. However, RCMs are not expected to capture the observed spatial precipitation extremes at a fine cell scale or at a basin scale. Statistical downscaling derives a statistical or empirical relationship between the variables simulated by the GCMs, called predictors, and station-scale hydrologic variables, called predictands. The main focus of the paper is on the need for using statistical downscaling techniques for projection of local hydrometeorological variables under climate change scenarios. The projections can be then served as a means of input source to various hydrologic models to obtain streamflow, evapotranspiration, soil moisture and other hydrological variables of interest.

Keywords: climate change, downscaling, GCM, RCM

Procedia PDF Downloads 372
17092 Effects of a Student-Centered Approach to Assessment on Students' Attitudes towards 'Applied Statistics' Course

Authors: Anduela Lile

Abstract:

The purpose of this cross sectional study was to investigate the effectiveness of teaching and learning Statistics from a student centered perspective in higher education institutions. Statistics education has emphasized the application of tangible and interesting examples in order to motivate students learning about statistical concepts. Participants in this study were 112 bachelor students enrolled in the ‘Applied Statistics’ course in Sports University of Tirana. Experimental group students received a student-centered teaching approach; Control group students received an instructor-centered teaching approach. This study found student-centered approach student group had statistically significantly higher assessments scores (52.1 ± 18.9) at the end of the evaluation compared to instructor-centered approach student group (61.8 ± 16.4), (t (108) = 2.848, p = 0.005). Results concluded that student-centered perspective can improve student positive attitude to statistical methods and to motivate project work. Therefore, findings of this study may be very useful to the higher education institutions to establish their learning strategies especially for courses related to Statistics.

Keywords: student-centered, instructor-centered, course assessment, learning outcomes, applied statistics

Procedia PDF Downloads 245
17091 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 62
17090 The Metacognition Levels of Students: A Research School of Physical Education and Sports at Anadolu University

Authors: Dilek Yalız Solmaz

Abstract:

Meta-cognition is an important factor for educating conscious individuals who are aware of their cognitive processes. With this respect, the purposes of this article is to find out the perceived metacognition level of Physical Education and Sports School students at Anadolu University and to identify whether metacognition levels display significant differences in terms of various variables. 416 Anadolu University Physical Education and Sports School students were formed the research universe. "The Meta-Cognitions Questionnaire (MCQ-30)" developed by Cartwright-Hatton and Wells and later developed the 30-item short form (MCQ-30) was used. The MCQ-30 which was adapted into Turkish by Tosun and Irak is a four-point agreement scale. In the data analysis, arithmethic mean, standard deviation, t-test and ANOVA were used. There is no statistical difference between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence and the positive beliefs of girls and boys students. There is a statistical difference between mean scores of the need to control thinking. There is no statistical difference according to departments of students between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence, need to control thinking and the positive beliefs. There is no statistical difference according to grade level of students between mean scores of the positive beliefs, cognitive confidence and need to control thinking. There is a statistical difference between mean scores of uncontrollableness and danger and cognitive awareness.

Keywords: meta cognition, physical education, sports school students, thinking

Procedia PDF Downloads 356
17089 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 220
17088 Direct Translation vs. Pivot Language Translation for Persian-Spanish Low-Resourced Statistical Machine Translation System

Authors: Benyamin Ahmadnia, Javier Serrano

Abstract:

In this paper we compare two different approaches for translating from Persian to Spanish, as a language pair with scarce parallel corpus. The first approach involves direct transfer using an statistical machine translation system, which is available for this language pair. The second approach involves translation through English, as a pivot language, which has more translation resources and more advanced translation systems available. The results show that, it is possible to achieve better translation quality using English as a pivot language in either approach outperforms direct translation from Persian to Spanish. Our best result is the pivot system which scores higher than direct translation by (1.12) BLEU points.

Keywords: statistical machine translation, direct translation approach, pivot language translation approach, parallel corpus

Procedia PDF Downloads 457
17087 The Concept of Neurostatistics as a Neuroscience

Authors: Igwenagu Chinelo Mercy

Abstract:

This study is on the concept of Neurostatistics in relation to neuroscience. Neuroscience also known as neurobiology is the scientific study of the nervous system. In the study of neuroscience, it has been noted that brain function and its relations to the process of acquiring knowledge and behaviour can be better explained by the use of various interrelated methods. The scope of neuroscience has broadened over time to include different approaches used to study the nervous system at different scales. On the other hand, Neurostatistics based on this study is viewed as a statistical concept that uses similar techniques of neuron mechanisms to solve some problems especially in the field of life science. This study is imperative in this era of Artificial intelligence/Machine leaning in the sense that clear understanding of the technique and its proper application could assist in solving some medical disorder that are mainly associated with the nervous system. This will also help in layman’s understanding of the technique of the nervous system in order to overcome some of the health challenges associated with it. For this concept to be well understood, an illustrative example using a brain associated disorder was used for demonstration. Structural equation modelling was adopted in the analysis. The results clearly show the link between the techniques of statistical model and nervous system. Hence, based on this study, the appropriateness of Neurostatistics application in relation to neuroscience could be based on the understanding of the behavioural pattern of both concepts.

Keywords: brain, neurons, neuroscience, neurostatistics, structural equation modeling

Procedia PDF Downloads 36
17086 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.

Keywords: clipping, clipped signal, speech signal processing, digital signal processing

Procedia PDF Downloads 358
17085 Assessment of the Electrical, Mechanical, and Thermal Nociceptive Thresholds for Stimulation and Pain Measurements at the Bovine Hind Limb

Authors: Samaneh Yavari, Christiane Pferrer, Elisabeth Engelke, Alexander Starke, Juergen Rehage

Abstract:

Background: Three nociceptive thresholds of thermal, electrical, and mechanical thresholds commonly use to evaluate the local anesthesia in many species, for instance, cow, horse, cat, dog, rabbit, and so on. Due to the lack of investigations to evaluate and/or validate such those nociceptive thresholds, our plan was the comparison of two-foot local anesthesia methods of Intravenous Regional Anesthesia (IVRA) and our modified four-point Nerve Block Anesthesia (NBA). Materials and Methods: Eight healthy nonpregnant nondairy Holstein Frisian cows in a cross-over study design were selected for this study. All cows divided into two different groups to receive two local anesthesia techniques of IVRA and our modified four-point NBA. Three thermal, electrical, and mechanical force and pinpricks were applied to evaluate the quality of local anesthesia methods before and after local anesthesia application. Results: The statistical evaluation demonstrated that our four-point NBA has a qualification to select as a standard foot local anesthesia. However, the recorded results of our study revealed no significant difference between two groups of local anesthesia techniques of IVRA and modified four-point NBA related to quality and duration of anesthesia stimulated by electrical, mechanical and thermal nociceptive stimuli. Conclusion and discussion: All three nociceptive threshold stimuli of electrical, mechanical and heat nociceptive thresholds can be applied to measure and evaluate the efficacy of foot local anesthesia of dairy cows. However, our study revealed no superiority of those three nociceptive methods to evaluate the duration and quality of bovine foot local anesthesia methods. Veterinarians to investigate the duration and quality of their selected anesthesia method can use any of those heat, mechanical, and electrical methods.

Keywords: mechanical, thermal, electrical threshold, IVRA, NBA, hind limb, dairy cow

Procedia PDF Downloads 217
17084 Statistical Analysis of the Impact of Maritime Transport Gross Domestic Product (GDP) on Nigeria’s Economy

Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi

Abstract:

Nigeria is referred as the ‘Giant of Africa’ due to high population, land mass and large economy. However, it still trails far behind many smaller economies in the continent in terms of maritime operations. As we have seen that the maritime industry is the spark plug for national growth, because it houses the most crucial infrastructure that generates wealth for a nation, it is worrisome that a nation with six seaports lag in maritime activities. In this research, we have studied how the Gross Domestic Product (GDP) of the maritime transport influences the Nigerian economy. To do this, we applied Simple Linear Regression (SLR), Support Vector Machine (SVM), Polynomial Regression Model (PRM), Generalized Additive Model (GAM) and Generalized Linear Mixed Model (GLMM) to model the relationship between the nation’s Total GDP (TGDP) and the Maritime Transport GDP (MGDP) using a time series data of 20 years. The result showed that the MGDP is statistically significant to the Nigerian economy. Amongst the statistical tool applied, the PRM of order 4 describes the relationship better when compared to other methods. The recommendations presented in this study will guide policy makers and help improve the economy of Nigeria in terms of its GDP.

Keywords: maritime transport, economy, GDP, regression, port

Procedia PDF Downloads 114
17083 Development and Validation of Selective Methods for Estimation of Valaciclovir in Pharmaceutical Dosage Form

Authors: Eman M. Morgan, Hayam M. Lotfy, Yasmin M. Fayez, Mohamed Abdelkawy, Engy Shokry

Abstract:

Two simple, selective, economic, safe, accurate, precise and environmentally friendly methods were developed and validated for the quantitative determination of valaciclovir (VAL) in the presence of its related substances R1 (acyclovir), R2 (guanine) in bulk powder and in the commercial pharmaceutical product containing the drug. Method A is a colorimetric method where VAL selectively reacts with ferric hydroxamate and the developed color was measured at 490 nm over a concentration range of 0.4-2 mg/mL with percentage recovery 100.05 ± 0.58 and correlation coefficient 0.9999. Method B is a reversed phase ultra performance liquid chromatographic technique (UPLC) which is considered superior in technology to the high-performance liquid chromatography with respect to speed, resolution, solvent consumption, time, and cost of analysis. Efficient separation was achieved on Agilent Zorbax CN column using ammonium acetate (0.1%) and acetonitrile as a mobile phase in a linear gradient program. Elution time for the separation was less than 5 min and ultraviolet detection was carried out at 256 nm over a concentration range of 2-50 μg/mL with mean percentage recovery 100.11±0.55 and correlation coefficient 0.9999. The proposed methods were fully validated as per International Conference on Harmonization specifications and effectively applied for the analysis of valaciclovir in pure form and tablets dosage form. Statistical comparison of the results obtained by the proposed and official or reported methods revealed no significant difference in the performance of these methods regarding the accuracy and precision respectively.

Keywords: hydroxamic acid, related substances, UPLC, valaciclovir

Procedia PDF Downloads 216
17082 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms

Authors: Seulki Lee, Seoung Bum Kim

Abstract:

Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.

Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process

Procedia PDF Downloads 270
17081 Improvement of the Numerical Integration's Quality in Meshless Methods

Authors: Ahlem Mougaida, Hedi Bel Hadj Salah

Abstract:

Several methods are suggested to improve the numerical integration in Galerkin weak form for Meshless methods. In fact, integrating without taking into account the characteristics of the shape functions reproduced by Meshless methods (rational functions, with compact support etc.), causes a large integration error that influences the PDE’s approximate solution. Comparisons between different methods of numerical integration for rational functions are discussed and compared. The algorithms are implemented in Matlab. Finally, numerical results were presented to prove the efficiency of our algorithms in improving results.

Keywords: adaptive methods, meshless, numerical integration, rational quadrature

Procedia PDF Downloads 321
17080 Mathematical Modeling for Diabetes Prediction: A Neuro-Fuzzy Approach

Authors: Vijay Kr. Yadav, Nilam Rathi

Abstract:

Accurate prediction of glucose level for diabetes mellitus is required to avoid affecting the functioning of major organs of human body. This study describes the fundamental assumptions and two different methodologies of the Blood glucose prediction. First is based on the back-propagation algorithm of Artificial Neural Network (ANN), and second is based on the Neuro-Fuzzy technique, called Fuzzy Inference System (FIS). Errors between proposed methods further discussed through various statistical methods such as mean square error (MSE), normalised mean absolute error (NMAE). The main objective of present study is to develop mathematical model for blood glucose prediction before 12 hours advanced using data set of three patients for 60 days. The comparative studies of the accuracy with other existing models are also made with same data set.

Keywords: back-propagation, diabetes mellitus, fuzzy inference system, neuro-fuzzy

Procedia PDF Downloads 221
17079 Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)

Authors: Anthony Oppong Kyekyeku, Sussana Antwi-Boasiako, Emmanuel De-Graft Johnson Owusu Ansah

Abstract:

In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.

Keywords: benchmarking performance data, cocoa beans, hidden infestation, storage system validation

Procedia PDF Downloads 140
17078 Management and Agreement Protocol in Computer Security

Authors: Abdulameer K. Hussain

Abstract:

When dealing with a cryptographic system we note that there are many activities performed by parties of this cryptographic system and the most prominent of these activities is the process of agreement between the parties involved in the cryptographic system on how to deal and perform the cryptographic system tasks to be more secure, more confident and reliable. The most common agreement among parties is a key agreement and other types of agreements. Despite the fact that there is an attempt from some quarters to find other effective agreement methods but these methods are limited to the traditional agreements. This paper presents different parameters to perform more effectively the task of the agreement, including the key alternative, the agreement on the encryption method used and the agreement to prevent the denial of the services. To manage and achieve these goals, this method proposes the existence of an control and monitoring entity to manage these agreements by collecting different statistical information of the opinions of the authorized parties in the cryptographic system. These statistics help this entity to take the proper decision about the agreement factors. This entity is called Agreement Manager (AM).

Keywords: agreement parameters, key agreement, key exchange, security management

Procedia PDF Downloads 382
17077 Microbiota Effect with Cytokine in Hl and NHL Patient Group

Authors: Ekin Ece Gürer, Tarık Onur Tiryaki, Sevgi Kalayoğlu Beşışık, Fatma Savran Oğuz, Uğur Sezerman, Fatma Erdem, Gülşen Günel, Dürdane Serap Kuruca, Zerrin Aktaş, Oral Öncül

Abstract:

Aim: Chemotherapytreatment in HodgkinLymphomaandNon-HodgkinLymphoma (NHL) diseasescausesgastrointestinalepithelialdamage, disruptstheintestinalmicrobiotabalanceandcausesdysbiosis. Inourstudy, it wasaimedtoshowtheeffect of thedamagecausedbychemotherapy on themicrobiotaandtheeffect of thechangingmicrobiota flora on thecourse of thedisease. Materials And Methods: Seven adult HL and seven adult HL patients to be treatedwithchemotherapywereincluded in the study. Stoolsamplesweretakentwice, beforechemotherapytreatmentandafterthe 3th course of treatment. SamplesweresequencedusingNextGenerationSequencing (NGS) methodafternucleicacidisolation. OTU tableswerepreparedusing NCBI blastnversion 2.0.12 accordingtothe NCBI general 16S bacterialtaxonomyreferencedated 10.08.2021. Thegenerated OTU tableswerecalculatedwith R Statistical Computer Language version 4.04 (readr, phyloseq, microbiome, vegan, descrand ggplot2 packages) to calculate Alpha diversityandtheirgraphicswerecreated. Statistical analyzeswerealsoperformedusing R Statistical Computer Language version 4.0.4 and studio IDE 1.4 (tidyverse, readr, xlsxand ggplot2 packages). Expression of IL-12 and IL-17 cytokineswasperformedbyrtPCRtwice, beforeandaftertreatment. Results: InHL patients, a significantdecreasewasobserved in themicrobiota flora of Ruminococcaceae_UCG-014 genus (p:0.036) andUndefined Ruminococcaceae_UCG-014 species (p:0.036) comparedtopre-treatment. When the post-treatment of HL patientswerecomparedwithhealthycontrols, a significantdecreasewasfound in themicrobiota of Prevotella_7 genus (p:0.049) andButyricimonas (p:0.006) in the post-treatmentmicrobiota of HL patients. InNHL patients, a significantdecreasewasobserved in themicrobiota flora of Coprococccus_3 genus (p:0.015) andUndefined Ruminoclostridium_5 (p:0.046) speciescomparedtopre-treatment. When post-treatment of NHL patientswerecomparedwithhealthycontrols, a significantabundance in theBacilliclass (p:0.029) and a significantdecrease in theUndefinedAlistipesspecies (p:0.047) wereobserved in the post-treatmentmicrobiota of NHL patients. While a decreasewasobserved in IL-12 cytokineexpressionuntilbeforetreatment, an increase in IL-17 cytokineexpressionwasdetected. Discussion: Intestinal flora monitoringafterchemotherapytreatmentshowsthat it can be a guide in thetreatment of thedisease. It is thoughtthatincreasingthediversity of commensalbacteria can alsopositivelyaffecttheprognosis of thedisease.

Keywords: hodgkin lymphoma, non-hodgkin, microbiota, cytokines

Procedia PDF Downloads 72
17076 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition

Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir

Abstract:

In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.

Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests

Procedia PDF Downloads 284
17075 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation

Authors: P. Selyshchev

Abstract:

We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.

Keywords: irradiation, primary defects, interaction, fluctuations

Procedia PDF Downloads 296
17074 Deterioration Prediction of Pavement Load Bearing Capacity from FWD Data

Authors: Kotaro Sasai, Daijiro Mizutani, Kiyoyuki Kaito

Abstract:

Expressways in Japan have been built in an accelerating manner since the 1960s with the aid of rapid economic growth. About 40 percent in length of expressways in Japan is now 30 years and older and has become superannuated. Time-related deterioration has therefore reached to a degree that administrators, from a standpoint of operation and maintenance, are forced to take prompt measures on a large scale aiming at repairing inner damage deep in pavements. These measures have already been performed for bridge management in Japan and are also expected to be embodied for pavement management. Thus, planning methods for the measures are increasingly demanded. Deterioration of layers around road surface such as surface course and binder course is brought about at the early stages of whole pavement deterioration process, around 10 to 30 years after construction. These layers have been repaired primarily because inner damage usually becomes significant after outer damage, and because surveys for measuring inner damage such as Falling Weight Deflectometer (FWD) survey and open-cut survey are costly and time-consuming process, which has made it difficult for administrators to focus on inner damage as much as they have been supposed to. As expressways today have serious time-related deterioration within them deriving from the long time span since they started to be used, it is obvious the idea of repairing layers deep in pavements such as base course and subgrade must be taken into consideration when planning maintenance on a large scale. This sort of maintenance requires precisely predicting degrees of deterioration as well as grasping the present situations of pavements. Methods for predicting deterioration are determined to be either mechanical or statistical. While few mechanical models have been presented, as far as the authors know of, previous studies have presented statistical methods for predicting deterioration in pavements. One describes deterioration process by estimating Markov deterioration hazard model, while another study illustrates it by estimating Proportional deterioration hazard model. Both of the studies analyze deflection data obtained from FWD surveys and present statistical methods for predicting deterioration process of layers around road surface. However, layers of base course and subgrade remain unanalyzed. In this study, data collected from FWD surveys are analyzed to predict deterioration process of layers deep in pavements in addition to surface layers by a means of estimating a deterioration hazard model using continuous indexes. This model can prevent the loss of information of data when setting rating categories in Markov deterioration hazard model when evaluating degrees of deterioration in roadbeds and subgrades. As a result of portraying continuous indexes, the model can predict deterioration in each layer of pavements and evaluate it quantitatively. Additionally, as the model can also depict probability distribution of the indexes at an arbitrary point and establish a risk control level arbitrarily, it is expected that this study will provide knowledge like life cycle cost and informative content during decision making process referring to where to do maintenance on as well as when.

Keywords: deterioration hazard model, falling weight deflectometer, inner damage, load bearing capacity, pavement

Procedia PDF Downloads 353