Search results for: fisher scoring method
19202 Numerical Solutions of Generalized Burger-Fisher Equation by Modified Variational Iteration Method
Authors: M. O. Olayiwola
Abstract:
Numerical solutions of the generalized Burger-Fisher are obtained using a Modified Variational Iteration Method (MVIM) with minimal computational efforts. The computed results with this technique have been compared with other results. The present method is seen to be a very reliable alternative method to some existing techniques for such nonlinear problems.Keywords: burger-fisher, modified variational iteration method, lagrange multiplier, Taylor’s series, partial differential equation
Procedia PDF Downloads 42919201 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR
Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.
Abstract:
We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME
Procedia PDF Downloads 39219200 The Beta-Fisher Snedecor Distribution with Applications to Cancer Remission Data
Authors: K. A. Adepoju, O. I. Shittu, A. U. Chukwu
Abstract:
In this paper, a new four-parameter generalized version of the Fisher Snedecor distribution called Beta- F distribution is introduced. The comprehensive account of the statistical properties of the new distributions was considered. Formal expressions for the cumulative density function, moments, moment generating function and maximum likelihood estimation, as well as its Fisher information, were obtained. The flexibility of this distribution as well as its robustness using cancer remission time data was demonstrated. The new distribution can be used in most applications where the assumption underlying the use of other lifetime distributions is violated.Keywords: fisher-snedecor distribution, beta-f distribution, outlier, maximum likelihood method
Procedia PDF Downloads 34619199 Base Change for Fisher Metrics: Case of the q-Gaussian Inverse Distribution
Authors: Gabriel I. Loaiza Ossa, Carlos A. Cadavid Moreno, Juan C. Arango Parra
Abstract:
It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ= -1/2, as does the family of usual Gaussian distributions. In the present paper, firstly, we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ₁, θ₂; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the inverse q-Gaussian distribution family (q < 3) as the family obtained by replacing the usual exponential function with the Tsallis q-exponential function in the expression for the inverse Gaussian distribution and observe that it supports two possible geometries, the Fisher and the q-Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q-Fisher geometry of the inverse q-Gaussian distribution family, similar to the ones obtained in the case of the inverse Gaussian distribution family.Keywords: base of changes, information geometry, inverse Gaussian distribution, inverse q-Gaussian distribution, statistical manifolds
Procedia PDF Downloads 24319198 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 3219197 Weighted Risk Scores Method Proposal for Occupational Safety Risk Assessment
Authors: Ulas Cinar, Omer Faruk Ugurlu, Selcuk Cebi
Abstract:
Occupational safety risk management is the most important element of a safe working environment. Effective risk management can only be possible with accurate analysis and evaluations. Scoring-based risk assessment methods offer considerable ease of application as they convert linguistic expressions into numerical results. It can also be easily adapted to any field. Contrary to all these advantages, important problems in scoring-based methods are frequently discussed. Effective measurability is one of the most critical problems. Existing methods allow experts to choose a score equivalent to each parameter. Therefore, experts prefer the score of the most likely outcome for risk. However, all other possible consequences are neglected. Assessments of the existing methods express the most probable level of risk, not the real risk of the enterprises. In this study, it is aimed to develop a method that will present a more comprehensive evaluation compared to the existing methods by evaluating the probability and severity scores, all sub-parameters, and potential results, and a new scoring-based method is proposed in the literature.Keywords: occupational health and safety, risk assessment, scoring based risk assessment method, underground mining, weighted risk scores
Procedia PDF Downloads 13419196 A Study on the Performance of 2-PC-D Classification Model
Authors: Nurul Aini Abdul Wahab, Nor Syamim Halidin, Sayidatina Aisah Masnan, Nur Izzati Romli
Abstract:
There are many applications of principle component method for reducing the large set of variables in various fields. Fisher’s Discriminant function is also a popular tool for classification. In this research, the researcher focuses on studying the performance of Principle Component-Fisher’s Discriminant function in helping to classify rice kernels to their defined classes. The data were collected on the smells or odour of the rice kernel using odour-detection sensor, Cyranose. 32 variables were captured by this electronic nose (e-nose). The objective of this research is to measure how well a combination model, between principle component and linear discriminant, to be as a classification model. Principle component method was used to reduce all 32 variables to a smaller and manageable set of components. Then, the reduced components were used to develop the Fisher’s Discriminant function. In this research, there are 4 defined classes of rice kernel which are Aromatic, Brown, Ordinary and Others. Based on the output from principle component method, the 32 variables were reduced to only 2 components. Based on the output of classification table from the discriminant analysis, 40.76% from the total observations were correctly classified into their classes by the PC-Discriminant function. Indirectly, it gives an idea that the classification model developed has committed to more than 50% of misclassifying the observations. As a conclusion, the Fisher’s Discriminant function that was built on a 2-component from PCA (2-PC-D) is not satisfying to classify the rice kernels into its defined classes.Keywords: classification model, discriminant function, principle component analysis, variable reduction
Procedia PDF Downloads 33019195 Quantum Fisher Information of Bound Entangled W-Like States
Authors: Fatih Ozaydin
Abstract:
Quantum Fisher information (QFI) is a multipartite entanglement witness and recently it has been studied extensively with separability and entanglement in the focus. On the other hand, bound entanglement is a special phenomena observed in mixed entangled states. In this work, we study the QFI of W states under a four-dimensional entanglement binding channel. Starting with initally pure W states of several qubits, we find how the QFI decreases as two qubits of the W state is subject to entanglement binding. We also show that as the size of the W state increases, the effect of entanglement binding is decreased.Keywords: Quantum Fisher information, W states, bound entanglement, entanglement binding
Procedia PDF Downloads 48119194 An Information System for Strategic Performance Scoring in Municipal Management
Authors: Emin Gundogar, Aysegul Yilmaz
Abstract:
Strategic performance scoring is a significant procedure in management. There are various methods to improve this procedure. This study introduces an information system that is developed to score performance for municipal management. The application of the system is clarified by exemplifying municipal processes.Keywords: management information system, municipal management, performance scoring
Procedia PDF Downloads 76719193 Application of Scoring Rubrics by Lecturers towards Objective Assessment of Essay Questions in the Department of Social Science Education, University of Calabar, Nigeria
Authors: Donald B. Enu, Clement O. Ukpor, Abigail E. Okon
Abstract:
Unreliable scoring of students’ performance by lecturers short-chains students’ assessment in terms of underequipping the school authority with facts as intended by society through the curriculum hence, the learners, the school and the society are cheated because the usefulness of testing is defeated. This study, therefore, examined lecturers’ scoring objectivity of essay items in the Department of Social Science Education, University of Calabar, Nigeria. Specifically, it assessed lecturers’ perception of the relevance of scoring rubrics and its level of application. Data were collected from all the 36 lecturers in the Department (28 members and 8 non-members adjourned to the department), through a 20-item questionnaire and checklist instruments. A case-study design was adopted. Descriptive statistics of frequency counts, weighted means, standard deviations, and percentages were used to analyze data gathered. A mean score of 2.5 and or 60 percent and above formed the acceptance or significant level in decision taking. It was found that lecturers perceived the use of scoring rubrics as a relevant practice to ensure fairness and reliable treatment of examiners scripts particularly in marking essay items and that there is a moderately high level of adherence to the application of scoring rubrics. It was also observed that some criteria necessary for the scoring objectivity of essay items were not fully put in place in the department. It was recommended strongly that students’ identities be hidden while marking and that pre-determined marking scheme should be prepared centrally and strictly adhered to during marking and recording of scores. Conference marking should be enforced in the department.Keywords: essay items, objective scoring, scorers reliability, scoring rubrics
Procedia PDF Downloads 17919192 A Multi-Dimensional Neural Network Using the Fisher Transform to Predict the Price Evolution for Algorithmic Trading in Financial Markets
Authors: Cristian Pauna
Abstract:
Trading the financial markets is a widespread activity today. A large number of investors, companies, public of private funds are buying and selling every day in order to make profit. Algorithmic trading is the prevalent method to make the trade decisions after the electronic trading release. The orders are sent almost instantly by computers using mathematical models. This paper will present a price prediction methodology based on a multi-dimensional neural network. Using the Fisher transform, the neural network will be instructed for a low-latency auto-adaptive process in order to predict the price evolution for the next period of time. The model is designed especially for algorithmic trading and uses the real-time price series. It was found that the characteristics of the Fisher function applied at the nodes scale level can generate reliable trading signals using the neural network methodology. After real time tests it was found that this method can be applied in any timeframe to trade the financial markets. The paper will also include the steps to implement the presented methodology into an automated trading system. Real trading results will be displayed and analyzed in order to qualify the model. As conclusion, the compared results will reveal that the neural network methodology applied together with the Fisher transform at the nodes level can generate a good price prediction and can build reliable trading signals for algorithmic trading.Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, neural network
Procedia PDF Downloads 16019191 Review and Comparison of Iran`s Sixteenth Topic of the Building with the Ranking System of the Water Sector Lead to Improve the Criteria of the Sixteenth Topic
Authors: O. Fatemi
Abstract:
Considering growing building construction industry in developing countries and sustainable development concept, as well as the importance of taking care of the future generations, codifying buildings scoring system based on environmental criteria, has always been a subject for discussion. The existing systems cannot be used for all the regions due to several reasons, including but not limited to variety in regional variables. In this article, the most important common LEED (Leadership in Energy and Environmental Design) and BREEAM (Building Research Establishment Environmental Assessment Method) common and Global environmental scoring systems, used in UK, USA, and Japan, respectively, have been discussed and compared with a special focus on CASBEE (Comprehensive Assessment System for Built Environment Efficiency), to credit assigning field (weighing and scores systems) as well as sustainable development criteria in each system. Then, converging and distinct fields of the foregoing systems are examined considering National Iranian Building Code. Furthermore, the common credits in the said systems not mentioned in National Iranian Building Code have been identified. These credits, which are generally included in well-known fundamental principles in sustainable development, may be considered as offered options for the Iranian building environmental scoring system. It is suggested that one of the globally and commonly accepted systems is chosen considering national priorities in order to offer an effective method for buildings environmental scoring, and then, a part of credits is added and/or removed, or a certain credit score is changed, and eventually, a new scoring system with a new title is developed for the country. Evidently, building construction industry highly affects the environment, economy, efficiency, and health of the relevant occupants. Considering the growing trend of cities and construction, achieving building scoring systems based on environmental criteria has always been a matter of discussion. The existing systems cannot be used for all the regions due to several reasons, including but not limited to variety in regional variables.Keywords: scoring system, sustainability assessment, water efficiency, national Iranian building code
Procedia PDF Downloads 18119190 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19319189 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders
Authors: Hao-Hsi Tseng, Hsin-Yun Lee
Abstract:
The Most Advantageous Tender (MAT) has been criticized for its susceptibility to dictatorial situations and for its processing of same score, same rank issues. This study applies the four criteria from Arrow's Impossibility Theorem to construct a mechanism for revealing illegitimate scores in scoring methods. While commonly be used to improve on problems resulting from extreme scores, ranking methods hide significant defects, adversely affecting selection fairness. To address these shortcomings, this study relies mainly on the overall evaluated score method, using standardized scores plus normal cumulative distribution function conversion to calculate the evaluation of vender preference. This allows for free score evaluations, which reduces the influence of dictatorial behavior and avoiding same score, same rank issues. Large-scale simulations confirm that this method outperforms currently used methods using the Impossibility Theorem.Keywords: Arrow’s impossibility theorem, cumulative normal distribution function, most advantageous tender, scoring method
Procedia PDF Downloads 46219188 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 33119187 A Robust Spatial Feature Extraction Method for Facial Expression Recognition
Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda
Abstract:
This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure
Procedia PDF Downloads 42319186 Navigating the Complexity of Guillain-Barré Syndrome and Miller Fisher Syndrome Overlap Syndrome: A Pediatric Case Report
Authors: Kamal Chafiq, Youssef Hadzine, Adel Elmekkaoui, Othmane Benlenda, Houssam Rajad, Soukaina Wakrim, Hicham Nassik
Abstract:
Guillain-Barré syndrome/Miller Fishe syndrome (GBS/MFS) overlap syndrome is an extremely rare variant of Guillain-Barré syndrome (GBS) in which Miller Fisher syndrome (MFS) coexists with other characteristics of GBS, such as limb weakness, paresthesia, and facial paralysis. We report the clinical case of a 12-year-old patient, with no pathological history, who acutely presents with ophthalmoplegia, areflexia, facial diplegia, and swallowing and phonation disorders, followed by progressive, descending, and symmetrical paresis affecting first the upper limbs and then the lower limbs. An albuminocytological dissociation was found in the cerebrospinal fluid study. Magnetic resonance imaging of the spinal cord showed enhancement and thickening of the cauda equina roots. The patient was treated with immunoglobulins with a favorable clinical outcome.Keywords: Guillain-Barré syndrome, Miller Fisher syndrome, overlap syndrome, anti-GQ1b antibodies
Procedia PDF Downloads 7619185 Prediction and Analysis of Human Transmembrane Transporter Proteins Based on SCM
Authors: Hui-Ling Huang, Tamara Vasylenko, Phasit Charoenkwan, Shih-Hsiang Chiu, Shinn-Ying Ho
Abstract:
The knowledge of the human transporters is still limited due to technically demanding procedure of crystallization for the structural characterization of transporters by spectroscopic methods. It is desirable to develop bioinformatics tools for effective analysis of available sequences in order to identify human transmembrane transporter proteins (HMTPs). This study proposes a scoring card method (SCM) based method for predicting HMTPs. We estimated a set of propensity scores of dipeptides to be HMTPs using SCM from the training dataset (HTS732) consisting of 366 HMTPs and 366 non-HMTPs. SCM using the estimated propensity scores of 20 amino acids and 400 dipeptides -as HMTPs, has a training accuracy of 87.63% and a test accuracy of 66.46%. The five top-ranked dipeptides include LD, NV, LI, KY, and MN with scores 996, 992, 989, 987, and 985, respectively. Five amino acids with the highest propensity scores are Ile, Phe, Met, Gly, and Leu, that hydrophobic residues are mostly highly-scored. Furthermore, obtained propensity scores were used to analyze physicochemical properties of human transporters.Keywords: dipeptide composition, physicochemical property, human transmembrane transporter proteins, human transmembrane transporters binding propensity, scoring card method
Procedia PDF Downloads 36819184 A Medical Vulnerability Scoring System Incorporating Health and Data Sensitivity Metrics
Authors: Nadir A. Carreon, Christa Sonderer, Aakarsh Rao, Roman Lysecky
Abstract:
With the advent of complex software and increased connectivity, the security of life-critical medical devices is becoming an increasing concern, particularly with their direct impact on human safety. Security is essential, but it is impossible to develop completely secure and impenetrable systems at design time. Therefore, it is important to assess the potential impact on the security and safety of exploiting a vulnerability in such critical medical systems. The common vulnerability scoring system (CVSS) calculates the severity of exploitable vulnerabilities. However, for medical devices it does not consider the unique challenges of impacts to human health and privacy. Thus, the scoring of a medical device on which human life depends (e.g., pacemakers, insulin pumps) can score very low, while a system on which human life does not depend (e.g., hospital archiving systems) might score very high. In this paper, we propose a medical vulnerability scoring system (MVSS) that extends CVSS to address the health and privacy concerns of medical devices. We propose incorporating two new parameters, namely health impact, and sensitivity impact. Sensitivity refers to the type of information that can be stolen from the device, and health represents the impact on the safety of the patient if the vulnerability is exploited (e.g., potential harm, life-threatening). We evaluate fifteen different known vulnerabilities in medical devices and compare MVSS against two state-of-the-art medical device-oriented vulnerability scoring systems and the foundational CVSS.Keywords: common vulnerability system, medical devices, medical device security, vulnerabilities
Procedia PDF Downloads 16519183 Development of a Rating Scale for Elementary EFL Writing
Authors: Mohammed S. Assiri
Abstract:
In EFL programs, rating scales used in writing assessment are often constructed by intuition. Intuition-based scales tend to provide inaccurate and divisive ratings of learners’ writing performance. Hence, following an empirical approach, this study attempted to develop a rating scale for elementary-level writing at an EFL program in Saudi Arabia. Towards this goal, 98 students’ essays were scored and then coded using comprehensive taxonomy of writing constructs and their measures. An automatic linear modeling was run to find out which measures would best predict essay scores. A nonparametric ANOVA, the Kruskal-Wallis test, was then used to determine which measures could best differentiate among scoring levels. Findings indicated that there were certain measures that could serve as either good predictors of essay scores or differentiators among scoring levels, or both. The main conclusion was that a rating scale can be empirically developed using predictive and discriminative statistical tests.Keywords: analytic scoring, rating scales, writing assessment, writing constructs, writing performance
Procedia PDF Downloads 46219182 Syntax and Words as Evolutionary Characters in Comparative Linguistics
Authors: Nancy Retzlaff, Sarah J. Berkemer, Trudie Strauss
Abstract:
In the last couple of decades, the advent of digitalization of any kind of data was probably one of the major advances in all fields of study. This paves the way for also analysing these data even though they might come from disciplines where there was no initial computational necessity to do so. Especially in linguistics, one can find a rather manual tradition. Still when considering studies that involve the history of language families it is hard to overlook the striking similarities to bioinformatics (phylogenetic) approaches. Alignments of words are such a fairly well studied example of an application of bioinformatics methods to historical linguistics. In this paper we will not only consider alignments of strings, i.e., words in this case, but also alignments of syntax trees of selected Indo-European languages. Based on initial, crude alignments, a sophisticated scoring model is trained on both letters and syntactic features. The aim is to gain a better understanding on which features in two languages are related, i.e., most likely to have the same root. Initially, all words in two languages are pre-aligned with a basic scoring model that primarily selects consonants and adjusts them before fitting in the vowels. Mixture models are subsequently used to filter ‘good’ alignments depending on the alignment length and the number of inserted gaps. Using these selected word alignments it is possible to perform tree alignments of the given syntax trees and consequently find sentences that correspond rather well to each other across languages. The syntax alignments are then filtered for meaningful scores—’good’ scores contain evolutionary information and are therefore used to train the sophisticated scoring model. Further iterations of alignments and training steps are performed until the scoring model saturates, i.e., barely changes anymore. A better evaluation of the trained scoring model and its function in containing evolutionary meaningful information will be given. An assessment of sentence alignment compared to possible phrase structure will also be provided. The method described here may have its flaws because of limited prior information. This, however, may offer a good starting point to study languages where only little prior knowledge is available and a detailed, unbiased study is needed.Keywords: alignments, bioinformatics, comparative linguistics, historical linguistics, statistical methods
Procedia PDF Downloads 15219181 Labyrinthine Venous Vasculature Ablation for the Treatment of Sudden Sensorineural Hearing Loss: Two Case Reports
Authors: Kritin K. Verma, Bailey Duhon, Patrick W. Slater
Abstract:
Objective: To introduce the possible etiological role that the Labyrinthine Venous Vasculature (LVV) has in venous congestion of the cochlear system in Sudden Sensorineural Hearing Loss (SSNHL) patients. Patients: Two patients (62-year-old female, 50-year-old male) presented within twenty-four hours of onset of SSNHL. Intervention: Following failed conservative and salvage techniques, the patients underwent ablation of the labyrinthine venous vasculature ipsilateral to the side of the loss. Main Outcome Measures: Improvement of sudden SSNHL based on an improvement of pure-tone audiometric (PTA) low-tone scoring averages at 250, 500, and 1000 Hz. Word recognition scoring using the NU-6 word list was used to assess quality of life. Results: Case 1 experienced a 51.7 dB increase in low-tone PTA and an increased word recognition scoring of 90%. Case 2 experienced a 33.4 dB increase in low-tone PTA and 60% increase in word recognition score. No major complications noted. Conclusion: Two patients experienced significant improvement in their low-tone PTA and word recognition scoring following the labyrinthine venous vasculature ablation.Keywords: case report, sudden sensorineural hearing loss, venous congestion, vascular ablation
Procedia PDF Downloads 13519180 Text Based Shuffling Algorithm on Graphics Processing Unit for Digital Watermarking
Authors: Zayar Phyo, Ei Chaw Htoon
Abstract:
In a New-LSB based Steganography method, the Fisher-Yates algorithm is used to permute an existing array randomly. However, that algorithm performance became slower and occurred memory overflow problem while processing the large dimension of images. Therefore, the Text-Based Shuffling algorithm aimed to select only necessary pixels as hiding characters at the specific position of an image according to the length of the input text. In this paper, the enhanced text-based shuffling algorithm is presented with the powered of GPU to improve more excellent performance. The proposed algorithm employs the OpenCL Aparapi framework, along with XORShift Kernel including the Pseudo-Random Number Generator (PRNG) Kernel. PRNG is applied to produce random numbers inside the kernel of OpenCL. The experiment of the proposed algorithm is carried out by practicing GPU that it can perform faster-processing speed and better efficiency without getting the disruption of unnecessary operating system tasks.Keywords: LSB based steganography, Fisher-Yates algorithm, text-based shuffling algorithm, OpenCL, XORShiftKernel
Procedia PDF Downloads 14819179 Automatic Classification for the Degree of Disc Narrowing from X-Ray Images Using CNN
Authors: Kwangmin Joo
Abstract:
Automatic detection of lumbar vertebrae and classification method is proposed for evaluating the degree of disc narrowing. Prior to classification, deep learning based segmentation is applied to detect individual lumbar vertebra. M-net is applied to segment five lumbar vertebrae and fine-tuning segmentation is employed to improve the accuracy of segmentation. Using the features extracted from previous step, clustering technique, k-means clustering, is applied to estimate the degree of disc space narrowing under four grade scoring system. As preliminary study, techniques proposed in this research could help building an automatic scoring system to diagnose the severity of disc narrowing from X-ray images.Keywords: Disc space narrowing, Degenerative disc disorders, Deep learning based segmentation, Clustering technique
Procedia PDF Downloads 12419178 Computerized Adaptive Testing for Ipsative Tests with Multidimensional Pairwise-Comparison Items
Authors: Wen-Chung Wang, Xue-Lan Qiu
Abstract:
Ipsative tests have been widely used in vocational and career counseling (e.g., the Jackson Vocational Interest Survey). Pairwise-comparison items are a typical item format of ipsative tests. When the two statements in a pairwise-comparison item measure two different constructs, the item is referred to as a multidimensional pairwise-comparison (MPC) item. A typical MPC item would be: Which activity do you prefer? (A) playing with young children, or (B) working with tools and machines. These two statements aim at the constructs of social interest and investigative interest, respectively. Recently, new item response theory (IRT) models for ipsative tests with MPC items have been developed. Among them, the Rasch ipsative model (RIM) deserves special attention because it has good measurement properties, in which the log-odds of preferring statement A to statement B are defined as a competition between two parts: the sum of a person’s latent trait to which statement A is measuring and statement A’s utility, and the sum of a person’s latent trait to which statement B is measuring and statement B’s utility. The RIM has been extended to polytomous responses, such as preferring statement A strongly, preferring statement A, preferring statement B, and preferring statement B strongly. To promote the new initiatives, in this study we developed computerized adaptive testing algorithms for MFC items and evaluated their performance using simulations and two real tests. Both the RIM and its polytomous extension are multidimensional, which calls for multidimensional computerized adaptive testing (MCAT). A particular issue in MCAT for MPC items is the within-person statement exposure (WPSE); that is, a respondent may keep seeing the same statement (e.g., my life is empty) for many times, which is certainly annoying. In this study, we implemented two methods to control the WPSE rate. In the first control method, items would be frozen when their statements had been administered more than a prespecified times. In the second control method, a random component was added to control the contribution of the information at different stages of MCAT. The second control method was found to outperform the first control method in our simulation studies. In addition, we investigated four item selection methods: (a) random selection (as a baseline), (b) maximum Fisher information method without WPSE control, (c) maximum Fisher information method with the first control method, and (d) maximum Fisher information method with the second control method. These four methods were applied to two real tests: one was a work survey with dichotomous MPC items and the other is a career interests survey with polytomous MPC items. There were three dependent variables: the bias and root mean square error across person measures, and measurement efficiency which was defined as the number of items needed to achieve the same degree of test reliability. Both applications indicated that the proposed MCAT algorithms were successful and there was no loss in measurement proficiency when the control methods were implemented, and among the four methods, the last method performed the best.Keywords: computerized adaptive testing, ipsative tests, item response theory, pairwise comparison
Procedia PDF Downloads 24619177 Rubric in Vocational Education
Authors: Azmanirah Ab Rahman, Jamil Ahmad, Ruhizan Muhammad Yasin
Abstract:
Rubric is a very important tool for teachers and students for a variety of purposes. Teachers use the rubric for evaluating student work while students use rubrics for self-assessment. Therefore, this paper was emphasized scoring rubric as a scoring tool for teachers in an environment of Competency Based Education and Training (CBET) in Malaysia Vocational College. A total of three teachers in the fields of electrical and electronics engineering were interviewed to identify how the use of rubrics practiced since vocational transformation implemented in 2012. Overall holistic rubric used to determine the performance of students in the skills area.Keywords: rubric, vocational education, teachers, CBET
Procedia PDF Downloads 50519176 Discrimination and Classification of Vestibular Neuritis Using Combined Fisher and Support Vector Machine Model
Authors: Amine Ben Slama, Aymen Mouelhi, Sondes Manoubi, Chiraz Mbarek, Hedi Trabelsi, Mounir Sayadi, Farhat Fnaiech
Abstract:
Vertigo is a sensation of feeling off balance; the cause of this symptom is very difficult to interpret and needs a complementary exam. Generally, vertigo is caused by an ear problem. Some of the most common causes include: benign paroxysmal positional vertigo (BPPV), Meniere's disease and vestibular neuritis (VN). In clinical practice, different tests of videonystagmographic (VNG) technique are used to detect the presence of vestibular neuritis (VN). The topographical diagnosis of this disease presents a large diversity in its characteristics that confirm a mixture of problems for usual etiological analysis methods. In this study, a vestibular neuritis analysis method is proposed with videonystagmography (VNG) applications using an estimation of pupil movements in the case of an uncontrolled motion to obtain an efficient and reliable diagnosis results. First, an estimation of the pupil displacement vectors using with Hough Transform (HT) is performed to approximate the location of pupil region. Then, temporal and frequency features are computed from the rotation angle variation of the pupil motion. Finally, optimized features are selected using Fisher criterion evaluation for discrimination and classification of the VN disease.Experimental results are analyzed using two categories: normal and pathologic. By classifying the reduced features using the Support Vector Machine (SVM), 94% is achieved as classification accuracy. Compared to recent studies, the proposed expert system is extremely helpful and highly effective to resolve the problem of VNG analysis and provide an accurate diagnostic for medical devices.Keywords: nystagmus, vestibular neuritis, videonystagmographic system, VNG, Fisher criterion, support vector machine, SVM
Procedia PDF Downloads 13519175 Feature Extraction of MFCC Based on Fisher-Ratio and Correlated Distance Criterion for Underwater Target Signal
Authors: Han Xue, Zhang Lanyue
Abstract:
In order to seek more effective feature extraction technology, feature extraction method based on MFCC combined with vector hydrophone is exposed in the paper. The sound pressure signal and particle velocity signal of two kinds of ships are extracted by using MFCC and its evolution form, and the extracted features are fused by using fisher-ratio and correlated distance criterion. The features are then identified by BP neural network. The results showed that MFCC, First-Order Differential MFCC and Second-Order Differential MFCC features can be used as effective features for recognition of underwater targets, and the fusion feature can improve the recognition rate. Moreover, the results also showed that the recognition rate of the particle velocity signal is higher than that of the sound pressure signal, and it reflects the superiority of vector signal processing.Keywords: vector information, MFCC, differential MFCC, fusion feature, BP neural network
Procedia PDF Downloads 52719174 Screening Methodology for Seismic Risk Assessment of Aging Structures in Oil and Gas Plants
Authors: Mohammad Nazri Mustafa, Pedram Hatami Abdullah, M. Fakhrur Razi Ahmad Faizul
Abstract:
With the issuance of Malaysian National Annex 2017 as a part of MS EN 1998-1:2015, the seismic mapping of Malaysian Peninsular including Sabah and Sarawak has undergone some changes in terms of the Peak Ground Acceleration (PGA) value. The revision to the PGA has raised a concern on the safety of oil and gas onshore structures as these structures were not designed to accommodate the new PGA values which are much higher than the previous values used in the original design. In view of the high numbers of structures and buildings to be re-assessed, a risk assessment methodology has been developed to prioritize and rank the assets in terms of their criticality against the new seismic loading. To-date such risk assessment method for oil and gas onshore structures is lacking, and it is the main intention of this technical paper to share the risk assessment methodology and risk elements scoring finalized via Delphi Method. The finalized methodology and the values used to rank the risk elements have been established based on years of relevant experience on the subject matter and based on a series of rigorous discussions with professionals in the industry. The risk scoring is mapped against the risk matrix (i.e., the LOF versus COF) and hence, the overall risk for the assets can be obtained. The overall risk can be used to prioritize and optimize integrity assessment, repair and strengthening work against the new seismic mapping of the country.Keywords: methodology, PGA, risk, seismic
Procedia PDF Downloads 15019173 Critical Behaviour and Filed Dependence of Magnetic Entropy Change in K Doped Manganites Pr₀.₈Na₀.₂−ₓKₓMnO₃ (X = .10 And .15)
Authors: H. Ben Khlifa, W. Cheikhrouhou-Koubaa, A. Cheikhrouhou
Abstract:
The orthorhombic Pr₀.₈Na₀.₂−ₓKₓMnO₃ (x = 0.10 and 0.15) manganites are prepared by using the solid-state reaction at high temperatures. The critical exponents (β, γ, δ) are investigated through various techniques such as modified Arrott plot, Kouvel-Fisher method, and critical isotherm analysis based on the data of the magnetic measurements recorded around the Curie temperature. The critical exponents are derived from the magnetization data using the Kouvel-Fisher method, are found to be β = 0.32(4) and γ = 1.29(2) at TC ~ 123 K for x = 0.10 and β = 0.31(1) and γ = 1.25(2) at TC ~ 133 K for x = 0.15. The critical exponent values obtained for both samples are comparable to the values predicted by the 3D-Ising model and have also been verified by the scaling equation of state. Such results demonstrate the existence of ferromagnetic short-range order in our materials. The magnetic entropy changes of polycrystalline samples with a second-order phase transition are investigated. A large magnetic entropy change deduced from isothermal magnetization curves, is observed in our samples with a peak centered on their respective Curie temperatures (TC). The field dependence of the magnetic entropy changes are analyzed, which shows power-law dependence ΔSmax ≈ a(μ0 H)n at the transition temperature. The values of n obey the Curie Weiss law above the transition temperature. It is shown that for the investigated materials, the magnetic entropy change follows a master curve behavior. The rescaled magnetic entropy change curves for different applied fields collapse onto a single curve for both samples.Keywords: manganites, critical exponents, magnetization, magnetocaloric, master curve
Procedia PDF Downloads 164