Search results for: optimality verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 597

Search results for: optimality verification

567 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 233
566 Urban Rail Transit CBTC Computer Interlocking Subsystem Relying on Multi-Template Pen Point Tracking Algorithm

Authors: Xinli Chen, Xue Su

Abstract:

In the urban rail transit CBTC system, interlocking is considered one of the most basic sys-tems, which has the characteristics of logical complexity and high-security requirements. The development and verification of traditional interlocking subsystems are entirely manual pro-cesses and rely too much on the designer, which often hides many uncertain factors. In order to solve this problem, this article is based on the multi-template nib tracking algorithm for model construction and verification, achieving the main safety attributes and using SCADE for formal verification. Experimental results show that this method helps to improve the quality and efficiency of interlocking software.

Keywords: computer interlocking subsystem, penpoint tracking, communication-based train control system, multi-template tip tracking

Procedia PDF Downloads 133
565 Developed Text-Independent Speaker Verification System

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Speech is a very convenient way of communication between people and machines. It conveys information about the identity of the talker. Since speaker recognition technology is increasingly securing our everyday lives, the objective of this paper is to develop two automatic text-independent speaker verification systems (TI SV) using low-level spectral features and machine learning methods. (i) The first system is based on a support vector machine (SVM), which was widely used in voice signal processing with the aim of speaker recognition involving verifying the identity of the speaker based on its voice characteristics, and (ii) the second is based on Gaussian Mixture Model (GMM) and Universal Background Model (UBM) to combine different functions from different resources to implement the SVM based.

Keywords: speaker verification, text-independent, support vector machine, Gaussian mixture model, cepstral analysis

Procedia PDF Downloads 22
564 End to End Monitoring in Oracle Fusion Middleware for Data Verification

Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan

Abstract:

In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.

Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring

Procedia PDF Downloads 454
563 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface

Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari

Abstract:

With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.

Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis

Procedia PDF Downloads 387
562 A Formal Verification Approach for Linux Kernel Designing

Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan

Abstract:

Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.

Keywords: formal approach, type theory, Linux Kernel, software program

Procedia PDF Downloads 96
561 Triangular Geometric Feature for Offline Signature Verification

Authors: Zuraidasahana Zulkarnain, Mohd Shafry Mohd Rahim, Nor Anita Fairos Ismail, Mohd Azhar M. Arsad

Abstract:

Handwritten signature is accepted widely as a biometric characteristic for personal authentication. The use of appropriate features plays an important role in determining accuracy of signature verification; therefore, this paper presents a feature based on the geometrical concept. To achieve the aim, triangle attributes are exploited to design a new feature since the triangle possesses orientation, angle and transformation that would improve accuracy. The proposed feature uses triangulation geometric set comprising of sides, angles and perimeter of a triangle which is derived from the center of gravity of a signature image. For classification purpose, Euclidean classifier along with Voting-based classifier is used to verify the tendency of forgery signature. This classification process is experimented using triangular geometric feature and selected global features. Based on an experiment that was validated using Grupo de Senales 960 (GPDS-960) signature database, the proposed triangular geometric feature achieves a lower Average Error Rates (AER) value with a percentage of 34% as compared to 43% of the selected global feature. As a conclusion, the proposed triangular geometric feature proves to be a more reliable feature for accurate signature verification.

Keywords: biometrics, euclidean classifier, features extraction, offline signature verification, voting-based classifier

Procedia PDF Downloads 349
560 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM

Authors: Rajpal Kaur, Pooja Choudhary

Abstract:

Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.

Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM

Procedia PDF Downloads 359
559 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 122
558 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms

Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano

Abstract:

In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.

Keywords: heuristic, MIP model, remedial course, school, timetabling

Procedia PDF Downloads 583
557 I²C Master-Slave Integration

Authors: Rozita Borhan, Lam Kien Sieng

Abstract:

This paper describes I²C Slave implementation using I²C master obtained from the OpenCores website. This website provides free Verilog and VHDL Codes to users. The design implementation for the I²C slave is in Verilog Language and uses EDA tools for ASIC design known as ModelSim from Mentor Graphic. This tool is used for simulation and verification purposes. Common application for this I²C Master-Slave integration is also included. This paper also addresses the advantages and limitations of the said design.

Keywords: I²C, master, OpenCores, slave, Verilog, verification

Procedia PDF Downloads 410
556 Development of Algorithms for Solving and Analyzing Special Problems Transports Type

Authors: Dmitri Terzi

Abstract:

The article presents the results of an algorithmic study of a special optimization problem of the transport type (traveling salesman problem): 1) To solve the problem, a new natural algorithm has been developed based on the decomposition of the initial data into convex hulls, which has a number of advantages; it is applicable for a fairly large dimension, does not require a large amount of memory, and has fairly good performance. The relevance of the algorithm lies in the fact that, in practice, programs for problems with the number of traversal points of no more than twenty are widely used. For large-scale problems, the availability of algorithms and programs of this kind is difficult. The proposed algorithm is natural because the optimal solution found by the exact algorithm is not always feasible due to the presence of many other factors that may require some additional restrictions. 2) Another inverse problem solved here is to describe a class of traveling salesman problems that have a predetermined optimal solution. The constructed algorithm 2 allows us to characterize the structure of traveling salesman problems, as well as construct test problems to evaluate the effectiveness of algorithms and other purposes. 3) The appendix presents a software implementation of Algorithm 1 (in MATLAB), which can be used to solve practical problems, as well as in the educational process on operations research and optimization methods.

Keywords: traveling salesman problem, solution construction algorithm, convex hulls, optimality verification

Procedia PDF Downloads 35
555 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 70
554 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 53
553 A Formal Property Verification for Aspect-Oriented Programs in Software Development

Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb

Abstract:

Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.

Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories

Procedia PDF Downloads 148
552 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference

Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov

Abstract:

Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.

Keywords: canonical correlation analysis, power curve, power performance, wind energy

Procedia PDF Downloads 308
551 Optimal Design of Step-Stress Partially Life Test Using Multiply Censored Exponential Data with Random Removals

Authors: Showkat Ahmad Lone, Ahmadur Rahman, Ariful Islam

Abstract:

The major assumption in accelerated life tests (ALT) is that the mathematical model relating the lifetime of a test unit and the stress are known or can be assumed. In some cases, such life–stress relationships are not known and cannot be assumed, i.e. ALT data cannot be extrapolated to use condition. So, in such cases, partially accelerated life test (PALT) is a more suitable test to be performed for which tested units are subjected to both normal and accelerated conditions. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests using progressive failure-censored hybrid data with random removals. The life data of the units under test is considered to follow exponential life distribution. The removals from the test are assumed to have binomial distributions. The point and interval maximum likelihood estimations are obtained for unknown distribution parameters and tampering coefficient. An optimum test plan is developed using the D-optimality criterion. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.

Keywords: binomial distribution, d-optimality, multiple censoring, optimal design, partially accelerated life testing, simulation study

Procedia PDF Downloads 296
550 Verification and Application of Finite Element Model Developed for Flood Routing in Rivers

Authors: A. L. Qureshi, A. A. Mahessar, A. Baloch

Abstract:

Flood wave propagation in river channel flow can be enunciated by nonlinear equations of motion for unsteady flow. However, it is difficult to find analytical solution of these complex non-linear equations. Hence, verification of the numerical model should be carried out against field data and numerical predictions. This paper presents the verification of developed finite element model applying for unsteady flow in the open channels. The results of a proposed model indicate a good matching with both Preissmann scheme and HEC-RAS model for a river reach of 29 km at both sites (15 km from upstream and at downstream end) for discharge hydrographs. It also has an agreeable comparison with the Preissemann scheme for the flow depth (stage) hydrographs. The proposed model has also been applying to forecast daily discharges at 400 km downstream from Sukkur barrage, which demonstrates accurate model predictions with observed daily discharges. Hence, this model may be utilized for predicting and issuing flood warnings about flood hazardous in advance.

Keywords: finite element method, Preissmann scheme, HEC-RAS, flood forecasting, Indus river

Procedia PDF Downloads 474
549 Automatic Checkpoint System Using Face and Card Information

Authors: Kriddikorn Kaewwongsri, Nikom Suvonvorn

Abstract:

In the deep south of Thailand, checkpoints for people verification are necessary for the security management of risk zones, such as official buildings in the conflict area. In this paper, we propose an automatic checkpoint system that verifies persons using information from ID cards and facial features. The methods for a person’s information abstraction and verification are introduced based on useful information such as ID number and name, extracted from official cards, and facial images from videos. The proposed system shows promising results and has a real impact on the local society.

Keywords: face comparison, card recognition, OCR, checkpoint system, authentication

Procedia PDF Downloads 299
548 Optimal Tetra-Allele Cross Designs Including Specific Combining Ability Effects

Authors: Mohd Harun, Cini Varghese, Eldho Varghese, Seema Jaggi

Abstract:

Hybridization crosses find a vital role in breeding experiments to evaluate the combining abilities of individual parental lines or crosses for creation of lines with desirable qualities. There are various ways of obtaining progenies and further studying the combining ability effects of the lines taken in a breeding programme. Some of the most common methods are diallel or two-way cross, triallel or three-way cross, tetra-allele or four-way cross. These techniques help the breeders to improve the quantitative traits which are of economical as well as nutritional importance in crops and animals. Amongst these methods, tetra-allele cross provides extra information in terms of the higher specific combining ability (sca) effects and the hybrids thus produced exhibit individual as well as population buffering mechanism because of the broad genetic base. Most of the common commercial hybrids in corn are either three-way or four-way cross hybrids. Tetra-allele cross came out as the most practical and acceptable scheme for the production of slaughter pigs having fast growth rate, good feed efficiency, and carcass quality. Tetra-allele crosses are mostly used for exploitation of heterosis in case of commercial silkworm production. Experimental designs involving tetra-allele crosses have been studied extensively in literature. Optimality of designs has also been considered as a researchable issue. In practical situations, it is advisable to include sca effects in the model as this information is needed by the breeder to improve economically and nutritionally important quantitative traits. Thus, a model that provides information regarding the specific traits by utilizing sca effects along with general combining ability (gca) effects may help the breeders to deal with the problem of various stresses. In this paper, a model for experimental designs involving tetra-allele crosses that incorporates both gca and sca has been defined. Optimality aspects of such designs have been discussed incorporating sca effects in the model. Orthogonality conditions have been derived for block designs ensuring estimation of contrasts among the gca effects, after eliminating the nuisance factors, independently from sca effects. User friendly SAS macro and web solution (webPTC) have been developed for the generation and analysis of such designs.

Keywords: general combining ability, optimality, specific combining ability, tetra-allele cross, webPTC

Procedia PDF Downloads 108
547 Student Records Management System Using Smart Cards and Biometric Technology for Educational Institutions

Authors: Patrick O. Bobbie, Prince S. Attrams

Abstract:

In recent times, the rapid change in new technologies has spurred up the way and manner records are handled in educational institutions. Also, there is a need for reliable access and ease-of use to these records, resulting in increased productivity in organizations. In academic institutions, such benefits help in quality assessments, institutional performance, and assessments of teaching and evaluation methods. Students in educational institutions benefit the most when advanced technologies are deployed in accessing records. This research paper discusses the use of biometric technologies coupled with smartcard technologies to provide a unique way of identifying students and matching their data to financial records to grant them access to restricted areas such as examination halls. The system developed in this paper, has an identity verification component as part of its main functionalities. A systematic software development cycle of analysis, design, coding, testing and support was used. The system provides a secured way of verifying student’s identity and real time verification of financial records. An advanced prototype version of the system has been developed for testing purposes.

Keywords: biometrics, smartcards, identity-verification, fingerprints

Procedia PDF Downloads 388
546 Modified Form of Margin Based Angular Softmax Loss for Speaker Verification

Authors: Jamshaid ul Rahman, Akhter Ali, Adnan Manzoor

Abstract:

Learning-based systems have received increasing interest in recent years; recognition structures, including end-to-end speak recognition, are one of the hot topics in this area. A famous work on end-to-end speaker verification by using Angular Softmax Loss gained significant importance and is considered useful to directly trains a discriminative model instead of the traditional adopted i-vector approach. The margin-based strategy in angular softmax is beneficial to learn discriminative speaker embeddings where the random selection of margin values is a big issue in additive angular margin and multiplicative angular margin. As a better solution in this matter, we present an alternative approach by introducing a bit similar form of an additive parameter that was originally introduced for face recognition, and it has a capacity to adjust automatically with the corresponding margin values and is applicable to learn more discriminative features than the Softmax. Experiments are conducted on the part of Fisher dataset, where it observed that the additive parameter with angular softmax to train the front-end and probabilistic linear discriminant analysis (PLDA) in the back-end boosts the performance of the structure.

Keywords: additive parameter, angular softmax, speaker verification, PLDA

Procedia PDF Downloads 68
545 Forensic Speaker Verification in Noisy Environmental by Enhancing the Speech Signal Using ICA Approach

Authors: Ahmed Kamil Hasan Al-Ali, Bouchra Senadji, Ganesh Naik

Abstract:

We propose a system to real environmental noise and channel mismatch for forensic speaker verification systems. This method is based on suppressing various types of real environmental noise by using independent component analysis (ICA) algorithm. The enhanced speech signal is applied to mel frequency cepstral coefficients (MFCC) or MFCC feature warping to extract the essential characteristics of the speech signal. Channel effects are reduced using an intermediate vector (i-vector) and probabilistic linear discriminant analysis (PLDA) approach for classification. The proposed algorithm is evaluated by using an Australian forensic voice comparison database, combined with car, street and home noises from QUT-NOISE at a signal to noise ratio (SNR) ranging from -10 dB to 10 dB. Experimental results indicate that the MFCC feature warping-ICA achieves a reduction in equal error rate about (48.22%, 44.66%, and 50.07%) over using MFCC feature warping when the test speech signals are corrupted with random sessions of street, car, and home noises at -10 dB SNR.

Keywords: noisy forensic speaker verification, ICA algorithm, MFCC, MFCC feature warping

Procedia PDF Downloads 381
544 Uvulars Alternation in Hasawi Arabic: A Harmonic Serialism Approach

Authors: Huda Ahmed Al Taisan

Abstract:

This paper investigates a phonological phenomenon, which exhibits variation ‘alternation’ in terms of the uvular consonants [q] and [ʁ] in Hasawi Arabic. This dialect is spoken in Alahsa city, which is located in the Eastern province of Saudi Arabia. To the best of our knowledge, no such research has systematically studied this phenomenon in Hasawi Arabic dialect. This paper is significant because it fills the gap in the literature about this alternation phenomenon in this understudied dialect. A large amount of the data is extracted from several interviews the author has conducted with 10 participants, native speakers of the dialect, and complemented by additional forms from social media. The latter method of collecting the data adds to the significance of the research. The analysis of the data is carried out in Harmonic Serialism Optimality Theory (HS-OT), a version of the Optimality Theoretic (OT) framework, which holds that linguistic forms are the outcome of the interaction among violable universal constraints, and in the recent development of OT into a model that accounts for linguistic variation in harmonic derivational steps. This alternation process is assumed to be phonologically unconditioned and in free variation in other varieties of Arabic dialects in the area. The goal of this paper is to investigate whether this phenomenon is in free variation or governed, what governs this alternation between [q] and [ʁ] and whether the alternation is phonological or other linguistic constraints are in action. The results show that the [q] and [ʁ] alternation is not free and it occurs due to different assimilation processes. Positional, segmental sequence and vowel adjacency factors are in action in Hasawi Arabic.

Keywords: harmonic serialism, Hasawi, uvular, variation

Procedia PDF Downloads 477
543 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy

Procedia PDF Downloads 312
542 Numerical Verification of a Backfill-Rectangular Tank-Fluid System

Authors: Ramazan Livaoğlu, Tufan Çakır

Abstract:

The performance of rectangular tanks during earthquakes has been observed to depend significantly on the existence of water in the container and the presence of the backfill acting on tank wall. Therefore, in design of rectangular tanks, the topics of fluid-structure-backfill interactions and determination of modal characteristics of the interaction system have traditionally been one of the great theoretical and practical controversy. Although finite element method has been and will continue to be used to a significant extent in treating the response of the system, experimental verification of numerical models remains prerequisite for their adoption and reliable application in practice. Thus, in this study, the numerical and experimental investigations were performed on the backfill-exterior wall-fluid interaction system. Firstly, three dimensional finite element model (3D-FEM) was developed to acquire modal frequencies and mode shapes of the system by means of ANSYS. Secondly, a series of in-situ tests were fulfilled to define modal characteristics of same system to determine the applicability of the FEM to a real physical situation under field conditions. Finally, comparing the theoretical predictions from the model to results from experimental measurement, a close agreement was found between theory and experiment. Thus, it can be easily stated that experimental verification provides strong support for the use of proposed model in further investigations.

Keywords: fluid-structure interaction, modal analysis, rectangular tank, soil structure interaction

Procedia PDF Downloads 365
541 Design and Application of NFC-Based Identity and Access Management in Cloud Services

Authors: Shin-Jer Yang, Kai-Tai Yang

Abstract:

In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.

Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device

Procedia PDF Downloads 407
540 Modelling of the Fire Pragmatism in the Area of Military Management and Its Experimental Verification

Authors: Ivana Mokrá

Abstract:

The article deals with modelling of the fire pragmatism in the area of military management and its experimental verification. Potential approaches are based on the synergy of mathematical and theoretical ideas, operational and tactical requirements and the military decision-making process. This issue has taken on importance in recent times, particularly with the increasing trend of digitized battlefield, the development of C4ISR systems and intention to streamline the command and control process at the lowest levels of command. From fundamental and philosophical point of view, these new approaches seek to significantly upgrade and enhance the decision-making process of the tactical commanders.

Keywords: military management, decision-making process, strike modeling, experimental evaluation, pragmatism, tactical strike modeling

Procedia PDF Downloads 365
539 A New Verification Based Congestion Control Scheme in Mobile Networks

Authors: P. K. Guha Thakurta, Shouvik Roy, Bhawana Raj

Abstract:

A congestion control scheme in mobile networks is proposed in this paper through a verification based model. The model proposed in this work is represented through performance metric like buffer Occupancy, latency and packet loss rate. Based on pre-defined values, each of the metric is introduced in terms of three different states. A Markov chain based model for the proposed work is introduced to monitor the occurrence of the corresponding state transitions. Thus, the estimation of the network status is obtained in terms of performance metric. In addition, the improved performance of our proposed model over existing works is shown with experimental results.

Keywords: congestion, mobile networks, buffer, delay, call drop, markov chain

Procedia PDF Downloads 413
538 Trusted Neural Network: Reversibility in Neural Networks for Network Integrity Verification

Authors: Malgorzata Schwab, Ashis Kumer Biswas

Abstract:

In this concept paper, we explore the topic of Reversibility in Neural Networks leveraged for Network Integrity Verification and crafted the term ''Trusted Neural Network'' (TNN), paired with the API abstraction around it, to embrace the idea formally. This newly proposed high-level generalizable TNN model builds upon the Invertible Neural Network architecture, trained simultaneously in both forward and reverse directions. This allows for the original system inputs to be compared with the ones reconstructed from the outputs in the reversed flow to assess the integrity of the end-to-end inference flow. The outcome of that assessment is captured as an Integrity Score. Concrete implementation reflecting the needs of specific problem domains can be derived from this general approach and is demonstrated in the experiments. The model aspires to become a useful practice in drafting high-level systems architectures which incorporate AI capabilities.

Keywords: trusted, neural, invertible, API

Procedia PDF Downloads 118