Search results for: error metrics
875 Node Pair Selection Scheme in Relay-Aided Communication Based On Stable Marriage Problem
Authors: Tetsuki Taniguchi, Yoshio Karasawa
Abstract:
This paper describes a node pair selection scheme in relay-aided multiple source multiple destination communication system based on stable marriage problem. A general case is assumed in which all of source, relay and destination nodes are equipped with multiantenna and carry out multistream transmission. Based on several metrics introduced from inter-node channel condition, the preference order is determined about all source-relay and relay-destination relations, and then the node pairs are determined using Gale-Shapley algorithm. The computer simulations show that the effectiveness of node pair selection is larger in multihop communication. Some additional aspects which are different from relay-less case are also investigated.
Keywords: Relay, multiple input multiple output (MIMO), multiuser, amplify and forward, stable marriage problem, Gale-Shapley algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935874 A Comparison of Signal Processing Techniques for the Extraction of Breathing Rate from the Photoplethysmogram
Authors: Susannah G. Fleming Lionel Tarassenko
Abstract:
The photoplethysmogram (PPG) is the pulsatile waveform produced by the pulse oximeter, which is widely used for monitoring arterial oxygen saturation in patients. Various methods for extracting the breathing rate from the PPG waveform have been compared using a consistent data set, and a novel technique using autoregressive modelling is presented. This novel technique is shown to outperform the existing techniques, with a mean error in breathing rate of 0.04 breaths per minute.Keywords: Autoregressive modelling, breathing rate, photoplethysmogram, pulse oximetry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3318873 Redesigning Business Processes: A Method Based on Simulation and Process Mining Techniques
Authors: Zahra Mohammadnazari, Fateme Rostambeygi, Fatemeh Dehrouyeh, Hwang Ki-Soon, Amir Aghsami
Abstract:
Corporations have always prioritized efforts to examine and improve processes. Various metrics, such as the cost and time required to implement the process and can be specified in this regard. Process improvement can be defined as an improvement of these indicators. This is accomplished by looking at prospective adjustments to the current executive process model or the resources allotted to it. Research has been conducted in this paper to the improve the procurement process and aims to explore assessment prospects in the project using a combination of process mining and simulation (benefiting from Play-In and Play-Out methodologies). To run the simulation, we will need to complete the control flow diagram, institution settings, resource settings, and activity settings. The process of mining event logs yields the process control flow. However, both the entry of institutions and the distribution of resources must be modeled. The rate of admission of institutions and the distribution of time for the implementation of activities will be determined in the next step.
Keywords: Business reengineering, Petri net, process-based simulation, process mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 483872 Object-Oriented Cognitive-Spatial Complexity Measures
Authors: Varun Gupta, Jitender Kumar Chhabra
Abstract:
Software maintenance and mainly software comprehension pose the largest costs in the software lifecycle. In order to assess the cost of software comprehension, various complexity measures have been proposed in the literature. This paper proposes new cognitive-spatial complexity measures, which combine the impact of spatial as well as architectural aspect of the software to compute the software complexity. The spatial aspect of the software complexity is taken into account using the lexical distances (in number of lines of code) between different program elements and the architectural aspect of the software complexity is taken into consideration using the cognitive weights of control structures present in control flow of the program. The proposed measures are evaluated using standard axiomatic frameworks and then, the proposed measures are compared with the corresponding existing cognitive complexity measures as well as the spatial complexity measures for object-oriented software. This study establishes that the proposed measures are better indicators of the cognitive effort required for software comprehension than the other existing complexity measures for object-oriented software.Keywords: cognitive complexity, software comprehension, software metrics, spatial complexity, Object-oriented software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144871 Measures and Influence of a Baw Filter on Digital Radio-Communications Signals
Authors: A. Diet, M. Villegas, G. Baudoin
Abstract:
This work concerns the measurements of a Bulk Acoustic Waves (BAW) emission filter S parameters and compare with prototypes simulated types. Thanks to HP-ADS, a co-simulation of filters- characteristics in a digital radio-communication chain is performed. Four cases of modulation schemes are studied in order to illustrate the impact of the spectral occupation of the modulated signal. Results of simulations and co-simulation are given in terms of Error Vector Measurements to be useful for a general sensibility analysis of 4th/3rd Generation (G.) emitters (wideband QAM and OFDM signals)Keywords: RF architectures, BAW filters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2773870 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: Pronunciation variations, dynamic programming, machine learning, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 800869 A Study of Agile-Based Approaches to Improve Software Quality
Authors: Gurmeet Kaur, Jyoti Pruthi
Abstract:
Agile Software development approaches and techniques are being considered as efficient, effective, and popular methods to the development of software. Agile software developments are useful for developing high-quality software that completes client requirements with zero defects, and in short delivery period. In agile software development methodology, quality is related to coding, which means quality, is managed through the use of approaches like refactoring, pair programming, test-driven development, behavior-driven development, acceptance test-driven development, and demand-driven development. The quality of software is measured using metrics like the number of defects during the development and improvement of the software. Usage of the above-mentioned methods or approaches reduces the possibilities of defects in developed software, and hence improves quality. This paper focuses on the study of agile-based quality methods or approaches for software development that ensures improved quality of software as well as reduced cost, and customer satisfaction.
Keywords: Agile software development, ASD, Acceptance test-driven development, ATDD, Behavior-driven development, BDD, Demand-driven development. DDD, Test-driven development, TDD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 691868 A Framework to Support Reuse in Object-Oriented Software Development
Authors: Fathi Taibi
Abstract:
Reusability is a quality desired attribute in software products. Generally, it could be achieved through adopting development methods that promote it and achieving software qualities that have been linked with high reusability proneness. With the exponential growth in mobile application development, software reuse became an integral part in a substantial number of projects. Similarly, software reuse has become widely practiced in start-up companies. However, this has led to new emerging problems. Firstly, the reused code does not meet the required quality and secondly, the reuse intentions are dubious. This work aims to propose a framework to support reuse in Object-Oriented (OO) software development. The framework comprises a process that uses a proposed reusability assessment metric and a formal foundation to specify the elements of the reused code and the relationships between them. The framework is empirically evaluated using a wide range of open-source projects and mobile applications. The results are analyzed to help understand the reusability proneness of OO software and the possible means to improve it.
Keywords: Software reusability, software metrics, object-oriented software, modularity, low complexity, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 379867 A Case Study to Assess the Validity of Function Points
Authors: Neelam Bawane nee' Singhal, C. V. Srikrishna
Abstract:
Many metrics were proposed to evaluate the characteristics of the analysis and design model of a given product which in turn help to assess the quality of the product. Function point metric is a measure of the 'functionality' delivery by the software. This paper presents an analysis of a set of programs of a project developed in Cµ through Function Points metric. Function points are measured for a Data Flow Diagram (DFD) of the case developed at initial stage. Lines of Codes (LOCs) and possible errors are calculated with the help of measured Function Points (FPs). The calculations are performed using suitable established functions. Calculated LOCs and errors are compared with actual LOCs and errors found at the time of analysis & design review, implementation and testing. It has been observed that actual found errors are more than calculated errors. On the basis of analysis and observations, authors conclude that function point provides useful insight and helps to analyze the drawbacks in the development process.Keywords: Function Points, Data Flow Diagram, Lines ofCodes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3672866 Watermark-based Counter for Restricting Digital Audio Consumption
Authors: Mikko Löytynoja, Nedeljko Cvejic, Tapio Seppänen
Abstract:
In this paper we introduce three watermarking methods that can be used to count the number of times that a user has played some content. The proposed methods are tested with audio content in our experimental system using the most common signal processing attacks. The test results show that the watermarking methods used enable the watermark to be extracted under the most common attacks with a low bit error rate.
Keywords: Digital rights management, restricted usage, content protection, spread spectrum, audio watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466865 Dual Construction of Stern-based Signature Scheme
Authors: Pierre-Louis Cayrel, Sidi Mohamed El Yousfi Alaoui
Abstract:
In this paper, we propose a dual version of the first threshold ring signature scheme based on error-correcting code proposed by Aguilar et. al in [1]. Our scheme uses an improvement of Véron zero-knowledge identification scheme, which provide smaller public and private key sizes and better computation complexity than the Stern one. This scheme is secure in the random oracle model.Keywords: Stern algorithm, Véron algorithm, threshold ring signature, post-quantum cryptography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799864 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.
Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911863 Base Change for Fisher Metrics: Case of the q−Gaussian Inverse Distribution
Authors: Gabriel I. Loaiza O., Carlos A. Cadavid M., Juan C. Arango P.
Abstract:
It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ = −1/2 , as does the family of usual Gaussian distributions. In the present paper, firstly we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ1, θ2; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the Inverse q−Gaussian distribution family (q < 3), as the family obtained by replacing the usual exponential function by the Tsallis q−exponential function in the expression for the Inverse Gaussian distribution, and observe that it supports two possible geometries, the Fisher and the q−Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q−Fisher geometry of the Inverse q−Gaussian distribution family, similar to the ones obtained in the case of the Inverse Gaussian distribution family.
Keywords: Base of Changes, Information Geometry, Inverse Gaussian distribution, Inverse q-Gaussian distribution, Statistical Manifolds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 387862 Variable Step-Size APA with Decorrelation of AR Input Process
Authors: Jae Wook Shin, Ju-man Song, Hyun-Taek Choi, Poo Gyeon Park
Abstract:
This paper introduces a new variable step-size APA with decorrelation of AR input process is based on the MSD analysis. To achieve a fast convergence rate and a small steady-state estimation error, he proposed algorithm uses variable step size that is determined by minimising the MSD. In addition, experimental results show that the proposed algorithm is achieved better performance than the other algorithms.
Keywords: adaptive filter, affine projection algorithm, variable step size.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896861 BER Performance of UWB Modulations through S-V Channel Model
Authors: Risanuri Hidayat
Abstract:
BER analysis of Impulse Radio Ultra Wideband (IRUWB) pulse modulations over S-V channel model is proposed in this paper. The UWB pulse is Gaussian monocycle pulse modulated using Pulse Amplitude Modulation (PAM) and Pulse Position Modulation (PPM). The channel model is generated from a modified S-V model. Bit-error rate (BER) is measured over several of bit rates. The result shows that all modulation are appropriate for both LOS and NLOS channel, but PAM gives better performance in bit rates and SNR. Moreover, as standard of speed has been given for UWB, the communication is appropriate with high bit rates in LOS channel.
Keywords: IR-UWB, S-V Channel Model, LOS NLOS, PAM, PPM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346860 Determination of Water Pollution and Water Quality with Decision Trees
Authors: Çiğdem Bakır, Mecit Yüzkat
Abstract:
With the increasing emphasis on water quality worldwide, the search for and expanding the market for new and intelligent monitoring systems has increased. The current method is the laboratory process, where samples are taken from bodies of water, and tests are carried out in laboratories. This method is time-consuming, a waste of manpower and uneconomical. To solve this problem, we used machine learning methods to detect water pollution in our study. We created decision trees with the Orange3 software used in the study and tried to determine all the factors that cause water pollution. An automatic prediction model based on water quality was developed by taking many model inputs such as water temperature, pH, transparency, conductivity, dissolved oxygen, and ammonia nitrogen with machine learning methods. The proposed approach consists of three stages: Preprocessing of the data used, feature detection and classification. We tried to determine the success of our study with different accuracy metrics and the results were presented comparatively. In addition, we achieved approximately 98% success with the decision tree.
Keywords: Decision tree, water quality, water pollution, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 260859 A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules
Authors: Ramandeep S. Sidhu, Sunil Khullar, Parvinder S. Sandhu, R. P. S. Bedi, Kiranbir Kaur
Abstract:
In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Keywords: Subtractive clustering, fuzzy inference system, fault proneness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2580858 Variable Regularization Parameter Normalized Least Mean Square Adaptive Filter
Authors: Young-Seok Choi
Abstract:
We present a normalized LMS (NLMS) algorithm with robust regularization. Unlike conventional NLMS with the fixed regularization parameter, the proposed approach dynamically updates the regularization parameter. By exploiting a gradient descent direction, we derive a computationally efficient and robust update scheme for the regularization parameter. In simulation, we demonstrate the proposed algorithm outperforms conventional NLMS algorithms in terms of convergence rate and misadjustment error.Keywords: Regularization, normalized LMS, system identification, robustness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876857 An Improved Method to Watermark Images Sensitive to Blocking Artifacts
Authors: Afzel Noore
Abstract:
A new digital watermarking technique for images that are sensitive to blocking artifacts is presented. Experimental results show that the proposed MDCT based approach produces highly imperceptible watermarked images and is robust to attacks such as compression, noise, filtering and geometric transformations. The proposed MDCT watermarking technique is applied to fingerprints for ensuring security. The face image and demographic text data of an individual are used as multiple watermarks. An AFIS system was used to quantitatively evaluate the matching performance of the MDCT-based watermarked fingerprint. The high fingerprint matching scores show that the MDCT approach is resilient to blocking artifacts. The quality of the extracted face and extracted text images was computed using two human visual system metrics and the results show that the image quality was high.Keywords: Digital watermarking, data hiding, modified discretecosine transformation (MDCT).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605856 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity
Authors: Mujtaba Roshan, John A. Schormans
Abstract:
Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.
Keywords: Quality of experience, quality of service, packet loss probability, network capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 941855 Union is Strength in Lossy Image Compression
Authors: Mario Mastriani
Abstract:
In this work, we present a comparison between different techniques of image compression. First, the image is divided in blocks which are organized according to a certain scan. Later, several compression techniques are applied, combined or alone. Such techniques are: wavelets (Haar's basis), Karhunen-Loève Transform, etc. Simulations show that the combined versions are the best, with minor Mean Squared Error (MSE), and higher Peak Signal to Noise Ratio (PSNR) and better image quality, even in the presence of noise.Keywords: Haar's basis, Image compression, Karhunen-LoèveTransform, Morton's scan, row-rafter scan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746854 Energy Efficient Cooperative Caching in WSN
Authors: Narottam Chand
Abstract:
Wireless sensor networks (WSNs) consist of number of tiny, low cost and low power sensor nodes to monitor some physical phenomenon. The major limitation in these networks is the use of non-rechargeable battery having limited power supply. The main cause of energy consumption in such networks is communication subsystem. This paper presents an energy efficient Cluster Cooperative Caching at Sensor (C3S) based upon grid type clustering. Sensor nodes belonging to the same cluster/grid form a cooperative cache system for the node since the cost for communication with them is low both in terms of energy consumption and message exchanges. The proposed scheme uses cache admission control and utility based data replacement policy to ensure that more useful data is retained in the local cache of a node. Simulation results demonstrate that C3S scheme performs better in various performance metrics than NICoCa which is existing cooperative caching protocol for WSNs.Keywords: Cooperative caching, cache replacement, admission control, WSN, clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2268853 Bootstrap and MLS Methods-based Individual Bioequivalence Assessment
Authors: Kongsheng Zhang, Li Ge
Abstract:
It is a one-sided hypothesis testing process for assessing bioequivalence. Bootstrap and modified large-sample(MLS) methods are considered to study individual bioequivalence(IBE), type I error and power of hypothesis tests are simulated and compared with FDA(2001). The results show that modified large-sample method is equivalent to the method of FDA(2001) .
Keywords: Individual bioequivalence, bootstrap, Bayesian bootstrap, modified large-sample.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584852 Burstiness Reduction of a Doubly Stochastic AR-Modeled Uniform Activity VBR Video
Authors: J. P. Dubois
Abstract:
Stochastic modeling of network traffic is an area of significant research activity for current and future broadband communication networks. Multimedia traffic is statistically characterized by a bursty variable bit rate (VBR) profile. In this paper, we develop an improved model for uniform activity level video sources in ATM using a doubly stochastic autoregressive model driven by an underlying spatial point process. We then examine a number of burstiness metrics such as the peak-to-average ratio (PAR), the temporal autocovariance function (ACF) and the traffic measurements histogram. We found that the former measure is most suitable for capturing the burstiness of single scene video traffic. In the last phase of this work, we analyse statistical multiplexing of several constant scene video sources. This proved, expectedly, to be advantageous with respect to reducing the burstiness of the traffic, as long as the sources are statistically independent. We observed that the burstiness was rapidly diminishing, with the largest gain occuring when only around 5 sources are multiplexed. The novel model used in this paper for characterizing uniform activity video was thus found to be an accurate model.Keywords: AR, ATM, burstiness, doubly stochastic, statisticalmultiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408851 Fusion Filters Weighted by Scalars and Matrices for Linear Systems
Authors: Seok Hyoung Lee, Vladimir Shin
Abstract:
An optimal mean-square fusion formulas with scalar and matrix weights are presented. The relationship between them is established. The fusion formulas are compared on the continuous-time filtering problem. The basic differential equation for cross-covariance of the local errors being the key quantity for distributed fusion is derived. It is shown that the fusion filters are effective for multi-sensor systems containing different types of sensors. An example demonstrating the reasonable good accuracy of the proposed filters is given.Keywords: Kalman filtering, fusion formula, multi-sensor, mean-square error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395850 A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems
Authors: Ong Sing Goh, C. Ardil, Wilson Wong, Chun Che Fung
Abstract:
The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.
Keywords: Evaluation, conversational agents, Response Quality, chatterbots
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927849 Study of Syntactic Errors for Deep Parsing at Machine Translation
Authors: Yukiko Sasaki Alam, Shahid Alam
Abstract:
Syntactic parsing is vital for semantic treatment by many applications related to natural language processing (NLP), because form and content coincide in many cases. However, it has not yet reached the levels of reliable performance. By manually examining and analyzing individual machine translation output errors that involve syntax as well as semantics, this study attempts to discover what is required for improving syntactic and semantic parsing.
Keywords: Machine translation, error analysis, syntactic errors, knowledge required for parsing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1247848 The Evaluation and Application of FMEA in Sepahan Oil Co
Authors: Hekmatpanah M., Fadavinia, A.
Abstract:
Failure modes and effects analysis (FMEA) is an effective technique for preventing potential problems and actions needed to error cause removal. On the other hand, the oil producing companies paly a critical role in the oil industry of Iran as a developing country out of which, Sepahan Oil Co. has a considerable contribution. The aim of this research is to show how FMEA could be applied and improve the quality of products at Sepahan Oil Co. For this purpose, the four liter production line of the company has been selected for investigation. The findings imply that the application of FMEA has reduced the scraps from 50000 ppm to 5000 ppm and has resulted in a 0.92 percent decrease of the oil waste.
Keywords: FMEA, Iran, Sepahan Oil Co., Canning, Waste, Scrap
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2779847 A Characterized and Optimized Approach for End-to-End Delay Constrained QoS Routing
Authors: P.S.Prakash, S.Selvan
Abstract:
QoS Routing aims to find paths between senders and receivers satisfying the QoS requirements of the application which efficiently using the network resources and underlying routing algorithm to be able to find low-cost paths that satisfy given QoS constraints. The problem of finding least-cost routing is known to be NP hard or complete and some algorithms have been proposed to find a near optimal solution. But these heuristics or algorithms either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we analyzed two algorithms namely Characterized Delay Constrained Routing (CDCR) and Optimized Delay Constrained Routing (ODCR). The CDCR algorithm dealt an approach for delay constrained routing that captures the trade-off between cost minimization and risk level regarding the delay constraint. The ODCR which uses an adaptive path weight function together with an additional constraint imposed on the path cost, to restrict search space and hence ODCR finds near optimal solution in much quicker time.Keywords: QoS, Delay, Routing, Optimization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274846 Simulating Action Potential as a Linear Combination of Gating Dynamics
Authors: S. H. Sabzpoushan
Abstract:
In this research we show that the dynamics of an action potential in a cell can be modeled with a linear combination of the dynamics of the gating state variables. It is shown that the modeling error is negligible. Our findings can be used for simplifying cell models and reduction of computational burden i.e. it is useful for simulating action potential propagation in large scale computations like tissue modeling. We have verified our finding with the use of several cell models.
Keywords: Linear model, Action potential, gating dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1275