Search results for: quality of converters "Biometrics - the code"
3571 A Novel Approach to Iris Localization for Iris Biometric Processing
Authors: Somnath Dey, Debasis Samanta
Abstract:
Iris-based biometric system is gaining its importance in several applications. However, processing of iris biometric is a challenging and time consuming task. Detection of iris part in an eye image poses a number of challenges such as, inferior image quality, occlusion of eyelids and eyelashes etc. Due to these problems it is not possible to achieve 100% accuracy rate in any iris-based biometric authentication systems. Further, iris detection is a computationally intensive task in the overall iris biometric processing. In this paper, we address these two problems and propose a technique to localize iris part efficiently and accurately. We propose scaling and color level transform followed by thresholding, finding pupil boundary points for pupil boundary detection and dilation, thresholding, vertical edge detection and removal of unnecessary edges present in the eye images for iris boundary detection. Scaling reduces the search space significantly and intensity level transform is helpful for image thresholding. Experimental results show that our approach is comparable with the existing approaches. Following our approach it is possible to detect iris part with 95-99% accuracy as substantiated by our experiments on CASIA Ver-3.0, ICE 2005, UBIRIS, Bath and MMU iris image databases.
Keywords: Iris recognition, iris localization, biometrics, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31913570 Evaluating 8D Reports Using Text-Mining
Authors: Benjamin Kuester, Bjoern Eilert, Malte Stonis, Ludger Overmeyer
Abstract:
Increasing quality requirements make reliable and effective quality management indispensable. This includes the complaint handling in which the 8D method is widely used. The 8D report as a written documentation of the 8D method is one of the key quality documents as it internally secures the quality standards and acts as a communication medium to the customer. In practice, however, the 8D report is mostly faulty and of poor quality. There is no quality control of 8D reports today. This paper describes the use of natural language processing for the automated evaluation of 8D reports. Based on semantic analysis and text-mining algorithms the presented system is able to uncover content and formal quality deficiencies and thus increases the quality of the complaint processing in the long term.
Keywords: 8D report, complaint management, evaluation system, text-mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10233569 Video Quality Assessment Methods: A Bird’s-Eye View
Authors: P. M. Arun Kumar, S. Chandramathi
Abstract:
The proliferation of multimedia technology and services in today’s world provide ample research scope in the frontiers of visual signal processing. Wide spread usage of video based applications in heterogeneous environment needs viable methods of Video Quality Assessment (VQA). The evaluation of video quality not only depends on high QoS requirements but also emphasis the need of novel term ‘QoE’ (Quality of Experience) that perceive video quality as user centric. This paper discusses two vital video quality assessment methods namely, subjective and objective assessment methods. The evolution of various video quality metrics, their classification models and applications are reviewed in this work. The Mean Opinion Score (MOS) based subjective measurements and algorithm based objective metrics are discussed and their challenges are outlined. Further, this paper explores the recent progress of VQA in emerging technologies such as mobile video and 3D video.
Keywords: 3D-Video, no reference metric, quality of experience, video quality assessment, video quality metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40543568 An Efficient Segmentation Method Based on Local Entropy Characteristics of Iris Biometrics
Authors: Ali Shojaee Bakhtiari, Ali Asghar Beheshti Shirazi, Amir Sepasi Zahmati
Abstract:
An efficient iris segmentation method based on analyzing the local entropy characteristic of the iris image, is proposed in this paper and the strength and weaknesses of the method are analyzed for practical purposes. The method shows special strength in providing designers with an adequate degree of freedom in choosing the proper sections of the iris for their application purposes.Keywords: Iris segmentation, entropy, biocryptosystem, biometric identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14293567 Validation of Reverse Engineered Web Application Models
Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini
Abstract:
Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16243566 New Approach for Constructing a Secure Biometric Database
Authors: A. Kebbeb, M. Mostefai, F. Benmerzoug, Y. Chahir
Abstract:
The multimodal biometric identification is the combination of several biometric systems; the challenge of this combination is to reduce some limitations of systems based on a single modality while significantly improving performance. In this paper, we propose a new approach to the construction and the protection of a multimodal biometric database dedicated to an identification system. We use a topological watermarking to hide the relation between face image and the registered descriptors extracted from other modalities of the same person for more secure user identification.
Keywords: Biometric databases, Multimodal biometrics, security authentication, Digital watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20913565 Secure Distance Bounding Protocol on Ultra-WideBand Based Mapping Code
Authors: Jamel Miri, Bechir Nsiri, Ridha Bouallegue
Abstract:
Ultra WidBand-IR physical layer technology has seen a great development during the last decade which makes it a promising candidate for short range wireless communications, as they bring considerable benefits in terms of connectivity and mobility. However, like all wireless communication they suffer from vulnerabilities in terms of security because of the open nature of the radio channel. To face these attacks, distance bounding protocols are the most popular counter measures. In this paper, we presented a protocol based on distance bounding to thread the most popular attacks: Distance Fraud, Mafia Fraud and Terrorist fraud. In our work, we study the way to adapt the best secure distance bounding protocols to mapping code of ultra-wideband (TH-UWB) radios. Indeed, to ameliorate the performances of the protocol in terms of security communication in TH-UWB, we combine the modified protocol to ultra-wideband impulse radio technology (IR-UWB). The security and the different merits of the protocols are analyzed.Keywords: Distance bounding, mapping code ultra-wideband, Terrorist Fraud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10353564 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme
Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi
Abstract:
Multiple User Interference (MUI) considers the primary problem in Optical Code-Division Multiple Access (OCDMA), which resulting from the overlapping among the users. In this article we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.Keywords: Multiple User Interference (MUI), Optical Code-Division Multiple Access (OCDMA), Partial Modified Prime Code (PMP), Spectral Amplitude Coding (SAC), Successive Interference Cancellation (SIC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17323563 High-Efficiency Comparator for Low-Power Application
Authors: M. Yousefi, N. Nasirzadeh
Abstract:
In this paper, dynamic comparator structure employing two methods for power consumption reduction with applications in low-power high-speed analog-to-digital converters have been presented. The proposed comparator has low consumption thanks to power reduction methods. They have the ability for offset adjustment. The comparator consumes 14.3 μW at 100 MHz which is equal to 11.8 fJ. The comparator has been designed and simulated in 180 nm CMOS. Layouts occupy 210 μm2.Keywords: Comparator, low, power, efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16213562 New Approach to Spectral Analysis of High Bit Rate PCM Signals
Authors: J. P. Dubois
Abstract:
Pulse code modulation is a widespread technique in digital communication with significant impact on existing modern and proposed future communication technologies. Its widespread utilization is due to its simplicity and attractive spectral characteristics. In this paper, we present a new approach to the spectral analysis of PCM signals using Riemann-Stieltjes integrals, which is very accurate for high bit rates. This approach can serve as a model for similar spectral analysis of other competing modulation schemes.Keywords: Coding, discrete Fourier, power spectral density, pulse code modulation, Riemann-Stieltjes integrals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15923561 The Quality of Accounting Information of Private Companies in the Czech Republic
Authors: Kateřina Struhařová
Abstract:
The paper gives the evidence of quality of accounting information of Czech private companies. In general the private companies in the Czech Republic do not see the benefits of providing accounting information of high quality. Based on the research of financial statements of entrepreneurs and companies in Zlin region it was confirmed that the quality of accounting information differs among the private entities and that the major impact on the accounting information quality has the fact if the financial statements are audited as well as the size of the entity. Also the foreign shareholders and lenders have some impact on the accounting information quality.
Keywords: Accounting information quality, Financial Statements, Czech Republic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16183560 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection
Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay
Abstract:
With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.
Keywords: credit card fraud detection, user authentication, behavioral biometrics, machine learning, literature survey
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5473559 Beyond Taguchi’s Concept of the Quality Loss Function
Authors: Atul Dev, Pankaj Jha
Abstract:
Dr. Genichi Taguchi looked at quality in a broader term and gave an excellent definition of quality in terms of loss to society. However the scope of this definition is limited to the losses imparted by a poor quality product to the customer only and are considered during the useful life of the product and further in a certain situation this loss can even be zero. In this paper, it has been proposed that the scope of quality of a product shall be further enhanced by considering the losses imparted by a poor quality product to society at large, due to associated environmental and safety related factors, over the complete life cycle of the product. Moreover, though these losses can be further minimized with the use of techno-safety interventions, the net losses to society however can never be made zero. This paper proposes an entirely new approach towards defining product quality and is based on Taguchi’s definition of quality.
Keywords: Existing concept, goal post philosophy, life cycle, proposed concept, quality loss function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20073558 Learners’ Perceptions of Tertiary Level Teachers’ Code Switching: A Vietnamese Perspective
Authors: Hoa Pham
Abstract:
The literature on language teaching and second language acquisition has been largely driven by monolingual ideology with a common assumption that a second language (L2) is best taught and learned in the L2 only. The current study challenges this assumption by reporting learners' positive perceptions of tertiary level teachers' code switching practices in Vietnam. The findings of this study contribute to our understanding of code switching practices in language classrooms from a learners' perspective. Data were collected from student participants who were working towards a Bachelor degree in English within the English for Business Communication stream through the use of focus group interviews. The literature has documented that this method of interviewing has a number of distinct advantages over individual student interviews. For instance, group interactions generated by focus groups create a more natural environment than that of an individual interview because they include a range of communicative processes in which each individual may influence or be influenced by others - as they are in their real life. The process of interaction provides the opportunity to obtain the meanings and answers to a problem that are "socially constructed rather than individually created" leading to the capture of real-life data. The distinct feature of group interaction offered by this technique makes it a powerful means of obtaining deeper and richer data than those from individual interviews. The data generated through this study were analysed using a constant comparative approach. Overall, the students expressed positive views of this practice indicating that it is a useful teaching strategy. Teacher code switching was seen as a learning resource and a source supporting language output. This practice was perceived to promote student comprehension and to aid the learning of content and target language knowledge. This practice was also believed to scaffold the students' language production in different contexts. However, the students indicated their preference for teacher code switching to be constrained, as extensive use was believed to negatively impact on their L2 learning and trigger cognitive reliance on the L1 for L2 learning. The students also perceived that when the L1 was used to a great extent, their ability to develop as autonomous learners was negatively impacted. This study found that teacher code switching was supported in certain contexts by learners, thus suggesting that there is a need for the widespread assumption about the monolingual teaching approach to be re-considered.Keywords: Code switching, L1 use, L2 teaching, Learners’ perception.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25023557 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. Earlier we predicted the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven datasets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: Software Metrics, Fault prediction, Cross project, Within project.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25463556 Quality Monitoring and Dynamic Pricing in Cold Chain Management
Authors: Myo Min Aung, Yoon Seok Chang, Woo Ram Kim
Abstract:
This paper presents a cold chain monitoring system which focuses on assessment of quality and dynamic pricing information about food in cold chain. Cold chain is composed of many actors and stages; however it can be seen as a single entity since a breakdown in temperature control at any stage can impact the final quality of the product. In a cold chain, the shelf life, quality, and safety of perishable food throughout the supply chain is greatly impacted by environmental factors especially temperature. In this paper, a prototype application is implemented to retrieve timetemperature history, the current quality and the dynamic price setting according to changing quality impacted by temperature fluctuations in real-time.
Keywords: Cold chain, monitoring, quality, temperature, traceability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30663555 HaskellFL: A Tool for Detecting Logical Errors in Haskell
Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha
Abstract:
Understanding and using the functional paradigm is a challenge for many programmers. Looking for logical errors in code may take a lot of a developer’s time when a program grows in size. In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying Functional Programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against Functional Programming assignments submitted by students enrolled at the Functional Programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available in GitHub. This work also evaluated the effectiveness of two fault localization techniques, Tarantula and Ochiai, in the Haskell context. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. The results also showed that the Ochiai method was more effective than Tarantula.
Keywords: Debug, fault localization, functional programming, Haskell.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7323554 A Side-Peak Cancellation Scheme for CBOC Code Acquisition
Authors: Youngpo Lee, Seokho Yoon
Abstract:
In this paper, we propose a side-peak cancellation scheme for code acquisition of composite binary offset carrier (CBOC) signals. We first model the family of CBOC signals in a generic form, and then, propose a side-peak cancellation scheme by combining correlation functions between the divided sub-carrier and received signals. From numerical results, it is shown that the proposed scheme removes the side-peak completely, and moreover, the resulting correlation function demonstrates the better power ratio performance than the CBOC autocorrelation.Keywords: CBOC, side-peak, ambiguity problem, synchronization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17403553 A New Categorization of Image Quality Metrics Based On a Model of Human Quality Perception
Authors: Maria Grazia Albanesi, Riccardo Amadeo
Abstract:
This study presents a new model of the human image quality assessment process: the aim is to highlightthe foundations of the image quality metrics proposed in literature, by identifyingthe cognitive/physiological or mathematical principles of their development and the relation with the actual human quality assessment process. The model allows to createa novel categorization of objective and subjective image quality metrics. Our work includes an overview of the most used or effectiveobjective metrics in literature, and, for each of them, we underline its main characteristics, with reference to the rationale of the proposed model and categorization. From the results of this operation, we underline a problem that affects all the presented metrics: the fact that many aspects of human biasesare not taken in account at all. We then propose a possible methodology to address this issue.
Keywords: Eye-Tracking, image quality assessment metric, MOS, quality of user experience, visual perception.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24483552 Pre and Post IFRS Loss Avoidance in France and the United Kingdom
Authors: T. Miková
Abstract:
This paper analyzes the effect of a single uniform accounting rule on reporting quality by investigating the influence of IFRS on earnings management. This paper examines whether earnings management is reduced after IFRS adoption through the use of “loss avoidance thresholds”, a method that has been verified in earlier studies. This paper concentrates on two European countries: one that represents the continental code law tradition with weak protection of investors (France) and one that represents the Anglo-American common law tradition, which typically implies a strong enforcement system (the United Kingdom).
The research investigates a sample of 526 companies (6822 firm-year observations) during the years 2000 – 2013. The results are different for the two jurisdictions. This study demonstrates that a single set of accounting standards contributes to better reporting quality and reduces the pervasiveness of earnings management in France. In contrast, there is no evidence that a reduction in earnings management followed the implementation of IFRS in the United Kingdom. Due to the fact that IFRS benefit France but not the United Kingdom, other political and economic factors, such legal system or capital market strength, must play a significant role in influencing the comparability and transparency cross-border companies’ financial statements. Overall, the result suggests that IFRS moderately contribute to the accounting quality of reported financial statements and bring benefit for stakeholders, though the role played by other economic factors cannot be discounted.
Keywords: Accounting Standards, Earnings Management, International Financial Reporting Standards, Loss Avoidance, Reporting Quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31273551 An Evaluation of ISO 9001:2008 and ISO 9001:2015 Standard Changes in Quality Management System
Authors: Filiz Ersoz, Deniz Merdin, Taner Ersoz
Abstract:
The objective of this study provides an insight into enterprises, who need to carry on their sustainability in harmony with the changing competition conditions, technology and laws, regarding the ISO 9001:2015. In the study, ISO 9001:2015, which is planned to be put in force and exists as a draft, was studied and its differences from the previous standard, ISO 9001:2008, were determined. To find out the differences, a survey was conducted among enterprises that implement a quality system. According to the findings obtained at the end of the study, it was observed that the enterprises attach importance to quality and follow the developments about quality management system, and they find the changes in the new draft document necessary.
Keywords: ISO 9001, quality, quality management system, quality revision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9973550 Simulation of Non-Linear Behavior of Shear Wall under Seismic Loading
Authors: M. A. Ghorbani, M. Pasbani Khiavi
Abstract:
The seismic response of steel shear wall system considering nonlinearity effects using finite element method is investigated in this paper. The non-linear finite element analysis has potential as usable and reliable means for analyzing of civil structures with the availability of computer technology. In this research the large displacements and materially nonlinear behavior of shear wall is presented with developing of finite element code. A numerical model based on the finite element method for the seismic analysis of shear wall is presented with developing of finite element code in this research. To develop the finite element code, the standard Galerkin weighted residual formulation is used. Two-dimensional plane stress model and total Lagrangian formulation was carried out to present the shear wall response and the Newton-Raphson method is applied for the solution of nonlinear transient equations. The presented model in this paper can be developed for analysis of civil engineering structures with different material behavior and complicated geometry.
Keywords: Finite element, steel shear wall, nonlinear, earthquake
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18423549 Effects of Level Densities and Those of a-Parameter in the Framework of Preequilibrium Model for 63,65Cu(n,xp) Reactions in Neutrons at 9 to 15 MeV
Authors: L. Yettou
Abstract:
In this study, the calculations of proton emission spectra produced by 63Cu(n,xp) and 65Cu(n,xp) reactions are used in the framework of preequilibrium models using the EMPIRE code and TALYS code. Exciton Model predidtions combined with the Kalbach angular distribution systematics and the Hybrid Monte Carlo Simulation (HMS) were used. The effects of levels densities and those of a-parameter have been investigated for our calculations. The comparison with experimental data shows clear improvement over the Exciton Model and HMS calculations.
Keywords: Preequilibrium models, level density, level density a-parameter, 63Cu(n, xp) and 65Cu(n, xp) reactions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5223548 Packing and Covering Radii of Linear Error-Block Codes
Authors: Rabiˆı DARITI, El Mamoun SOUIDI
Abstract:
Linear error-block codes are a natural generalization of linear error correcting codes. The purpose of this paper is to generalize some results on the packing and the covering radii to the error-block case. We study their properties when a code undergoes some specific modifications and combinations with another code. We give a few bounds on the packing and the covering radii of these codes.
Keywords: Linear error-block codes, π-distance, Correction capacity, Packing radius, Covering radius.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21853547 A Study on the Relation between Auditor Rotation and Audit Quality in Iranian Firms
Authors: Bita Mashayekhi, Marjan Fayyazi, Parisa Sefati
Abstract:
Audit quality is a popular topic in accounting and auditing research because recent decades’ financial crises reduce the reliability of financial reports to public investors and cause significant doubt about the audit profession. Therefore, doing research to identify effective factors in improving audit quality is necessary for bringing back public investors’ trust to financial statements as well as audit reports. In this study, we explore the relationship between audit rotation and audit quality. For this purpose, we employ the Duff (2009) model of audit quality to measure audit quality and use a questionnaire survey of 27 audit service quality attributes. Our results show that there is a negative relationship between auditor’s rotation and audit quality as we consider the auditor’s reputation, capability, assurance, experience, and responsiveness as surrogates for audit quality. There is no evidence for verifying a same relationship when we use the auditor’s independence and expertise for measuring audit quality.Keywords: Audit quality, auditor’s rotation, reputation, capability, assurance, experience, responsiveness, independence, expertise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7883546 Study of Unsteady Swirling Flow in a Hydrodynamic Vortex Chamber
Authors: Sergey I. Shtork, Aleksey P. Vinokurov, Sergey V. Alekseenko
Abstract:
The paper reports on the results of experimental and numerical study of nonstationary swirling flow in an isothermal model of vortex burner. It has been identified that main source of the instability is related to a precessing vortex core (PVC) phenomenon. The PVC induced flow pulsation characteristics such as precession frequency and its variation as a function of flowrate and swirl number have been explored making use of acoustic probes. Additionally pressure transducers were used to measure the pressure drops on the working chamber and across the vortex flow. The experiments have been included also the mean velocity measurements making use of a laser-Doppler anemometry. The features of instantaneous flowfield generated by the PVC were analyzed employing a commercial CFD code (Star-CCM+) based on Detached Eddy Simulation (DES) approach. Validity of the numerical code has been checked by comparison calculated flowfield data with the obtained experimental results. It has been confirmed particularly that the CFD code applied correctly reproduces the flow features.Keywords: Acoustic probes, detached eddy simulation (DES), laser-Doppler anemometry (LDA), precessing vortex core (PVC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22713545 Total Quality Management: The Socio- Demographic and Operational-Financial Determinants for Users- Perception of the Services Quality
Authors: H. Silvestre
Abstract:
The aim of this paper is to know the sociodemographic and operational-financial determinants of the services quality perceived by users of the national health services. Through the use of an inquiry conducted by the Ministry of Health, comprehending 16.936 interviews in 2006, we intend to find out if there is any characteristic that determines the 2006 inquiry results. With the revision of the literature we also want to know if the operational-financial results have implications in hospitals users- perception on the quality of the received services. In order to achieve our main goals we will make use of the regression analysis to find out the possible dimensions that determine those results.Keywords: Management by Results, Quality Approach, Tableau de Bord, Total Quality Management, Services quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14103544 Ottoman Script Recognition Using Hidden Markov Model
Authors: Ayşe Onat, Ferruh Yildiz, Mesut Gündüz
Abstract:
In this study, an OCR system for segmentation, feature extraction and recognition of Ottoman Scripts has been developed using handwritten characters. Detection of handwritten characters written by humans is a difficult process. Segmentation and feature extraction stages are based on geometrical feature analysis, followed by the chain code transformation of the main strokes of each character. The output of segmentation is well-defined segments that can be fed into any classification approach. The classes of main strokes are identified through left-right Hidden Markov Model (HMM).Keywords: Chain Code, HMM, Ottoman Script Recognition, OCR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23213543 Quality Culture Framework Proposal for Libyan Industrial Companies
Authors: Mostafa Ahmed Shokshok
Abstract:
Libyan industrial companies face many challenges in today's competitive market. Quality management culture approaches is one of these challenges which may furnish the road to the Libyan industrial companies to effectively empower their employees and improve their ability to respond to the international competition. The primary objective of this paper is to design a practical approach to guide Libyan industrial companies toward successful quality culture implementation.
Keywords: Libyan manufacturing industries, TQM, Quality culture, Quality framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21423542 A New Precautionary Method for Measurement and Improvement the Data Quality
Authors: Seyed Mohammad Hossein Moossavizadeh, Mehran Mohsenzadeh, Nasrin Arshadi
Abstract:
the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.
Keywords: Data quality, precaution, information system, measurement, improvement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468