Search results for: User acceptance testing. Software reliability growth modelling. Split Poisson process. Bayesian methods.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13330

Search results for: User acceptance testing. Software reliability growth modelling. Split Poisson process. Bayesian methods.

13270 Computer Aided Assembly Attributes Retrieval Methods for Automated Assembly Sequence Generation

Authors: M. V. A. Raju Bahubalendruni, Bibhuti Bhusan Biswal, B. B. V. L. Deepak

Abstract:

Achieving an appropriate assembly sequence needs deep verification for its physical feasibility. For this purpose, industrial engineers use several assembly predicates; namely, liaison, geometric feasibility, stability and mechanical feasibility. However, testing an assembly sequence for these predicates requires huge assembly information. Extracting such assembly information from an assembled product is a time consuming and highly skillful task with complex reasoning methods. In this paper, computer aided methods are proposed to extract all the necessary assembly information from computer aided design (CAD) environment in order to perform the assembly sequence planning efficiently. These methods use preliminary capabilities of three-dimensional solid modelling and assembly modelling methods used in CAD software considering equilibrium laws of physical bodies.

Keywords: Assembly automation, assembly attributes, assembly sequence generation, computer aided design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
13269 Knowledge Modelling for a Hotel Recommendation System

Authors: B. A. Gobin, R. K. Subramanian

Abstract:

Knowledge modelling, a main activity for the development of Knowledge Based Systems, have no set standards and are mostly done in an ad hoc way. There is a lack of support for the transition from abstract level to implementation. In this paper, a methodology for the development of the knowledge model, which is inspired by both Software and Knowledge Engineering, is proposed. Use of UML which is the de-facto standard for modelling in the software engineering arena is explored for knowledge modelling. The methodology proposed, is used to develop a knowledge model of a knowledge based system for recommending suitable hotels for tourists visiting Mauritius.

Keywords: Domain Modelling, Knowledge Based Systems, Knowledge Modelling, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3718
13268 Improving the Effectiveness of Software Testing through Test Case Reduction

Authors: R. P. Mahapatra, Jitendra Singh

Abstract:

This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.

Keywords: Software Testing, Test Case Generation, Test CaseReduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2929
13267 Requirements Gathering for Improved Software Usability and the Potential for Usage-Centred Design

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

Usability is an important software quality that is often neglected at the design stage. Although methods exist to incorporate elements of usability engineering, there is a need for more balanced usability focused methods that can enhance the experience of software usability for users. In this regard, the potential for Usage-Centred Design is explored with respect to requirements gathering and is shown to lead to high software usability besides other benefits. It achieves this through its focus on usage, defining essential use cases, by conducting task modeling, encouraging user collaboration, refining requirements, and so on. The requirements gathering process in UgCD is described in detail.

Keywords: Requirements gathering, Usability, Usage-Centred Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
13266 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2634
13265 Analysis of the Impact of NVivo and EndNote on Academic Research Productivity

Authors: Sujit K. Basak

Abstract:

The aim of this paper is to analyze the impact of literature review software on researchers. The aim of this study was achieved by analyzing models in terms of perceived usefulness, perceived ease of use, and acceptance level. Collected data were analyzed using WarpPLS 4.0 software. This study used two theoretical frameworks, namely, Technology Acceptance Model and the Training Needs Assessment Model. The study was experimental and was conducted at a public university in South Africa. The results of the study showed that acceptance level has a high impact on research productivity followed by perceived usefulness and perceived ease of use.

Keywords: Technology acceptance model, training needs assessment model, literature review software, research productivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2933
13264 A Development of a Weight-Balancing Control System Based On Android Operating System

Authors: Rattanathip Rattanachai, Piyachai Petchyen, Kunyanuth Kularbphettong

Abstract:

This paper describes the development of a Weight- Balancing Control System based on the Android Operating System and it provides recommendations on ways of balancing of user’s weight based on daily metabolism process and need so that user can make informed decisions on his or her weight controls. The system also depicts more information on nutrition details. Furthermore, it was designed to suggest to users what kinds of foods they should eat and how to exercise in the right ways. We describe the design methods and functional components of this prototype. To evaluate the system performance, questionnaires for system usability and Black Box Testing were used to measure expert and user satisfaction. The results were satisfactory as followed: Means for experts and users were 3.94 and 4.07 respectively.

Keywords: Weight-Balancing Control, Android Operating System, daily metabolism, Black Box Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2102
13263 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software

Authors: Anjushi Verma, Tirthankar Gayen

Abstract:

Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.

Keywords: Black Box, faults, failure, software reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
13262 CompleX-Machine: An Automated Testing Tool Using X-Machine Theory

Authors: E. K. A. Ogunshile

Abstract:

This paper is aimed at creating an Automatic Java X-Machine testing tool for software development. The nature of software development is changing; thus, the type of software testing tools required is also changing. Software is growing increasingly complex and, in part due to commercial impetus for faster software releases with new features and value, increasingly in danger of containing faults. These faults can incur huge cost for software development organisations and users; Cambridge Judge Business School’s research estimated the cost of software bugs to the global economy is $312 billion. Beyond the cost, faster software development methodologies and increasing expectations on developers to become testers is driving demand for faster, automated, and effective tools to prevent potential faults as early as possible in the software development lifecycle. Using X-Machine theory, this paper will explore a new tool to address software complexity, changing expectations on developers, faster development pressures and methodologies, with a view to reducing the huge cost of fixing software bugs.

Keywords: Conformance testing, finite state machine, software testing, X-Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160
13261 Adaptation of State/Transition-Based Methods for Embedded System Testing

Authors: Abdelaziz Guerrouat, Harald Richter

Abstract:

In this paper test generation methods and appropriate fault models for testing and analysis of embedded systems described as (extended) finite state machines ((E)FSMs) are presented. Compared to simple FSMs, EFSMs specify not only the control flow but also the data flow. Thus, we define a two-level fault model to cover both aspects. The goal of this paper is to reuse well-known FSM-based test generation methods for automation of embedded system testing. These methods have been widely used in testing and validation of protocols and communicating systems. In particular, (E)FSMs-based specification and testing is more advantageous because (E)FSMs support the formal semantic of already standardised formal description techniques (FDTs) despite of their popularity in the design of hardware and software systems.

Keywords: Formal methods, testing and validation, finite state machines, formal description techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
13260 An Exploratory Study in Nursing Education: Factors Influencing Nursing Students’ Acceptance of Mobile Learning

Authors: R. Abdulrahman, A. Eardley, A. Soliman

Abstract:

The proliferation in the development of mobile learning (m-learning) has played a vital role in the rapidly growing electronic learning market. This relatively new technology can help to encourage the development of in learning and to aid knowledge transfer a number of areas, by familiarizing students with innovative information and communications technologies (ICT). M-learning plays a substantial role in the deployment of learning methods for nursing students by using the Internet and portable devices to access learning resources ‘anytime and anywhere’. However, acceptance of m-learning by students is critical to the successful use of m-learning systems. Thus, there is a need to study the factors that influence student’s intention to use m-learning. This paper addresses this issue. It outlines the outcomes of a study that evaluates the unified theory of acceptance and use of technology (UTAUT) model as applied to the subject of user acceptance in relation to m-learning activity in nurse education. The model integrates the significant components across eight prominent user acceptance models. Therefore, a standard measure is introduced with core determinants of user behavioural intention. The research model extends the UTAUT in the context of m-learning acceptance by modifying and adding individual innovativeness (II) and quality of service (QoS) to the original structure of UTAUT. The paper goes on to add the factors of previous experience (of using mobile devices in similar applications) and the nursing students’ readiness (to use the technology) to influence their behavioural intentions to use m-learning. This study uses a technique called ‘convenience sampling’ which involves student volunteers as participants in order to collect numerical data. A quantitative method of data collection was selected and involves an online survey using a questionnaire form. This form contains 33 questions to measure the six constructs, using a 5-point Likert scale. A total of 42 respondents participated, all from the Nursing Institute at the Armed Forces Hospital in Saudi Arabia. The gathered data were then tested using a research model that employs the structural equation modelling (SEM), including confirmatory factor analysis (CFA). The results of the CFA show that the UTAUT model has the ability to predict student behavioural intention and to adapt m-learning activity to the specific learning activities. It also demonstrates satisfactory, dependable and valid scales of the model constructs. This suggests further analysis to confirm the model as a valuable instrument in order to evaluate the user acceptance of m-learning activity.

Keywords: Mobile learning, nursing institute, unified theory of acceptance and use of technology model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1139
13259 Analyzing Data on Breastfeeding Using Dispersed Statistical Models

Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan

Abstract:

Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is very important as it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, it helps to reduce the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we make a survey of the factors that influence exclusive breastfeeding and use two dispersed statistical models to analyze data. The models are the Generalized Poisson regression model and the Com-Poisson regression models.

Keywords: Exclusive breastfeeding, regression model, generalized poisson, com-poisson.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
13258 Design and Development of Architectural Model Darul Ridzuan Museum

Authors: Jafreezal Jaafar, Hasiah Mohamed, Hazida Razali

Abstract:

This paper focuses on the 3D reconstruction of the architectural design of Darul Ridzuan Museum. It has concentrated on designing exterior part of the building according to colored digital photo of the real museum. Besides viewing the architecture, walkthroughs are generated for the user to control it in an easier way. User can travel through the museum to get the feel of the environment and to explore the design of the museum as a whole; both exterior and interior. The result has shown positive result in terms of realism, navigation, collision detection, suitability, usability and user-s acceptance. In brief, the 3D virtual museum has provided an alternative to present a real museum.

Keywords: Virtual Heritage, 3D Modelling, Virtual Museum, Usability Evaluation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
13257 The Gerber-Shiu Functions of a Risk Model with Two Classes of Claims and Random Income

Authors: Shan Gao

Abstract:

In this paper, we consider a risk model involving two independent classes of insurance risks and random premium income. We assume that the premium income process is a Poisson Process, and the claim number processes are independent Poisson and generalized Erlang(n) processes, respectively. Both of the Gerber- Shiu functions with zero initial surplus and the probability generating functions (p.g.f.) of the Gerber-Shiu functions are obtained.

Keywords: Poisson process, generalized Erlang risk process, Gerber-Shiu function, generating function, generalized Lundberg equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
13256 Sensitivity Analysis of Real-Time Systems

Authors: Benjamin Gorry, Andrew Ireland, Peter King

Abstract:

Verification of real-time software systems can be expensive in terms of time and resources. Testing is the main method of proving correctness but has been shown to be a long and time consuming process. Everyday engineers are usually unwilling to adopt formal approaches to correctness because of the overhead associated with developing their knowledge of such techniques. Performance modelling techniques allow systems to be evaluated with respect to timing constraints. This paper describes PARTES, a framework which guides the extraction of performance models from programs written in an annotated subset of C.

Keywords: Performance Modelling, Real-time, SensitivityAnalysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
13255 Analyzing the Factors Effecting the Passenger Car Breakdowns using Com-Poisson GLM

Authors: N. Mamode Khan, V. Jowaheer

Abstract:

Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observations as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use quasi-likelihood estimation approach to estimate the parameters of the model. Under-dispersion parameter is estimated to be 2.14 justifying the appropriateness of Com-Poisson distribution in modelling under-dispersed count responses recorded in this study.

Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, quasi-likelihood estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
13254 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: Business process modelling, system models, role activity diagrams, sequence diagrams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
13253 Concurrent Testing of ADC for Embedded System

Authors: Y.B.Gandole

Abstract:

Compaction testing methods allow at-speed detecting of errors while possessing low cost of implementation. Owing to this distinctive feature, compaction methods have been widely used for built-in testing, as well as external testing. In the latter case, the bandwidth requirements to the automated test equipment employed are relaxed which reduces the overall cost of testing. Concurrent compaction testing methods use operational signals to detect misbehavior of the device under test and do not require input test stimuli. These methods have been employed for digital systems only. In the present work, we extend the use of compaction methods for concurrent testing of analog-to-digital converters. We estimate tolerance bounds for the result of compaction and evaluate the aliasing rate.

Keywords: Analog-to Digital Converter, Embedded system, Concurrent Testing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
13252 Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Authors: Anshu Manik, William G. Buttlar, Kasthurirangan Gopalakrishnan

Abstract:

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Keywords: Asphalt Pavement, Risk Analysis, StochasticSimulation, QC/QA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
13251 Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

Authors: Jehad Al Dallal

Abstract:

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Keywords: Automated testing, class testing, FICs, FIST2, object-oriented framework, object-oriented testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
13250 Manual to Automated Testing: An Effort-Based Approach for Determining the Priority of Software Test Automation

Authors: Peter Sabev, Katalina Grigorova

Abstract:

Test automation allows performing difficult and time consuming manual software testing tasks efficiently, quickly and repeatedly. However, development and maintenance of automated tests is expensive, so it needs a proper prioritization what to automate first. This paper describes a simple yet efficient approach for such prioritization of test cases based on the effort needed for both manual execution and software test automation. The suggested approach is very flexible because it allows working with a variety of assessment methods, and adding or removing new candidates at any time. The theoretical ideas presented in this article have been successfully applied in real world situations in several software companies by the authors and their colleagues including testing of real estate websites, cryptographic and authentication solutions, OSGi-based middleware framework that has been applied in various systems for smart homes, connected cars, production plants, sensors, home appliances, car head units and engine control units (ECU), vending machines, medical devices, industry equipment and other devices that either contain or are connected to an embedded service gateway.

Keywords: Automated Testing, Manual Testing, Test Automation, Software testing, Test Prioritization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3277
13249 A User Friendly Tool for Performance Evaluation of Different Reference Evapotranspiration Methods

Authors: Vijay Shankar

Abstract:

Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.

Keywords: Crop coefficient, Crop evapotranspiration, Field moisture, Irrigation Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
13248 Bootstrap and MLS Methods-based Individual Bioequivalence Assessment

Authors: Kongsheng Zhang, Li Ge

Abstract:

It is a one-sided hypothesis testing process for assessing bioequivalence. Bootstrap and modified large-sample(MLS) methods are considered to study individual bioequivalence(IBE), type I error and power of hypothesis tests are simulated and compared with FDA(2001). The results show that modified large-sample method is equivalent to the method of FDA(2001) .

Keywords: Individual bioequivalence, bootstrap, Bayesian bootstrap, modified large-sample.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
13247 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and electrocardiogram (ECG)-based systems are unquestionably the best choice due to their appealing inherent characteristics. The Convolutional Neural Networks (CNNs) are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the caliber of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest False Acceptance Rate (FAR)  of 0.04% and the highest False Rejection Rate (FRR)  of 5%, the best performing network achieved an identification accuracy of 99.94%. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable, but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, dense networks, identification rate, train/test split ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 451
13246 Probabilistic Modelling of Marine Bridge Deterioration

Authors: P.C. Ryan, A.J. O' Connor

Abstract:

Chloride induced corrosion of steel reinforcement is the main cause of deterioration of reinforced concrete marine structures. This paper investigates the relative performance of alternative repair options with respect to the deterioration of reinforced concrete bridge elements in marine environments. Focus is placed on the initiation phase of reinforcement corrosion. A laboratory study is described which involved exposing concrete samples to accelerated chloride-ion ingress. The study examined the relative efficiencies of two repair methods, namely Ordinary Portland Cement (OPC) concrete and a concrete which utilised Ground Granulated Blastfurnace Cement (GGBS) as a partial cement replacement. The mix designs and materials utilised were identical to those implemented in the repair of a marine bridge on the South East coast of Ireland in 2007. The results of this testing regime serve to inform input variables employed in probabilistic modelling of deterioration for subsequent reliability based analysis to compare the relative performance of the studied repair options.

Keywords: Deterioration, Marine Bridges, Reinforced Concrete, Reliability, Chloride-ion Ingress

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
13245 Bayesian Decision Approach to Protection on the Flood Event in Upper Ayeyarwady River, Myanmar

Authors: Min Min Swe Zin

Abstract:

This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.

Keywords: Bayesian decision method, conditional binomial distribution, minimax rules, prior beta distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
13244 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia

Authors: N. A. Samat, S. H. Mohd Imam Ma’arof

Abstract:

Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.

Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3208
13243 Determination of Poisson’s Ratio and Elastic Modulus of Compression Textile Materials

Authors: Chongyang Ye, Rong Liu

Abstract:

Compression textiles such as compression stockings (CSs) have been extensively applied for the prevention and treatment of chronic venous insufficiency of lower extremities. The involvement of multiple mechanical factors such as interface pressure, frictional force, and elastic materials make the interactions between lower limb and CSs to be complex. Determination of Poisson’s ratio and elastic moduli of CS materials are critical for constructing finite element (FE) modeling to numerically simulate a complex interactive system of CS and lower limb. In this study, a mixed approach, including an analytic model based on the orthotropic Hooke’s Law and experimental study (uniaxial tension testing and pure shear testing), has been proposed to determine Young’s modulus, Poisson’s ratio, and shear modulus of CS fabrics. The results indicated a linear relationship existing between the stress and strain properties of the studied CS samples under controlled stretch ratios (< 100%). The proposed method and the determined key mechanical properties of elastic orthotropic CS fabrics facilitate FE modeling for analyzing in-depth the effects of compression material design on their resultant biomechanical function in compression therapy.

Keywords: Elastic compression stockings, Young’s modulus, Poisson’s ratio, shear modulus, mechanical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 309
13242 Testing Loaded Programs Using Fault Injection Technique

Authors: S. Manaseer, F. A. Masooud, A. A. Sharieh

Abstract:

Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.

Keywords: Complex software systems, Error detection, Fault tolerance, Injection and testing methodology, Memory faults, Process and virtual memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
13241 Reliability Modeling and Data Analysis of Vacuum Circuit Breaker Subject to Random Shocks

Authors: Rafik Medjoudj, Rabah Medjoudj, D. Aissani

Abstract:

The electrical substation components are often subject to degradation due to over-voltage or over-current, caused by a short circuit or a lightning. A particular interest is given to the circuit breaker, regarding the importance of its function and its dangerous failure. This component degrades gradually due to the use, and it is also subject to the shock process resulted from the stress of isolating the fault when a short circuit occurs in the system. In this paper, based on failure mechanisms developments, the wear out of the circuit breaker contacts is modeled. The aim of this work is to evaluate its reliability and consequently its residual lifetime. The shock process is based on two random variables such as: the arrival of shocks and their magnitudes. The arrival of shocks was modeled using homogeneous Poisson process (HPP). By simulation, the dates of short-circuit arrivals were generated accompanied with their magnitudes. The same principle of simulation is applied to the amount of cumulative wear out contacts. The objective reached is to find the formulation of the wear function depending on the number of solicitations of the circuit breaker.

Keywords: reliability, short-circuit, models of shocks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888