Search results for: Software reliability models.
4854 Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics
Authors: K. K. Aggarwal, Yogesh Singh, Arvinder Kaur, Ruchika Malhotra
Abstract:
Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.
Keywords: Software quality, Measurement, Metrics, Artificial neural network, Coupling, Cohesion, Inheritance, Principal component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25754853 Reliability Assessment of Bangladesh Power System Using Recursive Algorithm
Authors: Nahid-Al-Masood, Jubaer Ahmed, Amina Hasan Abedin, S. R. Deeba, Faeza Hafiz, Mahmuda Begum
Abstract:
An electric utility-s main concern is to plan, design, operate and maintain its power supply to provide an acceptable level of reliability to its users. This clearly requires that standards of reliability be specified and used in all three sectors of the power system, i.e., generation, transmission and distribution. That is why reliability of a power system is always a major concern to power system planners. This paper presents the reliability analysis of Bangladesh Power System (BPS). Reliability index, loss of load probability (LOLP) of BPS is evaluated using recursive algorithm and considering no de-rated states of generators. BPS has sixty one generators and a total installed capacity of 5275 MW. The maximum demand of BPS is about 5000 MW. The relevant data of the generators and hourly load profiles are collected from the National Load Dispatch Center (NLDC) of Bangladesh and reliability index 'LOLP' is assessed for the period of last ten years.
Keywords: Recursive algorithm, LOLP, forced outage rate, cumulative probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23574852 Stepwise Refinement in Executable-UML for Embedded System Design: A Preliminary Study
Authors: Nurul Azma Zakaria, Masahiro Kimura, Noriko Matsumoto, Norihiko Yoshida
Abstract:
The fast growth in complexity coupled with requests for shorter development periods for embedded systems are bringing demands towards a more effective, i.e. higher-abstract, design process for hardaware/software integrated design. In Software Engineering area, Model Driven Architecture (MDA) and Executable UML (xUML) has been accepted to bring further improvement in software design. This paper constructs MDA and xUML stepwise transformations from an abstract specification model to a more concrete implementation model using the refactoring technique for hardaware/software integrated design. This approach provides clear and structured models which enables quick exploration and synthesis, and early stage verification.
Keywords: Hardware/software integrated design, model driven architecture, executable UML, refactoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13704851 Towards Model-Driven Communications
Authors: Antonio Natali, Ambra Molesini
Abstract:
In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.
Keywords: Interactions, specific languages, meta-models, model driven development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18524850 Data Mining Classification Methods Applied in Drug Design
Authors: Mária Stachová, Lukáš Sobíšek
Abstract:
Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.Keywords: data mining, classification, drug design, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28524849 Application of Reliability Prediction Model Adapted for the Analysis of the ERP System
Authors: F. Urem, K. Fertalj, Ž. Mikulić
Abstract:
This paper presents the possibilities of using Weibull statistical distribution in modeling the distribution of defects in ERP systems. There follows a case study, which examines helpdesk records of defects that were reported as the result of one ERP subsystem upgrade. The result of the applied modeling is in modeling the reliability of the ERP system from a user perspective with estimated parameters like expected maximum number of defects in one day or predicted minimum of defects between two upgrades. Applied measurement-based analysis framework is proved to be suitable in predicting future states of the reliability of the observed ERP subsystems.
Keywords: ERP, reliability, Weibull
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13184848 Multi-view Description of Real-Time Systems- Architecture
Authors: A. Bessam, M. T. Kimour
Abstract:
Real-time embedded systems should benefit from component-based software engineering to handle complexity and deal with dependability. In these systems, applications should not only be logically correct but also behave within time windows. However, in the current component based software engineering approaches, a few of component models handles time properties in a manner that allows efficient analysis and checking at the architectural level. In this paper, we present a meta-model for component-based software description that integrates timing issues. To achieve a complete functional model of software components, our meta-model focuses on four functional aspects: interface, static behavior, dynamic behavior, and interaction protocol. With each aspect we have explicitly associated a time model. Such a time model can be used to check a component-s design against certain properties and to compute the timing properties of component assemblies.Keywords: Real-time systems, Software architecture, software component, dependability, time properties, ADL, metamodeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16364847 Software Product Quality Evaluation Model with Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This paper presents a software product quality evaluation model based on the ISO/IEC 25010 quality model. The evaluation characteristics and sub characteristics were identified from the ISO/IEC 25010 quality model. The multidimensional structure of the quality model is based on characteristics such as functional suitability, performance efficiency, compatibility, usability, reliability, security, maintainability, and portability, and associated sub characteristics. Random numbers are generated to establish the decision maker’s importance weights for each sub characteristics. Also, random numbers are generated to establish the decision matrix of the decision maker’s final scores for each software product against each sub characteristics. Thus, objective criteria importance weights and index scores for datasets were obtained from the random numbers. In the proposed model, five different software product quality evaluation datasets under three different weight vectors were applied to multiple criteria decision analysis method, preference analysis for reference ideal solution (PARIS) for comparison, and sensitivity analysis procedure. This study contributes to provide a better understanding of the application of MCDMA methods and ISO/IEC 25010 quality model guidelines in software product quality evaluation process.
Keywords: ISO/IEC 25010 quality model, multiple criteria decisions making, multiple criteria decision making analysis, MCDMA, PARIS, Software Product Quality Evaluation Model, Software Product Quality Evaluation, Software Evaluation, Software Selection, Software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4504846 Optimal Sizing of a Hybrid Wind/PV Plant Considering Reliability Indices
Authors: S. Dehghan, B. Kiani, A. Kazemi, A. Parizad
Abstract:
The utilization of renewable energy sources in electric power systems is increasing quickly because of public apprehensions for unpleasant environmental impacts and increase in the energy costs involved with the use of conventional energy sources. Despite the application of these energy sources can considerably diminish the system fuel costs, they can also have significant influence on the system reliability. Therefore an appropriate combination of the system reliability indices level and capital investment costs of system is vital. This paper presents a hybrid wind/photovoltaic plant, with the aim of supplying IEEE reliability test system load pattern while the plant capital investment costs is minimized by applying a hybrid particle swarm optimization (PSO) / harmony search (HS) approach, and the system fulfills the appropriate level of reliability.Keywords: Distributed Generation, Fuel Cell, HS, Hybrid Power Plant, PSO, Photovoltaic, Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23054845 Harnessing Nigeria's Forestry Potential for Structural Applications: Structural Reliability of Nigerian Grown Opepe Timber
Authors: J. I. Aguwa, S. Sadiku, M. Abdullahi
Abstract:
This study examined the structural reliability of the Nigerian grown Opepe timber as bridge beam material. The strength of a particular specie of timber depends so much on some factors such as soil and environment in which it is grown. The steps involved are collection of the Opepe timber samples, seasoning/preparation of the test specimens, determination of the strength properties/statistical analysis, development of a computer programme in FORTRAN language and finally structural reliability analysis using FORM 5 software. The result revealed that the Nigerian grown Opepe is a reliable and durable structural bridge beam material for span of 5000mm, depth of 400mm, breadth of 250mm and end bearing length of 150mm. The probabilities of failure in bending parallel to the grain, compression perpendicular to the grain, shear parallel to the grain and deflection are 1.61 x 10-7, 1.43 x 10-8, 1.93 x 10-4 and 1.51 x 10-15 respectively. The paper recommends establishment of Opepe plantation in various Local Government Areas in Nigeria for structural applications such as in bridges, railway sleepers, generation of income to the nation as well as creating employment for the numerous unemployed youths.Keywords: Bending and deflection, Bridge beam, Compression, Nigerian Opepe, Shear, Structural reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12584844 Physical Parameters for Reliability Evaluation
Abstract:
This paper presents ageing experiments controlled by the evolution of junction parameters. The deterioration of the device is related to high injection effects which modified the transport mechanisms in the space charge region of the junction. Physical phenomena linked to the degradation of junction parameters that affect the devices reliability are reported and discussed. We have used the method based on numerical analysis of experimental current-voltage characteristic of the junction, in order to extract the electrical parameters. The simultaneous follow-up of the evolutions of the series resistance and of the transition voltage allow us to introduce a new parameter for reliability evaluation.
Keywords: High injection, junction, parameters, reliability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13824843 Software Maintenance Severity Prediction with Soft Computing Approach
Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu
Abstract:
As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15834842 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: Change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15204841 Requirement Engineering and Software Product Line Scoping Paradigm
Authors: Ahmed Mateen, Zhu Qingsheng, Faisal Shahzad
Abstract:
Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.
Keywords: Requirements engineering, software product lines, scoping, process structure, domain specific language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8284840 Software Development Processes Maturity versus Software Processes and Products Measurement
Authors: Beata Czarnacka-Chrobot
Abstract:
Unsatisfactory effectiveness of software systems development and enhancement projects is one of the main reasons why in software engineering there are attempts being made to use experiences coming from other engineering disciplines. In spite of specificity of software product and process a belief had come out that the execution of software could be more effective if these objects were subject to measurement – as it is true in other engineering disciplines for which measurement is an immanent feature. Thus objective and reliable approaches to the measurement of software processes and products have been sought in software engineering for several dozens of years already. This may be proved, among others, by the current version of CMMI for Development model. This paper is aimed at analyzing the approach to the software processes and products measurement proposed in the latest version of this very model, indicating growing acceptance for this issue in software engineering.Keywords: CMMI for Development (1.3), ISO/IEC standards, measurement and analysis process area, software process measurement, software product measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17934839 Reasons for Non-Applicability of Software Entropy Metrics for Bug Prediction in Android
Authors: Arvinder Kaur, Deepti Chopra
Abstract:
Software Entropy Metrics for bug prediction have been validated on various software systems by different researchers. In our previous research, we have validated that Software Entropy Metrics calculated for Mozilla subsystem’s predict the future bugs reasonably well. In this study, the Software Entropy metrics are calculated for a subsystem of Android and it is noticed that these metrics are not suitable for bug prediction. The results are compared with a subsystem of Mozilla and a comparison is made between the two software systems to determine the reasons why Software Entropy metrics are not applicable for Android.
Keywords: Android, bug prediction, mining software repositories, Software Entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10944838 Application of Life Data Analysis for the Reliability Assessment of Numerical Overcurrent Relays
Authors: Mohd Iqbal Ridwan, Kerk Lee Yen, Aminuddin Musa, Bahisham Yunus
Abstract:
Protective relays are components of a protection system in a power system domain that provides decision making element for correct protection and fault clearing operations. Failure of the protection devices may reduce the integrity and reliability of the power system protection that will impact the overall performance of the power system. Hence it is imperative for power utilities to assess the reliability of protective relays to assure it will perform its intended function without failure. This paper will discuss the application of reliability analysis using statistical method called Life Data Analysis in Tenaga Nasional Berhad (TNB), a government linked power utility company in Malaysia, namely Transmission Division, to assess and evaluate the reliability of numerical overcurrent protective relays from two different manufacturers.Keywords: Life data analysis, Protective relays, Reliability, Weibull Distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39834837 An Efficient Algorithm for Reliability Lower Bound of Distributed Systems
Authors: Mohamed H. S. Mohamed, Yang Xiao-zong, Liu Hong-wei, Wu Zhi-bo
Abstract:
The reliability of distributed systems and computer networks have been modeled by a probabilistic network or a graph G. Computing the residual connectedness reliability (RCR), denoted by R(G), under the node fault model is very useful, but is an NP-hard problem. Since it may need exponential time of the network size to compute the exact value of R(G), it is important to calculate its tight approximate value, especially its lower bound, at a moderate calculation time. In this paper, we propose an efficient algorithm for reliability lower bound of distributed systems with unreliable nodes. We also applied our algorithm to several typical classes of networks to evaluate the lower bounds and show the effectiveness of our algorithm.Keywords: Distributed systems, probabilistic network, residual connectedness reliability, lower bound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16844836 Development of an Attitude Scale Towards Social Networking Sites
Authors: Münevver Başman, Deniz Gülleroğlu
Abstract:
The purpose of this study is to develop a scale to determine the attitudes towards social networking sites. 45 tryout items, prepared for this aim, were applied to 342 students studying at Marmara University, Faculty of Education. The reliability and the validity of the scale were conducted with the help of these students. As a result of exploratory factor analysis with Varimax rotation, 41 items grouped according to the structure with three factors (interest, reality and negative effects) is obtained. While alpha reliability of the scale is obtained as .899; the reliability of factors is obtained as .899, .799, .775, respectively.Keywords: Attitude, reliability, social networking sites, validity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16674835 GEP Considering Purchase Prices, Profits of IPPs and Reliability Criteria Using Hybrid GA and PSO
Authors: H. Shayeghi, H. Hosseini, A. Shabani, M. Mahdavi
Abstract:
In this paper, optimal generation expansion planning (GEP) is investigated considering purchase prices, profits of independent power producers (IPPs) and reliability criteria using a new method based on hybrid coded Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). In this approach, optimal purchase price of each IPP is obtained by HCGA and reliability criteria are calculated by PSO technique. It should be noted that reliability criteria and the rate of carbon dioxide (CO2) emission have been considered as constraints of the GEP problem. Finally, the proposed method has been tested on the case study system. The results evaluation show that the proposed method can simply obtain optimal purchase prices of IPPs and is a fast method for calculation of reliability criteria in expansion planning. Also, considering the optimal purchase prices and profits of IPPs in generation expansion planning are caused that the expansion costs are decreased and the problem is solved more exactly.
Keywords: GEP Problem, IPPs, Reliability Criteria, GA, PSO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14334834 Strongly Adequate Software Architecture
Authors: Pradip Peter Dey
Abstract:
Components of a software system may be related in a wide variety of ways. These relationships need to be represented in software architecture in order develop quality software. In practice, software architecture is immensely challenging, strikingly multifaceted, extravagantly domain based, perpetually changing, rarely cost-effective, and deceptively ambiguous. This paper analyses relations among the major components of software systems and argues for using several broad categories for software architecture for assessment purposes: strongly adequate, weakly adequate and functionally adequate software architectures among other categories. These categories are intended for formative assessments of architectural designs.Keywords: Components, Model Driven Architecture, Graphical User Interfaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20544833 Operation Strategy of Multi-Energy Storage System Considering Power System Reliability
Authors: Wook-Won Kim, Je-Seok Shin, Jin-O Kim
Abstract:
As the penetration of Energy Storage System (ESS) increases in the power system due to higher performance and lower cost than ever, ESS is expanding its role to the ancillary service as well as the storage of extra energy from the intermittent renewable energy resources. For multi-ESS with different capacity and SOC level each other, it is required to make the optimal schedule of SOC level use the multi-ESS effectively. This paper proposes the energy allocation method for the multiple battery ESS with reliability constraint, in order to make the ESS discharge the required energy as long as possible. A simple but effective method is proposed in this paper, to satisfy the power for the spinning reserve requirement while improving the system reliability. Modelling of ESS is also proposed, and reliability is evaluated by using the combined reliability model which includes the proposed ESS model and conventional generation one. In the case study, it can be observed that the required power is distributed to each ESS adequately and accordingly, the SOC is scheduled to improve the reliability indices such as Loss of Load Probability (LOLP) and Loss of Load Expectation (LOLE).Keywords: Multiple energy storage system, energy allocation method, SOC schedule, reliability constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12274832 The Characterisation of TLC NAND Flash Memory, Leading to a Definable Endurance/Retention Trade-Off
Authors: Sorcha Bennett, Joe Sullivan
Abstract:
Triple-Level Cell (TLC) NAND Flash memory at, and below, 20nm (nanometer) is still largely unexplored by researchers, and with the ever more commonplace existence of Flash in consumer and enterprise applications there is a need for such gaps in knowledge to be filled. At the time of writing, there was little published data or literature on TLC, and more specifically reliability testing, with a further emphasis on both endurance and retention. This paper will give an introduction to NAND Flash memory, followed by an overview of the relevant current research on the reliability of Flash memory, along with the planned future work which will provide results to help characterise the reliability of TLC memory.Keywords: TLC NAND flash memory, reliability, endurance, retention, trade-off, raw flash, patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35194831 The Analysis of the Software Industry in Thailand
Authors: Danuvasin Charoen
Abstract:
The software industry has been considered a critical infrastructure for any nation. Several studies have indicated that national competitiveness increasingly depends upon Information and Communication Technology (ICT), and software is one of the major components of ICT, important for both large and small enterprises. Even though there has been strong growth in the software industry in Thailand, the industry has faced many challenges and problems that need to be resolved. For example, the amount of pirated software has been rising, and Thailand still has a large gap in the digital divide. Additionally, the adoption among SMEs has been slow. This paper investigates various issues in the software industry in Thailand, using information acquired through analysis of secondary sources, observation, and focus groups. The results of this study can be used as “lessons learned" for the development of the software industry in any developing country.Keywords: Software industry, developing nations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44764830 A Systematic Literature Review on Security and Privacy Design Patterns
Authors: Ebtehal Aljedaani, Maha Aljohani
Abstract:
Privacy and security patterns are both important for developing software that protects users' data and privacy. Privacy patterns are designed to address common privacy problems, such as unauthorized data collection and disclosure. Security patterns are designed to protect software from attack and ensure reliability and trustworthiness. Using privacy and security patterns, software engineers can implement security and privacy by design principles, which means that security and privacy are considered throughout the software development process. These patterns are available to translate "security and privacy-by-design" into practical advice for software engineering. Previous research on privacy and security patterns has typically focused on one category of patterns at a time. This paper aims to bridge this gap by merging the two categories and identifying their similarities and differences. To do this, we conducted a systematic literature review of 40 research papers on privacy and security patterns. The papers were analyzed based on the category of the pattern, the classification of the pattern, and the security requirements that the pattern addresses. This paper presents the results of a comprehensive review of privacy and security design patterns. The review is intended to help future IT designers understand the relationship between the two types of patterns and how to use them to design secure and privacy-preserving software. The paper provides a clear classification of privacy and security design patterns, along with examples of each type. We found that there is only one widely accepted classification of privacy design patterns, while there are several competing classifications of security design patterns. Three types of security design patterns were found to be the most used.
Keywords: Design patterns, security, privacy, classification of patterns, security patterns, privacy patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 664829 Probabilistic Electrical Power Generation Modeling Using Decimal to Binary Conversion
Authors: Ahmed S. Al-Abdulwahab
Abstract:
Generation system reliability assessment is an important task which can be performed using deterministic or probabilistic techniques. The probabilistic approaches have significant advantages over the deterministic methods. However, more complicated modeling is required by the probabilistic approaches. Power generation model is a basic requirement for this assessment. One form of the generation models is the well known capacity outage probability table (COPT). Different analytical techniques have been used to construct the COPT. These approaches require considerable mathematical modeling of the generating units. The unit-s models are combined to build the COPT which will add more burdens on the process of creating the COPT. Decimal to Binary Conversion (DBC) technique is widely and commonly applied in electronic systems and computing This paper proposes a novel utilization of the DBC to create the COPT without engaging in analytical modeling or time consuming simulations. The simple binary representation , “0 " and “1 " is used to model the states o f generating units. The proposed technique is proven to be an effective approach to build the generation model.Keywords: Decimal to Binary, generation, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20434828 Object-Oriented Cognitive-Spatial Complexity Measures
Authors: Varun Gupta, Jitender Kumar Chhabra
Abstract:
Software maintenance and mainly software comprehension pose the largest costs in the software lifecycle. In order to assess the cost of software comprehension, various complexity measures have been proposed in the literature. This paper proposes new cognitive-spatial complexity measures, which combine the impact of spatial as well as architectural aspect of the software to compute the software complexity. The spatial aspect of the software complexity is taken into account using the lexical distances (in number of lines of code) between different program elements and the architectural aspect of the software complexity is taken into consideration using the cognitive weights of control structures present in control flow of the program. The proposed measures are evaluated using standard axiomatic frameworks and then, the proposed measures are compared with the corresponding existing cognitive complexity measures as well as the spatial complexity measures for object-oriented software. This study establishes that the proposed measures are better indicators of the cognitive effort required for software comprehension than the other existing complexity measures for object-oriented software.Keywords: cognitive complexity, software comprehension, software metrics, spatial complexity, Object-oriented software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21464827 Reliability Factors Based Fuzzy Logic Scheme for Spectrum Sensing
Authors: Tallataf Rasheed, Adnan Rashdi, Ahmad Naeem Akhtar
Abstract:
The accurate spectrum sensing is a fundamental requirement of dynamic spectrum access for deployment of Cognitive Radio Network (CRN). To acheive this requirement a Reliability factors based Fuzzy Logic (RFL) Scheme for Spectrum Sensing has been proposed in this paper. Cognitive Radio User (CRU) predicts the presence or absence of Primary User (PU) using energy detector and calculates the Reliability factors which are SNR of sensing node, threshold of energy detector and decision difference of each node with other nodes in a cooperative spectrum sensing environment. Then the decision of energy detector is combined with Reliability factors of sensing node using Fuzzy Logic. These Reliability Factors used in RFL Scheme describes the reliability of decision made by a CRU to improve the local spectrum sensing. This Fuzzy combining scheme provides the accuracy of decision made by sensornode. The simulation results have shown that the proposed technique provide better PU detection probability than existing Spectrum Sensing Techniques.Keywords: Cognitive radio, spectrum sensing, energy detector, reliability factors, fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10674826 Some Pertinent Issues and Considerations on CBSE
Authors: Anil Kumar Tripathi, Ratneshwer
Abstract:
All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.
Keywords: Software Component, Component Based Software Engineering, Software Process, Testing, Maintenance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18324825 Comparison of Reliability Systems Based Uncertainty
Authors: A. Aissani, H. Benaoudia
Abstract:
Stochastic comparison has been an important direction of research in various area. This can be done by the use of the notion of stochastic ordering which gives qualitatitive rather than purely quantitative estimation of the system under study. In this paper we present applications of comparison based uncertainty related to entropy in Reliability analysis, for example to design better systems. These results can be used as a priori information in simulation studies.Keywords: Uncertainty, Stochastic comparison, Reliability, serie's system, imperfect repair.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1256