Search results for: Software process improvement; Software process optimization.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9195

Search results for: Software process improvement; Software process optimization.

9045 Modeling Metrics for Monitoring Software Project Performance Based On the GQM Model

Authors: Mariayee Doraisamy, Suhaimi Bin Ibrahim, Mohd Naz’ri Mahrin

Abstract:

There are several methods to monitor software projects and the objective for monitoring is to ensure that the software projects are developed and delivered successfully. A performance measurement is a method that is closely associated with monitoring and it can be scrutinized by looking at two important attributes which are efficiency and effectiveness both of which are factors that are important for the success of a software project. Consequently, a successful steering is achieved by monitoring and controlling a software project via the performance measurement criteria and metrics. Hence, this paper is aimed at identifying the performance measurement criteria and the metrics for monitoring the performance of a software project by using the Goal Question Metrics (GQM) approach. The GQM approach is utilized to ensure that the identified metrics are reliable and useful. These identified metrics are useful guidelines for project managers to monitor the performance of their software projects.

Keywords: Software project performance, Goal Question Metrics, Performance Measurement Criteria, Metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2470
9044 Computer Software for Calculating Electron Mobility of Semiconductors Compounds; Case Study for N-Gan

Authors: Emad A. Ahmed

Abstract:

Computer software to calculate electron mobility with respect to different scattering mechanism has been developed. This software is adopted completely Graphical User Interface (GUI) technique and its interface has been designed by Microsoft Visual basic 6.0. As a case study the electron mobility of n-GaN was performed using this software. The behavior of the mobility for n-GaN due to elastic scattering processes and its relation to temperature and doping concentration were discussed. The results agree with other available theoretical and experimental data.

Keywords: Electron mobility, relaxation time, GaN, Scattering, Computer software, computation physics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3820
9043 Methodologies, Systems Development Life Cycle and Modeling Languages in Agile Software Development

Authors: I. D. Arroyo

Abstract:

This article seeks to integrate different concepts from contemporary software engineering with an agile development approach. We seek to clarify some definitions and uses, we make a difference between the Systems Development Life Cycle (SDLC) and the methodologies, we differentiate the types of frameworks such as methodological, philosophical and behavioral, standards and documentation. We define relationships based on the documentation of the development process through formal and ad hoc models, and we define the usefulness of using DevOps and Agile Modeling as integrative methodologies of principles and best practices.

Keywords: Methodologies, SDLC, modeling languages, agile modeling, DevOps, UML, agile software development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 886
9042 Study of the Effect of Inclusion of TiO2 in Active Flux on Submerged Arc Welding of Low Carbon Mild Steel Plate and Parametric Optimization of the Process by Using DEA Based Bat Algorithm

Authors: Sheetal Kumar Parwar, J. Deb Barma, A. Majumder

Abstract:

Submerged arc welding is a very complex process. It is a very efficient and high performance welding process. In this present study an attempt have been done to reduce the welding distortion by increased amount of oxide flux through TiO2 in submerged arc welding process. Care has been taken to avoid the excessiveness of the adding agent for attainment of significant results. Data Envelopment Analysis (DEA) based BAT algorithm is used for the parametric optimization purpose in which DEA is used to convert multi response parameters into a single response parameter. The present study also helps to know the effectiveness of the addition of TiO2 in active flux during submerged arc welding process.

Keywords: BAT algorithm, design of experiment, optimization, submerged arc welding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
9041 Improving the Effectiveness of Software Testing through Test Case Reduction

Authors: R. P. Mahapatra, Jitendra Singh

Abstract:

This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.

Keywords: Software Testing, Test Case Generation, Test CaseReduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
9040 Assessing and Improving Ramp-Up Capability

Authors: Sebastian Tschöpe, Konja Knüppel, Peter Nyhuis

Abstract:

In times when product life cycles are decreasing, while market demands are increasing, manufacturing enterprises are confronted with the challenge of more frequent and more complex ramp-ups. Thus it becomes obvious that ramp-up management is going to be a topic enterprises have to focus on in the future. Since each ramp-up is unique concerning the product, the process, the technology, the circumstances and the coaction of these four factors, the knowledge of the ramp-up situation and the current ramp-up capability of the enterprise are fundamental requirements for the subsequent improvement of the ramp-up capability of the production system.

In this article a methodology is going to be presented which can be used to define typical production ramp-up situations, to identify the current ramp-up capability of a production system and to improve it with respect to a specific situation. Additionally there will be a description of the functionality of a software-tool developed based on this methodology.

Keywords: Assessment methodology, ramp-up, ramp-up capability, software-tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
9039 Bioprocess Optimization Based On Relevance Vector Regression Models and Evolutionary Programming Technique

Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte

Abstract:

This paper proposes a bioprocess optimization procedure based on Relevance Vector Regression models and evolutionary programming technique. Relevance Vector Regression scheme allows developing a compact and stable data-based process model avoiding time-consuming modeling expenses. The model building and process optimization procedure could be done in a half-automated way and repeated after every new cultivation run. The proposed technique was tested in a simulated mammalian cell cultivation process. The obtained results are promising and could be attractive for optimization of industrial bioprocesses.

Keywords: Bioprocess optimization, Evolutionary programming, Relevance Vector Regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142
9038 A Comparative Analysis of Zotero and Mendeley Reference Management Software

Authors: Sujit K. Basak

Abstract:

This paper presents a comparison of the reference management software between Zotero and Mendeley and the results were drawn by comparing the two software’s. The novelty of this paper is the comparative analysis of the software and it has shown that Mendeley can import more information from the Google Scholar for the researchers. This finding can help to know researchers to use the reference management software.

Keywords: Analysis, comparative analysis, zotero, researchers, Mendeley.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3145
9037 A Survey of Baseband Architecture for Software Defined Radio

Authors: M. A. Fodha, H. Benfradj, A. Ghazel

Abstract:

This paper is a survey of recent works that proposes a baseband processor architecture for software defined radio. A classification of different approaches is proposed. The performance of each architecture is also discussed in order to clarify the suitable approaches that meet software-defined radio constraints.

Keywords: Multi-core architectures, reconfigurable architecture, software defined radio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
9036 The Spiral_OWL Model – Towards Spiral Knowledge Engineering

Authors: Hafizullah A. Hashim, Aniza. A

Abstract:

The Spiral development model has been used successfully in many commercial systems and in a good number of defense systems. This is due to the fact that cost-effective incremental commitment of funds, via an analogy of the spiral model to stud poker and also can be used to develop hardware or integrate software, hardware, and systems. To support adaptive, semantic collaboration between domain experts and knowledge engineers, a new knowledge engineering process, called Spiral_OWL is proposed. This model is based on the idea of iterative refinement, annotation and structuring of knowledge base. The Spiral_OWL model is generated base on spiral model and knowledge engineering methodology. A central paradigm for Spiral_OWL model is the concentration on risk-driven determination of knowledge engineering process. The collaboration aspect comes into play during knowledge acquisition and knowledge validation phase. Design rationales for the Spiral_OWL model are to be easy-to-implement, well-organized, and iterative development cycle as an expanding spiral.

Keywords: Domain Expert, Knowledge Base, Ontology, Software Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
9035 Development of Configuration Software of Space Environment Simulator Control System Based on Linux

Authors: Zhan Haiyang, Zhang Lei, Ning Juan

Abstract:

This paper presents a configuration software solution in Linux, which is used for the control of space environment simulator. After introducing the structure and basic principle, it is said that the developing of QT software frame and the dynamic data exchanging between PLC and computer. The OPC driver in Linux is also developed. This driver realizes many-to-many communication between hardware devices and SCADA software. Moreover, an algorithm named “Scan PRI” is put forward. This algorithm is much more optimizable and efficient compared with "Scan in sequence" in Windows. This software has been used in practical project. It has a good control effect and can achieve the expected goal.

Keywords: Linux OS, configuration software, OPC server driver, MYSQL database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1092
9034 Redesigning Business Processes: A Method Based on Simulation and Process Mining Techniques

Authors: Zahra Mohammadnazari, Fateme Rostambeygi, Fatemeh Dehrouyeh, Hwang Ki-Soon, Amir Aghsami

Abstract:

Corporations have always prioritized efforts to examine and improve processes. Various metrics, such as the cost and time required to implement the process and can be specified in this regard. Process improvement can be defined as an improvement of these indicators. This is accomplished by looking at prospective adjustments to the current executive process model or the resources allotted to it. Research has been conducted in this paper to the improve the procurement process and aims to explore assessment prospects in the project using a combination of process mining and simulation (benefiting from Play-In and Play-Out methodologies). To run the simulation, we will need to complete the control flow diagram, institution settings, resource settings, and activity settings. The process of mining event logs yields the process control flow. However, both the entry of institutions and the distribution of resources must be modeled. The rate of admission of institutions and the distribution of time for the implementation of activities will be determined in the next step.

Keywords: Business reengineering, Petri net, process-based simulation, process mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 414
9033 Elections Management Information Communication System Voter Ballot

Authors: Zaza Tabagari, Zaza Sanikidze, George Giorgobiani

Abstract:

Abovepresented work deals with the new scope of application of information and communication technologies for the improvement of the election process in the biased environment. We are introducing a new concept of construction of the information-communication system for the election participant. It consists of four main components: Software, Physical Infrastructure, Structured Information and the Trained Stuff. The Structured Information is the bases of the whole system and is the collection of all possible events (irregularities among them) at the polling stations, which are structured in special templates, forms and integrated in mobile devices.The software represents a package of analytic modules, which operates with the dynamic database. The application of modern communication technologies facilities the immediate exchange of information and of relevant documents between the polling stations and the Server of the participant. No less important is the training of the staff for the proper functioning of the system. The e-training system with various modules should be applied in this respect. The presented methodology is primarily focused on the election processes in the countries of emerging democracies.It can be regarded as the tool for the monitoring of elections process by the political organization(s) and as one of the instruments to foster the spread of democracy in these countries.

Keywords: ICT, elections, structured information, dynamic databases, e-training.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
9032 A Genetic Algorithm Based Classification Approach for Finding Fault Prone Classes

Authors: Parvinder S. Sandhu, Satish Kumar Dhiman, Anmol Goyal

Abstract:

Fault-proneness of a software module is the probability that the module contains faults. A correlation exists between the fault-proneness of the software and the measurable attributes of the code (i.e. the static metrics) and of the testing (i.e. the dynamic metrics). Early detection of fault-prone software components enables verification experts to concentrate their time and resources on the problem areas of the software system under development. This paper introduces Genetic Algorithm based software fault prediction models with Object-Oriented metrics. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the classification of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results shows that Genetic algorithm approach can be used for finding the fault proneness in object oriented software components.

Keywords: Genetic Algorithms, Software Fault, Classification, Object Oriented Metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2252
9031 Development of a Simulator for Explaining Organic Chemical Reactions Based on Qualitative Process Theory

Authors: Alicia Y. C. Tang, Rukaini Hj. Abdullah, Sharifuddin M. Zain

Abstract:

This paper discusses the development of a qualitative simulator (abbreviated QRiOM) for predicting the behaviour of organic chemical reactions. The simulation technique is based on the qualitative process theory (QPT) ontology. The modelling constructs of QPT embody notions of causality which can be used to explain the behaviour of a chemical system. The major theme of this work is that, in a qualitative simulation environment, students are able to articulate his/her knowledge through the inspection of explanations generated by software. The implementation languages are Java and Prolog. The software produces explanation in various forms that stresses on the causal theories in the chemical system which can be effectively used to support learning.

Keywords: Chemical reactions, explanation, qualitative processtheory, simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517
9030 A Growing Natural Gas Approach for Evaluating Quality of Software Modules

Authors: Parvinder S. Sandhu, Sandeep Khimta, Kiranpreet Kaur

Abstract:

The prediction of Software quality during development life cycle of software project helps the development organization to make efficient use of available resource to produce the product of highest quality. “Whether a module is faulty or not" approach can be used to predict quality of a software module. There are numbers of software quality prediction models described in the literature based upon genetic algorithms, artificial neural network and other data mining algorithms. One of the promising aspects for quality prediction is based on clustering techniques. Most quality prediction models that are based on clustering techniques make use of K-means, Mixture-of-Guassians, Self-Organizing Map, Neural Gas and fuzzy K-means algorithm for prediction. In all these techniques a predefined structure is required that is number of neurons or clusters should be known before we start clustering process. But in case of Growing Neural Gas there is no need of predetermining the quantity of neurons and the topology of the structure to be used and it starts with a minimal neurons structure that is incremented during training until it reaches a maximum number user defined limits for clusters. Hence, in this work we have used Growing Neural Gas as underlying cluster algorithm that produces the initial set of labeled cluster from training data set and thereafter this set of clusters is used to predict the quality of test data set of software modules. The best testing results shows 80% accuracy in evaluating the quality of software modules. Hence, the proposed technique can be used by programmers in evaluating the quality of modules during software development.

Keywords: Growing Neural Gas, data clustering, fault prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1819
9029 Using Genetic Algorithm for Distributed Generation Allocation to Reduce Losses and Improve Voltage Profile

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

This paper presents a method for the optimal allocation of Distributed generation in distribution systems. In this paper, our aim would be optimal distributed generation allocation for voltage profile improvement and loss reduction in distribution network. Genetic Algorithm (GA) was used as the solving tool, which referring two determined aim; the problem is defined and objective function is introduced. Considering to fitness values sensitivity in genetic algorithm process, there is needed to apply load flow for decision-making. Load flow algorithm is combined appropriately with GA, till access to acceptable results of this operation. We used MATPOWER package for load flow algorithm and composed it with our Genetic Algorithm. The suggested method is programmed under MATLAB software and applied ETAP software for evaluating of results correctness. It was implemented on part of Tehran electricity distributing grid. The resulting operation of this method on some testing system is illuminated improvement of voltage profile and loss reduction indexes.

Keywords: Distributed Generation, Allocation, Voltage Profile, losses, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
9028 Optimization of Petroleum Refinery Configuration Design with Logic Propositions

Authors: Cheng Seong Khor, Xiao Qi Yeoh

Abstract:

This work concerns the topological optimization problem for determining the optimal petroleum refinery configuration. We are interested in further investigating and hopefully advancing the existing optimization approaches and strategies employing logic propositions to conceptual process synthesis problems. In particular, we seek to contribute to this increasingly exciting area of chemical process modeling by addressing the following potentially important issues: (a) how the formulation of design specifications in a mixed-logical-and-integer optimization model can be employed in a synthesis problem to enrich the problem representation by incorporating past design experience, engineering knowledge, and heuristics; and (b) how structural specifications on the interconnectivity relationships by space (states) and by function (tasks) in a superstructure should be properly formulated within a mixed-integer linear programming (MILP) model. The proposed modeling technique is illustrated on a case study involving the alternative processing routes of naphtha, in which significant improvement in the solution quality is obtained.

Keywords: Mixed-integer linear programming (MILP), petroleum refinery, process synthesis, superstructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
9027 Control Improvement of a C Sugar Cane Crystallization Using an Auto-Tuning PID Controller Based on Linearization of a Neural Network

Authors: S. Beyou, B. Grondin-Perez, M. Benne, C. Damour, J.-P. Chabriat

Abstract:

The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.

Keywords: Auto-tuning, PID, Instantaneous linearization, Neural network, Non linear process, C-crystallisation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
9026 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique

Authors: Navajyoti Panda, R. S. Pawar

Abstract:

In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.

Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1383
9025 An Ontology Model for Systems Engineering Derived from ISO/IEC/IEEE 15288: 2015: Systems and Software Engineering - System Life Cycle Processes

Authors: Lan Yang, Kathryn Cormican, Ming Yu

Abstract:

ISO/IEC/IEEE 15288: 2015, Systems and Software Engineering - System Life Cycle Processes is an international standard that provides generic top-level process descriptions to support systems engineering (SE). However, the processes defined in the standard needs improvement to lift integrity and consistency. The goal of this research is to explore the way by building an ontology model for the SE standard to manage the knowledge of SE. The ontology model gives a whole picture of the SE knowledge domain by building connections between SE concepts. Moreover, it creates a hierarchical classification of the concepts to fulfil different requirements of displaying and analysing SE knowledge.

Keywords: Knowledge management, model-based systems engineering, ontology modelling, systems engineering ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
9024 A Linear Use Case Based Software Cost Estimation Model

Authors: Hasan.O. Farahneh, Ayman A. Issa

Abstract:

Software development is moving towards agility with use cases and scenarios being used for requirements stories. Estimates of software costs are becoming even more important than before as effects of delays is much larger in successive short releases context of agile development. Thus, this paper reports on the development of new linear use case based software cost estimation model applicable in the very early stages of software development being based on simple metric. Evaluation showed that accuracy of estimates varies between 43% and 55% of actual effort of historical test projects. These results outperformed those of wellknown models when applied in the same context. Further work is being carried out to improve the performance of the proposed model when considering the effect of non-functional requirements.

Keywords: Metrics, Software Cost Estimation, Use Cases

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
9023 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases

Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García

Abstract:

This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.

Keywords: Conceptual modeling, JSON, NoSQL databases, requirements engineering, software development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1032
9022 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: Rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
9021 Expert System for Sintering Process Control based on the Information about solid-fuel Flow Composition

Authors: Yendiyarov Sergei, Zobnin Boris, Petrushenko Sergei

Abstract:

Usually, the solid-fuel flow of an iron ore sinter plant consists of different types of the solid-fuels, which differ from each other. Information about the composition of the solid-fuel flow usually comes every 8-24 hours. It can be clearly seen that this information cannot be used to control the sintering process in real time. Due to this, we propose an expert system which uses indirect measurements from the process in order to obtain the composition of the solid-fuel flow by solving an optimization task. Then this information can be used to control the sintering process. The proposed technique can be successfully used to improve sinter quality and reduce the amount of solid-fuel used by the process.

Keywords: sintering process, particle swarm optimization, optimal control, expert system, solid-fuel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
9020 A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation

Authors: Parvinder Singh Sandhu, Dalwinder Singh Salaria, Hardeep Singh

Abstract:

Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.

Keywords: Software Reusability, Software Metrics, Neural Networks, Genetic Algorithm, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
9019 A Proposed Technique for Software Development Risks Identification by using FTA Model

Authors: Hatem A. Khater, A. Baith Mohamed, Sara M. Kamel

Abstract:

Software Development Risks Identification (SDRI), using Fault Tree Analysis (FTA), is a proposed technique to identify not only the risk factors but also the causes of the appearance of the risk factors in software development life cycle. The method is based on analyzing the probable causes of software development failures before they become problems and adversely affect a project. It uses Fault tree analysis (FTA) to determine the probability of a particular system level failures that are defined by A Taxonomy for Sources of Software Development Risk to deduce failure analysis in which an undesired state of a system by using Boolean logic to combine a series of lower-level events. The major purpose of this paper is to use the probabilistic calculations of Fault Tree Analysis approach to determine all possible causes that lead to software development risk occurrence

Keywords: Software Development Risks Identification (SDRI), Fault Tree Analysis (FTA), Taxonomy for Software Development Risks (TSDR), Probabilistic Risk Assessment (PRA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2163
9018 Management Software for the Elaboration of an Electronic File in the Pharmaceutical Industry Following Mexican Regulations

Authors: M. Peña Aguilar Juan, Ríos Hernández Ezequiel, R. Valencia Luis

Abstract:

For certification, certain goods of public interest, such as medicines and food, it is required the preparation and delivery of a dossier. For its elaboration, legal and administrative knowledge must be taken, as well as organization of the documents of the process, and an order that allows the file verification. Therefore, a virtual platform was developed to support the process of management and elaboration of the dossier, providing accessibility to the information and interfaces that allow the user to know the status of projects. The development of dossier system on the cloud allows the inclusion of the technical requirements for the software management, including the validation and the manufacturing in the field industry. The platform guides and facilitates the dossier elaboration (report, file or history), considering Mexican legislation and regulations, it also has auxiliary tools for its management. This technological alternative provides organization support for documents and accessibility to the information required to specify the successful development of a dossier. The platform divides into the following modules: System control, catalog, dossier and enterprise management. The modules are designed per the structure required in a dossier in those areas. However, the structure allows for flexibility, as its goal is to become a tool that facilitates and does not obstruct processes. The architecture and development of the software allows flexibility for future work expansion to other fields, this would imply feeding the system with new regulations.

Keywords: Electronic dossier, technologies for management, web software, dossier elaboration, pharmaceutical industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1157
9017 Time-Cost-Quality Trade-off Software by using Simplified Genetic Algorithm for Typical Repetitive Construction Projects

Authors: Refaat H. Abd El Razek, Ahmed M. Diab, Sherif M. Hafez, Remon F. Aziz

Abstract:

Time-Cost Optimization "TCO" is one of the greatest challenges in construction project planning and control, since the optimization of either time or cost, would usually be at the expense of the other. Since there is a hidden trade-off relationship between project and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of the schedule compression. Recently third dimension in trade-off analysis is taken into consideration that is quality of the projects. Few of the existing algorithms are applied in a case of construction project with threedimensional trade-off analysis, Time-Cost-Quality relationships. The objective of this paper is to presents the development of a practical software system; that named Automatic Multi-objective Typical Construction Resource Optimization System "AMTCROS". This system incorporates the basic concepts of Line Of Balance "LOB" and Critical Path Method "CPM" in a multi-objective Genetic Algorithms "GAs" model. The main objective of this system is to provide a practical support for typical construction planners who need to optimize resource utilization in order to minimize project cost and duration while maximizing its quality simultaneously. The application of these research developments in planning the typical construction projects holds a strong promise to: 1) Increase the efficiency of resource use in typical construction projects; 2) Reduce construction duration period; 3) Minimize construction cost (direct cost plus indirect cost); and 4) Improve the quality of newly construction projects. A general description of the proposed software for the Time-Cost-Quality Trade-Off "TCQTO" is presented. The main inputs and outputs of the proposed software are outlined. The main subroutines and the inference engine of this software are detailed. The complexity analysis of the software is discussed. In addition, the verification, and complexity of the proposed software are proved and tested using a real case study.

Keywords: Project management, typical (repetitive) large scale projects, line of balance, multi-objective optimization, genetic algorithms, time-cost-quality trade-offs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3005
9016 Optimal Allocation of DG Units for Power Loss Reduction and Voltage Profile Improvement of Distribution Networks using PSO Algorithm

Authors: K. Varesi

Abstract:

This paper proposes a Particle Swarm Optimization (PSO) based technique for the optimal allocation of Distributed Generation (DG) units in the power systems. In this paper our aim is to decide optimal number, type, size and location of DG units for voltage profile improvement and power loss reduction in distribution network. Two types of DGs are considered and the distribution load flow is used to calculate exact loss. Load flow algorithm is combined appropriately with PSO till access to acceptable results of this operation. The suggested method is programmed under MATLAB software. Test results indicate that PSO method can obtain better results than the simple heuristic search method on the 30-bus and 33- bus radial distribution systems. It can obtain maximum loss reduction for each of two types of optimally placed multi-DGs. Moreover, voltage profile improvement is achieved.

Keywords: Distributed Generation (DG), Optimal Allocation, Particle Swarm Optimization (PSO), Power Loss Minimization, Voltage Profile Improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3122