Search results for: software agents
2071 Self-Healing Phenomenon Evaluation in Cementitious Matrix with Different Water/Cement Ratios and Crack Opening Age
Authors: V. G. Cappellesso, D. M. G. da Silva, J. A. Arndt, N. dos Santos Petry, A. B. Masuero, D. C. C. Dal Molin
Abstract:
Concrete elements are subject to cracking, which can be an access point for deleterious agents that can trigger pathological manifestations reducing the service life of these structures. Finding ways to minimize or eliminate the effects of this aggressive agents’ penetration, such as the sealing of these cracks, is a manner of contributing to the durability of these structures. The cementitious self-healing phenomenon can be classified in two different processes. The autogenous self-healing that can be defined as a natural process in which the sealing of this cracks occurs without the stimulation of external agents, meaning, without different materials being added to the mixture, while on the other hand, the autonomous seal-healing phenomenon depends on the insertion of a specific engineered material added to the cement matrix in order to promote its recovery. This work aims to evaluate the autogenous self-healing of concretes produced with different water/cement ratios and exposed to wet/dry cycles, considering two ages of crack openings, 3 days and 28 days. The self-healing phenomenon was evaluated using two techniques: crack healing measurement using ultrasonic waves and image analysis performed with an optical microscope. It is possible to observe that by both methods, it possible to observe the self-healing phenomenon of the cracks. For young ages of crack openings and lower water/cement ratios, the self-healing capacity is higher when compared to advanced ages of crack openings and higher water/cement ratios. Regardless of the crack opening age, these concretes were found to stabilize the self-healing processes after 80 days or 90 days.
Keywords: Self-healing, autogenous, water/cement ratio, curing cycles, test methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9732070 Load Modeling for Power Flow and Transient Stability Computer Studies at BAKHTAR Network
Authors: M. Sedighizadeh, A. Rezazadeh
Abstract:
A method has been developed for preparing load models for power flow and stability. The load modeling (LOADMOD) computer software transforms data on load class mix, composition, and characteristics into the from required for commonly–used power flow and transient stability simulation programs. Typical default data have been developed for load composition and characteristics. This paper defines LOADMOD software and describes the dynamic and static load modeling techniques used in this software and results of initial testing for BAKHTAR power system.Keywords: Load Modelling, Static, Power Flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20652069 Software Maintenance Severity Prediction for Object Oriented Systems
Authors: Parvinder S. Sandhu, Roma Jaswal, Sandeep Khimta, Shailendra Singh
Abstract:
As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done in time especially for the critical applications. As, Neural networks, which have been already applied in software engineering applications to build reliability growth models predict the gross change or reusability metrics. Neural networks are non-linear sophisticated modeling techniques that are able to model complex functions. Neural network techniques are used when exact nature of input and outputs is not known. A key feature is that they learn the relationship between input and output through training. In this present work, various Neural Network Based techniques are explored and comparative analysis is performed for the prediction of level of need of maintenance by predicting level severity of faults present in NASA-s public domain defect dataset. The comparison of different algorithms is made on the basis of Mean Absolute Error, Root Mean Square Error and Accuracy Values. It is concluded that Generalized Regression Networks is the best algorithm for classification of the software components into different level of severity of impact of the faults. The algorithm can be used to develop model that can be used for identifying modules that are heavily affected by the faults.Keywords: Neural Network, Software faults, Software Metric.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15752068 Case Study of the Exercise Habits and Aging Anxiety of Taiwanese Insurance Agents
Authors: W. T. Hsu, H. L. Tsai
Abstract:
The rapid aging of the population is a common trend in the world. However, the progress of modern medical technology has increased the average life expectancy. The global population structure has changed dramatically, and the elderly population has risen rapidly. In the face of rapid population growth, it must be noted issues of the aging population must face up to, which are the physiological, psychological, and social problems associated with aging. This study aims to investigate how insurance agents are actively dealing with an aging society, their own aging anxiety, and their exercise habits. Purposive sampling was the sampling method of this study, a total of 204 respondents were surveyed and 204 valid surveys were returned. The returned valid ratio was 100%. Statistical method included descriptive statistics, t-test, and one-way ANOVA. The results of the study found that the insurance agent’s age, seniority, exercise habits to aging anxiety are significantly different.
Keywords: Insurance agent, aging anxiety, exercise habits, elderly.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11882067 Design of Domain-Specific Software Systems with Parametric Code Templates
Authors: Kostyantyn Yermashov, Karsten Wolke, Karl Hayo Siemsen
Abstract:
Domain-specific languages describe specific solutions to problems in the application domain. Traditionally they form a solution composing black-box abstractions together. This, usually, involves non-deep transformations over the target model. In this paper we argue that it is potentially powerful to operate with grey-box abstractions to build a domain-specific software system. We present parametric code templates as grey-box abstractions and conceptual tools to encapsulate and manipulate these templates. Manipulations introduce template-s merging routines and can be defined in a generic way. This involves reasoning mechanisms at the code templates level. We introduce the concept of Neurath Modelling Language (NML) that operates with parametric code templates and specifies a visualisation mapping mechanism for target models. Finally we provide an example of calculating a domain-specific software system with predefined NML elements.
Keywords: software design, code templates, domain-specific languages, modelling languages, generic tools
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13952066 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator
Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira
Abstract:
True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).Keywords: Distillation curve, petroleum distillation, simulation, true boiling point curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16252065 File Format of Flow Chart Simulation Software - CFlow
Authors: Syahanim Mohd Salleh, Zaihosnita Hood, Hairulliza Mohd Judi, Marini Abu Bakar
Abstract:
CFlow is a flow chart software, it contains facilities to draw and evaluate a flow chart. A flow chart evaluation applies a simulation method to enable presentation of work flow in a flow chart solution. Flow chart simulation of CFlow is executed by manipulating the CFlow data file which is saved in a graphical vector format. These text-based data are organised by using a data classification technic based on a Library classification-scheme. This paper describes the file format for flow chart simulation software of CFlow.Keywords: CFlow, flow chart, file format.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25532064 Software Reengineering Tool for Traffic Accident Data
Authors: Jagdeep Kaur, Parvinder S. Sandhu, Birinderjit Singh, Amit Verma, Sanyam Anand
Abstract:
In today-s hip hop world where everyone is running short of time and works hap hazardly,the similar scene is common on the roads while in traffic.To do away with the fatal consequences of such speedy traffics on rushy lanes, a software to analyse and keep account of the traffic and subsequent conjestion is being used in the developed countries. This software has being implemented and used with the help of a suppprt tool called Critical Analysis Reporting Environment.There has been two existing versions of this tool.The current research paper involves examining the issues and probles while using these two practically. Further a hybrid architecture is proposed for the same that retains the quality and performance of both and is better in terms of coupling of components , maintainence and many other features.Keywords: Critical Analysis Reporting Environment, coupling, hybrid architecture etc.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15292063 A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules
Authors: Ramandeep S. Sidhu, Sunil Khullar, Parvinder S. Sandhu, R. P. S. Bedi, Kiranbir Kaur
Abstract:
In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Keywords: Subtractive clustering, fuzzy inference system, fault proneness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25802062 Web Application for University Internship Program Management
Authors: Prasanth Sabarish Nair, Thomas Binu, Madiajagan Muthaiyan
Abstract:
This paper discusses a software application to aid in the smooth functioning of a university internship program, including a student, faculty and an administration module. The software can also calculate the most apt combination of students to stations and allocate them respectively.
Keywords: Academic evaluation, administration monitoring, automatic allocation system, internship, student preferences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19502061 Manual Testing of Web Software Systems Supported by Direct Guidance of the Tester Based On Design Model
Authors: Karel Frajtak, Miroslav Bures, Ivan Jelinek
Abstract:
Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.
Keywords: Model based testing, test automation, test generating, tester support.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19592060 Increasing Profitability Supported by Innovative Methods and Designing Monitoring Software in Condition-Based Maintenance: A Case Study
Authors: Nasrin Farajiparvar
Abstract:
In the present article, a new method has been developed to enhance the application of equipment monitoring, which in turn results in improving condition-based maintenance economic impact in an automobile parts manufacturing factory. This study also describes how an effective software with a simple database can be utilized to achieve cost-effective improvements in maintenance performance. The most important results of this project are indicated here: 1. 63% reduction in direct and indirect maintenance costs. 2. Creating a proper database to analyse failures. 3. Creating a method to control system performance and develop it to similar systems. 4. Designing a software to analyse database and consequently create technical knowledge to face unusual condition of the system. Moreover, the results of this study have shown that the concept and philosophy of maintenance has not been understood in most Iranian industries. Thus, more investment is strongly required to improve maintenance conditions.
Keywords: Condition-based maintenance, Economic savings, Iran industries, Machine life prediction software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15752059 Design, Implementation and Testing of Mobile Agent Protection Mechanism for MANETS
Authors: Khaled E. A. Negm
Abstract:
In the current research, we present an operation framework and protection mechanism to facilitate secure environment to protect mobile agents against tampering. The system depends on the presence of an authentication authority. The advantage of the proposed system is that security measures is an integral part of the design, thus common security retrofitting problems do not arise. This is due to the presence of AlGamal encryption mechanism to protect its confidential content and any collected data by the agent from the visited host . So that eavesdropping on information from the agent is no longer possible to reveal any confidential information. Also the inherent security constraints within the framework allow the system to operate as an intrusion detection system for any mobile agent environment. The mechanism is tested for most of the well known severe attacks against agents and networked systems. The scheme proved a promising performance that makes it very much recommended for the types of transactions that needs highly secure environments, e. g., business to business.
Keywords: Mobile agent security, mobile accesses, agent encryption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20382058 Software-Defined Radio Based Channel Measurement System of Wideband HF Communication System in Low-Latitude Region
Authors: P. H. Mukti, I. Kurniawati, F. Oktaviansyah, A. D. Adhitya, N. Rachmadani, R. Corputty, G. Hendrantoro, T. Fukusako
Abstract:
HF Communication system is one of the attractive fields among many researchers since it can be reached long-distance areas with low-cost. This long-distance communication can be achieved by exploiting the ionosphere as a transmission medium for the HF radio wave. However, due to the dynamic nature of ionosphere, the channel characteristic of HF communication has to be investigated in order to gives better performances. Many techniques to characterize HF channel are available in the literature. However, none of those techniques describe the HF channel characteristic in low-latitude regions, especially equatorial areas. Since the ionosphere around equatorial region has an ESF phenomenon, it becomes an important investigation to characterize the wideband HF Channel in low-latitude region. On the other sides, the appearance of software-defined radio attracts the interest of many researchers. Accordingly, in this paper a SDR-based channel measurement system is proposed to be used for characterizing the HF channel in low-latitude region.
Keywords: Channel Characteristic, HF Communication System, LabVIEW, Software-Defined Radio, Universal Software Radio Pheripheral.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30092057 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases
Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García
Abstract:
This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.
Keywords: Conceptual modeling, JSON, NoSQL databases, requirements engineering, software development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10822056 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: Change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15202055 A Model for Collaborative COTS Software Acquisition (COSA)
Authors: Torsti Rantapuska, Sariseelia Sore
Abstract:
Acquiring commercial off-the-shelf (COTS) software applications is becoming routine in organizations. However, eliciting user requirements, finding the candidate COTS products and making the decision is a complex task, especially for SMEs who do not have the time and knowledge needed to do the task properly. The existing models intended to help the decision makers are originally designed for professional use. SMEs are obligated to rely on the software vendor’s ability to solve the problem with the systems provided. In this paper, we develop a model for SMEs for the acquisition of Commercial Off-The-Shelf (COTS) software products. A leading idea of the model is that the ICT investment is basically a change initiative and therefore it should also be taken as a process of organizational learning. The model is designed bearing three objectives in mind: 1) business orientation, 2) agility, and 3) Learning and knowledge management orientation. The model can be applied to ICT investments in SMEs which have a professional team leader with basic business and IT knowledge.
Keywords: COTS acquisition, ICT investment, organizational learning, ICT adoption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17702054 Bug Localization on Single-Line Bugs of Apache Commons Math Library
Authors: Cherry Oo, Hnin Min Oo
Abstract:
Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.Keywords: Software testing, fault localization, program spectra.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11472053 Impact Analysis Based on Change Requirement Traceability in Object Oriented Software Systems
Authors: Sunil Tumkur Dakshinamurthy, Mamootil Zachariah Kurian
Abstract:
Change requirement traceability in object oriented software systems is one of the challenging areas in research. We know that the traces between links of different artifacts are to be automated or semi-automated in the software development life cycle (SDLC). The aim of this paper is discussing and implementing aspects of dynamically linking the artifacts such as requirements, high level design, code and test cases through the Extensible Markup Language (XML) or by dynamically generating Object Oriented (OO) metrics. Also, non-functional requirements (NFR) aspects such as stability, completeness, clarity, validity, feasibility and precision are discussed. We discuss this as a Fifth Taxonomy, which is a system vulnerability concern.
Keywords: Artifacts, NFRs, OO metrics, SDLC, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11562052 Evaluation of Model Evaluation Criterion for Software Development Effort Estimation
Authors: S. K. Pillai, M. K. Jeyakumar
Abstract:
Estimation of model parameters is necessary to predict the behavior of a system. Model parameters are estimated using optimization criteria. Most algorithms use historical data to estimate model parameters. The known target values (actual) and the output produced by the model are compared. The differences between the two form the basis to estimate the parameters. In order to compare different models developed using the same data different criteria are used. The data obtained for short scale projects are used here. We consider software effort estimation problem using radial basis function network. The accuracy comparison is made using various existing criteria for one and two predictors. Then, we propose a new criterion based on linear least squares for evaluation and compared the results of one and two predictors. We have considered another data set and evaluated prediction accuracy using the new criterion. The new criterion is easy to comprehend compared to single statistic. Although software effort estimation is considered, this method is applicable for any modeling and prediction.
Keywords: Software effort estimation, accuracy, Radial Basis Function, linear least squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20412051 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries: A Case Study
Authors: A. M. Qahtani, G. B. Wills, A. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.
Keywords: Customisation Software Products, Global Software Engineering, Local Decision Making, Requirement Engineering, Simulation Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18972050 A Utilitarian Approach to Modeling Information Flows in Social Networks
Authors: Usha Sridhar, Sridhar Mandyam
Abstract:
We propose a multi-agent based utilitarian approach to model and understand information flows in social networks that lead to Pareto optimal informational exchanges. We model the individual expected utility function of the agents to reflect the net value of information received. We show how this model, adapted from a theorem by Karl Borch dealing with an actuarial Risk Exchange concept in the Insurance industry, can be used for social network analysis. We develop a utilitarian framework that allows us to interpret Pareto optimal exchanges of value as potential information flows, while achieving a maximization of a sum of expected utilities of information of the group of agents. We examine some interesting conditions on the utility function under which the flows are optimal. We illustrate the promise of this new approach to attach economic value to information in networks with a synthetic example.Keywords: Borch's Theorem , Economic value of information, Information Exchange, Pareto Optimal Solution, Social Networks, Utility Functions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15052049 Implementation of an On-Line PD Measurement System Using HFCT
Authors: F. Haghjoo, M. Sarlak, S.M. Shahrtash
Abstract:
In order to perform on-line measuring and detection of PD signals, a total solution composing of an HFCT, A/D converter and a complete software package is proposed. The software package includes compensation of HFCT contribution, filtering and noise reduction using wavelet transform and soft calibration routines. The results have shown good performance and high accuracy.Keywords: Partial Discharge, Measurement, On-line, HFCT
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18182048 A Comparative Performance Evaluation Model of Mobile Agent Versus Remote Method Invocation for Information Retrieval
Authors: Yousry El-Gamal, Khalid El-Gazzar, Magdy Saeb
Abstract:
The development of distributed systems has been affected by the need to accommodate an increasing degree of flexibility, adaptability, and autonomy. The Mobile Agent technology is emerging as an alternative to build a smart generation of highly distributed systems. In this work, we investigate the performance aspect of agent-based technologies for information retrieval. We present a comparative performance evaluation model of Mobile Agents versus Remote Method Invocation by means of an analytical approach. We demonstrate the effectiveness of mobile agents for dynamic code deployment and remote data processing by reducing total latency and at the same time producing minimum network traffic. We argue that exploiting agent-based technologies significantly enhances the performance of distributed systems in the domain of information retrieval.Keywords: Mobile Agent, performance evaluation, RMI, information retrieval, distributed systems, database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22512047 Implementation and Demonstration of Software-Defined Traffic Grooming
Authors: Lei Guo, Xu Zhang, Weigang Hou
Abstract:
Since the traditional network is closed and it has no architecture to create applications, it has been unable to evolve with changing demands under the rapid innovation in services. Additionally, due to the lack of the whole network profile, the quality of service cannot be well guaranteed in the traditional network. The Software Defined Network (SDN) utilizes global resources to support on-demand applications/services via open, standardized and programmable interfaces. In this paper, we implement the traffic grooming application under a real SDN environment, and the corresponding analysis is made. In our SDN: 1) we use OpenFlow protocol to control the entire network by using software applications running on the network operating system; 2) several virtual switches are combined into the data forwarding plane through Open vSwitch; 3) An OpenFlow controller, NOX, is involved as a logically centralized control plane that dynamically configures the data forwarding plane; 4) The traffic grooming based on SDN is demonstrated through dynamically modifying the idle time of flow entries. The experimental results demonstrate that the SDN-based traffic grooming effectively reduces the end-to-end delay, and the improvement ratio arrives to 99%.
Keywords: NOX, OpenFlow, software defined network, traffic grooming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10292046 Automated Java Testing: JUnit versus AspectJ
Authors: Manish Jain, Dinesh Gopalani
Abstract:
Growing dependency of mankind on software technology increases the need for thorough testing of the software applications and automated testing techniques that support testing activities. We have outlined our testing strategy for performing various types of automated testing of Java applications using AspectJ which has become the de-facto standard for Aspect Oriented Programming (AOP). Likewise JUnit, a unit testing framework is the most popular Java testing tool. In this paper, we have evaluated our proposed AOP approach for automated testing and JUnit on various parameters. First we have provided the similarity between the two approaches and then we have done a detailed comparison of the two testing techniques on factors like lines of testing code, learning curve, testing of private members etc. We established that our AOP testing approach using AspectJ has got several advantages and is thus particularly more effective than JUnit.Keywords: Aspect oriented programming, AspectJ, Aspects, JUnit, software testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19072045 Studying on ARINC653 Partition Run-time Scheduling and Simulation
Authors: Dongliang Wang, Jun Han, Dianfu Ma, Xianqi Zhao
Abstract:
Avionics software is safe-critical embedded software and its architecture is evolving from traditional federated architectures to Integrated Modular Avionics (IMA) to improve resource usability. ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Arinc653 uses two-level scheduling strategies, but current modeling tools only apply to simple problems of Arinc653 two-level scheduling, which only contain time property. In avionics industry, we are always manually allocating tasks and calculating the timing table of a real-time system to ensure it-s running as we design. In this paper we represent an automatically generating strategy which applies to the two scheduling problems with dependent constraints in Arinc653 partition run-time environment. It provides the functionality of automatic generation from the task and partition models to scheduling policy through allocating the tasks to the partitions while following the constraints, and then we design a simulating mechanism to check whether our policy is schedulable or notKeywords: Arinc653, scheduling, task allocation, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23452044 Analysis of the Impact of NVivo and EndNote on Academic Research Productivity
Authors: Sujit K. Basak
Abstract:
The aim of this paper is to analyze the impact of literature review software on researchers. The aim of this study was achieved by analyzing models in terms of perceived usefulness, perceived ease of use, and acceptance level. Collected data were analyzed using WarpPLS 4.0 software. This study used two theoretical frameworks, namely, Technology Acceptance Model and the Training Needs Assessment Model. The study was experimental and was conducted at a public university in South Africa. The results of the study showed that acceptance level has a high impact on research productivity followed by perceived usefulness and perceived ease of use.Keywords: Technology acceptance model, training needs assessment model, literature review software, research productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29782043 Moving From Problem Space to Solution Space
Authors: Bilal Saeed Raja, M. Ali Iqbal, Imran Ihsan
Abstract:
Extracting and elaborating software requirements and transforming them into viable software architecture are still an intricate task. This paper defines a solution architecture which is based on the blurred amalgamation of problem space and solution space. The dependencies between domain constraints, requirements and architecture and their importance are described that are to be considered collectively while evolving from problem space to solution space. This paper proposes a revised version of Twin Peaks Model named Win Peaks Model that reconciles software requirements and architecture in more consistent and adaptable manner. Further the conflict between stakeholders- win-requirements is resolved by proposed Voting methodology that is simple adaptation of win-win requirements negotiation model and QARCC.Keywords: Functional Requirements, Non Functional Requirements, Twin Peaks Model, QARCC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18622042 Underlying Cognitive Complexity Measure Computation with Combinatorial Rules
Authors: Benjapol Auprasert, Yachai Limpiyakorn
Abstract:
Measuring the complexity of software has been an insoluble problem in software engineering. Complexity measures can be used to predict critical information about testability, reliability, and maintainability of software systems from automatic analysis of the source code. During the past few years, many complexity measures have been invented based on the emerging Cognitive Informatics discipline. These software complexity measures, including cognitive functional size, lend themselves to the approach of the total cognitive weights of basic control structures such as loops and branches. This paper shows that the current existing calculation method can generate different results that are algebraically equivalence. However, analysis of the combinatorial meanings of this calculation method shows significant flaw of the measure, which also explains why it does not satisfy Weyuker's properties. Based on the findings, improvement directions, such as measures fusion, and cumulative variable counting scheme are suggested to enhance the effectiveness of cognitive complexity measures.Keywords: Cognitive Complexity Measure, Cognitive Weight of Basic Control Structure, Counting Rules, Cumulative Variable Counting Scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893