Search results for: Software Estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3037

Search results for: Software Estimation

2707 Joint Adaptive Block Matching Search (JABMS) Algorithm

Authors: V.K.Ananthashayana, Pushpa.M.K

Abstract:

In this paper a new Joint Adaptive Block Matching Search (JABMS) algorithm is proposed to generate motion vector and search a best match macro block by classifying the motion vector movement based on prediction error. Diamond Search (DS) algorithm generates high estimation accuracy when motion vector is small and Adaptive Rood Pattern Search (ARPS) algorithm can handle large motion vector but is not very accurate. The proposed JABMS algorithm which is capable of considering both small and large motions gives improved estimation accuracy and the computational cost is reduced by 15.2 times compared with Exhaustive Search (ES) algorithm and is 1.3 times less compared with Diamond search algorithm.

Keywords: Adaptive rood pattern search, Block matching, Diamond search, Joint Adaptive search, Motion estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
2706 Stature Estimation Based On Lower Limb Dimensions in the Malaysian Population

Authors: F. M. Nor, N. Abdullah, Al-M. Mustapa, L. Q. Wen, N. A. Faisal, D. A. A. Ahmad Nazari

Abstract:

Estimation of stature is an important step in developing a biological profile for human identification. It may provide a valuable indicator for unknown individual in a population. The aim of this study was to analyses the relationship between stature and lower limb dimensions in the Malaysian population. The sample comprised 100 corpses, which included 69 males and 31 females between age ranges of 20 to 90 years old. The parameters measured were stature, thigh length, lower leg length, leg length, foot length, foot height and foot breadth. Results showed that mean values in males were significantly higher than those in females (P < 0.05). There were significant correlations between lower limb dimensions and stature. Cross-validation of the equation on 100 individuals showed close approximation between known stature and estimated stature. It was concluded that lower limb dimensions were useful for estimation of stature, which should be validated in future studies. 

Keywords: Forensic anthropology population data, lower leg length, Malaysian, stature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3205
2705 A Cost Effective Approach to Develop Mid-size Enterprise Software Adopted the Waterfall Model

Authors: M. N. Hasnine, M. K. H. Chayon, M. M. Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: End-user Application Development, Enterprise Software Design, Information Resource Management, Usability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
2704 Estimation of OPC, Fly Ash and Slag Contents in Blended and Composite Cements by Selective Dissolution Method

Authors: Suresh Palla, Suresh Vanguri, Anitha, B. N. Mohapatra

Abstract:

This paper presents the results of the study on the estimation of fly ash, slag and cement contents in blended and composite cements by selective dissolution method. Types of cement samples investigated include Ordinary Portland Cement (OPC) with fly ash as performance improver, OPC with slag as performance improver, Portland Pozzolana Cement (PPC), Portland Slag Cement (PSC) and composite cement confirming to respective Indian Standards. Slag and OPC contents in PSC were estimated by selectively dissolving OPC in stage 1 and selectively dissolving slag in stage 2. In the case of composite cement sample, the percentage of cement, slag and fly ash were estimated systematically by selective dissolution of cement, slag and fly ash in three stages. In the first stage, cement is dissolved and separated by leaving the residue of slag and fly ash, designated as R1. The second stage involves gravimetric estimation of fractions of OPC, residue and selective dissolution of fly ash and slag contents. Fly ash content, R2 was estimated through gravimetric analysis. Thereafter, the difference between the R1 and R2 is considered as slag content. The obtained results of cement, fly ash and slag using selective dissolution method showed 10% of standard deviation with the corresponding percentage of respective constituents. The results suggest that this selective dissolution method can be successfully used for estimation of OPC and Supplementary Cementitious material (SCM) contents in different types of cements.

Keywords: Selective dissolution method, fly ash, Ground Granulated blast furnace slag, EDTA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 405
2703 Software Product Quality Evaluation Model with Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

This paper presents a software product quality evaluation model based on the ISO/IEC 25010 quality model. The evaluation characteristics and sub characteristics were identified from the ISO/IEC 25010 quality model. The multidimensional structure of the quality model is based on characteristics such as functional suitability, performance efficiency, compatibility, usability, reliability, security, maintainability, and portability, and associated sub characteristics. Random numbers are generated to establish the decision maker’s importance weights for each sub characteristics. Also, random numbers are generated to establish the decision matrix of the decision maker’s final scores for each software product against each sub characteristics. Thus, objective criteria importance weights and index scores for datasets were obtained from the random numbers. In the proposed model, five different software product quality evaluation datasets under three different weight vectors were applied to multiple criteria decision analysis method, preference analysis for reference ideal solution (PARIS) for comparison, and sensitivity analysis procedure. This study contributes to provide a better understanding of the application of MCDMA methods and ISO/IEC 25010 quality model guidelines in software product quality evaluation process.

Keywords: ISO/IEC 25010 quality model, multiple criteria decisions making, multiple criteria decision making analysis, MCDMA, PARIS, Software Product Quality Evaluation Model, Software Product Quality Evaluation, Software Evaluation, Software Selection, Software

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448
2702 The Study of the Intelligent Fuzzy Weighted Input Estimation Method Combined with the Experiment Verification for the Multilayer Materials

Authors: Ming-Hui Lee, Tsung-Chien Chen, Tsu-Ping Yu, Horng-Yuan Jang

Abstract:

The innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux of the multilayer materials as presented in this paper. The feasibility of this method can be verified by adopting the temperature measurement experiment. The experiment modular may be designed by using the copper sample which is stacked up 4 aluminum samples with different thicknesses. Furthermore, the bottoms of copper samples are heated by applying the standard heat source, and the temperatures on the tops of aluminum are measured by using the thermocouples. The temperature measurements are then regarded as the inputs into the presented method to estimate the heat flux in the bottoms of copper samples. The influence on the estimation caused by the temperature measurement of the sample with different thickness, the processing noise covariance Q, the weighting factor γ , the sampling time interval Δt , and the space discrete interval Δx , will be investigated by utilizing the experiment verification. The results show that this method is efficient and robust to estimate the unknown time-varying heat input of the multilayer materials.

Keywords: Multilayer Materials, Input Estimation Method, IHCP, Heat Flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1237
2701 Development of Material Analyzing Software Using X-Ray Diffraction

Authors: Le Chi Cuong

Abstract:

X-ray diffraction is an effective mean for analyzing material properties. This paper developed a new computational software for determining the properties of crystalline materials such as elastic constants, residual stresses, surface hardness, phase components, and etc. The results computed from the X-ray diffraction method were compared to those from the traditional methods and they are in the 95% confidential limits, showing that the newly developed software has high reproducibility, opening a possibility of its commercialization.

Keywords: X-ray diffraction, Nondestructive evaluation, Hardness, Residual stress, Phase determination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
2700 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: Time series modelling, stochastic processes, ARIMA model, Karkheh River.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1044
2699 Performance Evaluation of Complex Electrical Bio-impedance from V/I Four-electrode Measurements

Authors: Towfeeq Fairooz, Salim Istyaq

Abstract:

The passive electrical properties of a tissue depends on the intrinsic constituents and its structure, therefore by measuring the complex electrical impedance of the tissue it might be possible to obtain indicators of the tissue state or physiological activity [1]. Complete bio-impedance information relative to physiology and pathology of a human body and functional states of the body tissue or organs can be extracted by using a technique containing a fourelectrode measurement setup. This work presents the estimation measurement setup based on the four-electrode technique. First, the complex impedance is estimated by three different estimation techniques: Fourier, Sine Correlation and Digital De-convolution and then estimation errors for the magnitude, phase, reactance and resistance are calculated and analyzed for different levels of disturbances in the observations. The absolute values of relative errors are plotted and the graphical performance of each technique is compared.

Keywords: Electrical Impedance, Fast Fourier Transform, Additive White Gaussian Noise, Total Least Square, Digital De-Convolution, Sine-Correlation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2733
2698 Development of a Catchment Water Quality Model for Continuous Simulations of Pollutants Build-up and Wash-off

Authors: Iqbal Hossain, Dr. Monzur Imteaz, Dr. Shirley Gato-Trinidad, Prof. Abdallah Shanableh

Abstract:

Estimation of runoff water quality parameters is required to determine appropriate water quality management options. Various models are used to estimate runoff water quality parameters. However, most models provide event-based estimates of water quality parameters for specific sites. The work presented in this paper describes the development of a model that continuously simulates the accumulation and wash-off of water quality pollutants in a catchment. The model allows estimation of pollutants build-up during dry periods and pollutants wash-off during storm events. The model was developed by integrating two individual models; rainfall-runoff model, and catchment water quality model. The rainfall-runoff model is based on the time-area runoff estimation method. The model allows users to estimate the time of concentration using a range of established methods. The model also allows estimation of the continuing runoff losses using any of the available estimation methods (i.e., constant, linearly varying or exponentially varying). Pollutants build-up in a catchment was represented by one of three pre-defined functions; power, exponential, or saturation. Similarly, pollutants wash-off was represented by one of three different functions; power, rating-curve, or exponential. The developed runoff water quality model was set-up to simulate the build-up and wash-off of total suspended solids (TSS), total phosphorus (TP) and total nitrogen (TN). The application of the model was demonstrated using available runoff and TSS field data from road and roof surfaces in the Gold Coast, Australia. The model provided excellent representation of the field data demonstrating the simplicity yet effectiveness of the proposed model.

Keywords: Catchment, continuous pollutants build-up, pollutants wash-off, runoff, runoff water quality model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3135
2697 Investigate the Relation between the Correctness and the Number of Versions of Fault Tolerant Software System

Authors: Pham Ba Quang, Nguyen Tien Dat, Huynh Quyet Thang

Abstract:

In this paper, we generalize several techniques in developing Fault Tolerant Software. We introduce property “Correctness" in evaluating N-version Systems and compare it to some commonly used properties such as reliability or availability. We also find out the relation between this property and the number of versions of system. Our experiments to verify the correctness and the applicability of the relation are also presented.

Keywords: Correctness, Fault Tolerant Software, N-versionSystems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
2696 Time-Derivative Estimation of Noisy Movie Data using Adaptive Control Theory

Authors: Soon-Hyun Park, Takami Matsuo

Abstract:

This paper presents an adaptive differentiator of sequential data based on the adaptive control theory. The algorithm is applied to detect moving objects by estimating a temporal gradient of sequential data at a specified pixel. We adopt two nonlinear intensity functions to reduce the influence of noises. The derivatives of the nonlinear intensity functions are estimated by an adaptive observer with σ-modification update law.

Keywords: Adaptive estimation, parameter adjustmentlaw, motion detection, temporal gradient, differential filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1873
2695 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: Generalized extreme values (GEV), likelihood estimation, precipitation data, Wakeby distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 675
2694 Bi-linear Complementarity Problem

Authors: Chao Wang, Ting-Zhu Huang Chen Jia

Abstract:

In this paper, we propose a new linear complementarity problem named as bi-linear complementarity problem (BLCP) and the method for solving BLCP. In addition, the algorithm for error estimation of BLCP is also given. Numerical experiments show that the algorithm is efficient.

Keywords: Bi-linear complementarity problem, Linear complementarity problem, Extended linear complementarity problem, Error estimation, P-matrix, M-matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
2693 A Systematic Review on the Integration of Project Management with Organizational Flows

Authors: Maurício Covolan Rosito, Ricardo Melo Bastos

Abstract:

Software projects are very dynamic and require recurring adjustments of their project plans. These settings can be understood as reconfigurations in the schedule, in the resources allocation and other design elements. Yet, during the planning and execution of a software project, the integration of specific activities in the projects with the activities that take part in the organization-s common activity flow should be considered. This article presents the results from a systematic review of aspects related to software projects- dynamic reconfiguration emphasizing the integration of project management with the organizational flows. A series of studies was analyzed from the year 2000 to the present. The results of this work show that there is a diversity of techniques and strategies for dynamic reconfiguration of software projects-. However, few approaches consider the integration of software project activities with the activities that take part in the organization-s common workflow.

Keywords: Dynamic Reconfiguration, Organizational workflows, Project Management, Systematic Review

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
2692 Investigation of the Unbiased Characteristic of Doppler Frequency to Different Antenna Array Geometries

Authors: Somayeh Komeylian

Abstract:

Array signal processing techniques have been recently developing in a variety application of the performance enhancement of receivers by refraining the power of jamming and interference signals. In this scenario, biases induced to the antenna array receiver degrade significantly the accurate estimation of the carrier phase. Owing to the integration of frequency becomes the carrier phase, we have obtained the unbiased doppler frequency for the high precision estimation of carrier phase. The unbiased characteristic of Doppler frequency to the power jamming and the other interference signals allows achieving the highly accurate estimation of phase carrier. In this study, we have rigorously investigated the unbiased characteristic of Doppler frequency to the variation of the antenna array geometries. The simulation results have efficiently verified that the Doppler frequency remains also unbiased and accurate to the variation of antenna array geometries.

Keywords: Array signal processing, unbiased Doppler frequency, GNSS, carrier phase, slowly fluctuating point target.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 900
2691 A Study on the Application of TRIZ to CAD/CAM System

Authors: Yuan L. Lai, Jian H. Chen, Jui P. Hung

Abstract:

This study created new graphical icons and operating functions in a CAD/CAM software system by analyzing icons in some of the popular systems, such as AutoCAD, AlphaCAM, Mastercam and the 1st edition of LiteCAM. These software systems all focused on geometric design and editing, thus how to transmit messages intuitively from icon itself to users is an important function of graphical icons. The primary purpose of this study is to design innovative icons and commands for new software. This study employed the TRIZ method, an innovative design method, to generate new concepts systematically. Through literature review, it then investigated and analyzed the relationship between TRIZ and idea development. Contradiction Matrix and 40 Principles were used to develop an assisting tool suitable for icon design in software development. We first gathered icon samples from the selected CAD/CAM systems. Then grouped these icons by meaningful functions, and compared useful and harmful properties. Finally, we developed new icons for new software systems in order to avoid intellectual property problem.

Keywords: Icon, TRIZ, CAD/CAM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1794
2690 AJcFgraph - AspectJ Control Flow Graph Builder for Aspect-Oriented Software

Authors: Reza Meimandi Parizi, Abdul Azim Abdul Ghani

Abstract:

The ever-growing usage of aspect-oriented development methodology in the field of software engineering requires tool support for both research environments and industry. So far, tool support for many activities in aspect-oriented software development has been proposed, to automate and facilitate their development. For instance, the AJaTS provides a transformation system to support aspect-oriented development and refactoring. In particular, it is well established that the abstract interpretation of programs, in any paradigm, pursued in static analysis is best served by a high-level programs representation, such as Control Flow Graph (CFG). This is why such analysis can more easily locate common programmatic idioms for which helpful transformation are already known as well as, association between the input program and intermediate representation can be more closely maintained. However, although the current researches define the good concepts and foundations, to some extent, for control flow analysis of aspectoriented programs but they do not provide a concrete tool that can solely construct the CFG of these programs. Furthermore, most of these works focus on addressing the other issues regarding Aspect- Oriented Software Development (AOSD) such as testing or data flow analysis rather than CFG itself. Therefore, this study is dedicated to build an aspect-oriented control flow graph construction tool called AJcFgraph Builder. The given tool can be applied in many software engineering tasks in the context of AOSD such as, software testing, software metrics, and so forth.

Keywords: Aspect-Oriented Software Development, AspectJ, Control Flow Graph, Data Flow Analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
2689 A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment

Authors: Byung-Kon Kim

Abstract:

According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.

Keywords: Construction IT, BIM(Building Information Modeling), Cloud Computing, BIM Service Based Cloud Computing, Viewer Based BIM Server, 3D Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4101
2688 Estimation of R= P [Y < X] for Two-parameter Burr Type XII Distribution

Authors: H.Panahi, S.Asadi

Abstract:

In this article, we consider the estimation of P[Y < X], when strength, X and stress, Y are two independent variables of Burr Type XII distribution. The MLE of the R based on one simple iterative procedure is obtained. Assuming that the common parameter is known, the maximum likelihood estimator, uniformly minimum variance unbiased estimator and Bayes estimator of P[Y < X] are discussed. The exact confidence interval of the R is also obtained. Monte Carlo simulations are performed to compare the different proposed methods.

Keywords: Stress-Strength model, Maximum likelihood estimator, Bayes estimator, Burr type XII distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2296
2687 Implementation of Channel Estimation and Timing Synchronization Algorithms for MIMO-OFDM System Using NI USRP 2920

Authors: Ali Beydoun, Hamzé H. Alaeddine

Abstract:

MIMO-OFDM communication system presents a key solution for the next generation of mobile communication due to its high spectral efficiency, high data rate and robustness against multi-path fading channels. However, MIMO-OFDM system requires a perfect knowledge of the channel state information and a good synchronization between the transmitter and the receiver to achieve the expected performances. Recently, we have proposed two algorithms for channel estimation and timing synchronization with good performances and very low implementation complexity compared to those proposed in the literature. In order to validate and evaluate the efficiency of these algorithms in real environments, this paper presents in detail the implementation of 2 × 2 MIMO-OFDM system based on LabVIEW and USRP 2920. Implementation results show a good agreement with the simulation results under different configuration parameters.

Keywords: MIMO-OFDM system, timing synchronization, channel estimation, STBC, USRP 2920.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 843
2686 DRE - A Quality Metric for Component based Software Products

Authors: K. S. Jasmine, R. Vasantha

Abstract:

The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).

Keywords: Software Reuse, Defect density, Reuse metrics, Defect Removal efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2808
2685 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
2684 State Estimation Based on Unscented Kalman Filter for Burgers’ Equation

Authors: Takashi Shimizu, Tomoaki Hashimoto

Abstract:

Controlling the flow of fluids is a challenging problem that arises in many fields. Burgers’ equation is a fundamental equation for several flow phenomena such as traffic, shock waves, and turbulence. The optimal feedback control method, so-called model predictive control, has been proposed for Burgers’ equation. However, the model predictive control method is inapplicable to systems whose all state variables are not exactly known. In practical point of view, it is unusual that all the state variables of systems are exactly known, because the state variables of systems are measured through output sensors and limited parts of them can be only available. In fact, it is usual that flow velocities of fluid systems cannot be measured for all spatial domains. Hence, any practical feedback controller for fluid systems must incorporate some type of state estimator. To apply the model predictive control to the fluid systems described by Burgers’ equation, it is needed to establish a state estimation method for Burgers’ equation with limited measurable state variables. To this purpose, we apply unscented Kalman filter for estimating the state variables of fluid systems described by Burgers’ equation. The objective of this study is to establish a state estimation method based on unscented Kalman filter for Burgers’ equation. The effectiveness of the proposed method is verified by numerical simulations.

Keywords: State estimation, fluid systems, observer systems, unscented Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 742
2683 Empirical Exploration of Correlations between Software Design Measures: A Replication Study

Authors: Jehad Al Dallal

Abstract:

Software engineers apply different measures to quantify the quality of software design. These measures consider artifacts developed at low or high level software design phases. The results are used to point to design weaknesses and to indicate design points that have to be restructured. Understanding the relationship among the quality measures and among the design quality aspects considered by these measures is important to interpreting the impact of a measure for a quality aspect on other potentially related aspects. In addition, exploring the relationship between quality measures helps to explain the impact of different quality measures on external quality aspects, such as reliability and maintainability. In this paper, we report a replication study that empirically explores the correlation between six well known and commonly applied design quality measures. These measures consider several quality aspects, including complexity, cohesion, coupling, and inheritance. The results indicate that inheritance measures are weakly correlated to other measures, whereas complexity, coupling, and cohesion measures are mostly strongly correlated.  

Keywords: Quality attribute, quality measure, software design quality, spearman correlation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
2682 Motion Parameter Estimation via Dopplerlet-Transform-Based Matched Field Processing

Authors: Hongyan Dai

Abstract:

This work presents a matched field processing (MFP) algorithm based on Dopplerlet transform for estimating the motion parameters of a sound source moving along a straight line and with a constant speed by using a piecewise strategy, which can significantly reduce the computational burden. Monte Carlo simulation results and an experimental result are presented to verify the effectiveness of the algorithm advocated.

Keywords: matched field processing; Dopplerlet transform; motion parameter estimation; piecewise strategy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1226
2681 Software to Encrypt Messages Using Public-Key Cryptography

Authors: E. Inzunza-González, C. Cruz-Hernández, R. M. López-Gutiérrez, E. E. García-Guerrero, L. Cardoza- Avendaño, H. Serrano-Guerrero

Abstract:

In this paper the development of a software to encrypt messages with asymmetric cryptography is presented. In particular, is used the RSA (Rivest, Shamir and Adleman) algorithm to encrypt alphanumeric information. The software allows to generate different public keys from two prime numbers provided by the user, the user must then select a public-key to generate the corresponding private-key. To encrypt the information, the user must provide the public-key of the recipient as well as the message to be encrypted. The generated ciphertext can be sent through an insecure channel, so that would be very difficult to be interpreted by an intruder or attacker. At the end of the communication, the recipient can decrypt the original message if provide his/her public-key and his/her corresponding private-key.

Keywords: Asymmetric cryptography, Prime number, Publickey, Private-key, Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
2680 Communication and Quality in Distributed Agile Development: An Empirical Case Study

Authors: R. Green, T. Mazzuchi, S. Sarkani

Abstract:

Through inward perceptions, we intuitively expect distributed software development to increase the risks associated with achieving cost, schedule, and quality goals. To compound this problem, agile software development (ASD) insists one of the main ingredients of its success is cohesive communication attributed to collocation of the development team. The following study identified the degree of communication richness needed to achieve comparable software quality (reduce pre-release defects) between distributed and collocated teams. This paper explores the relevancy of communication richness in various development phases and its impact on quality. Through examination of a large distributed agile development project, this investigation seeks to understand the levels of communication required within each ASD phase to produce comparable quality results achieved by collocated teams. Obviously, a multitude of factors affects the outcome of software projects. However, within distributed agile software development teams, the mode of communication is one of the critical components required to achieve team cohesiveness and effectiveness. As such, this study constructs a distributed agile communication model (DAC-M) for potential application to similar distributed agile development efforts using the measurement of the suitable level of communication. The results of the study show that less rich communication methods, in the appropriate phase, might be satisfactory to achieve equivalent quality in distributed ASD efforts.

Keywords: agile software development (ASD), distributedsoftware teams, media richness theory, software development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2162
2679 Energy Consumption Analysis of Design Patterns

Authors: Andreas Litke, Kostas Zotos, Alexander Chatzigeorgiou, George Stephanides

Abstract:

The importance of low power consumption is widely acknowledged due to the increasing use of portable devices, which require minimizing the consumption of energy. Energy dissipation is heavily dependent on the software used in the system. Applying design patterns in object-oriented designs is a common practice nowadays. In this paper we analyze six design patterns and explore the effect of them on energy consumption and performance.

Keywords: Design Patterns, Embedded Systems, Energy Consumption, Performance Evaluation, Software Design and Development, Software Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2094
2678 A Study on a Research and Development Cost-Estimation Model in Korea

Authors: Babakina Alexandra, Yong Soo Kim

Abstract:

In this study, we analyzed the factors that affect research funds using linear regression analysis to increase the effectiveness of investments in national research projects. We collected 7,916 items of data on research projects that were in the process of being finished or were completed between 2010 and 2011. Data pre-processing and visualization were performed to derive statistically significant results. We identified factors that affected funding using analysis of fit distributions and estimated increasing or decreasing tendencies based on these factors.

Keywords: R&D funding, Cost estimation, Linear regression, Preliminary feasibility study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2246