Search results for: multi model software process improvement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34953

Search results for: multi model software process improvement

34803 A Description Logics Based Approach for Building Multi-Viewpoints Ontologies

Authors: M. Hemam, M. Djezzar, T. Djouad

Abstract:

We are interested in the problem of building an ontology in a heterogeneous organization, by taking into account different viewpoints and different terminologies of communities in the organization. Such ontology, that we call multi-viewpoint ontology, confers to the same universe of discourse, several partial descriptions, where each one is relative to a particular viewpoint. In addition, these partial descriptions share at global level, ontological elements constituent a consensus between the various viewpoints. In order to provide response elements to this problem we define a multi-viewpoints knowledge model based on viewpoint and ontology notions. The multi-viewpoints knowledge model is used to formalize the multi-viewpoints ontology in description logics language.

Keywords: description logic, knowledge engineering, ontology, viewpoint

Procedia PDF Downloads 289
34802 A Super-Efficiency Model for Evaluating Efficiency in the Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

In many cases, there is a time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrated models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long-term research project is given to compare the suggested model with the MpO model.

Keywords: DEA, super-efficiency, time lag, multi-periods input

Procedia PDF Downloads 450
34801 Numerical Simulation of Wishart Diffusion Processes

Authors: Raphael Naryongo, Philip Ngare, Anthony Waititu

Abstract:

This paper deals with numerical simulation of Wishart processes for a single asset risky pricing model whose volatility is described by Wishart affine diffusion processes. The multi-factor specification of volatility will make the model more flexible enough to fit the stock market data for short or long maturities for better returns. The Wishart process is a stochastic process which is a positive semi-definite matrix-valued generalization of the square root process. The aim of the study is to model the log asset stock returns under the double Wishart stochastic volatility model. The solution of the log-asset return dynamics for Bi-Wishart processes will be obtained through Euler-Maruyama discretization schemes. The numerical results on the asset returns are compared to the existing models returns such as Heston stochastic volatility model and double Heston stochastic volatility model

Keywords: euler schemes, log-asset return, infinitesimal generator, wishart diffusion affine processes

Procedia PDF Downloads 351
34800 Multi-Level Security Measures in Cloud Computing

Authors: Shobha G. Ranjan

Abstract:

Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.

Keywords: cloud computing, cloud security, integrity, multi-tenancy, security

Procedia PDF Downloads 479
34799 Multi-Point Dieless Forming Product Defect Reduction Using Reliability-Based Robust Process Optimization

Authors: Misganaw Abebe Baye, Ji-Woo Park, Beom-Soo Kang

Abstract:

The product quality of multi-point dieless forming (MDF) is identified to be dependent on the process parameters. Moreover, a certain variation of friction and material properties may have a substantially worse influence on the final product quality. This study proposed on how to compensate the MDF product defects by minimizing the sensitivity of noise parameter variations. This can be attained by reliability-based robust optimization (RRO) technique to obtain the optimal process setting of the controllable parameters. Initially two MDF Finite Element (FE) simulations of AA3003-H14 saddle shape showed a substantial amount of dimpling, wrinkling, and shape error. FE analyses are consequently applied on ABAQUS commercial software to obtain the correlation between the control process setting and noise variation with regard to the product defects. The best prediction models are chosen from the family of metamodels to swap the computational expensive FE simulation. Genetic algorithm (GA) is applied to determine the optimal process settings of the control parameters. Monte Carlo Analysis (MCA) is executed to determine how the noise parameter variation affects the final product quality. Finally, the RRO FE simulation and the experimental result show that the amendment of the control parameters in the final forming process leads to a considerably better-quality product.

Keywords: dimpling, multi-point dieless forming, reliability-based robust optimization, shape error, variation, wrinkling

Procedia PDF Downloads 228
34798 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 390
34797 Keypoint Detection Method Based on Multi-Scale Feature Fusion of Attention Mechanism

Authors: Xiaoxiao Li, Shuangcheng Jia, Qian Li

Abstract:

Keypoint detection has always been a challenge in the field of image recognition. This paper proposes a novelty keypoint detection method which is called Multi-Scale Feature Fusion Convolutional Network with Attention (MFFCNA). We verified that the multi-scale features with the attention mechanism module have better feature expression capability. The feature fusion between different scales makes the information that the network model can express more abundant, and the network is easier to converge. On our self-made street sign corner dataset, we validate the MFFCNA model with an accuracy of 97.8% and a recall of 81%, which are 5 and 8 percentage points higher than the HRNet network, respectively. On the COCO dataset, the AP is 71.9%, and the AR is 75.3%, which are 3 points and 2 points higher than HRNet, respectively. Extensive experiments show that our method has a remarkable improvement in the keypoint recognition tasks, and the recognition effect is better than the existing methods. Moreover, our method can be applied not only to keypoint detection but also to image classification and semantic segmentation with good generality.

Keywords: keypoint detection, feature fusion, attention, semantic segmentation

Procedia PDF Downloads 102
34796 Green Supply Chain Management and Corporate Performance: The Mediation Mechanism of Information Sharing among Firms

Authors: Seigo Matsuno, Yasuo Uchida, Shozo Tokinaga

Abstract:

This paper proposes and empirically tests a model of the relationships between green supply chain management (GSCM) activities and corporate performance. From the literature review, we identified five constructs, namely, environmental commitment, supplier collaboration, supplier assessment, information sharing among suppliers, and business process improvement. These explanatory variables are used to form a structural model explaining the environmental and economic performance. The model was analyzed using the data from a survey of a sample of manufacturing firms in Japan. The results suggest that the degree of supplier collaboration has an influence on the environmental performance directly. While, the impact of supplier assessment on the environmental performance is mediated by the information sharing and/or business process improvement. And the environmental performance has a positive relationship on the economic performance. Academic and managerial implications of our findings are discussed.

Keywords: corporate performance, empirical study, green supply chain management, path modeling

Procedia PDF Downloads 376
34795 Induction Heating Process Design Using Comsol® Multiphysics Software Version 4.2a

Authors: K. Djellabi, M. E. H. Latreche

Abstract:

Induction heating computer simulation is a powerful tool for process design and optimization, induction coil design, equipment selection, as well as education and business presentations. The authors share their vast experience in the practical use of computer simulation for different induction heating and heat treating processes. In this paper deals with mathematical modeling and numerical simulation of induction heating furnaces with axisymmetric geometries. For the numerical solution, we propose finite element methods combined with boundary (FEM) for the electromagnetic model using COMSOL® Multiphysics Software. Some numerical results for an industrial furnace are shown with high frequency.

Keywords: numerical methods, induction furnaces, induction heating, finite element method, Comsol multiphysics software

Procedia PDF Downloads 426
34794 Finding DEA Targets Using Multi-Objective Programming

Authors: Farzad Sharifi, Raziyeh Shamsi

Abstract:

In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose molti-objective DEA-R model, because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduces the efficiency score), an efficient DMU is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other case, only the ratio of stochastic data may be available (e.g; the ratio of stochastic inputs to stochastic outputs). Thus, we provide multi objective DEA model without explicit outputs and prove that in-put oriented MOP DEA-R model in the invariable return to scale case can be replacing by MOP- DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model, yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.

Keywords: DEA, MOLP, STOCHASTIC, DEA-R

Procedia PDF Downloads 382
34793 Optimization of Plastic Injection Molding Parameters by Altering Gate and Runner of Feeding System

Authors: Ali Ramezani

Abstract:

Balancing feeding system of plastic injection molding has overriding importance as it minimizes the process’s product defects such as weld line, shrinkage, sink marks and warpage. This article presents the difference between optimization of feeding system in identical multi-cavity molding and family molding using Moldflow Plastic Insight software. In this work, the effect of dimension, shape, position and type of gates and runners on the products quality was studied. The optimization was carried out by analyzing plastic injection molding process parameters, including melt temperature, mold temperature, cooling time, cooling temperature packing time and packing pressure. It was found that symmetrical feeding system is the most efficient shape for diminishing defects in identical multi-cavity molding. However, the same results were not concluded for family molding due to the differences between volume, mass, thickness and shape of cavities.

Keywords: balancing feeding system, family molding, multi-cavity, Moldflow, plastic injection

Procedia PDF Downloads 113
34792 Software User Experience Enhancement through Collaborative Design

Authors: Shan Wang, Fahad Alhathal, Daniel Hobson

Abstract:

User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023, aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight workshops with a diverse group of 11 individuals. Throughout these sessions, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.

Keywords: user experiences, co-design, design process, knowledge management tool, user-centered design

Procedia PDF Downloads 39
34791 Urban Rail Transit CBTC Computer Interlocking Subsystem Relying on Multi-Template Pen Point Tracking Algorithm

Authors: Xinli Chen, Xue Su

Abstract:

In the urban rail transit CBTC system, interlocking is considered one of the most basic sys-tems, which has the characteristics of logical complexity and high-security requirements. The development and verification of traditional interlocking subsystems are entirely manual pro-cesses and rely too much on the designer, which often hides many uncertain factors. In order to solve this problem, this article is based on the multi-template nib tracking algorithm for model construction and verification, achieving the main safety attributes and using SCADE for formal verification. Experimental results show that this method helps to improve the quality and efficiency of interlocking software.

Keywords: computer interlocking subsystem, penpoint tracking, communication-based train control system, multi-template tip tracking

Procedia PDF Downloads 139
34790 Importance of Hardware Systems and Circuits in Secure Software Development Life Cycle

Authors: Mir Shahriar Emami

Abstract:

Although it is fully impossible to ensure that a software system is quite secure, developing an acceptable secure software system in a convenient platform is not unreachable. In this paper, we attempt to analyze software development life cycle (SDLC) models from the hardware systems and circuits point of view. To date, the SDLC models pay merely attention to the software security from the software perspectives. In this paper, we present new features for SDLC stages to emphasize the role of systems and circuits in developing secure software system through the software development stages, the point that has not been considered previously in the SDLC models.

Keywords: SDLC, SSDLC, software security, software process engineering, hardware systems and circuits security

Procedia PDF Downloads 243
34789 Multi-Robotic Partial Disassembly Line Balancing with Robotic Efficiency Difference via HNSGA-II

Authors: Tao Yin, Zeqiang Zhang, Wei Liang, Yanqing Zeng, Yu Zhang

Abstract:

To accelerate the remanufacturing process of electronic waste products, this study designs a partial disassembly line with the multi-robotic station to effectively dispose of excessive wastes. The multi-robotic partial disassembly line is a technical upgrade to the existing manual disassembly line. Balancing optimization can make the disassembly line smoother and more efficient. For partial disassembly line balancing with the multi-robotic station (PDLBMRS), a mixed-integer programming model (MIPM) considering the robotic efficiency differences is established to minimize cycle time, energy consumption and hazard index and to calculate their optimal global values. Besides, an enhanced NSGA-II algorithm (HNSGA-II) is proposed to optimize PDLBMRS efficiently. Finally, MIPM and HNSGA-II are applied to an actual mixed disassembly case of two types of computers, the comparison of the results solved by GUROBI and HNSGA-II verifies the correctness of the model and excellent performance of the algorithm, and the obtained Pareto solution set provides multiple options for decision-makers.

Keywords: waste disposal, disassembly line balancing, multi-robot station, robotic efficiency difference, HNSGA-II

Procedia PDF Downloads 207
34788 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems

Authors: Lei Chen, Jian Jiao, Tingdi Zhao

Abstract:

Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.

Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system

Procedia PDF Downloads 101
34787 The Effect of Improvement Programs in the Mean Time to Repair and in the Mean Time between Failures on Overall Lead Time: A Simulation Using the System Dynamics-Factory Physics Model

Authors: Marcel Heimar Ribeiro Utiyama, Fernanda Caveiro Correia, Dario Henrique Alliprandini

Abstract:

The importance of the correct allocation of improvement programs is of growing interest in recent years. Due to their limited resources, companies must ensure that their financial resources are directed to the correct workstations in order to be the most effective and survive facing the strong competition. However, to our best knowledge, the literature about allocation of improvement programs does not analyze in depth this problem when the flow shop process has two capacity constrained resources. This is a research gap which is deeply studied in this work. The purpose of this work is to identify the best strategy to allocate improvement programs in a flow shop with two capacity constrained resources. Data were collected from a flow shop process with seven workstations in an industrial control and automation company, which process 13.690 units on average per month. The data were used to conduct a simulation with the System Dynamics-Factory Physics model. The main variables considered, due to their importance on lead time reduction, were the mean time between failures and the mean time to repair. The lead time reduction was the output measure of the simulations. Ten different strategies were created: (i) focused time to repair improvement, (ii) focused time between failures improvement, (iii) distributed time to repair improvement, (iv) distributed time between failures improvement, (v) focused time to repair and time between failures improvement, (vi) distributed time to repair and between failures improvement, (vii) hybrid time to repair improvement, (viii) hybrid time between failures improvements, (ix) time to repair improvement strategy towards the two capacity constrained resources, (x) time between failures improvement strategy towards the two capacity constrained resources. The ten strategies tested are variations of the three main strategies for improvement programs named focused, distributed and hybrid. Several comparisons among the effect of the ten strategies in lead time reduction were performed. The results indicated that for the flow shop analyzed, the focused strategies delivered the best results. When it is not possible to perform a large investment on the capacity constrained resources, companies should use hybrid approaches. An important contribution to the academy is the hybrid approach, which proposes a new way to direct the efforts of improvements. In addition, the study in a flow shop with two strong capacity constrained resources (more than 95% of utilization) is an important contribution to the literature. Another important contribution is the problem of allocation with two CCRs and the possibility of having floating capacity constrained resources. The results provided the best improvement strategies considering the different strategies of allocation of improvement programs and different positions of the capacity constrained resources. Finally, it is possible to state that both strategies, hybrid time to repair improvement and hybrid time between failures improvement, delivered best results compared to the respective distributed strategies. The main limitations of this study are mainly regarding the flow shop analyzed. Future work can further investigate different flow shop configurations like a varying number of workstations, different number of products or even different positions of the two capacity constrained resources.

Keywords: allocation of improvement programs, capacity constrained resource, hybrid strategy, lead time, mean time to repair, mean time between failures

Procedia PDF Downloads 100
34786 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data

Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello

Abstract:

Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.

Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification

Procedia PDF Downloads 854
34785 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup

Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi

Abstract:

Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.

Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller

Procedia PDF Downloads 371
34784 Protein Extraction by Enzyme-Assisted Extraction followed by Alkaline Extraction from Red Seaweed Eucheuma denticulatum (Spinosum) Used in Carrageenan Production

Authors: Alireza Naseri, Susan L. Holdt, Charlotte Jacobsen

Abstract:

In 2014, the global amount of carrageenan production was 60,000 ton with a value of US$ 626 million. From this number, it can be estimated that the total dried seaweed consumption for this production was at least 300,000 ton/year. The protein content of these types of seaweed is 5 – 25%. If just half of this total amount of protein could be extracted, 18,000 ton/year of a high-value protein product would be obtained. The overall aim of this study was to develop a technology that will ensure further utilization of the seaweed that is used only as raw materials for carrageenan production as single extraction at present. More specifically, proteins should be extracted from the seaweed either before or after extraction of carrageenan with focus on maintaining the quality of carrageenan as a main product. Different mechanical, chemical and enzymatic technologies were evaluated. The optimized process was implemented in lab scale and based on its results; the new experiments were done a pilot and larger scale. In order to calculate the efficiency of the new upstream multi-extraction process, protein content was tested before and after extraction. After this step, the extraction of carrageenan was done and carrageenan content and the effect of extraction on yield were evaluated. The functionality and quality of carrageenan were measured based on rheological parameters. The results showed that by using the new multi-extraction process (submitted patent); it is possible to extract almost 50% of total protein without any negative impact on the carrageenan quality. Moreover, compared to the routine carrageenan extraction process, the new multi-extraction process could increase the yield of carrageenan and the rheological properties such as gel strength in the final carrageenan had a promising improvement. The extracted protein has initially been screened as a plant protein source in typical food applications. Further work will be carried out in order to improve properties such as color, solubility, and taste.

Keywords: carrageenan, extraction, protein, seaweed

Procedia PDF Downloads 255
34783 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs

Procedia PDF Downloads 329
34782 Identifying Mitigation Plans in Reducing Usability Risk Using Delphi Method

Authors: Jayaletchumi T. Sambantha Moorthy, Suhaimi bin Ibrahim, Mohd Naz’ri Mahrin

Abstract:

Most quality models have defined usability as a significant factor that leads to improving product acceptability, increasing user satisfaction, improving product reliability, and also financially benefiting companies. Usability is also the best factor that acts as a balance for both the technical and human aspects of a software product, which is an important aspect in defining quality during software development process. A usability risk can be defined as a potential usability risk factor that a chosen action or activity may lead to a possible loss or an undesirable outcome. This could impact the usability of a software product thereby contributing to negative user experiences and causing a possible software product failure. Hence, it is important to mitigate and reduce usability risks in the software development process itself. By managing possible involved usability risks in software development process, failure of software product could be reduced. Therefore, this research uses the Delphi method to identify mitigation plans to reduce potential usability risks. The Delphi method is conducted with seven experts from the field of risk management and software development.

Keywords: usability, usability risk, risk management, risk mitigation, delphi study

Procedia PDF Downloads 446
34781 The Cost of Innovation in Software Development Projects

Authors: Mihai Liviu Despa

Abstract:

The paper tackles the topic of determining the cost of innovation in software development projects. Innovation can be achieved either in a planned or unplanned manner. The paper approaches the scenarios were innovation is planned for. As a starting point an innovative software development project is analyzed. The project is depicted step by step as it was implemented, from inception to delivery. Costs that are proprietary to innovation in software development are isolated based on the author’s personal experience in managing the above mentioned project. Innovation costs components identified by the author are then validated using open discussions with software development professionals and projects managers on LinkedIn groups. In order to receive relevant feedback only groups that focus on software development and innovation management are targeted. Additional innovation cost components suggested by software development professionals and projects managers are also considered. Based on the identified cost components an indicator is built. The indicator is meant to formalize the process of determining the cost of innovation in a software development project. The indicator aggregates all the innovation cost components that are identified in the research process. The process of calculating each cost component is also described. Conclusions are formulated and new related research topics are submitted for debate.

Keywords: innovation cost, IT project management, software development, innovation management

Procedia PDF Downloads 434
34780 A Software Engineering Methodology for Developing Secure Obfuscated Software

Authors: Carlos Gonzalez, Ernesto Linan

Abstract:

We propose a methodology to conciliate two apparently contradictory processes in the development of secure obfuscated software and good software engineered software. Our methodology consists first in the system designers defining the type of security level required for the software. There are four types of attackers: casual attackers, hackers, institution attack, and government attack. Depending on the level of threat, the methodology we propose uses five or six teams to accomplish this task. One Software Engineer Team and one or two software Obfuscation Teams, and Compiler Team, these four teams will develop and compile the secure obfuscated software, a Code Breakers Team will test the results of the previous teams to see if the software is not broken at the required security level, and an Intrusion Analysis Team will analyze the results of the Code Breakers Team and propose solutions to the development teams to prevent the detected intrusions. We also present an analytical model to prove that our methodology is no only easier to use, but generates an economical way of producing secure obfuscated software.

Keywords: development methodology, obfuscated software, secure software development, software engineering

Procedia PDF Downloads 230
34779 Back to Basics: Redefining Quality Measurement for Hybrid Software Development Organizations

Authors: Satya Pradhan, Venky Nanniyur

Abstract:

As the software industry transitions from a license-based model to a subscription-based Software-as-a-Service (SaaS) model, many software development groups are using a hybrid development model that incorporates Agile and Waterfall methodologies in different parts of the organization. The traditional metrics used for measuring software quality in Waterfall or Agile paradigms do not apply to this new hybrid methodology. In addition, to respond to higher quality demands from customers and to gain a competitive advantage in the market, many companies are starting to prioritize quality as a strategic differentiator. As a result, quality metrics are included in the decision-making activities all the way up to the executive level, including board of director reviews. This paper presents key challenges associated with measuring software quality in organizations using the hybrid development model. We introduce a framework called Prevention-Inspection-Evaluation-Removal (PIER) to provide a comprehensive metric definition for hybrid organizations. The framework includes quality measurements, quality enforcement, and quality decision points at different organizational levels and project milestones. The metrics framework defined in this paper is being used for all Cisco systems products used in customer premises. We present several field metrics for one product portfolio (enterprise networking) to show the effectiveness of the proposed measurement system. As the results show, this metrics framework has significantly improved in-process defect management as well as field quality.

Keywords: quality management system, quality metrics framework, quality metrics, agile, waterfall, hybrid development system

Procedia PDF Downloads 151
34778 Finite Element Modelling and Analysis of Human Knee Joint

Authors: R. Ranjith Kumar

Abstract:

Computer modeling and simulation of human movement is playing an important role in sports and rehabilitation. Accurate modeling and analysis of human knee join is more complex because of complicated structure whose geometry is not easily to represent by a solid model. As part of this project, from the number of CT scan images of human knee join surface reconstruction is carried out using 3D slicer software, an open source software. From this surface reconstruction model, using mesh lab (another open source software) triangular meshes are created on reconstructed surface. This final triangular mesh model is imported to Solid Works, 3D mechanical CAD modeling software. Finally this CAD model is imported to ABAQUS, finite element analysis software for analyzing the knee joints. The results obtained are encouraging and provides an accurate way of modeling and analysis of biological parts without human intervention.

Keywords: solid works, CATIA, Pro-e, CAD

Procedia PDF Downloads 104
34777 The Choosing the Right Projects With Multi-Criteria Decision Making to Ensure the Sustainability of the Projects

Authors: Saniye Çeşmecioğlu

Abstract:

The importance of project sustainability and success has become increasingly significant due to the proliferation of external environmental factors that have decreased project resistance in contemporary times. The primary approach to forestall the failure of projects is to ensure their long-term viability through the strategic selection of projects as creating judicious project selection framework within the organization. Decision-makers require precise decision contexts (models) that conform to the company's business objectives and sustainability expectations during the project selection process. The establishment of a rational model for project selection enables organizations to create a distinctive and objective framework for the selection process. Additionally, for the optimal implementation of this decision-making model, it is crucial to establish a Project Management Office (PMO) team and Project Steering Committee within the organizational structure to oversee the framework. These teams enable updating project selection criteria and weights in response to changing conditions, ensuring alignment with the company's business goals, and facilitating the selection of potentially viable projects. This paper presents a multi-criteria decision model for selecting project sustainability and project success criteria that ensures timely project completion and retention. The model was developed using MACBETH (Measuring Attractiveness by a Categorical Based Evaluation Technique) and was based on broadcaster companies’ expectations. The ultimate results of this study provide a model that endorses the process of selecting the appropriate project objectively by utilizing project selection and sustainability criteria along with their respective weights for organizations. Additionally, the study offers suggestions that may ascertain helpful in future endeavors.

Keywords: project portfolio management, project selection, multi-criteria decision making, project sustainability and success criteria, MACBETH

Procedia PDF Downloads 44
34776 Study of the Process of Climate Change According to Data Simulation Using LARS-WG Software during 2010-2030: Case Study of Semnan Province

Authors: Leila Rashidian

Abstract:

Temperature rise on Earth has had harmful effects on the Earth's surface and has led to change in precipitation patterns all around the world. The present research was aimed to study the process of climate change according to the data simulation in future and compare these parameters with current situation in the studied stations in Semnan province including Garmsar, Shahrood and Semnan. In this regard, LARS-WG software, HADCM3 model and A2 scenario were used for the 2010-2030 period. In this model, climatic parameters such as maximum and minimum temperature, precipitation and radiation were used daily. The obtained results indicated that there will be a 4.4% increase in precipitation in Semnan province compared with the observed data, and in general, there will be a 1.9% increase in temperature. This temperature rise has significant impact on precipitation patterns. Most of precipitation will be raining (torrential rains in some cases). According to the results, from west to east, the country will experience more temperature rise and will be warmer.

Keywords: climate change, Semnan province, Lars.WG model, climate parameters, HADCM₃ model

Procedia PDF Downloads 230
34775 Designing a Model to Increase the Flow of Circular Economy Startups Using a Systemic and Multi-Generational Approach

Authors: Luís Marques, João Rocha, Andreia Fernandes, Maria Moura, Cláudia Caseiro, Filipa Figueiredo, João Nunes

Abstract:

The implementation of circularity strategies other than recycling, such as reducing the amount of raw material, as well as reusing or sharing existing products, remains marginal. The European Commission announced that the transition towards a more circular economy could lead to the net creation of about 700,000 jobs in Europe by 2030, through additional labour demand from recycling plants, repair services and other circular activities. Efforts to create new circular business models in accordance with completely circular processes, as opposed to linear ones, have increased considerably in recent years. In order to create a societal Circular Economy transition model, it is necessary to include innovative solutions, where startups play a key role. Early-stage startups based on new business models according to circular processes often face difficulties in creating enough impact. The StartUp Zero Program designs a model and approach to increase the flow of startups in the Circular Economy field, focusing on a systemic decision analysis and multi-generational approach, considering Multi-Criteria Decision Analysis to support a decision-making tool, which is also supported by the use of a combination of an Analytical Hierarchy Process and Multi-Attribute Value Theory methods. We define principles, criteria and indicators for evaluating startup prerogatives, quantifying the evaluation process in a unique result. Additionally, this entrepreneurship program spanning 16 months involved more than 2400 young people, from ages 14 to 23, in more than 200 interaction activities.

Keywords: circular economy, entrepreneurship, startups;, multi-criteria decision analysis

Procedia PDF Downloads 75
34774 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 220