Search results for: software/hardware codesign
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2319

Search results for: software/hardware codesign

2049 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software

Authors: Anjushi Verma, Tirthankar Gayen

Abstract:

Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.

Keywords: Black Box, faults, failure, software reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332
2048 Wireless Sensor Networks for Swiftlet Farms Monitoring

Authors: Al-Khalid Othman, Wan A. Wan Zainal Abidin, Kee M. Lee, Hushairi Zen, Tengku. M. A. Zulcaffle, Kuryati Kipli

Abstract:

This paper provides an in-depth study of Wireless Sensor Network (WSN) application to monitor and control the swiftlet habitat. A set of system design is designed and developed that includes the hardware design of the nodes, Graphical User Interface (GUI) software, sensor network, and interconnectivity for remote data access and management. System architecture is proposed to address the requirements for habitat monitoring. Such applicationdriven design provides and identify important areas of further work in data sampling, communications and networking. For this monitoring system, a sensor node (MTS400), IRIS and Micaz radio transceivers, and a USB interfaced gateway base station of Crossbow (Xbow) Technology WSN are employed. The GUI of this monitoring system is written using a Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) along with Xbow Technology drivers provided by National Instrument. As a result, this monitoring system is capable of collecting data and presents it in both tables and waveform charts for further analysis. This system is also able to send notification message by email provided Internet connectivity is available whenever changes on habitat at remote sites (swiftlet farms) occur. Other functions that have been implemented in this system are the database system for record and management purposes; remote access through the internet using LogMeIn software. Finally, this research draws a conclusion that a WSN for monitoring swiftlet habitat can be effectively used to monitor and manage swiftlet farming industry in Sarawak.

Keywords: Swiftlet, WSN, Habitat Monitoring, Networking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2698
2047 Evolving Digital Circuits for Early Stage Breast Cancer Detection Using Cartesian Genetic Programming

Authors: Zahra Khalid, Gul Muhammad Khan, Arbab Masood Ahmad

Abstract:

Cartesian Genetic Programming (CGP) is explored to design an optimal circuit capable of early stage breast cancer detection. CGP is used to evolve simple multiplexer circuits for detection of malignancy in the Fine Needle Aspiration (FNA) samples of breast. The data set used is extracted from Wisconsins Breast Cancer Database (WBCD). A range of experiments were performed, each with different set of network parameters. The best evolved network detected malignancy with an accuracy of 99.14%, which is higher than that produced with most of the contemporary non-linear techniques that are computational expensive than the proposed system. The evolved network comprises of simple multiplexers and can be implemented easily in hardware without any further complications or inaccuracy, being the digital circuit.

Keywords: Breast cancer detection, cartesian genetic programming, evolvable hardware, fine needle aspiration (FNA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 751
2046 Requirements Gathering for Improved Software Usability and the Potential for Usage-Centred Design

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

Usability is an important software quality that is often neglected at the design stage. Although methods exist to incorporate elements of usability engineering, there is a need for more balanced usability focused methods that can enhance the experience of software usability for users. In this regard, the potential for Usage-Centred Design is explored with respect to requirements gathering and is shown to lead to high software usability besides other benefits. It achieves this through its focus on usage, defining essential use cases, by conducting task modeling, encouraging user collaboration, refining requirements, and so on. The requirements gathering process in UgCD is described in detail.

Keywords: Requirements gathering, Usability, Usage-Centred Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
2045 SBTAR: An Enhancing Method for Automate Test Tools

Authors: Noppakit Nawalikit, Pattarasinee Bhattarakosol

Abstract:

Since Software testing becomes an important part of Software development in order to improve the quality of software, many automation tools are created to help testing functionality of software. There are a few issues about usability of these tools, one is that the result log which is generated from tools contains useless information that the tester cannot use result log to communicate efficiently, or the result log needs to use a specific application to open. This paper introduces a new method, SBTAR that improves usability of automated test tools in a part of a result log. The practice will use the capability of tools named as IBM Rational Robot to create a customized function, the function would generate new format of a result log which contains useful information faster and easier to understand than using the original result log which was generated from the tools. This result log also increases flexibility by Microsoft Word or WordPad to make them readable.

Keywords: Software Automation Testing, Automated test tool, IBM Rational Robot.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1380
2044 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: Block matching, digital evidence, hash list.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
2043 Development of Software Complex for Digitalization of Enterprise Activities

Authors: G. T. Balakayeva, K. K. Nurlybayeva, M. B. Zhanuzakov

Abstract:

In the proposed work, we have developed software and designed a software architecture for the implementation of enterprise business processes. The proposed software has a multi-level architecture using a domain-specific tool. The developed architecture is a guarantor of the availability, reliability and security of the system and the implementation of business processes, which are the basis for effective enterprise management. Automating business processes, automating the algorithmic stages of an enterprise, developing optimal algorithms for managing activities, controlling and monitoring, reducing risks and improving results help organizations achieve strategic goals quickly and efficiently. The software described in this article can connect to the corporate information system via two methods: a desktop client and a web client. With an appeal to the application server, the desktop client program connects to the information system on the company's work PCs over a local network. Outside the organization, the user can interact with the information system via a web browser, which acts as a web client and connects to a web server. The developed software consists of several integrated modules that share resources and interact with each other through an API. The following technology stack was used during development: Node js, React js, MongoDB, Ngnix, Cloud Technologies, Python.

Keywords: Algorithms, document processing, automation, integrated modules, software architecture, software design, information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 114
2042 A Quantitative Approach to Strategic Design of Component-Based Business Process Models

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

Keywords: Business process model, process component, component management goals, measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
2041 Hardware Implementation of Stack-Based Replacement Algorithms

Authors: Hassan Ghasemzadeh, Sepideh Mazrouee, Hassan Goldani Moghaddam, Hamid Shojaei, Mohammad Reza Kakoee

Abstract:

Block replacement algorithms to increase hit ratio have been extensively used in cache memory management. Among basic replacement schemes, LRU and FIFO have been shown to be effective replacement algorithms in terms of hit rates. In this paper, we introduce a flexible stack-based circuit which can be employed in hardware implementation of both LRU and FIFO policies. We propose a simple and efficient architecture such that stack-based replacement algorithms can be implemented without the drawbacks of the traditional architectures. The stack is modular and hence, a set of stack rows can be cascaded depending on the number of blocks in each cache set. Our circuit can be implemented in conjunction with the cache controller and static/dynamic memories to form a cache system. Experimental results exhibit that our proposed circuit provides an average value of 26% improvement in storage bits and its maximum operating frequency is increased by a factor of two

Keywords: Cache Memory, Replacement Algorithms, LeastRecently Used Algorithm, First In First Out Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3392
2040 Factors of Effective Business Software Systems Development and Enhancement Projects Work Effort Estimation

Authors: Beata Czarnacka-Chrobot

Abstract:

Majority of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) fail to meet criteria of their effectiveness, what leads to the considerable financial losses. One of the fundamental reasons for such projects- exceptionally low success rate are improperly derived estimates for their costs and time. In the case of BSS D&EP these attributes are determined by the work effort, meanwhile reliable and objective effort estimation still appears to be a great challenge to the software engineering. Thus this paper is aimed at presenting the most important synthetic conclusions coming from the author-s own studies concerning the main factors of effective BSS D&EP work effort estimation. Thanks to the rational investment decisions made on the basis of reliable and objective criteria it is possible to reduce losses caused not only by abandoned projects but also by large scale of overrunning the time and costs of BSS D&EP execution.

Keywords: Benchmarking data, business software systems development and enhancement projects, effort estimation, software engineering economics, software functional size measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
2039 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: Blueprint, ERP, SDLC, Modular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 343
2038 Prediction of Reusability of Object Oriented Software Systems using Clustering Approach

Authors: Anju Shri, Parvinder S. Sandhu, Vikas Gupta, Sanyam Anand

Abstract:

In literature, there are metrics for identifying the quality of reusable components but the framework that makes use of these metrics to precisely predict reusability of software components is still need to be worked out. These reusability metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the software component and hence improve the productivity due to probabilistic increase in the reuse level. As CK metric suit is most widely used metrics for extraction of structural features of an object oriented (OO) software; So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO and LCOM, is used to obtain the structural analysis of OO-based software components. An algorithm has been proposed in which the inputs can be given to K-Means Clustering system in form of tuned values of the OO software component and decision tree is formed for the 10-fold cross validation of data to evaluate the in terms of linguistic reusability value of the component. The developed reusability model has produced high precision results as desired.

Keywords: CK-Metric, Desicion Tree, Kmeans, Reusability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877
2037 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators

Authors: Wei Zhang

Abstract:

With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.

Keywords: Deep learning, field programmable gate array, FPGA, hardware acceleration, convolutional neural networks, CNN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 824
2036 Life Time Based Analysis of MAC Protocols of Wireless Ad Hoc Networks in WSN Applications

Authors: R. Alageswaran, S. Selvakumar, P. Neelamegam

Abstract:

Wireless Sensor Networks (WSN) are emerging because of the developments in wireless communication technology and miniaturization of the hardware. WSN consists of a large number of low-cost, low-power, multifunctional sensor nodes to monitor physical conditions, such as temperature, sound, vibration, pressure, motion, etc. The MAC protocol to be used in the sensor networks must be energy efficient and this should aim at conserving the energy during its operation. In this paper, with the focus of analyzing the MAC protocols used in wireless Adhoc networks to WSN, simulation experiments were conducted in Global Mobile Simulator (GloMoSim) software. Number of packets sent by regular nodes, and received by sink node in different deployment strategies, total energy spent, and the network life time have been chosen as the metric for comparison. From the results of simulation, it is evident that the IEEE 802.11 protocol performs better compared to CSMA and MACA protocols.

Keywords: CSMA, DCF, MACA, TelosB

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
2035 Using Design Sprint for Software Engineering Undergraduate Student Projects: A Method Paper

Authors: Sobhani U. Pilapitiya, Tharanga Peiris

Abstract:

Software engineering curriculums generally consist of industry-based practices such as project-based learning (PBL) which mainly focuses on efficient and innovative product development. These approaches can be tailored and used in project-based modules in software engineering curriculums. However, there are very limited attempts in the area especially related to Sri Lankan context. This paper describes a tailored pedagogical approach and its results of using design sprint which can be used for project-based modules in software engineering (SE) curriculums. A controlled group of second year software engineering students was selected for the study. The study results indicate that all of the students agreed that the design sprint approach is effective in group-based projects and 83% of students stated that it minimized the re-work compared to traditional project approaches. The tailored process was effective, easy to implement and produced desired results at the end of the session while providing students an enjoyable experience.

Keywords: design sprint, project-based learning, software engineering, curriculum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 618
2034 A Robust Software for Advanced Analysis of Space Steel Frames

Authors: Viet-Hung Truong, Seung-Eock Kim

Abstract:

This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.

Keywords: Advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
2033 Server Virtualization Using User Behavior Model Focus on Provisioning Concept

Authors: D. Prangchumpol

Abstract:

Server provisioning is one of the most attractive topics in virtualization systems. Virtualization is a method of running multiple independent virtual operating systems on a single physical computer. It is a way of maximizing physical resources to maximize the investment in hardware. Additionally, it can help to consolidate servers, improve hardware utilization and reduce the consumption of power and physical space in the data center. However, management of heterogeneous workloads, especially for resource utilization of the server, or so called provisioning becomes a challenge. In this paper, a new concept for managing workloads based on user behavior is presented. The experimental results show that user behaviors are different in each type of service workload and time. Understanding user behaviors may improve the efficiency of management in provisioning concept. This preliminary study may be an approach to improve management of data centers running heterogeneous workloads for provisioning in virtualization system.

Keywords: association rule, provisioning, server virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
2032 FPGA Implementation of Adaptive Clock Recovery for TDMoIP Systems

Authors: Semih Demir, Anil Celebi

Abstract:

Circuit switched networks widely used until the end of the 20th century have been transformed into packages switched networks. Time Division Multiplexing over Internet Protocol (TDMoIP) is a system that enables Time Division Multiplexing (TDM) traffic to be carried over packet switched networks (PSN). In TDMoIP systems, devices that send TDM data to the PSN and receive it from the network must operate with the same clock frequency. In this study, it was aimed to implement clock synchronization process in Field Programmable Gate Array (FPGA) chips using time information attached to the packages received from PSN. The designed hardware is verified using the datasets obtained for the different carrier types and comparing the results with the software model. Field tests are also performed by using the real time TDMoIP system.

Keywords: Clock recovery on TDMoIP, FPGA, MATLAB reference model, clock synchronization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1383
2031 Analytics Model in a Telehealth Center Based on Cloud Computing and Local Storage

Authors: L. Ramirez, E. Guillén, J. Sánchez

Abstract:

Some of the main goals about telecare such as monitoring, treatment, telediagnostic are deployed with the integration of applications with specific appliances. In order to achieve a coherent model to integrate software, hardware, and healthcare systems, different telehealth models with Internet of Things (IoT), cloud computing, artificial intelligence, etc. have been implemented, and their advantages are still under analysis. In this paper, we propose an integrated model based on IoT architecture and cloud computing telehealth center. Analytics module is presented as a solution to control an ideal diagnostic about some diseases. Specific features are then compared with the recently deployed conventional models in telemedicine. The main advantage of this model is the availability of controlling the security and privacy about patient information and the optimization on processing and acquiring clinical parameters according to technical characteristics.

Keywords: Analytics, telemedicine, internet of things, cloud computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508
2030 A Survey on Usage and Diffusion of Project Risk Management Techniques and Software Tools in the Construction Industry

Authors: Muhammad Jamaluddin Thaheem, Alberto De Marco

Abstract:

The area of Project Risk Management (PRM) has been extensively researched, and the utilization of various tools and techniques for managing risk in several industries has been sufficiently reported. Formal and systematic PRM practices have been made available for the construction industry. Based on such body of knowledge, this paper tries to find out the global picture of PRM practices and approaches with the help of a survey to look into the usage of PRM techniques and diffusion of software tools, their level of maturity, and their usefulness in the construction sector. Results show that, despite existing techniques and tools, their usage is limited: software tools are used only by a minority of respondents and their cost is one of the largest hurdles in adoption. Finally, the paper provides some important guidelines for future research regarding quantitative risk analysis techniques and suggestions for PRM software tools development and improvement.

Keywords: Construction industry, Project risk management, Software tools, Survey study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2925
2029 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853
2028 FPGA Hardware Implementation and Evaluation of a Micro-Network Architecture for Multi-Core Systems

Authors: Yahia Salah, Med Lassaad Kaddachi, Rached Tourki

Abstract:

This paper presents the design, implementation and evaluation of a micro-network, or Network-on-Chip (NoC), based on a generic pipeline router architecture. The router is designed to efficiently support traffic generated by multimedia applications on embedded multi-core systems. It employs a simplest routing mechanism and implements the round-robin scheduling strategy to resolve output port contentions and minimize latency. A virtual channel flow control is applied to avoid the head-of-line blocking problem and enhance performance in the NoC. The hardware design of the router architecture has been implemented at the register transfer level; its functionality is evaluated in the case of the two dimensional Mesh/Torus topology, and performance results are derived from ModelSim simulator and Xilinx ISE 9.2i synthesis tool. An example of a multi-core image processing system utilizing the NoC structure has been implemented and validated to demonstrate the capability of the proposed micro-network architecture. To reduce complexity of the image compression and decompression architecture, the system use image processing algorithm based on classical discrete cosine transform with an efficient zonal processing approach. The experimental results have confirmed that both the proposed image compression scheme and NoC architecture can achieve a reasonable image quality with lower processing time.

Keywords: Generic Pipeline Network-on-Chip Router Architecture, JPEG Image Compression, FPGA Hardware Implementation, Performance Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3052
2027 A Reusability Evaluation Model for OO-Based Software Components

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

The requirement to improve software productivity has promoted the research on software metric technology. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. CK metric suit is most widely used metrics for the objectoriented (OO) software; we critically analyzed the CK metrics, tried to remove the inconsistencies and devised the framework of metrics to obtain the structural analysis of OO-based software components. Neural network can learn new relationships with new input data and can be used to refine fuzzy rules to create fuzzy adaptive system. Hence, Neuro-fuzzy inference engine can be used to evaluate the reusability of OO-based component using its structural attributes as inputs. In this paper, an algorithm has been proposed in which the inputs can be given to Neuro-fuzzy system in form of tuned WMC, DIT, NOC, CBO , LCOM values of the OO software component and output can be obtained in terms of reusability. The developed reusability model has produced high precision results as expected by the human experts.

Keywords: CK-Metric, ID3, Neuro-fuzzy, Reusability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
2026 A Functional Framework for Large Scale Application Software Systems

Authors: Han-hua Lu, Shun-yi Zhang, Yong Zheng, Ya-shi Wang, Li-juan Min

Abstract:

From the perspective of system of systems (SoS) and emergent behaviors, this paper describes large scale application software systems, and proposes framework methods to further depict systems- functional and non-functional characteristics. Besides, this paper also specifically discusses some functional frameworks. In the end, the framework-s applications in system disintegrations, system architecture and stable intermediate forms are additionally dealt with in this in building, deployment and maintenance of large scale software applications.

Keywords: application software system, framework methods, system of systems, emergent behaviors

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1341
2025 Evaluation Framework for Agent-Oriented Methodologies

Authors: Zohreh O. Akbari, Ahmad Faraahi

Abstract:

Many agent-oriented software engineering methodologies have been proposed for software developing; however their application is still limited due to their lack of maturity. Evaluating the strengths and weaknesses of these methodologies plays an important role in improving them and in developing new stronger methodologies. This paper presents an evaluation framework for agent-oriented methodologies, which addresses six major areas: concepts, notation, process, pragmatics, support for software engineering and marketability. The framework is then used to evaluate the Gaia methodology to identify its strengths and weaknesses, and to prove the ability of the framework for promoting the agent-oriented methodologies by detecting their weaknesses in detail.

Keywords: Agent-Oriented Software Engineering, Evaluation Framework, Methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2038
2024 A New Approach for Assertions Processing during Assertion-Based Software Testing

Authors: Ali M. Alakeel

Abstract:

Assertion-Based software testing has been shown to be a promising tool for generating test cases that reveal program faults. Because the number of assertions may be very large for industry-size programs, one of the main concerns to the applicability of assertion-based testing is the amount of search time required to explore a large number of assertions. This paper presents a new approach for assertions exploration during the process of Assertion- Based software testing. Our initial exterminations with the proposed approach show that the performance of Assertion-Based testing may be improved, therefore, making this approach more efficient when applied on programs with large number of assertions.

Keywords: Software testing, assertion-based testing, program assertions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2064
2023 Requirements Management as a Competitive Factor in the it Mid Tier Business Concerning the Implementation of Erp-Software

Authors: Oliver Grün

Abstract:

The success of IT-projects concerning the implementation of business application Software is strongly depending upon the application of an efficient requirements management, to understand the business requirements and to realize them in the IT. But in fact, the Potentials of the requirements management are not fully exhausted by small and medium sized enterprises (SME) of the IT sector. To work out recommendations for action and furthermore a possible solution, allowing a better exhaust of potentials, it shall be examined in a scientific research project, which problems occur out of which causes. In the same place, the storage of knowledge from the requirements management, and its later reuse are important, to achieve sustainable improvements of the competitive of the IT-SMEs. Requirements Engineering is one of the most important topics in Product Management for Software to achieve the goal of optimizing the success of the software product.

Keywords: ERP, Requirements Management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
2022 Lessons from Applying XP Methodology to Business Requirements Engineering in Developing Countries Context

Authors: Olugbara O.O., Adebiyi A.A.

Abstract:

Most standard software development methodologies are often not applied to software projects in many developing countries of the world. The approach generally practice is close to what eXtreme Programming (XP) is likely promoting, just keep coding and testing as the requirement evolves. XP is an agile software process development methodology that has inherent capability for improving efficiency of Business Software Development (BSD). XP can facilitate Business-to-Development (B2D) relationship due to its customer-oriented advocate. From practitioner point of view, we applied XP to BSD and result shows that customer involvement has positive impact on productivity, but can as well frustrate the success of the project. In an effort to promote software engineering practice in developing countries of Africa, we present the experiment performed, lessons learned, problems encountered and solution adopted in applying XP methodology to BSD.

Keywords: Requirements engineering, Requirements elicitation, Extreme programming, Mobile Work force

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
2021 A Review on Cloud Computing and Internet of Things

Authors: Sahar S. Tabrizi, Dogan Ibrahim

Abstract:

Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.

Keywords: Cloud computing, cloud services, IaaS, PaaS, SaaS, IoT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1336
2020 Real-Time Image Encryption Using a 3D Discrete Dual Chaotic Cipher

Authors: M. F. Haroun, T. A. Gulliver

Abstract:

In this paper, an encryption algorithm is proposed for real-time image encryption. The scheme employs a dual chaotic generator based on a three dimensional (3D) discrete Lorenz attractor. Encryption is achieved using non-autonomous modulation where the data is injected into the dynamics of the master chaotic generator. The second generator is used to permute the dynamics of the master generator using the same approach. Since the data stream can be regarded as a random source, the resulting permutations of the generator dynamics greatly increase the security of the transmitted signal. In addition, a technique is proposed to mitigate the error propagation due to the finite precision arithmetic of digital hardware. In particular, truncation and rounding errors are eliminated by employing an integer representation of the data which can easily be implemented. The simple hardware architecture of the algorithm makes it suitable for secure real-time applications.

Keywords: Chaotic systems, image encryption, 3D Lorenz attractor, non-autonomous modulation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1174