Search results for: software security verification validation and test
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16430

Search results for: software security verification validation and test

16340 A Reduced Distributed Sate Space for Modular Petri Nets

Authors: Sawsen Khlifa, Chiheb AMeur Abid, Belhassan Zouari

Abstract:

Modular verification approaches have been widely attempted to cope with the well known state explosion problem. This paper deals with the modular verification of modular Petri nets. We propose a reduced version for the modular state space of a given modular Petri net. The new structure allows the creation of smaller modular graphs. Each one draws the behavior of the corresponding module and outlines some global information. Hence, this version helps to overcome the explosion problem and to use less memory space. In this condensed structure, the verification of some generic properties concerning one module is limited to the exploration of its associated graph.

Keywords: distributed systems, modular verification, petri nets, state space explosition

Procedia PDF Downloads 81
16339 A Novel Software Model for Enhancement of System Performance and Security through an Optimal Placement of PMU and FACTS

Authors: R. Kiran, B. R. Lakshmikantha, R. V. Parimala

Abstract:

Secure operation of power systems requires monitoring of the system operating conditions. Phasor measurement units (PMU) are the device, which uses synchronized signals from the GPS satellites, and provide the phasors information of voltage and currents at a given substation. The optimal locations for the PMUs must be determined, in order to avoid redundant use of PMUs. The objective of this paper is to make system observable by using minimum number of PMUs & the implementation of stability software at 22OkV grid for on-line estimation of the power system transfer capability based on voltage and thermal limitations and for security monitoring. This software utilizes State Estimator (SE) and synchrophasor PMU data sets for determining the power system operational margin under normal and contingency conditions. This software improves security of transmission system by continuously monitoring operational margin expressed in MW or in bus voltage angles, and alarms the operator if the margin violates a pre-defined threshold.

Keywords: state estimator (SE), flexible ac transmission systems (FACTS), optimal location, phasor measurement units (PMU)

Procedia PDF Downloads 383
16338 Application of Learning Media Based Augmented Reality on Molecular Geometry Concept

Authors: F. S. Irwansyah, I. Farida, Y. Maulana

Abstract:

Studying chemistry requires the ability to understand three levels of understanding in the form of macroscopic, submicroscopic and symbolic, but the lack of emphasis on the submicroscopic level leads to the understanding of chemical concepts becoming incomplete, due to the limitations of the tools capable of providing visualization of submicroscopic concepts. The purpose of this study describes the stages of making augmented reality learning media on the concept of molecular geometry and analyze the feasibility test result of augmented reality learning media on the concept of molecular geometry. This research uses Research and Development (R & D) method which produces a product of AR learning media on molecular geometry concept and test the effectiveness of the product. Research stages include concept analysis and learning indicators, design development, validation, feasibility, and limited testing. The stages of validation and limited trial are aimed to get feedback in the form of assessment, suggestion and improvement on learning aspect, material substance aspect, visual communication aspect and software engineering aspects and media feasibility in terms of media creation purpose to be used in learning. The results of the overall feasibility test obtained r-calculation 0,7-0,9 with the interpretation of high feasibility value, whereas the result of limited trial got the percentage of eligibility with the average value equal to 70,83-92,5%. This percentage indicates that AR's learning media product on the concept of molecular geometry, deserves to be used as a learning resource.

Keywords: android, augmented reality, chemical learning, geometry

Procedia PDF Downloads 178
16337 The Verification Study of Computational Fluid Dynamics Model of the Aircraft Piston Engine

Authors: Lukasz Grabowski, Konrad Pietrykowski, Michal Bialy

Abstract:

This paper presents the results of the research to verify the combustion in aircraft piston engine Asz62-IR. This engine was modernized and a type of ignition system was developed. Due to the high costs of experiments of a nine-cylinder 1,000 hp aircraft engine, a simulation technique should be applied. Therefore, computational fluid dynamics to simulate the combustion process is a reasonable solution. Accordingly, the tests for varied ignition advance angles were carried out and the optimal value to be tested on a real engine was specified. The CFD model was created with the AVL Fire software. The engine in the research had two spark plugs for each cylinder and ignition advance angles had to be set up separately for each spark. The results of the simulation were verified by comparing the pressure in the cylinder. The courses of the indicated pressure of the engine mounted on a test stand were compared. The real course of pressure was measured with an optical sensor, mounted in a specially drilled hole between the valves. It was the OPTRAND pressure sensor, which was designed especially to engine combustion process research. The indicated pressure was measured in cylinder no 3. The engine was running at take-off power. The engine was loaded by a propeller at a special test bench. The verification of the CFD simulation results was based on the results of the test bench studies. The course of the simulated pressure obtained is within the measurement error of the optical sensor. This error is 1% and reflects the hysteresis and nonlinearity of the sensor. The real indicated pressure measured in the cylinder and the pressure taken from the simulation were compared. It can be claimed that the verification of CFD simulations based on the pressure is a success. The next step was to research on the impact of changing the ignition advance timing of spark plugs 1 and 2 on a combustion process. Moving ignition timing between 1 and 2 spark plug results in a longer and uneven firing of a mixture. The most optimal point in terms of indicated power occurs when ignition is simultaneous for both spark plugs, but so severely separated ignitions are assured that ignition will occur at all speeds and loads of engine. It should be confirmed by a bench experiment of the engine. However, this simulation research enabled us to determine the optimal ignition advance angle to be implemented into the ignition control system. This knowledge allows us to set up the ignition point with two spark plugs to achieve as large power as possible.

Keywords: CFD model, combustion, engine, simulation

Procedia PDF Downloads 327
16336 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks

Authors: Antonio Pizzarello, Oris Friesen

Abstract:

Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.

Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition

Procedia PDF Downloads 196
16335 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 154
16334 Invention of Novel Technique of Process Scale Up by Using Solid Dosage Form

Authors: Shashank Tiwari, S. P. Mahapatra

Abstract:

The aim of this technique is to reduce the steps of process scales up, save time & cost of the industries. This technique will minimise the steps of process scale up. The new steps are, Novel Lab Scale, Novel Lab Scale Trials, Novel Trial Batches, Novel Exhibit Batches, Novel Validation Batches. In these steps, it is not divided to validation batches in three parts but the data of trials batches, Exhibit Batches and Validation batches are use and compile for production and used for validation. It also increases the batch size of the trial, exhibit batches. The new size of trials batches is not less than fifty Thousand, the exhibit batches increase up to two lack and the validation batches up to five lack. After preparing the batches all their data & drugs use for stability & maintain the validation record and compile data for the technology transfer in production department for preparing the marketed size batches.

Keywords: batches, technique, preparation, scale up, validation

Procedia PDF Downloads 321
16333 Multi-Level Security Measures in Cloud Computing

Authors: Shobha G. Ranjan

Abstract:

Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.

Keywords: cloud computing, cloud security, integrity, multi-tenancy, security

Procedia PDF Downloads 473
16332 Design and Realization of Computer Network Security Perception Control System

Authors: El Miloudi Djelloul

Abstract:

Based on analysis on applications by perception control technology in computer network security status and security protection measures, from the angles of network physical environment and network software system environmental security, this paper provides network security system perception control solution using Internet of Things (IOT), telecom and other perception technologies. Security Perception Control System is in the computer network environment, utilizing Radio Frequency Identification (RFID) of IOT and telecom integration technology to carry out integration design for systems. In the network physical security environment, RFID temperature, humidity, gas and perception technologies are used to do surveillance on environmental data, dynamic perception technology is used for network system security environment, user-defined security parameters, security log are used for quick data analysis, extends control on I/O interface, by development of API and AT command, Computer Network Security Perception Control based on Internet and GSM/GPRS is achieved, which enables users to carry out interactive perception and control for network security environment by WEB, E-MAIL as well as PDA, mobile phone short message and Internet. In the system testing, through middle ware server, security information data perception in real time with deviation of 3-5% was achieved; it proves the feasibility of Computer Network Security Perception Control System.

Keywords: computer network, perception control system security strategy, Radio Frequency Identification (RFID)

Procedia PDF Downloads 410
16331 Secure Content Centric Network

Authors: Syed Umair Aziz, Muhammad Faheem, Sameer Hussain, Faraz Idris

Abstract:

Content centric network is the network based on the mechanism of sending and receiving the data based on the interest and data request to the specified node (which has cached data). In this network, the security is bind with the content not with the host hence making it host independent and secure. In this network security is applied by taking content’s MAC (message authentication code) and encrypting it with the public key of the receiver. On the receiver end, the message is first verified and after verification message is saved and decrypted using the receiver's private key.

Keywords: content centric network, client-server, host security threats, message authentication code, named data network, network caching, peer-to-peer

Procedia PDF Downloads 610
16330 Foreign Artificial Intelligence Investments and National Security Exceptions in International Investment Law

Authors: Ying Zhu

Abstract:

Recent years have witnessed a boom of foreign investments in the field of artificial intelligence (AI). Foreign investments provide critical capital for AI development but also trigger national security concerns of host states. A notable example is an increasing number of cases in which the Committee on Foreign Investment in the United States (CFIUS) has denied Chinese acquisitions of US technology companies on national security grounds. On July 19, 2018, the Congress has reached a deal on the final draft of a new provision to strengthen CFIUS’s authority to review overseas transactions involving sensitive US technology. The question is: how to reconcile the emerging tension between, on the one hand, foreign AI investors’ expectations of a predictable investment environment, and on the other hand, host states’ regulatory power on national security? This paper provides a methodology to reconcile this tension under international investment law. Based on an examination, the national security exception clauses in international investment treaties and the application of national security justification in investor-state arbitration jurisprudence, the paper argues that a traditional interpretation of the national security exception, based on the necessity concept in customary international law, fails to take into account new risks faced by countries, including security concerns over strategic industries such as AI. To overcome this shortage, the paper proposes to incorporate an integrated national security clause in international investment treaties, which includes a two-tier test: a ‘self-judging’ test in the pre-establishment period and a ‘proportionality’ test in the post-establishment period. At the end, the paper drafts a model national security clause for future treaty-drafting practice.

Keywords: foreign investment, artificial intelligence, international investment law, national security exception

Procedia PDF Downloads 121
16329 Wally Feelings Test: Validity and Reliability Study

Authors: Gökhan Kayili, Ramazan Ari

Abstract:

In this research, it is aimed to be adapted Wally Feelings Test to Turkish children and performed the reliability and validity analysis of the test. The sampling of the research was composed of three to five year-old 699 Turkish preschoolers who are attending official and private nursery school. The schools selected with simple random sampling method by considering different socio economic conditions and different central district in Konya. In order to determine reliability of Wally Feelings Test, internal consistency coefficients (KR-20), split-half reliability and test- retest reliability analysis have been performed. During validation process construct validity, content/scope validity and concurrent/criterion validity were used. When validity and reliability of the test examined, it is seen that Wally Feelings Test is a valid and reliable instrument to evaluate three to five year old Turkish children’s understanding feeling skills.

Keywords: reliability, validity, wally feelings test, social sciences

Procedia PDF Downloads 499
16328 Biomechanical Study of a Type II Superior Labral Anterior to Posterior Lesion in the Glenohumeral Joint Using Finite Element Analysis

Authors: Javier A. Maldonado E., Duvert A. Puentes T., Diego F. Villegas B.

Abstract:

The SLAP lesion (Superior Labral Anterior to Posterior) involves the labrum, causing pain and mobility problems in the glenohumeral joint. This injury is common in athletes practicing sports that requires throwing or those who receive traumatic impacts on the shoulder area. This paper determines the biomechanical behavior of soft tissues of the glenohumeral joint when type II SLAP lesion is present. This pathology is characterized for a tear in the superior labrum which is simulated in a 3D model of the shoulder joint. A 3D model of the glenohumeral joint was obtained using the free software Slice. Then, a Finite Element analysis was done using a general purpose software which simulates a compression test with external rotation. First, a validation was done assuming a healthy joint shoulder with a previous study. Once the initial model was validated, a lesion of the labrum built using a CAD software and the same test was done again. The results obtained were stress and strain distribution of the synovial capsule and the injured labrum. ANOVA was done for the healthy and injured glenohumeral joint finding significant differences between them. This study will help orthopedic surgeons to know the biomechanics involving this type of lesion and also the other surrounding structures affected by loading the injured joint.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, superior labral anterior to posterior lesion

Procedia PDF Downloads 175
16327 Climate Change and Its Impact on Water Security and Health in Coastal Community: A Gender Outlook

Authors: Soorya Vennila

Abstract:

The present study answers the questions; how does climate change affect the water security in drought prone Ramanathapuram district? and what has water insecurity done to the health of the coastal community? The study area chosen is Devipattinam in Ramanathapuram district. Climate change evidentially wreaked havoc on the community with saltwater intrusion, water quality degradation, water scarcity and its eventual economic, social like power inequality within family and community and health hazards. The climatological data such as rainfall, minimum temperature and maximum temperature were statistically analyzed for trend using Mann-Kendall test. The test was conducted for 14 years (1989-2002) of rainfall data, maximum and minimum temperature and the data were statistically analyzed. At the outset, the water quality samples were collected from Devipattinam to test its physical and chemical parameters and their spatial variation. The results were derived as shown in ARC GIS. Using the water quality test water quality index were framed. And finally, key Informant interview, questionnaire were conducted to capture the gender perception and problem. The data collected were thereafter interpreted using SPSS software for recommendations and suggestions to overcome water scarcity and health problems.

Keywords: health, watersecurity, water quality, climate change

Procedia PDF Downloads 41
16326 Validation Study of Radial Aircraft Engine Model

Authors: Lukasz Grabowski, Tytus Tulwin, Michal Geca, P. Karpinski

Abstract:

This paper presents the radial aircraft engine model which has been created in AVL Boost software. This model is a one-dimensional physical model of the engine, which enables us to investigate the impact of an ignition system design on engine performance (power, torque, fuel consumption). In addition, this model allows research under variable environmental conditions to reflect varied flight conditions (altitude, humidity, cruising speed). Before the simulation research the identifying parameters and validating of model were studied. In order to verify the feasibility to take off power of gasoline radial aircraft engine model, some validation study was carried out. The first stage of the identification was completed with reference to the technical documentation provided by manufacturer of engine and the experiments on the test stand of the real engine. The second stage involved a comparison of simulation results with the results of the engine stand tests performed on a WSK ’PZL-Kalisz’. The engine was loaded by a propeller in a special test bench. Identifying the model parameters referred to a comparison of the test results to the simulation in terms of: pressure behind the throttles, pressure in the inlet pipe, and time course for pressure in the first inlet pipe, power, and specific fuel consumption. Accordingly, the required coefficients and error of simulation calculation relative to the real-object experiments were determined. Obtained the time course for pressure and its value is compatible with the experimental results. Additionally the engine power and specific fuel consumption tends to be significantly compatible with the bench tests. The mapping error does not exceed 1.5%, which verifies positively the model of combustion and allows us to predict engine performance if the process of combustion will be modified. The next conducted tests verified completely model. The maximum mapping error for the pressure behind the throttles and the inlet pipe pressure is 4 %, which proves the model of the inlet duct in the engine with the charging compressor to be correct.

Keywords: 1D-model, aircraft engine, performance, validation

Procedia PDF Downloads 305
16325 A New Approach for Assertions Processing during Assertion-Based Software Testing

Authors: Ali M. Alakeel

Abstract:

Assertion-based software testing has been shown to be a promising tool for generating test cases that reveal program faults. Because the number of assertions may be very large for industry-size programs, one of the main concerns to the applicability of assertion-based testing is the amount of search time required to explore a large number of assertions. This paper presents a new approach for assertions exploration during the process of Assertion-Based software testing. Our initial exterminations with the proposed approach show that the performance of Assertion-Based testing may be improved, therefore, making this approach more efficient when applied on programs with large number of assertions.

Keywords: software testing, assertion-based testing, program assertions, generating test

Procedia PDF Downloads 422
16324 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier

Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur

Abstract:

Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.

Keywords: test case prioritization, classification, artificial neural networks, TF-IDF

Procedia PDF Downloads 355
16323 Numerical Simulation and Experimental Validation of the Tire-Road Separation in Quarter-car Model

Authors: Quy Dang Nguyen, Reza Nakhaie Jazar

Abstract:

The paper investigates vibration dynamics of tire-road separation for a quarter-car model; this separation model is developed to be close to the real situation considering the tire is able to separate from the ground plane. A set of piecewise linear mathematical models is developed and matches the in-contact and no-contact states to be considered as mother models for further investigations. The bound dynamics are numerically simulated in the time response and phase portraits. The separation analysis may determine which values of suspension parameters can delay and avoid the no-contact phenomenon, which results in improving ride comfort and eliminating the potentially dangerous oscillation. Finally, model verification is carried out in the MSC-ADAMS environment.

Keywords: quarter-car vibrations, tire-road separation, separation analysis, separation dynamics, ride comfort, ADAMS validation

Procedia PDF Downloads 55
16322 A Microwave Heating Model for Endothermic Reaction in the Cement Industry

Authors: Sofia N. Gonçalves, Duarte M. S. Albuquerque, José C. F. Pereira

Abstract:

Microwave technology has been gaining importance in contributing to decarbonization processes in high energy demand industries. Despite the several numerical models presented in the literature, a proper Verification and Validation exercise is still lacking. This is important and required to evaluate the physical process model accuracy and adequacy. Another issue addresses impedance matching, which is an important mechanism used in microwave experiments to increase electromagnetic efficiency. Such mechanism is not available in current computational tools, thus requiring an external numerical procedure. A numerical model was implemented to study the continuous processing of limestone with microwave heating. This process requires the material to be heated until a certain temperature that will prompt a highly endothermic reaction. Both a 2D and 3D model were built in COMSOL Multiphysics to solve the two-way coupling between Maxwell and Energy equations, along with the coupling between both heat transfer phenomena and limestone endothermic reaction. The 2D model was used to study and evaluate the required numerical procedure, being also a benchmark test, allowing other authors to implement impedance matching procedures. To achieve this goal, a controller built in MATLAB was used to continuously matching the cavity impedance and predicting the required energy for the system, thus successfully avoiding energy inefficiencies. The 3D model reproduces realistic results and therefore supports the main conclusions of this work. Limestone was modeled as a continuous flow under the transport of concentrated species, whose material and kinetics properties were taken from literature. Verification and Validation of the coupled model was taken separately from the chemical kinetic model. The chemical kinetic model was found to correctly describe the chosen kinetic equation by comparing numerical results with experimental data. A solution verification was made for the electromagnetic interface, where second order and fourth order accurate schemes were found for linear and quadratic elements, respectively, with numerical uncertainty lower than 0.03%. Regarding the coupled model, it was demonstrated that the numerical error would diverge for the heat transfer interface with the mapped mesh. Results showed numerical stability for the triangular mesh, and the numerical uncertainty was less than 0.1%. This study evaluated limestone velocity, heat transfer, and load influence on thermal decomposition and overall process efficiency. The velocity and heat transfer coefficient were studied with the 2D model, while different loads of material were studied with the 3D model. Both models demonstrated to be highly unstable when solving non-linear temperature distributions. High velocity flows exhibited propensity to thermal runways, and the thermal efficiency showed the tendency to stabilize for the higher velocities and higher filling ratio. Microwave efficiency denoted an optimal velocity for each heat transfer coefficient, pointing out that electromagnetic efficiency is a consequence of energy distribution uniformity. The 3D results indicated the inefficient development of the electric field for low filling ratios. Thermal efficiencies higher than 90% were found for the higher loads and microwave efficiencies up to 75% were accomplished. The 80% fill ratio was demonstrated to be the optimal load with an associated global efficiency of 70%.

Keywords: multiphysics modeling, microwave heating, verification and validation, endothermic reactions modeling, impedance matching, limestone continuous processing

Procedia PDF Downloads 112
16321 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples

Authors: Saifullah Karimullah

Abstract:

Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.

Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine

Procedia PDF Downloads 74
16320 BAN Logic Proof of E-passport Authentication Protocol

Authors: Safa Saoudi, Souheib Yousfi, Riadh Robbana

Abstract:

E-passport is a relatively new electronic document which maintains the passport features and provides better security. It deploys new technologies such as biometrics and Radio Frequency identification (RFID). The international civil aviation organization (ICAO) and the European union define mechanisms and protocols to provide security but their solutions present many threats. In this paper, a new mechanism is presented to strengthen e-passport security and authentication process. We propose a new protocol based on Elliptic curve, identity based encryption and shared secret between entities. Authentication in our contribution is formally proved with BAN Logic verification language. This proposal aims to provide a secure data storage and authentication.

Keywords: e-passport, elliptic curve cryptography, identity based encryption, shared secret, BAN Logic

Procedia PDF Downloads 401
16319 A Three Tier Secure KQML Interface with Novel Performatives

Authors: Dimple Juneja, Aarti Singh, Renu Hooda

Abstract:

Knowledge Query Manipulation Language (KQML) and FIPA ACL are two prime communication languages existing in multi agent systems (MAS). Both languages are more or less similar in terms of semantics (based on speech act theory) and offer cutting edge competition while establishing agent communication across Internet. In contrast to the fact that software agents operating on the internet are required to be more safeguarded from their counter-peer, both protocols lack security performatives. The paper proposes a three tier security interface with few novel security related performatives enhancing the basic architecture of KQML. The three levels are attestation, certification and trust establishment which enforces a tight security and hence reduces the security breeches.

Keywords: multiagent systems, KQML, FIPA ACL, performatives

Procedia PDF Downloads 384
16318 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 114
16317 Student Records Management System Using Smart Cards and Biometric Technology for Educational Institutions

Authors: Patrick O. Bobbie, Prince S. Attrams

Abstract:

In recent times, the rapid change in new technologies has spurred up the way and manner records are handled in educational institutions. Also, there is a need for reliable access and ease-of use to these records, resulting in increased productivity in organizations. In academic institutions, such benefits help in quality assessments, institutional performance, and assessments of teaching and evaluation methods. Students in educational institutions benefit the most when advanced technologies are deployed in accessing records. This research paper discusses the use of biometric technologies coupled with smartcard technologies to provide a unique way of identifying students and matching their data to financial records to grant them access to restricted areas such as examination halls. The system developed in this paper, has an identity verification component as part of its main functionalities. A systematic software development cycle of analysis, design, coding, testing and support was used. The system provides a secured way of verifying student’s identity and real time verification of financial records. An advanced prototype version of the system has been developed for testing purposes.

Keywords: biometrics, smartcards, identity-verification, fingerprints

Procedia PDF Downloads 386
16316 An Operators’ Real-sense-based Fire Simulation for Human Factors Validation in Nuclear Power Plants

Authors: Sa-Kil Kim, Jang-Soo Lee

Abstract:

On March 31, 1993, a severe fire accident took place in a nuclear power plant located in Narora in North India. The event involved a major fire in the turbine building of NAPS unit-1 and resulted in a total loss of power to the unit for 17 hours. In addition, there was a heavy ingress of smoke in the control room, mainly through the intake of the ventilation system, forcing the operators to vacate the control room. The Narora fire accident provides us lessons indicating that operators could lose their mind and predictable behaviors during a fire. After the Fukushima accident, which resulted from a natural disaster, unanticipated external events are also required to be prepared and controlled for the ultimate safety of nuclear power plants. From last year, our research team has developed a test and evaluation facility that can simulate external events such as an earthquake and fire based on the operators’ real-sense. As one of the results of the project, we proposed a unit real-sense-based facility that can simulate fire events in a control room for utilizing a test-bed of human factor validation. The test-bed has the operator’s workstation shape and functions to simulate fire conditions such as smoke, heat, and auditory alarms in accordance with the prepared fire scenarios. Furthermore, the test-bed can be used for the operators’ training and experience.

Keywords: human behavior in fire, human factors validation, nuclear power plants, real-sense-based fire simulation

Procedia PDF Downloads 253
16315 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference

Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov

Abstract:

Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.

Keywords: canonical correlation analysis, power curve, power performance, wind energy

Procedia PDF Downloads 302
16314 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 404
16313 Considering Partially Developed Artifacts in Change Impact Analysis Implementation

Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim

Abstract:

It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.

Keywords: software development, impact analysis, traceability, static analysis.

Procedia PDF Downloads 578
16312 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges

Authors: T. Gayen

Abstract:

Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.

Keywords: black box, fault tolerance, failure, software reliability

Procedia PDF Downloads 394
16311 Software Cloning and Agile Environment

Authors: Ravi Kumar, Dhrubajit Barman, Nomi Baruah

Abstract:

Software Cloning has grown an active area in software engineering research community yielding numerous techniques, various tools and other methods for clone detection and removal. The copying, modifying a block of code is identified as cloning as it is the most basic means of software reuse. Agile Software Development is an approach which is currently being used in various software projects, so that it helps to respond the unpredictability of building software through incremental, iterative, work cadences. Software Cloning has been introduced to Agile Environment and many Agile Software Development approaches are using the concept of Software Cloning. This paper discusses the various Agile Software Development approaches. It also discusses the degree to which the Software Cloning concept is being introduced in the Agile Software Development approaches.

Keywords: agile environment, refactoring, reuse, software cloning

Procedia PDF Downloads 496