Search results for: Model Checking.
7501 Development of A Meta Description Language for Software/Hardware Cooperative Design and Verification for Model-Checking Systems
Authors: Katsumi Wasaki, Naoki Iwasaki
Abstract:
Model-checking tools such as Symbolic Model Verifier (SMV) and NuSMV are available for checking hardware designs. These tools can automatically check the formal legitimacy of a design. However, NuSMV is too low level for describing a complete hardware design. It is therefore necessary to translate the system definition, as designed in a language such as Verilog or VHDL, into a language such as NuSMV for validation. In this paper, we present a meta hardware description language, Melasy, that contains a code generator for existing hardware description languages (HDLs) and languages for model checking that solve this problem.Keywords: meta description language, software/hardware codesign, co-verification, formal verification, hardware compiler, modelchecking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14647500 On Analysis of Boundness Property for ECATNets by Using Rewriting Logic
Authors: Noura Boudiaf, Allaoua Chaoui
Abstract:
To analyze the behavior of Petri nets, the accessibility graph and Model Checking are widely used. However, if the analyzed Petri net is unbounded then the accessibility graph becomes infinite and Model Checking can not be used even for small Petri nets. ECATNets [2] are a category of algebraic Petri nets. The main feature of ECATNets is their sound and complete semantics based on rewriting logic [8] and its language Maude [9]. ECATNets analysis may be done by using techniques of accessibility analysis and Model Checking defined in Maude. But, these two techniques supported by Maude do not work also with infinite-states systems. As a category of Petri nets, ECATNets can be unbounded and so infinite systems. In order to know if we can apply accessibility analysis and Model Checking of Maude to an ECATNet, we propose in this paper an algorithm allowing the detection if the ECATNet is bounded or not. Moreover, we propose a rewriting logic based tool implementing this algorithm. We show that the development of this tool using the Maude system is facilitated thanks to the reflectivity of the rewriting logic. Indeed, the self-interpretation of this logic allows us both the modelling of an ECATNet and acting on it.Keywords: ECATNets, Rewriting Logic, Maude, Finite-stateSystems, Infinite-state Systems, Boundness Property Checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13847499 Formal Modeling and Verification of Software Models
Authors: Siamak Rasulzadeh
Abstract:
Graph transformation has recently become more and more popular as a general visual modeling language to formally state the dynamic semantics of the designed models. Especially, it is a very natural formalism for languages which basically are graph (e.g. UML). Using this technique, we present a highly understandable yet precise approach to formally model and analyze the behavioral semantics of UML 2.0 Activity diagrams. In our proposal, AGG is used to design Activities, then using our previous approach to model checking graph transformation systems, designers can verify and analyze designed Activity diagrams by checking the interesting properties as combination of graph rules and LTL (Linear Temporal Logic) formulas on the Activities.Keywords: UML 2.0 Activity, Verification, Model Checking, Graph Transformation, Dynamic Semantics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14457498 Verification and Validation for Java Classes using Design by Contract. The Modular External Approach
Authors: Dario Ramirez de Leon, Oscar Chavez Bosquez, Julian J. Francisco Leon
Abstract:
Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.
Keywords: Model checking, verification and validation, JML, XML, java, runtime assertion checking, extended static checking, full static program verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15757497 Validation of Automation Systems using Temporal Logic Model Checking and Groebner Bases
Authors: Quoc-Nam Tran, Anjib Mulepati
Abstract:
Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.
Keywords: Computational Intelligence, Temporal Logic Reasoning, Model Checking, Groebner Bases.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14367496 An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML
Authors: Shinpei Ogata, Yoshitaka Aoki, Hirotaka Okuda, Saeko Matsuura
Abstract:
A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.Keywords: CRUD, Model Checking, Model Driven Development, Requirements Analysis, Unified Modeling Language, UPPAAL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16737495 Automated Fact-Checking By Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state of the art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study presents a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive and authoritative data; 2) developing a search function to automatically select relevant, new and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that: 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graph in Wikidata to dynamically augment the representations of claims and references without introducing too much noises; II) exploring semantic relations in claims and references to further enhance fact-checking.
Keywords: Fact checking, claim verification, Deep Learning, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 817494 The Performance of the Character-Access on the Checking Phase in String Searching Algorithms
Authors: Mahmoud M. Mhashi
Abstract:
A new algorithm called Character-Comparison to Character-Access (CCCA) is developed to test the effect of both: 1) converting character-comparison and number-comparison into character-access and 2) the starting point of checking on the performance of the checking operation in string searching. An experiment is performed; the results are compared with five algorithms, namely, Naive, BM, Inf_Suf_Pref, Raita, and Circle. With the CCCA algorithm, the results suggest that the evaluation criteria of the average number of comparisons are improved up to 74.0%. Furthermore, the results suggest that the clock time required by the other algorithms is improved in range from 28% to 68% by the new CCCA algorithmKeywords: Pattern matching, string searching, charactercomparison, character-access, and checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13087493 Model Checking Consistency of UML Diagrams Using Alloy
Authors: Akie Nimiya, Tomoyuki Yokogawa, Hisashi Miyazaki, Sousuke Amasaki, Yoichiro Sato, Michiyoshi Hayase
Abstract:
In this paper, we proposed a method for detecting consistency violation between UML state machine diagrams and communication diagrams using Alloy. Using input language of Alloy, the proposed method expresses system behaviors described by state machine diagrams, message sequences described by communication diagrams, and a consistency property. As a result of application for an example system, we confirmed that consistency violation could be detected using Alloy correctly.
Keywords: Model checking, UML, state machine diagrams, communication diagram, alloy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15847492 Symbolic Model Checking of Interactions in Sequence Diagrams with Combined Fragments by SMV
Authors: Yuka Kawakami, Tomoyuki Yokogawa, Hisashi Miyazaki, Sousuke Amasaki, Yoichiro Sato, Michiyoshi Hayase
Abstract:
In this paper, we proposed a method for detecting consistency violation between state machine diagrams and a sequence diagram defined in UML 2.0 using SMV. We extended a method expressing these diagrams defined in UML 1.0 with boolean formulas so that it can express a sequence diagram with combined fragments introduced in UML 2.0. This extension made it possible to represent three types of combined fragment: alternative, option and parallel. As a result of experiment, we confirmed that the proposed method could detect consistency violation correctly with SMV.
Keywords: UML, model checking, SMV, sequence diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14697491 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).
Keywords: Time series modelling, stochastic processes, ARIMA model, Karkheh River.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10447490 Research and Development of Intelligent Cooling Channels Design System
Authors: Q. Niu, X. H. Zhou, W. Liu
Abstract:
The cooling channels of injection mould play a crucial role in determining the productivity of moulding process and the product quality. It’s not a simple task to design high quality cooling channels. In this paper, an intelligent cooling channels design system including automatic layout of cooling channels, interference checking and assembly of accessories is studied. Automatic layout of cooling channels using genetic algorithm is analyzed. Through integrating experience criteria of designing cooling channels, considering the factors such as the mould temperature and interference checking, the automatic layout of cooling channels is implemented. The method of checking interference based on distance constraint algorithm and the function of automatic and continuous assembly of accessories are developed and integrated into the system. Case studies demonstrate the feasibility and practicality of the intelligent design system.
Keywords: Injection mould, cooling channel, automatic layout, interference checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22497489 The Negative Effect of Traditional Loops Style on the Performance of Algorithms
Authors: Mahmoud Moh'd Mhashi
Abstract:
A new algorithm called Character-Comparison to Character-Access (CCCA) is developed to test the effect of both: 1) converting character-comparison and number-comparison into character-access and 2) the starting point of checking on the performance of the checking operation in string searching. An experiment is performed using both English text and DNA text with different sizes. The results are compared with five algorithms, namely, Naive, BM, Inf_Suf_Pref, Raita, and Cycle. With the CCCA algorithm, the results suggest that the evaluation criteria of the average number of total comparisons are improved up to 35%. Furthermore, the results suggest that the clock time required by the other algorithms is improved in range from 22.13% to 42.33% by the new CCCA algorithm.
Keywords: Pattern matching, string searching, charactercomparison, character-access, text type, and checking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12707488 BDD Package Based on Boolean NOR Operation
Authors: M. Raseen, A.Assi, P.W. C. Prasad, A. Harb
Abstract:
Binary Decision Diagrams (BDDs) are useful data structures for symbolic Boolean manipulations. BDDs are used in many tasks in VLSI/CAD, such as equivalence checking, property checking, logic synthesis, and false paths. In this paper we describe a new approach for the realization of a BDD package. To perform manipulations of Boolean functions, the proposed approach does not depend on the recursive synthesis operation of the IF-Then-Else (ITE). Instead of using the ITE operation, the basic synthesis algorithm is done using Boolean NOR operation.Keywords: Binary Decision Diagram (BDD), ITE Operation, Boolean Function, NOR operation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19517487 Formal Verification of Cache System Using a Novel Cache Memory Model
Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang
Abstract:
Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.
Keywords: Cache system, formal verification, novel model, System on Chip (SoC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22987486 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Modeldriven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: crossdomain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: Automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21917485 Transmission Expansion Planning with Economic Dispatch and N-1Constraints
Authors: A. Charlangsut, M. Boonthienthong, N. Rugthaicharoencheep
Abstract:
This paper proposes a mathematical model for transmission expansion employing optimization method with scenario analysis approach. Economic transmission planning, on the other hand, seeks investment opportunities so that network expansions can generate more economic benefits than the costs. This approach can be used as a decision model for building new transmission lines added to the existing transmission system minimizing costs of the entire system subject to various system’s constraints and consider of loss value of transmission system and N-1 checking. The results show that the proposed model is efficient to be applied for the larger scale of power system topology.
Keywords: Transmission Expansion Planning, Economic Dispatch, Scenario Analysis, Contingency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21037484 Specifying a Timestamp-based Protocol For Multi-step Transactions Using LTL
Authors: Rafat Alshorman, Walter Hussak
Abstract:
Most of the concurrent transactional protocols consider serializability as a correctness criterion of the transactions execution. Usually, the proof of the serializability relies on mathematical proofs for a fixed finite number of transactions. In this paper, we introduce a protocol to deal with an infinite number of transactions which are iterated infinitely often. We specify serializability of the transactions and the protocol using a specification language based on temporal logics. It is worthwhile using temporal logics such as LTL (Lineartime Temporal Logic) to specify transactions, to gain full automatic verification by using model checkers.Keywords: Multi-step transactions, LTL specifications, Model Checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13817483 Multi-view Description of Real-Time Systems- Architecture
Authors: A. Bessam, M. T. Kimour
Abstract:
Real-time embedded systems should benefit from component-based software engineering to handle complexity and deal with dependability. In these systems, applications should not only be logically correct but also behave within time windows. However, in the current component based software engineering approaches, a few of component models handles time properties in a manner that allows efficient analysis and checking at the architectural level. In this paper, we present a meta-model for component-based software description that integrates timing issues. To achieve a complete functional model of software components, our meta-model focuses on four functional aspects: interface, static behavior, dynamic behavior, and interaction protocol. With each aspect we have explicitly associated a time model. Such a time model can be used to check a component-s design against certain properties and to compute the timing properties of component assemblies.Keywords: Real-time systems, Software architecture, software component, dependability, time properties, ADL, metamodeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16367482 An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)
Authors: Rosziati Ibrahim, Siow Yen Yen
Abstract:
System development life cycle (SDLC) is a process uses during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. A consistency of the context diagram to lower-level data flow diagrams is very important in smoothing up developing process of a system. However, manual consistency check from context diagram to lower-level data flow diagrams by using a checklist is time-consuming process. At the same time, the limitation of human ability to validate the errors is one of the factors that influence the correctness and balancing of the diagrams. This paper presents a tool that automates the consistency check between Data Flow Diagrams (DFDs) based on the rules of DFDs. The tool serves two purposes: as an editor to draw the diagrams and as a checker to check the correctness of the diagrams drawn. The consistency check from context diagram to lower-level data flow diagrams is embedded inside the tool to overcome the manual checking problem.Keywords: Data Flow Diagram, Context Diagram, ConsistencyCheck, Syntax and Semantic Rules
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34397481 Verification of Protocol Design using UML - SMV
Authors: Prashanth C.M., K. Chandrashekar Shet
Abstract:
In recent past, the Unified Modeling Language (UML) has become the de facto industry standard for object-oriented modeling of the software systems. The syntax and semantics rich UML has encouraged industry to develop several supporting tools including those capable of generating deployable product (code) from the UML models. As a consequence, ensuring the correctness of the model/design has become challenging and extremely important task. In this paper, we present an approach for automatic verification of protocol model/design. As a case study, Session Initiation Protocol (SIP) design is verified for the property, “the CALLER will not converse with the CALLEE before the connection is established between them ". The SIP is modeled using UML statechart diagrams and the desired properties are expressed in temporal logic. Our prototype verifier “UML-SMV" is used to carry out the verification. We subjected an erroneous SIP model to the UML-SMV, the verifier could successfully detect the error (in 76.26ms) and generate the error trace.
Keywords: Unified Modeling Language, Statechart, Verification, Protocol Design, Model Checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18557480 Economic Loss due to Ganoderma Disease in Oil Palm
Authors: K. Assis, K. P. Chong, A. S. Idris, C. M. Ho
Abstract:
Oil palm or Elaeis guineensis is considered as the golden crop in Malaysia. But oil palm industry in this country is now facing with the most devastating disease called as Ganoderma Basal Stem Rot disease. The objective of this paper is to analyze the economic loss due to this disease. There were three commercial oil palm sites selected for collecting the required data for economic analysis. Yield parameter used to measure the loss was the total weight of fresh fruit bunch in six months. The predictors include disease severity, change in disease severity, number of infected neighbor palms, age of palm, planting generation, topography, and first order interaction variables. The estimation model of yield loss was identified by using backward elimination based regression method. Diagnostic checking was conducted on the residual of the best yield loss model. The value of mean absolute percentage error (MAPE) was used to measure the forecast performance of the model. The best yield loss model was then used to estimate the economic loss by using the current monthly price of fresh fruit bunch at mill gate.
Keywords: Ganoderma, oil palm, regression model, yield loss, economic loss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32377479 Modernization of the Economic Price Adjustment Software
Authors: Roger L Goodwin
Abstract:
The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for longterm contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15007478 Patients’ Perceptions of Receiving a Diagnosis of a Hematological Malignancy, Following the SPIKES Protocol
Abstract:
Objective: Sharing devastating news with patients is often considered the most difficult task of doctors. This study aimed to explore patients’ perceptions of receiving bad news including which features improve the experience and which areas need refining. Methods: A questionnaire was written based on the steps of the SPIKES model for breaking bad new. 20 patients receiving treatment for a hematological malignancy completed the questionnaire. Results: Overall, the results are promising as most patients praised their consultation. ‘Poor’ was more commonly rated by women and participants aged 45-64. The main differences between the ‘excellent’ and ‘poor’ consultations include the doctor’s sensitivity and checking the patients’ understanding. Only 35% of patients were asked their existing knowledge and 85% of consultations failed to discuss the impact of the diagnosis on daily life. Conclusion: This study agreed with the consensus of existing literature. The commended aspects include consultation set-up and information given. Areas patients felt needed improvement include doctors determining the patient’s existing knowledge and checking new information has been understood. Doctors should also explore how the diagnosis will affect the patient’s life. With a poorer prognosis, doctors should work on conveying appropriate hope. The study was limited by a small sample size and potential recall bias.Keywords: Communication, diagnosis, hematology, patients.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19467477 ISCS (Information Security Check Service) for the Safety and Reliability of Communications
Authors: Jong-Whoi Shin, Jin-Tae Lee, Sang-Soo Jang, Jae-II Lee
Abstract:
Recent widespread use of information and communication technology has greatly changed information security risks that businesses and institutions encounter. Along with this situation, in order to ensure security and have confidence in electronic trading, it has become important for organizations to take competent information security measures to provide international confidence that sensitive information is secure. Against this backdrop, the approach to information security checking has come to an important issue, which is believed to be common to all countries. The purpose of this paper is to introduce the new system of information security checking program in Korea and to propose synthetic information security countermeasures under domestic circumstances in order to protect physical equipment, security management and technology, and the operation of security check for securing services on ISP(Internet Service Provider), IDC(Internet Data Center), and e-commerce(shopping malls, etc.)Keywords: Information Security Check Service, safety criteria, object enterpriser.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16107476 A Paradigm for Characterization and Checking of a Human Noise Behavior
Authors: Himanshu Dehra
Abstract:
This paper presents a paradigm for characterization and checking of human noise behavior. The definitions of ‘Noise’ and ‘Noise Behavior’ are devised. The concept of characterization and examining of Noise Behavior is obtained from the proposed paradigm of Psychoacoustics. The measurement of human noise behavior is discussed through definitions of noise sources and noise measurements. The noise sources, noise measurement equations and noise filters are further illustrated through examples. The theory and significance of solar energy acoustics is presented for life and its activities. Human comfort and health are correlated with human brain through physiological responses and noise protection. Examples of heat stress, intense heat, sweating and evaporation are also enumerated.
Keywords: Human brain, noise behavior, noise characterization, noise filters, physiological responses, psychoacoustics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10977475 Similarity Detection in Collaborative Development of Object-Oriented Formal Specifications
Authors: Fathi Taibi, Fouad Mohammed Abbou, Md. Jahangir Alam
Abstract:
The complexity of today-s software systems makes collaborative development necessary to accomplish tasks. Frameworks are necessary to allow developers perform their tasks independently yet collaboratively. Similarity detection is one of the major issues to consider when developing such frameworks. It allows developers to mine existing repositories when developing their own views of a software artifact, and it is necessary for identifying the correspondences between the views to allow merging them and checking their consistency. Due to the importance of the requirements specification stage in software development, this paper proposes a framework for collaborative development of Object- Oriented formal specifications along with a similarity detection approach to support the creation, merging and consistency checking of specifications. The paper also explores the impact of using additional concepts on improving the matching results. Finally, the proposed approach is empirically evaluated.Keywords: Collaborative Development, Formal methods, Object-Oriented, Similarity detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14697474 Design and Implementation of 4 Bit Multiplier Using Fault Tolerant Hybrid Full Adder
Authors: C. Kalamani, V. Abishek Karthick, S. Anitha, K. Kavin Kumar
Abstract:
The fault tolerant system plays a crucial role in the critical applications which are being used in the present scenario. A fault may change the functionality of circuits. Aim of this paper is to design multiplier using fault tolerant hybrid full adder. Fault tolerant hybrid full adder is designed to check and repair any fault in the circuit using self-checking circuit and the self-repairing circuit. Further, the use of conventional logic circuits may result in more area, delay as well as power consumption. In order to reduce these parameters of the circuit, GDI (Gate Diffusion Input) techniques with less number of transistors are used compared to conventional full adder circuit. This reduces the area, delay and power consumption. The proposed method solves the major problems occurring in the most crucial and critical applications.
Keywords: Gate diffusion input, hybrid full adder, self-checking, fault tolerant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14427473 Towards Model-Driven Communications
Authors: Antonio Natali, Ambra Molesini
Abstract:
In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.
Keywords: Interactions, specific languages, meta-models, model driven development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18507472 A Model of Market Segmentation for the Customers of Mellat Bank in Iran
Authors: Nader Gharibnavaz, Hossein Yazdi
Abstract:
If organizations like Mellat Bank want to identify its customer market completely to reach its specified goals, it can segment the market to offer the product package to the right segment. Our objective is to offer a segmentation model for Iran banking market in Mellat bank view. The methodology of this project is combined by “segmentation on the basis of four part-quality variables" and “segmentation on the basis of different in means". Required data are gathered from E-Systems and researcher personal observation. Finally, the research offers the organization that at first step form a four dimensional matrix with 756 segments using four variables named value-based, behavioral, activity style, and activity level, and at the second step calculate the means of profit for every cell of matrix in two distinguished work level (levels α1:normal condition and α2: high pressure condition) and compare the segments by checking two conditions that are 1- homogeneity every segment with its sub segment and 2- heterogeneity with other segments, and so it can do the necessary segmentation process. After all, the last offer (more explained by an operational example and feedback algorithm) is to test and update the model because of dynamic environment, technology, and banking system.Keywords: market segmentation model, banking system, Mellat bank
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3287