Search results for: Knowledge based systems in medicine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14753

Search results for: Knowledge based systems in medicine

13673 AGENTMAP: A Conceptual Meta-Model of Interacting Simulations

Authors: Thomas M. Prinz Wilhelm R. Rossak, Kai Gebhardt

Abstract:

A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.

Keywords: Multi Agent System, Agent-based Simulation, Distributed Systems, Meta-models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
13672 A Cohesive Lagrangian Swarm and Its Application to Multiple Unicycle-like Vehicles

Authors: Jito Vanualailai, Bibhya Sharma

Abstract:

Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.

Keywords: Attractive-repulsive swarm model, individual-based swarm model, Lagrangian swarm model, Lyapunov stability, Lyapunov-like function, practical stability, unicycle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
13671 High Impedance Faults Detection Technique Based on Wavelet Transform

Authors: Ming-Ta Yang, Jin-Lung Guan, Jhy-Cherng Gu

Abstract:

The purpose of this paper is to solve the problem of protecting aerial lines from high impedance faults (HIFs) in distribution systems. This investigation successfully applies 3I0 zero sequence current to solve HIF problems. The feature extraction system based on discrete wavelet transform (DWT) and the feature identification technique found on statistical confidence are then applied to discriminate effectively between the HIFs and the switch operations. Based on continuous wavelet transform (CWT) pattern recognition of HIFs is proposed, also. Staged fault testing results demonstrate that the proposed wavelet based algorithm is feasible performance well.

Keywords: Continuous wavelet transform, discrete wavelet transform, high impedance faults, statistical confidence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2314
13670 Robust Adaptive ELS-QR Algorithm for Linear Discrete Time Stochastic Systems Identification

Authors: Ginalber L. O. Serra

Abstract:

This work proposes a recursive weighted ELS algorithm for system identification by applying numerically robust orthogonal Householder transformations. The properties of the proposed algorithm show it obtains acceptable results in a noisy environment: fast convergence and asymptotically unbiased estimates. Comparative analysis with others robust methods well known from literature are also presented.

Keywords: Stochastic Systems, Robust Identification, Parameter Estimation, Systems Identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1479
13669 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: Condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 561
13668 Leveraging Quality Metrics in Voting Model Based Thread Retrieval

Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim

Abstract:

Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.

Keywords: Content quality, Forum search, Thread retrieval, Voting techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
13667 Concept for a Multidisciplinary Design Process–An Application on High Lift Systems

Authors: P. Zamov, H. Spangenberg

Abstract:

Presents a concept for a multidisciplinary process supporting effective task transitions between different technical domains during the architectural design stage. A system configuration challenge is the multifunctional driven increased solution space. As a consequence, more iteration is needed to find a global optimum, i.e. a compromise between involved disciplines without negative impact on development time. Since state of the art standards like ISO 15288 and VDI 2206 do not provide a detailed methodology on multidisciplinary design process, higher uncertainties regarding final specifications arise. This leads to the need of more detailed and standardized concepts or processes which could mitigate risks. The performed work is based on analysis of multidisciplinary interaction, of modeling and simulation techniques. To demonstrate and prove the applicability of the presented concept, it is applied to the design of aircraft high lift systems, in the context of the engineering disciplines kinematics, actuation, monitoring, installation and structure design.

Keywords: Systems engineering, multidisciplinary, architectural design, high lift system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2289
13666 The Management in Large Emergency Situations – A Best Practise Case Study based on GIS for Management of Evacuation

Authors: Ion Baş, Claudiu Zoicaş, Angela Ioniţâ

Abstract:

In most of the cases, natural disasters lead to the necessity of evacuating people. The quality of evacuation management is dramatically improved by the use of information provided by decision support systems, which become indispensable in case of large scale evacuation operations. This paper presents a best practice case study. In November 2007, officers from the Emergency Situations Inspectorate “Crisana" of Bihor County from Romania participated to a cross-border evacuation exercise, when 700 people have been evacuated from Netherlands to Belgium. One of the main objectives of the exercise was the test of four different decision support systems. Afterwards, based on that experience, software system called TEVAC (Trans Border Evacuation) has been developed “in house" by the experts of this institution. This original software system was successfully tested in September 2008, during the deployment of the international exercise EU-HUROMEX 2008, the scenario involving real evacuation of 200 persons from Hungary to Romania. Based on the lessons learned and results, starting from April 2009, the TEVAC software is used by all Emergency Situations Inspectorates all over Romania.

Keywords: Emergency evacuation, Searching Features, TEVAC(Trans Border Evacuation) software system, User Interface Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569
13665 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems

Authors: Belkacem Laimouche

Abstract:

With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.

Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94
13664 Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool

Authors: Florin Pop

Abstract:

Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.

Keywords: Scheduling, Simulation, Performance Evaluation, QoS, Distributed Systems, MONARC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
13663 Congestion Management in a Deregulated Power System with Micro Grid

Authors: Guguloth Ramesh, T. K. Sunil Kumar

Abstract:

This paper presents congestion management in deregulated power systems. In a deregulated environment, every buyer wants to buy power from the cheapest generator available, irrespective of relative geographical location of buyer and seller. As a consequence of this, the transmission corridors evacuating the power of cheaper generators would get overloaded if all such transactions are approved. Congestion management is a mechanism to prioritize the transactions and commit to such a schedule which would not overload the network. The congestions in the transmission lines are determined by Optimal Power Flow (OPF) solution, which is carried by primal liner programming method. Congestion in the transmission lines are alleviated by connected Distributed Generation (DG) of micro grid at load bus. A method to determine the optimal location of DG unit has been suggested based on transmission line relief sensitivity based approach. The effectiveness of proposed method has been demonstrated on modified IEEE-14 and 30 bus test systems.

Keywords: Congestion management, Distribution Generation (DG), Transmission Line Relief (TLR) sensitivity index, OPF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3884
13662 Hardware Error Analysis and Severity Characterization in Linux-Based Server Systems

Authors: N. Georgoulopoulos, A. Hatzopoulos, K. Karamitsios, K. Kotrotsios, A. I. Metsai

Abstract:

Current server systems are responsible for critical applications that run in different infrastructures, such as the cloud, physical machines, and virtual machines. A common challenge that these systems face are the various hardware faults that may occur due to the high load, among other reasons, which translates to errors resulting in malfunctions or even server downtime. The most important hardware parts, that are causing most of the errors, are the CPU, RAM, and the hard drive - HDD. In this work, we investigate selected CPU, RAM, and HDD errors, observed or simulated in kernel ring buffer log files from GNU/Linux servers. Moreover, a severity characterization is given for each error type. Understanding these errors is crucial for the efficient analysis of kernel logs that are usually utilized for monitoring servers and diagnosing faults. In addition, to support the previous analysis, we present possible ways of simulating hardware errors in RAM and HDD, aiming to facilitate the testing of methods for detecting and tackling the above issues in a server running on GNU/Linux.

Keywords: hardware errors, Kernel logs, GNU/Linux servers, RAM, HDD, CPU

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 666
13661 Generalized Rough Sets Applied to Graphs Related to Urban Problems

Authors: Mihai Rebenciuc, Simona Mihaela Bibic

Abstract:

Branch of modern mathematics, graphs represent instruments for optimization and solving practical applications in various fields such as economic networks, engineering, network optimization, the geometry of social action, generally, complex systems including contemporary urban problems (path or transport efficiencies, biourbanism, & c.). In this paper is studied the interconnection of some urban network, which can lead to a simulation problem of a digraph through another digraph. The simulation is made univoc or more general multivoc. The concepts of fragment and atom are very useful in the study of connectivity in the digraph that is simulation - including an alternative evaluation of k- connectivity. Rough set approach in (bi)digraph which is proposed in premier in this paper contribute to improved significantly the evaluation of k-connectivity. This rough set approach is based on generalized rough sets - basic facts are presented in this paper.

Keywords: (Bi)digraphs, rough set theory, systems of interacting agents, complex systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1181
13660 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.

Keywords: Answer processing, answer validation, classification, question answering, query reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2833
13659 Actual Nursing Competency among Nurses in Hospital in Vietnam

Authors: Do Thi Ha, Khanitta Nuntaboot

Abstract:

Background: Competency of nurses is vital to safe nursing practice as well as essential component to drive quality of nursing services. There exists little up to date information concerning actual competency among Vietnamese nurses. Purposes: The purpose of this study is to identify the actual nursing competency among nurses in clinical settings in Vietnam. Methods: A qualitative study, ethnographic method, comprised of the participant-observation, in-depth interview, and focus group discussion with multidisciplinary groups of nurses employing in Cho Ray hospital, Vietnam, managers/administrators, nurse teachers, medical doctors, other health care providers, patients and family members which derived from purposeful sampling technique. Content analysis was used for data analysis. Results: Five essential themes of nursing competencies among nurses were identified include (1) knowledge, (2) skills, (3) attitude and value-based nursing practice, (4) legal and ethical competencies, and (5) transcultural competencies. Basic and advanced knowledge were identified as further two dimensions of knowledge. There were five sub themes identified as further dimensions of skills include technical skills, communication skills, organizing and management skills, teamwork and interrelationship, and critical thinking skills. Conclusions: The findings from this study provide valuable information and understanding of the actual competency among nurses in clinical settings in Vietnam. It is expected that this understanding would assist in developing a guide to nursing education and training, nursing practice and relevant policy regulation used for promoting nursing competency among nurses.

Keywords: Nursing competency, qualitative design, ethnographic method, Vietnam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2464
13658 A World Map of Seabed Sediment Based on 50 Years of Knowledge

Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès

Abstract:

Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.

Keywords: Marine sedimentology, seabed map, sediment classification, World Ocean.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1025
13657 Reliability Analysis of Press Unit using Vague Set

Authors: S. P. Sharma, Monica Rani

Abstract:

In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.

Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
13656 Patterns of Sports Supplement Use among Iranian Female Athletes

Authors: A. Golshanraz, L. Hakemi, L. Pourkazemi, E. Dadgostar, F. Moradzandi, R. Tabatabaee, F. Moradi, K. Hosseinihajiagha, N. Jazayeri, H. Abedifar, R. Fouladi, M. Khooban, H. Saboori, M. Kiani, M. Sajedi, E. Karooninejad, S.Moeen, M.Ghavam, F.Beiranvand, S.Mansoori, F.Gheisari, H.Barzegari

Abstract:

Supplement use is common in athletes. Besides their cost, they may have side effects on health and performance. 250 questionnaires were distributed among female athletes (mean age 27.08 years). The questionnaire aimed to explore the frequency, type, believes, attitudes and knowledge regarding dietary supplements. Knowledge was good in 30.3%, fair in 60.2%, and poor in 9.1% of respondents. 65.3% of athletes did not use supplements regularly. The most widely used supplements were vitamins (48.4%), minerals (42.9%), energy supplements (21.3%), and herbals (20.9%). 68.9% of athletes believed in their efficacy. 34.4% experienced performance enhancement and 6.8% of reported side effects. 68.2% reported little knowledge and 60.9% were eager to learn more. In conclusion, many of the female athletes believe in the efficacy of supplements and think they are an unavoidable part of competitive sports. However, their information is not sufficient. We have to stress on education, consulting sessions, and rational prescription.

Keywords: athlete, female, sports, supplement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
13655 Analysis of Thermal Damping in Si Based Torsional Micromirrors

Authors: R. Resmi, M. R. Baiju

Abstract:

The thermal damping of a dynamic vibrating micromirror is an important factor affecting the design of MEMS based actuator systems. In the development process of new micromirror systems, assessing the extent of energy loss due to thermal damping accurately and predicting the performance of the system is very essential. In this paper, the depth of the thermal penetration layer at different eigenfrequencies and the temperature variation distributions surrounding a vibrating micromirror is analyzed. The thermal penetration depth corresponds to the thermal boundary layer in which energy is lost which is a measure of the thermal damping is found out. The energy is mainly dissipated in the thermal boundary layer and thickness of the layer is an important parameter. The detailed thermoacoustics is used to model the air domain surrounding the micromirror. The thickness of the boundary layer, temperature variations and thermal power dissipation are analyzed for a Si based torsional mode micromirror. It is found that thermal penetration depth decreases with eigenfrequency and hence operating the micromirror at higher frequencies is essential for reducing thermal damping. The temperature variations and thermal power dissipations at different eigenfrequencies are also analyzed. Both frequency-response and eigenfrequency analyses are done using COMSOL Multiphysics software.

Keywords: Eigen frequency analysis, micromirrors, thermal damping, thermoacoustic interactions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1044
13654 An UML Statechart Diagram-Based MM-Path Generation Approach for Object-Oriented Integration Testing

Authors: Ruilian Zhao, Ling Lin

Abstract:

MM-Path, an acronym for Method/Message Path, describes the dynamic interactions between methods in object-oriented systems. This paper discusses the classifications of MM-Path, based on the characteristics of object-oriented software. We categorize it according to the generation reasons, the effect scope and the composition of MM-Path. A formalized representation of MM-Path is also proposed, which has considered the influence of state on response method sequences of messages. .Moreover, an automatic MM-Path generation approach based on UML Statechart diagram has been presented, and the difficulties in identifying and generating MM-Path can be solved. . As a result, it provides a solid foundation for further research on test cases generation based on MM-Path.

Keywords: MM-Path, Message Sequence, Object-Oriented Integration Testing, Response Method Sequence, UML Statechart Diagram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2591
13653 An Enhanced Floor Estimation Algorithm for Indoor Wireless Localization Systems Using Confidence Interval Approach

Authors: Kriangkrai Maneerat, Chutima Prommak

Abstract:

Indoor wireless localization systems have played an important role to enhance context-aware services. Determining the position of mobile objects in complex indoor environments, such as those in multi-floor buildings, is very challenging problems. This paper presents an effective floor estimation algorithm, which can accurately determine the floor where mobile objects located. The proposed algorithm is based on the confidence interval of the summation of online Received Signal Strength (RSS) obtained from the IEEE 802.15.4 Wireless Sensor Networks (WSN).We compare the performance of the proposed algorithm with those of other floor estimation algorithms in literature by conducting a real implementation of WSN in our facility. The experimental results and analysis showed that the proposed floor estimation algorithm outperformed the other algorithms and provided highest percentage of floor accuracy up to 100% with 95-percent confidence interval.

Keywords: Floor estimation algorithm, floor determination, multi-floor building, indoor wireless systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3197
13652 A Literature Review on the Effect of Industrial Clusters and the Absorptive Capacity on Innovation

Authors: Enrique Claver Cortés, Bartolomé Marco Lajara, Eduardo Sánchez García, Pedro Seva Larrosa, Encarnación Manresa Marhuenda, Lorena Ruiz Fernández, Esther Poveda Pareja

Abstract:

In recent decades, the analysis of the effects of clustering as an essential factor for the development of innovations and the competitiveness of enterprises has raised great interest in different areas. Nowadays, companies have access to almost all tangible and intangible resources located and/or developed in any country in the world. However, despite the obvious advantages that this situation entails for companies, their geographical location has shown itself, increasingly clearly, to be a fundamental factor that positively influences their innovative performance and competitiveness. Industrial clusters could represent a unique level of analysis, positioned between the individual company and the industry, which makes them an ideal unit of analysis to determine the effects derived from company membership of a cluster. Also, the absorptive capacity (hereinafter 'AC') can mediate the process of innovation development by companies located in a cluster. The transformation and exploitation of knowledge could have a mediating effect between knowledge acquisition and innovative performance. The main objective of this work is to determine the key factors that affect the degree of generation and use of knowledge from the environment by companies and, consequently, their innovative performance and competitiveness. The elements analyzed are the companies' membership of a cluster and the AC. To this end, 30 most relevant papers published on this subject in the "Web of Science" database have been reviewed. Our findings show that, within a cluster, the knowledge coming from the companies' environment can significantly influence their innovative performance and competitiveness, although in this relationship, the degree of access and exploitation of the companies to this knowledge plays a fundamental role, which depends on a series of elements both internal and external to the company.

Keywords: Absorptive capacity, clusters, innovation, knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
13651 Mobile Phone Banking Applies and Customer Intention - A Case Study in Libya

Authors: Iman E. Bouthahab, Badea B. Geador

Abstract:

Aim of this paper is to explore the prospect of a new approach of mobile phone banking in Libya. This study evaluates customer knowledge on commercial mobile banking in Libya. To examine the relationship between age, occupation and intention for using mobile banking for commercial purpose, a survey was conducted to gather information from one hundred Libyan bank clients. The results indicate that Libyan customers have accepted the new technology and they are ready to use it. There is no significant joint relationship between age and occupation found in intention to use mobile banking in Libya. On the other hand, the customers’ knowledge about mobile banking has a greater relationship with the intention. This study has implications for demographic researches and consumer behaviour disciplines. It also has profitable implications for banks and managers in Libya, as it will assist in better understanding of the Libyan consumers and their activities, when they develop their market strategies and new service.

 

Keywords: Banks in Libya, Customer Knowledge, Intention, Mobile banking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2625
13650 Fundamental Theory of the Evolution Force: Gene Engineering utilizing Synthetic Evolution Artificial Intelligence

Authors: L. K. Davis

Abstract:

The effects of the evolution force are observable in nature at all structural levels ranging from small molecular systems to conversely enormous biospheric systems. However, the evolution force and work associated with formation of biological structures has yet to be described mathematically or theoretically. In addressing the conundrum, we consider evolution from a unique perspective and in doing so we introduce the “Fundamental Theory of the Evolution Force: FTEF”. We utilized synthetic evolution artificial intelligence (SYN-AI) to identify genomic building blocks and to engineer 14-3-3 ζ docking proteins by transforming gene sequences into time-based DNA codes derived from protein hierarchical structural levels. The aforementioned served as templates for random DNA hybridizations and genetic assembly. The application of hierarchical DNA codes allowed us to fast forward evolution, while dampening the effect of point mutations. Natural selection was performed at each hierarchical structural level and mutations screened using Blosum 80 mutation frequency-based algorithms. Notably, SYN-AI engineered a set of three architecturally conserved docking proteins that retained motion and vibrational dynamics of native Bos taurus 14-3-3 ζ.

Keywords: 14-3-3 docking genes, synthetic protein design, time based DNA codes, writing DNA code from scratch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 648
13649 Feature Selection with Kohonen Self Organizing Classification Algorithm

Authors: Francesco Maiorana

Abstract:

In this paper a one-dimension Self Organizing Map algorithm (SOM) to perform feature selection is presented. The algorithm is based on a first classification of the input dataset on a similarity space. From this classification for each class a set of positive and negative features is computed. This set of features is selected as result of the procedure. The procedure is evaluated on an in-house dataset from a Knowledge Discovery from Text (KDT) application and on a set of publicly available datasets used in international feature selection competitions. These datasets come from KDT applications, drug discovery as well as other applications. The knowledge of the correct classification available for the training and validation datasets is used to optimize the parameters for positive and negative feature extractions. The process becomes feasible for large and sparse datasets, as the ones obtained in KDT applications, by using both compression techniques to store the similarity matrix and speed up techniques of the Kohonen algorithm that take advantage of the sparsity of the input matrix. These improvements make it feasible, by using the grid, the application of the methodology to massive datasets.

Keywords: Clustering algorithm, Data mining, Feature selection, Grid, Kohonen Self Organizing Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3042
13648 Concept Abduction in Description Logics with Cardinality Restrictions

Authors: Viet-Hoang Vu, Nhan Le-Thanh

Abstract:

Recently the usefulness of Concept Abduction, a novel non-monotonic inference service for Description Logics (DLs), has been argued in the context of ontology-based applications such as semantic matchmaking and resource retrieval. Based on tableau calculus, a method has been proposed to realize this reasoning task in ALN, a description logic that supports simple cardinality restrictions as well as other basic constructors. However, in many ontology-based systems, the representation of ontology would require expressive formalisms for capturing domain-specific constraints, this language is not sufficient. In order to increase the applicability of the abductive reasoning method in such contexts, we would like to present in the scope of this paper an extension of the tableaux-based algorithm for dealing with concepts represented inALCQ, the description logic that extends ALN with full concept negation and quantified number restrictions.

Keywords: Abductive reasoning, description logics, semantic matchmaking, non-monotonic inference, tableaux-based method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1548
13647 Delay-Dependent H∞ Performance Analysis for Markovian Jump Systems with Time-Varying Delays

Authors: Yucai Ding, Hong Zhu, Shouming Zhong, Yuping Zhang

Abstract:

This paper considers ­H∞ performance for Markovian jump systems with Time-varying delays. The systems under consideration involve disturbance signal, Markovian switching and timevarying delays. By using a new Lyapunov-Krasovskii functional and a convex optimization approach, a delay-dependent stability condition in terms of linear matrix inequality (LMI) is addressed, which guarantee asymptotical stability in mean square and a prescribed ­H∞ performance index for the considered systems. Two numerical examples are given to illustrate the effectiveness and the less conservatism of the proposed main results. All these results are expected to be of use in the study of stochastic systems with time-varying delays.

Keywords: ­H∞ performance, Markovian switching, Delaydependent stability, Linear matrix inequality (LMI)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
13646 Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques

Authors: Christopher Paterson, Richard Curry, Alan Purvis, Simon Johnson

Abstract:

Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).

Keywords: Action potential detection, Low SNR, Phase spacediagrams/trajectories, Unsupervised/no-prior knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
13645 The Guaranteed Detection of the Seismoacoustic Emission Source in the C-OTDR Systems

Authors: Andrey V. Timofeev

Abstract:

A method is proposed for stable detection of seismoacoustic sources in C-OTDR systems that guarantee given upper bounds for probabilities of type I and type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDRsystem are presented.

Keywords: Guaranteed detection, C-OTDR systems, change point, interval estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978
13644 A Usability Testing Approach to Evaluate User-Interfaces in Business Administration

Authors: Salaheddin Odeh, Ibrahim O. Adwan

Abstract:

This interdisciplinary study is an investigation to evaluate user-interfaces in business administration. The study is going to be implemented on two computerized business administration systems with two distinctive user-interfaces, so that differences between the two systems can be determined. Both systems, a commercial and a prototype developed for the purpose of this study, deal with ordering of supplies, tendering procedures, issuing purchase orders, controlling the movement of the stocks against their actual balances on the shelves and editing them on their tabulations. In the second suggested system, modern computer graphics and multimedia issues were taken into consideration to cover the drawbacks of the first system. To highlight differences between the two investigated systems regarding some chosen standard quality criteria, the study employs various statistical techniques and methods to evaluate the users- interaction with both systems. The study variables are divided into two divisions: independent representing the interfaces of the two systems, and dependent embracing efficiency, effectiveness, satisfaction, error rate etc.

Keywords: Evaluation and usability testing, software prototyping, statistical methods, user-interface design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454