Search results for: pessimistic approach.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4977

Search results for: pessimistic approach.

4617 Hybrid Approach for Memory Analysis in Windows System

Authors: Khairul Akram Zainol Ariffin, Ahmad Kamil Mahmood, Jafreezal Jaafar, Solahuddin Shamsuddin

Abstract:

Random Access Memory (RAM) is an important device in computer system. It can represent the snapshot on how the computer has been used by the user. With the growth of its importance, the computer memory has been an issue that has been discussed in digital forensics. A number of tools have been developed to retrieve the information from the memory. However, most of the tools have their limitation in the ability of retrieving the important information from the computer memory. Hence, this paper is aimed to discuss the limitation and the setback for two main techniques such as process signature search and process enumeration. Then, a new hybrid approach will be presented to minimize the setback in both individual techniques. This new approach combines both techniques with the purpose to retrieve the information from the process block and other objects in the computer memory. Nevertheless, the basic theory in address translation for x86 platforms will be demonstrated in this paper.

Keywords: Algorithms, Digital Forensics, Memory Analysis, Signature Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
4616 An Approach to Construct Criteria for Evaluating Alternatives in Decision-Making

Authors: Niina M. Nissinen

Abstract:

This paper introduces an approach to construct a set of criteria for evaluating alternative options. Content analysis was used to collet criterion elements. Then the elements were classified and organized yielding to hierarchic structure. The reliability of the constructed criteria was evaluated in an experiment. Finally the criteria were used to evaluate alternative options indecision-making.

Keywords: Conceptual analysis, Content Analysis, Criteria, Decision-Making, Evaluation of Candidates

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375
4615 An Integrated Operational Research and System Dynamics Approach for Planning Decisions in Container Terminals

Authors: A. K. Abdel-Fattah, A. B. El-Tawil, N. A. Harraz

Abstract:

This paper focuses on the operational and strategic planning decisions related to the quayside of container terminals. We introduce an integrated operational research (OR) and system dynamics (SD) approach to solve the Berth Allocation Problem (BAP) and the Quay Crane Assignment Problem (QCAP). A BAP-QCAP optimization modeling approach which considers practical aspects not studied before in the integration of BAP and QCAP is discussed. A conceptual SD model is developed to determine the long-term effect of optimization on the system behavior factors like resource utilization, attractiveness to port, number of incoming vessels to port and port profits. The framework can be used for improving the operational efficiency of container terminals and providing a strategic view after applying optimization.

Keywords: Operational research, system dynamics, container terminal, quayside operational problems, strategic planning decisions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3265
4614 Analytical Formulae for the Approach Velocity Head Coefficient

Authors: Abdulrahman Abdulrahman

Abstract:

Critical depth meters, such as abroad crested weir, Venture Flume and combined control flume are standard devices for measuring flow in open channels. The discharge relation for these devices cannot be solved directly, but it needs iteration process to account for the approach velocity head. In this paper, analytical solution was developed to calculate the discharge in a combined critical depth-meter namely, a hump combined with lateral contraction in rectangular channel with subcritical approach flow including energy losses. Also analytical formulae were derived for approach velocity head coefficient for different types of critical depth meters. The solution was derived by solving a standard cubic equation considering energy loss on the base of trigonometric identity. The advantage of this technique is to avoid iteration process adopted in measuring flow by these devices. Numerical examples are chosen for demonstration of the proposed solution.

Keywords: Broad crested weir, combined control meter, control structures, critical flow, discharge measurement, flow control, hydraulic engineering, hydraulic structures, open channel flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 984
4613 An Approach to Concerns and Aspects Mining for Web Applications

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.

Keywords: Aspect Mining, Concepts Analysis, Concerns Mining, Multi-Dimensional Separation of Concerns, Impact Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
4612 Verification and Validation for Java Classes using Design by Contract. The Modular External Approach

Authors: Dario Ramirez de Leon, Oscar Chavez Bosquez, Julian J. Francisco Leon

Abstract:

Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.

Keywords: Model checking, verification and validation, JML, XML, java, runtime assertion checking, extended static checking, full static program verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
4611 Student and Group Activity Level Assessment in the ELARS Recommender System

Authors: Martina Holenko Dlab, Natasa Hoic-Bozic

Abstract:

This paper presents an original approach to student and group activity level assessment that relies on certainty factors theory. Activity level is used to represent quantity and continuity of student’s contributions in individual and collaborative e‑learning activities (e‑tivities) and is calculated to assist teachers in assessing quantitative aspects of student's achievements. Calculated activity levels are also used to raise awareness and provide recommendations during the learning process. The proposed approach was implemented within the educational recommender system ELARS and validated using data obtained from e‑tivity realized during a blended learning course. The results showed that the proposed approach can be used to estimate activity level in the context of e-tivities realized using Web 2.0 tools as well as to facilitate the assessment of quantitative aspect of students’ participation in e‑tivities.

Keywords: Assessment, ELARS, e-learning, recommender systems, student model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1020
4610 A Quantitative Approach to Strategic Design of Component-Based Business Process Models

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

Keywords: Business process model, process component, component management goals, measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1628
4609 Enhanced Genetic Algorithm Approach for Security Constrained Optimal Power Flow Including FACTS Devices

Authors: R.Narmatha Banu, D.Devaraj

Abstract:

This paper presents a genetic algorithm based approach for solving security constrained optimal power flow problem (SCOPF) including FACTS devices. The optimal location of FACTS devices are identified using an index called overload index and the optimal values are obtained using an enhanced genetic algorithm. The optimal allocation by the proposed method optimizes the investment, taking into account its effects on security in terms of the alleviation of line overloads. The proposed approach has been tested on IEEE-30 bus system to show the effectiveness of the proposed algorithm for solving the SCOPF problem.

Keywords: Optimal Power Flow, Genetic Algorithm, FlexibleAC transmission system (FACTS) devices, Severity Index (SI), Security Enhancement, Thyristor controlled series capacitor (TCSC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
4608 Worm Gearing Design Improvement by Considering Varying Mesh Stiffness

Authors: A. H. Elkholy, A. H. Falah

Abstract:

A new approach has been developed to estimate the load share and distribution of worm gear drives, and to calculate the instantaneous tooth meshing stiffness. In the approach, the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using the well-established formulae of spur gear loading and stresses. By combining the results obtained for all slices, the entire envolute worm gear set loading and stressing was obtained. The geometric modelling method presented allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. Based on the slicing method introduced in this study, the instantaneous meshing stiffness and load share are obtained. In comparison with existing methods, this approach has both good analysis accuracy and less computing time.

Keywords: Gear, load/stress distribution, worm, wheel, tooth stiffness, contact line.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181
4607 Optical Multicast over OBS Networks: An Approach Based On Code-Words and Tunable Decoders

Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga

Abstract:

In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.

Keywords: Optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
4606 Optical Multicast over OBS Networks: An Approach Based On Code-Words and Tunable Decoders

Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga

Abstract:

In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.

Keywords: Optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718
4605 Inverse Problem Methodology for the Measurement of the Electromagnetic Parameters Using MLP Neural Network

Authors: T. Hacib, M. R. Mekideche, N. Ferkha

Abstract:

This paper presents an approach which is based on the use of supervised feed forward neural network, namely multilayer perceptron (MLP) neural network and finite element method (FEM) to solve the inverse problem of parameters identification. The approach is used to identify unknown parameters of ferromagnetic materials. The methodology used in this study consists in the simulation of a large number of parameters in a material under test, using the finite element method (FEM). Both variations in relative magnetic permeability and electrical conductivity of the material under test are considered. Then, the obtained results are used to generate a set of vectors for the training of MLP neural network. Finally, the obtained neural network is used to evaluate a group of new materials, simulated by the FEM, but not belonging to the original dataset. Noisy data, added to the probe measurements is used to enhance the robustness of the method. The reached results demonstrate the efficiency of the proposed approach, and encourage future works on this subject.

Keywords: Inverse problem, MLP neural network, parametersidentification, FEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
4604 Hybrid Machine Learning Approach for Text Categorization

Authors: Nerijus Remeikis, Ignas Skucas, Vida Melninkaite

Abstract:

Text categorization - the assignment of natural language documents to one or more predefined categories based on their semantic content - is an important component in many information organization and management tasks. Performance of neural networks learning is known to be sensitive to the initial weights and architecture. This paper discusses the use multilayer neural network initialization with decision tree classifier for improving text categorization accuracy. An adaptation of the algorithm is proposed in which a decision tree from root node until a final leave is used for initialization of multilayer neural network. The experimental evaluation demonstrates this approach provides better classification accuracy with Reuters-21578 corpus, one of the standard benchmarks for text categorization tasks. We present results comparing the accuracy of this approach with multilayer neural network initialized with traditional random method and decision tree classifiers.

Keywords: Text categorization, decision trees, neural networks, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766
4603 Optimizing of Fuzzy C-Means Clustering Algorithm Using GA

Authors: Mohanad Alata, Mohammad Molhim, Abdullah Ramini

Abstract:

Fuzzy C-means Clustering algorithm (FCM) is a method that is frequently used in pattern recognition. It has the advantage of giving good modeling results in many cases, although, it is not capable of specifying the number of clusters by itself. In FCM algorithm most researchers fix weighting exponent (m) to a conventional value of 2 which might not be the appropriate for all applications. Consequently, the main objective of this paper is to use the subtractive clustering algorithm to provide the optimal number of clusters needed by FCM algorithm by optimizing the parameters of the subtractive clustering algorithm by an iterative search approach and then to find an optimal weighting exponent (m) for the FCM algorithm. In order to get an optimal number of clusters, the iterative search approach is used to find the optimal single-output Sugenotype Fuzzy Inference System (FIS) model by optimizing the parameters of the subtractive clustering algorithm that give minimum least square error between the actual data and the Sugeno fuzzy model. Once the number of clusters is optimized, then two approaches are proposed to optimize the weighting exponent (m) in the FCM algorithm, namely, the iterative search approach and the genetic algorithms. The above mentioned approach is tested on the generated data from the original function and optimal fuzzy models are obtained with minimum error between the real data and the obtained fuzzy models.

Keywords: Fuzzy clustering, Fuzzy C-Means, Genetic Algorithm, Sugeno fuzzy systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3201
4602 Surveillance of Super-Extended Objects: Bimodal Approach

Authors: Andrey V. Timofeev, Dmitry Egorov

Abstract:

This paper describes an effective solution to the task of a remote monitoring of super-extended objects (oil and gas pipeline, railways, national frontier). The suggested solution is based on the principle of simultaneously monitoring of seismoacoustic and optical/infrared physical fields. The principle of simultaneous monitoring of those fields is not new but in contrast to the known solutions the suggested approach allows to control super-extended objects with very limited operational costs. So-called C-OTDR (Coherent Optical Time Domain Reflectometer) systems are used to monitor the seismoacoustic field. Far-CCTV systems are used to monitor the optical/infrared field. A simultaneous data processing provided by both systems allows effectively detecting and classifying target activities, which appear in the monitored objects vicinity. The results of practical usage had shown high effectiveness of the suggested approach.

Keywords: Bimodal processing, C-OTDR monitoring system, LPboost, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036
4601 Cognitive Landscape of Values – Understanding the Information Contents of Mental Representations

Authors: J. Maksimainen

Abstract:

The values of managers and employees in organizations are phenomena that have captured the interest of researchers at large. Despite this attention, there continues to be a lack of agreement on what values are and how they influence individuals, or how they are constituted in individuals- mind. In this article content-based approach is presented as alternative reference frame for exploring values. In content-based approach human thinking in different contexts is set at the focal point. Differences in valuations can be explained through the information contents of mental representations. In addition to the information contents, attention is devoted to those cognitive processes through which mental representations of values are constructed. Such informational contents are in decisive role for understanding human behavior. By applying content-based analysis to an examination of values as mental representations, it is possible to reach a deeper to the motivational foundation of behaviors, such as decision making in organizational procedures, through understanding the structure and meanings of specific values at play.

Keywords: Content-based Approach, Mental Content, Mental Representations, Organizational values, Values

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1392
4600 A New Approach for Image Segmentation using Pillar-Kmeans Algorithm

Authors: Ali Ridho Barakbah, Yasushi Kiyoki

Abstract:

This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.

Keywords: Image segmentation, K-means clustering, Pillaralgorithm, color spaces.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3328
4599 A Mathematical Representation for Mechanical Model Assessment: Numerical Model Qualification Method

Authors: Keny Ordaz-Hernandez, Xavier Fischer, Fouad Bennis

Abstract:

This article illustrates a model selection management approach for virtual prototypes in interactive simulations. In those numerical simulations, the virtual prototype and its environment are modelled as a multiagent system, where every entity (prototype,human, etc.) is modelled as an agent. In particular, virtual prototyp ingagents that provide mathematical models of mechanical behaviour inform of computational methods are considered. This work argues that selection of an appropriate model in a changing environment,supported by models? characteristics, can be managed by the deter-mination a priori of specific exploitation and performance measures of virtual prototype models. As different models exist to represent a single phenomenon, it is not always possible to select the best one under all possible circumstances of the environment. Instead the most appropriate shall be selecting according to the use case. The proposed approach consists in identifying relevant metrics or indicators for each group of models (e.g. entity models, global model), formulate their qualification, analyse the performance, and apply the qualification criteria. Then, a model can be selected based on the performance prediction obtained from its qualification. The authors hope that this approach will not only help to inform engineers and researchers about another approach for selecting virtual prototype models, but also assist virtual prototype engineers in the systematic or automatic model selection.

Keywords: Virtual prototype models, domain, qualification criterion, model qualification, model assessment, environmental modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997
4598 Optimal Path Planner for Autonomous Vehicles

Authors: M. Imran Akram, Ahmed Pasha, Nabeel Iqbal

Abstract:

In this paper a real-time trajectory generation algorithm for computing 2-D optimal paths for autonomous aerial vehicles has been discussed. A dynamic programming approach is adopted to compute k-best paths by minimizing a cost function. Collision detection is implemented to detect intersection of the paths with obstacles. Our contribution is a novel approach to the problem of trajectory generation that is computationally efficient and offers considerable gain over existing techniques.

Keywords: dynamic programming, graph search, path planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
4597 Laser Registration and Supervisory Control of neuroArm Robotic Surgical System

Authors: Hamidreza Hoshyarmanesh, Hosein Madieh, Sanju Lama, Yaser Maddahi, Garnette R. Sutherland, Kourosh Zareinia

Abstract:

This paper illustrates the concept of an algorithm to register specified markers on the neuroArm surgical manipulators, an image-guided MR-compatible tele-operated robot for microsurgery and stereotaxy. Two range-finding algorithms, namely time-of-flight and phase-shift, are evaluated for registration and supervisory control. The time-of-flight approach is implemented in a semi-field experiment to determine the precise position of a tiny retro-reflective moving object. The moving object simulates a surgical tool tip. The tool is a target that would be connected to the neuroArm end-effector during surgery inside the magnet bore of the MR imaging system. In order to apply flight approach, a 905-nm pulsed laser diode and an avalanche photodiode are utilized as the transmitter and receiver, respectively. For the experiment, a high frequency time to digital converter was designed using a field-programmable gate arrays. In the phase-shift approach, a continuous green laser beam with a wavelength of 530 nm was used as the transmitter. Results showed that a positioning error of 0.1 mm occurred when the scanner-target point distance was set in the range of 2.5 to 3 meters. The effectiveness of this non-contact approach exhibited that the method could be employed as an alternative for conventional mechanical registration arm. Furthermore, the approach is not limited by physical contact and extension of joint angles.

Keywords: 3D laser scanner, intraoperative MR imaging, neuroArm, real time registration, robot-assisted surgery, supervisory control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
4596 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 607
4595 Value Analysis Dashboard in Supply Chain Management: Real Case Study from Iran

Authors: Seyedehfatemeh Golrizgashti, Seyedali Dalil

Abstract:

The goal of this paper is proposing a supply chain value dashboard in home appliance manufacturing firms to create more value for all stakeholders via balanced scorecard approach. Balanced scorecard is an effective approach that managers have used to evaluate supply chain performance in many fields but there is a lack of enough attention to all supply chain stakeholders, improving value creation and, defining correlation between value indicators and performance measuring quantitatively. In this research the key stakeholders in home appliance supply chain, value indicators with respect to create more value for stakeholders and the most important metrics to evaluate supply chain value performance based on balanced scorecard approach have been selected via literature review. The most important indicators based on expert’s judgment acquired by in survey focused on creating more value for. Structural equation modelling has been used to disclose relations between value indicators and balanced scorecard metrics. The important result of this research is identifying effective value dashboard to create more value for all stakeholders in supply chain via balanced scorecard approach and based on an empirical study covering ten home appliance manufacturing firms in Iran. Home appliance manufacturing firms can increase their stakeholder's satisfaction by using this value dashboard.

Keywords: Supply chain management, balanced scorecard, value, Structural modeling, Stakeholders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
4594 A Study of Lurking Behavior: The Desire Perspective

Authors: Hsiu-Hua Cheng, Chi-Wei Chen

Abstract:

Lurking behavior is common in information-seeking oriented communities. Transferring users with lurking behavior to be contributors can assist virtual communities to obtain competitive advantages. Based on the ecological cognition framework, this study proposes a model to examine the antecedents of lurking behavior in information-seeking oriented virtual communities. This study argues desire for emotional support, desire for information support, desire for performance-approach, desire for performance -avoidance, desire for mastery-approach, desire for mastery-avoidance, desire for ability trust, desire for benevolence trust, and desire for integrity trust effect on lurking behavior. This study offers an approach to understanding the determinants of lurking behavior in online contexts.

Keywords: Lurking behavior, the ecological cognition framework, Information-seeking oriented virtual communities, Desire.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
4593 New Approach for Constructing a Secure Biometric Database

Authors: A. Kebbeb, M. Mostefai, F. Benmerzoug, Y. Chahir

Abstract:

The multimodal biometric identification is the combination of several biometric systems; the challenge of this combination is to reduce some limitations of systems based on a single modality while significantly improving performance. In this paper, we propose a new approach to the construction and the protection of a multimodal biometric database dedicated to an identification system. We use a topological watermarking to hide the relation between face image and the registered descriptors extracted from other modalities of the same person for more secure user identification.

Keywords: Biometric databases, Multimodal biometrics, security authentication, Digital watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
4592 Integrating Visual Modeling throughout the Computer Science Curriculum

Authors: Carol B.Collins, M. H. N Tabrizi

Abstract:

The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors- and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors- experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.

Keywords: Algorithms, CASE, Problem-solving, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349
4591 BDD Package Based on Boolean NOR Operation

Authors: M. Raseen, A.Assi, P.W. C. Prasad, A. Harb

Abstract:

Binary Decision Diagrams (BDDs) are useful data structures for symbolic Boolean manipulations. BDDs are used in many tasks in VLSI/CAD, such as equivalence checking, property checking, logic synthesis, and false paths. In this paper we describe a new approach for the realization of a BDD package. To perform manipulations of Boolean functions, the proposed approach does not depend on the recursive synthesis operation of the IF-Then-Else (ITE). Instead of using the ITE operation, the basic synthesis algorithm is done using Boolean NOR operation.

Keywords: Binary Decision Diagram (BDD), ITE Operation, Boolean Function, NOR operation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
4590 A Combined Fuzzy Decision Making Approach to Supply Chain Risk Assessment

Authors: P. Moeinzadeh, A. Hajfathaliha

Abstract:

Many firms implemented various initiatives such as outsourced manufacturing which could make a supply chain (SC) more vulnerable to various types of disruptions. So managing risk has become a critical component of SC management. Different types of SC vulnerability management methodologies have been proposed for managing SC risk, most offer only point-based solutions that deal with a limited set of risks. This research aims to reinforce SC risk management by proposing an integrated approach. SC risks are identified and a risk index classification structure is created. Then we develop a SC risk assessment approach based on the analytic network process (ANP) and the VIKOR methods under the fuzzy environment where the vagueness and subjectivity are handled with linguistic terms parameterized by triangular fuzzy numbers. By using FANP, risks weights are calculated and then inserted to the FVIKOR to rank the SC members and find the most risky partner.

Keywords: Analytic network process (ANP), Fuzzy sets, Supply chain risk management (SCRM), VIšekriterijumsko KOmpromisno Rangiranje (VIKOR)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2889
4589 An MCDM Approach to Selection Scheduling Rule in Robotic Flexibe Assembly Cells

Authors: Khalid Abd, Kazem Abhary, Romeo Marian

Abstract:

Multiple criteria decision making (MCDM) is an approach to ranking the solutions and finding the best one when two or more solutions are provided. In this study, MCDM approach is proposed to select the most suitable scheduling rule of robotic flexible assembly cells (RFACs). Two MCDM approaches, Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) are proposed for solving the scheduling rule selection problem. The AHP method is employed to determine the weights of the evaluation criteria, while the TOPSIS method is employed to obtain final ranking order of scheduling rules. Four criteria are used to evaluate the scheduling rules. Also, four scheduling policies of RFAC are examined to choose the most appropriate one for this purpose. A numerical example illustrates applications of the suggested methodology. The results show that the methodology is practical and works in RFAC settings.

Keywords: AHP, TOPSIS, Scheduling rules selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
4588 A Weighted Approach to Unconstrained Iris Recognition

Authors: Yao-Hong Tsai

Abstract:

This paper presents a weighted approach to unconstrained iris recognition. In nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.

Keywords: Authentication, iris recognition, Adaboost, local binary pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894