Search results for: hybrid approach
3884 Runtime Monitoring Using Policy Based Approach to Control Information Flow for Mobile Apps
Authors: M. Sarrab, H. Bourdoucen
Abstract:
Mobile applications are verified to check the correctness or evaluated to check the performance with respect to specific security properties such as Availability, Integrity and Confidentiality. Where they are made available to the end users of the mobile application is achievable only to a limited degree using software engineering static verification techniques. The more sensitive the information, such as credit card data, personal medical information or personal emails being processed by mobile application, the more important it is to ensure the confidentiality of this information. Monitoring untrusted mobile application during execution in an environment where sensitive information is present is difficult and unnerving. The paper addresses the issue of monitoring and controlling the flow of confidential information during untrusted mobile application execution. The approach concentrates on providing a dynamic and usable information security solution by interacting with the mobile users during the runtime of mobile application in response to information flow events.
Keywords: Mobile application, Run-time verification, Usable security, Direct information flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19533883 Application of Stabilized Polyaniline Microparticles for Better Protective Ability of Zinc Coatings
Authors: N. Boshkova, K. Kamburova, N. Tabakova, N. Boshkov, Ts. Radeva
Abstract:
Coatings based on polyaniline (PANI) can improve the resistance of steel against corrosion. In this work, the preparation of stable suspensions of colloidal PANI-SiO2 particles, suitable for obtaining of composite anticorrosive coating on steel, is described. Electrokinetic data as a function of pH are presented, showing that the zeta potentials of the PANI-SiO2 particles are governed primarily by the charged groups at the silica oxide surface. Electrosteric stabilization of the PANI-SiO2 particles’ suspension against aggregation is realized at pH>5.5 (EB form of PANI) by adsorption of positively charged polyelectrolyte molecules onto negatively charged PANI-SiO2 particles. The PANI-SiO2 particles are incorporated by electrodeposition into the metal matrix of zinc in order to obtain composite (hybrid) coatings. The latter are aimed to ensure sacrificial protection of steel mainly in aggressive media leading to local corrosion damages. The surface morphology of the composite zinc coatings is investigated with SEM. The influence of PANI-SiO2 particles on the cathodic and anodic processes occurring in the starting electrolyte for obtaining of the coatings is followed with cyclic voltammetry. The electrochemical and corrosion behavior is evaluated with potentiodynamic polarization curves and polarization resistance measurements. The beneficial effect of the stabilized PANI-SiO2 particles for the increased protective ability of the composites is commented and discussed.
Keywords: Corrosion, polyaniline particles, zinc, protective ability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8093882 Residence Time Distribution in a Two Impinging Streams Cyclone Reactor: CFD Prediction and Experimental Validation
Authors: Nahid Ghasemi, Morteza Sohrabi, Yasan Soleymani
Abstract:
The quantified residence time distribution (RTD) provides a numerical characterization of mixing in a reactor, thus allowing the process engineer to better understand mixing performance of the reactor.This paper discusses computational studies to investigate flow patterns in a two impinging streams cyclone reactor(TISCR) . Flow in the reactor was modeled with computational fluid dynamics (CFD). Utilizing the Eulerian- Lagrangian approach, implemented in FLUENT (V6.3.22), particle trajectories were obtained by solving the particle force balance equations. From simulation results obtained at different Δts, the mean residence time (tm) and the mean square deviation (σ2) were calculated. a good agreement can be observed between predicted and experimental data. Simulation results indicate that the behavior of complex reactor systems can be predicted using the CFD technique with minimum data requirement for validation.Keywords: Impinging streams reactor, Residence timedistribution, CFD, Eulerian-Lagrangian approach
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23793881 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt
Authors: Sahbi Farhani
Abstract:
This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.
Keywords: External debt, military spending, ARDL approach, structural breaks, India.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14453880 Preparing Project Managers to Achieve Project Success - Human Management Perspective
Authors: E. Muneera, A. Anuar, A. S. Zulkiflee
Abstract:
The evolution in project management was triggered by the changes in management philosophy and practices in order to maintain competitive advantage and continuous success in the field. The purpose of this paper is to highlight the practicality of cognitive style and unlearning approach in influencing the achievement of project success by project managers. It introduces the concept of planning, knowing and creating style from cognitive style field in the light of achieving time, cost, quality and stakeholders appreciation in project success context. Further it takes up a discussion of the unlearning approach as a moderator in enhancing the relationship between cognitive style and project success. The paper bases itself on literature review from established disciplines like psychology, sociology and philosophy regarding cognitive style, unlearning and project success in general. The analysis and synthesis of literature in the subject area a conceptual paper is utilized as the basis of future research to form a comprehensive framework for project managers in enhancing the project management competency.Keywords: Cognitive Style, Project Managers, Project Success, Unlearning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20303879 Genetic Algorithm Based Wavelength Division Multiplexing Networks Planning
Authors: S.Baskar, P.S.Ramkumar, R.Kesavan
Abstract:
This paper presents a new heuristic algorithm useful for long-term planning of survivable WDM networks. A multi-period model is formulated that combines network topology design and capacity expansion. The ability to determine network expansion schedules of this type becomes increasingly important to the telecommunications industry and to its customers. The solution technique consists of a Genetic Algorithm that allows generating several network alternatives for each time period simultaneously and shortest-path techniques to deduce from these alternatives a least-cost network expansion plan over all time periods. The multi-period planning approach is illustrated on a realistic network example. Extensive simulations on a wide range of problem instances are carried out to assess the cost savings that can be expected by choosing a multi-period planning approach instead of an iterative network expansion design method.Keywords: Wavelength Division Multiplexing, Genetic Algorithm, Network topology, Multi-period reliable network planning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14363878 An Efficient Graph Query Algorithm Based on Important Vertices and Decision Features
Authors: Xiantong Li, Jianzhong Li
Abstract:
Graph has become increasingly important in modeling complicated structures and schemaless data such as proteins, chemical compounds, and XML documents. Given a graph query, it is desirable to retrieve graphs quickly from a large database via graph-based indices. Different from the existing methods, our approach, called VFM (Vertex to Frequent Feature Mapping), makes use of vertices and decision features as the basic indexing feature. VFM constructs two mappings between vertices and frequent features to answer graph queries. The VFM approach not only provides an elegant solution to the graph indexing problem, but also demonstrates how database indexing and query processing can benefit from data mining, especially frequent pattern mining. The results show that the proposed method not only avoids the enumeration method of getting subgraphs of query graph, but also effectively reduces the subgraph isomorphism tests between the query graph and graphs in candidate answer set in verification stage.Keywords: Decision Feature, Frequent Feature, Graph Dataset, Graph Query
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18713877 A Study of Growth Factors on Sustainable Manufacturing in Small and Medium-Sized Enterprises: Case Study of Japan Manufacturing
Authors: Tadayuki Kyoutani, Shigeyuki Haruyama, Ken Kaminishi, Zefry Darmawan
Abstract:
Japan’s semiconductor industries have developed greatly in recent years. Many were started from a Small and Medium-sized Enterprises (SMEs) that found at a good circumstance and now become the prosperous industries in the world. Sustainable growth factors that support the creation of spirit value inside the Japanese company were strongly embedded through performance. Those factors were not clearly defined among each company. A series of literature research conducted to explore quantitative text mining about the definition of sustainable growth factors. Sustainable criteria were developed from previous research to verify the definition of the factors. A typical frame work was proposed as a systematical approach to develop sustainable growth factor in a specific company. Result of approach was review in certain period shows that factors influenced in sustainable growth was importance for the company to achieve the goal.
Keywords: SME, manufacture, sustainable, growth factor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6363876 Creating Customer Value through SOA and Outsourcing: A NEBIC Approach
Authors: Benazeer Md. Shahzada, Verelst Jan, Van Grembergen Wim, Mannaert Herwig
Abstract:
This article is an extension and a practical application approach of Wheeler-s NEBIC theory (Net Enabled Business Innovation Cycle). NEBIC theory is a new approach in IS research and can be used for dynamic environment related to new technology. Firms can follow the market changes rapidly with support of the IT resources. Flexible firms adapt their market strategies, and respond more quickly to customers changing behaviors. When every leading firm in an industry has access to the same IT resources, the way that these IT resources are managed will determine the competitive advantages or disadvantages of firm. From Dynamic Capabilities Perspective and from newly introduced NEBIC theory by Wheeler, we know that only IT resources cannot deliver customer value but good configuration of those resources can guarantee customer value by choosing the right emerging technology, grasping the right economic opportunities through business innovation and growth. We found evidences in literature that SOA (Service Oriented Architecture) is a promising emerging technology which can deliver the desired economic opportunity through modularity, flexibility and loose-coupling. SOA can also help firms to connect in network which can open a new window of opportunity to collaborate in innovation and right kind of outsourcing. There are many articles and research reports indicates that failure rate in outsourcing is very high but at the same time research indicates that successful outsourcing projects adds tangible and intangible benefits to the service consumer. Business executives and policy makers in the west should not afraid of outsourcing but they should choose the right strategy through the use of emerging technology to significantly reduce the failure rate in outsourcing.Keywords: Absorptive capacity, Dynamic Capability, Netenabled business innovation cycle, Service oriented architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14113875 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach
Authors: D. Tedesco, G. Feletti, P. Trucco
Abstract:
The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.
Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2333874 Countercurrent Flow Simulation of Gas-Solid System in a Purge Column Using Computational Fluid Dynamics Techniques
Authors: T. J. Jamaleddine
Abstract:
Purge columns or degasser vessels are widely used in the polyolefin process for removing trapped hydrocarbons and in-excess catalyst residues from the polymer particles. A uniform distribution of purged gases coupled with a plug-flow characteristic inside the column system is desirable to obtain optimum desorption characteristics of trapped hydrocarbon and catalyst residues. Computational Fluid Dynamics (CFD) approach is a promising tool for design optimization of these vessels. The success of this approach is profoundly dependent on the solution strategy and the choice of geometrical layout at the vessel outlet. Filling the column with solids and initially solving for the solids flow minimized numerical diffusion substantially. Adopting a cylindrical configuration at the vessel outlet resulted in less numerical instability and resembled the hydrodynamics flow of solids in the hopper segment reasonably well.Keywords: CFD, gas-solids flow, gas purging, species transport, purge column, degasser vessel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6643873 Semi-automatic Background Detection in Microscopic Images
Authors: Alessandro Bevilacqua, Alessandro Gherardi, Ludovico Carozza, Filippo Piccinini
Abstract:
The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.
Keywords: Microscopy, flat field correction, background estimation, image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18353872 One-DOF Precision Position Control using the Combined Piezo-VCM Actuator
Authors: Yung-Tien Liu, Chun-Chao Wang
Abstract:
This paper presents the control performance of a high-precision positioning device using the hybrid actuator composed of a piezoelectric (PZT) actuator and a voice-coil motor (VCM). The combined piezo-VCM actuator features two main characteristics: a large operation range due to long stroke of the VCM, and high precision and heavy load positioning ability due to PZT impact force. A one-degree-of-freedom (DOF) experimental setup was configured to examine the fundamental characteristics, and the control performance was effectively demonstrated by using a switching controller. In rough positioning state, an integral variable structure controller (IVSC) was used for the VCM to conduct long range of operation; in precision positioning state, an impact force controller (IFC) for the PZT actuator coupled with presliding states of the sliding table was used to obtain high-precision position control and achieve both forward and backward actuations. The experimental results showed that the sliding table having a mass of 881g and with a preload of 10 N was successfully positioned within the positioning accuracy of 10 nm in both forward and backward position controls.
Keywords: Integral variable structure controller (IVSC), impact force, precision positioning, presliding, PZT actuator, voice-coil motor (VCM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19373871 Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering
Authors: Elizabeth B. Varghese, M. Wilscy
Abstract:
A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.
Keywords: Face Recognition, Vector Quantization, Integrated Adaptive Fuzzy Clustering, Self Organization Map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22423870 Capacitor Placement in Radial Distribution System for Loss Reduction Using Artificial Bee Colony Algorithm
Authors: R. Srinivasa Rao
Abstract:
This paper presents a new method which applies an artificial bee colony algorithm (ABC) for capacitor placement in distribution systems with an objective of improving the voltage profile and reduction of power loss. The ABC algorithm is a new population based meta heuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 69-bus system and compared the results with the other approach available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.Keywords: Distribution system, Capacitor Placement, Loss reduction, Artificial Bee Colony Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28173869 An Application of the Sinc-Collocation Method to a Three-Dimensional Oceanography Model
Authors: Y. Mohseniahouei, K. Abdella, M. Pollanen
Abstract:
In this paper, we explore the applicability of the Sinc- Collocation method to a three-dimensional (3D) oceanography model. The model describes a wind-driven current with depth-dependent eddy viscosity in the complex-velocity system. In general, the Sinc-based methods excel over other traditional numerical methods due to their exponentially decaying errors, rapid convergence and handling problems in the presence of singularities in end-points. Together with these advantages, the Sinc-Collocation approach that we utilize exploits first derivative interpolation, whose integration is much less sensitive to numerical errors. We bring up several model problems to prove the accuracy, stability, and computational efficiency of the method. The approximate solutions determined by the Sinc-Collocation technique are compared to exact solutions and those obtained by the Sinc-Galerkin approach in earlier studies. Our findings indicate that the Sinc-Collocation method outperforms other Sinc-based methods in past studies.Keywords: Boundary Value Problems, Differential Equations, Sinc Numerical Methods, Wind-Driven Currents
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18513868 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling
Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar
Abstract:
Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that FFS ensures fair allocation of resources but needs to improve with an imbalanced system load. And PDPS prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.
Keywords: Energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1223867 Flexible Development and Calculation of Contract Logistics Services
Authors: T. Spiegel, J. Siegmann, C. F. Durach
Abstract:
Challenges resulting from an international and dynamic business environment are increasingly being passed on from manufacturing companies to external service providers. Especially providers of complex, customer-specific industry services have to cope with continuously changing requirements. This is particularly true for contract logistics service providers. They are forced to develop efficient and highly flexible structures and strategies to meet their customer’s needs. One core element they have to focus on is the reorganization of their service development and sales process. Based on an action research approach, this study develops and tests a concept to streamline tender management for contract logistics service providers. The concept of modularized service architecture is deployed in order to derive a practice-oriented approach for the modularization of complex service portfolios and the design of customized quotes. These findings are evaluated regarding their applicability in other service sectors and practical recommendations are given.
Keywords: Contract Logistics, Modularization, Service Development, Tender Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20713866 Competence-Based Human Resources Selection and Training: Making Decisions
Authors: O. Starineca, I. Voronchuk
Abstract:
Human Resources (HR) selection and training have various implementation possibilities depending on an organization’s abilities and peculiarities. We propose to base HR selection and training decisions about on a competence-based approach. HR selection and training of employees are topical as there is room for improvement in this field; therefore, the aim of the research is to propose rational decision-making approaches for an organization HR selection and training choice. Our proposals are based on the training development and competence-based selection approaches created within previous researches i.e. Analytic-Hierarchy Process (AHP) and Linear Programming. Literature review on non-formal education, competence-based selection, AHP form our theoretical background. Some educational service providers in Latvia offer employees training, e.g. motivation, computer skills, accounting, law, ethics, stress management, etc. that are topical for Public Administration. Competence-based approach is a rational base for rational decision-making in both HR selection and considering HR training.Keywords: Competence-based selection, human resource, training, decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11063865 Manufacturing Dispersions Based Simulation and Synthesis of Design Tolerances
Authors: Nassima Cheikh, Abdelmadjid Cheikh, Said Hamou
Abstract:
The objective of this work which is based on the approach of simultaneous engineering is to contribute to the development of a CIM tool for the synthesis of functional design dimensions expressed by average values and tolerance intervals. In this paper, the dispersions method known as the Δl method which proved reliable in the simulation of manufacturing dimensions is used to develop a methodology for the automation of the simulation. This methodology is constructed around three procedures. The first procedure executes the verification of the functional requirements by automatically extracting the functional dimension chains in the mechanical sub-assembly. Then a second procedure performs an optimization of the dispersions on the basis of unknown variables. The third procedure uses the optimized values of the dispersions to compute the optimized average values and tolerances of the functional dimensions in the chains. A statistical and cost based approach is integrated in the methodology in order to take account of the capabilities of the manufacturing processes and to distribute optimal values among the individual components of the chains.Keywords: functional tolerances, manufacturing dispersions, simulation, CIM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14743864 Design Based Performance Prediction of Component Based Software Products
Authors: K. S. Jasmine, R. Vasantha
Abstract:
Component-Based software engineering provides an opportunity for better quality and increased productivity in software development by using reusable software components [10]. One of the most critical aspects of the quality of a software system is its performance. The systematic application of software performance engineering techniques throughout the development process can help to identify design alternatives that preserve desirable qualities such as extensibility and reusability while meeting performance objectives [1]. In the present scenario, software engineering methodologies strongly focus on the functionality of the system, while applying a “fix- it-later" approach to software performance aspects [3]. As a result, lengthy fine-tunings, expensive extra hard ware, or even redesigns are necessary for the system to meet the performance requirements. In this paper, we propose design based, implementation independent, performance prediction approach to reduce the overhead associated in the later phases while developing a performance guaranteed software product with the help of Unified Modeling Language (UML).Keywords: Software Reuse, Component-based development, Unified Modeling Language, Software performance, Software components, Performance engineering, Software engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18673863 Artificial Neural Network Approach for Inventory Management Problem
Authors: Govind Shay Sharma, Randhir Singh Baghel
Abstract:
The stock management of raw materials and finished goods is a significant issue for industries in fulfilling customer demand. Optimization of inventory strategies is crucial to enhancing customer service, reducing lead times and costs, and meeting market demand. This paper suggests finding an approach to predict the optimum stock level by utilizing past stocks and forecasting the required quantities. In this paper, we utilized Artificial Neural Network (ANN) to determine the optimal value. The objective of this paper is to discuss the optimized ANN that can find the best solution for the inventory model. In the context of the paper, we mentioned that the k-means algorithm is employed to create homogeneous groups of items. These groups likely exhibit similar characteristics or attributes that make them suitable for being managed using uniform inventory control policies. The paper proposes a method that uses the neural fit algorithm to control the cost of inventory.
Keywords: Artificial Neural Network, inventory management, optimization, distributor center.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733862 Estimation of Skew Angle in Binary Document Images Using Hough Transform
Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar
Abstract:
This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32673861 A Materialized Approach to the Integration of XML Documents: the OSIX System
Authors: H. Ahmad, S. Kermanshahani, A. Simonet, M. Simonet
Abstract:
The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.Keywords: Data integration, semi-structured data, views, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15903860 Quality as an Approach to Organizational Change and Its Role in the Reorganization of Enterprises: Case of Four Moroccan Small and Medium-Sized Enterprises
Authors: A. Boudiaf
Abstract:
The purpose of this paper is to analyze and apprehend, through four case studies, the interest of the project of the implementation of the quality management system (QMS) at four Moroccan small and medium-sized enterprises (SMEs). This project could generate significant organizational change to improve the functioning of the organization. In fact, quality is becoming a necessity in the current business world. It is considered to be a major component in companies’ competitive strategies. It should be noted that quality management is characterized by a set of methods and techniques that can be used to solve malfunctions and reorganize companies. It is useful to point out that the choice of the adoption of the quality approach could be influenced by the circumstances of the business context, it could also be derived from its strategic vision; this means that this choice can be characterized as either a strategic aspect or a reactive aspect. This would probably have a major impact on the functioning of the QMS and also on the perception of the quality issue by company managers and their employees.
Keywords: Business context, organizational change, quality, reorganization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8093859 A Finite Difference Calculation Procedure for the Navier-Stokes Equations on a Staggered Curvilinear Grid
Authors: R. M. Barron, B. Zogheib
Abstract:
A new numerical method for solving the twodimensional, steady, incompressible, viscous flow equations on a Curvilinear staggered grid is presented in this paper. The proposed methodology is finite difference based, but essentially takes advantage of the best features of two well-established numerical formulations, the finite difference and finite volume methods. Some weaknesses of the finite difference approach are removed by exploiting the strengths of the finite volume method. In particular, the issue of velocity-pressure coupling is dealt with in the proposed finite difference formulation by developing a pressure correction equation in a manner similar to the SIMPLE approach commonly used in finite volume formulations. However, since this is purely a finite difference formulation, numerical approximation of fluxes is not required. Results obtained from the present method are based on the first-order upwind scheme for the convective terms, but the methodology can easily be modified to accommodate higher order differencing schemes.Keywords: Curvilinear, finite difference, finite volume, SIMPLE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32033858 High Speed Video Transmission for Telemedicine using ATM Technology
Authors: J. P. Dubois, H. M. Chiu
Abstract:
In this paper, we study statistical multiplexing of VBR video in ATM networks. ATM promises to provide high speed realtime multi-point to central video transmission for telemedicine applications in rural hospitals and in emergency medical services. Video coders are known to produce variable bit rate (VBR) signals and the effects of aggregating these VBR signals need to be determined in order to design a telemedicine network infrastructure capable of carrying these signals. We first model the VBR video signal and simulate it using a generic continuous-data autoregressive (AR) scheme. We carry out the queueing analysis by the Fluid Approximation Model (FAM) and the Markov Modulated Poisson Process (MMPP). The study has shown a trade off: multiplexing VBR signals reduces burstiness and improves resource utilization, however, the buffer size needs to be increased with an associated economic cost. We also show that the MMPP model and the Fluid Approximation model fit best, respectively, the cell region and the burst region. Therefore, a hybrid MMPP and FAM completely characterizes the overall performance of the ATM statistical multiplexer. The ramifications of this technology are clear: speed, reliability (lower loss rate and jitter), and increased capacity in video transmission for telemedicine. With migration to full IP-based networks still a long way to achieving both high speed and high quality of service, the proposed ATM architecture will remain of significant use for telemedicine.Keywords: ATM, multiplexing, queueing, telemedicine, VBR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17443857 Application of Process Approach to Evaluate the Information Security Risk and its Implementation in an Iranian Private Bank
Authors: Isa Nakhai Kamal Abadi, Esmaeel Saberi, Ehsan Mirjafari
Abstract:
Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.
Keywords: Risk Management, Information Security, Methodology, Probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15303856 An Evaluation of Barriers to Implement Reverse Logistics: A Case Study of Indian Fastener Industry
Authors: D. Garg, S. Luthra, A. Haleem
Abstract:
Reverse logistics (RL) is supposed to be a systematic procedure that helps in improving the environmental hazards and maintain business sustainability for industries. Industries in Indian are now opting for adoption of RL techniques in business. But, RL practices are not popular in Indian industries because of many barriers for its successful implementation. Therefore, need arises to identify and evaluate the barriers to implement RL practices by taking an Indian industries perspective. Literature review approach and case study approach have been adapted to identify relevant barriers to implement RL practices. Further, Fuzzy Decision Making Trial and Evaluation Laboratory methodology has been brought into use for evaluating causal relationships among the barriers to implement RL practices. Seven barriers out of ten barriers have been categorized into the cause group and remaining into effect group. This research will help Indian industries to manage these barriers towards effective implementing RL practices.Keywords: Barriers, decision making trial and evaluation laboratory, fuzzy set theory, Indian industries, reverse logistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21913855 An Intelligent Human-Computer Interaction System for Decision Support
Authors: Chee Siong Teh, Chee Peng Lim
Abstract:
This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.
Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454