Search results for: Direct approach
4251 Preparing Project Managers to Achieve Project Success - Human Management Perspective
Authors: E. Muneera, A. Anuar, A. S. Zulkiflee
Abstract:
The evolution in project management was triggered by the changes in management philosophy and practices in order to maintain competitive advantage and continuous success in the field. The purpose of this paper is to highlight the practicality of cognitive style and unlearning approach in influencing the achievement of project success by project managers. It introduces the concept of planning, knowing and creating style from cognitive style field in the light of achieving time, cost, quality and stakeholders appreciation in project success context. Further it takes up a discussion of the unlearning approach as a moderator in enhancing the relationship between cognitive style and project success. The paper bases itself on literature review from established disciplines like psychology, sociology and philosophy regarding cognitive style, unlearning and project success in general. The analysis and synthesis of literature in the subject area a conceptual paper is utilized as the basis of future research to form a comprehensive framework for project managers in enhancing the project management competency.Keywords: Cognitive Style, Project Managers, Project Success, Unlearning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20304250 Genetic Algorithm Based Wavelength Division Multiplexing Networks Planning
Authors: S.Baskar, P.S.Ramkumar, R.Kesavan
Abstract:
This paper presents a new heuristic algorithm useful for long-term planning of survivable WDM networks. A multi-period model is formulated that combines network topology design and capacity expansion. The ability to determine network expansion schedules of this type becomes increasingly important to the telecommunications industry and to its customers. The solution technique consists of a Genetic Algorithm that allows generating several network alternatives for each time period simultaneously and shortest-path techniques to deduce from these alternatives a least-cost network expansion plan over all time periods. The multi-period planning approach is illustrated on a realistic network example. Extensive simulations on a wide range of problem instances are carried out to assess the cost savings that can be expected by choosing a multi-period planning approach instead of an iterative network expansion design method.Keywords: Wavelength Division Multiplexing, Genetic Algorithm, Network topology, Multi-period reliable network planning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14364249 An Efficient Graph Query Algorithm Based on Important Vertices and Decision Features
Authors: Xiantong Li, Jianzhong Li
Abstract:
Graph has become increasingly important in modeling complicated structures and schemaless data such as proteins, chemical compounds, and XML documents. Given a graph query, it is desirable to retrieve graphs quickly from a large database via graph-based indices. Different from the existing methods, our approach, called VFM (Vertex to Frequent Feature Mapping), makes use of vertices and decision features as the basic indexing feature. VFM constructs two mappings between vertices and frequent features to answer graph queries. The VFM approach not only provides an elegant solution to the graph indexing problem, but also demonstrates how database indexing and query processing can benefit from data mining, especially frequent pattern mining. The results show that the proposed method not only avoids the enumeration method of getting subgraphs of query graph, but also effectively reduces the subgraph isomorphism tests between the query graph and graphs in candidate answer set in verification stage.Keywords: Decision Feature, Frequent Feature, Graph Dataset, Graph Query
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18714248 A Study of Growth Factors on Sustainable Manufacturing in Small and Medium-Sized Enterprises: Case Study of Japan Manufacturing
Authors: Tadayuki Kyoutani, Shigeyuki Haruyama, Ken Kaminishi, Zefry Darmawan
Abstract:
Japan’s semiconductor industries have developed greatly in recent years. Many were started from a Small and Medium-sized Enterprises (SMEs) that found at a good circumstance and now become the prosperous industries in the world. Sustainable growth factors that support the creation of spirit value inside the Japanese company were strongly embedded through performance. Those factors were not clearly defined among each company. A series of literature research conducted to explore quantitative text mining about the definition of sustainable growth factors. Sustainable criteria were developed from previous research to verify the definition of the factors. A typical frame work was proposed as a systematical approach to develop sustainable growth factor in a specific company. Result of approach was review in certain period shows that factors influenced in sustainable growth was importance for the company to achieve the goal.
Keywords: SME, manufacture, sustainable, growth factor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6354247 Incineration of Sludge in a Fluidized-Bed Combustor
Authors: Chien-Song Chyang, Yu-Chi Wang
Abstract:
For sludge disposal, incineration is considered to be better than direct burial because of regulations and space limitations in Taiwan. Additionally, burial after incineration can effectively prolong the lifespan of a landfill. Therefore, it is the most satisfactory method for treating sludge at present. Of the various incineration technologies, the fluidized bed incinerator is a suitable choice due to its fuel flexibility. In this work, sludge generated from industrial plants was treated in a pilot-scale vortexing fluidized bed. The moisture content of the sludge was 48.53%, and its LHV was 454.6 kcal/kg. Primary gas and secondary gas were fixed at 3 Nm3/min and 1 Nm3/min, respectively. Diesel burners with on-off controllers were used to control the temperature; the bed temperature was set to 750±20 °C, and the freeboard temperature was 850±20 °C. The experimental data show that the NO emission increased with bed temperature. The maximum NO emission is 139 ppm, which is in agreement with the regulation. The CO emission is low than 100 ppm through the operation period. The mean particle size of fly ash collected from baghouse decreased with operating time. The ration of bottom ash to fly ash is about 3. Compared with bottom ash, the potassium in the fly ash is much higher. It implied that the potassium content is not the key factor for aggregation of bottom ash.
Keywords: Sludge incineration, fluidized bed combustion, fly ash, bottom ash.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9414246 Creating Customer Value through SOA and Outsourcing: A NEBIC Approach
Authors: Benazeer Md. Shahzada, Verelst Jan, Van Grembergen Wim, Mannaert Herwig
Abstract:
This article is an extension and a practical application approach of Wheeler-s NEBIC theory (Net Enabled Business Innovation Cycle). NEBIC theory is a new approach in IS research and can be used for dynamic environment related to new technology. Firms can follow the market changes rapidly with support of the IT resources. Flexible firms adapt their market strategies, and respond more quickly to customers changing behaviors. When every leading firm in an industry has access to the same IT resources, the way that these IT resources are managed will determine the competitive advantages or disadvantages of firm. From Dynamic Capabilities Perspective and from newly introduced NEBIC theory by Wheeler, we know that only IT resources cannot deliver customer value but good configuration of those resources can guarantee customer value by choosing the right emerging technology, grasping the right economic opportunities through business innovation and growth. We found evidences in literature that SOA (Service Oriented Architecture) is a promising emerging technology which can deliver the desired economic opportunity through modularity, flexibility and loose-coupling. SOA can also help firms to connect in network which can open a new window of opportunity to collaborate in innovation and right kind of outsourcing. There are many articles and research reports indicates that failure rate in outsourcing is very high but at the same time research indicates that successful outsourcing projects adds tangible and intangible benefits to the service consumer. Business executives and policy makers in the west should not afraid of outsourcing but they should choose the right strategy through the use of emerging technology to significantly reduce the failure rate in outsourcing.Keywords: Absorptive capacity, Dynamic Capability, Netenabled business innovation cycle, Service oriented architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14114245 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach
Authors: D. Tedesco, G. Feletti, P. Trucco
Abstract:
The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.
Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334244 Countercurrent Flow Simulation of Gas-Solid System in a Purge Column Using Computational Fluid Dynamics Techniques
Authors: T. J. Jamaleddine
Abstract:
Purge columns or degasser vessels are widely used in the polyolefin process for removing trapped hydrocarbons and in-excess catalyst residues from the polymer particles. A uniform distribution of purged gases coupled with a plug-flow characteristic inside the column system is desirable to obtain optimum desorption characteristics of trapped hydrocarbon and catalyst residues. Computational Fluid Dynamics (CFD) approach is a promising tool for design optimization of these vessels. The success of this approach is profoundly dependent on the solution strategy and the choice of geometrical layout at the vessel outlet. Filling the column with solids and initially solving for the solids flow minimized numerical diffusion substantially. Adopting a cylindrical configuration at the vessel outlet resulted in less numerical instability and resembled the hydrodynamics flow of solids in the hopper segment reasonably well.Keywords: CFD, gas-solids flow, gas purging, species transport, purge column, degasser vessel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6644243 Semi-automatic Background Detection in Microscopic Images
Authors: Alessandro Bevilacqua, Alessandro Gherardi, Ludovico Carozza, Filippo Piccinini
Abstract:
The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.
Keywords: Microscopy, flat field correction, background estimation, image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18354242 Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering
Authors: Elizabeth B. Varghese, M. Wilscy
Abstract:
A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.
Keywords: Face Recognition, Vector Quantization, Integrated Adaptive Fuzzy Clustering, Self Organization Map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22414241 Capacitor Placement in Radial Distribution System for Loss Reduction Using Artificial Bee Colony Algorithm
Authors: R. Srinivasa Rao
Abstract:
This paper presents a new method which applies an artificial bee colony algorithm (ABC) for capacitor placement in distribution systems with an objective of improving the voltage profile and reduction of power loss. The ABC algorithm is a new population based meta heuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 69-bus system and compared the results with the other approach available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.Keywords: Distribution system, Capacitor Placement, Loss reduction, Artificial Bee Colony Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28174240 An Application of the Sinc-Collocation Method to a Three-Dimensional Oceanography Model
Authors: Y. Mohseniahouei, K. Abdella, M. Pollanen
Abstract:
In this paper, we explore the applicability of the Sinc- Collocation method to a three-dimensional (3D) oceanography model. The model describes a wind-driven current with depth-dependent eddy viscosity in the complex-velocity system. In general, the Sinc-based methods excel over other traditional numerical methods due to their exponentially decaying errors, rapid convergence and handling problems in the presence of singularities in end-points. Together with these advantages, the Sinc-Collocation approach that we utilize exploits first derivative interpolation, whose integration is much less sensitive to numerical errors. We bring up several model problems to prove the accuracy, stability, and computational efficiency of the method. The approximate solutions determined by the Sinc-Collocation technique are compared to exact solutions and those obtained by the Sinc-Galerkin approach in earlier studies. Our findings indicate that the Sinc-Collocation method outperforms other Sinc-based methods in past studies.Keywords: Boundary Value Problems, Differential Equations, Sinc Numerical Methods, Wind-Driven Currents
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18514239 Modelling of Heating and Evaporation of Biodiesel Fuel Droplets
Authors: Mansour Al Qubeissi, Sergei S. Sazhin, Cyril Crua, Morgan R. Heikal
Abstract:
This paper presents the application of the Discrete Component Model for heating and evaporation to multi-component biodiesel fuel droplets in direct injection internal combustion engines. This model takes into account the effects of temperature gradient, recirculation and species diffusion inside droplets. A distinctive feature of the model used in the analysis is that it is based on the analytical solutions to the temperature and species diffusion equations inside the droplets. Nineteen types of biodiesel fuels are considered. It is shown that a simplistic model, based on the approximation of biodiesel fuel by a single component or ignoring the diffusion of components of biodiesel fuel, leads to noticeable errors in predicted droplet evaporation time and time evolution of droplet surface temperature and radius.
Keywords: Heat/Mass Transfer, Biodiesel, Multi-component Fuel, Droplet, Evaporation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27984238 Diagnosis on Environmental Impacts of Tourism at Caju Beach in Palmas, Tocantins, Brazil
Authors: Mary L. G. S. Senna, Veruska, C. Dutra, Jr., Keity L. F. Oliveira, Patrícia A. Santos, Alana C. M. Santana
Abstract:
Environmental impacts are the changes in the physical, chemical or biological properties of natural areas that are most often caused by human actions on the environment and which have consequences for human health, society and the elements of nature. The identification of the environmental impacts is important so that they are mitigated, and above all that the mitigating measures are applied in the area. This work aims to identify the environmental impacts generated in the Praia do Caju area in the city of Palmas/Brazil and show that the lack of structure on the beach intensifies the environmental impacts. The present work was carried out having as parameter, the typologies of exploratory and descriptive and quantitative research through a matrix of environmental impacts through direct observation and registration. The study took place during the holidays from August to December 2016 and photographic record of impacts. From the collected data it was possible to verify that Caju beach suffers constant degradation due to irregular deposition.
Keywords: Leisure, tourism, environmental impacts, Brazil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7354237 Voltage-Controllable Liquid Crystals Lens
Authors: Wen-Chi Hung, Tung-Kai Liu, Ming-Shan Tsai, Chun-Che Lee, I-Min Jiang
Abstract:
This study investigates a voltage-controllable liquid crystals lens with a Fresnel zone electrode. When applying a proper voltage on the liquid crystal cell, a Fresnel-zone-distributed electric field is induced to direct liquid crystals aligned in a concentric structure. Owing to the concentrically aligned liquid crystals, a Fresnel lens is formed. We probe the Fresnel liquid crystal lens using a polarized incident beam with a wavelength of 632.8 nm, finding that the diffraction efficiency depends on the applying voltage. A remarkable diffraction efficiency of ~39.5 % is measured at the voltage of 0.9V. Additionally, a dual focus lens is fabricated by attaching a plane-convex lens to the Fresnel liquid crystals cell. The Fresnel LC lens and the dual focus lens may be applied for DVD/CD pick-up head, confocal microscopy system, or electrically-controlling optical systems.
Keywords: Liquid Crystals Lens, Fresnel Lens, and Dual focus
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22954236 Mechanical Design and Theoretical Analysis of a Skip-Cycle Mechanism for an Internal Combustion Engine
Authors: Ismail Gerzeli, Cemal Baykara, Osman Akin Kutlar
Abstract:
Skip cycle is a working strategy for spark ignition engines, which allows changing the effective stroke of an engine through skipping some of the four stroke cycles. This study proposes a new mechanism to achieve the desired skip-cycle strategy for internal combustion engines. The air and fuel leakage, which occurs through the gas exchange, negatively affects the efficiency of the engine at high speeds and loads. An absolute sealing is assured by direct use of poppet valves, which are kept in fully closed position during the skipped mode. All the components of the mechanism were designed according to the real dimensions of the Anadolu Motor's gasoline engine and modeled in 3D by means of CAD software. As the mechanism operates in two modes, two dynamically equivalent models are established to obtain the force and strength analysis for critical components.Keywords: Dynamic Model, Mechanical Design, Skip Cycle System (SCS), Valve Disabling Mechanism
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20074235 An Efficient Hardware Implementation of Extended and Fast Physical Addressing in Microprocessor-Based Systems Using Programmable Logic
Authors: Mountassar Maamoun, Abdelhamid Meraghni, Abdelhalim Benbelkacem, Daoud Berkani
Abstract:
This paper describes an efficient hardware implementation of a new technique for interfacing the data exchange between the microprocessor-based systems and the external devices. This technique, based on the use of software/hardware system and a reduced physical address, enlarges the interfacing capacity of the microprocessor-based systems, uses the Direct Memory Access (DMA) to increases the frequency of the new bus, and improves the speed of data exchange. While using this architecture in microprocessor-based system or in computer, the input of the hardware part of our system will be connected to the bus system, and the output, which is a new bus, will be connected to an external device. The new bus is composed of a data bus, a control bus and an address bus. A Xilinx Integrated Software Environment (ISE) 7.1i has been used for the programmable logic implementation.
Keywords: Interfacing, Software/hardware System, CPLD, programmable logic, DMA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13854234 Flexible Development and Calculation of Contract Logistics Services
Authors: T. Spiegel, J. Siegmann, C. F. Durach
Abstract:
Challenges resulting from an international and dynamic business environment are increasingly being passed on from manufacturing companies to external service providers. Especially providers of complex, customer-specific industry services have to cope with continuously changing requirements. This is particularly true for contract logistics service providers. They are forced to develop efficient and highly flexible structures and strategies to meet their customer’s needs. One core element they have to focus on is the reorganization of their service development and sales process. Based on an action research approach, this study develops and tests a concept to streamline tender management for contract logistics service providers. The concept of modularized service architecture is deployed in order to derive a practice-oriented approach for the modularization of complex service portfolios and the design of customized quotes. These findings are evaluated regarding their applicability in other service sectors and practical recommendations are given.
Keywords: Contract Logistics, Modularization, Service Development, Tender Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20714233 Competence-Based Human Resources Selection and Training: Making Decisions
Authors: O. Starineca, I. Voronchuk
Abstract:
Human Resources (HR) selection and training have various implementation possibilities depending on an organization’s abilities and peculiarities. We propose to base HR selection and training decisions about on a competence-based approach. HR selection and training of employees are topical as there is room for improvement in this field; therefore, the aim of the research is to propose rational decision-making approaches for an organization HR selection and training choice. Our proposals are based on the training development and competence-based selection approaches created within previous researches i.e. Analytic-Hierarchy Process (AHP) and Linear Programming. Literature review on non-formal education, competence-based selection, AHP form our theoretical background. Some educational service providers in Latvia offer employees training, e.g. motivation, computer skills, accounting, law, ethics, stress management, etc. that are topical for Public Administration. Competence-based approach is a rational base for rational decision-making in both HR selection and considering HR training.Keywords: Competence-based selection, human resource, training, decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11064232 Manufacturing Dispersions Based Simulation and Synthesis of Design Tolerances
Authors: Nassima Cheikh, Abdelmadjid Cheikh, Said Hamou
Abstract:
The objective of this work which is based on the approach of simultaneous engineering is to contribute to the development of a CIM tool for the synthesis of functional design dimensions expressed by average values and tolerance intervals. In this paper, the dispersions method known as the Δl method which proved reliable in the simulation of manufacturing dimensions is used to develop a methodology for the automation of the simulation. This methodology is constructed around three procedures. The first procedure executes the verification of the functional requirements by automatically extracting the functional dimension chains in the mechanical sub-assembly. Then a second procedure performs an optimization of the dispersions on the basis of unknown variables. The third procedure uses the optimized values of the dispersions to compute the optimized average values and tolerances of the functional dimensions in the chains. A statistical and cost based approach is integrated in the methodology in order to take account of the capabilities of the manufacturing processes and to distribute optimal values among the individual components of the chains.Keywords: functional tolerances, manufacturing dispersions, simulation, CIM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14744231 Design Based Performance Prediction of Component Based Software Products
Authors: K. S. Jasmine, R. Vasantha
Abstract:
Component-Based software engineering provides an opportunity for better quality and increased productivity in software development by using reusable software components [10]. One of the most critical aspects of the quality of a software system is its performance. The systematic application of software performance engineering techniques throughout the development process can help to identify design alternatives that preserve desirable qualities such as extensibility and reusability while meeting performance objectives [1]. In the present scenario, software engineering methodologies strongly focus on the functionality of the system, while applying a “fix- it-later" approach to software performance aspects [3]. As a result, lengthy fine-tunings, expensive extra hard ware, or even redesigns are necessary for the system to meet the performance requirements. In this paper, we propose design based, implementation independent, performance prediction approach to reduce the overhead associated in the later phases while developing a performance guaranteed software product with the help of Unified Modeling Language (UML).Keywords: Software Reuse, Component-based development, Unified Modeling Language, Software performance, Software components, Performance engineering, Software engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18674230 Artificial Neural Network Approach for Inventory Management Problem
Authors: Govind Shay Sharma, Randhir Singh Baghel
Abstract:
The stock management of raw materials and finished goods is a significant issue for industries in fulfilling customer demand. Optimization of inventory strategies is crucial to enhancing customer service, reducing lead times and costs, and meeting market demand. This paper suggests finding an approach to predict the optimum stock level by utilizing past stocks and forecasting the required quantities. In this paper, we utilized Artificial Neural Network (ANN) to determine the optimal value. The objective of this paper is to discuss the optimized ANN that can find the best solution for the inventory model. In the context of the paper, we mentioned that the k-means algorithm is employed to create homogeneous groups of items. These groups likely exhibit similar characteristics or attributes that make them suitable for being managed using uniform inventory control policies. The paper proposes a method that uses the neural fit algorithm to control the cost of inventory.
Keywords: Artificial Neural Network, inventory management, optimization, distributor center.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734229 Finite Element Approach to Evaluate Time Dependent Shear Behavior of Connections in Hybrid Steel-PC Girder under Sustained Loading
Authors: Mohammad Najmol Haque, Takeshi Maki, Jun Sasaki
Abstract:
Headed stud shear connections are widely used in the junction or embedded zone of hybrid girder to achieve whole composite action with continuity that can sustain steel-concrete interfacial tensile and shear forces. In Japan, Japan Road Association (JRA) specifications are used for hybrid girder design that utilizes very low level of stud capacity than those of American Institute of Steel Construction (AISC) specifications, Japan Society of Civil Engineers (JSCE) specifications and EURO code. As low design shear strength is considered in design of connections, the time dependent shear behavior due to sustained external loading is not considered, even not fully studied. In this study, a finite element approach was used to evaluate the time dependent shear behavior for headed studs used as connections at the junction. This study clarified, how the sustained loading distinctively impacted on changing the interfacial shear of connections with time which was sensitive to lodging history, positions of flanges, neighboring studs, position of prestress bar and reinforcing bar, concrete strength, etc. and also identified a shear influence area. Stud strength was also confirmed through pushout tests. The outcome obtained from the study may provide an important basis and reference data in designing connections of hybrid girders with enhanced stud capacity with due consideration of their long-term shear behavior.
Keywords: Finite element approach, hybrid girder, headed stud shear connections, sustained loading, time dependent shear behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6294228 Banks Profitability Indicators in CEE Countries
Abstract:
The aim of the present article is to determine the impact of the external and internal factors of bank performance on the profitability indicators of the CEE countries banks in the period from 2006 to 2012. On the basis of research conducted abroad on bank and macroeconomic profitability indicators, in order to obtain research results, the authors evaluated return on average assets (ROAA) and return on average equity (ROAE) indicators of the CEE countries banks. The authors analyzed profitability indicators of banks using descriptive methods, SPSS data analysis methods, as well as data correlation and linear regression analysis. The authors concluded that most internal and external indicators of bank performance have no direct influence the profitability of the banks in the CEE countries. The only exceptions are credit risk and bank size, which affect one of the measures of bank profitability – return on average equity.
Keywords: Banks, CEE countries, Profitability ROAA, ROAE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26624227 Novel Direct Flux and Torque Control of Optimally Designed 6 Phase Reluctance Machine with Special Current Waveform
Authors: E T. Rakgati, E. Matlotse
Abstract:
In this paper the principle, basic torque theory and design optimisation of a six-phase reluctance dc machine are considered. A trapezoidal phase current waveform for the machine drive is proposed and evaluated to minimise ripple torque. Low cost normal laminated salient-pole rotors with and without slits and chamfered poles are investigated. The six-phase machine is optimised in multi-dimensions by linking the finite-element analysis method directly with an optimisation algorithm; the objective function is to maximise the torque per copper losses of the machine. The armature reaction effect is investigated in detail and found to be severe. The measured and calculated torque performances of a 35 kW optimum designed six-phase reluctance dc machine drive are presented.
Keywords: Reluctance dc machine, current waveform, design optimisation, finite element analysis, armature reaction effect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17284226 Estimation of Skew Angle in Binary Document Images Using Hough Transform
Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar
Abstract:
This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32674225 A Materialized Approach to the Integration of XML Documents: the OSIX System
Authors: H. Ahmad, S. Kermanshahani, A. Simonet, M. Simonet
Abstract:
The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.Keywords: Data integration, semi-structured data, views, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15904224 Quality as an Approach to Organizational Change and Its Role in the Reorganization of Enterprises: Case of Four Moroccan Small and Medium-Sized Enterprises
Authors: A. Boudiaf
Abstract:
The purpose of this paper is to analyze and apprehend, through four case studies, the interest of the project of the implementation of the quality management system (QMS) at four Moroccan small and medium-sized enterprises (SMEs). This project could generate significant organizational change to improve the functioning of the organization. In fact, quality is becoming a necessity in the current business world. It is considered to be a major component in companies’ competitive strategies. It should be noted that quality management is characterized by a set of methods and techniques that can be used to solve malfunctions and reorganize companies. It is useful to point out that the choice of the adoption of the quality approach could be influenced by the circumstances of the business context, it could also be derived from its strategic vision; this means that this choice can be characterized as either a strategic aspect or a reactive aspect. This would probably have a major impact on the functioning of the QMS and also on the perception of the quality issue by company managers and their employees.
Keywords: Business context, organizational change, quality, reorganization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8084223 A Finite Difference Calculation Procedure for the Navier-Stokes Equations on a Staggered Curvilinear Grid
Authors: R. M. Barron, B. Zogheib
Abstract:
A new numerical method for solving the twodimensional, steady, incompressible, viscous flow equations on a Curvilinear staggered grid is presented in this paper. The proposed methodology is finite difference based, but essentially takes advantage of the best features of two well-established numerical formulations, the finite difference and finite volume methods. Some weaknesses of the finite difference approach are removed by exploiting the strengths of the finite volume method. In particular, the issue of velocity-pressure coupling is dealt with in the proposed finite difference formulation by developing a pressure correction equation in a manner similar to the SIMPLE approach commonly used in finite volume formulations. However, since this is purely a finite difference formulation, numerical approximation of fluxes is not required. Results obtained from the present method are based on the first-order upwind scheme for the convective terms, but the methodology can easily be modified to accommodate higher order differencing schemes.Keywords: Curvilinear, finite difference, finite volume, SIMPLE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32034222 A Hybrid Particle Swarm Optimization Solution to Ramping Rate Constrained Dynamic Economic Dispatch
Authors: Pichet Sriyanyong
Abstract:
This paper presents the application of an enhanced Particle Swarm Optimization (EPSO) combined with Gaussian Mutation (GM) for solving the Dynamic Economic Dispatch (DED) problem considering the operating constraints of generators. The EPSO consists of the standard PSO and a modified heuristic search approaches. Namely, the ability of the traditional PSO is enhanced by applying the modified heuristic search approach to prevent the solutions from violating the constraints. In addition, Gaussian Mutation is aimed at increasing the diversity of global search, whilst it also prevents being trapped in suboptimal points during search. To illustrate its efficiency and effectiveness, the developed EPSO-GM approach is tested on the 3-unit and 10-unit 24-hour systems considering valve-point effect. From the experimental results, it can be concluded that the proposed EPSO-GM provides, the accurate solution, the efficiency, and the feature of robust computation compared with other algorithms under consideration.Keywords: Particle Swarm Optimization (PSO), GaussianMutation (GM), Dynamic Economic Dispatch (DED).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795