Search results for: first order plusdead time process
12179 Assessing the Ways of Improving the Power Saving Modes in the Ore-Grinding Technological Process
Authors: Baghdasaryan Marinka
Abstract:
Monitoring the distribution of electric power consumption in the technological process of ore grinding is conducted. As a result, the impacts of the mill filling rate, the productivity of the ore supply, the volumetric density of the grinding balls, the specific density of the ground ore, and the relative speed of the mill rotation on the specific consumption of electric power have been studied. The power and technological factors affecting the reactive power generated by the synchronous motors, operating within the technological scheme are studied. A block diagram for evaluating the power consumption modes of the technological process is presented, which includes the analysis of the technological scheme, the determination of the place and volumetric density of the ore-grinding mill, the evaluation of the technological and power factors affecting the energy saving process, as well as the assessment of the electric power standards.
Keywords: Electric power standard, factor, ore grinding, power consumption, reactive power, technological.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 90012178 Optimal Bayesian Control of the Proportion of Defectives in a Manufacturing Process
Authors: Viliam Makis, Farnoosh Naderkhani, Leila Jafari
Abstract:
In this paper, we present a model and an algorithm for the calculation of the optimal control limit, average cost, sample size, and the sampling interval for an optimal Bayesian chart to control the proportion of defective items produced using a semi-Markov decision process approach. Traditional p-chart has been widely used for controlling the proportion of defectives in various kinds of production processes for many years. It is well known that traditional non-Bayesian charts are not optimal, but very few optimal Bayesian control charts have been developed in the literature, mostly considering finite horizon. The objective of this paper is to develop a fast computational algorithm to obtain the optimal parameters of a Bayesian p-chart. The decision problem is formulated in the partially observable framework and the developed algorithm is illustrated by a numerical example.Keywords: Bayesian control chart, semi-Markov decision process, quality control, partially observable process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 116912177 Prioritization Assessment of Housing Development Risk Factors: A Fuzzy Hierarchical Process-Based Approach
Authors: Yusuf Garba Baba
Abstract:
The construction industry and housing subsector are fraught with risks that have the potential of negatively impacting on the achievement of project objectives. The success or otherwise of most construction projects depends to large extent on how well these risks have been managed. The recent paradigm shift by the subsector to use of formal risk management approach in contrast to hitherto developed rules of thumb means that risks must not only be identified but also properly assessed and responded to in a systematic manner. The study focused on identifying risks associated with housing development projects and prioritisation assessment of the identified risks in order to provide basis for informed decision. The study used a three-step identification framework: review of literature for similar projects, expert consultation and questionnaire based survey to identify potential risk factors. Delphi survey method was employed in carrying out the relative prioritization assessment of the risks factors using computer-based Analytical Hierarchical Process (AHP) software. The results show that 19 out of the 50 risks significantly impact on housing development projects. The study concludes that although significant numbers of risk factors have been identified as having relevance and impacting to housing construction projects, economic risk group and, in particular, ‘changes in demand for houses’ is prioritised by most developers as posing a threat to the achievement of their housing development objectives. Unless these risks are carefully managed, their effects will continue to impede success in these projects. The study recommends the adoption and use of the combination of multi-technique identification framework and AHP prioritization assessment methodology as a suitable model for the assessment of risks in housing development projects.
Keywords: Risk identification, risk assessment, analytical hierarchical process, multi-criteria decision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73412176 Using Agility in Building Business Process Management Solutions
Authors: Krešimir Fertalj, Mladen Matejaš
Abstract:
In turbulent modern economy, the companies need to properly manage their business processes. Well-defined and stable business processes ensure security of crucial data and applications, and provide a quality product or service to the end customer. On the other side, constant changes on the market, new regulatory provisions, and emerging new technologies require the need of issuing prompt and effective changes of business process. In this article, we explore the use of agile principles in working with business process management (BPM) solutions. We deal with difficulties in BPM development cycle, review the benefits of using agility, and choose the basic agile principles that ensure the success of a BPM project.Keywords: Agile development, BPM environment, Kanban, SCRUM, XP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161912175 Description of Kinetics of Propane Fragmentation with a Support of Ab Initio Simulation
Authors: Amer Al Mahmoud Alsheikh, Jan Žídek, František Krčma
Abstract:
Using ab initio theoretical calculations, we present analysis of fragmentation process. The analysis is performed in two steps. The first step is calculation of fragmentation energies by ab initio calculations. The second step is application of the energies to kinetic description of process. The energies of fragments are presented in this paper. The kinetics of fragmentation process can be described by numerical models. The method for kinetic analysis is described in this paper. The result - composition of fragmentation products - will be calculated in future. The results from model can be compared to the concentrations of fragments from mass spectrum.Keywords: Ab initio, Density functional theory, Fragmentation energy, Geometry optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189412174 Effective Digital Music Retrieval System through Content-based Features
Authors: Bokyung Sung, Kwanghyo Koo, Jungsoo Kim, Myung-Bum Jung, Jinman Kwon, Ilju Ko
Abstract:
In this paper, we propose effective system for digital music retrieval. We divided proposed system into Client and Server. Client part consists of pre-processing and Content-based feature extraction stages. In pre-processing stage, we minimized Time code Gap that is occurred among same music contents. As content-based feature, first-order differentiated MFCC were used. These presented approximately envelop of music feature sequences. Server part included Music Server and Music Matching stage. Extracted features from 1,000 digital music files were stored in Music Server. In Music Matching stage, we found retrieval result through similarity measure by DTW. In experiment, we used 450 queries. These were made by mixing different compression standards and sound qualities from 50 digital music files. Retrieval accurate indicated 97% and retrieval time was average 15ms in every single query. Out experiment proved that proposed system is effective in retrieve digital music and robust at various user environments of web.
Keywords: Music Retrieval, Content-based, Music Feature and Digital Music.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151912173 Mining Sequential Patterns Using I-PrefixSpan
Authors: Dhany Saputra, Dayang R. A. Rambli, Oi Mean Foong
Abstract:
In this paper, we propose an improvement of pattern growth-based PrefixSpan algorithm, called I-PrefixSpan. The general idea of I-PrefixSpan is to use sufficient data structure for Seq-Tree framework and separator database to reduce the execution time and memory usage. Thus, with I-PrefixSpan there is no in-memory database stored after index set is constructed. The experimental result shows that using Java 2, this method improves the speed of PrefixSpan up to almost two orders of magnitude as well as the memory usage to more than one order of magnitude.Keywords: ArrayList, ArrayIntList, minimum support, sequence database, sequential patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156412172 Obsession of Time and the New Musical Ontologies: The Concert for Saxophone, Daniel Kientzy and Orchestra by Myriam Marbe
Authors: Luminiţa Duţică
Abstract:
For the music composer Myriam Marbe the musical time and memory represent 2 (complementary) phenomena with conclusive impact on the settlement of new musical ontologies. Summarizing the most important achievements of the contemporary techniques of composition, her vision on the microform presented in The Concert for Daniel Kientzy, saxophone and orchestra transcends the linear and unidirectional time in favour of a flexible, multivectorial speech with spiral developments, where the sound substance is auto(re)generated by analogy with the fundamental processes of the memory. The conceptual model is of an archetypal essence, the music composer being concerned with identifying the mechanisms of the creation process, especially of those specific to the collective creation (of oral tradition). Hence the spontaneity of expression, improvisation tint, free rhythm, micro-interval intonation, coloristictimbral universe dominated by multiphonics and unique sound effects, hence the atmosphere of ritual, however purged by the primary connotations and reprojected into a wonderful spectacular space. The Concert is a work of artistic maturity and enforces respect, among others, by the timbral diversity of the three species of saxophone required by the music composer (baritone, sopranino and alt), in Part III Daniel Kientzy shows the performance of playing two saxophones concomitantly. The score of the music composer Myriam Marbe contains a deeply spiritualized music, full or archetypal symbols, a music whose drama suggests a real cinematographic movement.Keywords: Archetype, chronogenesis, concert, multiphonics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210012171 Haemocompatibility of Surface Modified AISI 316L Austenitic Stainless Steel Tested in Artificial Plasma
Authors: W. Walke, J. Przondziono, K. Nowińska
Abstract:
The study comprises evaluation of suitability of passive layer created on the surface of AISI 316L stainless steel for products that are intended to have contact with blood. For that purpose, prior to and after chemical passivation, samples were subject to 7 day exposure in artificial plasma at the temperature of T=37°C. Next, tests of metallic ions infiltration from the surface to the solution were performed. The tests were performed with application of spectrometer JY 2000, by Yobin – Yvon, employing Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). In order to characterize physical and chemical features of electrochemical processes taking place during exposure of samples to artificial plasma, tests with application of electrochemical impedance spectroscopy were suggested. The tests were performed with application of measuring unit equipped with potentiostat PGSTAT 302n with an attachment for impedance tests FRA2. Measurements were made in the environment simulating human blood at the temperature of T=37°C. Performed tests proved that application of chemical passivation process for AISI 316L stainless steel used for production of goods intended to have contact with blood is well-grounded and useful in order to improve safety of their usage.
Keywords: AISI 316L stainless steel, chemical passivation, artificial plasma, ions infiltration, EIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209712170 Identifying Areas on the Pavement Where Rain Water Runoff Affects Motorcycle Behavior
Authors: Panagiotis Lemonakis, Theodoros Αlimonakis, George Kaliabetsos, Nikos Eliou
Abstract:
It is very well known that certain vertical and longitudinal slopes have to be assured in order to achieve adequate rainwater runoff from the pavement. The selection of longitudinal slopes, between the turning points of the vertical curves that meet the afore-mentioned requirement does not ensure adequate drainage because the same condition must also be applied at the transition curves. In this way none of the pavement edges’ slopes (as well as any other spot that lie on the pavement) will be opposite to the longitudinal slope of the rotation axis. Horizontal and vertical alignment must be properly combined in order to form a road which resultant slope does not take small values and hence, checks must be performed in every cross section and every chainage of the road. The present research investigates the rain water runoff from the road surface in order to identify the conditions under which, areas of inadequate drainage are being created, to analyze the rainwater behavior in such areas, to provide design examples of good and bad drainage zones and to track down certain motorcycle types which might encounter hazardous situations due to the presence of water film between the pavement and both of their tires resulting loss of traction. Moreover, it investigates the combination of longitudinal and cross slope values in critical pavement areas. It should be pointed out that the drainage gradient is analytically calculated for the whole road width and not just for an oblique slope per chainage (combination of longitudinal grade and cross slope). Lastly, various combinations of horizontal and vertical design are presented, indicating the crucial zones of bad pavement drainage. The key conclusion of the study is that any type of motorcycle will travel for some time inside the area of improper runoff for a certain time frame which depends on the speed and the trajectory that the rider chooses along the transition curve. Taking into account that on this section the rider will have to lean his motorcycle and hence reduce the contact area of his tire with the pavement it is apparent that any variations on the friction value due to the presence of a water film may lead to serious problems regarding his safety. The water runoff from the road pavement is improved when between reverse longitudinal slopes, crest instead of sag curve is chosen and particularly when its edges coincide with the edges of the horizontal curve. Lastly, the results of the investigation have shown that the variation of the longitudinal slope involves the vertical shift of the center of the poor water runoff area. The magnitude of this area increases as the length of the transition curve increases.
Keywords: Drainage, motorcycle safety, superelevation, transition curves, vertical grade.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53612169 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks
Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó
Abstract:
One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.Keywords: Citation networks, scientometric indicator, cross-field normalization, local cluster detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72512168 Studying on ARINC653 Partition Run-time Scheduling and Simulation
Authors: Dongliang Wang, Jun Han, Dianfu Ma, Xianqi Zhao
Abstract:
Avionics software is safe-critical embedded software and its architecture is evolving from traditional federated architectures to Integrated Modular Avionics (IMA) to improve resource usability. ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Arinc653 uses two-level scheduling strategies, but current modeling tools only apply to simple problems of Arinc653 two-level scheduling, which only contain time property. In avionics industry, we are always manually allocating tasks and calculating the timing table of a real-time system to ensure it-s running as we design. In this paper we represent an automatically generating strategy which applies to the two scheduling problems with dependent constraints in Arinc653 partition run-time environment. It provides the functionality of automatic generation from the task and partition models to scheduling policy through allocating the tasks to the partitions while following the constraints, and then we design a simulating mechanism to check whether our policy is schedulable or notKeywords: Arinc653, scheduling, task allocation, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234512167 Nonlinear Dynamic Analysis of Base-Isolated Structures Using a Mixed Integration Method: Stability Aspects and Computational Efficiency
Authors: Nicolò Vaiana, Filip C. Filippou, Giorgio Serino
Abstract:
In order to reduce numerical computations in the nonlinear dynamic analysis of seismically base-isolated structures, a Mixed Explicit-Implicit time integration Method (MEIM) has been proposed. Adopting the explicit conditionally stable central difference method to compute the nonlinear response of the base isolation system, and the implicit unconditionally stable Newmark’s constant average acceleration method to determine the superstructure linear response, the proposed MEIM, which is conditionally stable due to the use of the central difference method, allows to avoid the iterative procedure generally required by conventional monolithic solution approaches within each time step of the analysis. The main aim of this paper is to investigate the stability and computational efficiency of the MEIM when employed to perform the nonlinear time history analysis of base-isolated structures with sliding bearings. Indeed, in this case, the critical time step could become smaller than the one used to define accurately the earthquake excitation due to the very high initial stiffness values of such devices. The numerical results obtained from nonlinear dynamic analyses of a base-isolated structure with a friction pendulum bearing system, performed by using the proposed MEIM, are compared to those obtained adopting a conventional monolithic solution approach, i.e. the implicit unconditionally stable Newmark’s constant acceleration method employed in conjunction with the iterative pseudo-force procedure. According to the numerical results, in the presented numerical application, the MEIM does not have stability problems being the critical time step larger than the ground acceleration one despite of the high initial stiffness of the friction pendulum bearings. In addition, compared to the conventional monolithic solution approach, the proposed algorithm preserves its computational efficiency even when it is adopted to perform the nonlinear dynamic analysis using a smaller time step.Keywords: Base isolation, computational efficiency, mixed explicit-implicit method, partitioned solution approach, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105912166 Role-Governed Categorization and Category Learning as a Result from Structural Alignment: The RoleMap Model
Authors: Yolina A. Petrova, Georgi I. Petkov
Abstract:
The paper presents a symbolic model for category learning and categorization (called RoleMap). Unlike the other models which implement learning in a separate working mode, role-governed category learning and categorization emerge in RoleMap while it does its usual reasoning. The model is based on several basic mechanisms known as reflecting the sub-processes of analogy-making. It steps on the assumption that in their everyday life people constantly compare what they experience and what they know. Various commonalities between the incoming information (current experience) and the stored one (long-term memory) emerge from those comparisons. Some of those commonalities are considered to be highly important, and they are transformed into concepts for further use. This process denotes the category learning. When there is missing knowledge in the incoming information (i.e. the perceived object is still not recognized), the model makes anticipations about what is missing, based on the similar episodes from its long-term memory. Various such anticipations may emerge for different reasons. However, with time only one of them wins and is transformed into a category member. This process denotes the act of categorization.
Keywords: Categorization, category learning, role-governed category, analogy-making, cognitive modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 66112165 The Effect of Leadership Styles on Continuous Improvement Teams
Authors: Paul W. Murray
Abstract:
This research explorers the relationship between leadership style and continuous improvement (CI) teams. CI teams have several features that are not always found in other types of teams, including multi-functional members, short time period for performance, positive and actionable results, and exposure to senior leadership. There is no one best style of leadership for these teams. Instead, it is important to select the best leadership style for the situation. The leader must have the flexibility to change styles and the skill to use the chosen style effectively in order to ensure the team’s success.
Keywords: Leadership style, Lean Manufacturing, Teams, Cross-functional.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 388712164 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique
Authors: Navajyoti Panda, R. S. Pawar
Abstract:
In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.
Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145012163 Pull-In Instability Determination of Microcapacitive Sensor for Measuring Special Range of Pressure
Authors: Yashar Haghighatfar, Shahrzad Mirhosseini
Abstract:
Pull-in instability is a nonlinear and crucial effect that is important for the design of microelectromechanical system devices. In this paper, the appropriate electrostatic voltage range is determined by measuring fluid flow pressure via micro pressure sensor based microbeam. The microbeam deflection contains two parts, the static and perturbation deflection of static. The second order equation regarding the equivalent stiffness, mass and damping matrices based on Galerkin method is introduced to predict pull-in instability due to the external voltage. Also the reduced order method is used for solving the second order nonlinear equation of motion. Furthermore, in the present study, the micro capacitive pressure sensor is designed for measuring special fluid flow pressure range. The results show that the measurable pressure range can be optimized, regarding damping field and external voltage.
Keywords: MEMS, pull-in instability, electrostatically actuated microbeam, reduced order method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76912162 Optimization of Control Parameters for MRR in Injection Flushing Type of EDM on Stainless Steel 304 Workpiece
Authors: M. S. Reza, M. Hamdi, A.S. Hadi
Abstract:
The operating control parameters of injection flushing type of electrical discharge machining process on stainless steel 304 workpiece with copper tools are being optimized according to its individual machining characteristic i.e. material removal rate (MRR). Lower MRR during EDM machining process may decrease its- machining productivity. Hence, the quality characteristic for MRR is set to higher-the-better to achieve the optimum machining productivity. Taguchi method has been used for the construction, layout and analysis of the experiment for each of the machining characteristic for the MRR. The use of Taguchi method in the experiment saves a lot of time and cost of preparing and machining the experiment samples. Therefore, an L18 Orthogonal array which was the fundamental component in the statistical design of experiments has been used to plan the experiments and Analysis of Variance (ANOVA) is used to determine the optimum machining parameters for this machining characteristic. The control parameters selected for this optimization experiments are polarity, pulse on duration, discharge current, discharge voltage, machining depth, machining diameter and dielectric liquid pressure. The result had shown that the higher the discharge voltage, the higher will be the MRR.Keywords: ANOVA, EDM, Injection Flushing, L18 OrthogonalArray, MRR, Stainless Steel 304
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182112161 Development of a Wiki-based Feature Library for a Process Planning System
Authors: Hendry Muljadi, Hideaki Takeda, Koichi Ando
Abstract:
A manufacturing feature can be defined simply as a geometric shape and its manufacturing information to create the shape. In a feature-based process planning system, feature library plays an important role in the extraction of manufacturing features with their proper manufacturing information. However, to manage the manufacturing information flexibly, it is important to build a feature library that is easy to modify. In this paper, a Wiki-based feature library is proposed.Keywords: Manufacturing feature, feature library, feature ontology, process planning, Wiki, MediaWiki.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141812160 Work Structuring and the Feasibility of Application to Construction Projects in Vietnam
Authors: Viet-Hung Nguyen, Luh-Maan Chang
Abstract:
Design should be viewed concurrently by three ways as transformation, flow and value generation. An innovative approach to solve design – related problems is described as the integrated product - process design. As a foundation for a formal framework consisting of organizing principles and techniques, Work Structuring has been developed to guide efforts in the integration that enhances the development of operation and process design in alignment with product design. Vietnam construction projects are facing many delays, and cost overruns caused mostly by design related problems. A better design management that integrates product and process design could resolve these problems. A questionnaire survey and in – depth interviews were used to investigate the feasibility of applying Work Structuring to construction projects in Vietnam. The purpose of this paper is to present the research results and to illustrate the possible problems and potential solutions when Work Structuring is implemented to construction projects in Vietnam.Keywords: integrated product – process design, Work Structuring, construction projects, Vietnam
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169212159 Bayesian Belief Networks for Test Driven Development
Authors: Vijayalakshmy Periaswamy S., Kevin McDaid
Abstract:
Testing accounts for the major percentage of technical contribution in the software development process. Typically, it consumes more than 50 percent of the total cost of developing a piece of software. The selection of software tests is a very important activity within this process to ensure the software reliability requirements are met. Generally tests are run to achieve maximum coverage of the software code and very little attention is given to the achieved reliability of the software. Using an existing methodology, this paper describes how to use Bayesian Belief Networks (BBNs) to select unit tests based on their contribution to the reliability of the module under consideration. In particular the work examines how the approach can enhance test-first development by assessing the quality of test suites resulting from this development methodology and providing insight into additional tests that can significantly reduce the achieved reliability. In this way the method can produce an optimal selection of inputs and the order in which the tests are executed to maximize the software reliability. To illustrate this approach, a belief network is constructed for a modern software system incorporating the expert opinion, expressed through probabilities of the relative quality of the elements of the software, and the potential effectiveness of the software tests. The steps involved in constructing the Bayesian Network are explained as is a method to allow for the test suite resulting from test-driven development.Keywords: Software testing, Test Driven Development, Bayesian Belief Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188712158 A Case Study on Product Development Performance Measurement
Authors: Liv Gingnell, Evelina Ericsson, Joakim Lilliesköld, Robert Langerström
Abstract:
In recent years, an increased competition and lower profit margins have necessitated a focus on improving the performance of the product development process, an area that traditionally have been excluded from detailed steering and evaluation. A systematic improvement requires a good understanding of the current performance, wherefore the interest for product development performance measurement has increased dramatically. This paper presents a case study that evaluates the performance of the product development performance measurement system used in a Swedish company that is a part of a global corporate group. The study is based on internal documentation and eighteen in-depth interviews with stakeholders involved in the product development process. The results from the case study includes a description of what metrics that are in use, how these are employed, and its affect on the quality of the performance measurement system. Especially, the importance of having a well-defined process proved to have a major impact on the quality of the performance measurement system in this particular case.
Keywords: Outcome metric, Performance driver, Performance measurement, Product development process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157412157 N-Grams: A Tool for Repairing Word Order Errors in Ill-formed Texts
Authors: Theologos Athanaselis, Stelios Bakamidis, Ioannis Dologlou, Konstantinos Mamouras
Abstract:
This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. A possible way for reordering the words is to use all the permutations. The problem is that for a sentence with length N words the number of all permutations is N!. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The confusion matrix technique has been designed in order to reduce the search space among permuted sentences. The limitation of search space is succeeded using the statistical inference of N-grams. The results of this technique are very interesting and prove that the number of permuted sentences can be reduced by 98,16%. For experimental purposes a test set of TOEFL sentences was used and the results show that more than 95% can be repaired using the proposed method.
Keywords: Permutations filtering, Statistical language model N-grams, Word order errors, TOEFL
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166812156 A Constructivist Approach and Tool for Autonomous Agent Bottom-up Sequential Learning
Authors: Jianyong Xue, Olivier L. Georgeon, Salima Hassas
Abstract:
During the initial phase of cognitive development, infants exhibit amazing abilities to generate novel behaviors in unfamiliar situations, and explore actively to learn the best while lacking extrinsic rewards from the environment. These abilities set them apart from even the most advanced autonomous robots. This work seeks to contribute to understand and replicate some of these abilities. We propose the Bottom-up hiErarchical sequential Learning algorithm with Constructivist pAradigm (BEL-CA) to design agents capable of learning autonomously and continuously through interactions. The algorithm implements no assumption about the semantics of input and output data. It does not rely upon a model of the world given a priori in the form of a set of states and transitions as well. Besides, we propose a toolkit to analyze the learning process at run time called GAIT (Generating and Analyzing Interaction Traces). We use GAIT to report and explain the detailed learning process and the structured behaviors that the agent has learned on each decision making. We report an experiment in which the agent learned to successfully interact with its environment and to avoid unfavorable interactions using regularities discovered through interaction.Keywords: Cognitive development, constructivist learning, hierarchical sequential learning, self-adaptation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53312155 A Parallel Implementation of k-Means in MATLAB
Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas
Abstract:
The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.Keywords: K-means algorithm, clustering, parallel computations, MATLAB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 115712154 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.
Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105612153 Comparison of Numerical and Theoretical Friction Effect in the Wire Winding for Reinforced Structures with Wire Winding
Authors: Amer Ezoji, Mohammad Sedighi
Abstract:
In the article, the wire winding process for the reinforcement of a pressure vessel frame has been studied. Firstly, the importance of the wire winding method has been explained. The main step in the design process is the methodology axial force control and wire winding process. The hot isostatic press and wire winding process introduce. With use the equilibrium term in the pressure vessel and frame, stresses in the frame wires analyzed. A case study frame was studied to control axial force in the hot isostatic press. Frame and them wires simulated then friction effect and wires effect in elastic yoke in the simulation model considered. Then theoretical and simulate resulted compare and vessel pressure import to frame because we assurance wire wounded not received to yielding point.
Keywords: Wire winding, Frame, stress, friction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205212152 New Fourth Order Explicit Group Method in the Solution of the Helmholtz Equation
Authors: Norhashidah Hj. Mohd Ali, Teng Wai Ping
Abstract:
In this paper, the formulation of a new group explicit method with a fourth order accuracy is described in solving the two dimensional Helmholtz equation. The formulation is based on the nine-point fourth order compact finite difference approximation formula. The complexity analysis of the developed scheme is also presented. Several numerical experiments were conducted to test the feasibility of the developed scheme. Comparisons with other existing schemes will be reported and discussed. Preliminary results indicate that this method is a viable alternative high accuracy solver to the Helmholtz equation.
Keywords: Explicit group method, finite difference, Helmholtz equation, five-point formula, nine-point formula.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 208112151 A Particle Swarm Optimal Control Method for DC Motor by Considering Energy Consumption
Authors: Yingjie Zhang, Ming Li, Ying Zhang, Jing Zhang, Zuolei Hu
Abstract:
In the actual start-up process of DC motors, the DC drive system often faces a conflict between energy consumption and acceleration performance. To resolve the conflict, this paper proposes a comprehensive performance index that energy consumption index is added on the basis of classical control performance index in the DC motor starting process. Taking the comprehensive performance index as the cost function, particle swarm optimization algorithm is designed to optimize the comprehensive performance. Then it conducts simulations on the optimization of the comprehensive performance of the DC motor on condition that the weight coefficient of the energy consumption index should be properly designed. The simulation results show that as the weight of energy consumption increased, the energy efficiency was significantly improved at the expense of a slight sacrifice of fastness indicators with the comprehensive performance index method. The energy efficiency was increased from 63.18% to 68.48% and the response time reduced from 0.2875s to 0.1736s simultaneously compared with traditional proportion integrals differential controller in energy saving.
Keywords: Comprehensive performance index, energy consumption, acceleration performance, particle swarm optimal control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64212150 A Scatter Search and Help Policies Approaches for a New Mixed Model Assembly Lines Sequencing Problem
Authors: N. Manavizadeh , M. Rabbani , H. Sotudian , F. Jolai
Abstract:
Mixed Model Production is the practice of assembling several distinct and different models of a product on the same assembly line without changeovers and then sequencing those models in a way that smoothes the demand for upstream components. In this paper, we consider an objective function which minimizes total stoppage time and total idle time and it is presented sequence dependent set up time. Many studies have been done on the mixed model assembly lines. But in this paper we specifically focused on reducing the idle times. This is possible through various help policies. For improving the solutions, some cases developed and about 40 tests problem was considered. We use scatter search for optimization and for showing the efficiency of our algorithm, experimental results shows behavior of method. Scatter search and help policies can produce high quality answers, so it has been used in this paper.Keywords: mixed model assembly lines, Scatter search, help policies, idle time, Stoppage time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491