Search results for: process component
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6159

Search results for: process component

5559 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique

Authors: Navajyoti Panda, R. S. Pawar

Abstract:

In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.

Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
5558 Learning Classifier Systems Approach for Automated Discovery of Censored Production Rules

Authors: Suraiya Jabin, Kamal K. Bharadwaj

Abstract:

In the recent past Learning Classifier Systems have been successfully used for data mining. Learning Classifier System (LCS) is basically a machine learning technique which combines evolutionary computing, reinforcement learning, supervised or unsupervised learning and heuristics to produce adaptive systems. A LCS learns by interacting with an environment from which it receives feedback in the form of numerical reward. Learning is achieved by trying to maximize the amount of reward received. All LCSs models more or less, comprise four main components; a finite population of condition–action rules, called classifiers; the performance component, which governs the interaction with the environment; the credit assignment component, which distributes the reward received from the environment to the classifiers accountable for the rewards obtained; the discovery component, which is responsible for discovering better rules and improving existing ones through a genetic algorithm. The concatenate of the production rules in the LCS form the genotype, and therefore the GA should operate on a population of classifier systems. This approach is known as the 'Pittsburgh' Classifier Systems. Other LCS that perform their GA at the rule level within a population are known as 'Mitchigan' Classifier Systems. The most predominant representation of the discovered knowledge is the standard production rules (PRs) in the form of IF P THEN D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski and Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: IF P THEN D UNLESS C, where Censor C is an exception to the rule. Such rules are employed in situations, in which conditional statement IF P THEN D holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the IF P THEN D part of CPR expresses important information, while the UNLESS C part acts only as a switch and changes the polarity of D to ~D. In this paper Pittsburgh style LCSs approach is used for automated discovery of CPRs. An appropriate encoding scheme is suggested to represent a chromosome consisting of fixed size set of CPRs. Suitable genetic operators are designed for the set of CPRs and individual CPRs and also appropriate fitness function is proposed that incorporates basic constraints on CPR. Experimental results are presented to demonstrate the performance of the proposed learning classifier system.

Keywords: Censored Production Rule, Data Mining, GeneticAlgorithm, Learning Classifier System, Machine Learning, PittsburgApproach, , Reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
5557 Development of a Wiki-based Feature Library for a Process Planning System

Authors: Hendry Muljadi, Hideaki Takeda, Koichi Ando

Abstract:

A manufacturing feature can be defined simply as a geometric shape and its manufacturing information to create the shape. In a feature-based process planning system, feature library plays an important role in the extraction of manufacturing features with their proper manufacturing information. However, to manage the manufacturing information flexibly, it is important to build a feature library that is easy to modify. In this paper, a Wiki-based feature library is proposed.

Keywords: Manufacturing feature, feature library, feature ontology, process planning, Wiki, MediaWiki.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1396
5556 Membrane Distillation Process Modeling: Dynamical Approach

Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati

Abstract:

This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.

Keywords: Membrane distillation, Dynamical modeling, Advection-diffusion equation, Thermal equilibrium, Heat equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2834
5555 Work Structuring and the Feasibility of Application to Construction Projects in Vietnam

Authors: Viet-Hung Nguyen, Luh-Maan Chang

Abstract:

Design should be viewed concurrently by three ways as transformation, flow and value generation. An innovative approach to solve design – related problems is described as the integrated product - process design. As a foundation for a formal framework consisting of organizing principles and techniques, Work Structuring has been developed to guide efforts in the integration that enhances the development of operation and process design in alignment with product design. Vietnam construction projects are facing many delays, and cost overruns caused mostly by design related problems. A better design management that integrates product and process design could resolve these problems. A questionnaire survey and in – depth interviews were used to investigate the feasibility of applying Work Structuring to construction projects in Vietnam. The purpose of this paper is to present the research results and to illustrate the possible problems and potential solutions when Work Structuring is implemented to construction projects in Vietnam.

Keywords: integrated product – process design, Work Structuring, construction projects, Vietnam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
5554 A Case Study on Product Development Performance Measurement

Authors: Liv Gingnell, Evelina Ericsson, Joakim Lilliesköld, Robert Langerström

Abstract:

In recent years, an increased competition and lower profit margins have necessitated a focus on improving the performance of the product development process, an area that traditionally have been excluded from detailed steering and evaluation. A systematic improvement requires a good understanding of the current performance, wherefore the interest for product development performance measurement has increased dramatically. This paper presents a case study that evaluates the performance of the product development performance measurement system used in a Swedish company that is a part of a global corporate group. The study is based on internal documentation and eighteen in-depth interviews with stakeholders involved in the product development process. The results from the case study includes a description of what metrics that are in use, how these are employed, and its affect on the quality of the performance measurement system. Especially, the importance of having a well-defined process proved to have a major impact on the quality of the performance measurement system in this particular case.

Keywords: Outcome metric, Performance driver, Performance measurement, Product development process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
5553 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1032
5552 Comparison of Numerical and Theoretical Friction Effect in the Wire Winding for Reinforced Structures with Wire Winding

Authors: Amer Ezoji, Mohammad Sedighi

Abstract:

In the article, the wire winding process for the reinforcement of a pressure vessel frame has been studied. Firstly, the importance of the wire winding method has been explained. The main step in the design process is the methodology axial force control and wire winding process. The hot isostatic press and wire winding process introduce. With use the equilibrium term in the pressure vessel and frame, stresses in the frame wires analyzed. A case study frame was studied to control axial force in the hot isostatic press. Frame and them wires simulated then friction effect and wires effect in elastic yoke in the simulation model considered. Then theoretical and simulate resulted compare and vessel pressure import to frame because we assurance wire wounded not received to yielding point.   

Keywords: Wire winding, Frame, stress, friction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2025
5551 A Hypercube Social Feature Extraction and Multipath Routing in Delay Tolerant Networks

Authors: S. Balaji, M. Rajaram, Y. Harold Robinson, E. Golden Julie

Abstract:

Delay Tolerant Networks (DTN) which have sufficient state information include trajectory and contact information, to protect routing efficiency. However, state information is dynamic and hard to obtain without a global and/or long-term collection process. To deal with these problems, the internal social features of each node are introduced in the network to perform the routing process. This type of application is motivated from several human contact networks where people contact each other more frequently if they have more social features in common. Two unique processes were developed for this process; social feature extraction and multipath routing. The routing method then becomes a hypercube–based feature matching process. Furthermore, the effectiveness of multipath routing is evaluated and compared to that of single-path routing.

Keywords: Delay tolerant networks, entropy, human contact networks, hyper cubes, multipath Routing, social features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
5550 The Ethics of Dissent: The Case of David Kelly

Authors: A. Kayes, D. Christopher Kayes

Abstract:

In this paper, we rely on the story of the late British weapons inspector David Kelly to illustrate how sensemaking can inform the study of the ethics of suppression of dissent. Using archival data, we reconstruct Dr. Kelly-s key responsibilities as a weapons inspector and government employee. We begin by clarifying the concept of dissent and how it is a useful organizational process. We identify the various ways that dissent has been discussed in the organizational literature and reconsider the process of sensemaking. We conclude that suppression of opinions that deviate from the majority is part of the identity maintenance of the sensemaking process. We illustrate the prevention of dissent in organizations consists of a set of unsatisfactory trade-offs.

Keywords: ethics, dissent, suppression, sensemaking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2272
5549 Elaboration and Optimization of Pellets Used for Precise Glass Grinding

Authors: N. Belkhir, A. Chorfa, D. Bouzid

Abstract:

In this work, grinding or microcutting tools in the form of pellets were manufactured using a bounded alumina abrasive grains. The bound used is a vitreous material containing quartz feldspars, kaolinite and a quantity of hematite. The pellets were used in glass grinding process to replace the free abrasive grains lapping process. The study of the elaborated pellets were done to define their effectiveness in the grinding process and to optimize the influence of the pellets elaboration parameters. The obtained results show the existence of an optimal combination of the pellets elaboration parameters for each glass grinding phase (coarse to fine grinding). The final roughness (rms) reached by the elaborated pellets on a BK7 glass surface was about 0.392 μm.

Keywords: Abrasive grain, glass, grinding, pellet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
5548 The Effects of Negative Electronic Word-of-Mouth and Webcare on Thai Online Consumer Behavior

Authors: Pongsatorn Tantrabundit, Lersak Phothong, Ong-art Chanprasitchai

Abstract:

Due to the emergence of the Internet, it has extended the traditional Word-of-Mouth (WOM) to a new form called “Electronic Word-of-Mouth (eWOM).” Unlike traditional WOM, eWOM is able to present information in various ways by applying different components. Each eWOM component generates different effects on online consumer behavior. This research investigates the effects of Webcare (responding message) from product/ service providers on negative eWOM by applying two types of products (search and experience). The proposed conceptual model was developed based on the combination of the stages in consumer decision-making process, theory of reasoned action (TRA), theory of planned behavior (TPB), the technology acceptance model (TAM), the information integration theory and the elaboration likelihood model. The methodology techniques used in this study included multivariate analysis of variance (MANOVA) and multiple regression analysis. The results suggest that Webcare does slightly increase Thai online consumer’s perceptions on perceived eWOM trustworthiness, information diagnosticity and quality. For negative eWOM, we also found that perceived eWOM Trustworthiness, perceived eWOM diagnosticity and quality have a positive relationship with eWOM influence whereas perceived valence has a negative relationship with eWOM influence in Thai online consumers.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
5547 Improving the Decision-Making Process and Transparency of Corporate Governance Using XBRL

Authors: Claudiu Brandas

Abstract:

Several recent studies have shown that the transparency of financial reporting have a significant influence on investor-s decisions. Thus, regulation authorities and professional organizations (IFAC) have emphasized the role of XBRL (eXtensible Business Reporting Language) and interactive data as a means of promoting transparency and monitoring corporate reporting. In this context, this paper has as objective the analysis of interactive reporting through XBRL and its use as a support in the process of taking decisions in corporate governance, namely the potential of interactive reports in XBRL to increase the transparency and monitoring process of corporate governance.

Keywords: Corporate Governance, decision, financial reporting, transparency, XBRL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2264
5546 Development of the Structure of the Knowledgebase for Countermeasures in the Knowledge Acquisition Process for Trouble Prediction in Healthcare Processes

Authors: Shogo Kato, Daisuke Okamoto, Satoko Tsuru, Yoshinori Iizuka, Ryoko Shimono

Abstract:

Healthcare safety has been perceived important. It is essential to prevent troubles in healthcare processes for healthcare safety. Trouble prevention is based on trouble prediction using accumulated knowledge on processes, troubles, and countermeasures. However, information on troubles has not been accumulated in hospitals in the appropriate structure, and it has not been utilized effectively to prevent troubles. In the previous study, however a detailed knowledge acquisition process for trouble prediction was proposed, the knowledgebase for countermeasures was not involved. In this paper, we aim to propose the structure of the knowledgebase for countermeasures, in the knowledge acquisition process for trouble prediction in healthcare process. We first design the structure of countermeasures and propose the knowledge representation form on countermeasures. Then, we evaluate the validity of the proposal, by applying it into an actual hospital.

Keywords: Trouble prevention, knowledge structure, structured knowledge, reusable knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
5545 Latent Topic Based Medical Data Classification

Authors: Jian-hua Yeh, Shi-yi Kuo

Abstract:

This paper discusses the classification process for medical data. In this paper, we use the data from ACM KDDCup 2008 to demonstrate our classification process based on latent topic discovery. In this data set, the target set and outliers are quite different in their nature: target set is only 0.6% size in total, while the outliers consist of 99.4% of the data set. We use this data set as an example to show how we dealt with this extremely biased data set with latent topic discovery and noise reduction techniques. Our experiment faces two major challenge: (1) extremely distributed outliers, and (2) positive samples are far smaller than negative ones. We try to propose a suitable process flow to deal with these issues and get a best AUC result of 0.98.

Keywords: classification, latent topics, outlier adjustment, feature scaling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
5544 Computer Aided Design of Reshaping Process of Circular Pipes into Square Pipes

Authors: Parviz Alinezhad, Ali Sanati, Koorosh Naser Momtahen

Abstract:

Square pipes (pipes with square cross sections) are being used for various industrial objectives, such as machine structure components and housing/building elements. The utilization of them is extending rapidly and widely. Hence, the out-put of those pipes is increasing and new application fields are continually developing. Due to various demands in recent time, the products have to satisfy difficult specifications with high accuracy in dimensions. The reshaping process design of pipes with square cross sections; however, is performed by trial and error and based on expert-s experience. In this paper, a computer-aided simulation is developed based on the 2-D elastic-plastic method with consideration of the shear deformation to analyze the reshaping process. Effect of various parameters such as diameter of the circular pipe and mechanical properties of metal on product dimension and quality can be evaluated by using this simulation. Moreover, design of reshaping process include determination of shrinkage of cross section, necessary number of stands, radius of rolls and height of pipe at each stand, are investigated. Further, it is shown that there are good agreements between the results of the design method and the experimental results.

Keywords: Circular Pipes, Square Pipes, Shear Deformation, Reshaping Process, Numerical Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
5543 The Composting Process from a Waste Management Method to a Remediation Procedure

Authors: G. Petruzzelli, F. Pedron, M. Grifoni, F. Gorini, I. Rosellini, B. Pezzarossa

Abstract:

Composting is a controlled technology to enhance the natural aerobic process of organic wastes degradation. The resulting product is a humified material that is principally recyclable for agricultural purpose. The composting process is one of the most important tools for waste management, by the European Community legislation. In recent years composting has been increasingly used as a remediation technology to remove biodegradable contaminants from soil, and to modulate heavy metals bioavailability in phytoremediation strategies. An optimization in the recovery of resources from wastes through composting could enhance soil fertility and promote its use in the remediation biotechnologies of contaminated soils.

Keywords: Agriculture, biopile, compost, soil clean-up, waste recycling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2263
5542 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek

Abstract:

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
5541 A Framework for Product Development Process including HW and SW Components

Authors: Namchul Do, Gyeongseok Chae

Abstract:

This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.

Keywords: HW and SW Development Integration, ProductDevelopment with Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2579
5540 Impact of Process Parameters on Tensile Strength of Fused Deposition Modeling Printed Crisscross Poylactic Acid

Authors: Shilpesh R. Rajpurohit, Harshit K. Dave

Abstract:

Additive manufacturing gains the popularity in recent times, due to its capability to create prototype as well functional as end use product directly from CAD data without any specific requirement of tooling. Fused deposition modeling (FDM) is one of the widely used additive manufacturing techniques that are used to create functional end use part of polymer that is comparable with the injection-molded parts. FDM printed part has an application in various fields such as automobile, aerospace, medical, electronic, etc. However, application of FDM part is greatly affected by poor mechanical properties. Proper selection of the process parameter could enhance the mechanical performance of the printed part. In the present study, experimental investigation has been carried out to study the behavior of the mechanical performance of the printed part with respect to process variables. Three process variables viz. raster angle, raster width and layer height have been varied to understand its effect on tensile strength. Further, effect of process variables on fractured surface has been also investigated.

Keywords: 3D printing, fused deposition modeling, layer height, raster angle, raster width, tensile strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
5539 Location Update Cost Analysis of Mobile IPv6 Protocols

Authors: Brahmjit Singh

Abstract:

Mobile IP has been developed to provide the continuous information network access to mobile users. In IP-based mobile networks, location management is an important component of mobility management. This management enables the system to track the location of mobile node between consecutive communications. It includes two important tasks- location update and call delivery. Location update is associated with signaling load. Frequent updates lead to degradation in the overall performance of the network and the underutilization of the resources. It is, therefore, required to devise the mechanism to minimize the update rate. Mobile IPv6 (MIPv6) and Hierarchical MIPv6 (HMIPv6) have been the potential candidates for deployments in mobile IP networks for mobility management. HMIPv6 through studies has been shown with better performance as compared to MIPv6. It reduces the signaling overhead traffic by making registration process local. In this paper, we present performance analysis of MIPv6 and HMIPv6 using an analytical model. Location update cost function is formulated based on fluid flow mobility model. The impact of cell residence time, cell residence probability and user-s mobility is investigated. Numerical results are obtained and presented in graphical form. It is shown that HMIPv6 outperforms MIPv6 for high mobility users only and for low mobility users; performance of both the schemes is almost equivalent to each other.

Keywords: Wireless networks, Mobile IP networks, Mobility management, performance analysis, Handover.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
5538 QoS Routing in Wired Sensor Networks with Partial Updates

Authors: Arijit Ghos, Tony Gigargis

Abstract:

QoS routing is an important component of Traffic Engineering in networks that provide QoS guarantees. QoS routing is dependent on the link state information which is typically flooded across the network. This affects both the quality of the routing and the utilization of the network resources. In this paper, we examine establishing QoS routes with partial state updates in wired sensor networks.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1182
5537 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.

Keywords: Injection molding, shrinkage, six sigma, Taguchi parameter design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348
5536 Designing Information Systems in Education as Prerequisite for Successful Management Results

Authors: Vladimir Simovic, Matija Varga, Tonco Marusic

Abstract:

This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.

Keywords: Designing, education management, information systems, matrix technology, process affinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1073
5535 Solving Single Machine Total Weighted Tardiness Problem Using Gaussian Process Regression

Authors: Wanatchapong Kongkaew

Abstract:

This paper proposes an application of probabilistic technique, namely Gaussian process regression, for estimating an optimal sequence of the single machine with total weighted tardiness (SMTWT) scheduling problem. In this work, the Gaussian process regression (GPR) model is utilized to predict an optimal sequence of the SMTWT problem, and its solution is improved by using an iterated local search based on simulated annealing scheme, called GPRISA algorithm. The results show that the proposed GPRISA method achieves a very good performance and a reasonable trade-off between solution quality and time consumption. Moreover, in the comparison of deviation from the best-known solution, the proposed mechanism noticeably outperforms the recently existing approaches.

 

Keywords: Gaussian process regression, iterated local search, simulated annealing, single machine total weighted tardiness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214
5534 An Evaluation of Average Run Length of MaxEWMA and MaxGWMA Control Charts

Authors: S. Phanyaem

Abstract:

Exponentially weighted moving average control chart (EWMA) is a popular chart used for detecting shift in the mean of parameter of distributions in quality control. The objective of this paper is to compare the efficiency of control chart to detect an increases in the mean of a process. In particular, we compared the Maximum Exponentially Weighted Moving Average (MaxEWMA) and Maximum Generally Weighted Moving Average (MaxGWMA) control charts when the observations are Exponential distribution. The criteria for evaluate the performance of control chart is called, the Average Run Length (ARL). The result of comparison show that in the case of process is small sample size, the MaxEWMA control chart is more efficiency to detect shift in the process mean than MaxGWMA control chart. For the case of large sample size, the MaxEWMA control chart is more sensitive to detect small shift in the process mean than MaxGWMA control chart, and when the process is a large shift in mean, the MaxGWMA control chart is more sensitive to detect mean shift than MaxEWMA control chart.

Keywords: Maximum Exponentially Weighted Moving Average, Maximum General Weighted Moving Average, Average Run Length.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2148
5533 Theoretical and Experimental Analysis of Hard Material Machining

Authors: Rajaram Kr. Gupta, Bhupendra Kumar, T. V. K. Gupta, D. S. Ramteke

Abstract:

Machining of hard materials is a recent technology for direct production of work-pieces. The primary challenge in machining these materials is selection of cutting tool inserts which facilitates an extended tool life and high-precision machining of the component. These materials are widely for making precision parts for the aerospace industry. Nickel-based alloys are typically used in extreme environment applications where a combination of strength, corrosion resistance and oxidation resistance material characteristics are required. The present paper reports the theoretical and experimental investigations carried out to understand the influence of machining parameters on the response parameters. Considering the basic machining parameters (speed, feed and depth of cut) a study has been conducted to observe their influence on material removal rate, surface roughness, cutting forces and corresponding tool wear. Experiments are designed and conducted with the help of Central Composite Rotatable Design technique. The results reveals that for a given range of process parameters, material removal rate is favorable for higher depths of cut and low feed rate for cutting forces. Low feed rates and high values of rotational speeds are suitable for better finish and higher tool life.

Keywords: Speed, feed, depth of cut, roughness, cutting force, flank wear.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
5532 Approximate Frequent Pattern Discovery Over Data Stream

Authors: Kittisak Kerdprasop, Nittaya Kerdprasop

Abstract:

Frequent pattern discovery over data stream is a hard problem because a continuously generated nature of stream does not allow a revisit on each data element. Furthermore, pattern discovery process must be fast to produce timely results. Based on these requirements, we propose an approximate approach to tackle the problem of discovering frequent patterns over continuous stream. Our approximation algorithm is intended to be applied to process a stream prior to the pattern discovery process. The results of approximate frequent pattern discovery have been reported in the paper.

Keywords: Frequent pattern discovery, Approximate algorithm, Data stream analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1317
5531 Evaluation of Risks in New Product Innovation

Authors: Emre Alptekin, Damla Yalçınyiğit, Gülfem Alptekin

Abstract:

In highly competitive environments, a growing number of companies must regularly launch new products speedily and successfully. A company-s success is based on the systematic, conscious product designing method which meets the market requirements and takes risks as well as resources into consideration. Research has found that developing and launching new products are inherently risky endeavors. Hence in this research, we aim at introducing a risk evaluation framework for the new product innovation process. Our framework is based on the fuzzy analytical hierarchy process (FAHP) methodology. We have applied all the stages of the framework on the risk evaluation process of a pharmaceuticals company.

Keywords: Evaluation, risks, product innovation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
5530 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel

Authors: F. M. Pisano, M. Ciminello

Abstract:

Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.

Keywords: Interactive dashboards, optical fibers, structural health monitoring, visual analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 791