Search results for: Measurement Process
5639 Improving the Software Homologation Process through Peer Review: An Experience Report on Android Development Environment
Authors: Camila Bernardon, Diana Lemos, Mario Garcia, Thiago Souto, Bruno Bonifacio
Abstract:
In the current technological market environment, ensuring the quality of new products has become a complex challenge. In this scenario, companies have been investing in solutions that aim to reduce the execution time of software testing and lead to cost efficiency. However, companies that have a complex and specialized testing environment usually face barriers related to costly testing processes, especially in distributed settings. Sidia Institute of Technology works on research and development for the Android platform for mobile devices in Latin America. As we work in a global software development (GSD) scope, we have faced barriers caused by failures detected lately that have caused delays in the homologation release process on Android projects. Thus, we adopt an Internal Review process, using as an alternative to reduce these failures. In this paper it was presented the experience of a homologation team adopting an Internal Review process in order to increase the performance through of improving test efficiency. Using this approach, it was possible to realize a substantial improvement in quality, reliability and timeliness of our deliveries. Through the quantitative analyses, it was possible identify a positive growth in homologation efficiency of 6% after adoption of the process. In addition, we performed a qualitative analysis from the collected data through an online questionnaire. In particular, results show that association between failure reduction and review process adoption provides the most quality that has a positive effect on project milestones. We hope this report can be helpful to other companies and the scientific community to improve their process thereby increasing competitive advantages.
Keywords: Android, GSD, improvement quality process, mobile products.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4875638 Application of Fuzzy Logic Approach for an Aircraft Model with and without Winglet
Authors: Altab Hossain, Ataur Rahman, Jakir Hossen, A.K.M. P. Iqbal, SK. Hasan
Abstract:
The measurement of aerodynamic forces and moments acting on an aircraft model is important for the development of wind tunnel measurement technology to predict the performance of the full scale vehicle. The potentials of an aircraft model with and without winglet and aerodynamic characteristics with NACA wing No. 65-3- 218 have been studied using subsonic wind tunnel of 1 m × 1 m rectangular test section and 2.5 m long of Aerodynamics Laboratory Faculty of Engineering (University Putra Malaysia). Focusing on analyzing the aerodynamic characteristics of the aircraft model, two main issues are studied in this paper. First, a six component wind tunnel external balance is used for measuring lift, drag and pitching moment. Secondly, Tests are conducted on the aircraft model with and without winglet of two configurations at Reynolds numbers 1.7×105, 2.1×105, and 2.5×105 for different angle of attacks. Fuzzy logic approach is found as efficient for the representation, manipulation and utilization of aerodynamic characteristics. Therefore, the primary purpose of this work was to investigate the relationship between lift and drag coefficients, with free-stream velocities and angle of attacks, and to illustrate how fuzzy logic might play an important role in study of lift aerodynamic characteristics of an aircraft model with the addition of certain winglet configurations. Results of the developed fuzzy logic were compared with the experimental results. For lift coefficient analysis, the mean of actual and predicted values were 0.62 and 0.60 respectively. The coreelation between actual and predicted values (from FLS model) of lift coefficient in different angle of attack was found as 0.99. The mean relative error of actual and predicted valus was found as 5.18% for the velocity of 26.36 m/s which was found to be less than the acceptable limits (10%). The goodness of fit of prediction value was 0.95 which was close to 1.0.Keywords: Wind tunnel; Winglet; Lift coefficient; Fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19055637 Device for 3D Analysis of Basic Movements of the Lower Extremity
Authors: Jiménez Villanueva Mayra Alejandra, Ortíz Casallas Diana Carolina, Luengas Contreras Lely Adriana
Abstract:
This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.Keywords: abduction, adduction, A / D converter, Autodesk 3DMax, Infrared Diode, Driver, extension, flexion, Infrared LEDs, Interface, Modeling OPENGL, Optical Fiber, USB CDC(Communications Device Class), Virtual Reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16945636 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique
Authors: Navajyoti Panda, R. S. Pawar
Abstract:
In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.
Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14505635 Development of a Wiki-based Feature Library for a Process Planning System
Authors: Hendry Muljadi, Hideaki Takeda, Koichi Ando
Abstract:
A manufacturing feature can be defined simply as a geometric shape and its manufacturing information to create the shape. In a feature-based process planning system, feature library plays an important role in the extraction of manufacturing features with their proper manufacturing information. However, to manage the manufacturing information flexibly, it is important to build a feature library that is easy to modify. In this paper, a Wiki-based feature library is proposed.Keywords: Manufacturing feature, feature library, feature ontology, process planning, Wiki, MediaWiki.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14185634 Membrane Distillation Process Modeling: Dynamical Approach
Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati
Abstract:
This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.
Keywords: Membrane distillation, Dynamical modeling, Advection-diffusion equation, Thermal equilibrium, Heat equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28535633 Work Structuring and the Feasibility of Application to Construction Projects in Vietnam
Authors: Viet-Hung Nguyen, Luh-Maan Chang
Abstract:
Design should be viewed concurrently by three ways as transformation, flow and value generation. An innovative approach to solve design – related problems is described as the integrated product - process design. As a foundation for a formal framework consisting of organizing principles and techniques, Work Structuring has been developed to guide efforts in the integration that enhances the development of operation and process design in alignment with product design. Vietnam construction projects are facing many delays, and cost overruns caused mostly by design related problems. A better design management that integrates product and process design could resolve these problems. A questionnaire survey and in – depth interviews were used to investigate the feasibility of applying Work Structuring to construction projects in Vietnam. The purpose of this paper is to present the research results and to illustrate the possible problems and potential solutions when Work Structuring is implemented to construction projects in Vietnam.Keywords: integrated product – process design, Work Structuring, construction projects, Vietnam
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16925632 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.
Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10565631 Comparison of Numerical and Theoretical Friction Effect in the Wire Winding for Reinforced Structures with Wire Winding
Authors: Amer Ezoji, Mohammad Sedighi
Abstract:
In the article, the wire winding process for the reinforcement of a pressure vessel frame has been studied. Firstly, the importance of the wire winding method has been explained. The main step in the design process is the methodology axial force control and wire winding process. The hot isostatic press and wire winding process introduce. With use the equilibrium term in the pressure vessel and frame, stresses in the frame wires analyzed. A case study frame was studied to control axial force in the hot isostatic press. Frame and them wires simulated then friction effect and wires effect in elastic yoke in the simulation model considered. Then theoretical and simulate resulted compare and vessel pressure import to frame because we assurance wire wounded not received to yielding point.
Keywords: Wire winding, Frame, stress, friction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20525630 A Hypercube Social Feature Extraction and Multipath Routing in Delay Tolerant Networks
Authors: S. Balaji, M. Rajaram, Y. Harold Robinson, E. Golden Julie
Abstract:
Delay Tolerant Networks (DTN) which have sufficient state information include trajectory and contact information, to protect routing efficiency. However, state information is dynamic and hard to obtain without a global and/or long-term collection process. To deal with these problems, the internal social features of each node are introduced in the network to perform the routing process. This type of application is motivated from several human contact networks where people contact each other more frequently if they have more social features in common. Two unique processes were developed for this process; social feature extraction and multipath routing. The routing method then becomes a hypercube–based feature matching process. Furthermore, the effectiveness of multipath routing is evaluated and compared to that of single-path routing.
Keywords: Delay tolerant networks, entropy, human contact networks, hyper cubes, multipath Routing, social features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13055629 The Ethics of Dissent: The Case of David Kelly
Authors: A. Kayes, D. Christopher Kayes
Abstract:
In this paper, we rely on the story of the late British weapons inspector David Kelly to illustrate how sensemaking can inform the study of the ethics of suppression of dissent. Using archival data, we reconstruct Dr. Kelly-s key responsibilities as a weapons inspector and government employee. We begin by clarifying the concept of dissent and how it is a useful organizational process. We identify the various ways that dissent has been discussed in the organizational literature and reconsider the process of sensemaking. We conclude that suppression of opinions that deviate from the majority is part of the identity maintenance of the sensemaking process. We illustrate the prevention of dissent in organizations consists of a set of unsatisfactory trade-offs.Keywords: ethics, dissent, suppression, sensemaking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22995628 Sensing Pressure for Authentication System Using Keystroke Dynamics
Authors: Hidetoshi Nonaka, Masahito Kurihara
Abstract:
In this paper, an authentication system using keystroke dynamics is presented. We introduced pressure sensing for the improvement of the accuracy of measurement and durability against intrusion using key-logger, and so on, however additional instrument is needed. As the result, it has been found that the pressure sensing is also effective for estimation of real moment of keystroke.
Keywords: Biometric authentication, Keystroke dynamics, Pressure sensing, Time-frequency analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22225627 Elaboration and Optimization of Pellets Used for Precise Glass Grinding
Authors: N. Belkhir, A. Chorfa, D. Bouzid
Abstract:
In this work, grinding or microcutting tools in the form of pellets were manufactured using a bounded alumina abrasive grains. The bound used is a vitreous material containing quartz feldspars, kaolinite and a quantity of hematite. The pellets were used in glass grinding process to replace the free abrasive grains lapping process. The study of the elaborated pellets were done to define their effectiveness in the grinding process and to optimize the influence of the pellets elaboration parameters. The obtained results show the existence of an optimal combination of the pellets elaboration parameters for each glass grinding phase (coarse to fine grinding). The final roughness (rms) reached by the elaborated pellets on a BK7 glass surface was about 0.392 μm.
Keywords: Abrasive grain, glass, grinding, pellet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15785626 Improving the Decision-Making Process and Transparency of Corporate Governance Using XBRL
Authors: Claudiu Brandas
Abstract:
Several recent studies have shown that the transparency of financial reporting have a significant influence on investor-s decisions. Thus, regulation authorities and professional organizations (IFAC) have emphasized the role of XBRL (eXtensible Business Reporting Language) and interactive data as a means of promoting transparency and monitoring corporate reporting. In this context, this paper has as objective the analysis of interactive reporting through XBRL and its use as a support in the process of taking decisions in corporate governance, namely the potential of interactive reports in XBRL to increase the transparency and monitoring process of corporate governance.Keywords: Corporate Governance, decision, financial reporting, transparency, XBRL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23015625 Development of the Structure of the Knowledgebase for Countermeasures in the Knowledge Acquisition Process for Trouble Prediction in Healthcare Processes
Authors: Shogo Kato, Daisuke Okamoto, Satoko Tsuru, Yoshinori Iizuka, Ryoko Shimono
Abstract:
Healthcare safety has been perceived important. It is essential to prevent troubles in healthcare processes for healthcare safety. Trouble prevention is based on trouble prediction using accumulated knowledge on processes, troubles, and countermeasures. However, information on troubles has not been accumulated in hospitals in the appropriate structure, and it has not been utilized effectively to prevent troubles. In the previous study, however a detailed knowledge acquisition process for trouble prediction was proposed, the knowledgebase for countermeasures was not involved. In this paper, we aim to propose the structure of the knowledgebase for countermeasures, in the knowledge acquisition process for trouble prediction in healthcare process. We first design the structure of countermeasures and propose the knowledge representation form on countermeasures. Then, we evaluate the validity of the proposal, by applying it into an actual hospital.Keywords: Trouble prevention, knowledge structure, structured knowledge, reusable knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16725624 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Authors: K. Indra Gandhi
Abstract:
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.
Keywords: Model-driven development, wireless sensor networks, data acquisition, separation of concern, layered design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9575623 Latent Topic Based Medical Data Classification
Authors: Jian-hua Yeh, Shi-yi Kuo
Abstract:
This paper discusses the classification process for medical data. In this paper, we use the data from ACM KDDCup 2008 to demonstrate our classification process based on latent topic discovery. In this data set, the target set and outliers are quite different in their nature: target set is only 0.6% size in total, while the outliers consist of 99.4% of the data set. We use this data set as an example to show how we dealt with this extremely biased data set with latent topic discovery and noise reduction techniques. Our experiment faces two major challenge: (1) extremely distributed outliers, and (2) positive samples are far smaller than negative ones. We try to propose a suitable process flow to deal with these issues and get a best AUC result of 0.98.
Keywords: classification, latent topics, outlier adjustment, feature scaling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16425622 Computer Aided Design of Reshaping Process of Circular Pipes into Square Pipes
Authors: Parviz Alinezhad, Ali Sanati, Koorosh Naser Momtahen
Abstract:
Square pipes (pipes with square cross sections) are being used for various industrial objectives, such as machine structure components and housing/building elements. The utilization of them is extending rapidly and widely. Hence, the out-put of those pipes is increasing and new application fields are continually developing. Due to various demands in recent time, the products have to satisfy difficult specifications with high accuracy in dimensions. The reshaping process design of pipes with square cross sections; however, is performed by trial and error and based on expert-s experience. In this paper, a computer-aided simulation is developed based on the 2-D elastic-plastic method with consideration of the shear deformation to analyze the reshaping process. Effect of various parameters such as diameter of the circular pipe and mechanical properties of metal on product dimension and quality can be evaluated by using this simulation. Moreover, design of reshaping process include determination of shrinkage of cross section, necessary number of stands, radius of rolls and height of pipe at each stand, are investigated. Further, it is shown that there are good agreements between the results of the design method and the experimental results.Keywords: Circular Pipes, Square Pipes, Shear Deformation, Reshaping Process, Numerical Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13985621 The Composting Process from a Waste Management Method to a Remediation Procedure
Authors: G. Petruzzelli, F. Pedron, M. Grifoni, F. Gorini, I. Rosellini, B. Pezzarossa
Abstract:
Composting is a controlled technology to enhance the natural aerobic process of organic wastes degradation. The resulting product is a humified material that is principally recyclable for agricultural purpose. The composting process is one of the most important tools for waste management, by the European Community legislation. In recent years composting has been increasingly used as a remediation technology to remove biodegradable contaminants from soil, and to modulate heavy metals bioavailability in phytoremediation strategies. An optimization in the recovery of resources from wastes through composting could enhance soil fertility and promote its use in the remediation biotechnologies of contaminated soils.
Keywords: Agriculture, biopile, compost, soil clean-up, waste recycling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22865620 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring
Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek
Abstract:
In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17805619 High Sensitivity Crack Detection and Locating with Optimized Spatial Wavelet Analysis
Authors: A. Ghanbari Mardasi, N. Wu, C. Wu
Abstract:
In this study, a spatial wavelet-based crack localization technique for a thick beam is presented. Wavelet scale in spatial wavelet transformation is optimized to enhance crack detection sensitivity. A windowing function is also employed to erase the edge effect of the wavelet transformation, which enables the method to detect and localize cracks near the beam/measurement boundaries. Theoretical model and vibration analysis considering the crack effect are first proposed and performed in MATLAB based on the Timoshenko beam model. Gabor wavelet family is applied to the beam vibration mode shapes derived from the theoretical beam model to magnify the crack effect so as to locate the crack. Relative wavelet coefficient is obtained for sensitivity analysis by comparing the coefficient values at different positions of the beam with the lowest value in the intact area of the beam. Afterward, the optimal wavelet scale corresponding to the highest relative wavelet coefficient at the crack position is obtained for each vibration mode, through numerical simulations. The same procedure is performed for cracks with different sizes and positions in order to find the optimal scale range for the Gabor wavelet family. Finally, Hanning window is applied to different vibration mode shapes in order to overcome the edge effect problem of wavelet transformation and its effect on the localization of crack close to the measurement boundaries. Comparison of the wavelet coefficients distribution of windowed and initial mode shapes demonstrates that window function eases the identification of the cracks close to the boundaries.
Keywords: Edge effect, scale optimization, small crack locating, spatial wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9495618 A Framework for Product Development Process including HW and SW Components
Authors: Namchul Do, Gyeongseok Chae
Abstract:
This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.Keywords: HW and SW Development Integration, ProductDevelopment with Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26015617 The Effects of Increasing Unsaturation in Palm Oil and Incorporation of Carbon Nanotubes on Resinous Properties
Authors: Muhammad R. Islam, Mohammad Dalour H. Beg, Saidatul S. Jamari
Abstract:
Considering palm oil as non-drying oil owing to its low iodine value, an attempt was taken to increase the unsaturation in the fatty acid chains of palm oil for the preparation of alkyds. To increase the unsaturation in the palm oil, sulphuric acid (SA) and para-toluene sulphonic acid (PTSA) was used prior to alcoholysis for the dehydration process. The iodine number of the oil samples was checked for the unsaturation measurement by Wijs method. Alkyd resin was prepared using the dehydrated palm oil by following alcoholysis and esterification reaction. To improve the film properties 0.5wt.% multi-wall carbon nano tubes (MWCNTs) were used to manufacture polymeric film. The properties of the resins were characterized by various physico-chemical properties such as density, viscosity, iodine value, saponification value, etc. Structural elucidation was confirmed by Fourier transform of infrared spectroscopy and proton nuclear magnetic resonance; surfaces of the films were examined by field-emission scanning electron microscope. In addition, pencil hardness and chemical resistivity was also measured by using standard methods. The effect of enhancement of the unsaturation in the fatty acid chain found significant and motivational. The resin prepared with dehydrated palm oil showed improved properties regarding hardness and chemical resistivity testing. The incorporation of MWCNTs enhanced the thermal stability and hardness of the films as well.
Keywords: Alkyd resin, nano-coatings, dehydration, palm oil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24465616 Impact of Process Parameters on Tensile Strength of Fused Deposition Modeling Printed Crisscross Poylactic Acid
Authors: Shilpesh R. Rajpurohit, Harshit K. Dave
Abstract:
Additive manufacturing gains the popularity in recent times, due to its capability to create prototype as well functional as end use product directly from CAD data without any specific requirement of tooling. Fused deposition modeling (FDM) is one of the widely used additive manufacturing techniques that are used to create functional end use part of polymer that is comparable with the injection-molded parts. FDM printed part has an application in various fields such as automobile, aerospace, medical, electronic, etc. However, application of FDM part is greatly affected by poor mechanical properties. Proper selection of the process parameter could enhance the mechanical performance of the printed part. In the present study, experimental investigation has been carried out to study the behavior of the mechanical performance of the printed part with respect to process variables. Three process variables viz. raster angle, raster width and layer height have been varied to understand its effect on tensile strength. Further, effect of process variables on fractured surface has been also investigated.
Keywords: 3D printing, fused deposition modeling, layer height, raster angle, raster width, tensile strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16605615 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.
Keywords: Injection molding, shrinkage, six sigma, Taguchi parameter design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13815614 Designing Information Systems in Education as Prerequisite for Successful Management Results
Authors: Vladimir Simovic, Matija Varga, Tonco Marusic
Abstract:
This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.Keywords: Designing, education management, information systems, matrix technology, process affinity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10965613 Solving Single Machine Total Weighted Tardiness Problem Using Gaussian Process Regression
Authors: Wanatchapong Kongkaew
Abstract:
This paper proposes an application of probabilistic technique, namely Gaussian process regression, for estimating an optimal sequence of the single machine with total weighted tardiness (SMTWT) scheduling problem. In this work, the Gaussian process regression (GPR) model is utilized to predict an optimal sequence of the SMTWT problem, and its solution is improved by using an iterated local search based on simulated annealing scheme, called GPRISA algorithm. The results show that the proposed GPRISA method achieves a very good performance and a reasonable trade-off between solution quality and time consumption. Moreover, in the comparison of deviation from the best-known solution, the proposed mechanism noticeably outperforms the recently existing approaches.
Keywords: Gaussian process regression, iterated local search, simulated annealing, single machine total weighted tardiness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22355612 An Evaluation of Average Run Length of MaxEWMA and MaxGWMA Control Charts
Authors: S. Phanyaem
Abstract:
Exponentially weighted moving average control chart (EWMA) is a popular chart used for detecting shift in the mean of parameter of distributions in quality control. The objective of this paper is to compare the efficiency of control chart to detect an increases in the mean of a process. In particular, we compared the Maximum Exponentially Weighted Moving Average (MaxEWMA) and Maximum Generally Weighted Moving Average (MaxGWMA) control charts when the observations are Exponential distribution. The criteria for evaluate the performance of control chart is called, the Average Run Length (ARL). The result of comparison show that in the case of process is small sample size, the MaxEWMA control chart is more efficiency to detect shift in the process mean than MaxGWMA control chart. For the case of large sample size, the MaxEWMA control chart is more sensitive to detect small shift in the process mean than MaxGWMA control chart, and when the process is a large shift in mean, the MaxGWMA control chart is more sensitive to detect mean shift than MaxEWMA control chart.
Keywords: Maximum Exponentially Weighted Moving Average, Maximum General Weighted Moving Average, Average Run Length.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21745611 Approximate Frequent Pattern Discovery Over Data Stream
Authors: Kittisak Kerdprasop, Nittaya Kerdprasop
Abstract:
Frequent pattern discovery over data stream is a hard problem because a continuously generated nature of stream does not allow a revisit on each data element. Furthermore, pattern discovery process must be fast to produce timely results. Based on these requirements, we propose an approximate approach to tackle the problem of discovering frequent patterns over continuous stream. Our approximation algorithm is intended to be applied to process a stream prior to the pattern discovery process. The results of approximate frequent pattern discovery have been reported in the paper.Keywords: Frequent pattern discovery, Approximate algorithm, Data stream analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13425610 Evaluation of Risks in New Product Innovation
Authors: Emre Alptekin, Damla Yalçınyiğit, Gülfem Alptekin
Abstract:
In highly competitive environments, a growing number of companies must regularly launch new products speedily and successfully. A company-s success is based on the systematic, conscious product designing method which meets the market requirements and takes risks as well as resources into consideration. Research has found that developing and launching new products are inherently risky endeavors. Hence in this research, we aim at introducing a risk evaluation framework for the new product innovation process. Our framework is based on the fuzzy analytical hierarchy process (FAHP) methodology. We have applied all the stages of the framework on the risk evaluation process of a pharmaceuticals company.Keywords: Evaluation, risks, product innovation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492