Search results for: process mining.
5145 Influence of Deep Cold Rolling and Low Plasticity Burnishing on Surface Hardness and Surface Roughness of AISI 4140 Steel
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma
Abstract:
Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.
Keywords: Deep cold rolling, burnishing, surface roughness, surface hardness, design of experiments, AISI4140 steel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38015144 Bioprocess Optimization Based On Relevance Vector Regression Models and Evolutionary Programming Technique
Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte
Abstract:
This paper proposes a bioprocess optimization procedure based on Relevance Vector Regression models and evolutionary programming technique. Relevance Vector Regression scheme allows developing a compact and stable data-based process model avoiding time-consuming modeling expenses. The model building and process optimization procedure could be done in a half-automated way and repeated after every new cultivation run. The proposed technique was tested in a simulated mammalian cell cultivation process. The obtained results are promising and could be attractive for optimization of industrial bioprocesses.
Keywords: Bioprocess optimization, Evolutionary programming, Relevance Vector Regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22015143 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.
Keywords: FEA, random vibration fatigue, process automation, AHP, TOPSIS, multiple-criteria decision-making, MCDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5415142 Mechanical Properties of Die-Cast Nonflammable Mg Alloy
Authors: Myoung-Gon Yoon, Jung-Ho Moon, Tae Kwon Ha
Abstract:
Tensile specimens of nonflammable AZ91D Mg alloy were fabricated in this study via cold chamber die-casting process. Dimensions of tensile specimens were 25mm in length, 4mm in width, and 0.8 or 3.0mm in thickness. Microstructure observation was conducted before and after tensile tests at room temperature. In the die casting process, various injection distances from 150 to 260mm were employed to obtain optimum process conditions. Distribution of Al12Mg17 phase was the key factor to determine the mechanical properties of die-cast Mg alloy. Specimens with 3mm of thickness showed superior mechanical properties to those with 0.8mm of thickness. Closed networking of Al12Mg17 phase along grain boundary was found to be detrimental to mechanical properties of die-cast Mg alloy.
Keywords: Non-flammable magnesium alloy, AZ91D, die-casting, microstructure, mechanical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25695141 Modelling the Occurrence of Defects and Change Requests during User Acceptance Testing
Authors: Kevin McDaid, Simon P. Wilson
Abstract:
Software developed for a specific customer under contract typically undergoes a period of testing by the customer before acceptance. This is known as user acceptance testing and the process can reveal both defects in the system and requests for changes to the product. This paper uses nonhomogeneous Poisson processes to model a real user acceptance data set from a recently developed system. In particular a split Poisson process is shown to provide an excellent fit to the data. The paper explains how this model can be used to aid the allocation of resources through the accurate prediction of occurrences both during the acceptance testing phase and before this activity begins. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23435140 Annual Power Load Forecasting Using Support Vector Regression Machines: A Study on Guangdong Province of China 1985-2008
Authors: Zhiyong Li, Zhigang Chen, Chao Fu, Shipeng Zhang
Abstract:
Load forecasting has always been the essential part of an efficient power system operation and planning. A novel approach based on support vector machines is proposed in this paper for annual power load forecasting. Different kernel functions are selected to construct a combinatorial algorithm. The performance of the new model is evaluated with a real-world dataset, and compared with two neural networks and some traditional forecasting techniques. The results show that the proposed method exhibits superior performance.Keywords: combinatorial algorithm, data mining, load forecasting, support vector machines
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16495139 Effective Work Roll Cooling toward Stand Reduction in Hot Strip Process
Authors: Temsiri Sapsaman, Anocha Bhocarattanahkul
Abstract:
The maintenance of work rolls in hot strip processing has been lengthy and difficult tasks for hot strip manufacturer because heavy work rolls have to be taken out of the production line, which could take hours. One way to increase the time between maintenance is to improve the effectiveness of the work roll cooling system such that the wear and tear more slowly occurs, while the operation cost is kept low. Therefore, this study aims to improve the work roll cooling system by providing the manufacturer the relationship between the work-roll temperature reduced by cooling and the water flow that can help manufacturer determining the more effective water flow of the cooling system. The relationship is found using simulation with a systematic process adjustment so that the satisfying quality of product is achieved. Results suggest that the manufacturer could reduce the water flow by 9% with roughly the same performance. With the same process adjustment, the feasibility of finishing-mill-stand reduction is also investigated. Results suggest its possibility.Keywords: Work-roll cooling system, hot strip process adjustment, feasibility study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19615138 Torrefaction of Biomass Pellets: Modeling of the Process in a Fixed Bed Reactor
Authors: Ekaterina Artiukhina, Panagiotis Grammelis
Abstract:
Torrefaction of biomass pellets is considered as a useful pretreatment technology in order to convert them into a high quality solid biofuel that is more suitable for pyrolysis, gasification, combustion, and co-firing applications. In the course of torrefaction, the temperature varies across the pellet, and therefore chemical reactions proceed unevenly within the pellet. However, the uniformity of the thermal distribution along the pellet is generally assumed. The torrefaction process of a single cylindrical pellet is modeled here, accounting for heat transfer coupled with chemical kinetics. The drying sub-model was also introduced. The nonstationary process of wood pellet decomposition is described by the system of non-linear partial differential equations over the temperature and mass. The model captures well the main features of the experimental data.
Keywords: Torrefaction, biomass pellets, model, heat and mass transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18035137 Robust Fractional-Order PI Controller with Ziegler-Nichols Rules
Authors: Mazidah Tajjudin, Mohd Hezri Fazalul Rahiman, Norhashim Mohd Arshad, Ramli Adnan
Abstract:
In process control applications, above 90% of the controllers are of PID type. This paper proposed a robust PI controller with fractional-order integrator. The PI parameters were obtained using classical Ziegler-Nichols rules but enhanced with the application of error filter cascaded to the fractional-order PI. The controller was applied on steam temperature process that was described by FOPDT transfer function. The process can be classified as lag dominating process with very small relative dead-time. The proposed control scheme was compared with other PI controller tuned using Ziegler-Nichols and AMIGO rules. Other PI controller with fractional-order integrator known as F-MIGO was also considered. All the controllers were subjected to set point change and load disturbance tests. The performance was measured using Integral of Squared Error (ISE) and Integral of Control Signal (ICO). The proposed controller produced best performance for all the tests with the least ISE index.
Keywords: PID controller, fractional-order PID controller, PI control tuning, steam temperature control, Ziegler-Nichols tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34755136 Six Sigma Process and its Impact on the Organizational Productivity
Authors: Masoud Hekmatpanah, Mohammad Sadroddin, Saeid Shahbaz, Farhad Mokhtari, Farahnaz Fadavinia
Abstract:
The six sigma method is a project-driven management approach to improve the organization-s products, services, and processes by continually reducing defects in the organization. Understanding the key features, obstacles, and shortcomings of the six sigma method allows organizations to better support their strategic directions, and increasing needs for coaching, mentoring, and training. It also provides opportunities to better implement six sigma projects. The purpose of this paper is the survey of six sigma process and its impact on the organizational productivity. So I have studied key concepts , problem solving process of six sigmaas well as the survey of important fields such as: DMAIC, six sigma and productivity applied programme, and other advantages of six sigma. In the end of this paper, present research conclusions. (direct and positive relation between six sigma and productivity)
Keywords: Six sigma, project management, quality, theory, productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 69805135 Integrating Life Cycle Uncertainties for Evaluating a Building Overall Cost
Authors: M. Arja, G. Sauce, B. Souyri
Abstract:
Overall cost is a significant consideration in any decision-making process. Although many studies were carried out on overall cost in construction, little has treated the uncertainties of real life cycle development. On the basis of several case studies, a feedback process was performed on the historical data of studied buildings. This process enabled to identify some factors causing uncertainty during the operational period. As a result, the research proposes a new method for assessing the overall cost during a part of the building-s life cycle taking account of the building actual value, its end-of-life value and the influence of the identified life cycle uncertainty factors. The findings are a step towards a higher level of reliability in overall cost evaluation taking account of some usually unexpected uncertainty factors.Keywords: Asset management, building life cycle uncertainty, building value, overall cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16585134 Mining News Sites to Create Special Domain News Collections
Authors: David B. Bracewell, Fuji Ren, Shingo Kuroiwa
Abstract:
We present a method to create special domain collections from news sites. The method only requires a single sample article as a seed. No prior corpus statistics are needed and the method is applicable to multiple languages. We examine various similarity measures and the creation of document collections for English and Japanese. The main contributions are as follows. First, the algorithm can build special domain collections from as little as one sample document. Second, unlike other algorithms it does not require a second “general" corpus to compute statistics. Third, in our testing the algorithm outperformed others in creating collections made up of highly relevant articles.Keywords: Information Retrieval, News, Special DomainCollections,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14895133 Influence of [Emim][OAc] and Water on Gelatinization Process and Interactions with Starch
Authors: Shajaratuldur Ismail, Nurlidia Mansor, Zakaria Man
Abstract:
Thermoplastic starch (TPS) plasticized by 1-ethyl-3-methylimidazolium acetate [Emim][OAc] were obtained through gelatinization process. The gelatinization process occurred in the presence of water and [Emim][OAc] as plasticizer at high temperature (90˚C). The influence of [Emim][OAc] and water on the gelatinization and interactions with starch have been studied over a range of compositions. The homogenous mass was obtained for the samples containing 35, 40 and 43.5 % of water contents which showed that water plays important role in gelatinization process. Detailed IR spectroscopy analysis showed decrease in hydrogen bonding intensity and strong interaction between acetate anion in [Emim][OAc] and starch hydroxyl groups in the presence of [Emim][OAc]. Starch-[Emim][OAc]-water mixture at 10-3-8.7 presented homogenous mass, less hydrogen bonding intensity and strong interaction between acetate anion in [Emim][OAc] and starch hydroxyl groups.
Keywords: Starch, ionic liquid, 1-ethyl-3-methylimidazolium acetate, plasticizer, gelatinization, IR spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9145132 Prediction of a Human Facial Image by ANN using Image Data and its Content on Web Pages
Authors: Chutimon Thitipornvanid, Siripun Sanguansintukul
Abstract:
Choosing the right metadata is a critical, as good information (metadata) attached to an image will facilitate its visibility from a pile of other images. The image-s value is enhanced not only by the quality of attached metadata but also by the technique of the search. This study proposes a technique that is simple but efficient to predict a single human image from a website using the basic image data and the embedded metadata of the image-s content appearing on web pages. The result is very encouraging with the prediction accuracy of 95%. This technique may become a great assist to librarians, researchers and many others for automatically and efficiently identifying a set of human images out of a greater set of images.Keywords: Metadata, Prediction, Multi-layer perceptron, Human facial image, Image mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12175131 Vertically Grown p–Type ZnO Nanorod on Ag Thin Film
Authors: Jihyun Park, Tae Il Lee, Jae-Min Myoung
Abstract:
A Silver (Ag) thin film is introduced as a template and doping source for vertically aligned p–type ZnO nanorods. ZnO nanorods were grown using an ammonium hydroxide based hydrothermal process. During the hydrothermal process, the Ag thin film was dissolved to generate Ag ions in the solution. The Ag ions can contribute to doping in the wurzite structure of ZnO and the (111) grain of Ag thin film can be the epitaxial temporal template for the (0001) plane of ZnO. Hence, Ag–doped p–type ZnO nanorods were successfully grown on the substrate, which can be an electrode or semiconductor for the device application. To demonstrate the potentials of this idea, p–n diode was fabricated and its electrical characteristics were demonstrated.
Keywords: Ag–doped ZnO nanorods, Hydrothermal process, p–n homo–junction diode, p–type ZnO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23815130 Pattern Recognition Using Feature Based Die-Map Clusteringin the Semiconductor Manufacturing Process
Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek
Abstract:
Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.
Keywords: Die-Map Clustering, Feature Extraction, Pattern Recognition, Semiconductor Manufacturing Process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31555129 European and International Bond Markets Integration
Authors: Dimitris Georgoutsos, Petros M. Migiakis
Abstract:
The concurrent era is characterised by strengthened interactions among financial markets and increased capital mobility globally. In this frames we examine the effects the international financial integration process has on the European bond markets. We perform a comparative study of the interactions of the European and international bond markets and exploit Cointegration analysis results on the elimination of stochastic trends and the decomposition of the underlying long run equilibria and short run causal relations. Our investigation provides evidence on the relation between the European integration process and that of globalisation, viewed through the bond markets- sector. Additionally the structural formulation applied, offers significant implications of the findings. All in all our analysis offers a number of answers on crucial queries towards the European bond markets integration process.
Keywords: financial integration, bond markets, cointegration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18255128 Iterative Clustering Algorithm for Analyzing Temporal Patterns of Gene Expression
Authors: Seo Young Kim, Jae Won Lee, Jong Sung Bae
Abstract:
Microarray experiments are information rich; however, extensive data mining is required to identify the patterns that characterize the underlying mechanisms of action. For biologists, a key aim when analyzing microarray data is to group genes based on the temporal patterns of their expression levels. In this paper, we used an iterative clustering method to find temporal patterns of gene expression. We evaluated the performance of this method by applying it to real sporulation data and simulated data. The patterns obtained using the iterative clustering were found to be superior to those obtained using existing clustering algorithms.Keywords: Clustering, microarray experiment, temporal pattern of gene expression data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13625127 The Effect of Solution Density on the Synthesis of Magnesium Borate from Boron-Gypsum
Authors: N. Tugrul, E. Sariburun, F. T. Senberber, A. S. Kipcak, E. Moroydor Derun, S. Piskin
Abstract:
Boron-gypsum is a waste which occurs in the boric acid production process. In this study, the boron content of this waste is evaluated for the use in synthesis of magnesium borates and such evaluation of this kind of waste is useful more than storage or disposal. Magnesium borates, which are a sub-class of boron minerals, are useful additive materials for the industries due to their remarkable thermal and mechanical properties. Magnesium borates were obtained hydrothermally at different temperatures. Novelty of this study is the search of the solution density effects to magnesium borate synthesis process for the increasing the possibility of borongypsum usage as a raw material. After the synthesis process, products are subjected to XRD and FT-IR to identify and characterize their crystal structure, respectively.
Keywords: Boron-gypsum, hydrothermal synthesis, magnesium borate, solution density.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21625126 E-Voting: A Trustworthiness In Democratic; A View from Technology, Political and Social Issue
Authors: Sera Syarmila Sameon, Rohaini Ramli
Abstract:
A trustworthy voting process in democratic is important that each vote is recorded with accuracy and impartiality. The accuracy and impartiality are tallied in high rate with biometric system. One of the sign is a fingerprint. Fingerprint recognition is still a challenging problem, because of the distortions among the different impression of the same finger. Because of the trustworthy of biometric voting technologies, it may give a great effect on numbers of voter-s participation and outcomes of the democratic process. Hence in this study, the authors are interested in designing and analyzing the Electronic Voting System and the participation of the users. The system is based on the fingerprint minutiae with the addition of person ID number. This is in order to enhance the accuracy and speed of the voting process. The new design is analyzed by conducting pilot election among a class of students for selecting their representative.Keywords: Biometric, FAR and FRR, democratic, voting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15715125 Establishment and Evaluation of Information System for Chemotherapy Care
Authors: Yi-Ting Liu, Pei-Ying Wen
Abstract:
In order to improve the overall safety of chemotherapy, safety-protecting netwas established for the whole process from prescribing by physicians, transcribing by nurses, dispensing by pharmacists to administering by nurses. The information system was used to check and monitorwhole process of administration and related sheets were computerized to simplify the paperwork.
Keywords: Chemotherapy, Bar Code Medication Administration (BCMA), Medication Safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18725124 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the Plan, Do, Check, Act (PDCA) approach and record review in the gathering of data for the calendar year 2019, specifically from August to October, focusing on the noodle products miki, canton, and misua. A causal-comparative research design was employed to establish cause-effect relationships among the variables, using descriptive statistics and correlation to compute the data gathered. The findings indicate that miki, canton, and misua production have distinct cycle times and production outputs in every set of its production processes, as well as varying levels of wastage. The company has not yet established a formal allowable rejection rate for wastage; instead, this paper used a 1% wastage limit. We recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators should be conducted by assessing their performance statistically based on the output and the machine performance; a root cause analysis must be conducted to identify solutions to production issues; and, an improved recording system for input and output of the production process of each noodle product should be established to eliminate the poor recording of data.
Keywords: Production, continuous improvement, process, operations, Plan, Do, Check, Act approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 475123 Studies on Lucrative Process Layout for Medium Scale Industries
Authors: Balamurugan Baladhandapani, Ganesh Renganathan, V. R. Sanal Kumar
Abstract:
In this paper a comprehensive review on various factory layouts has been carried out for designing a lucrative process layout for medium scale industries. Industry data base reveals that the end product rejection rate is on the order of 10% amounting large profit loss. In order to avoid these rejection rates and to increase the quality product production an intermediate non-destructive testing facility (INDTF) has been recommended for increasing the overall profit. We observed through detailed case studies that while introducing INDTF to medium scale industries the expensive production process can be avoided to the defective products well before its final shape. Additionally, the defective products identified during the intermediate stage can be effectively utilized for other applications or recycling; thereby the overall wastage of the raw materials can be reduced and profit can be increased. We concluded that the prudent design of a factory layout through critical path method facilitating with INDTF will warrant profitable outcome.
Keywords: Intermediate Non-destructive testing, Medium scale industries, Process layout design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23985122 Asymmetric Tukey’s Control Chart Robust to Skew and Non-Skew Process Observation
Authors: S. Sukparungsee
Abstract:
In reality, the process observations are away from the assumption that are normal distributed. The observations could be skew distributions which should use an asymmetric chart rather than symmetric chart. Consequently, this research aim to study the robustness of the asymmetric Tukey’s control chart for skew and non-skew distributions as Lognormal and Laplace distributions. Furthermore, the performances in detecting of a change in parameter of asymmetric and symmetric Tukey’s control charts are compared by Average ARL (AARL). The results found that the asymmetric performs better than symmetric Tukey’s control chart for both cases of skew and non-skew process observation.
Keywords: Asymmetric control limit, average of average run length, Tukey’s control chart and skew distributions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24925121 Nugget Formation during Resistance Spot Welding using Finite Element Model
Authors: Jawad Saleem, Abdul Majid, Kent Bertilsson, Torbjörn Carlberg, Nazar Ul Islam
Abstract:
Resistance spot welding process comprises of electric, thermal and mechanical phenomenon, which makes this process complex and highly non-linear and thus, it becomes difficult to model it. In order to obtain good weld nugget during spot welding, hit and trial welds are usually done which is very costly. Therefore the numerical simulation research has been conducted to understand the whole process. In this paper three different cases were analyzed by varying the tip contact area and it was observed that, with the variation of tip contact area the nugget formation at the faying surface is affected. The tip contact area of the welding electrode becomes large with long welding cycles. Therefore in order to maintain consistency of nugget formation during the welding process, the current compensation in control feedback is required. If the contact area of the welding electrode tip is reduced, a large amount of current flows through the faying surface, as a result of which sputtering occurs.Keywords: Resistance spot welding, Finite element modeling, Nugget formation, Welding electrode, Numerical method simulation,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37895120 Improvement of GVPI Insulation System Characteristics by Curing Process Modification
Authors: M. Shadmand
Abstract:
The curing process of insulation system for electrical machines plays a determinative role for its durability and reliability. Polar structure of insulating resin molecules and used filler of insulation system can be taken as an occasion to leverage it to enhance overall characteristics of insulation system, mechanically and electrically. The curing process regime for insulating system plays an important role for its mechanical and electrical characteristics by arranging the polymerization of chain structure for resin. In this research, the effect of electrical field application on in-curing insulating system for Global Vacuum Pressurized Impregnation (GVPI) system for traction motor was considered by performing the dissipation factor, polarization and de-polarization current (PDC) and voltage endurance (aging) measurements on sample test objects. Outcome results depicted obvious improvement in mechanical strength of the insulation system as well as higher electrical characteristics with routing and long-time (aging) electrical tests. Coming together, polarization of insulation system during curing process would enhance the machine life time.Keywords: Insulation system, GVPI, PDC, aging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10905119 WebGD: A CORBA-based Document Classification and Retrieval System on the Web
Authors: Fuyang Peng, Bo Deng, Chao Qi, Mou Zhan
Abstract:
This paper presents the design and implementation of the WebGD, a CORBA-based document classification and retrieval system on Internet. The WebGD makes use of such techniques as Web, CORBA, Java, NLP, fuzzy technique, knowledge-based processing and database technology. Unified classification and retrieval model, classifying and retrieving with one reasoning engine and flexible working mode configuration are some of its main features. The architecture of WebGD, the unified classification and retrieval model, the components of the WebGD server and the fuzzy inference engine are discussed in this paper in detail.Keywords: Text Mining, document classification, knowledgeprocessing, fuzzy logic, Web, CORBA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18525118 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics
Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan
Abstract:
The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).
Keywords: Cloud forensics, data protection laws, GDPR, IoT forensics, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11055117 An Overview of Construction and Demolition Waste as Coarse Aggregate in Concrete
Authors: S. R. Shamili, J. Karthikeyan
Abstract:
Fast development of the total populace and far and wide urbanization has surprisingly expanded the advancement of the construction industry. As a result of these activities, old structures are being demolished to make new buildings. Due to these large-scale demolitions, a huge amount of debris is generated all over the world, which results in a landfill. The use of construction and demolition waste as landfill causes groundwater contamination, which is hazardous. Using construction and demolition waste as aggregate can reduce the use of natural aggregates and the problem of mining. The objective of this study is to provide a detailed overview on how the construction and demolition waste material has been used as aggregate in structural concrete. In this study, the preparation, classification, and composition of construction and demolition wastes are also discussed.
Keywords: Aggregate, construction and demolition waste, landfill, large scale demolition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6455116 The Study of the Discrete Risk Model with Random Income
Authors: Peichen Zhao
Abstract:
In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.
Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1424