Search results for: reconfigurable machine tool
1732 Steady State Thermal Analysis and Design of a Cooling System in an AFPM Motor
Authors: K. Sarrafan, A. Darabi
Abstract:
In this paper, the steady-state temperature of a sample 500 KW two rotor one stator Non-slotted axial flux permanent magnet motor is calculated using the finite element simulator software package. Due to the high temperature in various parts of the machine, especially at stator winding, a cooling system is designed for the motor and the temperature is recalculated. The results show that the temperature obtained for the parts is within the permissible range.Keywords: Axial Flux, Cooling System, Permanent Magnet, Thermal Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27361731 Identification of Printed Punjabi Words and English Numerals Using Gabor Features
Authors: Rajneesh Rani, Renu Dhir, G. S. Lehal
Abstract:
Script identification is one of the challenging steps in the development of optical character recognition system for bilingual or multilingual documents. In this paper an attempt is made for identification of English numerals at word level from Punjabi documents by using Gabor features. The support vector machine (SVM) classifier with five fold cross validation is used to classify the word images. The results obtained are quite encouraging. Average accuracy with RBF kernel, Polynomial and Linear Kernel functions comes out to be greater than 99%.
Keywords: Script identification, gabor features, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21261730 Evaluation of the Analytic for Hemodynamic Instability as A Prediction Tool for Early Identification of Patient Deterioration
Authors: Bryce Benson, Sooin Lee, Ashwin Belle
Abstract:
Unrecognized or delayed identification of patient deterioration is a key cause of in-hospitals adverse events. Clinicians rely on vital signs monitoring to recognize patient deterioration. However, due to ever increasing nursing workloads and the manual effort required, vital signs tend to be measured and recorded intermittently, and inconsistently causing large gaps during patient monitoring. Additionally, during deterioration, the body’s autonomic nervous system activates compensatory mechanisms causing the vital signs to be lagging indicators of underlying hemodynamic decline. This study analyzes the predictive efficacy of the Analytic for Hemodynamic Instability (AHI) system, an automated tool that was designed to help clinicians in early identification of deteriorating patients. The lead time analysis in this retrospective observational study assesses how far in advance AHI predicted deterioration prior to the start of an episode of hemodynamic instability (HI) becoming evident through vital signs? Results indicate that of the 362 episodes of HI in this study, 308 episodes (85%) were correctly predicted by the AHI system with a median lead time of 57 minutes and an average of 4 hours (240.5 minutes). Of the 54 episodes not predicted, AHI detected 45 of them while the episode of HI was ongoing. Of the 9 undetected, 5 were not detected by AHI due to either missing or noisy input ECG data during the episode of HI. In total, AHI was able to either predict or detect 98.9% of all episodes of HI in this study. These results suggest that AHI could provide an additional ‘pair of eyes’ on patients, continuously filling the monitoring gaps and consequently giving the patient care team the ability to be far more proactive in patient monitoring and adverse event management.
Keywords: Clinical deterioration prediction, decision support system, early warning system, hemodynamic status, physiologic monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4471729 Per Flow Packet Scheduling Scheme to Improve the End-to-End Fairness in Mobile Ad Hoc Wireless Network
Authors: K. Sasikala, R. S. D Wahidabanu
Abstract:
Various fairness models and criteria proposed by academia and industries for wired networks can be applied for ad hoc wireless network. The end-to-end fairness in an ad hoc wireless network is a challenging task compared to wired networks, which has not been addressed effectively. Most of the traffic in an ad hoc network are transport layer flows and thus the fairness of transport layer flows has attracted the interest of the researchers. The factors such as MAC protocol, routing protocol, the length of a route, buffer size, active queue management algorithm and the congestion control algorithms affects the fairness of transport layer flows. In this paper, we have considered the rate of data transmission, the queue management and packet scheduling technique. The ad hoc network is dynamic in nature due to various parameters such as transmission of control packets, multihop nature of forwarding packets, changes in source and destination nodes, changes in the routing path influences determining throughput and fairness among the concurrent flows. In addition, the effect of interaction between the protocol in the data link and transport layers has also plays a role in determining the rate of the data transmission. We maintain queue for each flow and the delay information of each flow is maintained accordingly. The pre-processing of flow is done up to the network layer only. The source and destination address information is used for separating the flow and the transport layer information is not used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and the transport layer information is used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on not mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and MC-MLAS and the performance of the proposed approach is encouraging.
Keywords: ATP, End-to-End fairness, FSM, MAC, QoS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19851728 Analysis of Surface Hardness, Surface Roughness, and Near Surface Microstructure of AISI 4140 Steel Worked with Turn-Assisted Deep Cold Rolling Process
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma, K. Jagannath, Achutha Kini U.
Abstract:
In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the surface hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor hobson talysurf tester, micro vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.
Keywords: Surface hardness, response surface methodology, microstructure, central composite design, deep cold rolling, surface roughness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18031727 Analysis of Palm Perspiration Effect with SVM for Diabetes in People
Authors: Hamdi Melih Saraoğlu, Muhlis Yıldırım, Abdurrahman Özbeyaz, Feyzullah Temurtas
Abstract:
In this research, the diabetes conditions of people (healthy, prediabete and diabete) were tried to be identified with noninvasive palm perspiration measurements. Data clusters gathered from 200 subjects were used (1.Individual Attributes Cluster and 2. Palm Perspiration Attributes Cluster). To decrase the dimensions of these data clusters, Principal Component Analysis Method was used. Data clusters, prepared in that way, were classified with Support Vector Machines. Classifications with highest success were 82% for Glucose parameters and 84% for HbA1c parametres.
Keywords: Palm perspiration, Diabetes, Support Vector Machine, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19451726 Enhancing Visual Basic GUI Applications using VRML Scenes
Authors: Bala Dhandayuthapani Veerasamy
Abstract:
Rapid Application Development (RAD) enables ever expanding needs for speedy development of computer application programs that are sophisticated, reliable, and full-featured. Visual Basic was the first RAD tool for the Windows operating system, and too many people say still it is the best. To provide very good attraction in visual basic 6 applications, this paper directing to use VRML scenes over the visual basic environment.Keywords: Cortona Control, Interpolator, Route, Sensor, VisualBasic, VRML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22681725 Extended Least Squares LS–SVM
Authors: József Valyon, Gábor Horváth
Abstract:
Among neural models the Support Vector Machine (SVM) solutions are attracting increasing attention, mostly because they eliminate certain crucial questions involved by neural network construction. The main drawback of standard SVM is its high computational complexity, therefore recently a new technique, the Least Squares SVM (LS–SVM) has been introduced. In this paper we present an extended view of the Least Squares Support Vector Regression (LS–SVR), which enables us to develop new formulations and algorithms to this regression technique. Based on manipulating the linear equation set -which embodies all information about the regression in the learning process- some new methods are introduced to simplify the formulations, speed up the calculations and/or provide better results.Keywords: Function estimation, Least–Squares Support VectorMachines, Regression, System Modeling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20081724 Analysis of Modified Heap Sort Algorithm on Different Environment
Authors: Vandana Sharma, Parvinder S. Sandhu, Satwinder Singh, Baljit Saini
Abstract:
In field of Computer Science and Mathematics, sorting algorithm is an algorithm that puts elements of a list in a certain order i.e. ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system-s performance. This paper presented the comparative performance study of four sorting algorithms on different platform. For each machine, it is found that the algorithm depends upon the number of elements to be sorted. In addition, as expected, results show that the relative performance of the algorithms differed on the various machines. So, algorithm performance is dependent on data size and there exists impact of hardware also.Keywords: Algorithm, Analysis, Complexity, Sorting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24111723 Path Planning of a Robot Manipulator using Retrieval RRT Strategy
Authors: K. Oh, J. P. Hwang, E. Kim, H. Lee
Abstract:
This paper presents an algorithm which extends the rapidly-exploring random tree (RRT) framework to deal with change of the task environments. This algorithm called the Retrieval RRT Strategy (RRS) combines a support vector machine (SVM) and RRT and plans the robot motion in the presence of the change of the surrounding environment. This algorithm consists of two levels. At the first level, the SVM is built and selects a proper path from the bank of RRTs for a given environment. At the second level, a real path is planned by the RRT planners for the given environment. The suggested method is applied to the control of KUKA™,, a commercial 6 DOF robot manipulator, and its feasibility and efficiency are demonstrated via the cosimulatation of MatLab™, and RecurDyn™,.Keywords: Path planning, RRT, 6 DOF manipulator, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25291722 Data Preprocessing for Supervised Leaning
Authors: S. B. Kotsiantis, D. Kanellopoulos, P. E. Pintelas
Abstract:
Many factors affect the success of Machine Learning (ML) on a given task. The representation and quality of the instance data is first and foremost. If there is much irrelevant and redundant information present or noisy and unreliable data, then knowledge discovery during the training phase is more difficult. It is well known that data preparation and filtering steps take considerable amount of processing time in ML problems. Data pre-processing includes data cleaning, normalization, transformation, feature extraction and selection, etc. The product of data pre-processing is the final training set. It would be nice if a single sequence of data pre-processing algorithms had the best performance for each data set but this is not happened. Thus, we present the most well know algorithms for each step of data pre-processing so that one achieves the best performance for their data set.Keywords: Data mining, feature selection, data cleaning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60871721 Investigating Transformations in the Cartesian Plane Using Spreadsheets
Authors: D. Allison, A. Didenko, G. Miller
Abstract:
The link between coordinate transformations in the plane and their effects on the graph of a function can be difficult for students studying college level mathematics to comprehend. To solidify this conceptual link in the mind of a student Microsoft Excel can serve as a convenient graphing tool and pedagogical aid. The authors of this paper describe how various transformations and their related functional symmetry properties can be graphically displayed with an Excel spreadsheet.
Keywords: Mathematics education, Microsoft Excel spreadsheet, technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19871720 A Study of Cooperative Co-evolutionary Genetic Algorithm for Solving Flexible Job Shop Scheduling Problem
Authors: Lee Yih Rou, Hishammuddin Asmuni
Abstract:
Flexible Job Shop Problem (FJSP) is an extension of classical Job Shop Problem (JSP). The FJSP extends the routing flexibility of the JSP, i.e assigning machine to an operation. Thus it makes it more difficult than the JSP. In this study, Cooperative Coevolutionary Genetic Algorithm (CCGA) is presented to solve the FJSP. Makespan (time needed to complete all jobs) is used as the performance evaluation for CCGA. In order to test performance and efficiency of our CCGA the benchmark problems are solved. Computational result shows that the proposed CCGA is comparable with other approaches.Keywords: Co-evolution, Genetic Algorithm (GA), Flexible JobShop Problem(FJSP)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17871719 Chatter Stability Characterization of Full-Immersion End-Milling Using a Generalized Modified Map of the Full-Discretization Method, Part 1: Validation of Results and Study of Stability Lobes by Numerical Simulation
Authors: Chigbogu G. Ozoegwu, Sam N. Omenyi
Abstract:
The objective in this work is to generate and discuss the stability results of fully-immersed end-milling process with parameters; tool mass m=0.0431kg,tool natural frequency ωn = 5700 rads^-1, damping factor ξ=0.002 and workpiece cutting coefficient C=3.5x10^7 Nm^-7/4. Different no of teeth is considered for the end-milling. Both 1-DOF and 2-DOF chatter models of the system are generated on the basis of non-linear force law. Chatter stability analysis is carried out using a modified form (generalized for both 1-DOF and 2-DOF models) of recently developed method called Full-discretization. The full-immersion three tooth end-milling together with higher toothed end-milling processes has secondary Hopf bifurcation lobes (SHBL’s) that exhibit one turning (minimum) point each. Each of such SHBL is demarcated by its minimum point into two portions; (i) the Lower Spindle Speed Portion (LSSP) in which bifurcations occur in the right half portion of the unit circle centred at the origin of the complex plane and (ii) the Higher Spindle Speed Portion (HSSP) in which bifurcations occur in the left half portion of the unit circle. Comments are made regarding why bifurcation lobes should generally get bigger and more visible with increase in spindle speed and why flip bifurcation lobes (FBL’s) could be invisible in the low-speed stability chart but visible in the high-speed stability chart of the fully-immersed three-tooth miller.
Keywords: Chatter, flip bifurcation, modified full-discretization map stability lobe, secondary Hopf bifurcation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18311718 A Tool for Creation Artificial Symbiotic Associations of Wheat
Authors: Zilya R. Vershinina, Andrei K. Baymiev, Aleksei K. Baymiev, Aleksei V. Chemeris
Abstract:
This paper reports optimization of characteristics of bioballistic transformation of spring soft wheat (Triticum aestivum L. cultivar Raduga) and getting of transgenic plants, carrying pea lectin gene. This gene will let to create new associative wheat symbiosis with nodule bacteria of field pea, which has growth encouraging, fungistatic and other useful characteristics.Keywords: transgenic wheat, pea lectin, rhizobia root colonization, symbiosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15561717 Continuous Text Translation Using Text Modeling in the Thetos System
Authors: Nina Suszczanska, Przemyslaw Szmal, Slawomir Kulikow
Abstract:
In the paper a method of modeling text for Polish is discussed. The method is aimed at transforming continuous input text into a text consisting of sentences in so called canonical form, whose characteristic is, among others, a complete structure as well as no anaphora or ellipses. The transformation is lossless as to the content of text being transformed. The modeling method has been worked out for the needs of the Thetos system, which translates Polish written texts into the Polish sign language. We believe that the method can be also used in various applications that deal with the natural language, e.g. in a text summary generator for Polish.Keywords: anaphora, machine translation, NLP, sign language, text syntax.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16551716 Support Vector Fuzzy Based Neural Networks For Exchange Rate Modeling
Authors: Prof. Chokri SLIM
Abstract:
A Novel fuzzy neural network combining with support vector learning mechanism called support-vector-based fuzzy neural networks (SVBFNN) is proposed. The SVBFNN combine the capability of minimizing the empirical risk (training error) and expected risk (testing error) of support vector learning in high dimensional data spaces and the efficient human-like reasoning of FNN.
Keywords: Neural network, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166861715 Symbolic Model Checking of Interactions in Sequence Diagrams with Combined Fragments by SMV
Authors: Yuka Kawakami, Tomoyuki Yokogawa, Hisashi Miyazaki, Sousuke Amasaki, Yoichiro Sato, Michiyoshi Hayase
Abstract:
In this paper, we proposed a method for detecting consistency violation between state machine diagrams and a sequence diagram defined in UML 2.0 using SMV. We extended a method expressing these diagrams defined in UML 1.0 with boolean formulas so that it can express a sequence diagram with combined fragments introduced in UML 2.0. This extension made it possible to represent three types of combined fragment: alternative, option and parallel. As a result of experiment, we confirmed that the proposed method could detect consistency violation correctly with SMV.
Keywords: UML, model checking, SMV, sequence diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14681714 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru
Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar
Abstract:
Nowadays, Heritage Building Information Modeling (HBIM) is considered an efficient tool to represent and manage information of Cultural Heritage (CH). The basis of this tool relies on a 3D model generally obtained from a Cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired Level of Development (LOD), Level of Information (LOI), Grade of Generation (GOG) as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models’ families respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources, since the BIM software used has a free student license.
Keywords: Cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9251713 Building Relationship Network for Machine Analysis from Wear Debris Measurements
Authors: Qurban A Memon, Mohammad S. Laghari
Abstract:
Integration of system process information obtained through an image processing system with an evolving knowledge database to improve the accuracy and predictability of wear debris analysis is the main focus of the paper. The objective is to automate intelligently the analysis process of wear particle using classification via self-organizing maps. This is achieved using relationship measurements among corresponding attributes of various measurements for wear debris. Finally, visualization technique is proposed that helps the viewer in understanding and utilizing these relationships that enable accurate diagnostics.Keywords: Relationship Network, Relationship Measurement, Self-organizing Clusters, Wear Debris Analysis, Kohonen Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19371712 Steady-State Performance of a New Model for UPFC Applied to Multi-Machines System with Nonlinear Load
Authors: S.Ali Al-Mawsawi
Abstract:
In this paper, a new developed construction model of the UPFC is proposed. The construction of this model consists of one shunt compensation block and two series compensation blocks. In this case, the UPFC with the new construction model will be investigated when it is installed in multi-machine systems with nonlinear load model. In addition, the steady–state performance of the new model operating as impedance compensation will be presented and compared with that obtained from the system without compensation.Keywords: UPFC, PWM, Nonlinear load, Multi-Machines system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18221711 Ranking - Convex Risk Minimization
Authors: Wojciech Rejchel
Abstract:
The problem of ranking (rank regression) has become popular in the machine learning community. This theory relates to problems, in which one has to predict (guess) the order between objects on the basis of vectors describing their observed features. In many ranking algorithms a convex loss function is used instead of the 0-1 loss. It makes these procedures computationally efficient. Hence, convex risk minimizers and their statistical properties are investigated in this paper. Fast rates of convergence are obtained under conditions, that look similarly to the ones from the classification theory. Methods used in this paper come from the theory of U-processes as well as empirical processes.
Keywords: Convex loss function, empirical risk minimization, empirical process, U-process, boosting, euclidean family.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14141710 Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition
Authors: Liton Jude Rozario, Mohammad Reduanul Haque, Md. Ziarul Islam, Mohammad Shorif Uddin
Abstract:
Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.
Keywords: PCA, ICA, LDA, SVM, face recognition, noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24301709 Electronic Tool that Helps in Learning How to Play a Flute
Authors: Galeano R. Katherine, Rincon L. David, Luengas C. Lely
Abstract:
This paper describes the development of an electronic instrument that looks like a flute, which is able to sense the basic musical notes being executed by a specific user. The principal function of the instrument is to teach how to play a flute. This device will generate a significant academic impact, in a field of virtual reality interactive that combine art and technology. With this example is expected to contribute in research and implementation of teaching devices around the world.Keywords: Flute, Hardware, Learning, Virtual Reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16631708 Wear and Mechanical Properties of Nodular Iron Modified with Copper
Authors: J. Ramos, V. Gil, A. F. Torres
Abstract:
In this research (using induction furnace process) nodular iron with three different percentages of copper (residual, 0.5% and 1,2%) was obtained. Chemical analysis was performed by mass spectrometry and microstructures were characterized by Optical Microscopy (ASTM E3) and Scanning Electron Microscopy (SEM). The study of mechanical behavior was carried out in a mechanical test machine (ASTM E8) and a Pin on disk tribometer (ASTM G99) was used to assess wear resistance. It is observed that the dissolution of copper in crystal lattice increases the pearlite structure improving the wear and hardness behavior, but producing a contrary effect on the energy absorption.
Keywords: Ferritic and perlite structure, mechanical properties, nodular iron, wear.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22661707 Optimization of Structure of Section-Based Automated Lines
Authors: R. Usubamatov, M. Z. Abdulmuin
Abstract:
Automated production lines with so called 'hard structures' are widely used in manufacturing. Designers segmented these lines into sections by placing a buffer between the series of machine tools to increase productivity. In real production condition the capacity of a buffer system is limited and real production line can compensate only some part of the productivity losses of an automated line. The productivity of such production lines cannot be readily determined. This paper presents mathematical approach to solving the structure of section-based automated production lines by criterion of maximum productivity.
Keywords: optimization production line, productivity, sections
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13241706 A Novel Fuzzy-Neural Based Medical Diagnosis System
Authors: S. Moein, S. A. Monadjemi, P. Moallem
Abstract:
In this paper, application of artificial neural networks in typical disease diagnosis has been investigated. The real procedure of medical diagnosis which usually is employed by physicians was analyzed and converted to a machine implementable format. Then after selecting some symptoms of eight different diseases, a data set contains the information of a few hundreds cases was configured and applied to a MLP neural network. The results of the experiments and also the advantages of using a fuzzy approach were discussed as well. Outcomes suggest the role of effective symptoms selection and the advantages of data fuzzificaton on a neural networks-based automatic medical diagnosis system.Keywords: Artificial Neural Networks, Fuzzy Logic, MedicalDiagnosis, Symptoms, Fuzzification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22581705 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analyzing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.
Keywords: DNA microarray, feature selection, missing data, bioinformatics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27901704 Classification Influence Index and its Application for k-Nearest Neighbor Classifier
Authors: Sejong Oh
Abstract:
Classification is an important topic in machine learning and bioinformatics. Many datasets have been introduced for classification tasks. A dataset contains multiple features, and the quality of features influences the classification accuracy of the dataset. The power of classification for each feature differs. In this study, we suggest the Classification Influence Index (CII) as an indicator of classification power for each feature. CII enables evaluation of the features in a dataset and improved classification accuracy by transformation of the dataset. By conducting experiments using CII and the k-nearest neighbor classifier to analyze real datasets, we confirmed that the proposed index provided meaningful improvement of the classification accuracy.Keywords: accuracy, classification, dataset, data preprocessing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14941703 Collaborative Planning and Forecasting
Authors: Neha Asthana, Vishal Krishna Prasad
Abstract:
Collaborative Planning and Forecasting is an innovative and systematic approach towards productive integration and assimilation of data synergized into information. The changing and variable market dynamics have persuaded global business chains to incorporate Collaborative Planning and Forecasting as an imperative tool. Thus, it is essential for the supply chains to constantly improvise, update its nature, and mould as per changing global environment.
Keywords: Information transfer, Forecasting, Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905