Search results for: process developed data warehouse.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13876

Search results for: process developed data warehouse.

13636 Real Time Force Sensing Mat for Human Gait Analysis

Authors: Darwin Gouwanda, S. M. N. Arosha Senanayake, M. M. Danushka Ranjana Marasinghe, Mervin Chandrapal, Jeya Mithra Kumar, Tung Mun Hon, Yulius

Abstract:

This paper presents a real time force sensing instrument that is designed for human gait analysis purposes. This instrument mainly consists of three main elements: the force sensing mat, signal conditioning and switching circuit and data acquisition device. In order to control and to process the incoming signals from the force sensing mat, Force-Logger and Force-Reloader program are developed using Labview 8.0. This paper describes the architecture of the force sensing mat, signal conditioning and switching circuit and the real time streaming of the incoming data from the force sensing mat.

Keywords: Force platform, Force sensing resistor, human gait analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
13635 Rapid Detection System of Airborne Pathogens

Authors: Shigenori Togashi, Kei Takenaka

Abstract:

We developed new processes which can collect and detect rapidly airborne pathogens such as the avian flu virus for the pandemic prevention. The fluorescence antibody technique is known as one of high-sensitive detection methods for viruses, but this needs up to a few hours to bind sufficient fluorescence dyes to viruses for detection. In this paper, we developed a mist-labeling can detect substitution viruses in a short time to improve the binding rate of fluorescent dyes and substitution viruses by the micro reaction process. Moreover, we developed the rapid detection system with the above “mist labeling”. The detection system set with a sampling bag collecting patient’s breath and a cartridge can detect automatically pathogens within 10 minutes.

Keywords: Viruses, Sampler, Mist, Detection, Fluorescent dyes, Microreaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2796
13634 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: Block matching, digital evidence, hash list.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
13633 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: Electromagnetic sensor, data acquisition, accurately, position measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 900
13632 Online Monitoring Rheological Property of Polymer Melt during Injection Molding

Authors: Chung-Chih Lin, Chien-Liang Wu

Abstract:

The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.

Keywords: Injection molding, melt viscosity, shear rate, monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2756
13631 A Framework for Product Development Process including HW and SW Components

Authors: Namchul Do, Gyeongseok Chae

Abstract:

This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.

Keywords: HW and SW Development Integration, ProductDevelopment with Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2552
13630 Dynamic Performance Indicators for Aged-Care Construction Projects

Authors: Norman Wu, Darren Sun

Abstract:

Key performance indicators (KPIs) are used for post result evaluation in the construction industry, and they normally do not have provisions for changes. This paper proposes a set of dynamic key performance indicators (d-KPIs) which predicts the future performance of the activity being measured and presents the opportunity to change practice accordingly. Critical to the predictability of a construction project is the ability to achieve automated data collection. This paper proposes an effective way to collect the process and engineering management data from an integrated construction management system. The d-KPI matrix, consisting of various indicators under seven categories, developed from this study can be applied to close monitoring of the development projects of aged-care facilities. The d-KPI matrix also enables performance measurement and comparison at both project and organization levels.

Keywords: Aged-care project, construction, dynamic KPI, healthcare system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2331
13629 Effect of Inventory Management on Financial Performance: Evidence from Nigerian Conglomerate Companies

Authors: Adamu Danlami Ahmed

Abstract:

Inventory management is the determinant of effective and efficient work for any manager. This study looked at the relationship between inventory management and financial performance. The population of the study comprises all conglomerate quoted companies in the Nigerian Stock Exchange market as at 31st December 2010. The scope of the study covered the period from 2010 to 2014. Descriptive, Pearson correlation and multiple regressions are used to analyze the data. It was found that inventory management is significantly related to the profitability of the company. This entails that an efficient management of the inventory cycle will enhance the profitability of the company. Also, lack of proper management of it will hinder the financial performance of organizations. Based on the results, it was recommended that a conglomerate company should try to see that inventories are kept to a minimum, as well as make sure the proper checks are maintained to make sure only needed inventories are in the store. As well as to keep track of the movement of goods, in order to avoid unnecessary delay of finished and work in progress (WIP) goods in the store and warehouse.

Keywords: Finished goods, work in progress, financial performance, inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4553
13628 Vibration Induced Fatigue Assessment in Vehicle Development Process

Authors: Fatih Kagnici

Abstract:

Improvement in CAE methods has an important role for shortening of the vehicle product development time. It is provided that validation of the design and improvements in terms of durability can be done without hardware prototype production. In recent years, several different methods have been developed in order to investigate fatigue damage of the vehicle. The intended goal among these methods is prediction of fatigue damage in a short time with reduced costs. This study developed a new fatigue damage prediction method in the automotive sector using power spectrum densities of accelerations. This study also confirmed that the weak region in vehicle can be easily detected with the method developed in this study which results were compared with conventional method.

Keywords: Fatigue damage, Power spectrum density, Vibration induced fatigue, Vehicle development

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3084
13627 Classifying Bio-Chip Data using an Ant Colony System Algorithm

Authors: Minsoo Lee, Yearn Jeong Kim, Yun-mi Kim, Sujeung Cheong, Sookyung Song

Abstract:

Bio-chips are used for experiments on genes and contain various information such as genes, samples and so on. The two-dimensional bio-chips, in which one axis represent genes and the other represent samples, are widely being used these days. Instead of experimenting with real genes which cost lots of money and much time to get the results, bio-chips are being used for biological experiments. And extracting data from the bio-chips with high accuracy and finding out the patterns or useful information from such data is very important. Bio-chip analysis systems extract data from various kinds of bio-chips and mine the data in order to get useful information. One of the commonly used methods to mine the data is classification. The algorithm that is used to classify the data can be various depending on the data types or number characteristics and so on. Considering that bio-chip data is extremely large, an algorithm that imitates the ecosystem such as the ant algorithm is suitable to use as an algorithm for classification. This paper focuses on finding the classification rules from the bio-chip data using the Ant Colony algorithm which imitates the ecosystem. The developed system takes in consideration the accuracy of the discovered rules when it applies it to the bio-chip data in order to predict the classes.

Keywords: Ant Colony System, DNA chip data, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422
13626 System for Monitoring Marine Turtles Using Unstructured Supplementary Service Data

Authors: Luís Pina

Abstract:

The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.

Keywords: GSM, marine biology, marine turtles, USSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
13625 Nonparametric Control Chart Using Density Weighted Support Vector Data Description

Authors: Myungraee Cha, Jun Seok Kim, Seung Hwan Park, Jun-Geol Baek

Abstract:

In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.

Keywords: Density estimation, Multivariate control chart, Oneclass classification, Support vector data description (SVDD)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
13624 Determination of Electromagnetic Properties of Human Tissues

Authors: Iliana Marinova, Valentin Mateev

Abstract:

In this paper a computer system for electromagnetic properties measurements is designed. The system employs Agilent 4294A precision impedance analyzer to measure the amplitude and the phase of a signal applied over a tested biological tissue sample. Measured by the developed computer system data could be used for tissue characterization in wide frequency range from 40Hz to 110MHz. The computer system can interface with output devices acquiring flexible testing process.

Keywords: Electromagnetic properties, human tissue, bioimpedance, measurement system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2385
13623 A Framework of Monte Carlo Simulation for Examining the Uncertainty-Investment Relationship

Authors: George Yungchih Wang

Abstract:

This paper argues that increased uncertainty, in certain situations, may actually encourage investment. Since earlier studies mostly base their arguments on the assumption of geometric Brownian motion, the study extends the assumption to alternative stochastic processes, such as mixed diffusion-jump, mean-reverting process, and jump amplitude process. A general approach of Monte Carlo simulation is developed to derive optimal investment trigger for the situation that the closed-form solution could not be readily obtained under the assumption of alternative process. The main finding is that the overall effect of uncertainty on investment is interpreted by the probability of investing, and the relationship appears to be an invested U-shaped curve between uncertainty and investment. The implication is that uncertainty does not always discourage investment even under several sources of uncertainty. Furthermore, high-risk projects are not always dominated by low-risk projects because the high-risk projects may have a positive realization effect on encouraging investment.

Keywords: real options, geometric Brownian motion, mixeddiffusion-jump process, mean- reverting process, jump amplitudeprocess

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
13622 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach

Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik

Abstract:

Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.

Keywords: Center of pressure (CoP), a method of developed statokinesigram trajectory (MDST), a model of postural system behavior, retroreflective marker data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 709
13621 A Review on Soft Computing Technique in Intrusion Detection System

Authors: Noor Suhana Sulaiman, Rohani Abu Bakar, Norrozila Sulaiman

Abstract:

Intrusion Detection System is significant in network security. It detects and identifies intrusion behavior or intrusion attempts in a computer system by monitoring and analyzing the network packets in real time. In the recent year, intelligent algorithms applied in the intrusion detection system (IDS) have been an increasing concern with the rapid growth of the network security. IDS data deals with a huge amount of data which contains irrelevant and redundant features causing slow training and testing process, higher resource consumption as well as poor detection rate. Since the amount of audit data that an IDS needs to examine is very large even for a small network, classification by hand is impossible. Hence, the primary objective of this review is to review the techniques prior to classification process suit to IDS data.

Keywords: Intrusion Detection System, security, soft computing, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
13620 Access to Higher Education in Nigeria: The University of Calabar Pre-Degree Program Experience

Authors: Eni I. Eni, James Okon, Ashang J. Ashang

Abstract:

The pre-degree program of the University of Calabar was introduced to help increase access to tertiary Education in science related courses. Its main objective was to provide access to candidates from educationally less developed states (ELDS) and states within its catchment area. An impact evaluation of the program was conducted, from where the aspect of providing access to University Education was reported here. Two research questions were formulated; expost-facto research design and purposive sampling technique were adopted for the study. Data collected was analyzed using descriptive statistics in terms of frequencies and percentages. The result of data analysis showed that the pre-degree program of the University of Calabar has provided educational access to Nigerians especially those from educationally less developed states in science related courses. It was therefore recommended that the program be sustained and further be improved upon to facilitate its continued provision of access to University Education in Nigeria.

Keywords: Educationally Less Developed States, Higher Education, Pre-Degree program, University of Calabar,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2518
13619 Data Transformation Services (DTS): Creating Data Mart by Consolidating Multi-Source Enterprise Operational Data

Authors: J. D. D. Daniel, K. N. Goh, S. M. Yusop

Abstract:

Trends in business intelligence, e-commerce and remote access make it necessary and practical to store data in different ways on multiple systems with different operating systems. As business evolve and grow, they require efficient computerized solution to perform data update and to access data from diverse enterprise business applications. The objective of this paper is to demonstrate the capability of DTS [1] as a database solution for automatic data transfer and update in solving business problem. This DTS package is developed for the sales of variety of plants and eventually expanded into commercial supply and landscaping business. Dimension data modeling is used in DTS package to extract, transform and load data from heterogeneous database systems such as MySQL, Microsoft Access and Oracle that consolidates into a Data Mart residing in SQL Server. Hence, the data transfer from various databases is scheduled to run automatically every quarter of the year to review the efficient sales analysis. Therefore, DTS is absolutely an attractive solution for automatic data transfer and update which meeting today-s business needs.

Keywords: Data Transformation Services (DTS), ObjectLinking and Embedding Database (OLEDB), Data Mart, OnlineAnalytical Processing (OLAP), Online Transactional Processing(OLTP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1982
13618 An Intelligent Human-Computer Interaction System for Decision Support

Authors: Chee Siong Teh, Chee Peng Lim

Abstract:

This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.

Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
13617 Proposing an Efficient Method for Frequent Pattern Mining

Authors: Vaibhav Kant Singh, Vijay Shah, Yogendra Kumar Jain, Anupam Shukla, A.S. Thoke, Vinay KumarSingh, Chhaya Dule, Vivek Parganiha

Abstract:

Data mining, which is the exploration of knowledge from the large set of data, generated as a result of the various data processing activities. Frequent Pattern Mining is a very important task in data mining. The previous approaches applied to generate frequent set generally adopt candidate generation and pruning techniques for the satisfaction of the desired objective. This paper shows how the different approaches achieve the objective of frequent mining along with the complexities required to perform the job. This paper will also look for hardware approach of cache coherence to improve efficiency of the above process. The process of data mining is helpful in generation of support systems that can help in Management, Bioinformatics, Biotechnology, Medical Science, Statistics, Mathematics, Banking, Networking and other Computer related applications. This paper proposes the use of both upward and downward closure property for the extraction of frequent item sets which reduces the total number of scans required for the generation of Candidate Sets.

Keywords: Data Mining, Candidate Sets, Frequent Item set, Pruning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
13616 Peakwise Smoothing of Data Models using Wavelets

Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan

Abstract:

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
13615 Improved K-Modes for Categorical Clustering Using Weighted Dissimilarity Measure

Authors: S.Aranganayagi, K.Thangavel

Abstract:

K-Modes is an extension of K-Means clustering algorithm, developed to cluster the categorical data, where the mean is replaced by the mode. The similarity measure proposed by Huang is the simple matching or mismatching measure. Weight of attribute values contribute much in clustering; thus in this paper we propose a new weighted dissimilarity measure for K-Modes, based on the ratio of frequency of attribute values in the cluster and in the data set. The new weighted measure is experimented with the data sets obtained from the UCI data repository. The results are compared with K-Modes and K-representative, which show that the new measure generates clusters with high purity.

Keywords: Clustering, categorical data, K-Modes, weighted dissimilarity measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3639
13614 Load Modeling for Power Flow and Transient Stability Computer Studies at BAKHTAR Network

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

A method has been developed for preparing load models for power flow and stability. The load modeling (LOADMOD) computer software transforms data on load class mix, composition, and characteristics into the from required for commonly–used power flow and transient stability simulation programs. Typical default data have been developed for load composition and characteristics. This paper defines LOADMOD software and describes the dynamic and static load modeling techniques used in this software and results of initial testing for BAKHTAR power system.

Keywords: Load Modelling, Static, Power Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
13613 Computational Evaluation of a C-A Heat Pump

Authors: Young-Jin Baik, Minsung Kim, Young-Soo Lee, Ki-Chang Chang, Seong-Ryong Park

Abstract:

The compression-absorption heat pump (C-A HP), one of the promising heat recovery equipments that make process hot water using low temperature heat of wastewater, was evaluated by computer simulation. A simulation program was developed based on the continuity and the first and second laws of thermodynamics. Both the absorber and desorber were modeled using UA-LMTD method. In order to prevent an unfeasible temperature profile and to reduce calculation errors from the curved temperature profile of a mixture, heat loads were divided into lots of segments. A single-stage compressor was considered. A compressor cooling load was also taken into account. An isentropic efficiency was computed from the map data. Simulation conditions were given based on the system consisting of ordinarily designed components. The simulation results show that most of the total entropy generation occurs during the compression and cooling process, thus suggesting the possibility that system performance can be enhanced if a rectifier is introduced.

Keywords: Waste heat recovery, Heat Pump.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
13612 Optimization of Machining Parametric Study on Electrical Discharge Machining

Authors: Rakesh Prajapati, Purvik Patel, Hardik Patel

Abstract:

Productivity and quality are two important aspects that have become great concerns in today’s competitive global market. Every production/manufacturing unit mainly focuses on these areas in relation to the process, as well as the product developed. The electrical discharge machining (EDM) process, even now it is an experience process, wherein the selected parameters are still often far from the maximum, and at the same time selecting optimization parameters is costly and time consuming. Material Removal Rate (MRR) during the process has been considered as a productivity estimate with the aim to maximize it, with an intention of minimizing surface roughness taken as most important output parameter. These two opposites in nature requirements have been simultaneously satisfied by selecting an optimal process environment (optimal parameter setting). Objective function is obtained by Regression Analysis and Analysis of Variance. Then objective function is optimized using Genetic Algorithm technique. The model is shown to be effective; MRR and Surface Roughness improved using optimized machining parameters.

Keywords: Material removal rate, TWR, OC, DOE, ANOVA, MINITAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
13611 Study of the Effect of Inclusion of TiO2 in Active Flux on Submerged Arc Welding of Low Carbon Mild Steel Plate and Parametric Optimization of the Process by Using DEA Based Bat Algorithm

Authors: Sheetal Kumar Parwar, J. Deb Barma, A. Majumder

Abstract:

Submerged arc welding is a very complex process. It is a very efficient and high performance welding process. In this present study an attempt have been done to reduce the welding distortion by increased amount of oxide flux through TiO2 in submerged arc welding process. Care has been taken to avoid the excessiveness of the adding agent for attainment of significant results. Data Envelopment Analysis (DEA) based BAT algorithm is used for the parametric optimization purpose in which DEA is used to convert multi response parameters into a single response parameter. The present study also helps to know the effectiveness of the addition of TiO2 in active flux during submerged arc welding process.

Keywords: BAT algorithm, design of experiment, optimization, submerged arc welding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
13610 Determining a Suitable Maintenance Measure for Gentelligent Components Using Case-Based Reasoning

Authors: M. Winkens, P. Nyhuis

Abstract:

Components with sensory properties such as gentelligent components developed at the Collaborative Research Centre 653 offer a new angle in terms of the full utilization of the remaining service life as well as preventive maintenance. The developed methodology of component status driven maintenance analyzes the stress data obtained during the component's useful life and on the basis of this knowledge assesses the type of maintenance required in this case. The procedure is derived from the case-based reasoning method and will be explained in detail. The method's functionality is demonstrated with real-life data obtained during test runs of a racing car prototype.

Keywords: Gentelligent Components, Preventive Maintenance, Case based Reasoning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1852
13609 Stealth Laser Dicing Process Improvement via Shuffled Frog Leaping Algorithm

Authors: Pongchanun Luangpaiboon, Wanwisa Sarasang

Abstract:

In this paper, performances of shuffled frog leaping algorithm was investigated on the stealth laser dicing process. Effect of problem on the performance of the algorithm was based on the tolerance of meandering data. From the customer specification it could be less than five microns with the target of zero microns. Currently, the meandering levels are unsatisfactory when compared to the customer specification. Firstly, the two-level factorial design was applied to preliminarily study the statistically significant effects of five process variables. In this study one influential process variable is integer. From the experimental results, the new operating condition from the algorithm was superior when compared to the current manufacturing condition.

Keywords: Stealth Laser Dicing Process, Meandering, Metaheuristics, Shuffled Frog Leaping Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2589
13608 Predictions of Values in a Causticizing Process

Authors: R. Andreola, O. A. A. Santos, L. M. M, Jorge

Abstract:

An industrial system for the production of white liquor of a paper industry, Klabin Paraná Papéis, formed by ten reactors was modeled, simulated, and analyzed. The developed model considered possible water losses by evaporation and reaction, in addition to variations in volumetric flow of lime mud across the reactors due to composition variations. The model predictions agreed well with the process measurements at the plant and the results showed that the slaking reaction is nearly complete at the third causticizing reactor, while causticizing ends by the seventh reactor. Water loss due to slaking reaction and evaporation occurs more pronouncedly in the slaking reaction than in the final causticizing reactors; nevertheless, the lime mud flow remains nearly constant across the reactors.

Keywords: Causticizing, lime, prediction, process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
13607 An Experimental Design Approach to Determine Effects of The Operating Parameters on The Rate of Ru promoted Ir Carbonylation of Methanol

Authors: Vahid Hosseinpour, Mohammad Kazemini, Alireza Mohammadrezaee

Abstract:

carbonylation of methanol in homogenous phase is one of the major routesfor production of acetic acid. Amongst group VIII metal catalysts used in this process iridium has displayed the best capabilities. To investigate effect of operating parameters like: temperature, pressure, methyl iodide, methyl acetate, iridium, ruthenium, and water concentrations on the reaction rate, experimental design for this system based upon central composite design (CCD) was utilized. Statistical rate equation developed by this method contained individual, interactions and curvature effects of parameters on the reaction rate. The model with p-value less than 0.0001 and R2 values greater than 0.9; confirmeda satisfactory fitness of the experimental and theoretical studies. In other words, the developed model and experimental data obtained passed all diagnostic tests establishing this model as a statistically significant.

Keywords: Acetic Acid, Carbonylation of Methanol, Central Composite Design, Experimental Design, Iridium/Ruthenium

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3600