Search results for: data centre cooling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7888

Search results for: data centre cooling

6958 Analysis of DNA Microarray Data using Association Rules: A Selective Study

Authors: M. Anandhavalli Gauthaman

Abstract:

DNA microarrays allow the measurement of expression levels for a large number of genes, perhaps all genes of an organism, within a number of different experimental samples. It is very much important to extract biologically meaningful information from this huge amount of expression data to know the current state of the cell because most cellular processes are regulated by changes in gene expression. Association rule mining techniques are helpful to find association relationship between genes. Numerous association rule mining algorithms have been developed to analyze and associate this huge amount of gene expression data. This paper focuses on some of the popular association rule mining algorithms developed to analyze gene expression data.

Keywords: DNA microarray, gene expression, association rule mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
6957 Prospects, Problems of Marketing Research and Data Mining in Turkey

Authors: Sema Kurtuluş, Kemal Kurtuluş

Abstract:

The objective of this paper is to review and assess the methodological issues and problems in marketing research, data and knowledge mining in Turkey. As a summary, academic marketing research publications in Turkey have significant problems. The most vital problem seems to be related with modeling. Most of the publications had major weaknesses in modeling. There were also, serious problems regarding measurement and scaling, sampling and analyses. Analyses myopia seems to be the most important problem for young academia in Turkey. Another very important finding is the lack of publications on data and knowledge mining in the academic world.

Keywords: Marketing research, data mining, knowledge mining, research modeling, analyses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
6956 Analysis and Comparison of Image Encryption Algorithms

Authors: İsmet Öztürk, İbrahim Soğukpınar

Abstract:

With the fast progression of data exchange in electronic way, information security is becoming more important in data storage and transmission. Because of widely using images in industrial process, it is important to protect the confidential image data from unauthorized access. In this paper, we analyzed current image encryption algorithms and compression is added for two of them (Mirror-like image encryption and Visual Cryptography). Implementations of these two algorithms have been realized for experimental purposes. The results of analysis are given in this paper.

Keywords: image encryption, image cryptosystem, security, transmission

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4956
6955 Risk Classification of SMEs by Early Warning Model Based on Data Mining

Authors: Nermin Ozgulbas, Ali Serhan Koyuncugil

Abstract:

One of the biggest problems of SMEs is their tendencies to financial distress because of insufficient finance background. In this study, an Early Warning System (EWS) model based on data mining for financial risk detection is presented. CHAID algorithm has been used for development of the EWS. Developed EWS can be served like a tailor made financial advisor in decision making process of the firms with its automated nature to the ones who have inadequate financial background. Besides, an application of the model implemented which covered 7,853 SMEs based on Turkish Central Bank (TCB) 2007 data. By using EWS model, 31 risk profiles, 15 risk indicators, 2 early warning signals, and 4 financial road maps has been determined for financial risk mitigation.

Keywords: Early Warning Systems, Data Mining, Financial Risk, SMEs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3386
6954 The Analysis of Defects Prediction in Injection Molding

Authors: Mehdi Moayyedian, Kazem Abhary, Romeo Marian

Abstract:

This paper presents an evaluation of a plastic defect in injection molding before it occurs in the process; it is known as the short shot defect. The evaluation of different parameters which affect the possibility of short shot defect is the aim of this paper. The analysis of short shot possibility is conducted via SolidWorks Plastics and Taguchi method to determine the most significant parameters. Finite Element Method (FEM) is employed to analyze two circular flat polypropylene plates of 1 mm thickness. Filling time, part cooling time, pressure holding time, melt temperature and gate type are chosen as process and geometric parameters, respectively. A methodology is presented herein to predict the possibility of the short-shot occurrence. The analysis determined melt temperature is the most influential parameter affecting the possibility of short shot defect with a contribution of 74.25%, and filling time with a contribution of 22%, followed by gate type with a contribution of 3.69%. It was also determined the optimum level of each parameter leading to a reduction in the possibility of short shot are gate type at level 1, filling time at level 3 and melt temperature at level 3. Finally, the most significant parameters affecting the possibility of short shot were determined to be melt temperature, filling time, and gate type.

Keywords: Injection molding, plastic defects, short shot, Taguchi method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
6953 Using Data from Foursquare Web Service to Represent the Commercial Activity of a City

Authors: Taras Agryzkov, Almudena Nolasco-Cirugeda, Jos´e L. Oliver, Leticia Serrano-Estrada, Leandro Tortosa, Jos´e F. Vicent

Abstract:

This paper aims to represent the commercial activity of a city taking as source data the social network Foursquare. The city of Murcia is selected as case study, and the location-based social network Foursquare is the main source of information. After carrying out a reorganisation of the user-generated data extracted from Foursquare, it is possible to graphically display on a map the various city spaces and venues especially those related to commercial, food and entertainment sector businesses. The obtained visualisation provides information about activity patterns in the city of Murcia according to the people‘s interests and preferences and, moreover, interesting facts about certain characteristics of the town itself.

Keywords: Social networks, Foursquare, spatial analysis, data visualization, geocomputation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2675
6952 Long-Range Dependence of Financial Time Series Data

Authors: Chatchai Pesee

Abstract:

This paper examines long-range dependence or longmemory of financial time series on the exchange rate data by the fractional Brownian motion (fBm). The principle of spectral density function in Section 2 is used to find the range of Hurst parameter (H) of the fBm. If 0< H <1/2, then it has a short-range dependence (SRD). It simulates long-memory or long-range dependence (LRD) if 1/2< H <1. The curve of exchange rate data is fBm because of the specific appearance of the Hurst parameter (H). Furthermore, some of the definitions of the fBm, long-range dependence and selfsimilarity are reviewed in Section II as well. Our results indicate that there exists a long-memory or a long-range dependence (LRD) for the exchange rate data in section III. Long-range dependence of the exchange rate data and estimation of the Hurst parameter (H) are discussed in Section IV, while a conclusion is discussed in Section V.

Keywords: Fractional Brownian motion, long-rangedependence, memory, short-range dependence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1883
6951 Meta Random Forests

Authors: Praveen Boinee, Alessandro De Angelis, Gian Luca Foresti

Abstract:

Leo Breimans Random Forests (RF) is a recent development in tree based classifiers and quickly proven to be one of the most important algorithms in the machine learning literature. It has shown robust and improved results of classifications on standard data sets. Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques to the random forests. We experiment the working of the ensembles of random forests on the standard data sets available in UCI data sets. We compare the original random forest algorithm with their ensemble counterparts and discuss the results.

Keywords: Random Forests [RF], ensembles, UCI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2708
6950 Economic effects and Energy Use Efficiency of Incorporating Alfalfa and Fertilizer into Grass- Based Pasture Systems

Authors: M. Khakbazan, S. L. Scott, H. C. Block, C. D. Robins, W. P. McCaughey

Abstract:

A ten-year grazing study was conducted at the Agriculture and Agri-Food Canada Brandon Research Centre in Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P, K, and S) addition on economics and efficiency of non-renewable energy use in meadow brome grass-based pasture systems for beef production. Fertilizing grass-only or alfalfa-grass pastures to full soil test recommendations improved pasture productivity, but did not improve profitability compared to unfertilized pastures. Fertilizing grass-only pastures resulted in the highest net loss of any pasture management strategy in this study. Adding alfalfa at the time of seeding, with no added fertilizer, was economically the best pasture improvement strategy in this study. Because of moisture limitations, adding commercial fertilizer to full soil test recommendations is probably not economically justifiable in most years, especially with the rising cost of fertilizer. Improving grass-only pastures by adding fertilizer and/or alfalfa required additional non-renewable energy inputs; however, the additional energy required for unfertilized alfalfa-grass pastures was minimal compared to the fertilized pastures. Of the four pasture management strategies, adding alfalfa to grass pastures without adding fertilizer had the highest efficiency of energy use. Based on energy use and economic performance, the unfertilized alfalfa-grass pasture was the most efficient and sustainable pasture system.

Keywords: Alfalfa, grass, fertilizer, pasture systems, economics, energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
6949 Time Series Regression with Meta-Clusters

Authors: Monika Chuchro

Abstract:

This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain subgroups of time series data with normal distribution from the inflow into wastewater treatment plant data, composed of several groups differing by mean value. Two simple algorithms, K-mean and EM, were chosen as a clustering method. The Rand index was used to measure the similarity. After simple meta-clustering, a regression model was performed for each subgroups. The final model was a sum of the subgroups models. The quality of the obtained model was compared with the regression model made using the same explanatory variables, but with no clustering of data. Results were compared using determination coefficient (R2), measure of prediction accuracy- mean absolute percentage error (MAPE) and comparison on a linear chart. Preliminary results allow us to foresee the potential of the presented technique.

Keywords: Clustering, Data analysis, Data mining, Predictive models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
6948 ROSA/LSTF Test on Pressurized Water Reactor Steam Generator Tube Rupture Accident Induced by Main Steam Line Break with Recovery Actions

Authors: Takeshi Takeda

Abstract:

An experiment was performed for the OECD/NEA ROSA-2 Project employing the ROSA/LSTF (rig of safety assessment/large-scale test facility), which simulated a steam generator tube rupture (SGTR) accident induced by main steam line break (MSLB) with operator recovery actions in a pressurized water reactor (PWR). The primary pressure decreased to the pressure level nearly-equal to the intact steam generator (SG) secondary-side pressure even with coolant injection from the high-pressure injection (HPI) system of emergency core cooling system (ECCS) into cold legs. Multi-dimensional coolant behavior appeared such as thermal stratification in both hot and cold legs in intact loop. The RELAP5/MOD3.3 code indicated the insufficient predictions of the primary pressure, the SGTR break flow rate, and the HPI flow rate, and failed to predict the fluid temperatures in the intact loop hot and cold legs. Results obtained from the comparison among three LSTF SGTR-related tests clarified that the thermal stratification occurs in the horizontal legs by different mechanisms.

Keywords: LSTF, SGTR, thermal stratification, RELAP5.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 786
6947 Development of Sustainable Farming Compartment with Treated Wastewater in Abu Dhabi

Authors: Jongwan Eun, Sam Helwany, Lakshyana K. C.

Abstract:

The United Arab Emirates (UAE) is significantly dependent on desalinated water and groundwater resource, which is expensive and highly energy intensive. Despite the scarce water resource, stagnates only 54% of the recycled water was reused in 2012, and due to the lack of infrastructure to reuse the recycled water, the portion is expected to decrease with growing water usage. In this study, an “Oasis” complex comprised of Sustainable Farming Compartments (SFC) was proposed for reusing treated wastewater. The wastewater is used to decrease the ambient temperature of the SFC via an evaporative cooler. The SFC prototype was designed, built, and tested in an environmentally controlled laboratory and field site to evaluate the feasibility and effectiveness of the SFC subjected to various climatic conditions in Abu Dhabi. Based on the experimental results, the temperature drop achieved in the SFC in the laboratory and field site were5 ̊C from 22 ̊C and 7- 15 ̊C (from 33-45 ̊C to average 28 ̊C at relative humidity < 50%), respectively. An energy simulation using TRNSYS was performed to extend and validate the results obtained from the experiment. The results from the energy simulation and experiments show statistically close agreement. The total power consumption of the SFC system was approximately three and a half times lower than that of an electrical air conditioner. Therefore, by using treated wastewater, the SFC has a promising prospect to solve Abu Dhabi’s ecological concern related to desertification and wind erosion.

Keywords: Ecological farming system, energy simulation, evaporative cooling system, treated wastewater, temperature, humidity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
6946 Studies on Determination of the Optimum Distance Between the Tmotes for Optimum Data Transfer in a Network with WLL Capability

Authors: N C Santhosh Kumar, N K Kishore

Abstract:

Using mini modules of Tmotes, it is possible to automate a small personal area network. This idea can be extended to large networks too by implementing multi-hop routing. Linking the various Tmotes using Programming languages like Nesc, Java and having transmitter and receiver sections, a network can be monitored. It is foreseen that, depending on the application, a long range at a low data transfer rate or average throughput may be an acceptable trade-off. To reduce the overall costs involved, an optimum number of Tmotes to be used under various conditions (Indoor/Outdoor) is to be deduced. By analyzing the data rates or throughputs at various locations of Tmotes, it is possible to deduce an optimal number of Tmotes for a specific network. This paper deals with the determination of optimum distances to reduce the cost and increase the reliability of the entire sensor network with Wireless Local Loop (WLL) capability.

Keywords: Average throughput, data rate, multi-hop routing, optimum data transfer, throughput, Tmotes, wireless local loop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
6945 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision making, management and planning of healthcare and related activities. However, user resistances, unique position of medical data content and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. Success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose a HA process model with features from rational unified process (RUP) model and agile methodology.

Keywords: Agile methodology, health analytics, unified process model, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2329
6944 Energy Efficient Plant Design Approaches: Case Study of the Sample Building of the Energy Efficiency Training Facilities

Authors: Idil Kanter Otcu

Abstract:

Nowadays, due to the growing problems of energy supply and the drastic reduction of natural non-renewable resources, the development of new applications in the energy sector and steps towards greater efficiency in energy consumption are required. Since buildings account for a large share of energy consumption, increasing the structural density of buildings causes an increase in energy consumption. This increase in energy consumption means that energy efficiency approaches to building design and the integration of new systems using emerging technologies become necessary in order to curb this consumption. As new systems for productive usage of generated energy are developed, buildings that require less energy to operate, with rational use of resources, need to be developed. One solution for reducing the energy requirements of buildings is through landscape planning, design and application. Requirements such as heating, cooling and lighting can be met with lower energy consumption through planting design, which can help to achieve more efficient and rational use of resources. Within this context, rather than a planting design which considers only the ecological and aesthetic features of plants, these considerations should also extend to spatial organization whereby the relationship between the site and open spaces in the context of climatic elements and planting designs are taken into account. In this way, the planting design can serve an additional purpose. In this study, a landscape design which takes into consideration location, local climate morphology and solar angle will be illustrated on a sample building project.

Keywords: Energy efficiency, landscape design, plant design, xeriscape landscape.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
6943 Performance Evaluation of Conventional and Wiper Carbide Tools When Turning 6060 Aluminium Alloy: Analysis of Surface Roughness

Authors: Salah Gariani, Taher Dao, Khaled Jegandi

Abstract:

Wiper inserts are widely used nowadays, particularly in turning and milling operations, due to their unique geometric characteristics that generate superb surface finish and improve productivity. Wiper inserts can produce double the feed rate while preserving comparable surface roughness compared to that produced by conventional cutting tools. This paper reports an experimental investigation of surface quality generated in the precision dry turning of 6060 Aluminium alloy using conventional and wiper inserts at different cutting conditions. The Taguchi L9 array, Analysis of Means (AOM) and variance (ANOVA) were employed in the development of the experimental design and to optimise the process parameter identified: average surface roughness (Ra). The experimental results show that the wiper inserts substantially improved the surface quality of the machined samples by a factor of two compared to those for the conventional insert under all cutting conditions. The ANOVA and AOM analysis showed that the type of insert is the most significant factor affecting surface roughness, with a Percentage Contribution Ratio (PCR) value of 67.41%. Feed rate also significantly affected surface roughness but contributed less to its variation. No significant difference was found between values of Ra using wiper inserts under dry and wet cooling modes when turning 6060 Aluminium alloy.

Keywords: 6060 Aluminium alloy, conventional and wiper carbide tools, dry turning, average surface roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 319
6942 The Classification Model for Hard Disk Drive Functional Tests under Sparse Data Conditions

Authors: S. Pattanapairoj, D. Chetchotsak

Abstract:

This paper proposed classification models that would be used as a proxy for hard disk drive (HDD) functional test equitant which required approximately more than two weeks to perform the HDD status classification in either “Pass" or “Fail". These models were constructed by using committee network which consisted of a number of single neural networks. This paper also included the method to solve the problem of sparseness data in failed part, which was called “enforce learning method". Our results reveal that the constructed classification models with the proposed method could perform well in the sparse data conditions and thus the models, which used a few seconds for HDD classification, could be used to substitute the HDD functional tests.

Keywords: Sparse data, Classifications, Committee network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
6941 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: Electromagnetic sensor, data acquisition, accurately, position measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960
6940 Calculus Logarithmic Function for Image Encryption

Authors: Adil AL-Rammahi

Abstract:

When we prefer to make the data secure from various attacks and fore integrity of data, we must encrypt the data before it is transmitted or stored. This paper introduces a new effective and lossless image encryption algorithm using a natural logarithmic function. The new algorithm encrypts an image through a three stage process. In the first stage, a reference natural logarithmic function is generated as the foundation for the encryption image. The image numeral matrix is then analyzed to five integer numbers, and then the numbers’ positions are transformed to matrices. The advantages of this method is useful for efficiently encrypting a variety of digital images, such as binary images, gray images, and RGB images without any quality loss. The principles of the presented scheme could be applied to provide complexity and then security for a variety of data systems such as image and others.

Keywords: Linear Systems, Image Encryption, Calculus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2400
6939 Challenging the Stereotypes: A Critical Study of Chotti Munda and His Arrow and Sula

Authors: Khushboo Gokani, Renu Josan

Abstract:

Mahasweta Devi and Toni Morrison are the two stalwarts of the Indian English and the Afro-American literature respectively. The writings of these two novelists are authentic and powerful records of the lives of the people because much of their personal experiences have gone into the making of their works. Devi, a representative force of the Indian English literature, is also a social activist working with the tribals of Bihar, Jharkhand, Orissa and West Bengal. Most of her works echo the lives and struggles of the subalterns as is evident in her “best beloved book” Chotti Munda and His Arrow. The novelist focuses on the struggle of the tribals against the colonial and the feudal powers to create their own identity, thereby, embarking on the ideological project of ‘setting the record straight’. The Nobel Laureate Toni Morrison, on the other hand, brings to the fore the crucial issues of gender, race and class in many of her significant works. In one of her representative works Sula, the protagonist emerges as a non- conformist and directly confronts the notion of a ‘good woman’ nurtured by the community of the Blacks. In addition to this, the struggle of the Blacks against the White domination, also become an important theme of the text. The thrust of the paper lies in making a critical analysis of the portrayal of the heroic attempts of the subaltern protagonist and the artistic endeavor of the novelists in challenging the stereotypes.

Keywords: Subaltern, The Centre And The Periphery, Struggle Of The Muted Groups.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3694
6938 Intelligent BRT in Tehran

Authors: P. Parvizi, S. Mohammadi

Abstract:

an intelligent BRT system is necessary when communities looking for new ways to use high capacity rapid transit at a reduced cost.This paper will describe the intelligent control system that works with Datacenter. With the help of GPS system, the data center can monitor the situation of each bus and bus station. Through RFID technology, bus station and traffic light can transfer data with bus and by Wimax communication technology all of parts can talk together; data center learns all information about the location of bus, the arrival of bus in each station and the number of passengers in station and bus.Finally, the paper presents the case study of those theories in Tehran BRT.

Keywords: TehranBRT, RFID, Intelligent Transportation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2450
6937 Spread Spectrum Image Watermarking for Secured Multimedia Data Communication

Authors: Tirtha S. Das, Ayan K. Sau, Subir K. Sarkar

Abstract:

Digital watermarking is a way to provide the facility of secure multimedia data communication besides its copyright protection approach. The Spread Spectrum modulation principle is widely used in digital watermarking to satisfy the robustness of multimedia signals against various signal-processing operations. Several SS watermarking algorithms have been proposed for multimedia signals but very few works have discussed on the issues responsible for secure data communication and its robustness improvement. The current paper has critically analyzed few such factors namely properties of spreading codes, proper signal decomposition suitable for data embedding, security provided by the key, successive bit cancellation method applied at decoder which have greater impact on the detection reliability, secure communication of significant signal under camouflage of insignificant signals etc. Based on the analysis, robust SS watermarking scheme for secure data communication is proposed in wavelet domain and improvement in secure communication and robustness performance is reported through experimental results. The reported result also shows improvement in visual and statistical invisibility of the hidden data.

Keywords: Spread spectrum modulation, spreading code, signaldecomposition, security, successive bit cancellation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780
6936 Comparison of Hough Transform and Mean Shift Algorithm for Estimation of the Orientation Angle of Industrial Data Matrix Codes

Authors: Ion-Cosmin Dita, Vasile Gui, Franz Quint, Marius Otesteanu

Abstract:

In automatic manufacturing and assembling of mechanical, electrical and electronic parts one needs to reliably identify the position of components and to extract the information of these components. Data Matrix Codes (DMC) are established by these days in many areas of industrial manufacturing thanks to their concentration of information on small spaces. In today’s usually order-related industry, where increased tracing requirements prevail, they offer further advantages over other identification systems. This underlines in an impressive way the necessity of a robust code reading system for detecting DMC on the components in factories. This paper compares two methods for estimating the angle of orientation of Data Matrix Codes: one method based on the Hough Transform and the other based on the Mean Shift Algorithm. We concentrate on Data Matrix Codes in industrial environment, punched, milled, lasered or etched on different materials in arbitrary orientation.

Keywords: Industrial data matrix code, Hough transform, mean shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
6935 An Intelligent Human-Computer Interaction System for Decision Support

Authors: Chee Siong Teh, Chee Peng Lim

Abstract:

This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.

Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
6934 Implementation of Security Algorithms for u-Health Monitoring System

Authors: Jiho Park, Yong-Gyu Lee, Gilwon Yoon

Abstract:

Data security in u-Health system can be an important issue because wireless network is vulnerable to hacking. However, it is not easy to implement a proper security algorithm in an embedded u-health monitoring because of hardware constraints such as low performance, power consumption and limited memory size and etc. To secure data that contain personal and biosignal information, we implemented several security algorithms such as Blowfish, data encryption standard (DES), advanced encryption standard (AES) and Rivest Cipher 4 (RC4) for our u-Health monitoring system and the results were successful. Under the same experimental conditions, we compared these algorithms. RC4 had the fastest execution time. Memory usage was the most efficient for DES. However, considering performance and safety capability, however, we concluded that AES was the most appropriate algorithm for a personal u-Health monitoring system.

Keywords: biosignal, data encryption, security measures, u-health

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
6933 A Symbol by Symbol Clustering Based Blind Equalizer

Authors: Kristina Georgoulakis

Abstract:

A new blind symbol by symbol equalizer is proposed. The operation of the proposed equalizer is based on the geometric properties of the two dimensional data constellation. An unsupervised clustering technique is used to locate the clusters formed by the received data. The symmetric properties of the clusters labels are subsequently utilized in order to label the clusters. Following this step, the received data are compared to clusters and decisions are made on a symbol by symbol basis, by assigning to each data the label of the nearest cluster. The operation of the equalizer is investigated both in linear and nonlinear channels. The performance of the proposed equalizer is compared to the performance of a CMAbased blind equalizer.

Keywords: Blind equalization, channel equalization, cluster based equalisers

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
6932 Zero Inflated Models for Overdispersed Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

The zero inflated models are usually used in modeling count data with excess zeros where the existence of the excess zeros could be structural zeros or zeros which occur by chance. These type of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences which involve sex and health dental epidemiology. The most popular zero inflated models used by many researchers are zero inflated Poisson and zero inflated negative binomial models. In addition, zero inflated generalized Poisson and zero inflated double Poisson models are also discussed and found in some literature. Recently zero inflated inverse trinomial model and zero inflated strict arcsine models are advocated and proven to serve as alternative models in modeling overdispersed count data caused by excessive zeros and unobserved heterogeneity. The purpose of this paper is to review some related literature and provide a variety of examples from different disciplines in the application of zero inflated models. Different model selection methods used in model comparison are discussed.

Keywords: Overdispersed count data, model selection methods, likelihood ratio, AIC, BIC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4531
6931 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based On Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focusses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: Auto-ID, Construction Logistic, Fuzzy, Monitoring, RFID, Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776
6930 Nuclear Data Evaluation for 217Po

Authors: Sherif S. Nafee, Amir K. Al-Ramady, Salem S. Shaheen

Abstract:

Evaluated nuclear decay data for the 217Po nuclide is presented in the present work. These data include recommended values for the half-life T1/2, α-, β-- and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons and the K-shell to L-shell and L-shell to M-shell and to N-shell conversion electrons ratios K/L, L/M and L/N have been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.

Keywords: Atomic Mass Evaluation, Nuclear Data Evaluation, Total Electron Conversion Electrons.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2253
6929 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network

Authors: Shoujia Fang, Guoqing Ding, Xin Chen

Abstract:

The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.

Keywords: Keypoint detection, curve feature, convolutional neural network, press-fit assembly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940