Search results for: Data cutting and sorting method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13808

Search results for: Data cutting and sorting method

12518 IFDewey: A New Insert-Friendly Labeling Schemafor XML Data

Authors: S. Soltan, A. Zarnani, R. AliMohammadzadeh, M. Rahgozar

Abstract:

XML has become a popular standard for information exchange via web. Each XML document can be presented as a rooted, ordered, labeled tree. The Node label shows the exact position of a node in the original document. Region and Dewey encoding are two famous methods of labeling trees. In this paper, we propose a new insert friendly labeling method named IFDewey based on recently proposed scheme, called Extended Dewey. In Extended Dewey many labels must be modified when a new node is inserted into the XML tree. Our method eliminates this problem by reserving even numbers for future insertion. Numbers generated by Extended Dewey may be even or odd. IFDewey modifies Extended Dewey so that only odd numbers are generated and even numbers can then be used for a much easier insertion of nodes.

Keywords: XML, tree labeling, query processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
12517 Lithofacies Classification from Well Log Data Using Neural Networks, Interval Neutrosophic Sets and Quantification of Uncertainty

Authors: Pawalai Kraipeerapun, Chun Che Fung, Kok Wai Wong

Abstract:

This paper proposes a novel approach to the question of lithofacies classification based on an assessment of the uncertainty in the classification results. The proposed approach has multiple neural networks (NN), and interval neutrosophic sets (INS) are used to classify the input well log data into outputs of multiple classes of lithofacies. A pair of n-class neural networks are used to predict n-degree of truth memberships and n-degree of false memberships. Indeterminacy memberships or uncertainties in the predictions are estimated using a multidimensional interpolation method. These three memberships form the INS used to support the confidence in results of multiclass classification. Based on the experimental data, our approach improves the classification performance as compared to an existing technique applied only to the truth membership. In addition, our approach has the capability to provide a measure of uncertainty in the problem of multiclass classification.

Keywords: Multiclass classification, feed-forward backpropagation neural network, interval neutrosophic sets, uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
12516 Numerical Study of Some Coupled PDEs by using Differential Transformation Method

Authors: Reza Abazari, Rasool Abazari

Abstract:

In this paper, the two-dimension differential transformation method (DTM) is employed to obtain the closed form solutions of the three famous coupled partial differential equation with physical interest namely, the coupled Korteweg-de Vries(KdV) equations, the coupled Burgers equations and coupled nonlinear Schrödinger equation. We begin by showing that how the differential transformation method applies to a linear and non-linear part of any PDEs and apply on these coupled PDEs to illustrate the sufficiency of the method for this kind of nonlinear differential equations. The results obtained are in good agreement with the exact solution. These results show that the technique introduced here is accurate and easy to apply.

Keywords: Coupled Korteweg-de Vries(KdV) equation, Coupled Burgers equation, Coupled Schrödinger equation, differential transformation method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3002
12515 An Overview of the Application of Fuzzy Inference System for the Automation of Breast Cancer Grading with Spectral Data

Authors: Shabbar Naqvi, Jonathan M. Garibaldi

Abstract:

Breast cancer is one of the most frequent occurring cancers in women throughout the world including U.K. The grading of this cancer plays a vital role in the prognosis of the disease. In this paper we present an overview of the use of advanced computational method of fuzzy inference system as a tool for the automation of breast cancer grading. A new spectral data set obtained from Fourier Transform Infrared Spectroscopy (FTIR) of cancer patients has been used for this study. The future work outlines the potential areas of fuzzy systems that can be used for the automation of breast cancer grading.

Keywords: Breast cancer, FTIR, fuzzy inference system, principal component analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2130
12514 Effect of Type of Pile and Its Installation Method on Pile Bearing Capacity by Physical Modeling in Frustum Confining Vessel

Authors: Seyed Abolhasan Naeini, M. Mortezaee

Abstract:

Various factors such as the method of installation, the pile type, the pile material and the pile shape, can affect the final bearing capacity of a pile executed in the soil; among them, the method of installation is of special importance. The physical modeling is among the best options in the laboratory study of the piles behavior. Therefore, the current paper first presents and reviews the frustum confining vessel (FCV) as a suitable tool for physical modeling of deep foundations. Then, by describing the loading tests of two open-ended and closed-end steel piles, each of which has been performed in two methods, “with displacement" and "without displacement", the effect of end conditions and installation method on the final bearing capacity of the pile is investigated. The soil used in the current paper is silty sand of Firuzkuh, Iran. The results of the experiments show that in general the without displacement installation method has a larger bearing capacity in both piles, and in a specific method of installation the closed ended pile shows a slightly higher bearing capacity.

Keywords: physical modeling, frustum confining vessel, pile, bearing capacity, installation method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 502
12513 Aircraft Selection Using Multiple Criteria Decision Making Analysis Method with Different Data Normalization Techniques

Authors: C. Ardil

Abstract:

This paper presents an original application of multiple criteria decision making analysis theory to the evaluation of aircraft selection problem. The selection of an optimal, efficient and reliable fleet, network and operations planning policy is one of the most important factors in aircraft selection problem. Given that decision making in aircraft selection involves the consideration of a number of opposite criteria and possible solutions, such a selection can be considered as a multiple criteria decision making analysis problem. This study presents a new integrated approach to decision making by considering the multiple criteria utility theory and the maximal regret minimization theory methods as well as aircraft technical, economical, and environmental aspects. Multiple criteria decision making analysis method uses different normalization techniques to allow criteria to be aggregated with qualitative and quantitative data of the decision problem. Therefore, selecting a suitable normalization technique for the model is also a challenge to provide data aggregation for the aircraft selection problem. To compare the impact of different normalization techniques on the decision problem, the vector, linear (sum), linear (max), and linear (max-min) data normalization techniques were identified to evaluate aircraft selection problem. As a logical implication of the proposed approach, it enhances the decision making process through enabling the decision maker to: (i) use higher level knowledge regarding the selection of criteria weights and the proposed technique, (ii) estimate the ranking of an alternative, under different data normalization techniques and integrated criteria weights after a posteriori analysis of the final rankings of alternatives. A set of commercial passenger aircraft were considered in order to illustrate the proposed approach. The obtained results of the proposed approach were compared using Spearman's rho tests. An analysis of the final rank stability with respect to the changes in criteria weights was also performed so as to assess the sensitivity of the alternative rankings obtained by the application of different data normalization techniques and the proposed approach.

Keywords: Normalization Techniques, Aircraft Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, MCDMA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
12512 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: Data mining, K-means, road traffic accidents, Waze, Weka.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1215
12511 Stable Tending Control of Complex Power Systems: An Example of Localized Design of Power System Stabilizers

Authors: Wenjuan Du

Abstract:

The phase compensation method was proposed based on the concept of the damping torque analysis (DTA). It is a method for the design of a PSS (power system stabilizer) to suppress local-mode power oscillations in a single-machine infinite-bus power system. This paper presents the application of the phase compensation method for the design of a PSS in a multi-machine power system. The application is achieved by examining the direct damping contribution of the stabilizer to the power oscillations. By using linearized equal area criterion, a theoretical proof to the application for the PSS design is presented. Hence PSS design in the paper is an example of stable tending control by localized method.

Keywords: Phase compensation method, power system small-signal stability, power system stabilizer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 980
12510 Identification of an Mechanism Systems by Using the Modified PSO Method

Authors: Chih-Cheng Kao, Hsin- Hua Chu

Abstract:

This paper mainly proposes an efficient modified particle swarm optimization (MPSO) method, to identify a slidercrank mechanism driven by a field-oriented PM synchronous motor. In system identification, we adopt the MPSO method to find parameters of the slider-crank mechanism. This new algorithm is added with “distance" term in the traditional PSO-s fitness function to avoid converging to a local optimum. It is found that the comparisons of numerical simulations and experimental results prove that the MPSO identification method for the slider-crank mechanism is feasible.

Keywords: Slider-crank mechanism, distance, systemidentification, modified particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1507
12509 Burst on Hurst Algorithm for Detecting Activity Patterns in Networks of Cortical Neurons

Authors: G. Stillo, L. Bonzano, M. Chiappalone, A. Vato, F. Davide, S. Martinoia

Abstract:

Electrophysiological signals were recorded from primary cultures of dissociated rat cortical neurons coupled to Micro-Electrode Arrays (MEAs). The neuronal discharge patterns may change under varying physiological and pathological conditions. For this reason, we developed a new burst detection method able to identify bursts with peculiar features in different experimental conditions (i.e. spontaneous activity and under the effect of specific drugs). The main feature of our algorithm (i.e. Burst On Hurst), based on the auto-similarity or fractal property of the recorded signal, is the independence from the chosen spike detection method since it works directly on the raw data.

Keywords: Burst detection, cortical neuronal networks, Micro-Electrode Array (MEA), wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
12508 Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies

Authors: T. S. Myers, J. Trevathan

Abstract:

Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.

Keywords: Information architecture, Semantic technologies Sensor networks, Ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
12507 2D Rigid Registration of MR Scans using the 1d Binary Projections

Authors: Panos D. Kotsas

Abstract:

This paper presents the application of a signal intensity independent registration criterion for 2D rigid body registration of medical images using 1D binary projections. The criterion is defined as the weighted ratio of two projections. The ratio is computed on a pixel per pixel basis and weighting is performed by setting the ratios between one and zero pixels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the one areas of the two projections and it is minimized using the Chebyshev polynomial approximation using n=5 points. The sum of x and y projections is used for translational adjustment and a 45deg projection for rotational adjustment. 20 T1- T2 registration experiments were performed and gave mean errors 1.19deg and 1.78 pixels. The method is suitable for contour/surface matching. Further research is necessary to determine the robustness of the method with regards to threshold, shape and missing data.

Keywords: Medical image, projections, registration, rigid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1346
12506 Data Migration between Document-Oriented and Relational Databases

Authors: Bogdan Walek, Cyril Klimes

Abstract:

Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.

Keywords: data migration, database, document-oriented database, XML, relational schema

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3525
12505 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: Biometrics, identity verification, genetic data, k-nearest neighbor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1120
12504 The Influencing Factors and the Approach to Enhance the Standard of E-Commerce for Small and Medium Enterprises in Bangkok

Authors: Wanida Suwunniponth

Abstract:

The objectives of this research paper were to study the influencing factors that contributed to the success of electronic commerce (e-commerce) and to study the approach to enhance the standard of e-commerce for small and medium enterprises (SME). The research paper focused the study on only sole proprietorship SMEs in Bangkok, Thailand. The factors contributed to the success of SME included business management, learning in the organization, business collaboration, and the quality of website. A quantitative and qualitative mixed research methodology was used. In terms of quantitative method, a questionnaire was used to collect data from 251 sole proprietorships. The System Equation Model (SEM) was utilized as the tool for data analysis. In terms of qualitative method, an in-depth interview, a dialogue with experts in the field of ecommerce for SMEs, and content analysis were used. By using the adjusted causal relationship structure model, it was revealed that the factors affecting the success of e-commerce for SMEs were found to be congruent with the empirical data. The hypothesis testing indicated that business management influenced the learning in the organization, the learning in the organization influenced business collaboration and the quality of the website, and these factors, in turn, influenced the success of SMEs. Moreover, the approach to enhance the standard of SMEs revealed that the majority of respondents wanted to enhance the standard of SMEs to a high level in the category of safety of e-commerce system, basic structure of e-commerce, development of staff potentials, assistance of budget and tax reduction, and law improvement regarding the e-commerce respectively.

Keywords: Electronic Commerce, Influencing Factors, Small and Medium Enterprises.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
12503 Predicting Dietary Practice Behavior among Type 2 Diabetics Using the Theory of Planned Behavior and Mixed Methods Design

Authors: D.O. Omondi, M.K. Walingo, G.M. Mbagaya, L.O.A. Othuon

Abstract:

This study applied the Theory of Planned Behavior model in predicting dietary behavior among Type 2 diabetics in a Kenyan environment. The study was conducted for three months within the diabetic clinic at Kisii Hospital in Nyanza Province in Kenya and adopted sequential mixed methods design combing both qualitative and quantitative phases. Qualitative data was analyzed using grounded theory analysis method. Structural equation modeling using maximum likelihood was used to analyze quantitative data. The results based on the common fit indices revealed that the theory of planned behavior fitted the data acceptably well among the Type 2 diabetes and within dietary behavior {χ2 = 223.3, df = 77, p = .02, χ2/df = 2.9, n=237; TLI = .93; CFI =.91; RMSEA (90CI) = .090(.039, .146)}. This implies that the Theory of Planned Behavior holds and forms a framework for promoting dietary practice among Type 2 diabetics.

Keywords: Dietary practice, Kenya, Theory of PlannedBehavior, Type 2 diabetes, Mixed Methods Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098
12502 Evaluation of Solid Phase Micro-extraction with Standard Testing Method for Formaldehyde Determination

Authors: Y. L. Yung, Kong Mun Lo

Abstract:

In this study, solid phase micro-extraction (SPME) was optimized to improve the sensitivity and accuracy in formaldehyde determination for plywood panels. Further work has been carried out to compare the newly developed technique with existing method which reacts formaldehyde collected in desiccators with acetyl acetone reagent (DC-AA). In SPME, formaldehyde was first derivatized with O-(2,3,4,5,6 pentafluorobenzyl)-hydroxylamine hydrochloride (PFBHA) and analysis was then performed by gas chromatography in combination with mass spectrometry (GC-MS). SPME data subjected to various wood species gave satisfactory results, with relative standard deviations (RSDs) obtained in the range of 3.1-10.3%. It was also well correlated with DC values, giving a correlation coefficient, RSQ, of 0.959. The quantitative analysis of formaldehyde by SPME was an alternative in wood industry with great potential

Keywords: Formaldehyde, GCMS, Plywood and SPME

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2537
12501 Increased Capacity of Information Hiding in LSB-s Method for Text and Image

Authors: H.B.Kekre, Archana Athawale, Pallavi N.Halarnkar

Abstract:

Steganography, derived from Greek, literally means “covered writing". It includes a vast array of secret communications methods that conceal the message-s very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. This paper proposes a new improved version of Least Significant Bit (LSB) method. The approach proposed is simple for implementation when compared to Pixel value Differencing (PVD) method and yet achieves a High embedding capacity and imperceptibility. The proposed method can also be applied to 24 bit color images and achieve embedding capacity much higher than PVD.

Keywords: Information Hiding, LSB Matching, PVD Steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3166
12500 Rapid Determination of Biochemical Oxygen Demand

Authors: Mayur Milan Kale, Indu Mehrotra

Abstract:

Biochemical Oxygen Demand (BOD) is a measure of the oxygen used in bacteria mediated oxidation of organic substances in water and wastewater. Theoretically an infinite time is required for complete biochemical oxidation of organic matter, but the measurement is made over 5-days at 20 0C or 3-days at 27 0C test period with or without dilution. Researchers have worked to further reduce the time of measurement. The objective of this paper is to review advancement made in BOD measurement primarily to minimize the time and negate the measurement difficulties. Survey of literature review in four such techniques namely BOD-BARTTM, Biosensors, Ferricyanidemediated approach, luminous bacterial immobilized chip method. Basic principle, method of determination, data validation and their advantage and disadvantages have been incorporated of each of the methods. In the BOD-BARTTM method the time lag is calculated for the system to change from oxidative to reductive state. BIOSENSORS are the biological sensing element with a transducer which produces a signal proportional to the analyte concentration. Microbial species has its metabolic deficiencies. Co-immobilization of bacteria using sol-gel biosensor increases the range of substrate. In ferricyanidemediated approach, ferricyanide has been used as e-acceptor instead of oxygen. In Luminous bacterial cells-immobilized chip method, bacterial bioluminescence which is caused by lux genes was observed. Physiological responses is measured and correlated to BOD due to reduction or emission. There is a scope to further probe into the rapid estimation of BOD.

Keywords: BOD, Four methods, Rapid estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3641
12499 Impovement of a Label Extraction Method for a Risk Search System

Authors: Shigeaki Sakurai, Ryohei Orihara

Abstract:

This paper proposes an improvement method of classification efficiency in a classification model. The model is used in a risk search system and extracts specific labels from articles posted at bulletin board sites. The system can analyze the important discussions composed of the articles. The improvement method introduces ensemble learning methods that use multiple classification models. Also, it introduces expressions related to the specific labels into generation of word vectors. The paper applies the improvement method to articles collected from three bulletin board sites selected by users and verifies the effectiveness of the improvement method.

Keywords: Text mining, Risk search system, Corporate reputation, Bulletin board site, Ensemble learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1325
12498 Reversible Watermarking for H.264/AVC Videos

Authors: Yih-Chuan Lin, Jung-Hong Li

Abstract:

In this paper, we propose a reversible watermarking scheme based on histogram shifting (HS) to embed watermark bits into the H.264/AVC standard videos by modifying the last nonzero level in the context adaptive variable length coding (CAVLC) domain. The proposed method collects all of the last nonzero coefficients (or called last level coefficient) of 4×4 sub-macro blocks in a macro block and utilizes predictions for the current last level from the neighbor block-s last levels to embed watermark bits. The feature of the proposed method is low computational and has the ability of reversible recovery. The experimental results have demonstrated that our proposed scheme has acceptable degradation on video quality and output bit-rate for most test videos.

Keywords: Reversible data hiding, H.264/AVC standard, CAVLC, Histogram shifting

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
12497 An Efficient Collocation Method for Solving the Variable-Order Time-Fractional Partial Differential Equations Arising from the Physical Phenomenon

Authors: Haniye Dehestani, Yadollah Ordokhani

Abstract:

In this work, we present an efficient approach for solving variable-order time-fractional partial differential equations, which are based on Legendre and Laguerre polynomials. First, we introduced the pseudo-operational matrices of integer and variable fractional order of integration by use of some properties of Riemann-Liouville fractional integral. Then, applied together with collocation method and Legendre-Laguerre functions for solving variable-order time-fractional partial differential equations. Also, an estimation of the error is presented. At last, we investigate numerical examples which arise in physics to demonstrate the accuracy of the present method. In comparison results obtained by the present method with the exact solution and the other methods reveals that the method is very effective.

Keywords: Collocation method, fractional partial differential equations, Legendre-Laguerre functions, pseudo-operational matrix of integration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1023
12496 Adaptive Nonparametric Approach for Guaranteed Real-Time Detection of Targeted Signals in Multichannel Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

An adaptive nonparametric method is proposed for stable real-time detection of seismoacoustic sources in multichannel C-OTDR systems with a significant number of channels. This method guarantees given upper boundaries for probabilities of Type I and Type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this report.

Keywords: Adaptive detection, change point, interval estimation, guaranteed detection, multichannel monitoring systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1884
12495 Research on Maintenance Design Method based Virtual Maintenance

Authors: Yunbin Yang, Liangli He, Fengjun Wang

Abstract:

The essentiality of maintenance assessment and maintenance optimization in design stage is analyzed, and the existent problems of conventional maintenance design method are illuminated. MDMVM (Maintenance Design Method based Virtual Maintenance) is illuminated, and the process of MDMVM established, and the MDMVM architecture is given out. The key techniques of MDMVM are analyzed, and include maintenance design based KBE (Knowledge Based Engineering) and virtual maintenance based physically attribute. According to physical property, physically based modeling, visual object movement control, the simulation of operation force and maintenance sequence planning method are emphatically illuminated. Maintenance design system based virtual maintenance is established in foundation of maintenance design method.

Keywords: Digital mock-up, virtual maintenance, knowledge engineering, maintenance sequence planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
12494 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: Active Contour, Bayesian, Echocardiographic image, Feature vector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
12493 Assessment of Aminopolyether on 18F-FDG Samples

Authors: Renata L. C. Leão, João E. Nascimento, Natalia C. E. S. Nascimento, Elaine S. Vasconcelos, Mércia L. Oliveira

Abstract:

The quality control procedures of a radiopharmaceutical include the assessment of its chemical purity. The method suggested by international pharmacopeias consists of a thin layer chromatographic run. In this paper, the method proposed by the United States Pharmacopeia (USP) is compared to a direct method to determine the final concentration of aminopolyether in Fludeoxyglucose (18F-FDG) preparations. The approach (no chromatographic run) was achieved by placing the thin-layer chromatography (TLC) plate directly on an iodine vapor chamber. Both methods were validated and they showed adequate results to determine the concentration of aminopolyether in 18F-FDG preparations. However, the direct method is more sensitive, faster and simpler when compared to the reference method (with chromatographic run), and it may be chosen for use in routine quality control of 18F-FDG.

Keywords: Chemical purity, Kryptofix 222, thin layer chromatography, validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 837
12492 The Guaranteed Detection of the Seismoacoustic Emission Source in the C-OTDR Systems

Authors: Andrey V. Timofeev

Abstract:

A method is proposed for stable detection of seismoacoustic sources in C-OTDR systems that guarantee given upper bounds for probabilities of type I and type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDRsystem are presented.

Keywords: Guaranteed detection, C-OTDR systems, change point, interval estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1986
12491 Typical Day Prediction Model for Output Power and Energy Efficiency of a Grid-Connected Solar Photovoltaic System

Authors: Yan Su, L. C. Chan

Abstract:

A novel typical day prediction model have been built and validated by the measured data of a grid-connected solar photovoltaic (PV) system in Macau. Unlike conventional statistical method used by previous study on PV systems which get results by averaging nearby continuous points, the present typical day statistical method obtain the value at every minute in a typical day by averaging discontinuous points at the same minute in different days. This typical day statistical method based on discontinuous point averaging makes it possible for us to obtain the Gaussian shape dynamical distributions for solar irradiance and output power in a yearly or monthly typical day. Based on the yearly typical day statistical analysis results, the maximum possible accumulated output energy in a year with on site climate conditions and the corresponding optimal PV system running time are obtained. Periodic Gaussian shape prediction models for solar irradiance, output energy and system energy efficiency have been built and their coefficients have been determined based on the yearly, maximum and minimum monthly typical day Gaussian distribution parameters, which are obtained from iterations for minimum Root Mean Squared Deviation (RMSD). With the present model, the dynamical effects due to time difference in a day are kept and the day to day uncertainty due to weather changing are smoothed but still included. The periodic Gaussian shape correlations for solar irradiance, output power and system energy efficiency have been compared favorably with data of the PV system in Macau and proved to be an improvement than previous models.

Keywords: Grid Connected, RMSD, Solar PV System, Typical Day.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
12490 Power Saving System in Green Data Center

Authors: Joon-young Jung, Dong-oh Kang, Chang-seok Bae

Abstract:

Power consumption is rapidly increased in data centers because the number of data center is increased and more the scale of data center become larger. Therefore, it is one of key research items to reduce power consumption in data center. The peak power of a typical server is around 250 watts. When a server is idle, it continues to use around 60% of the power consumed when in use, though vendors are putting effort into reducing this “idle" power load. Servers tend to work at only around a 5% to 20% utilization rate, partly because of response time concerns. An average of 10% of servers in their data centers was unused. In those reason, we propose dynamic power management system to reduce power consumption in green data center. Experiment result shows that about 55% power consumption is reduced at idle time.

Keywords: Data Center, Green IT, Management Server, Power Saving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1628
12489 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

The problems arising from unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many researchers have found that the performance of existing classifiers tends to be biased towards the majority class. The k-nearest neighbors’ nonparametric discriminant analysis is a method that was proposed for classifying unbalanced classes with good performance. In this study, the methods of discriminant analysis are of interest in investigating misclassification error rates for classimbalanced data of three diabetes risk groups. The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification of class-imbalanced data of diabetes risk groups. Data from a project maintaining healthy conditions for 599 employees of a government hospital in Bangkok were obtained for the classification problem. The employees were divided into three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data including the variables of diabetes risk group, age, gender, blood glucose, and BMI were analyzed and bootstrapped for 50 and 100 samples, 599 observations per sample, for additional estimation of the misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples showed nonnormality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. Searching the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions of (0.90:0.05:0.05), (0.80: 0.10: 0.10) and (0.70, 0.15, 0.15). The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k=3 or k=4 and the defined prior probabilities of non-risk: risk: diabetic as 0.90: 0.05:0.05 or 0.80:0.10:0.10 gave the smallest error rate of misclassification. The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: Bootstrap, diabetes risk groups, error rate, k-nearest neighbors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008