Search results for: Extraction and data integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8495

Search results for: Extraction and data integration

6035 Hybrid Approach for Country’s Performance Evaluation

Authors: C. Slim

Abstract:

This paper presents an integrated model, which hybridized data envelopment analysis (DEA) and support vector machine (SVM) together, to class countries according to their efficiency and performance. This model takes into account aspects of multi-dimensional indicators, decision-making hierarchy and relativity of measurement. Starting from a set of indicators of performance as exhaustive as possible, a process of successive aggregations has been developed to attain an overall evaluation of a country’s competitiveness.

Keywords: Artificial neural networks, support vector machine, data envelopment analysis, aggregations, indicators of performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1046
6034 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home

Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.

Keywords: Situation-awareness, Smart home, IoT, Machine learning, Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
6033 Correspondence between Function and Interaction in Protein Interaction Network of Saccaromyces cerevisiae

Authors: Nurcan Tuncbag, Turkan Haliloglu, Ozlem Keskin

Abstract:

Understanding the cell's large-scale organization is an interesting task in computational biology. Thus, protein-protein interactions can reveal important organization and function of the cell. Here, we investigated the correspondence between protein interactions and function for the yeast. We obtained the correlations among the set of proteins. Then these correlations are clustered using both the hierarchical and biclustering methods. The detailed analyses of proteins in each cluster were carried out by making use of their functional annotations. As a result, we found that some functional classes appear together in almost all biclusters. On the other hand, in hierarchical clustering, the dominancy of one functional class is observed. In the light of the clustering data, we have verified some interactions which were not identified as core interactions in DIP and also, we have characterized some functionally unknown proteins according to the interaction data and functional correlation. In brief, from interaction data to function, some correlated results are noticed about the relationship between interaction and function which might give clues about the organization of the proteins, also to predict new interactions and to characterize functions of unknown proteins.

Keywords: Pair-wise protein interactions, DIP database, functional correlations, biclustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
6032 New Simultaneous High Performance Liquid Chromatographic Method for Determination of NSAIDs and Opioid Analgesics in Advanced Drug Delivery Systems and Human Plasma

Authors: Asad Ullah Madni, Mahmood Ahmad, Naveed Akhtar, Muhammad Usman

Abstract:

A new and cost effective RP-HPLC method was developed and validated for simultaneous analysis of non steroidal anti inflammatory dugs Diclofenac sodium (DFS), Flurbiprofen (FLP) and an opioid analgesic Tramadol (TMD) in advanced drug delivery systems (Liposome and Microcapsules), marketed brands and human plasma. Isocratic system was employed for the flow of mobile phase consisting of 10 mM sodium dihydrogen phosphate buffer and acetonitrile in molar ratio of 67: 33 with adjusted pH of 3.2. The stationary phase was hypersil ODS column (C18, 250×4.6 mm i.d., 5 μm) with controlled temperature of 30 C°. DFS in liposomes, microcapsules and marketed drug products was determined in range of 99.76-99.84%. FLP and TMD in microcapsules and brands formulation were 99.78 - 99.94 % and 99.80 - 99.82 %, respectively. Single step liquid-liquid extraction procedure using combination of acetonitrile and trichloroacetic acid (TCA) as protein precipitating agent was employed. The detection limits (at S/N ratio 3) of quality control solutions and plasma samples were 10, 20, and 20 ng/ml for DFS, FLP and TMD, respectively. The Assay was acceptable in linear dynamic range. All other validation parameters were found in limits of FDA and ICH method validation guidelines. The proposed method is sensitive, accurate and precise and could be applicable for routine analysis in pharmaceutical industry as well as in human plasma samples for bioequivalence and pharmacokinetics studies.

Keywords: Diclofenac Sodium, Flurbiprofen, Tramadol, HPLCUV detection, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834
6031 Genetic Programming Based Data Projections for Classification Tasks

Authors: César Estébanez, Ricardo Aler, José M. Valls

Abstract:

In this paper we present a GP-based method for automatically evolve projections, so that data can be more easily classified in the projected spaces. At the same time, our approach can reduce dimensionality by constructing more relevant attributes. Fitness of each projection measures how easy is to classify the dataset after applying the projection. This is quickly computed by a Simple Linear Perceptron. We have tested our approach in three domains. The experiments show that it obtains good results, compared to other Machine Learning approaches, while reducing dimensionality in many cases.

Keywords: Classification, genetic programming, projections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377
6030 Molecular Identification of ESBL Genesbla GES-1, blaVEB-1, blaCTX-M blaOXA-1, blaOXA-4,blaOXA-10 and blaPER-1 in Pseudomonas aeruginosa Strains Isolated from Burn Patientsby PCR, RFLP and Sequencing Techniques

Authors: Fereshteh Shacheraghi, Mohammad Reza Shakibaie, Hanieh Noveiri

Abstract:

Fourty one strains of ESBL producing P.aeruginosa which were previously isolated from burn patients in Kerman University general hospital, Iran were subjected to PCR, RFLP and sequencing in order to determine the type of extended spectrum β- lactamases (ESBL), the restriction digestion pattern and possibility of mutation among detected genes. DNA extraction was carried out by phenol chloroform method. PCR for detection of bla genes was performed using specific primer for each gene. Restriction Fragment Length Polymorphism (RFLP) for ESBL genes was carried out using EcoRI, NheI, PVUII, EcoRV, DdeI, and PstI restriction enzymes. The PCR products were subjected to direct sequencing of both the strands for identification of the ESBL genes.The blaCTX-M, blaVEB-1, blaPER-1, blaGES-1, blaOXA-1, blaOXA-4 and blaOXA-10 genes were detected in the (n=1) 2.43%, (n=41)100%, (n=28) 68.3%, (n=10) 24.4%, (n=29) 70.7%, (n=7)17.1% and (n=38) 92.7% of the ESBL producing isolates respectively. The RFLP analysis showed that each ESBL gene has identical pattern of digestion among the isolated strains. Sequencing of the ESBL genes confirmed the genuinety of PCR products and revealed no mutation in the restriction sites of the above genes. From results of the present investigation it can be concluded that blaVEB-1 and blaCTX-M were the most and the least frequently isolated ESBL genes among the P.aeruginosa strains isolated from burn patients. The RFLP and sequencing analysis revealed that same clone of the bla genes were indeed existed among the antibiotic resistant strains.

Keywords: ESBL genes, PCR, RFLP, Sequencing, P.aeruginosa

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2949
6029 Development of a Serial Signal Monitoring Program for Educational Purposes

Authors: Jungho Moon, Lae-Jeong Park

Abstract:

This paper introduces a signal monitoring program developed with a view to helping electrical engineering students get familiar with sensors with digital output. Because the output of digital sensors cannot be simply monitored by a measuring instrument such as an oscilloscope, students tend to have a hard time dealing with digital sensors. The monitoring program runs on a PC and communicates with an MCU that reads the output of digital sensors via an asynchronous communication interface. Receiving the sensor data from the MCU, the monitoring program shows time and/or frequency domain plots of the data in real time. In addition, the monitoring program provides a serial terminal that enables the user to exchange text information with the MCU while the received data is plotted. The user can easily observe the output of digital sensors and configure the digital sensors in real time, which helps students who do not have enough experiences with digital sensors. Though the monitoring program was programmed in the Matlab programming language, it runs without the Matlab since it was compiled as a standalone executable.

Keywords: Digital sensor, MATLAB, MCU, signal monitoring program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
6028 A Novel Fuzzy Logic Based Controller to Adjust the Brightness of the Television Screen with Respect to Surrounding Light

Authors: A. V. Sai Balasubramanian, N. Ravi Shankar, S. Subbaraman, R. Rengaraj

Abstract:

One of the major cause of eye strain and other problems caused while watching television is the relative illumination between the screen and its surrounding. This can be overcome by adjusting the brightness of the screen with respect to the surrounding light. A controller based on fuzzy logic is proposed in this paper. The fuzzy controller takes in the intensity of light surrounding the screen and the present brightness of the screen as input. The output of the fuzzy controller is the grid voltage corresponding to the required brightness. This voltage is given to CRT and brightness is controller dynamically. For the given test system data, different de-fuzzifier methods have been implemented and the results are compared. In order to validate the effectiveness of the proposed approach, a fuzzy controller has been designed by obtaining a test data from a real time system. The simulations are performed in MATLAB and are verified with standard system data. The proposed approach can be implemented for real time applications.

Keywords: Fuzzy controller, Grid voltage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765
6027 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O’Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: Empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4820
6026 Investigation of Scour Depth at Bridge Piers using Bri-Stars Model in Iran

Authors: Gh. Saeidifar, F. Raeiszadeh

Abstract:

BRI-STARS (BRIdge Stream Tube model for Alluvial River Simulation) program was used to investigate the scour depth around bridge piers in some of the major river systems in Iran. Model calibration was performed by collecting different field data. Field data are cataloged on three categories, first group of bridges that their rivers bed are formed by fine material, second group of bridges that their rivers bed are formed by sand material, and finally bridges that their rivers bed are formed by gravel or cobble materials. Verification was performed with some field data in Fars Province. Results show that for wide piers, computed scour depth is more than measured one. In gravel bed streams, computed scour depth is greater than measured scour depth, the reason is due to formation of armor layer on bed of channel. Once this layer is eroded, the computed scour depth is close to the measured one.

Keywords: BRI-STARS, local scour, bridge, computer modeling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1971
6025 Effects of Global Warming on Climate Change in Udon Thani Province in the Period in 60 Surrounding Years (A.D.1951-2010)

Authors: T. Santiboon

Abstract:

This research were investigated, determined, and analyzed of the climate characteristically change in the provincial Udon Thani in the period of 60 surrounding years from 1951 to 2010 A.D. that it-s transferred to effects of climatologically data for determining global warming. Statistically significant were not found for the 60 years- data (R2<0.81). Statistically significant were found after adapted data followed as the Sun Spot cycle in 11 year periods, at the level 0.001 (R2= 1.00). These results indicate the Udon Thani-s weather are affected change; temperatures and evaporation were increased, but rainfall and number days of rainfall, cyclone storm, wind speed, and humidity, forest assessment were decreased. The effects of thermal energy from the sun radiation energy and human activities that they-re followed as the sunspot cycle are able to be predicted from the last to the future of the uniformitarian-s the climate change and global warming effect of the world.

Keywords: Climate Change, Global Warming, Udon Thani Province Weather

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
6024 Intelligent Assistive Methods for Diagnosis of Rheumatoid Arthritis Using Histogram Smoothing and Feature Extraction of Bone Images

Authors: SP. Chokkalingam, K. Komathy

Abstract:

Advances in the field of image processing envision a new era of evaluation techniques and application of procedures in various different fields. One such field being considered is the biomedical field for prognosis as well as diagnosis of diseases. This plethora of methods though provides a wide range of options to select from, it also proves confusion in selecting the apt process and also in finding which one is more suitable. Our objective is to use a series of techniques on bone scans, so as to detect the occurrence of rheumatoid arthritis (RA) as accurately as possible. Amongst other techniques existing in the field our proposed system tends to be more effective as it depends on new methodologies that have been proved to be better and more consistent than others. Computer aided diagnosis will provide more accurate and infallible rate of consistency that will help to improve the efficiency of the system. The image first undergoes histogram smoothing and specification, morphing operation, boundary detection by edge following algorithm and finally image subtraction to determine the presence of rheumatoid arthritis in a more efficient and effective way. Using preprocessing noises are removed from images and using segmentation, region of interest is found and Histogram smoothing is applied for a specific portion of the images. Gray level co-occurrence matrix (GLCM) features like Mean, Median, Energy, Correlation, Bone Mineral Density (BMD) and etc. After finding all the features it stores in the database. This dataset is trained with inflamed and noninflamed values and with the help of neural network all the new images are checked properly for their status and Rough set is implemented for further reduction.

Keywords: Computer Aided Diagnosis, Edge Detection, Histogram Smoothing, Rheumatoid Arthritis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2463
6023 Optimal Current Control of Externally Excited Synchronous Machines in Automotive Traction Drive Applications

Authors: Oliver Haala, Bernhard Wagner, Maximilian Hofmann, Martin Marz

Abstract:

The excellent suitability of the externally excited synchronous machine (EESM) in automotive traction drive applications is justified by its high efficiency over the whole operation range and the high availability of materials. Usually, maximum efficiency is obtained by modelling each single loss and minimizing the sum of all losses. As a result, the quality of the optimization highly depends on the precision of the model. Moreover, it requires accurate knowledge of the saturation dependent machine inductances. Therefore, the present contribution proposes a method to minimize the overall losses of a salient pole EESM and its inverter in steady state operation based on measurement data only. Since this method does not require any manufacturer data, it is well suited for an automated measurement data evaluation and inverter parametrization. The field oriented control (FOC) of an EESM provides three current components resp. three degrees of freedom (DOF). An analytic minimization of the copper losses in the stator and the rotor (assuming constant inductances) is performed and serves as a first approximation of how to choose the optimal current reference values. After a numeric offline minimization of the overall losses based on measurement data the results are compared to a control strategy that satisfies cos (ϕ) = 1.

Keywords: Current control, efficiency, externally excited synchronous machine, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4328
6022 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: Non-stationary, BINARMA(1, 1) model, Poisson Innovations, CML

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 564
6021 Development of an ArcGIS Toolbar for Trend Analysis of Climatic Data

Authors: Arnab Bandyopadhyay, Anubhab Pal, Subhajit Debnath

Abstract:

Climate change is a cumulative change in weather patterns over a period of time. Trend analysis using non-parametric Mann-Kendall test may help to determine the existence and magnitude of any statistically significant trend in the climatic data. Another index called Sen slope may be used to quantify the magnitude of such trends. A toolbar extension to ESRI ArcGIS named Arc Trends has been developed in this study for performing the above mentioned tasks. To study the temporal trend of meteorological parameters, 32 years (1971-2002) monthly meteorological data were collected for 133 selected stations over different agro-ecological regions of India. Both the maximum and minimum temperatures were found to be rising. A significant increasing trend in the relative humidity and a consistent significant decreasing trend in the wind speed all over the country were found. However, a general increase in rainfall was not found in recent years.

Keywords: Temporal trend, climate change, ArcGIS, Mann- Kendall test, Sen slope

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3071
6020 A Survey of Various Algorithms for Vlsi Physical Design

Authors: Rajine Swetha R, B. Shekar Babu, Sumithra Devi K.A

Abstract:

Electronic Systems are the core of everyday lives. They form an integral part in financial networks, mass transit, telephone systems, power plants and personal computers. Electronic systems are increasingly based on complex VLSI (Very Large Scale Integration) integrated circuits. Initial electronic design automation is concerned with the design and production of VLSI systems. The next important step in creating a VLSI circuit is Physical Design. The input to the physical design is a logical representation of the system under design. The output of this step is the layout of a physical package that optimally or near optimally realizes the logical representation. Physical design problems are combinatorial in nature and of large problem sizes. Darwin observed that, as variations are introduced into a population with each new generation, the less-fit individuals tend to extinct in the competition of basic necessities. This survival of fittest principle leads to evolution in species. The objective of the Genetic Algorithms (GA) is to find an optimal solution to a problem .Since GA-s are heuristic procedures that can function as optimizers, they are not guaranteed to find the optimum, but are able to find acceptable solutions for a wide range of problems. This survey paper aims at a study on Efficient Algorithms for VLSI Physical design and observes the common traits of the superior contributions.

Keywords: Genetic Algorithms, Physical Design, VLSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
6019 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: Thailand tourism, maximum entropy bootstrapping approach, macroeconomic model, asymmetric information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236
6018 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm

Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad

Abstract:

This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.

Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2462
6017 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
6016 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
6015 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals

Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty

Abstract:

A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient, but not the magnitude. A neural network with two hidden layers was then used to learn the coefficient magnitudes, along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.

Keywords: Quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 108
6014 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition

Authors: Chuan Li, Ming Liang

Abstract:

Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.

Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
6013 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: Association rules, Rule-based classification, Classification quality, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
6012 Crack Opening Investigation in Fiberconcrete

Authors: Arturs Macanovskis, Vitalijs Lusis, Andrejs Krasnikovs

Abstract:

This work had three stages. In the first stage was examined pull-out process for steel fiber was embedded into a concrete by one end and was pulled out of concrete under the angle to pulling out force direction. Angle was varied. On the obtained forcedisplacement diagrams were observed jumps. For such mechanical behavior explanation, fiber channel in concrete surface microscopical experimental investigation, using microscope KEYENCE VHX2000, was performed. At the second stage were obtained diagrams for load- crack opening displacement for breaking homogeneously reinforced and layered fiberconcrete prisms (with dimensions 10x10x40cm) subjected to 4-point bending. After testing was analyzed main crack. At the third stage elaborated prediction model for the fiberconcrete beam, failure under bending, using the following data: a) diagrams for fibers pulling out at different angles; b) experimental data about steel-straight fibers locations in the main crack. Experimental and theoretical (modeling) data were compared.

Keywords: Fiberconcrete, pull-out, fiber channel, layered fiberconcrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
6011 Validation of Automotive Centrals Using Hardware in the Loop-Body Control Unit and Lights

Authors: Marley Rosa Luciano, Rodney Rezende Saldanha

Abstract:

The race for electrification and the need for innovation to attract customers has led the automotive industry to do something different with vehicles. New emissions control challenges and efficient technological availability are the pillars of creation. The growing demand to upgrade industrial manufacturing systems creates actions that directly impact vehicle production. With this comes the search for new prototyping methods and virtual tools for component testing and validation, and vehicle systems have established themselves. The demand for Electronic Control Units (ECU) is increasing due to the availability of intelligence and safety in today's vehicles, directly affecting their development, performance, and functional testing. In order to keep up with global changes, the automotive industry uses different virtual environments to produce, verify and validate their vehicles and test prototypes used during development. Therefore, in this paper, integration and validation were performed using the Hardware in the Loop (HIL) test platform, focusing on the ECU Body Control Module (BCM). Then, a brief commentary reviews other test medium platforms, such as the Plywood Buck (PWB), and examines the reliability, flexibility, installation time, and cost of the three test platforms, software in the loop (SIL), Model in the loop (MIL), and HIL, to review their benefits, challenges, and issues in use and information to optimize the use of each platform and test medium.

Keywords: Automotive, Electronic Central Unit, xIL, Hardware in the loop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 272
6010 An Attribute-Centre Based Decision Tree Classification Algorithm

Authors: Gökhan Silahtaroğlu

Abstract:

Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.

Keywords: Classification, decision tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1355
6009 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences

Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam

Abstract:

The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.

Keywords: Learning experiences, innovation, traditional games, trainee teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2426
6008 Combining Bagging and Boosting

Authors: S. B. Kotsiantis, P. E. Pintelas

Abstract:

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.

Keywords: data mining, machine learning, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2536
6007 Semantic Mobility Channel (SMC): Ubiquitous and Mobile Computing Meets the Semantic Web

Authors: José M. Cantera, Miguel Jiménez, Genoveva López, Javier Soriano

Abstract:

With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).

Keywords: Semantic web, ubiquitous and mobile computing, web content transcoding. semantic mark-up, mobile computing, middleware and services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
6006 The Expression of Lipoprotein Lipase Gene with Fat Accumulations and Serum Biochemical Levels in Betong (KU Line) and Broiler Chickens

Authors: W. Loongyai, N. Saengsawang, W. Danvilai, C. Kridtayopas, P. Sopannarath, C. Bunchasak

Abstract:

Betong chicken is a slow growing and a lean strain of chicken, while the rapid growth of broiler is accompanied by increased fat. We investigated the growth performance, fat accumulations, lipid serum biochemical levels and lipoprotein lipase (LPL) gene expression of female Betong (KU line) at the age of 4 and 6 weeks. A total of 80 female Betong chickens (KU line) and 80 female broiler chickens were reared under open system (each group had 4 replicates of 20 chicks per pen). The results showed that feed intake and average daily gain (ADG) of broiler chicken were significantly higher than Betong (KU line) (P < 0.01), while feed conversion ratio (FCR) of Betong (KU line) at week 6 were significantly lower than broiler chicken (P < 0.01) at 6 weeks. At 4 and 6 weeks, two birds per replicate were randomly selected and slaughtered. Carcass weight did not significantly differ between treatments; the percentage of abdominal fat and subcutaneous fat yield was higher in the broiler (P < 0.01) at 4 and 6 week. Total cholesterol and LDL level of broiler were higher than Betong (KU line) at 4 and 6 weeks (P < 0.05). Abdominal fat samples were collected for total RNA extraction. The cDNA was amplified using primers specific for LPL gene expression and analysed using real-time PCR. The results showed that the expression of LPL gene was not different when compared between Betong (KU line) and broiler chickens at the age of 4 and 6 weeks (P > 0.05). Our results indicated that broiler chickens had high growth rate and fat accumulation when compared with Betong (KU line) chickens, whereas LPL gene expression did not differ between breeds.

Keywords: Lipoprotein lipase gene, Betong (KU line), broiler, abdominal fat, gene expression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 933