Search results for: Data Structures
6390 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm
Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad
Abstract:
This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24796389 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing
Authors: V. Barot, S. McLeod, R. Harrison, A. A. West
Abstract:
Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.
Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13736388 Hiding Data in Images Using PCP
Authors: Souvik Bhattacharyya, Gautam Sanyal
Abstract:
In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18216387 Conventional Design and Simulation of an Urban Hybrid Bus
Authors: A. Khanipour, K. M. Ebrahimi, W. J. Seale
Abstract:
Due to heightened concerns over environmental and economic issues the growing important of air pollution, and the importance of conserving fossil fuel resources in the world, the automotive industry is now forced to produce more fuel efficient, low emission vehicles and new drive system technologies. One of the most promising technologies to receive attention is the hybrid electric vehicle (HEV), which consists of two or more energy sources that supply energy to electric traction motors that in turn drive the wheels. This paper presents the various structures of HEV systems, the basic theoretical knowledge for describing their operation and the general behaviour of the HEV in acceleration, cruise and deceleration phases. The conventional design and sizing of a series HEV is studied. A conventional bus and its series configuration are defined and evaluated using the ADVISOR. In this section the simulation of a standard driving cycle and prediction of its fuel consumption and emissions of the HEV are discussed. Finally the bus performance is investigated to establish whether it can satisfy the performance, fuel consumption and emissions requested. The validity of the simulation has been established by the close conformity between the fuel consumption of the conventional bus reported by the manufacturer to what has achieved from the simulation.Keywords: Hybrid Electric Vehicle, Hybridization, LEV, HEV.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25166386 Lowering Error Floors by Concatenation of Low-Density Parity-Check and Array Code
Authors: Cinna Soltanpur, Mohammad Ghamari, Behzad Momahed Heravi, Fatemeh Zare
Abstract:
Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.Keywords: Concatenated coding, low–density parity–check codes, array code, error floors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9936385 A Rigid Point Set Registration of Remote Sensing Images Based on Genetic Algorithms and Hausdorff Distance
Authors: F. Meskine, N. Taleb, M. Chikr El-Mezouar, K. Kpalma, A. Almhdie
Abstract:
Image registration is the process of establishing point by point correspondence between images obtained from a same scene. This process is very useful in remote sensing, medicine, cartography, computer vision, etc. Then, the task of registration is to place the data into a common reference frame by estimating the transformations between the data sets. In this work, we develop a rigid point registration method based on the application of genetic algorithms and Hausdorff distance. First, we extract the feature points from both images based on the algorithm of global and local curvature corner. After refining the feature points, we use Hausdorff distance as similarity measure between the two data sets and for optimizing the search space we use genetic algorithms to achieve high computation speed for its inertial parallel. The results show the efficiency of this method for registration of satellite images.Keywords: Feature extraction, Genetic algorithms, Hausdorff distance, Image registration, Point registration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19316384 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition
Authors: Chuan Li, Ming Liang
Abstract:
Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17176383 Semi-Automatic Method to Assist Expert for Association Rules Validation
Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen
Abstract:
In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.Keywords: Association rules, Rule-based classification, Classification quality, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17916382 Crack Opening Investigation in Fiberconcrete
Authors: Arturs Macanovskis, Vitalijs Lusis, Andrejs Krasnikovs
Abstract:
This work had three stages. In the first stage was examined pull-out process for steel fiber was embedded into a concrete by one end and was pulled out of concrete under the angle to pulling out force direction. Angle was varied. On the obtained forcedisplacement diagrams were observed jumps. For such mechanical behavior explanation, fiber channel in concrete surface microscopical experimental investigation, using microscope KEYENCE VHX2000, was performed. At the second stage were obtained diagrams for load- crack opening displacement for breaking homogeneously reinforced and layered fiberconcrete prisms (with dimensions 10x10x40cm) subjected to 4-point bending. After testing was analyzed main crack. At the third stage elaborated prediction model for the fiberconcrete beam, failure under bending, using the following data: a) diagrams for fibers pulling out at different angles; b) experimental data about steel-straight fibers locations in the main crack. Experimental and theoretical (modeling) data were compared.
Keywords: Fiberconcrete, pull-out, fiber channel, layered fiberconcrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18566381 A Similarity Function for Global Quality Assessment of Retinal Vessel Segmentations
Authors: Arturo Aquino, Manuel Emilio Gegundez, Jose Manuel Bravo, Diego Marin
Abstract:
Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Keywords: Retinal vessel segmentation, quality assessment, performanceevaluation, similarity function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15006380 HEXAFLY-INT Project: Design of a High Speed Flight Experiment
Authors: S. Di Benedetto, M. P. Di Donato, A. Rispoli, S. Cardone, J. Riehmer, J. Steelant, L. Vecchione
Abstract:
Thanks to a coordinated funding by the European Space Agency (ESA) and the European Commission (EC) within the 7th framework program, the High-Speed Experimental Fly Vehicles – International (HEXAFLY-INT) project is aimed at the flight validation of hypersonics technologies enabling future trans-atmospheric flights. The project, which is currently involving partners from Europe, Russian Federation and Australia operating under ESA/ESTEC coordination, will achieve the goal of designing, manufacturing, assembling and flight testing an unpowered high speed vehicle in a glider configuration by 2018. The main technical challenges of the project are specifically related to the design of the vehicle gliding configuration and to the complexity of integrating breakthrough technologies with standard aeronautical technologies, e.g. high temperature protection system and airframe cold structures. Also, the sonic boom impact, which is one of the environmental challenges of the high speed flight, will be assessed. This paper provides a comprehensive and detailed update on all the current projects activities carried out to date on both the vehicle and mission design.
Keywords: Design, flight testing, hypersonics, integration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23156379 An Attribute-Centre Based Decision Tree Classification Algorithm
Authors: Gökhan Silahtaroğlu
Abstract:
Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.Keywords: Classification, decision tree, split, pruning, entropy, gini.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13706378 Wind Tunnel for Aerodynamic Development Testing
Authors: E. T. L. Cöuras Ford, V. A. C. Vale, J. U. L. Mendes, F. A. Ribeiro
Abstract:
The study of the aerodynamics related to the improvement in the acting of airplanes and automobiles with the objective of being reduced the effect of the attrition of the air on structures, providing larger speeds and smaller consumption of fuel. The application of the knowledge of the aerodynamics not more limits to the aeronautical and automobile industries. Therefore, this research aims to design and construction of a wind tunnel to perform aerodynamic analysis in bodies of cars, seeking greater efficiency. Therefore, this research aims to design and construction of a wind tunnel to perform aerodynamic analysis in bodies of cars, seeking greater efficiency. For this, a methodology for wind tunnel type selection is designed to be built, taking into account the various existing configurations in which chose to build an open circuit tunnel, due to the lower complexity of construction and installation; operational simplicity and low cost. The guidelines for the project were teaching: the layer that limits study and analyze specimens with different geometries. For the variation of pressure in the test, section of a switched gauge used a pitot tube. Thus, it was possible to obtain quantitative and qualitative results, which proved to be satisfactory.Keywords: Wind tunnel, Aerodynamics, Air.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13796377 An Experimental Investigation on the Droplet Behavior Impacting a Hot Surface above the Leidenfrost Temperature
Authors: Khaleel Sami Hamdan, Dong-Eok Kim, Sang-Ki Moon
Abstract:
An appropriate model to predict the size of the droplets resulting from the break-up with the structures will help in a better understanding and modeling of the two-phase flow calculations in the simulation of a reactor core loss-of-coolant accident (LOCA). A droplet behavior impacting on a hot surface above the Leidenfrost temperature was investigated. Droplets of known size and velocity were impacted to an inclined plate of hot temperature, and the behavior of the droplets was observed by a high-speed camera. It was found that for droplets of Weber number higher than a certain value, the higher the Weber number of the droplet the smaller the secondary droplets. The COBRA-TF model over-predicted the measured secondary droplet sizes obtained by the present experiment. A simple model for the secondary droplet size was proposed using the mass conservation equation. The maximum spreading diameter of the droplets was also compared to previous correlations and a fairly good agreement was found. A better prediction of the heat transfer in the case of LOCA can be obtained with the presented model.
Keywords: Break-up, droplet, impact, inclined hot plate, Leidenfrost temperature, LOCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23716376 Development and Structural Performance Evaluation on Slit Circular Shear Panel Damper
Authors: Daniel Y. Abebe, Jaehyouk Choi
Abstract:
There are several types of metal-based devices conceived as dampers for the seismic energy absorber whereby damages to the major structural components could be minimized for both new and existing structures. This paper aimed to develop and evaluate structural performance of slit circular shear panel damper for passive seismic energy protection by inelastic deformation. Structural evaluation was done using commercially available nonlinear FE simulation program. The main parameters considered are: diameter-to-thickness (D/t) ratio and slit length-to-width ratio (l/w). Depending on these parameters three different buckling mode and hysteretic behavior was found: yielding prior to buckling without strength degradation, yielding prior to buckling with strength degradation and yielding with buckling and strength degradation which forms pinching at initial displacement. The susceptible location at which the possible crack is initiated is also identified for selected specimens using rupture index.
Keywords: Slit circular shear panel damper, Hysteresis Characteristics, Slip length-to-width ratio, D/t ratio, FE analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24976375 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences
Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam
Abstract:
The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.Keywords: Learning experiences, innovation, traditional games, trainee teachers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24436374 Combining Bagging and Boosting
Authors: S. B. Kotsiantis, P. E. Pintelas
Abstract:
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Keywords: data mining, machine learning, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25636373 A Developed Power and Free Conveyor for Light Loads in Intra-Logistics
Authors: Batin Latif Aylak, Bernd Noche
Abstract:
Nowadays there are lots of applications of power and free conveyors in logistics. They are the most frequently used conveyor systems worldwide. Overhead conveyor technologies like power and free systems are used in the most intra-logistics applications in trade and industry. The automotive, food, beverage and textile industry as well as aeronautic catering or engineering are among the applications. Power and free systems employ different manufacturing intervals in manufacturing as well as in production as temporary store and buffer. Depending on the application area, power and free conveyors are equipped with target controls enabling complex distribution-and sorting tasks. This article introduces a new power and free conveyor design in intra-logistics and explains its components. According to the explanation of the components, a model is created by means of their technical characteristics. Through the CAD software, the model is visualized. After that, the static analysis is evaluated. This analysis helps the calculation of the mandatory state of structures under force action. This powerful model helps companies achieve lower development costs as well as quicker market maturity.
Keywords: Intra-logistics, material flow, power and free conveyor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20066372 Trend Analysis for Extreme Rainfall Events in New South Wales, Australia
Authors: Evan Hajani, Ataur Rahman, Khaled Haddad
Abstract:
Climate change will affect the hydrological cycle in many different ways such as increase in evaporation and rainfalls. There have been growing interests among researchers to identify the nature of trends in historical rainfall data in many different parts of the world. This paper examines the trends in annual maximum rainfall data from 30 stations in New South Wales, Australia by using two non-parametric tests, Mann-Kendall (MK) and Spearman’s Rho (SR). Rainfall data were analyzed for fifteen different durations ranging from 6 min to 3 days. It is found that the sub-hourly durations (6, 12, 18, 24, 30 and 48 minutes) show statistically significant positive (upward) trends whereas longer duration (subdaily and daily) events generally show a statistically significant negative (downward) trend. It is also found that the MK test and SR test provide notably different results for some rainfall event durations considered in this study. Since shorter duration sub-hourly rainfall events show positive trends at many stations, the design rainfall data based on stationary frequency analysis for these durations need to be adjusted to account for the impact of climate change. These shorter durations are more relevant to many urban development projects based on smaller catchments having a much shorter response time.
Keywords: Climate change, Mann-Kendall test, Spearman’s Rho test, trends, design rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29106371 Application of Artificial Neural Network to Forecast Actual Cost of a Project to Improve Earned Value Management System
Authors: Seyed Hossein Iranmanesh, Mansoureh Zarezadeh
Abstract:
This paper presents an application of Artificial Neural Network (ANN) to forecast actual cost of a project based on the earned value management system (EVMS). For this purpose, some projects randomly selected based on the standard data set , and it is produced necessary progress data such as actual cost ,actual percent complete , baseline cost and percent complete for five periods of project. Then an ANN with five inputs and five outputs and one hidden layer is trained to produce forecasted actual costs. The comparison between real and forecasted data show better performance based on the Mean Absolute Percentage Error (MAPE) criterion. This approach could be applicable to better forecasting the project cost and result in decreasing the risk of project cost overrun, and therefore it is beneficial for planning preventive actions.
Keywords: Earned Value Management System (EVMS), Artificial Neural Network (ANN), Estimate At Completion, Forecasting Methods, Project Performance Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27676370 Overview of Development of a Digital Platform for Building Critical Infrastructure Protection Systems in Smart Industries
Authors: Bruno Vilić Belina, Ivan Župan
Abstract:
Smart industry concepts and digital transformation are very popular in many industries. They develop their own digital platforms, which have an important role in innovations and transactions. The main idea of smart industry digital platforms is central data collection, industrial data integration and data usage for smart applications and services. This paper presents the development of a digital platform for building critical infrastructure protection systems in smart industries. Different service contraction modalities in Service Level Agreements (SLAs), Customer Relationship Management (CRM) relations, trends and changes in business architectures (especially process business architecture) for the purpose of developing infrastructural production and distribution networks, information infrastructure meta-models and generic processes by critical infrastructure owner demanded by critical infrastructure law, satisfying cybersecurity requirements and taking into account hybrid threats are researched.
Keywords: Cybersecurity, critical infrastructure, smart industries, digital platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2286369 Customer Need Type Classification Model using Data Mining Techniques for Recommender Systems
Authors: Kyoung-jae Kim
Abstract:
Recommender systems are usually regarded as an important marketing tool in the e-commerce. They use important information about users to facilitate accurate recommendation. The information includes user context such as location, time and interest for personalization of mobile users. We can easily collect information about location and time because mobile devices communicate with the base station of the service provider. However, information about user interest can-t be easily collected because user interest can not be captured automatically without user-s approval process. User interest usually represented as a need. In this study, we classify needs into two types according to prior research. This study investigates the usefulness of data mining techniques for classifying user need type for recommendation systems. We employ several data mining techniques including artificial neural networks, decision trees, case-based reasoning, and multivariate discriminant analysis. Experimental results show that CHAID algorithm outperforms other models for classifying user need type. This study performs McNemar test to examine the statistical significance of the differences of classification results. The results of McNemar test also show that CHAID performs better than the other models with statistical significance.Keywords: Customer need type, Data mining techniques, Recommender system, Personalization, Mobile user.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21466368 Innovation Knowledge and Capability, Work Efficiency of Accountants and the Success of SME Registered in Nakorn Pathom Province
Authors: Autjira Songan, Supattra Kanchanopast
Abstract:
The objectives of this research were to compare the success of SME registered in Nakorn Pathom Province divided in personal data also to study the relations between the innovation knowledge and capability and the success of SME registered in Nakorn Pathom Province and to study the relations between the work efficiency and the success of SME registered in Nakorn Pathom Province. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. Data were analyzed by using Statistical Package for the Social Sciences.The findings revealed that the majority of respondents were male with the age between 25-34 years old, hold undergraduate degree, married and stay together. The average income of respondents was between 10,001-20,000 baht. It also found that in terms of innovation knowledge and capability, there were two variables had an influence on the amount of innovation knowledge and capability, innovation evaluation which were physical characteristic and innovation process.
Keywords: ccountants, Innovation, Knowledge, Work Efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17366367 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: Complex-valued signal processing, synthetic aperture radar (SAR), 2-D radar imaging, compressive sensing, Sparse Bayesian learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15266366 Machine Learning Approach for Identifying Dementia from MRI Images
Authors: S. K. Aruna, S. Chitra
Abstract:
This research paper presents a framework for classifying Magnetic Resonance Imaging (MRI) images for Dementia. Dementia, an age-related cognitive decline is indicated by degeneration of cortical and sub-cortical structures. Characterizing morphological changes helps understand disease development and contributes to early prediction and prevention of the disease. Modelling, that captures the brain’s structural variability and which is valid in disease classification and interpretation is very challenging. Features are extracted using Gabor filter with 0, 30, 60, 90 orientations and Gray Level Co-occurrence Matrix (GLCM). It is proposed to normalize and fuse the features. Independent Component Analysis (ICA) selects features. Support Vector Machine (SVM) classifier with different kernels is evaluated, for efficiency to classify dementia. This study evaluates the presented framework using MRI images from OASIS dataset for identifying dementia. Results showed that the proposed feature fusion classifier achieves higher classification accuracy.
Keywords: Magnetic resonance imaging, dementia, Gabor filter, gray level co-occurrence matrix, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21166365 Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model
Authors: Nurfirdawati Muhamad Hanafi, Azmanirah Ab Rahman, Marina Ibrahim Mukhtar, Jamil Ahmad, Sarebah Warman
Abstract:
This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), infit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained show that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.
Keywords: Competency Assessment, Reliability, Validity, Item Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28316364 A Delay-Tolerant Distributed Query Processing Architecture for Mobile Environment
Authors: T.P. Andamuthu, Dr. P. Balasubramanie
Abstract:
The intermittent connectivity modifies the “always on" network assumption made by all the distributed query processing systems. In modern- day systems, the absence of network connectivity is considered as a fault. Since the last upload, it might not be feasible to transmit all the data accumulated right away over the available connection. It is possible that vital information may be delayed excessively when the less important information takes place of the vital information. Owing to the restricted and uneven bandwidth, it is vital that the mobile nodes make the most advantageous use of the connectivity when it arrives. Hence, in order to select the data that needs to be transmitted first, some sort of data prioritization is essential. A continuous query processing system for intermittently connected mobile networks that comprises of a delaytolerant continuous query processor distributed across the mobile hosts has been proposed in this paper. In addition, a mechanism for prioritizing query results has been designed that guarantees enhanced accuracy and reduced delay. It is illustrated that our architecture reduces the client power consumption, increases query efficiency by the extensive simulation results.Keywords: Broadcast, Location, Mobile host, Mobility, Query.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14506363 Automated Thickness Measurement of Retinal Blood Vessels for Implementation of Clinical Decision Support Systems in Diagnostic Diabetic Retinopathy
Authors: S.Jerald Jeba Kumar, M.Madheswaran
Abstract:
The structure of retinal vessels is a prominent feature, that reveals information on the state of disease that are reflected in the form of measurable abnormalities in thickness and colour. Vascular structures of retina, for implementation of clinical diabetic retinopathy decision making system is presented in this paper. Retinal Vascular structure is with thin blood vessel, whose accuracy is highly dependent upon the vessel segmentation. In this paper the blood vessel thickness is automatically detected using preprocessing techniques and vessel segmentation algorithm. First the capture image is binarized to get the blood vessel structure clearly, then it is skeletonised to get the overall structure of all the terminal and branching nodes of the blood vessels. By identifying the terminal node and the branching points automatically, the main and branching blood vessel thickness is estimated. Results are presented and compared with those provided by clinical classification on 50 vessels collected from Bejan Singh Eye hospital..Keywords: Diabetic retinopathy, Binarization, SegmentationClinical Decision Support Systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20436362 Web Traffic Mining using Neural Networks
Authors: Farhad F. Yusifov
Abstract:
With the explosive growth of data available on the Internet, personalization of this information space become a necessity. At present time with the rapid increasing popularity of the WWW, Websites are playing a crucial role to convey knowledge and information to the end users. Discovering hidden and meaningful information about Web users usage patterns is critical to determine effective marketing strategies to optimize the Web server usage for accommodating future growth. The task of mining useful information becomes more challenging when the Web traffic volume is enormous and keeps on growing. In this paper, we propose a intelligent model to discover and analyze useful knowledge from the available Web log data.Keywords: Clustering, Self organizing map, Web log files, Web traffic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16036361 Effect of Single Overload Ratio and Stress Ratio on Fatigue Crack Growth
Authors: M. Benachour, N. Benachour, M. Benguediab
Abstract:
In this investigation variation of cyclic loading effect on fatigue crack growth is the studied. This study is performed on 2024 T351 and 7050-T74 aluminum alloys, used in aeronautical structures. The propagation model used in this study is NASGRO model. In constant amplitude loading (CA), effect of stress ratio has been investigated. Fatigue life and fatigue crack growth rate were affected by this factor. Results showed an increasing in fatigue crack growth rates (FCGRs) with increasing stress ratio. Variable amplitude loading (VAL) can take many forms i.e. with a single overload, overload band… etc. The shape of these loads affects strongly the fracture life and FCGRs. The application of a single overload (ORL) decrease the FCGR and increase the delay crack length caused by the formation of a larger plastic zone compared to the plastic zone due without VAL. The fatigue behavior of the both material under single overload has been compared.
Keywords: Fatigue crack growth, overload ratio, stress ratio, generalized willenborg model, retardation, Al-alloys.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3601