Search results for: data safety.
6229 A New Classification of Risk-Reduction Options to Improve the Risk-Reduction Readiness of the Railway Industry
Authors: Eberechi Weli, Michael Todinov
Abstract:
The gap between the selection of risk-reduction options in the railway industry and the task of their effective implementation results in compromised safety and substantial losses. An effective risk management must necessarily integrate the evaluation phases with the implementation phase. This paper proposes an essential categorisation of risk reduction measures that best addresses a standard railway industry portfolio. By categorising the risk reduction options into design, operational, procedural and technical options, it is guaranteed that the efforts of the implementation facilitators (people, processes and supporting systems) are systematically harmonised. The classification is based on an integration of fundamental principles of risk reduction in the railway industry with the systems engineering approach.
This paper argues that the use of a similar classification approach is an attribute of organisations possessing a superior level of risk-reduction readiness. The integration of the proposed rational classification structure provides a solid ground for effective risk reduction.
Keywords: Cost effectiveness, organisational readiness, risk reduction, railway, system engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18086228 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm
Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad
Abstract:
This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24836227 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing
Authors: V. Barot, S. McLeod, R. Harrison, A. A. West
Abstract:
Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.
Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13806226 Hiding Data in Images Using PCP
Authors: Souvik Bhattacharyya, Gautam Sanyal
Abstract:
In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18256225 A Rigid Point Set Registration of Remote Sensing Images Based on Genetic Algorithms and Hausdorff Distance
Authors: F. Meskine, N. Taleb, M. Chikr El-Mezouar, K. Kpalma, A. Almhdie
Abstract:
Image registration is the process of establishing point by point correspondence between images obtained from a same scene. This process is very useful in remote sensing, medicine, cartography, computer vision, etc. Then, the task of registration is to place the data into a common reference frame by estimating the transformations between the data sets. In this work, we develop a rigid point registration method based on the application of genetic algorithms and Hausdorff distance. First, we extract the feature points from both images based on the algorithm of global and local curvature corner. After refining the feature points, we use Hausdorff distance as similarity measure between the two data sets and for optimizing the search space we use genetic algorithms to achieve high computation speed for its inertial parallel. The results show the efficiency of this method for registration of satellite images.Keywords: Feature extraction, Genetic algorithms, Hausdorff distance, Image registration, Point registration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19376224 Formation of Vasoactive Amines in Dry Fermented Sausage Petrovská Klobása during Drying and Ripening in Traditional and Industrial Conditions
Authors: Tatjana A. Tasić, Predrag M. Ikonić, Ljiljana S. Petrović, Marija R. Jokanović, Vladimir M. Tomović, Branislav V. Šojić, Snežana B. Škaljac
Abstract:
Formation of histamine, tryptamine, phenylethylamine and tyramine (vasoactive amines) in dry fermented sausage Petrovská klobása during drying and ripening in traditional room (B1) and industrial ripening chamber (B3) were investigated. Dansyl chloride derivatized vasoactive amines were determined using HPLC-DAD on Eclipse XDB-C18 column.
Histamine, the most important amine from food safety point of view, was not detected in any analyzed sample. Unlike most of the other fermented sausages, where tyramine is reported as the most abundant amine, in Petrovská klobása tryptamine was the most abundant vasoactive amine in both groups of sausages even though concentrations of tryptamine and tyramine in B3 sausages at the end of ripening were nearly the same (39.8 versus 39.6mg/kg). Sum of vasoactive amines in samples varied from not detected ND (B3) to 176 mg/kg (B1), with concentration of 36.1 (B3) and 73.6 (B1) mg/kg at the end of drying and 96 (B3) and 176 (B1) mg/kg at the end of ripening period. Although the sum of vasoactive amines has increased from the end of drying (45. and 90. day) to the end of ripening period (120. day), during whole production period these values did not exceed 200 mg/kg proposed as possible indicator of hygienic conditions and GMP in the sausage production.
Keywords: Vasoactive amines, traditional dry fermented sausage Petrovská klobása.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40126223 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition
Authors: Chuan Li, Ming Liang
Abstract:
Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17226222 Semi-Automatic Method to Assist Expert for Association Rules Validation
Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen
Abstract:
In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.Keywords: Association rules, Rule-based classification, Classification quality, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17976221 Crack Opening Investigation in Fiberconcrete
Authors: Arturs Macanovskis, Vitalijs Lusis, Andrejs Krasnikovs
Abstract:
This work had three stages. In the first stage was examined pull-out process for steel fiber was embedded into a concrete by one end and was pulled out of concrete under the angle to pulling out force direction. Angle was varied. On the obtained forcedisplacement diagrams were observed jumps. For such mechanical behavior explanation, fiber channel in concrete surface microscopical experimental investigation, using microscope KEYENCE VHX2000, was performed. At the second stage were obtained diagrams for load- crack opening displacement for breaking homogeneously reinforced and layered fiberconcrete prisms (with dimensions 10x10x40cm) subjected to 4-point bending. After testing was analyzed main crack. At the third stage elaborated prediction model for the fiberconcrete beam, failure under bending, using the following data: a) diagrams for fibers pulling out at different angles; b) experimental data about steel-straight fibers locations in the main crack. Experimental and theoretical (modeling) data were compared.
Keywords: Fiberconcrete, pull-out, fiber channel, layered fiberconcrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18596220 An Attribute-Centre Based Decision Tree Classification Algorithm
Authors: Gökhan Silahtaroğlu
Abstract:
Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.Keywords: Classification, decision tree, split, pruning, entropy, gini.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13746219 Serviceability of Fabric-Formed Concrete Structures
Authors: Yadgar Tayfur, Antony Darby, Tim Ibell, Mark Evernden, John Orr
Abstract:
Fabric form-work is a technique to cast concrete structures with a great advantage of saving concrete material of up to 40%. This technique is particularly associated with the optimized concrete structures that usually have smaller cross-section dimensions than equivalent prismatic members. However, this can make the structural system produced from these members prone to smaller serviceability safety margins. Therefore, it is very important to understand the serviceability issue of non-prismatic concrete structures. In this paper, an analytical computer-based model to optimize concrete beams and to predict load-deflection behaviour of both prismatic and non-prismatic concrete beams is presented. The model was developed based on the method of sectional analysis and integration of curvatures. Results from the analytical model were compared to load-deflection behaviour of a number of beams with different geometric and material properties from other researchers. The results of the comparison show that the analytical program can accurately predict the load-deflection response of concrete beams with medium reinforcement ratios. However, it over-estimates deflection values for lightly reinforced specimens. Finally, the analytical program acceptably predicted load-deflection behaviour of on-prismatic concrete beams.
Keywords: Concrete beams, deflections, fabric formwork, optimisation, serviceability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15766218 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences
Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam
Abstract:
The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.Keywords: Learning experiences, innovation, traditional games, trainee teachers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24516217 Combining Bagging and Boosting
Authors: S. B. Kotsiantis, P. E. Pintelas
Abstract:
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Keywords: data mining, machine learning, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25706216 Trend Analysis for Extreme Rainfall Events in New South Wales, Australia
Authors: Evan Hajani, Ataur Rahman, Khaled Haddad
Abstract:
Climate change will affect the hydrological cycle in many different ways such as increase in evaporation and rainfalls. There have been growing interests among researchers to identify the nature of trends in historical rainfall data in many different parts of the world. This paper examines the trends in annual maximum rainfall data from 30 stations in New South Wales, Australia by using two non-parametric tests, Mann-Kendall (MK) and Spearman’s Rho (SR). Rainfall data were analyzed for fifteen different durations ranging from 6 min to 3 days. It is found that the sub-hourly durations (6, 12, 18, 24, 30 and 48 minutes) show statistically significant positive (upward) trends whereas longer duration (subdaily and daily) events generally show a statistically significant negative (downward) trend. It is also found that the MK test and SR test provide notably different results for some rainfall event durations considered in this study. Since shorter duration sub-hourly rainfall events show positive trends at many stations, the design rainfall data based on stationary frequency analysis for these durations need to be adjusted to account for the impact of climate change. These shorter durations are more relevant to many urban development projects based on smaller catchments having a much shorter response time.
Keywords: Climate change, Mann-Kendall test, Spearman’s Rho test, trends, design rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29156215 Application of Artificial Neural Network to Forecast Actual Cost of a Project to Improve Earned Value Management System
Authors: Seyed Hossein Iranmanesh, Mansoureh Zarezadeh
Abstract:
This paper presents an application of Artificial Neural Network (ANN) to forecast actual cost of a project based on the earned value management system (EVMS). For this purpose, some projects randomly selected based on the standard data set , and it is produced necessary progress data such as actual cost ,actual percent complete , baseline cost and percent complete for five periods of project. Then an ANN with five inputs and five outputs and one hidden layer is trained to produce forecasted actual costs. The comparison between real and forecasted data show better performance based on the Mean Absolute Percentage Error (MAPE) criterion. This approach could be applicable to better forecasting the project cost and result in decreasing the risk of project cost overrun, and therefore it is beneficial for planning preventive actions.
Keywords: Earned Value Management System (EVMS), Artificial Neural Network (ANN), Estimate At Completion, Forecasting Methods, Project Performance Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27726214 Overview of Development of a Digital Platform for Building Critical Infrastructure Protection Systems in Smart Industries
Authors: Bruno Vilić Belina, Ivan Župan
Abstract:
Smart industry concepts and digital transformation are very popular in many industries. They develop their own digital platforms, which have an important role in innovations and transactions. The main idea of smart industry digital platforms is central data collection, industrial data integration and data usage for smart applications and services. This paper presents the development of a digital platform for building critical infrastructure protection systems in smart industries. Different service contraction modalities in Service Level Agreements (SLAs), Customer Relationship Management (CRM) relations, trends and changes in business architectures (especially process business architecture) for the purpose of developing infrastructural production and distribution networks, information infrastructure meta-models and generic processes by critical infrastructure owner demanded by critical infrastructure law, satisfying cybersecurity requirements and taking into account hybrid threats are researched.
Keywords: Cybersecurity, critical infrastructure, smart industries, digital platform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2416213 Using Artificial Neural Network to Predict Collisions on Horizontal Tangents of 3D Two-Lane Highways
Authors: Omer F. Cansiz, Said M. Easa
Abstract:
The purpose of this study is mainly to predict collision frequency on the horizontal tangents combined with vertical curves using artificial neural network methods. The proposed ANN models are compared with existing regression models. First, the variables that affect collision frequency were investigated. It was found that only the annual average daily traffic, section length, access density, the rate of vertical curvature, smaller curve radius before and after the tangent were statistically significant according to related combinations. Second, three statistical models (negative binomial, zero inflated Poisson and zero inflated negative binomial) were developed using the significant variables for three alignment combinations. Third, ANN models are developed by applying the same variables for each combination. The results clearly show that the ANN models have the lowest mean square error value than those of the statistical models. Similarly, the AIC values of the ANN models are smaller to those of the regression models for all the combinations. Consequently, the ANN models have better statistical performances than statistical models for estimating collision frequency. The ANN models presented in this paper are recommended for evaluating the safety impacts 3D alignment elements on horizontal tangents.Keywords: Collision frequency, horizontal tangent, 3D two-lane highway, negative binomial, zero inflated Poisson, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16446212 Customer Need Type Classification Model using Data Mining Techniques for Recommender Systems
Authors: Kyoung-jae Kim
Abstract:
Recommender systems are usually regarded as an important marketing tool in the e-commerce. They use important information about users to facilitate accurate recommendation. The information includes user context such as location, time and interest for personalization of mobile users. We can easily collect information about location and time because mobile devices communicate with the base station of the service provider. However, information about user interest can-t be easily collected because user interest can not be captured automatically without user-s approval process. User interest usually represented as a need. In this study, we classify needs into two types according to prior research. This study investigates the usefulness of data mining techniques for classifying user need type for recommendation systems. We employ several data mining techniques including artificial neural networks, decision trees, case-based reasoning, and multivariate discriminant analysis. Experimental results show that CHAID algorithm outperforms other models for classifying user need type. This study performs McNemar test to examine the statistical significance of the differences of classification results. The results of McNemar test also show that CHAID performs better than the other models with statistical significance.Keywords: Customer need type, Data mining techniques, Recommender system, Personalization, Mobile user.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21526211 Innovation Knowledge and Capability, Work Efficiency of Accountants and the Success of SME Registered in Nakorn Pathom Province
Authors: Autjira Songan, Supattra Kanchanopast
Abstract:
The objectives of this research were to compare the success of SME registered in Nakorn Pathom Province divided in personal data also to study the relations between the innovation knowledge and capability and the success of SME registered in Nakorn Pathom Province and to study the relations between the work efficiency and the success of SME registered in Nakorn Pathom Province. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. Data were analyzed by using Statistical Package for the Social Sciences.The findings revealed that the majority of respondents were male with the age between 25-34 years old, hold undergraduate degree, married and stay together. The average income of respondents was between 10,001-20,000 baht. It also found that in terms of innovation knowledge and capability, there were two variables had an influence on the amount of innovation knowledge and capability, innovation evaluation which were physical characteristic and innovation process.
Keywords: ccountants, Innovation, Knowledge, Work Efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17396210 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: Complex-valued signal processing, synthetic aperture radar (SAR), 2-D radar imaging, compressive sensing, Sparse Bayesian learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15386209 Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model
Authors: Nurfirdawati Muhamad Hanafi, Azmanirah Ab Rahman, Marina Ibrahim Mukhtar, Jamil Ahmad, Sarebah Warman
Abstract:
This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), infit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained show that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.
Keywords: Competency Assessment, Reliability, Validity, Item Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28426208 A Delay-Tolerant Distributed Query Processing Architecture for Mobile Environment
Authors: T.P. Andamuthu, Dr. P. Balasubramanie
Abstract:
The intermittent connectivity modifies the “always on" network assumption made by all the distributed query processing systems. In modern- day systems, the absence of network connectivity is considered as a fault. Since the last upload, it might not be feasible to transmit all the data accumulated right away over the available connection. It is possible that vital information may be delayed excessively when the less important information takes place of the vital information. Owing to the restricted and uneven bandwidth, it is vital that the mobile nodes make the most advantageous use of the connectivity when it arrives. Hence, in order to select the data that needs to be transmitted first, some sort of data prioritization is essential. A continuous query processing system for intermittently connected mobile networks that comprises of a delaytolerant continuous query processor distributed across the mobile hosts has been proposed in this paper. In addition, a mechanism for prioritizing query results has been designed that guarantees enhanced accuracy and reduced delay. It is illustrated that our architecture reduces the client power consumption, increases query efficiency by the extensive simulation results.Keywords: Broadcast, Location, Mobile host, Mobility, Query.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14546207 Web Traffic Mining using Neural Networks
Authors: Farhad F. Yusifov
Abstract:
With the explosive growth of data available on the Internet, personalization of this information space become a necessity. At present time with the rapid increasing popularity of the WWW, Websites are playing a crucial role to convey knowledge and information to the end users. Discovering hidden and meaningful information about Web users usage patterns is critical to determine effective marketing strategies to optimize the Web server usage for accommodating future growth. The task of mining useful information becomes more challenging when the Web traffic volume is enormous and keeps on growing. In this paper, we propose a intelligent model to discover and analyze useful knowledge from the available Web log data.Keywords: Clustering, Self organizing map, Web log files, Web traffic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16096206 Practical Guidelines and Examples for the Users of the TMS320C6713 DSK
Authors: Abdullah A Wardak
Abstract:
This paper describes how the correct endian mode of the TMS320C6713 DSK board can be identified. It also explains how the TMS320C6713 DSK board can be used in the little endian and in the big endian modes for assembly language programming in particular and for signal processing in general. Similarly, it discusses how crucially important it is for a user of the TMS320C6713 DSK board to identify the mode of operation and then use it correctly during the development stages of the assembly language programming; otherwise, it will cause unnecessary confusion and erroneous results as far as storing data into the memory and loading data from the memory is concerned. Furthermore, it highlights and strongly recommends to the users of the TMS320C6713 DSK board to be aware of the availability and importance of various display options in the Code Composer Studio (CCS) for correctly interpreting and displaying the desired data in the memory. The information presented in this paper will be of great importance and interest to those practitioners and developers who wants to use the TMS320C6713 DSK board for assembly language programming as well as input-output signal processing manipulations. Finally, examples that clearly illustrate the concept are presented.Keywords: Assembly language programming, big endian mode, little endian mode, signal processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27966205 A Formulation of the Latent Class Vector Model for Pairwise Data
Authors: Tomoya Okubo, Kuninori Nakamura, Shin-ichi Mayekawa
Abstract:
In this research, a latent class vector model for pairwise data is formulated. As compared to the basic vector model, this model yields consistent estimates of the parameters since the number of parameters to be estimated does not increase with the number of subjects. The result of the analysis reveals that the model was stable and could classify each subject to the latent classes representing the typical scales used by these subjects.
Keywords: finite mixture models, latent class analysis, Thrustone's paired comparison method, vector model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12206204 Comparison and Characterization of Dyneema™ HB-210 and HB-212 for Accelerated UV Aging
Authors: Jonmichael A. Weaver, David A. Miller
Abstract:
Ultra High Molecular Weight Polyethylene (UHMWPE) presents several distinct advantages as a material with a high strength to weight ratio, durability, and neutron stability. Understanding the change in the mechanical performance of UHMWPE due to environmental exposure is key to safety for future applications. Dyneema® HB-210, a 15 µm diameter UHMWPE multi-filament fiber laid up in a polyurethane matrix in [0/ 90]2, with a thickness of 0.17 mm is compared to the same fiber and orientation system, HB-212, with a rubber-based matrix under UV aging conditions. UV aging tests according to ASTM-G154 were performed on both HB-210 and HB-212 to interrogate the change in mechanical properties, as measured through dynamic mechanical analysis and imaged using a scanning electron microscope. These results showed a decrease in both the storage modulus and loss modulus of the aged material compared to the unaged, even though the tan δ slightly increased. Material degradation occurred at a higher rate in Dyneema® HB-212 compared to HB-210. The HB-210 was characterized for the effects of 100 hours of UV aging via dynamic mechanical analysis. Scanning electron microscope images were taken of the HB-210 and HB-212 to identify the primary damage mechanisms in the matrix. Embrittlement and matrix spall were the products of prolonged UV exposure and erosion, resulting in decreased mechanical properties.
Keywords: Composite materials, material characterization, UV aging, UHMWPE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7016203 Detecting Circles in Image Using Statistical Image Analysis
Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee
Abstract:
The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.Keywords: Image processing, median filter, projection, scalespace, segmentation, threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18386202 Idiopathic Constipation can be Subdivided in Clinical Subtypes: Data Mining by Cluster Analysis on a Population based Study
Authors: Mauro Giacomini, Stefania Bertone, Carlo Mansi, Pietro Dulbecco, Vincenzo Savarino
Abstract:
The prevalence of non organic constipation differs from country to country and the reliability of the estimate rates is uncertain. Moreover, the clinical relevance of subdividing the heterogeneous functional constipation disorders into pre-defined subgroups is largely unknown.. Aim: to estimate the prevalence of constipation in a population-based sample and determine whether clinical subgroups can be identified. An age and gender stratified sample population from 5 Italian cities was evaluated using a previously validated questionnaire. Data mining by cluster analysis was used to determine constipation subgroups. Results: 1,500 complete interviews were obtained from 2,083 contacted households (72%). Self-reported constipation correlated poorly with symptombased constipation found in 496 subjects (33.1%). Cluster analysis identified four constipation subgroups which correlated to subgroups identified according to pre-defined symptom criteria. Significant differences in socio-demographics and lifestyle were observed among subgroups.Keywords: Cluster analysis, constipation, data mining, statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13036201 Manifold Analysis by Topologically Constrained Isometric Embedding
Authors: Guy Rosman, Alexander M. Bronstein, Michael M. Bronstein, Ron Kimmel
Abstract:
We present a new algorithm for nonlinear dimensionality reduction that consistently uses global information, and that enables understanding the intrinsic geometry of non-convex manifolds. Compared to methods that consider only local information, our method appears to be more robust to noise. Unlike most methods that incorporate global information, the proposed approach automatically handles non-convexity of the data manifold. We demonstrate the performance of our algorithm and compare it to state-of-the-art methods on synthetic as well as real data.
Keywords: Dimensionality reduction, manifold learning, multidimensional scaling, geodesic distance, boundary detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14586200 Empirical Roughness Progression Models of Heavy Duty Rural Pavements
Authors: Nahla H. Alaswadko, Rayya A. Hassan, Bayar N. Mohammed
Abstract:
Empirical deterministic models have been developed to predict roughness progression of heavy duty spray sealed pavements for a dataset representing rural arterial roads. The dataset provides a good representation of the relevant network and covers a wide range of operating and environmental conditions. A sample with a large size of historical time series data for many pavement sections has been collected and prepared for use in multilevel regression analysis. The modelling parameters include road roughness as performance parameter and traffic loading, time, initial pavement strength, reactivity level of subgrade soil, climate condition, and condition of drainage system as predictor parameters. The purpose of this paper is to report the approaches adopted for models development and validation. The study presents multilevel models that can account for the correlation among time series data of the same section and to capture the effect of unobserved variables. Study results show that the models fit the data very well. The contribution and significance of relevant influencing factors in predicting roughness progression are presented and explained. The paper concludes that the analysis approach used for developing the models confirmed their accuracy and reliability by well-fitting to the validation data.
Keywords: Roughness progression, empirical model, pavement performance, heavy duty pavement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 809