Search results for: monitoring data
7485 Using Metacognitive Strategies in Reading Comprehension by EFL Students
Authors: Simin Sadeghi-Saeb
Abstract:
Metacognitive strategies consistently play important roles in reading comprehension. The metacognitive strategies involve the active monitoring and consequent regulation and orchestration of the cognitive processes in relation to the cognitive objects or data on which they bear. In this paper, the effect of instruction in using metacognitive strategies on reading academic materials, type of metacognitive strategies were mostly used by college university students before and after the instruction and the level they use those strategies before and after the instruction were studied. For these aims, 50 female college students were chosen. Then, they were divided randomly into two groups, experimental and control groups. At first session, students in both groups took the standard TOFEL exam. After the pre-test had been administered, the instruction began. After treatment, a post-test was taken. It is useful to state that after pre-test and post-test the same questionnaire was handed to the students of experimental group. The results of this research show that the instruction of metacognitive strategies has positive effect on the students' scores in reading comprehension tests. Furthermore, it showed that before and after the instruction, the students' usage of metacognitive strategies changed. Also, it demonstrated that the instruction affected the students' level of metacognitive strategies' usage.
Keywords: EFL students, English reading comprehension, instruction, metacognitive strategies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15847484 ADABeV: Automatic Detection of Abnormal Behavior in Video-surveillance
Authors: Nour Charara, Iman Jarkass, Maria Sokhn, Elena Mugellini, Omar Abou Khaled
Abstract:
Intelligent Video-Surveillance (IVS) systems are being more and more popular in security applications. The analysis and recognition of abnormal behaviours in a video sequence has gradually drawn the attention in the field of IVS, since it allows filtering out a large number of useless information, which guarantees the high efficiency in the security protection, and save a lot of human and material resources. We present in this paper ADABeV, an intelligent video-surveillance framework for event recognition in crowded scene to detect the abnormal human behaviour. This framework is attended to be able to achieve real-time alarming, reducing the lags in traditional monitoring systems. This architecture proposal addresses four main challenges: behaviour understanding in crowded scenes, hard lighting conditions, multiple input kinds of sensors and contextual-based adaptability to recognize the active context of the scene.Keywords: Behavior recognition, Crowded scene, Data fusion, Pattern recognition, Video-surveillance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36367483 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data
Authors: Sedigheh Mirzaei S., Debasis Sengupta
Abstract:
Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.Keywords: Preece-Baines growth model, MCMC method, Mixed effect model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21397482 Data Migration between Document-Oriented and Relational Databases
Authors: Bogdan Walek, Cyril Klimes
Abstract:
Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.Keywords: data migration, database, document-oriented database, XML, relational schema
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35257481 Identity Verification Using k-NN Classifiers and Autistic Genetic Data
Authors: Fuad M. Alkoot
Abstract:
DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN).
Keywords: Biometrics, identity verification, genetic data, k-nearest neighbor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11207480 Modeling, Simulation and Monitoring of Nuclear Reactor Using Directed Graph and Bond Graph
Authors: A. Badoud, M. Khemliche, S. Latreche
Abstract:
The main objective developed in this paper is to find a graphic technique for modeling, simulation and diagnosis of the industrial systems. This importance is much apparent when it is about a complex system such as the nuclear reactor with pressurized water of several form with various several non-linearity and time scales. In this case the analytical approach is heavy and does not give a fast idea on the evolution of the system. The tool Bond Graph enabled us to transform the analytical model into graphic model and the software of simulation SYMBOLS 2000 specific to the Bond Graphs made it possible to validate and have the results given by the technical specifications. We introduce the analysis of the problem involved in the faults localization and identification in the complex industrial processes. We propose a method of fault detection applied to the diagnosis and to determine the gravity of a detected fault. We show the possibilities of application of the new diagnosis approaches to the complex system control. The industrial systems became increasingly complex with the faults diagnosis procedures in the physical systems prove to become very complex as soon as the systems considered are not elementary any more. Indeed, in front of this complexity, we chose to make recourse to Fault Detection and Isolation method (FDI) by the analysis of the problem of its control and to conceive a reliable system of diagnosis making it possible to apprehend the complex dynamic systems spatially distributed applied to the standard pressurized water nuclear reactor.Keywords: Bond Graph, Modeling, Simulation, Monitoring, Analytical Redundancy Relations, Pressurized Water Reactor, Directed Graph.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19787479 Power Saving System in Green Data Center
Authors: Joon-young Jung, Dong-oh Kang, Chang-seok Bae
Abstract:
Power consumption is rapidly increased in data centers because the number of data center is increased and more the scale of data center become larger. Therefore, it is one of key research items to reduce power consumption in data center. The peak power of a typical server is around 250 watts. When a server is idle, it continues to use around 60% of the power consumed when in use, though vendors are putting effort into reducing this “idle" power load. Servers tend to work at only around a 5% to 20% utilization rate, partly because of response time concerns. An average of 10% of servers in their data centers was unused. In those reason, we propose dynamic power management system to reduce power consumption in green data center. Experiment result shows that about 55% power consumption is reduced at idle time.Keywords: Data Center, Green IT, Management Server, Power Saving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16287478 Invasion of Pectinatella magnifica in Freshwater Resources of the Czech Republic
Authors: J. Pazourek, K. Šmejkal, P. Kollár, J. Rajchard, J. Šinko, Z. Balounová, E. Vlková, H. Salmonová
Abstract:
Pectinatella magnifica (Leidy, 1851) is an invasive freshwater animal that lives in colonies. A colony of Pectinatella magnifica (a gelatinous blob) can be up to several feet in diameter large and under favorable conditions it exhibits an extreme growth rate. Recently European countries around rivers of Elbe, Oder, Danube, Rhine and Vltava have confirmed invasion of Pectinatella magnifica, including freshwater reservoirs in South Bohemia (Czech Republic). Our project (Czech Science Foundation, GAČR P503/12/0337) is focused onto biology and chemistry of Pectinatella magnifica. We monitor the organism occurrence in selected South Bohemia ponds and sandpits during the last years, collecting information about physical properties of surrounding water, and sampling the colonies for various analyses (classification, maps of secondary metabolites, toxicity tests). Because the gelatinous matrix is during the colony lifetime also a host for algae, bacteria and cyanobacteria (co-habitants), in this contribution, we also applied a high performance liquid chromatography (HPLC) method for determination of potentially present cyanobacterial toxins (microcystin-LR, microcystin-RR, nodularin). Results from the last 3-year monitoring show that these toxins are under limit of detection (LOD), so that they do not represent a danger yet. The final goal of our study is to assess toxicity risks related to fresh water resources invaded by Pectinatella magnifica, and to understand the process of invasion, which can enable to control it.
Keywords: Cyanobacteria, freshwater resources, Pectinatella magnifica invasion, toxicity monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18777477 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27047476 Nonlinear Estimation Model for Rail Track Deterioration
Authors: M. Karimpour, L. Hitihamillage, N. Elkhoury, S. Moridpour, R. Hesami
Abstract:
Rail transport authorities around the world have been facing a significant challenge when predicting rail infrastructure maintenance work for a long period of time. Generally, maintenance monitoring and prediction is conducted manually. With the restrictions in economy, the rail transport authorities are in pursuit of improved modern methods, which can provide precise prediction of rail maintenance time and location. The expectation from such a method is to develop models to minimize the human error that is strongly related to manual prediction. Such models will help them in understanding how the track degradation occurs overtime under the change in different conditions (e.g. rail load, rail type, rail profile). They need a well-structured technique to identify the precise time that rail tracks fail in order to minimize the maintenance cost/time and secure the vehicles. The rail track characteristics that have been collected over the years will be used in developing rail track degradation prediction models. Since these data have been collected in large volumes and the data collection is done both electronically and manually, it is possible to have some errors. Sometimes these errors make it impossible to use them in prediction model development. This is one of the major drawbacks in rail track degradation prediction. An accurate model can play a key role in the estimation of the long-term behavior of rail tracks. Accurate models increase the track safety and decrease the cost of maintenance in long term. In this research, a short review of rail track degradation prediction models has been discussed before estimating rail track degradation for the curve sections of Melbourne tram track system using Adaptive Network-based Fuzzy Inference System (ANFIS) model.
Keywords: ANFIS, MGT, Prediction modeling, rail track degradation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15957475 Web Application for University Internship Program Management
Authors: Prasanth Sabarish Nair, Thomas Binu, Madiajagan Muthaiyan
Abstract:
This paper discusses a software application to aid in the smooth functioning of a university internship program, including a student, faculty and an administration module. The software can also calculate the most apt combination of students to stations and allocate them respectively.
Keywords: Academic evaluation, administration monitoring, automatic allocation system, internship, student preferences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19507474 A Model Predicting the Microbiological Qualityof Aquacultured Sea Bream (Sparus aurata) According to Physicochemical Data: An Application in Western Greece Fish Aquaculture
Authors: Joan Iliopoulou-Georgudaki, Chris Theodoropoulos, Danae Venieri, Maria Lagkadinou
Abstract:
Monitoring of microbial flora in aquacultured sea bream, in relation to the physicochemical parameters of the rearing seawater, ended to a model describing the influence of the last to the quality of the fisheries. Fishes were sampled during eight months from four aqua farms in Western Greece and analyzed for psychrotrophic, H2S producing bacteria, Salmonella sp., heterotrophic plate count (PCA), with simultaneous physical evaluation. Temperature, dissolved oxygen, pH, conductivity, TDS, salinity, NO3 - and NH4 + ions were recorded. Temperature, dissolved oxygen and conductivity were correlated, respectively, to PCA, Pseudomonas sp. and Shewanella sp. counts. These parameters were the inputs of the model, which was driving, as outputs, to the prediction of PCA, Vibrio sp., Pseudomonas sp. and Shewanella sp. counts, and fish microbiological quality. The present study provides, for the first time, a ready-to-use predictive model of fisheries hygiene, leading to an effective management system for the optimization of aquaculture fisheries quality.
Keywords: Microbiological, model, physicochemical, Seabream.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17497473 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System
Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock
Abstract:
The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription- Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable to those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the COBAS assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the COBAS assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.Keywords: HIV viral load, Aptima, Roche, Panther system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32197472 Low-Latency and Low-Overhead Path Planning for In-band Network-Wide Telemetry
Authors: Penghui Zhang, Hua Zhang, Jun-Bo Wang, Cheng Zeng, Zijian Cao
Abstract:
With the development of software-defined networks and programmable data planes, in-band network telemetry (INT) has become an emerging technology in communications because it can get accurate and real-time network information. However, due to the expansion of the network scale, existing telemetry systems, to the best of the authors’ knowledge, have difficulty in meeting the common requirements of low overhead, low latency and full coverage for traffic measurement. This paper proposes a network-wide telemetry system with a low-latency low-overhead path planning (INT-LLPP). This paper builds a mathematical model to analyze the telemetry overhead and latency of INT systems. Then, we adopt a greedy-based path planning algorithm to reduce the overhead and latency of the network telemetry with the full network coverage. The simulation results show that network-wide telemetry is achieved and the telemetry overhead can be reduced significantly compared with existing INT systems. INT-LLPP can control the system latency to get real-time network information.
Keywords: Network telemetry, network monitoring, path planning, low latency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2567471 Principal Component Analysis using Singular Value Decomposition of Microarray Data
Authors: Dong Hoon Lim
Abstract:
A series of microarray experiments produces observations of differential expression for thousands of genes across multiple conditions. Principal component analysis(PCA) has been widely used in multivariate data analysis to reduce the dimensionality of the data in order to simplify subsequent analysis and allow for summarization of the data in a parsimonious manner. PCA, which can be implemented via a singular value decomposition(SVD), is useful for analysis of microarray data. For application of PCA using SVD we use the DNA microarray data for the small round blue cell tumors(SRBCT) of childhood by Khan et al.(2001). To decide the number of components which account for sufficient amount of information we draw scree plot. Biplot, a graphic display associated with PCA, reveals important features that exhibit relationship between variables and also the relationship of variables with observations.
Keywords: Principal component analysis, singular value decomposition, microarray data, SRBCT
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32507470 Hydrolysis of Hull-Less Pumpkin Oil Cake Protein Isolate by Pepsin
Authors: Ivan Živanović, Žužana Vaštag, Senka Popović, Ljiljana Popović, Draginja Peričin
Abstract:
The present work represents an investigation of the hydrolysis of hull-less pumpkin (Cucurbita Pepo L.) oil cake protein isolate (PuOC PI) by pepsin. To examine the effectiveness and suitability of pepsin towards PuOC PI the kinetic parameters for pepsin on PuOC PI were determined and then, the hydrolysis process was studied using Response Surface Methodology (RSM). The hydrolysis was carried out at temperature of 30°C and pH 3.00. Time and initial enzyme/substrate ratio (E/S) at three levels were selected as the independent parameters. The degree of hydrolysis, DH, was mesuared after 20, 30 and 40 minutes, at initial E/S of 0.7, 1 and 1.3 mA/mg proteins. Since the proposed second-order polynomial model showed good fit with the experimental data (R2 = 0.9822), the obtained mathematical model could be used for monitoring the hydrolysis of PuOC PI by pepsin, under studied experimental conditions, varying the time and initial E/S. To achieve the highest value of DH (39.13 %), the obtained optimum conditions for time and initial E/S were 30 min and 1.024 mA/mg proteins.Keywords: Enzymatic hydrolysis, Pepsin, Pumpkin (CucurbitaPepo L.) oil cake protein isolate, Response surface methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21777469 Speech Data Compression using Vector Quantization
Authors: H. B. Kekre, Tanuja K. Sarode
Abstract:
Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24047468 Ontology and CDSS Based Intelligent Health Data Management in Health Care Server
Authors: Eun-Jung Ko, Hyung-Jik Lee, Jeun-Woo Lee
Abstract:
In ubiqutious healthcare environment, user's health data are transfered to the remote healthcare server by the user's wearable system or mobile phone. These collected user's health data should be managed and analyzed in the healthcare server, so that care giver or user can monitor user's physiological state. In this paper, we designed and developed the intelligent Healthcare Server to manage the user's health data using CDSS and ontology. Our system can analyze user's health data semantically using CDSS and ontology, and report the result of user's physiological raw data to the user and care giver.
Keywords: u-healthcare, CDSS, healthcare server, health data, ontology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22377467 A Genetic Algorithm for Clustering on Image Data
Authors: Qin Ding, Jim Gasvoda
Abstract:
Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.
Keywords: Clustering, data mining, genetic algorithm, image data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20537466 A Holistic Framework for Unifying Data Security and Management in Modern Enterprises
Authors: Ashly Joseph
Abstract:
Modern businesses struggle significantly to secure and manage their data properly as the volume and complexity of their data both expand exponentially. Through the use of a multi-layered defense strategy, a centralized management platform, and cutting-edge technologies like AI, this research paper presents a comprehensive framework to integrate data security and management. The constraints of current data protection and management strategies, technological advancements, and the evolving threat landscape are all examined in this article. It suggests best practices for putting into practice integrated data security and governance models, placing an emphasis on ongoing adaptation. The advantages mentioned include a strengthened security posture, simpler procedures, lower costs, and reduced complexity. Additionally, issues including skill shortages, antiquated systems, and cultural obstacles are examined. Security executives and Chief Information Security Officers are given practical advice on how to evaluate, plan, and put into place strong data-centric security and management capabilities. The goal of the paper is to provide a thorough study of the data security and management landscape and to arm contemporary businesses with the knowledge they need to be proactive in protecting their data assets.
Keywords: Data security, security management, cloud computing, cybersecurity, data governance, security architecture, data management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2747465 Post Mining- Discovering Valid Rules from Different Sized Data Sources
Authors: R. Nedunchezhian, K. Anbumani
Abstract:
A big organization may have multiple branches spread across different locations. Processing of data from these branches becomes a huge task when innumerable transactions take place. Also, branches may be reluctant to forward their data for centralized processing but are ready to pass their association rules. Local mining may also generate a large amount of rules. Further, it is not practically possible for all local data sources to be of the same size. A model is proposed for discovering valid rules from different sized data sources where the valid rules are high weighted rules. These rules can be obtained from the high frequency rules generated from each of the data sources. A data source selection procedure is considered in order to efficiently synthesize rules. Support Equalization is another method proposed which focuses on eliminating low frequency rules at the local sites itself thus reducing the rules by a significant amount.
Keywords: Association rules, multiple data stores, synthesizing, valid rules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14047464 RFID-ready Master Data Management for Reverse Logistics
Authors: Jincheol Han, Hyunsun Ju, Jonghoon Chun
Abstract:
Sharing consistent and correct master data among disparate applications in a reverse-logistics chain has long been recognized as an intricate problem. Although a master data management (MDM) system can surely assume that responsibility, applications that need to co-operate with it must comply with proprietary query interfaces provided by the specific MDM system. In this paper, we present a RFID-ready MDM system which makes master data readily available for any participating applications in a reverse-logistics chain. We propose a RFID-wrapper as a part of our MDM. It acts as a gateway between any data retrieval request and query interfaces that process it. With the RFID-wrapper, any participating applications in a reverse-logistics chain can easily retrieve master data in a way that is analogous to retrieval of any other RFID-based logistics transactional data.Keywords: Reverse Logistics, Master Data Management, RFID.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19747463 Dynamic Models versus Frailty Models for Recurrent Event Data
Authors: Entisar A. Elgmati
Abstract:
Recurrent event data is a special type of multivariate survival data. Dynamic and frailty models are one of the approaches that dealt with this kind of data. A comparison between these two models is studied using the empirical standard deviation of the standardized martingale residual processes as a way of assessing the fit of the two models based on the Aalen additive regression model. Here we found both approaches took heterogeneity into account and produce residual standard deviations close to each other both in the simulation study and in the real data set.Keywords: Dynamic, frailty, misspecification, recurrent events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23507462 A Study of RSCMAC Enhanced GPS Dynamic Positioning
Authors: Ching-Tsan Chiang, Sheng-Jie Yang, Jing-Kai Huang
Abstract:
The purpose of this research is to develop and apply the RSCMAC to enhance the dynamic accuracy of Global Positioning System (GPS). GPS devices provide services of accurate positioning, speed detection and highly precise time standard for over 98% area on the earth. The overall operation of Global Positioning System includes 24 GPS satellites in space; signal transmission that includes 2 frequency carrier waves (Link 1 and Link 2) and 2 sets random telegraphic codes (C/A code and P code), on-earth monitoring stations or client GPS receivers. Only 4 satellites utilization, the client position and its elevation can be detected rapidly. The more receivable satellites, the more accurate position can be decoded. Currently, the standard positioning accuracy of the simplified GPS receiver is greatly increased, but due to affected by the error of satellite clock, the troposphere delay and the ionosphere delay, current measurement accuracy is in the level of 5~15m. In increasing the dynamic GPS positioning accuracy, most researchers mainly use inertial navigation system (INS) and installation of other sensors or maps for the assistance. This research utilizes the RSCMAC advantages of fast learning, learning convergence assurance, solving capability of time-related dynamic system problems with the static positioning calibration structure to improve and increase the GPS dynamic accuracy. The increasing of GPS dynamic positioning accuracy can be achieved by using RSCMAC system with GPS receivers collecting dynamic error data for the error prediction and follows by using the predicted error to correct the GPS dynamic positioning data. The ultimate purpose of this research is to improve the dynamic positioning error of cheap GPS receivers and the economic benefits will be enhanced while the accuracy is increased.Keywords: Dynamic Error, GPS, Prediction, RSCMAC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16857461 Issues and Architecture for Supporting Data Warehouse Queries in Web Portals
Authors: Minsoo Lee, Yoon-kyung Lee, Hyejung Yoon, Soo-kyung Song, Sujeong Cheong
Abstract:
Data Warehousing tools have become very popular and currently many of them have moved to Web-based user interfaces to make it easier to access and use the tools. The next step is to enable these tools to be used within a portal framework. The portal framework consists of pages having several small windows that contain individual data warehouse query results. There are several issues that need to be considered when designing the architecture for a portal enabled data warehouse query tool. Some issues need special techniques that can overcome the limitations that are imposed by the nature of data warehouse queries. Issues such as single sign-on, query result caching and sharing, customization, scheduling and authorization need to be considered. This paper discusses such issues and suggests an architecture to support data warehouse queries within Web portal frameworks.
Keywords: Data Warehousing tools, data warehousing queries, web portal frameworks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21217460 Determination of Soil Loss by Erosion in Different Land Covers Categories and Slope Classes in Bovilla Watershed, Tirana, Albania
Authors: Valmir Baloshi, Fran Gjoka, Nehat Çollaku, Elvin Toromani
Abstract:
As a sediment production mechanism, soil erosion is the main environmental threat to the Bovilla watershed, including the decline of water quality of the Bovilla reservoir that provides drinking water to Tirana city (the capital of Albania). Therefore, an experiment with 25 erosion plots for soil erosion monitoring has been set up since June 2017. The aim was to determine the soil loss on plot and watershed scale in Bovilla watershed (Tirana region) for implementation of soil and water protection measures or payments for ecosystem services (PES) programs. The results of erosion monitoring for the period June 2017 - May 2018 showed that the highest values of surface runoff were noted in bare land of 38829.91 liters on slope of 74% and the lowest values in forest land of 12840.6 liters on slope of 64% while the highest values of soil loss were found in bare land of 595.15 t/ha on slope of 62% and lowest values in forest land of 18.99 t/ha on slope of 64%. These values are much higher than the average rate of soil loss in the European Union (2.46 ton/ha/year). In the same sloping class, the soil loss was reduced from orchard or bare land to the forest land, and in the same category of land use, the soil loss increased with increasing land slope. It is necessary to conduct chemical analyses of sediments to determine the amount of chemical elements leached out of the soil and end up in the reservoir of Bovilla. It is concluded that PES programs should be implemented for rehabilitation of sub-watersheds Ranxe, Vilez and Zall-Bastar of the Bovilla watershed with valuable conservation practices.
Keywords: ANOVA, Bovilla, land cover, slope, soil loss, watershed management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8867459 Lamb Wave Wireless Communication in Healthy Plates Using Coherent Demodulation
Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad
Abstract:
Guided ultrasonic waves are used in Non-Destructive Testing and Structural Health Monitoring for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average bit error percentage. Results has shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.
Keywords: Lamb Wave Communication, wireless communication, coherent demodulation, bit error percentage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5637458 Data Mining Using Learning Automata
Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri
Abstract:
In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).Keywords: Data mining, Learning automata, Classification rules, Knowledge discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19357457 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks
Authors: A. Krishna Veni, R.Geetha
Abstract:
Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.
Keywords: Aggregation, lifetime, network security, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12177456 An Approach to Measure Snow Depth of Winter Accumulation at Basin Scale Using Satellite Data
Authors: M. Geetha Priya, D. Krishnaveni
Abstract:
Snow depth estimation and monitoring studies have been carried out for decades using empirical relationship or extrapolation of point measurements carried out in field. With the development of advanced satellite based remote sensing techniques, a modified approach is proposed in the present study to estimate the winter accumulated snow depth at basin scale. Assessment of snow depth by differencing Digital Elevation Model (DEM) generated at the beginning and end of winter season can be experimented for the region of interest (Himalayan and polar regions) accounting for winter accumulation (solid precipitation). The proposed approach is based on existing geodetic method that is being used for glacier mass balance estimation. Considering the satellite datasets purely acquired during beginning and end of winter season, it is possible to estimate the change in depth or thickness for the snow that is accumulated during the winter as it takes one year for the snow to get transformed into firn (snow that has survived one summer or one-year old snow).
Keywords: Digital elevation model, snow depth, geodetic method, snow cover.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 716