Search results for: test data.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9349

Search results for: test data.

7729 Development of a Numerical Model to Predict Wear in Grouted Connections for Offshore Wind Turbine Generators

Authors: Paul Dallyn, Ashraf El-Hamalawi, Alessandro Palmeri, Bob Knight

Abstract:

In order to better understand the long term implications of the grout wear failure mode in large-diameter plainsided grouted connections, a numerical model has been developed and calibrated that can take advantage of existing operational plant data to predict the wear accumulation for the actual load conditions experienced over a given period, thus limiting the requirement for expensive monitoring systems. This model has been derived and calibrated based on site structural condition monitoring (SCM) data and supervisory control and data acquisition systems (SCADA) data for two operational wind turbine generator substructures afflicted with this challenge, along with experimentally derived wear rates.

Keywords: Grouted Connection, Numerical Model, Offshore Structure, Wear, Wind Energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2621
7728 Numerical Analysis of Thermal Conductivity of Non-Charring Material Ablation Carbon-Carbon and Graphite with Considering Chemical Reaction Effects, Mass Transfer and Surface Heat Transfer

Authors: H. Mohammadiun, A. Kianifar, A. Kargar

Abstract:

Nowadays, there is little information, concerning the heat shield systems, and this information is not completely reliable to use in so many cases. for example, the precise calculation cannot be done for various materials. In addition, the real scale test has two disadvantages: high cost and low flexibility, and for each case we must perform a new test. Hence, using numerical modeling program that calculates the surface recession rate and interior temperature distribution is necessary. Also, numerical solution of governing equation for non-charring material ablation is presented in order to anticipate the recession rate and the heat response of non-charring heat shields. the governing equation is nonlinear and the Newton- Rafson method along with TDMA algorithm is used to solve this nonlinear equation system. Using Newton- Rafson method for solving the governing equation is one of the advantages of the solving method because this method is simple and it can be easily generalized to more difficult problems. The obtained results compared with reliable sources in order to examine the accuracy of compiling code.

Keywords: Ablation rate, surface recession, interior temperaturedistribution, non charring material ablation, Newton Rafson method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
7727 Modality and Redundancy Effects on Music Theory Learning Among Pupils of Different Anxiety Levels

Authors: Soon Fook Fong, Aldalalah, M. Osamah

Abstract:

The purpose of this study was to investigate effects of modality and redundancy principles on music theory learning among pupils of different anxiety levels. The lesson of music theory was developed in three different modes, audio and image (AI), text with image (TI) and audio with image and text (AIT). The independent variables were the three modes of courseware. The moderator variable was the anxiety level, while the dependent variable was the post test score. The study sample consisted of 405 third-grade pupils. Descriptive and inferential statistics were conducted to analyze the collected data. Analyses of covariance (ANCOVA) and Post hoc were carried out to examine the main effects as well as the interaction effects of the independent variables on the dependent variable. The findings of this study showed that medium anxiety pupils performed significantly better than low and high anxiety pupils in all the three treatment modes. The AI mode was found to help pupils with high anxiety significantly more than the TI and AIT modes.

Keywords: Modality, Redundancy, Music theory, Cognitivetheory of multimedia learning, Cognitive load theory, Anxiety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
7726 A Novel Implementation of Application Specific Instruction-set Processor (ASIP) using Verilog

Authors: Kamaraju.M, Lal Kishore.K, Tilak.A.V.N

Abstract:

The general purpose processors that are used in embedded systems must support constraints like execution time, power consumption, code size and so on. On the other hand an Application Specific Instruction-set Processor (ASIP) has advantages in terms of power consumption, performance and flexibility. In this paper, a 16-bit Application Specific Instruction-set processor for the sensor data transfer is proposed. The designed processor architecture consists of on-chip transmitter and receiver modules along with the processing and controlling units to enable the data transmission and reception on a single die. The data transfer is accomplished with less number of instructions as compared with the general purpose processor. The ASIP core operates at a maximum clock frequency of 1.132GHz with a delay of 0.883ns and consumes 569.63mW power at an operating voltage of 1.2V. The ASIP is implemented in Verilog HDL using the Xilinx platform on Virtex4.

Keywords: ASIP, Data transfer, Instruction set, Processor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
7725 Data Mining Applied to the Predictive Model of Triage System in Emergency Department

Authors: Wen-Tsann Lin, Yung-Tsan Jou, Yih-Chuan Wu, Yuan-Du Hsiao

Abstract:

The Emergency Department of a medical center in Taiwan cooperated to conduct the research. A predictive model of triage system is contracted from the contract procedure, selection of parameters to sample screening. 2,000 pieces of data needed for the patients is chosen randomly by the computer. After three categorizations of data mining (Multi-group Discriminant Analysis, Multinomial Logistic Regression, Back-propagation Neural Networks), it is found that Back-propagation Neural Networks can best distinguish the patients- extent of emergency, and the accuracy rate can reach to as high as 95.1%. The Back-propagation Neural Networks that has the highest accuracy rate is simulated into the triage acuity expert system in this research. Data mining applied to the predictive model of the triage acuity expert system can be updated regularly for both the improvement of the system and for education training, and will not be affected by subjective factors.

Keywords: Back-propagation Neural Networks, Data Mining, Emergency Department, Triage System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2285
7724 Dynamic Metadata Schemes in the Neutron and Photon Science Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata is one of the most important aspects for advancing data management practices within all research communities. Definitions and schemes of metadata are inter alia of particular significance in the domain of neutron and photon scattering experiments covering a broad area of different scientific disciplines. The demand of describing continuously evolving highly non-standardized experiments, including the resulting processed and published data, constitutes a considerable challenge for a static definition of metadata. Here, we present the concept of dynamic metadata for the neutron and photon scientific community, which enriches a static set of defined basic metadata. We explore the idea of dynamic metadata with the help of the use case of X-ray Photon Correlation Spectroscopy (XPCS), which is a synchrotron-based scattering technique that allows the investigation of nanoscale dynamic processes. It serves here as a demonstrator of how dynamic metadata can improve data acquisition, sharing, and analysis workflows. Our approach enables researchers to tailor metadata definitions dynamically and adapt them to the evolving demands of describing data and results from a diverse set of experiments. We demonstrate that dynamic metadata standards yield advantages that enhance data reproducibility, interoperability, and the dissemination of knowledge.

Keywords: Big data, metadata, schemas, XPCS, X-ray Photon Correlation Spectroscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 83
7723 A Novel Compression Algorithm for Electrocardiogram Signals based on Wavelet Transform and SPIHT

Authors: Sana Ktata, Kaïs Ouni, Noureddine Ellouze

Abstract:

Electrocardiogram (ECG) data compression algorithm is needed that will reduce the amount of data to be transmitted, stored and analyzed, but without losing the clinical information content. A wavelet ECG data codec based on the Set Partitioning In Hierarchical Trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm has achieved notable success in still image coding. We modified the algorithm for the one-dimensional (1-D) case and applied it to compression of ECG data. By this compression method, small percent root mean square difference (PRD) and high compression ratio with low implementation complexity are achieved. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. Compression ratios of up to 48:1 for ECG signals lead to acceptable results for visual inspection.

Keywords: Discrete Wavelet Transform, ECG compression, SPIHT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
7722 Field Trial of Resin-Based Composite Materials for the Treatment of Surface Collapses Associated with Former Shallow Coal Mining

Authors: Philip T. Broughton, Mark P. Bettney, Isla L. Smail

Abstract:

Effective treatment of ground instability is essential when managing the impacts associated with historic mining. A field trial was undertaken by the Coal Authority to investigate the geotechnical performance and potential use of composite materials comprising resin and fill or stone to safely treat surface collapses, such as crown-holes, associated with shallow mining. Test pits were loosely filled with various granular fill materials. The fill material was injected with commercially available silicate and polyurethane resin foam products. In situ and laboratory testing was undertaken to assess the geotechnical properties of the resultant composite materials. The test pits were subsequently excavated to assess resin permeation. Drilling and resin injection was easiest through clean limestone fill materials. Recycled building waste fill material proved difficult to inject with resin; this material is thus considered unsuitable for use in resin composites. Incomplete resin permeation in several of the test pits created irregular ‘blocks’ of composite. Injected resin foams significantly improve the stiffness and resistance (strength) of the un-compacted fill material. The stiffness of the treated fill material appears to be a function of the stone particle size, its associated compaction characteristics (under loose tipping) and the proportion of resin foam matrix. The type of fill material is more critical than the type of resin to the geotechnical properties of the composite materials. Resin composites can effectively support typical design imposed loads. Compared to other traditional treatment options, such as cement grouting, the use of resin composites is potentially less disruptive, particularly for sites with limited access, and thus likely to achieve significant reinstatement cost savings. The use of resin composites is considered a suitable option for the future treatment of shallow mining collapses.

Keywords: Composite material, ground improvement, mining legacy, resin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
7721 A Software Tool Design for Cerebral Infarction of MR Images

Authors: Kyoung-Jong Park, Woong-Gi Jeon, Hee-Cheol Kim, Dong-Eog Kim, Heung-Kook Choi

Abstract:

The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.

Keywords: Software tool design, Cerebral infarction, Brain MR image, Registration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
7720 Deformation Characteristics of Fire Damaged and Rehabilitated Normal Strength Concrete Beams

Authors: Yeo Kyeong Lee, Hae Won Min, Ji Yeon Kang, Hee Sun Kim, Yeong Soo Shin

Abstract:

In recent years, fire accidents have been steadily increased and the amount of property damage caused by the accidents has gradually raised. Damaging building structure, fire incidents bring about not only such property damage but also strength degradation and member deformation. As a result, the building structure undermines its structural ability. Examining the degradation and the deformation is very important because reusing the building is more economical than reconstruction. Therefore, engineers need to investigate the strength degradation and member deformation well, and make sure that they apply right rehabilitation methods. This study aims at evaluating deformation characteristics of fire damaged and rehabilitated normal strength concrete beams through both experiments and finite element analyses. For the experiments, control beams, fire damaged beams and rehabilitated beams are tested to examine deformation characteristics. Ten test beam specimens with compressive strength of 21MPa are fabricated and main test variables are selected as cover thickness of 40mm and 50mm and fire exposure time of 1 hour or 2 hours. After heating, fire damaged beams are air-recurred for 2 months and rehabilitated beams are repaired with polymeric cement mortar after being removed the fire damaged concrete cover. All beam specimens are tested under four points loading. FE analyses are executed to investigate the effects of main parameters applied to experimental study. Test results show that both maximum load and stiffness of the rehabilitated beams are higher than those of the fire damaged beams. In addition, predicted structural behaviors from the analyses also show good rehabilitation effect and the predicted load-deflection curves are similar to the experimental results. For the further, the proposed analytical method can be used to predict deformation characteristics of fire damaged and rehabilitated concrete beams without suffering from time and cost consuming of experimental process.

Keywords: Fire, Normal strength concrete, Rehabilitation, Reinforced concrete beam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2365
7719 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: Data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1205
7718 New Data Reuse Adaptive Filters with Noise Constraint

Authors: Young-Seok Choi

Abstract:

We present a new framework of the data-reusing (DR) adaptive algorithms by incorporating a constraint on noise, referred to as a noise constraint. The motivation behind this work is that the use of the statistical knowledge of the channel noise can contribute toward improving the convergence performance of an adaptive filter in identifying a noisy linear finite impulse response (FIR) channel. By incorporating the noise constraint into the cost function of the DR adaptive algorithms, the noise constrained DR (NC-DR) adaptive algorithms are derived. Experimental results clearly indicate their superior performance over the conventional DR ones.

Keywords: Adaptive filter, data-reusing, least-mean square (LMS), affine projection (AP), noise constraint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
7717 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System

Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar

Abstract:

MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.

Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
7716 On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Authors: Xiaoshu Lu, Päivi Leino-Arjas, Kustaa Piha, Akseli Aittomäki, Peppiina Saastamoinen, Ossi Rahkonen, Eero Lahelma

Abstract:

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

Keywords: Sickness absence, longitudinal data, methodologies, mix-distribution model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2251
7715 Phase Jitter Transfer in High Speed Data Links

Authors: Tsunwai Gary Yip

Abstract:

Phase locked loops in 10 Gb/s and faster data links are low phase noise devices. Characterization of their phase jitter transfer functions is difficult because the intrinsic noise of the PLLs is comparable to the phase noise of the reference clock signal. The problem is solved by using a linear model to account for the intrinsic noise. This study also introduces a novel technique for measuring the transfer function. It involves the use of the reference clock as a source of wideband excitation, in contrast to the commonly used sinusoidal excitations at discrete frequencies. The data reported here include the intrinsic noise of a PLL for 10 Gb/s links and the jitter transfer function of a PLL for 12.8 Gb/s links. The measured transfer function suggests that the PLL responded like a second order linear system to a low noise reference clock.

Keywords: Intrinsic phase noise, jitter in data link, PLL jitter transfer function, high speed clocking in electronic circuit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575
7714 A Review in Recent Development of Network Threats and Security Measures

Authors: Roza Dastres, Mohsen Soori

Abstract:

Networks are vulnerable devices due to their basic feature of facilitating remote access and data communication. The information in the networks needs to be kept secured and safe in order to provide an effective communication and sharing device in the web of data. Due to challenges and threats of the data in networks, the network security is one of the most important considerations in information technology infrastructures. As a result, the security measures are considered in the network in order to decrease the probability of accessing the secured data by the hackers. The purpose of network security is to protect the network and its components from unauthorized access and abuse in order to provide a safe and secured communication device for the users. In the present research work a review in recent development of network threats and security measures is presented and future research works are also suggested. Different attacks to the networks and security measured against them are discussed in order to increase security in the web of data. So, new ideas in the network security systems can be presented by analyzing the published papers in order to move forward the research field.

Keywords: Network threats, network security, security measures, firewalls.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 801
7713 Human Resources and Business Result: An Empirical Approach Based On RBV Theory

Authors: XhevrieMamaqi

Abstract:

Organization capacity learning is a process referring to the sum total of individual and collective learning through training programs, experience and experimentation, among others. Today, in-business ongoing training is one of the most important strategies for human capital development and it is crucial to sustain and improve workers’ knowledge and skills. Many organizations, firms and business are adopting a strategy of continuous learning, encouraging employees to learn new skills continually to be innovative and to try new processes and work in order to achieve a competitive advantage and superior business results. This paper uses the Resource Based View and Capacities (RBV) approach to construct a hypothetical relationships model between training and business results. The test of the model is applied on transversal data. A sample of 266 business of Spanish sector service has been selected. A Structural Equation Model (SEM) is used to estimate the relationship between ongoing training, represented by two latent dimension denominated Human and Social Capital resources and economic business results. The coefficients estimated have shown the efficient of some training aspectsexplaining the variation in business results.

Keywords: Business results, Human and Social Capital resources, training, RBV Theory, SEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1839
7712 Surface Elevation Dynamics Assessment Using Digital Elevation Models, Light Detection and Ranging, GPS and Geospatial Information Science Analysis: Ecosystem Modelling Approach

Authors: Ali K. M. Al-Nasrawi, Uday A. Al-Hamdany, Sarah M. Hamylton, Brian G. Jones, Yasir M. Alyazichi

Abstract:

Surface elevation dynamics have always responded to disturbance regimes. Creating Digital Elevation Models (DEMs) to detect surface dynamics has led to the development of several methods, devices and data clouds. DEMs can provide accurate and quick results with cost efficiency, in comparison to the inherited geomatics survey techniques. Nowadays, remote sensing datasets have become a primary source to create DEMs, including LiDAR point clouds with GIS analytic tools. However, these data need to be tested for error detection and correction. This paper evaluates various DEMs from different data sources over time for Apple Orchard Island, a coastal site in southeastern Australia, in order to detect surface dynamics. Subsequently, 30 chosen locations were examined in the field to test the error of the DEMs surface detection using high resolution global positioning systems (GPSs). Results show significant surface elevation changes on Apple Orchard Island. Accretion occurred on most of the island while surface elevation loss due to erosion is limited to the northern and southern parts. Concurrently, the projected differential correction and validation method aimed to identify errors in the dataset. The resultant DEMs demonstrated a small error ratio (≤ 3%) from the gathered datasets when compared with the fieldwork survey using RTK-GPS. As modern modelling approaches need to become more effective and accurate, applying several tools to create different DEMs on a multi-temporal scale would allow easy predictions in time-cost-frames with more comprehensive coverage and greater accuracy. With a DEM technique for the eco-geomorphic context, such insights about the ecosystem dynamic detection, at such a coastal intertidal system, would be valuable to assess the accuracy of the predicted eco-geomorphic risk for the conservation management sustainability. Demonstrating this framework to evaluate the historical and current anthropogenic and environmental stressors on coastal surface elevation dynamism could be profitably applied worldwide.

Keywords: DEMs, eco-geomorphic-dynamic processes, geospatial information science. Remote sensing, surface elevation changes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1133
7711 A Review on Soft Computing Technique in Intrusion Detection System

Authors: Noor Suhana Sulaiman, Rohani Abu Bakar, Norrozila Sulaiman

Abstract:

Intrusion Detection System is significant in network security. It detects and identifies intrusion behavior or intrusion attempts in a computer system by monitoring and analyzing the network packets in real time. In the recent year, intelligent algorithms applied in the intrusion detection system (IDS) have been an increasing concern with the rapid growth of the network security. IDS data deals with a huge amount of data which contains irrelevant and redundant features causing slow training and testing process, higher resource consumption as well as poor detection rate. Since the amount of audit data that an IDS needs to examine is very large even for a small network, classification by hand is impossible. Hence, the primary objective of this review is to review the techniques prior to classification process suit to IDS data.

Keywords: Intrusion Detection System, security, soft computing, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
7710 Analysis of Formyl Peptide Receptor 1 Protein Value as an Indicator of Neutrophil Chemotaxis Dysfunction in Aggressive Periodontitis

Authors: Prajna Metta, Yanti Rusyanti, Nunung Rusminah, Bremmy Laksono

Abstract:

The decrease of neutrophil chemotaxis function may cause increased susceptibility to aggressive periodontitis (AP). Neutrophil chemotaxis is affected by formyl peptide receptor 1 (FPR1), which when activated will respond to bacterial chemotactic peptide formyl methionyl leusyl phenylalanine (FMLP). FPR1 protein value is decreased in response to a wide number of inflammatory stimuli in AP patients. This study was aimed to assess the alteration of FPR1 protein value in AP patients and if FPR1 protein value could be used as an indicator of neutrophil chemotaxis dysfunction in AP. This is a case control study with 20 AP patients and 20 control subjects. Three milliliters of peripheral blood were drawn and analyzed for FPR1 protein value with ELISA. The data were statistically analyzed with Mann-Whitney test (p>0,05). Results showed that the mean value of FPR1 protein value in AP group is 0,353 pg/mL (0,11 to 1,18 pg/mL) and the mean value of FPR1 protein value in control group is 0,296 pg/mL (0,05 to 0,88 pg/mL). P value 0,787 > 0,05 suggested that there is no significant difference of FPR1 protein value in both groups. The present study suggests that FPR1 protein value has no significance alteration in AP patients and could not be used as an indicator of neutrophil chemotaxis dysfunction.

Keywords: Aggressive periodontitis, chemotaxis dysfunction, FPR1 protein value, neutrophil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815
7709 The Significant Effect of Wudu’ and Zikr in the Controlling of Emotional Pressure Using Biofeedback Emwave Technique

Authors: Mohd Anuar Awang Idris, Muhammad Nubli Abdul Wahab, Nora Yusma Mohamed Yusoff

Abstract:

Wudu’ (Ablution) and Zikr are amongst some of the spiritual tools which may help an individual control his mind, emotion and attitude. These tools are deemed to be able to deliver a positive impact on an individual’s psychophysiology. The main objective of this research is to determine the effects of Wudu’ (Ablution) and Zikr therapy using the biofeedback emWave application and technology. For this research, 13 students were selected as samples from the students’ representative body at the University Tenaga National, Malaysia. The DASS (Depression Anxiety Stress Scale) questionnaire was used to help with the assessment and measurement of each student’s ability in controlling his or her emotions before and after the therapies. The biofeedback emWave technology was utilized to monitor the student’s psychophysiology level. In addition, the data obtained from the Heart rate variability (HRV) test have also been used to affirm that Wudu’ and Zikr had had significant impacts on the student’s success in controlling his or her emotional pressure.

Keywords: Biofeedback emWave, emotion, psychophysiology, wudu’, zikr.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
7708 Modeling and Simulation of Acoustic Link Using Mackenize Propagation Speed Equation

Authors: Christhu Raj M. R., Rajeev Sukumaran

Abstract:

Underwater acoustic networks have attracted great attention in the last few years because of its numerous applications. High data rate can be achieved by efficiently modeling the physical layer in the network protocol stack. In Acoustic medium, propagation speed of the acoustic waves is dependent on many parameters such as temperature, salinity, density, and depth. Acoustic propagation speed cannot be modeled using standard empirical formulas such as Urick and Thorp descriptions. In this paper, we have modeled the acoustic channel using real time data of temperature, salinity, and speed of Bay of Bengal (Indian Coastal Region). We have modeled the acoustic channel by using Mackenzie speed equation and real time data obtained from National Institute of Oceanography and Technology. It is found that acoustic propagation speed varies between 1503 m/s to 1544 m/s as temperature and depth differs. The simulation results show that temperature, salinity, depth plays major role in acoustic propagation and data rate increases with appropriate data sets substituted in the simulated model.

Keywords: Underwater Acoustics, Mackenzie Speed Equation, Temperature, Salinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2166
7707 A Keyword-Based Filtering Technique of Document-Centric XML using NFA Representation

Authors: Changwoo Byun, Kyounghan Lee, Seog Park

Abstract:

XML is becoming a de facto standard for online data exchange. Existing XML filtering techniques based on a publish/subscribe model are focused on the highly structured data marked up with XML tags. These techniques are efficient in filtering the documents of data-centric XML but are not effective in filtering the element contents of the document-centric XML. In this paper, we propose an extended XPath specification which includes a special matching character '%' used in the LIKE operation of SQL in order to solve the difficulty of writing some queries to adequately filter element contents using the previous XPath specification. We also present a novel technique for filtering a collection of document-centric XMLs, called Pfilter, which is able to exploit the extended XPath specification. We show several performance studies, efficiency and scalability using the multi-query processing time (MQPT).

Keywords: XML Data Stream, Document-centric XML, Filtering Technique, Value-based Predicates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
7706 A Mixture Model of Two Different Distributions Approach to the Analysis of Heterogeneous Survival Data

Authors: Ülkü Erişoğlu, Murat Erişoğlu, Hamza Erol

Abstract:

In this paper we propose a mixture of two different distributions such as Exponential-Gamma, Exponential-Weibull and Gamma-Weibull to model heterogeneous survival data. Various properties of the proposed mixture of two different distributions are discussed. Maximum likelihood estimations of the parameters are obtained by using the EM algorithm. Illustrative example based on real data are also given.

Keywords: Exponential-Gamma, Exponential-Weibull, Gamma-Weibull, EM Algorithm, Survival Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4044
7705 Motion Recognition Based On Fuzzy WP Feature Extraction Approach

Authors: Keun-Chang Kwak

Abstract:

This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.

Keywords: Motion recognition, fuzzy wavelet packet, Vicon physical data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
7704 Resilient Modulus and Deformation Responses of Waste Glass in Flexible Pavement System

Authors: M. Al-Saedi, A. Chegenizadeh, H. Nikraz

Abstract:

Experimental investigations are conducted to assess a layered structure of glass (G) - rock (R) blends under the impact of repeated loading. Laboratory tests included sieve analyses, modified compaction test and repeated load triaxial test (RLTT) is conducted on different structures of stratified GR samples to reach the objectives of this study. Waste materials are such essential components in the climate system, and also commonly used in minimising the need for natural materials in many countries. Glass is one of the most widely used groups of waste materials which have been extensively using in road applications. Full range particle size and colours of glass are collected and mixed at different ratios with natural rock material trying to use the blends in pavement layers. Whole subsurface specimen sequentially consists of a single layer of R and a layer of G-R blend. 12G/88R and 45G/55R mix ratios are employed in this research, the thickness of G-R layer was changed, and the results were compared between the pure rock and the layered specimens. The relations between resilient module (Mr) and permanent deformation with sequence number are presented. During the earlier stages of RLTT, the results indicated that the 45G/55R specimen shows higher moduli than R specimen.

Keywords: Rock base course, layered structure, glass, resilient modulus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 603
7703 Empirical Analytical Modelling of Average Bond Stress and Anchorage of Tensile Bars in Reinforced Concrete

Authors: Maruful H. Mazumder, Raymond I. Gilbert

Abstract:

The design specifications for calculating development and lapped splice lengths of reinforcement in concrete are derived from a conventional empirical modelling approach that correlates experimental test data using a single mathematical equation. This paper describes part of a recently completed experimental research program to assess the effects of different structural parameters on the development length requirements of modern high strength steel reinforcing bars, including the case of lapped splices in large-scale reinforced concrete members. The normalized average bond stresses for the different variations of anchorage lengths are assessed according to the general form of a typical empirical analytical model of bond and anchorage. Improved analytical modelling equations are developed in the paper that better correlate the normalized bond strength parameters with the structural parameters of an empirical model of bond and anchorage.

Keywords: Bond stress, Development length, Lapped splice length, Reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2258
7702 Generating State-Based Testing Models for Object-Oriented Framework Interface Classes

Authors: Jehad Al Dallal, Paul Sorenson

Abstract:

An application framework provides a reusable design and implementation for a family of software systems. Application developers extend the framework to build their particular applications using hooks. Hooks are the places identified to show how to use and customize the framework. Hooks define the Framework Interface Classes (FICs) and the specifications of their methods. As part of the development life cycle, it is required to test the implementations of the FICs. Building a testing model to express the behavior of a class is an essential step for the generation of the class-based test cases. The testing model has to be consistent with the specifications provided for the hooks. State-based models consisting of states and transitions are testing models well suited to objectoriented software. Typically, hand-construction of a state-based model of a class behavior is expensive, error-prone, and may result in constructing an inconsistent model with the specifications of the class methods, which misleads verification results. In this paper, a technique is introduced to automatically synthesize a state-based testing model for FICs using the specifications provided for the hooks. A tool that supports the proposed technique is introduced.

Keywords: Framework interface classes, hooks, state-basedtesting, testing model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1210
7701 DCBOR: A Density Clustering Based on Outlier Removal

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.

Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
7700 Alternative to M-Estimates in Multisensor Data Fusion

Authors: Nga-Viet Nguyen, Georgy Shevlyakov, Vladimir Shin

Abstract:

To solve the problem of multisensor data fusion under non-Gaussian channel noise. The advanced M-estimates are known to be robust solution while trading off some accuracy. In order to improve the estimation accuracy while still maintaining the equivalent robustness, a two-stage robust fusion algorithm is proposed using preliminary rejection of outliers then an optimal linear fusion. The numerical experiments show that the proposed algorithm is equivalent to the M-estimates in the case of uncorrelated local estimates and significantly outperforms the M-estimates when local estimates are correlated.

Keywords: Data fusion, estimation, robustness, M-estimates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806