Search results for: multifractal time series analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40457

Search results for: multifractal time series analysis

39857 Audience Perceptions and Attitudes Towards the Representation of Tribal South African Culture in Drama Series

Authors: Oluwayemisi Mary Onyenanakeya, Kevin Onyenankeya

Abstract:

Commercial media entertainment offerings especially mainstream soap operas, in South Africa, are progressively infusing dominant social values and ideas which are alien to South African tribal societies. In most of the commodified television drama series, people who hold tight to traditional beliefs and values are often characterised as traditionalists, while those who have imbibed the western defined dicta and ideology of modernity are seen as progressives. This study, therefore, sought to ascertain how South African tribal language, traditional institutions, values, social norms and ancestral beliefs are portrayed through the television drama, Generations: The Legacy, and what the viewers think about those constructions and the implication for cultural identity. The mixed methods approach was employed involving the administration of questionnaire to 350 participants selected through random sampling and a content analysis of 20 episodes of Generations: The Legacy. The findings further showed that the values and traditions represented in generation do not significantly reflect the South African tribal tradition and values (p-value > 0.05). In most instances where traditional values are represented they tend to be portrayed as old fashioned (p-value > 0.05), and inferior and backward (p-value > 0.05). In addition, the findings indicate that Generations: The legacy is a vehicle for promoting dominant culture.

Keywords: identity, soap opera, South Africa, television

Procedia PDF Downloads 280
39856 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects

Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta

Abstract:

Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.

Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect

Procedia PDF Downloads 210
39855 Percentile Norms of Heart Rate Variability (HRV) of Indian Sportspersons Withdrawn from Competitive Games and Sports

Authors: Pawan Kumar, Dhananjoy Shaw

Abstract:

Heart rate variability (HRV) is the physiological phenomenon of variation in the time interval between heartbeats and is alterable with fitness, age and different medical conditions including withdrawal/retirement from games/sports. Objectives of the study were to develop (a) percentile norms of heart rate variability (HRV) variables derived from time domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity (b) percentile norms of heart rate variability (HRV) variables derived from frequency domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity. The study was conducted on 430 males. Ages of the sample ranged from 30 to 35 years of same socio-economic status. Date was collected using ECG polygraphs. Data were processed and extracted using frequency domain analysis and time domain analysis. Collected data were computed with percentile from one to hundred. The finding showed that the percentile norms of heart rate variability (HRV) variables derived from time domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity namely, NN50 count (ranged from 1 to 189 score as percentile range). pNN50 count (ranged from .24 to 60.80 score as percentile range). SDNN (ranged from 17.34 to 167.29 score as percentile range). SDSD (ranged from 11.14 to 120.46 score as percentile range). RMMSD (ranged from 11.19 to 120.24 score as percentile range) and SDANN (ranged from 4.02 to 88.75 score as percentile range). The percentile norms of heart rate variability (HRV) variables derived from frequency domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity namely Low Frequency (Normalized Power) ranged from 20.68 to 90.49 score as percentile range. High Frequency (Normalized Power) ranged from 14.37 to 81.60 score as percentile range. LF/ HF ratio(ranged from 0.26 to 9.52 score as percentile range). LF (Absolute Power) ranged from 146.79 to 5669.33 score as percentile range. HF (Absolute Power) ranged from 102.85 to 10735.71 score as percentile range and Total Power (Absolute Power) ranged from 471.45 to 25879.23 score as percentile range. Conclusion: The analysis documented percentile norms for time domain analysis and frequency domain analysis for versatile use and evaluation.

Keywords: RMSSD, Percentile, SDANN, HF, LF

Procedia PDF Downloads 412
39854 Axle Load Estimation of Moving Vehicles Using BWIM Technique

Authors: Changgil Lee, Seunghee Park

Abstract:

Although vehicle driving test for the development of BWIM system is necessary, but it needs much cost and time in addition application of various driving condition. Thus, we need the numerical-simulation method resolving the cost and time problems of vehicle driving test and the way of measuring response of bridge according to the various driving condition. Using the precision analysis model reflecting the dynamic characteristic is contributed to increase accuracy in numerical simulation. In this paper, we conduct a numerical simulation to apply precision analysis model, which reflects the dynamic characteristic of bridge using Bridge Weigh-in-Motion technique and suggest overload vehicle enforcement technology using precision analysis model.

Keywords: bridge weigh-in-motion(BWIM) system, precision analysis model, dynamic characteristic of bridge, numerical simulation

Procedia PDF Downloads 283
39853 Quantitative Structure-Activity Relationship Analysis of Binding Affinity of a Series of Anti-Prion Compounds to Human Prion Protein

Authors: Strahinja Kovačević, Sanja Podunavac-Kuzmanović, Lidija Jevrić, Milica Karadžić

Abstract:

The present study is based on the quantitative structure-activity relationship (QSAR) analysis of eighteen compounds with anti-prion activity. The structures and anti-prion activities (expressed in response units, RU%) of the analyzed compounds are taken from CHEMBL database. In the first step of analysis 85 molecular descriptors were calculated and based on them the hierarchical cluster analysis (HCA) and principal component analysis (PCA) were carried out in order to detect potential significant similarities or dissimilarities among the studied compounds. The calculated molecular descriptors were physicochemical, lipophilicity and ADMET (absorption, distribution, metabolism, excretion and toxicity) descriptors. The first stage of the QSAR analysis was simple linear regression modeling. It resulted in one acceptable model that correlates Henry's law constant with RU% units. The obtained 2D-QSAR model was validated by cross-validation as an internal validation method. The validation procedure confirmed the model’s quality and therefore it can be used for prediction of anti-prion activity. The next stage of the analysis of anti-prion activity will include 3D-QSAR and molecular docking approaches in order to select the most promising compounds in treatment of prion diseases. These results are the part of the project No. 114-451-268/2016-02 financially supported by the Provincial Secretariat for Science and Technological Development of AP Vojvodina.

Keywords: anti-prion activity, chemometrics, molecular modeling, QSAR

Procedia PDF Downloads 294
39852 Integrated Dynamic Analysis of Semi-Submersible Flap Type Concept

Authors: M. Rafiur Rahman, M. Mezbah Uddin, Mohammad Irfan Uddin, M. Moinul Islam

Abstract:

With a rapid development of offshore renewable energy industry, the research activities in regards of harnessing power from offshore wind and wave energy are increasing day by day. Integration of wind turbines and wave energy converters into one combined semi-submersible platform might be a cost-economy and beneficial option. In this paper, the coupled integrated dynamic analysis in the time domain (TD) of a simplified semi-submersible flap type concept (SFC) is accomplished via state-of-the-art numerical code referred as Simo-Riflex-Aerodyn (SRA). This concept is a combined platform consisting of a semi-submersible floater supporting a 5 MW horizontal axis wind turbine (WT) and three elliptical shaped flap type wave energy converters (WECs) on three pontoons. The main focus is to validate the numerical model of SFC with experimental results and perform the frequency domain (FD) and TD response analysis. The numerical analysis is performed using potential flow theory for hydrodynamics and blade element momentum (BEM) theory for aerodynamics. A variety of environmental conditions encompassing the functional & survival conditions for short-term sea (1-hour simulation) are tested to evaluate the sustainability of the SFC. The numerical analysis is performed in full scale. Finally, the time domain analysis of heave, pitch & surge motions is performed numerically using SRA and compared with the experimental results. Due to the simplification of the model, there are some discrepancies which are discussed in brief.

Keywords: coupled integrated dynamic analysis, SFC, time domain analysis, wave energy converters

Procedia PDF Downloads 213
39851 Modification Of Rubber Swab Tool With Brush To Reduce Rubber Swab Fraction Fishing Time

Authors: T. R. Hidayat, G. Irawan, F. Kurniawan, E. H. I. Prasetya, Suharto, T. F. Ridwan, A. Pitoyo, A. Juniantoro, R. T. Hidayat

Abstract:

Swab activities is an activity to lift fluid from inside the well with the use of a sand line that aims to find out fluid influx after conducting perforation or to reduce the level of fluid as an effort to get the difference between formation pressure with hydrostatic pressure in the well for underbalanced perforation. During the swab activity, problems occur frequent problems occur with the rubber swab. The rubber swab often breaks and becomes a fish inside the well. This rubber swab fishing activity caused the rig operation takes longer, the swab result data becomes too late and create potential losses of well operation for the company. The average time needed for fishing the fractions of rubber swab plus swab work is 42 hours. Innovation made for such problems is to modify the rubber swab tool. The rubber swab tool is modified by provided a series of brushes at the end part of the tool with a thread of connection in order to improve work safety, so when the rubber swab breaks, the broken swab will be lifted by the brush underneath; therefore, it reduces the loss time for rubber swab fishing. This tool has been applied, it and is proven that with this rubber swab tool modification, the rig operation becomes more efficient because it does not carry out the rubber swab fishing activity. The fish fractions of the rubber swab are lifted up to the surface. Therefore, it saves the fuel cost, and well production potentials are obtained. The average time to do swab work after the application of this modified tool is 8 hours.

Keywords: rubber swab, modifikasi swab, brush, fishing rubber swab, saving cost

Procedia PDF Downloads 163
39850 Does Pakistan Stock Exchange Offer Diversification Benefits to Regional and International Investors: A Time-Frequency (Wavelets) Analysis

Authors: Syed Jawad Hussain Shahzad, Muhammad Zakaria, Mobeen Ur Rehman, Saniya Khaild

Abstract:

This study examines the co-movement between the Pakistan, Indian, S&P 500 and Nikkei 225 stock markets using weekly data from 1998 to 2013. The time-frequency relationship between the selected stock markets is conducted by using measures of continuous wavelet power spectrum, cross-wavelet transform and cross (squared) wavelet coherency. The empirical evidence suggests strong dependence between Pakistan and Indian stock markets. The co-movement of Pakistani index with U.S and Japanese, the developed markets, varies over time and frequency where the long-run relationship is dominant. The results of cross wavelet and wavelet coherence analysis indicate moderate covariance and correlation between stock indexes and the markets are in phase (i.e. cyclical in nature) over varying durations. Pakistan stock market was lagging during the entire period in relation to Indian stock market, corresponding to the 8~32 and then 64~256 weeks scale. Similar findings are evident for S&P 500 and Nikkei 225 indexes, however, the relationship occurs during the later period of study. All three wavelet indicators suggest strong evidence of higher co-movement during 2008-09 global financial crises. The empirical analysis reveals a strong evidence that the portfolio diversification benefits vary across frequencies and time. This analysis is unique and have several practical implications for regional and international investors while assigning the optimal weightage of different assets in portfolio formulation.

Keywords: co-movement, Pakistan stock exchange, S&P 500, Nikkei 225, wavelet analysis

Procedia PDF Downloads 351
39849 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García

Abstract:

In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.

Keywords: Intelligent Transportation Systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning

Procedia PDF Downloads 456
39848 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Model

Authors: Alam Ali, Ashok Kumar Pathak

Abstract:

Path analysis is a statistical technique used to evaluate the strength of the direct and indirect effects of variables. One or more structural regression equations are used to estimate a series of parameters in order to find the better fit of data. Sometimes, exogenous variables do not show a significant strength of their direct and indirect effect when the assumption of classical regression (ordinary least squares (OLS)) are violated by the nature of the data. The main motive of this article is to investigate the efficacy of the copula-based regression approach over the classical regression approach and calculate the direct and indirect effects of variables when data violates the OLS assumption and variables are linked through an elliptical copula. We perform this study using a well-organized numerical scheme. Finally, a real data application is also presented to demonstrate the performance of the superiority of the copula approach.

Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique

Procedia PDF Downloads 64
39847 Exposing Latent Fingermarks on Problematic Metal Surfaces Using Time of Flight Secondary Ion Mass Spectroscopy

Authors: Tshaiya Devi Thandauthapani, Adam J. Reeve, Adam S. Long, Ian J. Turner, James S. Sharp

Abstract:

Fingermarks are a crucial form of evidence for identifying a person at a crime scene. However, visualising latent (hidden) fingermarks can be difficult, and the correct choice of techniques is essential to develop and preserve any fingermarks that might be present. Knives, firearms and other metal weapons have proven to be challenging substrates (stainless steel in particular) from which to reliably obtain fingermarks. In this study, time of flight secondary ion mass spectroscopy (ToF-SIMS) was used to image fingermarks on metal surfaces. This technique was compared to a conventional superglue based fuming technique that was accompanied by a series of contrast enhancing dyes (basic yellow 40 (BY40), crystal violet (CV) and Sudan black (SB)) on three different metal surfaces. The conventional techniques showed little to no evidence of fingermarks being present on the metal surfaces after a few days. However, ToF-SIMS images revealed fingermarks on the same and similar substrates with an exceptional level of detail demonstrating clear ridge definition as well as detail about sweat pore position and shape, that persist for over 26 days after deposition when the samples were stored under ambient conditions.

Keywords: conventional techniques, latent fingermarks, metal substrates, time of flight secondary ion mass spectroscopy

Procedia PDF Downloads 153
39846 Optimization of Electrical Discharge Machining Parameters in Machining AISI D3 Tool Steel by Grey Relational Analysis

Authors: Othman Mohamed Altheni, Abdurrahman Abusaada

Abstract:

This study presents optimization of multiple performance characteristics [material removal rate (MRR), surface roughness (Ra), and overcut (OC)] of hardened AISI D3 tool steel in electrical discharge machining (EDM) using Taguchi method and Grey relational analysis. Machining process parameters selected were pulsed current Ip, pulse-on time Ton, pulse-off time Toff and gap voltage Vg. Based on ANOVA, pulse current is found to be the most significant factor affecting EDM process. Optimized process parameters are simultaneously leading to a higher MRR, lower Ra, and lower OC are then verified through a confirmation experiment. Validation experiment shows an improved MRR, Ra and OC when Taguchi method and grey relational analysis were used

Keywords: edm parameters, grey relational analysis, Taguchi method, ANOVA

Procedia PDF Downloads 285
39845 Machine Learning Approach for Yield Prediction in Semiconductor Production

Authors: Heramb Somthankar, Anujoy Chakraborty

Abstract:

This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.

Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis

Procedia PDF Downloads 101
39844 The Reenactment of Historic Memory and the Ways to Read past Traces through Contemporary Architecture in European Urban Contexts: The Case Study of the Medieval Walls of Naples

Authors: Francesco Scarpati

Abstract:

Because of their long history, ranging from ancient times to the present day, European cities feature many historical layers, whose single identities are represented by traces surviving in the urban design. However, urban transformations, in particular, the ones that have been produced by the property speculation phenomena of the 20th century, often compromised the readability of these traces, resulting in a loss of the historical identities of the single layers. The purpose of this research is, therefore, a reflection on the theme of the reenactment of the historical memory in the stratified European contexts and on how contemporary architecture can help to reveal past signs of the cities. The research work starts from an analysis of a series of emblematic examples that have already provided an original solution to the described problem, going from the architectural detail scale to the urban and landscape scale. The results of these analyses are then applied to the case study of the city of Naples, as an emblematic example of a stratified city, with an ancient Greek origin; a city where it is possible to read most of the traces of its transformations. Particular consideration is given to the trace of the medieval walls of the city, which a long time ago clearly divided the city itself from the outer fields, and that is no longer readable at the current time. Finally, solutions and methods of intervention are proposed to ensure that the trace of the walls, read as a boundary, can be revealed through the contemporary project.

Keywords: contemporary project, historic memory, historic urban contexts, medieval walls, naples, stratified cities, urban traces

Procedia PDF Downloads 258
39843 Unified Power Quality Conditioner Presentation and Dimensioning

Authors: Abderrahmane Kechich, Othmane Abdelkhalek

Abstract:

Static converters behave as nonlinear loads that inject harmonic currents into the grid and increase the consumption of the inactive power. On the other hand, the increased use of sensitive equipment requires the application of sinusoidal voltages. As a result, the electrical power quality control has become a major concern in the field of power electronics. In this context, the active power conditioner (UPQC) was developed. It combines both serial and parallel structures; the series filter can protect sensitive loads and compensate for voltage disturbances such as voltage harmonics, voltage dips or flicker when the shunt filter compensates for current disturbances such as current harmonics, reactive currents and imbalance. This double feature is that it is one of the most appropriate devices. Calculating parameters is an important step and in the same time it’s not easy for that reason several researchers based on trial and error method for calculating parameters but this method is not easy for beginners researchers especially what about the controller’s parameters, for that reason this paper gives a mathematical way to calculate of almost all of UPQC parameters away from trial and error method. This paper gives also a new approach for calculating of PI regulators parameters for purpose to have a stable UPQC able to compensate for disturbances acting on the waveform of line voltage and load current in order to improve the electrical power quality.

Keywords: UPQC, Shunt active filer, series active filer, PI controller, PWM control, dual-loop control

Procedia PDF Downloads 394
39842 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia

Authors: Nguyen-Thanh Son

Abstract:

Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.

Keywords: MODIS, flood, mapping, Cambodia

Procedia PDF Downloads 116
39841 Application of Local Mean Decomposition for Rolling Bearing Fault Diagnosis Based On Vibration Signals

Authors: Toufik Bensana, Slimane Mekhilef, Kamel Tadjine

Abstract:

Vibration analysis has been frequently applied in the condition monitoring and fault diagnosis of rolling element bearings. Unfortunately, the vibration signals collected from a faulty bearing are generally non stationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA) and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.

Keywords: fault diagnosis, condition monitoring, local mean decomposition, rolling element bearing, vibration analysis

Procedia PDF Downloads 384
39840 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems

Authors: Z. Bouattou, R. Laurini, H. Belbachir

Abstract:

This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.

Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems

Procedia PDF Downloads 393
39839 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life

Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi

Abstract:

Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.

Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model

Procedia PDF Downloads 448
39838 Analysis of Vibratory Signals Based on Local Mean Decomposition (LMD) for Rolling Bearing Fault Diagnosis

Authors: Toufik Bensana, Medkour Mihoub, Slimane Mekhilef

Abstract:

The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally nonstationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA), and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.

Keywords: fault diagnosis, rolling element bearing, local mean decomposition, condition monitoring

Procedia PDF Downloads 380
39837 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 89
39836 Mathematical Modeling and Analysis of Forced Vibrations in Micro-Scale Microstretch Thermoelastic Simply Supported Beam

Authors: Geeta Partap, Nitika Chugh

Abstract:

The present paper deals with the flexural vibrations of homogeneous, isotropic, generalized micropolar microstretch thermoelastic thin Euler-Bernoulli beam resonators, due to Exponential time varying load. Both the axial ends of the beam are assumed to be at simply supported conditions. The governing equations have been solved analytically by using Laplace transforms technique twice with respect to time and space variables respectively. The inversion of Laplace transform in time domain has been performed by using the calculus of residues to obtain deflection.The analytical results have been numerically analyzed with the help of MATLAB software for magnesium like material. The graphical representations and interpretations have been discussed for Deflection of beam under Simply Supported boundary condition and for distinct considered values of time and space as well. The obtained results are easy to implement for engineering analysis and designs of resonators (sensors), modulators, actuators.

Keywords: microstretch, deflection, exponential load, Laplace transforms, residue theorem, simply supported

Procedia PDF Downloads 301
39835 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013

Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran

Abstract:

Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.

Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka

Procedia PDF Downloads 470
39834 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: cognitive radio, energy detector, periodogram, spectrum sensing

Procedia PDF Downloads 369
39833 A Fundamental Study for Real-Time Safety Evaluation System of Landing Pier Using FBG Sensor

Authors: Heungsu Lee, Youngseok Kim, Jonghwa Yi, Chul Park

Abstract:

A landing pier is subjected to safety assessment by visual inspection and design data, but it is difficult to check the damage in real-time. In this study, real - time damage detection and safety evaluation methods were studied. As a result of structural analysis of the arbitrary landing pier structure, the inflection point of deformation and moment occurred at 10%, 50%, and 90% of pile length. The critical value of Fiber Bragg Grating (FBG) sensor was set according to the safety factor, and the FBG sensor application method for real - time safety evaluation was derived.

Keywords: FBG sensor, harbor structure, maintenance, safety evaluation system

Procedia PDF Downloads 204
39832 New-Born Children and Marriage Stability: An Evaluation of Divorce Risk Based on 2010-2018 China Family Panel Studies Data

Authors: Yuchao Yao

Abstract:

As two of the main characteristics of Chinese demographic trends, increasing divorce rates and decreasing fertility rates both shaped the population structure in the recent decade. Figuring out to what extent can be having a child make a difference in the divorce rate of a couple will not only draw a picture of Chinese families but also bring about a new perspective to evaluate the Chinese child-breeding policies. Based on China Family Panel Studies (CFPS) Data 2010-2018, this paper provides a systematic evaluation of how children influence a couple’s marital stability through a series of empirical models. Using survival analysis and propensity score matching (PSM) model, this paper finds that the number and age of children that a couple has mattered in consolidating marital relationship, and these effects vary little over time; during the last decade, newly having children can in fact decrease the possibility of divorce for Chinese couples; the such decreasing effect is largely due to the birth of a second child. As this is an inclusive attempt to study and compare not only the effects but also the causality of children on divorce risk in the last decade, the results of this research will do a good summary of the status quo of divorce in China. Furthermore, this paper provides implications for further reforming the current marriage and child-breeding policies.

Keywords: divorce risk, fertility, China, survival analysis, propensity score matching

Procedia PDF Downloads 70
39831 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 76
39830 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 66
39829 Machine Learning-Enabled Classification of Climbing Using Small Data

Authors: Nicholas Milburn, Yu Liang, Dalei Wu

Abstract:

Athlete performance scoring within the climbing do-main presents interesting challenges as the sport does not have an objective way to assign skill. Assessing skill levels within any sport is valuable as it can be used to mark progress while training, and it can help an athlete choose appropriate climbs to attempt. Machine learning-based methods are popular for complex problems like this. The dataset available was composed of dynamic force data recorded during climbing; however, this dataset came with challenges such as data scarcity, imbalance, and it was temporally heterogeneous. Investigated solutions to these challenges include data augmentation, temporal normalization, conversion of time series to the spectral domain, and cross validation strategies. The investigated solutions to the classification problem included light weight machine classifiers KNN and SVM as well as the deep learning with CNN. The best performing model had an 80% accuracy. In conclusion, there seems to be enough information within climbing force data to accurately categorize climbers by skill.

Keywords: classification, climbing, data imbalance, data scarcity, machine learning, time sequence

Procedia PDF Downloads 135
39828 Remaining Useful Life Estimation of Bearings Based on Nonlinear Dimensional Reduction Combined with Timing Signals

Authors: Zhongmin Wang, Wudong Fan, Hengshan Zhang, Yimin Zhou

Abstract:

In data-driven prognostic methods, the prediction accuracy of the estimation for remaining useful life of bearings mainly depends on the performance of health indicators, which are usually fused some statistical features extracted from vibrating signals. However, the existing health indicators have the following two drawbacks: (1) The differnet ranges of the statistical features have the different contributions to construct the health indicators, the expert knowledge is required to extract the features. (2) When convolutional neural networks are utilized to tackle time-frequency features of signals, the time-series of signals are not considered. To overcome these drawbacks, in this study, the method combining convolutional neural network with gated recurrent unit is proposed to extract the time-frequency image features. The extracted features are utilized to construct health indicator and predict remaining useful life of bearings. First, original signals are converted into time-frequency images by using continuous wavelet transform so as to form the original feature sets. Second, with convolutional and pooling layers of convolutional neural networks, the most sensitive features of time-frequency images are selected from the original feature sets. Finally, these selected features are fed into the gated recurrent unit to construct the health indicator. The results state that the proposed method shows the enhance performance than the related studies which have used the same bearing dataset provided by PRONOSTIA.

Keywords: continuous wavelet transform, convolution neural net-work, gated recurrent unit, health indicators, remaining useful life

Procedia PDF Downloads 123