Search results for: continuous wavelet analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28669

Search results for: continuous wavelet analysis

28519 Real Interest Rates and Real Returns of Agricultural Commodities in the Context of Quantitative Easing

Authors: Wei Yao, Constantinos Alexiou

Abstract:

In the existing literature, many studies have focused on the implementation and effectiveness of quantitative easing (QE) since 2008, but only a few have evaluated QE’s effect on commodity prices. In this context, by following Frankel’s (1986) commodity price overshooting model, we study the dynamic covariation between the expected real interest rates and six agricultural commodities’ real returns over the period from 2000:1 to 2018 for the US economy. We use wavelet analysis to investigate the causal relationship and co-movement of time series data by calculating the coefficient of determination in different frequencies. We find that a) US unconventional monetary policy may cause more positive and significant covariation between the expected real interest rates and agricultural commodities’ real returns over the short horizons; b) a lead-lag relationship that runs from agricultural commodities’ real returns to the expected real short-term interest rates over the long horizons; and c) a lead-lag relationship from agricultural commodities’ real returns to the expected real long-term interest rates over short horizons. In the realm of monetary policy, we argue that QE may shift the negative relationship between most commodities’ real returns and the expected real interest rates to a positive one over a short horizon.

Keywords: QE, commodity price, interest rate, wavelet coherence

Procedia PDF Downloads 65
28518 On a Continuous Formulation of Block Method for Solving First Order Ordinary Differential Equations (ODEs)

Authors: A. M. Sagir

Abstract:

The aim of this paper is to investigate the performance of the developed linear multistep block method for solving first order initial value problem of Ordinary Differential Equations (ODEs). The method calculates the numerical solution at three points simultaneously and produces three new equally spaced solution values within a block. The continuous formulations enable us to differentiate and evaluate at some selected points to obtain three discrete schemes, which were used in block form for parallel or sequential solutions of the problems. A stability analysis and efficiency of the block method are tested on ordinary differential equations involving practical applications, and the results obtained compared favorably with the exact solution. Furthermore, comparison of error analysis has been developed with the help of computer software.

Keywords: block method, first order ordinary differential equations, linear multistep, self-starting

Procedia PDF Downloads 288
28517 Reversible and Adaptive Watermarking for MRI Medical Images

Authors: Nisar Ahmed Memon

Abstract:

A new medical image watermarking scheme delivering high embedding capacity is presented in this paper. Integer Wavelet Transform (IWT), Companding technique and adaptive thresholding are used in this scheme. The proposed scheme implants, recovers the hidden information and restores the input image to its pristine state at the receiving end. Magnetic Resonance Imaging (MRI) images are used for experimental purposes. The scheme first segment the MRI medical image into non-overlapping blocks and then inserts watermark into wavelet coefficients having a high frequency of each block. The scheme uses block-based watermarking adopting iterative optimization of threshold for companding in order to avoid the histogram pre and post processing. Results show that proposed scheme performs better than other reversible medical image watermarking schemes available in literature for MRI medical images.

Keywords: adaptive thresholding, companding technique, data authentication, reversible watermarking

Procedia PDF Downloads 275
28516 Subsurface Structures Related to the Hydrocarbon Migration and Accumulation in the Afghan Tajik Basin, Northern Afghanistan: Insights from Seismic Attribute Analysis

Authors: Samim Khair Mohammad, Takeshi Tsuji, Chanmaly Chhun

Abstract:

The Afghan Tajik (foreland) basin, located in the depression zone between mountain axes, is under compression and deformation during the collision of India with the Eurasian plate. The southern part of the Afghan Tajik basin in the Northern part of Afghanistan has not been well studied and explored, but considered for the significant potential for oil and gas resources. The Afghan Tajik basin depositional environments (< 8km) resulted from mixing terrestrial and marine systems, which has potential prospects of Jurrasic (deep) and Tertiary (shallow) petroleum systems. We used 2D regional seismic profiles with a total length of 674.8 km (or over an area of 2500 km²) in the southern part of the basin. To characterize hydrocarbon systems and structures in this study area, we applied advanced seismic attributes such as spectral decomposition (10 - 60Hz) based on time-frequency analysis with continuous wavelet transform. The spectral decomposition results yield the (averaging 20 - 30Hz group) spectral amplitude anomaly. Based on this anomaly result, seismic, and structural interpretation, the potential hydrocarbon accumulations were inferred around the main thrust folds in the tertiary (Paleogene+Neogene) petroleum systems, which appeared to be accumulated around the central study area. Furthermore, it seems that hydrocarbons dominantly migrated along the main thrusts and then concentrated around anticline fold systems which could be sealed by mudstone/carbonate rocks.

Keywords: The Afghan Tajik basin, seismic lines, spectral decomposition, thrust folds, hydrocarbon reservoirs

Procedia PDF Downloads 78
28515 Bivariate Generalization of q-α-Bernstein Polynomials

Authors: Tarul Garg, P. N. Agrawal

Abstract:

We propose to define the q-analogue of the α-Bernstein Kantorovich operators and then introduce the q-bivariate generalization of these operators to study the approximation of functions of two variables. We obtain the rate of convergence of these bivariate operators by means of the total modulus of continuity, partial modulus of continuity and the Peetre’s K-functional for continuous functions. Further, in order to study the approximation of functions of two variables in a space bigger than the space of continuous functions, i.e. Bögel space; the GBS (Generalized Boolean Sum) of the q-bivariate operators is considered and degree of approximation is discussed for the Bögel continuous and Bögel differentiable functions with the aid of the Lipschitz class and the mixed modulus of smoothness.

Keywords: Bögel continuous, Bögel differentiable, generalized Boolean sum, K-functional, mixed modulus of smoothness

Procedia PDF Downloads 364
28514 Use of Six-sigma Concept in Discrete Manufacturing Industry

Authors: Ignatio Madanhire, Charles Mbohwa

Abstract:

Efficiency in manufacturing is critical in raising the value of exports so as to gainfully trade on the regional and international markets. There seems to be increasing popularity of continuous improvement strategies availed to manufacturing entities, but this research study established that there has not been a similar popularity accorded to the Six Sigma methodology. Thus this work was conducted to investigate the applicability, effectiveness, usefulness, application and suitability of the Six Sigma methodology as a competitiveness option for discrete manufacturing entity. Development of Six-sigma center in the country with continuous improvement information would go a long way in benefiting the entire industry

Keywords: discrete manufacturing, six-sigma, continuous improvement, efficiency, competitiveness

Procedia PDF Downloads 436
28513 Acclimatation of Bacterial Communities for Biohydrogen Production by Co-Digestion Process in Batch and Continuous Systems

Authors: Gómez Romero Jacob, García Peña Elvia Inés

Abstract:

The co-digestion process of crude cheese whey (CCW) with fruit vegetable waste (FVW) for biohydrogen production was investigated in batch and continuous systems, in stirred 1.8 L bioreactors at 37°C. Five different C/N ratios (7, 17, 21, 31, and 46) were tested in batch systems. While, in continuous system eight conditions were evaluated, hydraulic retention time (from 60 to 10 h) and organic load rate (from 21.96 to 155.87 g COD/L d). Data in batch tests showed a maximum specific biohydrogen production rate of 10.68 mmol H2/Lh and a biohydrogen yield of 449.84 mL H2/g COD at a C/N ratio of 21. In continuous co-digestion system, the optimum hydraulic retention time and organic loading rate were 17.5 h and 80.02 g COD/L d, respectively. Under these conditions, the highest volumetric production hydrogen rate (VPHR) and hydrogen yield were 11.02 mmol H2/L h, 800 mL H2/COD, respectively. A pyrosequencing analysis showed that the main acclimated microbial communities for co-digestion studies consisted of Bifidobacterium, with 85.4% of predominance. Hydrogen producing bacteria such as Klebsiella (9.1%), Lactobacillus (0.97%), Citrobacter (0.21%), Enterobacter (0.27%), and Clostridium (0.18%) were less abundant at this culture period. The microbial population structure was correlated with the lactate, acetate, and butyrate profiles obtained. Results demonstrated that the co-digestion of CCW with FVW improves biohydrogen production due to a better nutrient balance and improvement of the system’s buffering capacity.

Keywords: acclimatation, biohydrogen, co-digestion, microbial community

Procedia PDF Downloads 535
28512 Scattering Operator and Spectral Clustering for Ultrasound Images: Application on Deep Venous Thrombi

Authors: Thibaud Berthomier, Ali Mansour, Luc Bressollette, Frédéric Le Roy, Dominique Mottier, Léo Fréchier, Barthélémy Hermenault

Abstract:

Deep Venous Thrombosis (DVT) occurs when a thrombus is formed within a deep vein (most often in the legs). This disease can be deadly if a part or the whole thrombus reaches the lung and causes a Pulmonary Embolism (PE). This disorder, often asymptomatic, has multifactorial causes: immobilization, surgery, pregnancy, age, cancers, and genetic variations. Our project aims to relate the thrombus epidemiology (origins, patient predispositions, PE) to its structure using ultrasound images. Ultrasonography and elastography were collected using Toshiba Aplio 500 at Brest Hospital. This manuscript compares two classification approaches: spectral clustering and scattering operator. The former is based on the graph and matrix theories while the latter cascades wavelet convolutions with nonlinear modulus and averaging operators.

Keywords: deep venous thrombosis, ultrasonography, elastography, scattering operator, wavelet, spectral clustering

Procedia PDF Downloads 458
28511 Robust and Transparent Spread Spectrum Audio Watermarking

Authors: Ali Akbar Attari, Ali Asghar Beheshti Shirazi

Abstract:

In this paper, we propose a blind and robust audio watermarking scheme based on spread spectrum in Discrete Wavelet Transform (DWT) domain. Watermarks are embedded in the low-frequency coefficients, which is less audible. The key idea is dividing the audio signal into small frames, and magnitude of the 6th level of DWT approximation coefficients is modifying based upon the Direct Sequence Spread Spectrum (DSSS) technique. Also, the psychoacoustic model for enhancing in imperceptibility, as well as Savitsky-Golay filter for increasing accuracy in extraction, is used. The experimental results illustrate high robustness against most common attacks, i.e. Gaussian noise addition, Low pass filter, Resampling, Requantizing, MP3 compression, without significant perceptual distortion (ODG is higher than -1). The proposed scheme has about 83 bps data payload.

Keywords: audio watermarking, spread spectrum, discrete wavelet transform, psychoacoustic, Savitsky-Golay filter

Procedia PDF Downloads 179
28510 The Core Obstacles of Continuous Improvement Implementation: Some Key Findings from Health and Education Sectors

Authors: Abdullah Alhaqbani

Abstract:

Purpose: Implementing continuous improvement is a challenge that public sector organisations face in becoming successful. Many obstacles hinder public organisations from successfully implementing continuous improvement. This paper aims to highlight the key core obstacles that face public organisations to implement continuous improvement programmes. Approach: Based on the literature, this paper reviews 66 papers that were published between 2000 and 2013 and that focused on the concept of continuous improvement and improvement methodologies in the context of public sector organisations. The methodologies for continuous improvement covered in these papers include Total Quality Management, Six Sigma, process re-engineering, lean thinking and Kaizen. Findings: Of the 24 obstacles found in the literature, 11 barriers were seen as core barriers that frequently occurred in public sector organisations. The findings indicate that lack of top management commitment; organisational culture and political issues and resistance to change are significant obstacles for improvement programmes. Moreover, this review found that improvement methodologies share some core barriers to successful implementation within public organisations. These barriers as well are common in the different geographic area. For instance lack of top management commitment and training that found in the education sector in Albanian are common barriers of improvement studies in Kuwait, Saudi Arabia, Spain, UK and US. Practical implications: Understanding these core issues and barriers will help managers of public organisations to improve their strategies with respect to continuous improvement. Thus, this review highlights the core issues that prevent a successful continuous improvement journey within the public sector. Value: Identifying and understanding the common obstacles to successfully implementing continuous improvement in the public sector will help public organisations to learn how to improve in launching and successfully sustaining such programmes. However, this is not the end; rather, it is just the beginning of a longer improvement journey. Thus, it is intended that this review will identify key learning opportunities for public sector organisations in developing nations which will then be tested via further research.

Keywords: continuous improvement, total quality management, obstacles, public sector

Procedia PDF Downloads 317
28509 Parametric Study on the Behavior of Reinforced Concrete Continuous Beams Flexurally Strengthened with FRP Plates

Authors: Mohammed A. Sakr, Tarek M. Khalifa, Walid N. Mansour

Abstract:

External bonding of fiber reinforced polymer (FRP) plates to reinforced concrete (RC) beams is an effective technique for flexural strengthening. This paper presents an analytical parametric study on the behavior of RC continuous beams flexurally strengthened with externally bonded FRP plates on the upper and lower fibers, conducted using simple uniaxial nonlinear finite element model (UNFEM). UNFEM is able to estimate the load-carrying capacity, different failure modes and the interfacial stresses of RC continuous beams flexurally strengthened with externally bonded FRP plates on the upper and lower fibers. The study investigated the effect of five key parameters on the behavior and moment redistribution of FRP-reinforced continuous beams. The investigated parameters were the length of the FRP plate, the width and the thickness of the FRP plate, the ratio between the area of the FRP plate to the concrete area, the cohesive shear strength of the adhesive layer, and the concrete compressive strength. The investigation resulted in a number of important conclusions reflecting the effects of the studied parameters on the behavior of RC continuous beams flexurally strengthened with externally bonded FRP plates.

Keywords: continuous beams, parametric study, finite element, fiber reinforced polymer

Procedia PDF Downloads 353
28508 Continuous Improvement Model for Creative Industries Development

Authors: Rolandas Strazdas, Jurate Cerneviciute

Abstract:

Creative industries are defined as those industries which produce tangible or intangible artistic and creative output and have a potential for income generation by exploitingcultural assets and producing knowledge-based goods and services (both traditional and contemporary). With the emergence of an entire sector of creative industriestriggered by the development of creative products managingcreativity-based business processes becomes a critical issue. Diverse managerial practices and models on effective management of creativity have beenexamined in scholarly literature. Even thoughthese studies suggest how creativity in organisations can be nourished, they do not sufficiently relate the proposed practices to the underlying business processes. The article analyses a range of business process improvement methods such as PDCA, DMAIC, DMADV and TOC. The strengths and weaknesses of these methods aimed to improvethe innovation development process are identified. Based on the analysis of the existing improvement methods, a continuous improvement model was developed and presented in the article.

Keywords: continuous improvement, creative industries, improvement model, process mapping

Procedia PDF Downloads 443
28507 ACBM: Attention-Based CNN and Bi-LSTM Model for Continuous Identity Authentication

Authors: Rui Mao, Heming Ji, Xiaoyu Wang

Abstract:

Keystroke dynamics are widely used in identity recognition. It has the advantage that the individual typing rhythm is difficult to imitate. It also supports continuous authentication through the keyboard without extra devices. The existing keystroke dynamics authentication methods based on machine learning have a drawback in supporting relatively complex scenarios with massive data. There are drawbacks to both feature extraction and model optimization in these methods. To overcome the above weakness, an authentication model of keystroke dynamics based on deep learning is proposed. The model uses feature vectors formed by keystroke content and keystroke time. It ensures efficient continuous authentication by cooperating attention mechanisms with the combination of CNN and Bi-LSTM. The model has been tested with Open Data Buffalo dataset, and the result shows that the FRR is 3.09%, FAR is 3.03%, and EER is 4.23%. This proves that the model is efficient and accurate on continuous authentication.

Keywords: keystroke dynamics, identity authentication, deep learning, CNN, LSTM

Procedia PDF Downloads 131
28506 Continuous Improvement Programme as a Strategy for Technological Innovation in Developing Nations. Nigeria as a Case Study

Authors: Sefiu Adebowale Adewumi

Abstract:

Continuous improvement programme (CIP) adopts an approach to improve organizational performance with small incremental steps over time. In this approach, it is not the size of each step that is important, but the likelihood that the improvements will be ongoing. Many companies in developing nations are now complementing continuous improvement with innovation, which is the successful exploitation of new ideas. Focus area of CIP in the organization was in relation to the size of the organizations and also in relation to the generic classification of these organizations. Product quality was prevalent in the manufacturing industry while manpower training and retraining and marketing strategy were emphasized for improvement to be made in the service, transport and supply industries. However, focus on innovation in raw materials, process and methods are needed because these are the critical factors that influence product quality in the manufacturing industries.

Keywords: continuous improvement programme, developing countries, generic classfications, technological innovation

Procedia PDF Downloads 160
28505 Physical Properties of Alkali Resistant-Glass Fibers in Continuous Fiber Spinning Conditions

Authors: Ji-Sun Lee, Soong-Keun Hyun, Jin-Ho Kim

Abstract:

In this study, a glass fiber is fabricated using a continuous spinning process from alkali resistant (AR) glass with 4 wt% zirconia. In order to confirm the melting properties of the marble glass, the raw material is placed into a Pt crucible and melted at 1650 ℃ for 2 h, and then annealed. In order to confirm the transparency of the clear marble glass, the visible transmittance is measured, and the fiber spinning condition is investigated by using high temperature viscosity measurements. A change in the diameter is observed according to the winding speed in the range of 100–900 rpm; it is also verified as a function of the fiberizing temperature in the range of 1200–1260 ℃. The optimum winding speed and spinning temperature are 500 rpm and 1240 ℃, respectively. The properties of the prepared spinning fiber are confirmed using optical microscope, tensile strength, modulus, and alkali-resistant tests.

Keywords: glass composition, fiber diameter, continuous filament fiber, continuous spinning, physical properties

Procedia PDF Downloads 296
28504 [Keynote Talk]: Morphological Analysis of Continuous Graphene Oxide Fibers Incorporated with Carbon Nanotube and MnCl₂

Authors: Nuray Ucar, Pelin Altay, Ilkay Ozsev Yuksek

Abstract:

Graphene oxide fibers have recently received increasing attention due to their excellent properties such as high specific surface area, high mechanical strength, good thermal properties and high electrical conductivity. They have shown notable potential in various applications including batteries, sensors, filtration and separation and wearable electronics. Carbon nanotubes (CNTs) have unique structural, mechanical, and electrical properties and can be used together with graphene oxide fibers for several application areas such as lithium ion batteries, wearable electronics, etc. Metals salts that can be converted into metal ions and metal oxide can be also used for several application areas such as battery, purification natural gas, filtration, absorption. This study investigates the effects of CNT and metal complex compounds (MnCl₂, metal salts) on the morphological structure of graphene oxide fibers. The graphene oxide dispersion was manufactured by modified Hummers method, and continuous graphene oxide fibers were produced with wet spinning. The CNT and MnCl₂ were incorporated into the coagulation baths during wet spinning process. Produced composite continuous fibers were analyzed with SEM, SEM-EDS and AFM microscopies and as spun fiber counts were measured.

Keywords: continuous graphene oxide fiber, Hummers' method, CNT, MnCl₂

Procedia PDF Downloads 150
28503 Concept Analysis of Professionalism in Teachers and Faculty Members

Authors: Taiebe Shokri, Shahram Yazdani, Leila Afshar, Soleiman Ahmadi

Abstract:

Introduction: The importance of professionalism in higher education not only determines the appropriate and inappropriate behaviors and guides faculty members in the implementation of professional responsibilities, but also guarantees faculty members' adherence to professional principles and values, ensures the quality of teaching and facilitator will be the teaching-learning process in universities and will increase the commitment to meet the needs of students as well as the development of an ethical culture based on ethics. Therefore, considering the important role of medical education teachers to prepare teachers and students in the future, the need to determine the concept of professional teacher and teacher, and the characteristics of teacher professionalism, we have explained the concept of professionalism in teachers in this study. Methods: The concept analysis method used in this study was Walker and Avant method which has eight steps. Walker and Avant state the purpose of concept analysis as follows: The process of distinguishing between the defining features of a concept and its unrelated features. The process of concept analysis includes selecting a concept, determining the purpose of the analysis, identifying the uses of the concept, determining the defining features of the concept, identifying a model, identifying boundary and adversarial items, identifying the precedents and consequences of the concept, and defining empirical references. is. Results: Professionalism in its general sense, requires deep knowledge, insight, creating a healthy and safe environment, honesty and trust, impartiality, commitment to the profession and continuous improvement, punctuality, criticism, professional competence, responsibility, and Individual accountability, especially in social interactions, is an effort for continuous improvement, the acquisition of these characteristics is not easily possible and requires education, especially continuous learning. Professionalism is a set of values, behaviors, and relationships that underpin public trust in teachers.

Keywords: concept analysis, medical education, professionalism, faculty members

Procedia PDF Downloads 133
28502 A Study of Closed Sets and Maps with Ideals

Authors: Asha Gupta, Ramandeep Kaur

Abstract:

The purpose of this paper is to study a class of closed sets, called generalized pre-closed sets with respect to an ideal (briefly Igp-closed sets), which is an extension of generalized pre-closed sets in general topology. Then, by using these sets, the concepts of Igp- compact spaces along with some classes of maps like continuous and closed maps via ideals have been introduced and analogues of some known results for compact spaces, continuous maps and closed maps in general topology have been obtained.

Keywords: ideal, gp-closed sets, gp-closed maps, gp-continuous maps

Procedia PDF Downloads 201
28501 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet

Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi

Abstract:

One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).

Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)

Procedia PDF Downloads 421
28500 Vibration Analysis of Functionally Graded Engesser-Timoshenko Beams Subjected to Axial Load Located on a Continuous Elastic Foundation

Authors: M. Karami Khorramabadi, A. R. Nezamabadi

Abstract:

This paper studies free vibration of functionally graded beams Subjected to Axial Load that is simply supported at both ends lies on a continuous elastic foundation. The displacement field of beam is assumed based on Engesser-Timoshenko beam theory. The Young's modulus of beam is assumed to be graded continuously across the beam thickness. Applying the Hamilton's principle, the governing equation is established. Resulting equation is solved using the Euler's Equation. The effects of the constituent volume fractions and foundation coefficient on the vibration frequency are presented. To investigate the accuracy of the present analysis, a compression study is carried out with a known data.

Keywords: functionally graded beam, free vibration, elastic foundation, Engesser-Timoshenko beam theory

Procedia PDF Downloads 391
28499 The Effects of Continuous and Interval Aerobic Exercises with Moderate Intensity on Serum Levels of Glial Cell Line-Derived Neurotrophic Factor and Aerobic Capacity in Obese Children

Authors: Ali Golestani, Vahid Naseri, Hossein Taheri

Abstract:

Recently, some of studies examined the effect of exercise on neurotrophic factors influencing the growth, protection, plasticity and function in central and peripheral nerve cells. The aim of this study was to investigate the effects of continuous and interval aerobic exercises with moderate intensity on serum levels of glial cell line-derived neurotrophic factor (GDNF) and aerobic capacity in obese children. 21 obese students with an average age of 13.6 ± 0.5 height 171 ± 5 and BMI 32 ± 1.2 were divided randomly to control, continuous aerobic and interval aerobic groups. Training protocol included continuous or interval aerobic exercises with moderate intensity 50-65%MHR, three times per week for 10 weeks. 48 hours before and after executing of protocol, blood samples were taken from the participants and their GDNF serum levels were measured by ELISA. Aerobic power was estimated using Shuttle-run test. T-test results indicated a small increase in their GDNF serum levels, which was not statistically significant (p =0.11). In addition, the results of ANOVA did not show any significant difference between continuous and interval aerobic training on the serum levels of their GDNF but their aerobic capacity significantly increased (p =0.012). Although continuous and interval aerobic exercise improves aerobic power in obese children, they had no significant effect on their serum levels of GDNF.

Keywords: aerobic power, continuous aerobic training, glial cell line-derived neurotrophic factor (GDNF), interval aerobic training, obese children

Procedia PDF Downloads 158
28498 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform

Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier

Abstract:

The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.

Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing

Procedia PDF Downloads 178
28497 An Adaptive Decomposition for the Variability Analysis of Observation Time Series in Geophysics

Authors: Olivier Delage, Thierry Portafaix, Hassan Bencherif, Guillaume Guimbretiere

Abstract:

Most observation data sequences in geophysics can be interpreted as resulting from the interaction of several physical processes at several time and space scales. As a consequence, measurements time series in geophysics have often characteristics of non-linearity and non-stationarity and thereby exhibit strong fluctuations at all time-scales and require a time-frequency representation to analyze their variability. Empirical Mode Decomposition (EMD) is a relatively new technic as part of a more general signal processing method called the Hilbert-Huang transform. This analysis method turns out to be particularly suitable for non-linear and non-stationary signals and consists in decomposing a signal in an auto adaptive way into a sum of oscillating components named IMFs (Intrinsic Mode Functions), and thereby acts as a bank of bandpass filters. The advantages of the EMD technic are to be entirely data driven and to provide the principal variability modes of the dynamics represented by the original time series. However, the main limiting factor is the frequency resolution that may give rise to the mode mixing phenomenon where the spectral contents of some IMFs overlap each other. To overcome this problem, J. Gilles proposed an alternative entitled “Empirical Wavelet Transform” (EWT) which consists in building from the segmentation of the original signal Fourier spectrum, a bank of filters. The method used is based on the idea utilized in the construction of both Littlewood-Paley and Meyer’s wavelets. The heart of the method lies in the segmentation of the Fourier spectrum based on the local maxima detection in order to obtain a set of non-overlapping segments. Because linked to the Fourier spectrum, the frequency resolution provided by EWT is higher than that provided by EMD and therefore allows to overcome the mode-mixing problem. On the other hand, if the EWT technique is able to detect the frequencies involved in the original time series fluctuations, EWT does not allow to associate the detected frequencies to a specific mode of variability as in the EMD technic. Because EMD is closer to the observation of physical phenomena than EWT, we propose here a new technic called EAWD (Empirical Adaptive Wavelet Decomposition) based on the coupling of the EMD and EWT technics by using the IMFs density spectral content to optimize the segmentation of the Fourier spectrum required by EWT. In this study, EMD and EWT technics are described, then EAWD technic is presented. Comparison of results obtained respectively by EMD, EWT and EAWD technics on time series of ozone total columns recorded at Reunion island over [1978-2019] period is discussed. This study was carried out as part of the SOLSTYCE project dedicated to the characterization and modeling of the underlying dynamics of time series issued from complex systems in atmospheric sciences

Keywords: adaptive filtering, empirical mode decomposition, empirical wavelet transform, filter banks, mode-mixing, non-linear and non-stationary time series, wavelet

Procedia PDF Downloads 116
28496 Noninvasive Continuous Glucose Monitoring Device Using a Photon-Assisted Tunneling Photodetector Based on a Quantum Metal-Oxide-Semiconductor

Authors: Wannakorn Sangthongngam, Melissa Huerta, Jaewoo Kim, Doyeon Kim

Abstract:

Continuous glucose monitoring systems are essential for diabetics to avoid health complications but come at a costly price, especially when insurance does not fully cover the diabetic testing kits needed. This paper proposes a noninvasive continuous glucose monitoring system to provide an accessible, low-cost, and painless alternative method of accurate glucose measurements to help improve quality of life. Using a light source with a wavelength of 850nm illuminates the fingertip for the photodetector to detect the transmitted light. Utilizing SeeDevice’s photon-assisted tunneling photodetector (PAT-PD)-based QMOS™ sensor, fluctuations of voltage based on photon absorption in blood cells are comparable to traditional glucose measurements. The performance of the proposed method was validated using 4 test participants’ transmitted voltage readings compared with measurements obtained from the Accu-Chek glucometer. The proposed method was able to successfully measure concentrations from linear regression calculations.

Keywords: continuous glucose monitoring, non-invasive continuous glucose monitoring, NIR, photon-assisted tunneling photodetector, QMOS™, wearable device

Procedia PDF Downloads 68
28495 Assessing the Pre-Service and In-Service Teachers’ Continuation of Use of Technology After Participation in Professional Development

Authors: Ayoub Kafyulilo, Petra Fisser, Joke Voogt

Abstract:

This study was conducted to assess the continuation of the use of technology in science and mathematics teaching of the pre-service and in-service teachers who attended the professional development programme. It also assessed professional development, personal, institutional, and technological factors contributing to the continuous use of technology in teaching. The study involved 42 teachers, thirteen pre-service teachers, and twenty-nine in-service teachers. A mixed-method research approach was used to collect data for this study. Findings showed that the continuous use of technology in teaching after the termination of the professional development arrangement was high among the pre-service teachers, and differed for the in-service teachers. The regression model showed that knowledge and skills, access to technology and ease of use were strong predictors (R2 = 55.3%) of the teachers’ continuous use of technology after the professional development arrangement. The professional development factor did not have a direct effect on the continuous use of technology, rather had an influence on personal factors (knowledge and skills). In turn, the personal factors had influence on the institutional factors (access to technology) and technological factors (ease of use), which together had an effect on the teachers’ continuous use of technology in teaching.

Keywords: technology, professional development, teachers, science and mathematics

Procedia PDF Downloads 138
28494 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 203
28493 Comparative Study for Biodiesel Production Using a Batch and a Semi-Continuous Flow Reactor

Authors: S. S. L. Andrade, E. A. Souza, L. C. L. Santos, C. Moraes, A. K. C. L. Lobato

Abstract:

Biodiesel may be produced through transesterification reaction (or alcoholysis), that is the transformation of a long chain fatty acid in an alkyl ester. This reaction can occur in the presence of acid catalysts, alkali, or enzyme. Currently, for industrial processes, biodiesel is produced by alkaline route. The alkali most commonly used in these processes is hydroxides and methoxides of sodium and potassium. In this work, biodiesel production was conducted in two different systems. The first consisted of a batch reactor operating with a traditional washing system and the second consisted of a semi-continuous flow reactor operating with a membrane separation system. Potassium hydroxides was used as catalyst at a concentration of 1% by weight, the molar ratio oil/alcohol was 1/9 and temperature of 55 °C. Tests were performed using soybeans and palm oil and the ester conversion results were compared for both systems. It can be seen that the results for both oils are similar when using the batch reator or the semi-continuous flow reactor. The use of the semi-continuous flow reactor allows the removal of the formed products. Thus, in the case of a reversible reaction, with the removal of reaction products, the concentration of the reagents becomes higher and the equilibrium reaction is shifted towards the formation of more products. The higher conversion to ester with soybean and palm oil using the batch reactor was approximately 98%. In contrast, it was observed a conversion of 99% when using the same operating condition on a semi-continuous flow reactor.

Keywords: biodiesel, batch reactor, semi-continuous flow reactor, transesterification

Procedia PDF Downloads 355
28492 Continuous Land Cover Change Detection in Subtropical Thicket Ecosystems

Authors: Craig Mahlasi

Abstract:

The Subtropical Thicket Biome has been in peril of transformation. Estimates indicate that as much as 63% of the Subtropical Thicket Biome is severely degraded. Agricultural expansion is the main driver of transformation. While several studies have sought to document and map the long term transformations, there is a lack of information on disturbance events that allow for timely intervention by authorities. Furthermore, tools that seek to perform continuous land cover change detection are often developed for forests and thus tend to perform poorly in thicket ecosystems. This study investigates the utility of Earth Observation data for continuous land cover change detection in Subtropical Thicket ecosystems. Temporal Neural Networks are implemented on a time series of Sentinel-2 observations. The model obtained 0.93 accuracy, a recall score of 0.93, and a precision score of 0.91 in detecting Thicket disturbances. The study demonstrates the potential of continuous land cover change in Subtropical Thicket ecosystems.

Keywords: remote sensing, land cover change detection, subtropical thickets, near-real time

Procedia PDF Downloads 138
28491 Efficient Feature Fusion for Noise Iris in Unconstrained Environment

Authors: Yao-Hong Tsai

Abstract:

This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.

Keywords: image fusion, iris recognition, local binary pattern, wavelet

Procedia PDF Downloads 350
28490 External Strengthening of RC Continuous Beams Using FRP Plates: Finite Element Model

Authors: Mohammed A. Sakr, Tarek M. Khalifa, Walid N. Mansour

Abstract:

Fiber reinforced polymer (FRP) installation is a very effective way to repair and strengthen structures that have become structurally weak over their life span. This technique attracted the concerning of researchers during the last two decades. This paper presents a simple uniaxial nonlinear finite element model (UNFEM) able to accurately estimate the load-carrying capacity, different failure modes and the interfacial stresses of reinforced concrete (RC) continuous beams flexurally strengthened with externally bonded FRP plates on the upper and lower fibers. Results of the proposed finite element (FE) model are verified by comparing them with experimental measurements available in the literature. The agreement between numerical and experimental results is very good. Considering fracture energy of adhesive is necessary to get a realistic load carrying capacity of continuous RC beams strengthened with FRP. This simple UNFEM is able to help design engineers to model their strengthened structures and solve their problems.

Keywords: continuous beams, debonding, finite element, fibre reinforced polymer

Procedia PDF Downloads 460