Search results for: time series fractal analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40639

Search results for: time series fractal analysis

39469 Determination of Effect Factor for Effective Parameter on Saccharification of Lignocellulosic Material by Concentrated Acid

Authors: Sina Aghili, Ali Arasteh Nodeh

Abstract:

Tamarisk usage as a new group of lignocelluloses material to produce fermentable sugars in bio-ethanol process was studied. The overall aim of this work was to establish the optimum condition for acid hydrolysis of this new material and a mathematical model predicting glucose release as a function of operation variable. Sulfuric acid concentration in the range of 20 to 60%(w/w), process temperature between 60 to 95oC, hydrolysis time from 120 to 240 min and solid content 5,10,15%(w/w) were used as hydrolysis conditions. HPLC was used to analysis of the product. This analysis indicated that glucose was the main fermentable sugar and was increased with time, temperature and solid content and acid concentration was a parabola influence in glucose production.The process was modeled by a quadratic equation. Curve study and model were found that 42% acid concentration, 15 % solid content and 90oC were in optimum condition.

Keywords: fermentable sugar, saccharification, wood, hydrolysis

Procedia PDF Downloads 330
39468 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 157
39467 Climate Changes Impact on Artificial Wetlands

Authors: Carla Idely Palencia-Aguilar

Abstract:

Artificial wetlands play an important role at Guasca Municipality in Colombia, not only because they are used for the agroindustry, but also because more than 45 species were found, some of which are endemic and migratory birds. Remote sensing was used to determine the changes in the area occupied by water of artificial wetlands by means of Aster and Modis images for different time periods. Evapotranspiration was also determined by three methods: Surface Energy Balance System-Su (SEBS) algorithm, Surface Energy Balance- Bastiaanssen (SEBAL) algorithm, and Potential Evapotranspiration- FAO. Empirical equations were also developed to determine the relationship between Normalized Difference Vegetation Index (NDVI) versus net radiation, ambient temperature and rain with an obtained R2 of 0.83. Groundwater level fluctuations on a daily basis were studied as well. Data from a piezometer placed next to the wetland were fitted with rain changes (with two weather stations located at the proximities of the wetlands) by means of multiple regression and time series analysis, the R2 from the calculated and measured values resulted was higher than 0.98. Information from nearby weather stations provided information for ordinary kriging as well as the results for the Digital Elevation Model (DEM) developed by using PCI software. Standard models (exponential, spherical, circular, gaussian, linear) to describe spatial variation were tested. Ordinary Cokriging between height and rain variables were also tested, to determine if the accuracy of the interpolation would increase. The results showed no significant differences giving the fact that the mean result of the spherical function for the rain samples after ordinary kriging was 58.06 and a standard deviation of 18.06. The cokriging using for the variable rain, a spherical function; for height variable, the power function and for the cross variable (rain and height), the spherical function had a mean of 57.58 and a standard deviation of 18.36. Threatens of eutrophication were also studied, given the unconsciousness of neighbours and government deficiency. Water quality was determined over the years; different parameters were studied to determine the chemical characteristics of water. In addition, 600 pesticides were studied by gas and liquid chromatography. Results showed that coliforms, nitrogen, phosphorous and prochloraz were the most significant contaminants.

Keywords: DEM, evapotranspiration, geostatistics, NDVI

Procedia PDF Downloads 115
39466 About the Interface Bonding Safety of Adhesively Bonded Concrete Joints Under Cracking: A Fracture Energetic Approach

Authors: Brandtner-Hafner Martin

Abstract:

Adhesives are increasingly being used in the construction sector. On the one hand, this concerns dowel reinforcements using chemical anchors. On the other hand, the sealing and repair of cracks in structural concrete components are still on the rise. In the field of bonding, the interface between the joined materials is the most critical area. Therefore, it is of immense importance to characterize and investigate this section sufficiently by fracture analysis. Since standardized mechanical test methods are not sufficiently capable of doing this, recourse is made to an innovative concept based on fracture energy. Therefore, a series of experimental tests were performed using the so-called GF-principle to study the interface bonding safety of adhesively bonded concrete joints. Several different structural adhesive systems based on epoxy, CA/A hybrid, PUR, MS polymer, dispersion, and acrylate were selected for bonding concrete substrates. The results show that stable crack propagation and prevention of uncontrolled failure in bonded concrete joints depend very much on the adhesive system used, and only fracture analytical evaluation methods can provide empirical information on this.

Keywords: interface bonding safety, adhesively bonded concrete joints, GF-principle, fracture analysis

Procedia PDF Downloads 299
39465 Dynamics Analyses of Swing Structure Subject to Rotational Forces

Authors: Buntheng Chhorn, WooYoung Jung

Abstract:

Large-scale swing has been used in entertainment and performance, especially in circus, for a very long time. To increase the safety of this type of structure, a thorough analysis for displacement and bearing stress was performed for an extreme condition where a full cycle swing occurs. Different masses, ranging from 40 kg to 220 kg, and velocities were applied on the swing. Then, based on the solution of differential dynamics equation, swing velocity response to harmonic force was obtained. Moreover, the resistance capacity was estimated based on ACI steel structure design guide. Subsequently, numerical analysis was performed in ABAQUS to obtain the stress on each frame of the swing. Finally, the analysis shows that the expansion of swing structure frame section was required for mass bigger than 150kg.

Keywords: swing structure, displacement, bearing stress, dynamic loads response, finite element analysis

Procedia PDF Downloads 375
39464 The Connection Between the Semiotic Theatrical System and the Aesthetic Perception

Authors: Păcurar Diana Istina

Abstract:

The indissoluble link between aesthetics and semiotics, the harmonization and semiotic understanding of the interactions between the viewer and the object being looked at, are the basis of the practical demonstration of the importance of aesthetic perception within the theater performance. The design of a theater performance includes several structures, some considered from the beginning, art forms (i.e., the text), others being represented by simple, common objects (e.g., scenographic elements), which, if reunited, can trigger a certain aesthetic perception. The audience is delivered, by the team involved in the performance, a series of auditory and visual signs with which they interact. It is necessary to explain some notions about the physiological support of the transformation of different types of stimuli at the level of the cerebral hemispheres. The cortex considered the superior integration center of extransecal and entanged stimuli, permanently processes the information received, but even if it is delivered at a constant rate, the generated response is individualized and is conditioned by a number of factors. Each changing situation represents a new opportunity for the viewer to cope with, developing feelings of different intensities that influence the generation of meanings and, therefore, the management of interactions. In this sense, aesthetic perception depends on the detection of the “correctness” of signs, the forms of which are associated with an aesthetic property. Fairness and aesthetic properties can have positive or negative values. Evaluating the emotions that generate judgment and implicitly aesthetic perception, whether we refer to visual emotions or auditory emotions, involves the integration of three areas of interest: Valence, arousal and context control. In this context, superior human cognitive processes, memory, interpretation, learning, attribution of meanings, etc., help trigger the mechanism of anticipation and, no less important, the identification of error. This ability to locate a short circuit produced in a series of successive events is fundamental in the process of forming an aesthetic perception. Our main purpose in this research is to investigate the possible conditions under which aesthetic perception and its minimum content are generated by all these structures and, in particular, by interactions with forms that are not commonly considered aesthetic forms. In order to demonstrate the quantitative and qualitative importance of the categories of signs used to construct a code for reading a certain message, but also to emphasize the importance of the order of using these indices, we have structured a mathematical analysis that has at its core the analysis of the percentage of signs used in a theater performance.

Keywords: semiology, aesthetics, theatre semiotics, theatre performance, structure, aesthetic perception

Procedia PDF Downloads 83
39463 Vulnerability Analysis for Risk Zones Boundary Definition to Support a Decision Making Process at CBRNE Operations

Authors: Aliaksei Patsekha, Michael Hohenberger, Harald Raupenstrauch

Abstract:

An effective emergency response to accidents with chemical, biological, radiological, nuclear, or explosive materials (CBRNE) that represent highly dynamic situations needs immediate actions within limited time, information and resources. The aim of the study is to provide the foundation for division of unsafe area into risk zones according to the impact of hazardous parameters (heat radiation, thermal dose, overpressure, chemical concentrations). A decision on the boundary values for three risk zones is based on the vulnerability analysis that covered a variety of accident scenarios containing the release of a toxic or flammable substance which either evaporates, ignites and/or explodes. Critical values are selected for the boundary definition of the Red, Orange and Yellow risk zones upon the examination of harmful effects that are likely to cause injuries of varying severity to people and different levels of damage to structures. The obtained results provide the basis for creating a comprehensive real-time risk map for a decision support at CBRNE operations.

Keywords: boundary values, CBRNE threats, decision making process, hazardous effects, vulnerability analysis, risk zones

Procedia PDF Downloads 205
39462 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 267
39461 Disaster Preparedness and Management in Saudi Arabia: An Empirical Investigation

Authors: Shougi Suliman Abosuliman, Arun Kumar, Firoz Alam

Abstract:

Disaster preparedness is a key success factor for any effective disaster management practices. This paper evaluates the disaster preparedness and management in Saudi Arabia using an empirical investigation approach. It presents the results of the survey conducted by interviewing representatives of the Saudi decision-makers and administrators responsible for disaster control in Jeddah before, during and after flooding in 2009 and 2010. First, demographics of the respondents are presented, followed by quantitative analysis of their views and experiences regarding the Kingdom’s readiness before and after each flood. This is shown as a series of dependent and independent variables. Following this is a list of respondents’ priorities for disaster preparation in the Kingdom.

Keywords: disaster response policy, crisis management, effective service delivery, Jeddah

Procedia PDF Downloads 453
39460 Modification of Aliphatic-Aromatic Copolyesters with Polyether Block for Segmented Copolymers with Elastothemoplastic Properties

Authors: I. Irska, S. Paszkiewicz, D. Pawlikowska, E. Piesowicz, A. Linares, T. A. Ezquerra

Abstract:

Due to the number of advantages such as high tensile strength, sensitivity to hydrolytic degradation, and biocompatibility poly(lactic acid) (PLA) is one of the most common polyesters for biomedical and pharmaceutical applications. However, PLA is a rigid, brittle polymer with low heat distortion temperature and slow crystallization rate. In order to broaden the range of PLA applications, it is necessary to improve these properties. In recent years a number of new strategies have been evolved to obtain PLA-based materials with improved characteristics, including manipulation of crystallinity, plasticization, blending, and incorporation into block copolymers. Among the other methods, synthesis of aliphatic-aromatic copolyesters has been attracting considerable attention as they may combine the mechanical performance of aromatic polyesters with biodegradability known from aliphatic ones. Given the need for highly flexible biodegradable polymers, in this contribution, a series of aromatic-aliphatic based on poly(butylene terephthalate) and poly(lactic acid) (PBT-b-PLA) copolyesters exhibiting superior mechanical properties were copolymerized with an additional poly(tetramethylene oxide) (PTMO) soft block. The structure and properties of both series were characterized by means of attenuated total reflectance – Fourier transform infrared spectroscopy (ATR-FTIR), nuclear magnetic resonance spectroscopy (¹H NMR), differential scanning calorimetry (DSC), wide-angle X-ray scattering (WAXS) and dynamic mechanical, thermal analysis (DMTA). Moreover, the related changes in tensile properties have been evaluated and discussed. Lastly, the viscoelastic properties of synthesized poly(ester-ether) copolymers were investigated in detail by step cycle tensile tests. The block lengths decreased with the advance of treatment, and the block-random diblock terpolymers of (PBT-ran-PLA)-b-PTMO were obtained. DSC and DMTA analysis confirmed unambiguously that synthesized poly(ester-ether) copolymers are microphase-separated systems. The introduction of polyether co-units resulted in a decrease in crystallinity degree and melting temperature. X-ray diffraction patterns revealed that only PBT blocks are able to crystallize. The mechanical properties of (PBT-ran-PLA)-b-PTMO copolymers are a result of a unique arrangement of immiscible hard and soft blocks, providing both strength and elasticity.

Keywords: aliphatic-aromatic copolymers, multiblock copolymers, phase behavior, thermoplastic elastomers

Procedia PDF Downloads 134
39459 Transperineal Repair Is Ideal for the Management of Rectocele with Faecal Incontinence

Authors: Tia Morosin, Marie Shella De Robles

Abstract:

Rectocele may be associated with symptoms of both obstructed defecation and faecal incontinence. Currently, numerous operative techniques exist to treat patients with rectocele; however, no single technique has emerged as the optimal approach in patients with post-partum faecal incontinence. The purpose of this study was to evaluate the clinical outcome in a consecutive series of patients who underwent transperineal repair of rectocele for patients presenting with faecal incontinence as the predominant symptom. Twenty-three consecutive patients from April 2000 to July 2015 with symptomatic rectocele underwent transperineal repair by a single surgeon. All patients had a history of vaginal delivery, with or without evidence of associated anal sphincter injury at the time. The median age of the cohort was 53 years (range 21 to 90 years). The median operating time and length of hospital stay were 2 hours and 7 days, respectively. Two patients developed urinary retention post-operatively, which required temporary bladder catheterization. One patient had wound dehiscence, which was managed by absorbent dressing applied by the patient and her carer. There was no operative mortality. In all patients with rectocele, there was a concomitant anal sphincter disruption. All patients had satisfactory improvement with regard to faecal incontinence on follow-up. This study suggests this method provides excellent anatomic and physiologic results with minimal morbidity. However, because none of the patients gained full continence postoperatively, pelvic floor rehabilitation might be also needed to achieve better sphincter function in patients with incontinence.

Keywords: anal sphincter defect, faecal incontinence, rectocele, transperineal repair

Procedia PDF Downloads 124
39458 Life Time Improvement of Clamp Structural by Using Fatigue Analysis

Authors: Pisut Boonkaew, Jatuporn Thongsri

Abstract:

In hard disk drive manufacturing industry, the process of reducing an unnecessary part and qualifying the quality of part before assembling is important. Thus, clamp was designed and fabricated as a fixture for holding in testing process. Basically, testing by trial and error consumes a long time to improve. Consequently, the simulation was brought to improve the part and reduce the time taken. The problem is the present clamp has a low life expectancy because of the critical stress that occurred. Hence, the simulation was brought to study the behavior of stress and compressive force to improve the clamp expectancy with all probability of designs which are present up to 27 designs, which excluding the repeated designs. The probability was calculated followed by the full fractional rules of six sigma methodology which was provided correctly. The six sigma methodology is a well-structured method for improving quality level by detecting and reducing the variability of the process. Therefore, the defective will be decreased while the process capability increasing. This research focuses on the methodology of stress and fatigue reduction while compressive force still remains in the acceptable range that has been set by the company. In the simulation, ANSYS simulates the 3D CAD with the same condition during the experiment. Then the force at each distance started from 0.01 to 0.1 mm will be recorded. The setting in ANSYS was verified by mesh convergence methodology and compared the percentage error with the experimental result; the error must not exceed the acceptable range. Therefore, the improved process focuses on degree, radius, and length that will reduce stress and still remain in the acceptable force number. Therefore, the fatigue analysis will be brought as the next process in order to guarantee that the lifetime will be extended by simulating through ANSYS simulation program. Not only to simulate it, but also to confirm the setting by comparing with the actual clamp in order to observe the different of fatigue between both designs. This brings the life time improvement up to 57% compared with the actual clamp in the manufacturing. This study provides a precise and trustable setting enough to be set as a reference methodology for the future design. Because of the combination and adaptation from the six sigma method, finite element, fatigue and linear regressive analysis that lead to accurate calculation, this project will able to save up to 60 million dollars annually.

Keywords: clamp, finite element analysis, structural, six sigma, linear regressive analysis, fatigue analysis, probability

Procedia PDF Downloads 232
39457 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.

Keywords: Lagrange interpolation, linear complexity, monomial matrix, Newton interpolation

Procedia PDF Downloads 223
39456 Time and Kinematics of Moving Bodies

Authors: Muhammad Omer Farooq Saeed

Abstract:

The purpose of the proposal is to find out what time actually is! And to understand the natural phenomenon of the behavior of time and light corresponding to the motion of the bodies at relatively high speeds. The utmost concern of the paper is to deal with the possible demerits in the equations of relativity, thereby providing some valuable extensions in those equations and concepts. The idea used develops the most basic conception of the relative motion of the body with respect to space and a real understanding of time and the variation of energy of the body in different frames of reference. The results show the development of a completely new understanding of time, relative motion and energy, along with some extensions in the equations of special relativity most importantly the time dilation and the mass-energy relationship that will explain all frames of a body, all in one go. The proposal also raises serious questions on the validity of the “Principle of Equivalence” on which the General Relativity is based, most importantly a serious case of the bending light that eventually goes against its own governing concepts of space-time being proposed in the theory. The results also predict the existence of a completely new field that explains the fact just how and why bodies acquire energy in space-time. This field explains the production of gravitational waves based on time. All in all, this proposal challenges the formulas and conceptions of Special and General Relativity, respectively.

Keywords: time, relative motion, energy, speed, frame of reference, photon, curvature, space-time, time –differentials

Procedia PDF Downloads 64
39455 Metal Layer Based Vertical Hall Device in a Complementary Metal Oxide Semiconductor Process

Authors: Se-Mi Lim, Won-Jae Jung, Jin-Sup Kim, Jun-Seok Park, Hyung-Il Chae

Abstract:

This paper presents a current-mode vertical hall device (VHD) structure using metal layers in a CMOS process. The proposed metal layer based vertical hall device (MLVHD) utilizes vertical connection among metal layers (from M1 to the top metal) to facilitate hall effect. The vertical metal structure unit flows a bias current Ibias from top to bottom, and an external magnetic field changes the current distribution by Lorentz force. The asymmetric current distribution can be detected by two differential-mode current outputs on each side at the bottom (M1), and each output sinks Ibias/2 ± Ihall. A single vertical metal structure generates only a small amount of hall effect of Ihall due to the short length from M1 to the top metal as well as the low conductivity of the metal, and a series connection between thousands of vertical structure units can solve the problem by providing NxIhall. The series connection between two units is another vertical metal structure flowing current in the opposite direction, and generates negative hall effect. To mitigate the negative hall effect from the series connection, the differential current outputs at the bottom (M1) from one unit merges on the top metal level of the other unit. The proposed MLVHD is simulated in a 3-dimensional model simulator in COMSOL Multiphysics, with 0.35 μm CMOS process parameters. The simulated MLVHD unit size is (W) 10 μm × (L) 6 μm × (D) 10 μm. In this paper, we use an MLVHD with 10 units; the overall hall device size is (W) 10 μm × (L)78 μm × (D) 10 μm. The COMSOL simulation result is as following: the maximum hall current is approximately 2 μA with a 12 μA bias current and 100mT magnetic field; This work was supported by Institute for Information & communications Technology Promotion(IITP) grant funded by the Korea government(MSIP) (No.R7117-16-0165, Development of Hall Effect Semiconductor for Smart Car and Device).

Keywords: CMOS, vertical hall device, current mode, COMSOL

Procedia PDF Downloads 297
39454 Use of Artificial Intelligence Based Models to Estimate the Use of a Spectral Band in Cognitive Radio

Authors: Danilo López, Edwin Rivas, Fernando Pedraza

Abstract:

Currently, one of the major challenges in wireless networks is the optimal use of radio spectrum, which is managed inefficiently. One of the solutions to existing problem converges in the use of Cognitive Radio (CR), as an essential parameter so that the use of the available licensed spectrum is possible (by secondary users), well above the usage values that are currently detected; thus allowing the opportunistic use of the channel in the absence of primary users (PU). This article presents the results found when estimating or predicting the future use of a spectral transmission band (from the perspective of the PU) for a chaotic type channel arrival behavior. The time series prediction method (which the PU represents) used is ANFIS (Adaptive Neuro Fuzzy Inference System). The results obtained were compared to those delivered by the RNA (Artificial Neural Network) algorithm. The results show better performance in the characterization (modeling and prediction) with the ANFIS methodology.

Keywords: ANFIS, cognitive radio, prediction primary user, RNA

Procedia PDF Downloads 415
39453 Tracking the Effect of Ibutilide on Amplitude and Frequency of Fibrillatory Intracardiac Electrograms Using the Regression Analysis

Authors: H. Hajimolahoseini, J. Hashemi, D. Redfearn

Abstract:

Background: Catheter ablation is an effective therapy for symptomatic atrial fibrillation (AF). The intracardiac electrocardiogram (IEGM) collected during this procedure contains precious information that has not been explored to its full capacity. Novel processing techniques allow looking at these recordings from different perspectives which can lead to improved therapeutic approaches. In our previous study, we showed that variation in amplitude measured through Shannon Entropy could be used as an AF recurrence risk stratification factor in patients who received Ibutilide before the electrograms were recorded. The aim of this study is to further investigate the effect of Ibutilide on characteristics of the recorded signals from the left atrium (LA) of a patient with persistent AF before and after administration of the drug. Methods: The IEGMs collected from different intra-atrial sites of 12 patients were studied and compared before and after Ibutilide administration. First, the before and after Ibutilide IEGMs that were recorded within a Euclidian distance of 3 mm in LA were selected as pairs for comparison. For every selected pair of IEGMs, the Probability Distribution Function (PDF) of the amplitude in time domain and magnitude in frequency domain was estimated using the regression analysis. The PDF represents the relative likelihood of a variable falling within a specific range of values. Results: Our observations showed that in time domain, the PDF of amplitudes was fitted to a Gaussian distribution while in frequency domain, it was fitted to a Rayleigh distribution. Our observations also revealed that after Ibutilide administration, the IEGMs would have significantly narrower short-tailed PDFs both in time and frequency domains. Conclusion: This study shows that the PDFs of the IEGMs before and after administration of Ibutilide represents significantly different properties, both in time and frequency domains. Hence, by fitting the PDF of IEGMs in time domain to a Gaussian distribution or in frequency domain to a Rayleigh distribution, the effect of Ibutilide can easily be tracked using the statistics of their PDF (e.g., standard deviation) while this is difficult through the waveform of IEGMs itself.

Keywords: atrial fibrillation, catheter ablation, probability distribution function, time-frequency characteristics

Procedia PDF Downloads 156
39452 Online Self-Help Metacognitive Therapy for OCD: A Case Series

Authors: C. Pearcy, C. Rees

Abstract:

Cognitive behavioural therapy (CBT) and exposure and response prevention (ERP) are currently the most efficacious treatments for Obsessive-compulsive disorder (OCD). Many clients, however, remain symptomatic following treatment. As a result, refusal of treatment, withdrawal from treatment, and partial adherence to treatment are common amongst ERP. Such limitations have caused few professionals to actually engage in ERP therapy, which has warranted the exploration of alternative treatments. This study evaluated an online self-help treatment program for OCD (the OCD Doctor Online); a 4-week Metacognitive Therapy (MCT) program which has implemented strategies from Wells’ Metacognitive model of OCD. The aim of the present study was to investigate whether an online self-help treatment using MCT would reduce symptoms of OCD, reduce unhelpful metacognitions and improve quality of life. Treatment effectiveness was assessed using a case series methodology in 3 consecutively referred individuals. At post-treatment, all participants showed reductions in unhelpful metacognitive beliefs (MCQ-30) and improvements in quality of life (Q-LES-Q), which were maintained through to 4 week follow-up. Two of the three participants showed reductions in OCD symptomology (OCI-R), which were further reduced at 4-week follow-up. The present study suggests that internet-based self-help treatment may be an effective means of delivering MCT to adults with OCD.

Keywords: internet-based, metacognitive therapy, obsessive-compulsive disorder, self-help

Procedia PDF Downloads 428
39451 Comparative Study of the Effects of Process Parameters on the Yield of Oil from Melon Seed (Cococynthis citrullus) and Coconut Fruit (Cocos nucifera)

Authors: Ndidi F. Amulu, Patrick E. Amulu, Gordian O. Mbah, Callistus N. Ude

Abstract:

Comparative analysis of the properties of melon seed, coconut fruit and their oil yield were evaluated in this work using standard analytical technique AOAC. The results of the analysis carried out revealed that the moisture contents of the samples studied are 11.15% (melon) and 7.59% (coconut). The crude lipid content are 46.10% (melon) and 55.15% (coconut).The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant difference (p < 0.05) in yield between the samples, with melon oil seed flour having a higher percentage range of oil yield (41.30 – 52.90%) and coconut (36.25 – 49.83%). The physical characterization of the extracted oil was also carried out. The values gotten for refractive index are 1.487 (melon seed oil) and 1.361 (coconut oil) and viscosities are 0.008 (melon seed oil) and 0.002 (coconut oil). The chemical analysis of the extracted oils shows acid value of 1.00mg NaOH/g oil (melon oil), 10.050mg NaOH/g oil (coconut oil) and saponification value of 187.00mg/KOH (melon oil) and 183.26mg/KOH (coconut oil). The iodine value of the melon oil gave 75.00mg I2/g and 81.00mg I2/g for coconut oil. A standard statistical package Minitab version 16.0 was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to optimize the leaching process. Both samples gave high oil yield at the same optimal conditions. The optimal conditions to obtain highest oil yield ≥ 52% (melon seed) and ≥ 48% (coconut seed) are solute - solvent ratio of 40g/ml, leaching time of 2hours and leaching temperature of 50oC. The two samples studied have potential of yielding oil with melon seed giving the higher yield.

Keywords: Coconut, Melon, Optimization, Processing

Procedia PDF Downloads 434
39450 Determination of Small Shear Modulus of Clayey Sand Using Bender Element Test

Authors: R. Sadeghzadegan, S. A. Naeini, A. Mirzaii

Abstract:

In this article, the results of a series of carefully conducted laboratory test program were represented to determine the small strain shear modulus of sand mixed with a range of kaolinite including zero to 30%. This was experimentally achieved using a triaxial cell equipped with bender element. Results indicate that small shear modulus tends to increase, while clay content decreases and effective confining pressure increases. The exponent of stress in the power model regression analysis was not sensitive to the amount of clay content for all sand clay mixtures, while coefficient A was directly affected by change in clay content.

Keywords: small shear modulus, bender element test, plastic fines, sand

Procedia PDF Downloads 466
39449 Estimation and Forecasting with a Quantile AR Model for Financial Returns

Authors: Yuzhi Cai

Abstract:

This talk presents a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated MCMC algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. An application of the method to the USD to GBP daily currency exchange rates will also be discussed. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.

Keywords: combining forecasts, MCMC, quantile modelling, quantile forecasting, predictive density functions

Procedia PDF Downloads 342
39448 Design of 900 MHz High Gain SiGe Power Amplifier with Linearity Improved Bias Circuit

Authors: Guiheng Zhang, Wei Zhang, Jun Fu, Yudong Wang

Abstract:

A 900 MHz three-stage SiGe power amplifier (PA) with high power gain is presented in this paper. Volterra Series is applied to analyze nonlinearity sources of SiGe HBT device model clearly. Meanwhile, the influence of operating current to IMD3 is discussed. Then a β-helper current mirror bias circuit is applied to improve linearity, since the β-helper current mirror bias circuit can offer stable base biasing voltage. Meanwhile, it can also work as predistortion circuit when biasing voltages of three bias circuits are fine-tuned, by this way, the power gain and operating current of PA are optimized for best linearity. The three power stages which fabricated by 0.18 μm SiGe technology are bonded to the printed circuit board (PCB) to obtain impedances by Load-Pull system, then matching networks are done for best linearity with discrete passive components on PCB. The final measured three-stage PA exhibits 21.1 dBm of output power at 1 dB compression point (OP1dB) with power added efficiency (PAE) of 20.6% and 33 dB power gain under 3.3 V power supply voltage.

Keywords: high gain power amplifier, linearization bias circuit, SiGe HBT model, Volterra series

Procedia PDF Downloads 333
39447 Modeling the Time-Dependent Rheological Behavior of Clays Used in Fabrication of Ceramic

Authors: Larbi Hammadi, N. Boudjenane, N. Benhallou, R. Houjedje, R. Reffis, M. Belhadri

Abstract:

Many of clays exhibited the thixotropic behavior in which, the apparent viscosity of material decreases with time of shearing at constant shear rate. The structural kinetic model (SKM) was used to characterize the thixotropic behavior of two different kinds of clays used in fabrication of ceramic. Clays selected for analysis represent the fluid and semisolid clays materials. The SKM postulates that the change in the rheological behavior is associated with shear-induced breakdown of the internal structure of the clays. This model for the structure decay with time at constant shear rate assumes nth order kinetics for the decay of the material structure with a rate constant.

Keywords: ceramic, clays, structural kinetic model, thixotropy, viscosity

Procedia PDF Downloads 400
39446 Determination of Hydrocarbon Path Migration from Gravity Data Analysis (Ghadames Basin, Southern Tunisia, North Africa)

Authors: Mohamed Dhaoui, Hakim Gabtni

Abstract:

The migration of hydrocarbons is a fairly complicated process that depends on several parameters, both structural and sedimentological. In this study, we will try to determine secondary migration paths which convey hydrocarbon from their main source rock to the largest reservoir of the Paleozoic petroleum system of the Tunisian part of Ghadames basin. In fact, The Silurian source rock is the main source rock of the Paleozoic petroleum system of the Ghadames basin. However, the most solicited reservoir in this area is the Triassic reservoir TAGI (Trias Argilo-Gréseux Inférieur). Several geochemical studies have confirmed that oil products TAGI come mainly from the Tannezuft Silurian source rock. That being said that secondary migration occurs through the fault system which affects the post-Silurian series. Our study is based on analysis and interpretation of gravity data. The gravity modeling was conducted in the northern part of Ghadames basin and the Telemzane uplift. We noted that there is a close relationship between the location of producing oil fields and gravity gradients which separate the positive and negative gravity anomalies. In fact, the analysis and transformation of the Bouguer anomaly map, and the residual gravity map allowed as understanding the architecture of the Precambrian in the study area, thereafter gravimetric models were established allowed to determine the probable migration path.

Keywords: basement, Ghadames, gravity, hydrocarbon, migration path

Procedia PDF Downloads 354
39445 Factors Affecting in Soil Analysis Technique Adopted by the Southern Region Farmers, Syria

Authors: Moammar Dayoub

Abstract:

The study aimed to know the reality of farmers and determine the extent of adoption of the recommendations of the fertilizer and the difficulties and problems they face. The study was conducted on a random sample of farmers consist of 95 farmers who had analysed their field soil in scientific research centres in agricultural southern region through the form specially prepared for this purpose, the results showed that the rate of adoption of the fertilizer recommendations whole amounted to an average of 36.9% in the southern region, The degree of adoption was 34.7% in the region. The results showed that 41% of farmers did not implement the recommendations because of the non-convenient analysis, and 34% due to neglect, and 15% due to the weather and an environment, while 10% of them for lack of manure in the suitable time. The study also revealed that Independent factors affecting the continuing adoption of soil analysis are: farms experience, sampling method in farmer’s schools, irrigated area, and personal knowledge of farmers in analysing the soil. Also, show that the application of fertilizer recommendations led to increased production by 15-20%, this analysis emphasizes the importance of soil analysis and adherence to the recommendations of the research centres.

Keywords: adoption, recommendations of the fertilizer, soil analysis, southern region

Procedia PDF Downloads 164
39444 Spatio-Temporal Analysis of Land Use and Land Cover Change in the Cocoa Belt of Ondo State, southwestern Nigeria

Authors: Emmanuel Dada, Adebayo-Victoria Tobi Dada

Abstract:

The study evaluates land use and land cover changes in the cocoa belt of Ondo state to quantify its effect on the expanse of land occupied by cocoa plantation as the most suitable region for cocoa raisin in Nigeria. Time series of satellite imagery from Landsat-7 ETM+ and Landsat-8 TIRS covering years 2000 and 2015 respectively were used. The study area was classified into six land use themes of cocoa plantation, settlement, water body, light forest and grassland, forest, and bar surface and rock outcrop. The analyses revealed that out of total land area of 997714 hectares of land of the study area, cocoa plantation land use increased by 10.3% in 2015 from 312260.6 ha in 2000. Forest land use also increased by 6.3% in 2015 from 152144.1 ha in the year 2000, water body reduced from 2954.5 ha in the year 2000 by 0.1% in 2015, settlement land use increased by 3% in 2015 from 15194.6 ha in 2000, light forest and grassland area reduced by 10.4% between 2000 and 2015 and 9.1% reduction in bar surface and rock outcrop land use between the year 2000 and 2015 respectively. The reasons for different ranges in the changes observed in the land use and land cover in the study area could be due to increase in the incentive to cocoa farmers from both government and non-governmental organizations, developed new cocoa breed that thrive better in the light forest, rapid increased in the population of cocoa farmers’ settlements, and government promulgation of forest reserve law.

Keywords: satellite imagery, land use and land cover change, area of land

Procedia PDF Downloads 226
39443 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 122
39442 Time-Frequency Modelling and Analysis of Faulty Rotor

Authors: B. X. Tchomeni, A. A. Alugongo, T. B. Tengen

Abstract:

In this paper, de Laval rotor system has been characterized by a hinge model and its transient response numerically treated for a dynamic solution. The effect of the ensuing non-linear disturbances namely rub and breathing crack is numerically simulated. Subsequently, three analysis methods: Orbit Analysis, Fast Fourier Transform (FFT) and Wavelet Transform (WT) are employed to extract features of the vibration signal of the faulty system. An analysis of the system response orbits clearly indicates the perturbations due to the rotor-to-stator contact. The sensitivities of WT to the variation in system speed have been investigated by Continuous Wavelet Transform (CWT). The analysis reveals that features of crack, rubs and unbalance in vibration response can be useful for condition monitoring. WT reveals its ability to detect non-linear signal, and obtained results provide a useful tool method for detecting machinery faults.

Keywords: Continuous wavelet, crack, discrete wavelet, high acceleration, low acceleration, nonlinear, rotor-stator, rub

Procedia PDF Downloads 342
39441 Grief and Repenting: The Engaging Remembrance in Thomas Hardy’s ‘Poems of 1912-13’

Authors: Chih-Chun Tang

Abstract:

Nostalgia, to some people, may seem foolhardy in a way. However, nostalgia is a completely and intensely private but social, collective emotion. It has continuing consequence and outgrowth for our lives as social actions. It leads people to hunt and explore remembrance of persons and places of our past in an effort to confer meaning of persons and places of present. In the ‘Poems of 1912-13’ Thomas Hardy, a British poet, composed a series of poems after the unexpected death of his long-disaffected wife, Emma. The series interprets the cognitive and emotional concussion of Emma’s death on Hardy, concerning his mind and real visit to the landscape in Cornwall, England. Both spaces perform the author’s innermost in thought to his late wife and to the landscape. They present an apparent counterpart of the poet and his afflicted conscience. After Emma had died, Hardy carried her recollections alive by roaming about in the real visit and whimsical land (space) they once had drifted and meandered. This paper highlights the nostalgias and feds that seem endlessly to crop up.

Keywords: Thomas Hardy, remembrance, psychological, poems 1912-13, Fred Davis, nostalgia

Procedia PDF Downloads 266
39440 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp

Procedia PDF Downloads 341