Search results for: RR interval time series
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19781

Search results for: RR interval time series

19451 Time Series Analysis of Air Pollution in Suceava County ( Nord- East of Romania)

Authors: Lazurca Liliana Gina

Abstract:

Different time series analysis of yearly air pollution at Suceava County, Nord-East of Romania, has been performed in this study. The trends in the atmospheric concentrations of the main gaseous and particulate pollutants in urban, industrial and rural environments across Suceava County were estimated for the period of 2008-2014. The non-parametric Mann-Kendall test was used to determine the trends in the annual average concentrations of air pollutants (NO2, NO, NOx, SO2, CO, PM10, O3, C6H6). The slope was estimated using the non-parametric Sen’s method. Trend significance was assumed at the 5% significance level (p < 0.05) in the current study. During the 7 year period, trends in atmospheric concentrations may not have been monotonic, in some instances concentrations of species increased and subsequently decreased. The trend in Suceava County is to keep a low concentration of pollutants in ambient air respecting the limit values.All the results that we obtained show that Romania has taken a lot of regulatory measures to decrease the concentrations of air pollutants in the last decade, in Suceava County the air quality monitoring highlight for the most part of the analyzed pollutants decreasing trends. For the analyzed period we observed considerable improvements in background air in Suceava County.

Keywords: pollutant, trend, air quality monitoring, Mann-Kendall

Procedia PDF Downloads 343
19450 Uncontrollable Inaccuracy in Inverse Problems

Authors: Yu Menshikov

Abstract:

In this paper the influence of errors of function derivatives in initial time which have been obtained by experiment (uncontrollable inaccuracy) to the results of inverse problem solution was investigated. It was shown that these errors distort the inverse problem solution as a rule near the beginning of interval where the solution are analyzed. Several methods for remove the influence of uncontrollable inaccuracy have been suggested.

Keywords: inverse problems, filtration, uncontrollable inaccuracy

Procedia PDF Downloads 486
19449 Analysis of Extreme Rainfall Trends in Central Italy

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Marco Cifrodelli, Corrado Corradini

Abstract:

The trend of magnitude and frequency of extreme rainfalls seems to be different depending on the investigated area of the world. In this work, the impact of climate change on extreme rainfalls in Umbria, an inland region of central Italy, is examined using data recorded during the period 1921-2015 by 10 representative rain gauge stations. The study area is characterized by a complex orography, with altitude ranging from 200 to more than 2000 m asl. The climate is very different from zone to zone, with mean annual rainfall ranging from 650 to 1450 mm and mean annual air temperature from 3.3 to 14.2°C. Over the past 15 years, this region has been affected by four significant droughts as well as by six dangerous flood events, all with very large impact in economic terms. A least-squares linear trend analysis of annual maximums over 60 time series selected considering 6 different durations (1 h, 3 h, 6 h, 12 h, 24 h, 48 h) showed about 50% of positive and 50% of negative cases. For the same time series the non-parametrical Mann-Kendall test with a significance level 0.05 evidenced only 3% of cases characterized by a negative trend and no positive case. Further investigations have also demonstrated that the variance and covariance of each time series can be considered almost stationary. Therefore, the analysis on the magnitude of extreme rainfalls supplies the indication that an evident trend in the change of values in the Umbria region does not exist. However, also the frequency of rainfall events, with particularly high rainfall depths values, occurred during a fixed period has also to be considered. For all selected stations the 2-day rainfall events that exceed 50 mm were counted for each year, starting from the first monitored year to the end of 2015. Also, this analysis did not show predominant trends. Specifically, for all selected rain gauge stations the annual number of 2-day rainfall events that exceed the threshold value (50 mm) was slowly decreasing in time, while the annual cumulated rainfall depths corresponding to the same events evidenced trends that were not statistically significant. Overall, by using a wide available dataset and adopting simple methods, the influence of climate change on the heavy rainfalls in the Umbria region is not detected.

Keywords: climate changes, rainfall extremes, rainfall magnitude and frequency, central Italy

Procedia PDF Downloads 211
19448 Minimizing Unscheduled Maintenance from an Aircraft and Rolling Stock Maintenance Perspective: Preventive Maintenance Model

Authors: Adel A. Ghobbar, Varun Raman

Abstract:

The Corrective maintenance of components and systems is a problem plaguing almost every industry in the world today. Train operators’ and the maintenance repair and overhaul subsidiary of the Dutch railway company is also facing this problem. A considerable portion of the maintenance activities carried out by the company are unscheduled. This, in turn, severely stresses and stretches the workforce and resources available. One possible solution is to have a robust preventive maintenance plan. The other possible solution is to plan maintenance based on real-time data obtained from sensor-based ‘Health and Usage Monitoring Systems.’ The former has been investigated in this paper. The preventive maintenance model developed for train operator will subsequently be extended, to tackle the unscheduled maintenance problem also affecting the aerospace industry. The extension of the model to the aerospace sector will be dealt with in the second part of the research, and it would, in turn, validate the soundness of the model developed. Thus, there are distinct areas that will be addressed in this paper, including the mathematical modelling of preventive maintenance and optimization based on cost and system availability. The results of this research will help an organization to choose the right maintenance strategy, allowing it to save considerable sums of money as opposed to overspending under the guise of maintaining high asset availability. The concept of delay time modelling was used to address the practical problem of unscheduled maintenance in this paper. The delay time modelling can be used to help with support planning for a given asset. The model was run using MATLAB, and the results are shown that the ideal inspection intervals computed using the extended from a minimal cost perspective were 29 days, and from a minimum downtime, perspective was 14 days. Risk matrix integration was constructed to represent the risk in terms of the probability of a fault leading to breakdown maintenance and its consequences in terms of maintenance cost. Thus, the choice of an optimal inspection interval of 29 days, resulted in a cost of approximately 50 Euros and the corresponding value of b(T) was 0.011. These values ensure that the risk associated with component X being maintained at an inspection interval of 29 days is more than acceptable. Thus, a switch in maintenance frequency from 90 days to 29 days would be optimal from the point of view of cost, downtime and risk.

Keywords: delay time modelling, unscheduled maintenance, reliability, maintainability, availability

Procedia PDF Downloads 110
19447 Seismic Microzonation Analysis for Damage Mapping of the 2006 Yogyakarta Earthquake, Indonesia

Authors: Fathul Mubin, Budi E. Nurcahya

Abstract:

In 2006, a large earthquake ever occurred in the province of Yogyakarta, which caused considerable damage. This is the basis need to investigate the seismic vulnerability index in around of the earthquake zone. This research is called microzonation of earthquake hazard. This research has been conducted at the site and surrounding of Prambanan Temple, includes homes and civil buildings. The reason this research needs to be done because in the event of an earthquake in 2006, there was damage to the temples at Prambanan temple complex and its surroundings. In this research, data collection carried out for 60 minutes using three component seismograph measurements at 165 points with spacing of 1000 meters. The data recorded in time function were analyzed using the spectral ratio method, known as the Horizontal to Vertical Spectral Ratio (HVSR). Results from this analysis are dominant frequency (Fg) and maximum amplification factor (Ag) are used to obtain seismic vulnerability index. The results of research showed the dominant frequency range from 0.5 to 30 Hz and the amplification is in interval from 0.5 to 9. Interval value for seismic vulnerability index is 0.1 to 50. Based on distribution maps of seismic vulnerability index and impact of buildings damage seemed for suitability. For further research, it needs to survey to the east (klaten) and south (Bantul, DIY) to determine a full distribution maps of seismic vulnerability index.

Keywords: amplification factor, dominant frequency, microzonation analysis, seismic vulnerability index

Procedia PDF Downloads 174
19446 A Double Ended AC Series Arc Fault Location Algorithm Based on Currents Estimation and a Fault Map Trace Generation

Authors: Edwin Calderon-Mendoza, Patrick Schweitzer, Serge Weber

Abstract:

Series arc faults appear frequently and unpredictably in low voltage distribution systems. Many methods have been developed to detect this type of faults and commercial protection systems such AFCI (arc fault circuit interrupter) have been used successfully in electrical networks to prevent damage and catastrophic incidents like fires. However, these devices do not allow series arc faults to be located on the line in operating mode. This paper presents a location algorithm for series arc fault in a low-voltage indoor power line in an AC 230 V-50Hz home network. The method is validated through simulations using the MATLAB software. The fault location method uses electrical parameters (resistance, inductance, capacitance, and conductance) of a 49 m indoor power line. The mathematical model of a series arc fault is based on the analysis of the V-I characteristics of the arc and consists basically of two antiparallel diodes and DC voltage sources. In a first step, the arc fault model is inserted at some different positions across the line which is modeled using lumped parameters. At both ends of the line, currents and voltages are recorded for each arc fault generation at different distances. In the second step, a fault map trace is created by using signature coefficients obtained from Kirchhoff equations which allow a virtual decoupling of the line’s mutual capacitance. Each signature coefficient obtained from the subtraction of estimated currents is calculated taking into account the Discrete Fast Fourier Transform of currents and voltages and also the fault distance value. These parameters are then substituted into Kirchhoff equations. In a third step, the same procedure described previously to calculate signature coefficients is employed but this time by considering hypothetical fault distances where the fault can appear. In this step the fault distance is unknown. The iterative calculus from Kirchhoff equations considering stepped variations of the fault distance entails the obtaining of a curve with a linear trend. Finally, the fault distance location is estimated at the intersection of two curves obtained in steps 2 and 3. The series arc fault model is validated by comparing current registered from simulation with real recorded currents. The model of the complete circuit is obtained for a 49m line with a resistive load. Also, 11 different arc fault positions are considered for the map trace generation. By carrying out the complete simulation, the performance of the method and the perspectives of the work will be presented.

Keywords: indoor power line, fault location, fault map trace, series arc fault

Procedia PDF Downloads 113
19445 Perceptual Organization within Temporal Displacement

Authors: Michele Sinico

Abstract:

The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.

Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization

Procedia PDF Downloads 228
19444 Improved Traveling Wave Method Based Fault Location Algorithm for Multi-Terminal Transmission System of Wind Farm with Grounding Transformer

Authors: Ke Zhang, Yongli Zhu

Abstract:

Due to rapid load growths in today’s highly electrified societies and the requirement for green energy sources, large-scale wind farm power transmission system is constantly developing. This system is a typical multi-terminal power supply system, whose structure of the network topology of transmission lines is complex. What’s more, it locates in the complex terrain of mountains and grasslands, thus increasing the possibility of transmission line faults and finding the fault location with difficulty after the faults and resulting in an extremely serious phenomenon of abandoning the wind. In order to solve these problems, a fault location method for multi-terminal transmission line based on wind farm characteristics and improved single-ended traveling wave positioning method is proposed. Through studying the zero sequence current characteristics by using the characteristics of the grounding transformer(GT) in the existing large-scale wind farms, it is obtained that the criterion for judging the fault interval of the multi-terminal transmission line. When a ground short-circuit fault occurs, there is only zero sequence current on the path between GT and the fault point. Therefore, the interval where the fault point exists is obtained by determining the path of the zero sequence current. After determining the fault interval, The location of the short-circuit fault point is calculated by the traveling wave method. However, this article uses an improved traveling wave method. It makes the positioning accuracy more accurate by combining the single-ended traveling wave method with double-ended electrical data. What’s more, a method of calculating the traveling wave velocity is deduced according to the above improvements (it is the actual wave velocity in theory). The improvement of the traveling wave velocity calculation method further improves the positioning accuracy. Compared with the traditional positioning method, the average positioning error of this method is reduced by 30%.This method overcomes the shortcomings of the traditional method in poor fault location of wind farm transmission lines. In addition, it is more accurate than the traditional fixed wave velocity method in the calculation of the traveling wave velocity. It can calculate the wave velocity in real time according to the scene and solve the traveling wave velocity can’t be updated with the environment and real-time update. The method is verified in PSCAD/EMTDC.

Keywords: grounding transformer, multi-terminal transmission line, short circuit fault location, traveling wave velocity, wind farm

Procedia PDF Downloads 228
19443 The Usage of Bridge Estimator for Hegy Seasonal Unit Root Tests

Authors: Huseyin Guler, Cigdem Kosar

Abstract:

The aim of this study is to propose Bridge estimator for seasonal unit root tests. Seasonality is an important factor for many economic time series. Some variables may contain seasonal patterns and forecasts that ignore important seasonal patterns have a high variance. Therefore, it is very important to eliminate seasonality for seasonal macroeconomic data. There are some methods to eliminate the impacts of seasonality in time series. One of them is filtering the data. However, this method leads to undesired consequences in unit root tests, especially if the data is generated by a stochastic seasonal process. Another method to eliminate seasonality is using seasonal dummy variables. Some seasonal patterns may result from stationary seasonal processes, which are modelled using seasonal dummies but if there is a varying and changing seasonal pattern over time, so the seasonal process is non-stationary, deterministic seasonal dummies are inadequate to capture the seasonal process. It is not suitable to use seasonal dummies for modeling such seasonally nonstationary series. Instead of that, it is necessary to take seasonal difference if there are seasonal unit roots in the series. Different alternative methods are proposed in the literature to test seasonal unit roots, such as Dickey, Hazsa, Fuller (DHF) and Hylleberg, Engle, Granger, Yoo (HEGY) tests. HEGY test can be also used to test the seasonal unit root in different frequencies (monthly, quarterly, and semiannual). Another issue in unit root tests is the lag selection. Lagged dependent variables are added to the model in seasonal unit root tests as in the unit root tests to overcome the autocorrelation problem. In this case, it is necessary to choose the lag length and determine any deterministic components (i.e., a constant and trend) first, and then use the proper model to test for seasonal unit roots. However, this two-step procedure might lead size distortions and lack of power in seasonal unit root tests. Recent studies show that Bridge estimators are good in selecting optimal lag length while differentiating nonstationary versus stationary models for nonseasonal data. The advantage of this estimator is the elimination of the two-step nature of conventional unit root tests and this leads a gain in size and power. In this paper, the Bridge estimator is proposed to test seasonal unit roots in a HEGY model. A Monte-Carlo experiment is done to determine the efficiency of this approach and compare the size and power of this method with HEGY test. Since Bridge estimator performs well in model selection, our approach may lead to some gain in terms of size and power over HEGY test.

Keywords: bridge estimators, HEGY test, model selection, seasonal unit root

Procedia PDF Downloads 302
19442 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach

Authors: Helen L. Hein, Joachim Schwarte

Abstract:

As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.

Keywords: aerogel-based, insulating material, early development phase, interval arithmetic

Procedia PDF Downloads 124
19441 Discrete Estimation of Spectral Density for Alpha Stable Signals Observed with an Additive Error

Authors: R. Sabre, W. Horrigue, J. C. Simon

Abstract:

This paper is interested in two difficulties encountered in practice when observing a continuous time process. The first is that we cannot observe a process over a time interval; we only take discrete observations. The second is the process frequently observed with a constant additive error. It is important to give an estimator of the spectral density of such a process taking into account the additive observation error and the choice of the discrete observation times. In this work, we propose an estimator based on the spectral smoothing of the periodogram by the polynomial Jackson kernel reducing the additive error. In order to solve the aliasing phenomenon, this estimator is constructed from observations taken at well-chosen times so as to reduce the estimator to the field where the spectral density is not zero. We show that the proposed estimator is asymptotically unbiased and consistent. Thus we obtain an estimate solving the two difficulties concerning the choice of the instants of observations of a continuous time process and the observations affected by a constant error.

Keywords: spectral density, stable processes, aliasing, periodogram

Procedia PDF Downloads 116
19440 Impact of the Time Interval in the Numerical Solution of Incompressible Flows

Authors: M. Salmanzadeh

Abstract:

In paper, we will deal with incompressible Couette flow, which represents an exact analytical solution of the Navier-Stokes equations. Couette flow is perhaps the simplest of all viscous flows, while at the same time retaining much of the same physical characteristics of a more complicated boundary-layer flow. The numerical technique that we will employ for the solution of the Couette flow is the Crank-Nicolson implicit method. Parabolic partial differential equations lend themselves to a marching solution; in addition, the use of an implicit technique allows a much larger marching step size than would be the case for an explicit solution. Hence, in the present paper we will have the opportunity to explore some aspects of CFD different from those discussed in the other papers.

Keywords: incompressible couette flow, numerical method, partial differential equation, Crank-Nicolson implicit

Procedia PDF Downloads 503
19439 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: biophysics analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion

Procedia PDF Downloads 108
19438 Ultracapacitor State-of-Energy Monitoring System with On-Line Parameter Identification

Authors: N. Reichbach, A. Kuperman

Abstract:

The paper describes a design of a monitoring system for super capacitor packs in propulsion systems, allowing determining the instantaneous energy capacity under power loading. The system contains real-time recursive-least-squares identification mechanism, estimating the values of pack capacitance and equivalent series resistance. These values are required for accurate calculation of the state-of-energy.

Keywords: real-time monitoring, RLS identification algorithm, state-of-energy, super capacitor

Procedia PDF Downloads 508
19437 Modeling of Bed Level Changes in Larak Island

Authors: Saeed Zeinali, Nasser Talebbeydokhti, Mehdi Saeidian, Shahrad Vosough

Abstract:

In this article, bathymetry changes have been studied as a case study for Larak Island, located in The South of Iran. The advanced 2D model of Mike21 has been used for this purpose. A simple procedure has been utilized in this model. First, the hydrodynamic (HD) module of Mike21 has been used to obtain the required output for sediment transport model (ST module). The ST module modeled the area for tidal currents only. Bed level changes are resulted by series of modeling for both HD and ST module in 3 months time step. The final bathymetry in each time step is used as the primary bathymetry for next time step. This consecutive procedure been continued until bathymetry for the year 2020 is obtained.

Keywords: bed level changes, Larak Island, hydrodynamic, sediment transport

Procedia PDF Downloads 240
19436 A Ground Observation Based Climatology of Winter Fog: Study over the Indo-Gangetic Plains, India

Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva

Abstract:

Every year, fog formation over the Indo-Gangetic Plains (IGPs) of Indian region during the winter months of December and January is believed to create numerous hazards, inconvenience, and economic loss to the inhabitants of this densely populated region of Indian subcontinent. The aim of the paper is to analyze the spatial and temporal variability of winter fog over IGPs. Long term ground observations of visibility and other meteorological parameters (1971-2010) have been analyzed to understand the formation of fog phenomena and its relevance during the peak winter months of January and December over IGP of India. In order to examine the temporal variability, time series and trend analysis were carried out by using the Mann-Kendall Statistical test. Trend analysis performed by using the Mann-Kendall test, accepts the alternate hypothesis with 95% confidence level indicating that there exists a trend. Kendall tau’s statistics showed that there exists a positive correlation between time series and fog frequency. Further, the Theil and Sen’s median slope estimate showed that the magnitude of trend is positive. Magnitude is higher during January compared to December for the entire IGP except in December when it is high over the western IGP. Decade wise time series analysis revealed that there has been continuous increase in fog days. The net overall increase of 99 % was observed over IGP in last four decades. Diurnal variability and average daily persistence were computed by using descriptive statistical techniques. Geo-statistical analysis of fog was carried out to understand the spatial variability of fog. Geo-statistical analysis of fog revealed that IGP is a high fog prone zone with fog occurrence frequency of more than 66% days during the study period. Diurnal variability indicates the peak occurrence of fog is between 06:00 and 10:00 local time and average daily fog persistence extends to 5 to 7 hours during the peak winter season. The results would offer a new perspective to take proactive measures in reducing the irreparable damage that could be caused due to changing trends of fog.

Keywords: fog, climatology, Mann-Kendall test, trend analysis, spatial variability, temporal variability, visibility

Procedia PDF Downloads 216
19435 Comparison of Back-Projection with Non-Uniform Fast Fourier Transform for Real-Time Photoacoustic Tomography

Authors: Moung Young Lee, Chul Gyu Song

Abstract:

Photoacoustic imaging is the imaging technology that combines the optical imaging and ultrasound. This provides the high contrast and resolution due to optical imaging and ultrasound imaging, respectively. We developed the real-time photoacoustic tomography (PAT) system using linear-ultrasound transducer and digital acquisition (DAQ) board. There are two types of algorithm for reconstructing the photoacoustic signal. One is back-projection algorithm, the other is FFT algorithm. Especially, we used the non-uniform FFT algorithm. To evaluate the performance of our system and algorithms, we monitored two wires that stands at interval of 2.89 mm and 0.87 mm. Then, we compared the images reconstructed by algorithms. Finally, we monitored the two hairs crossed and compared between these algorithms.

Keywords: back-projection, image comparison, non-uniform FFT, photoacoustic tomography

Procedia PDF Downloads 411
19434 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection

Procedia PDF Downloads 280
19433 Effect of Microstructure on Transition Temperature of Austempered Ductile Iron (ADI)

Authors: A. Ozel

Abstract:

The ductile to brittle transition temperature is a very important criterion that is used for selection of materials in some applications, especially in low-temperature conditions. For that reason, in this study transition temperature of as-cast and austempered unalloyed ductile iron in the temperature interval from -60 to +100 degrees C have been investigated. The microstructures of samples were examined by light microscope. The impact energy values obtained from the experiments were found to depend on the austempering time and temperature.

Keywords: Austempered Ductile Iron (ADI), Charpy test, microstructure, transition temperature

Procedia PDF Downloads 480
19432 Autonomous Quantum Competitive Learning

Authors: Mohammed A. Zidan, Alaa Sagheer, Nasser Metwally

Abstract:

Real-time learning is an important goal that most of artificial intelligence researches try to achieve it. There are a lot of problems and applications which require low cost learning such as learn a robot to be able to classify and recognize patterns in real time and real-time recall. In this contribution, we suggest a model of quantum competitive learning based on a series of quantum gates and additional operator. The proposed model enables to recognize any incomplete patterns, where we can increase the probability of recognizing the pattern at the expense of the undesired ones. Moreover, these undesired ones could be utilized as new patterns for the system. The proposed model is much better compared with classical approaches and more powerful than the current quantum competitive learning approaches.

Keywords: competitive learning, quantum gates, quantum gates, winner-take-all

Procedia PDF Downloads 442
19431 Solar Radiation Time Series Prediction

Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs

Abstract:

A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled DNI field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.

Keywords: artificial neural networks, resilient propagation, solar radiation, time series forecasting

Procedia PDF Downloads 357
19430 The Relationships between Carbon Dioxide (CO2) Emissions, Energy Consumption, and GDP for Turkey: Time Series Analysis, 1980-2010

Authors: Jinhoa Lee

Abstract:

The relationships between environmental quality, energy use and economic output have created growing attention over the past decades among researchers and policy makers. Focusing on the empirical aspects of the role of CO2 emissions and energy use in affecting the economic output, this paper is an effort to fulfill the gap in a comprehensive case study at a country level using modern econometric techniques. To achieve the goal, this country-specific study examines the short-run and long-run relationships among energy consumption (using disaggregated energy sources: crude oil, coal, natural gas, electricity), carbon dioxide (CO2) emissions and gross domestic product (GDP) for Turkey using time series analysis from the year 1980-2010. To investigate the relationships between the variables, this paper employs the Phillips–Perron (PP) test for stationarity, Johansen maximum likelihood method for cointegration and a Vector Error Correction Model (VECM) for both short- and long-run causality among the research variables for the sample. All the variables in this study show very strong significant effects on GDP in the country for the long term. The long-run equilibrium in the VECM suggests negative long-run causalities from consumption of petroleum products and the direct combustion of crude oil, coal and natural gas to GDP. Conversely, positive impacts of CO2 emissions and electricity consumption on GDP are found to be significant in Turkey during the period. There exists a short-run bidirectional relationship between electricity consumption and natural gas consumption. There exists a positive unidirectional causality running from electricity consumption to natural gas consumption, while there exists a negative unidirectional causality running from natural gas consumption to electricity consumption. Moreover, GDP has a negative effect on electricity consumption in Turkey in the short run. Overall, the results support arguments that there are relationships among environmental quality, energy use and economic output but the associations can to be differed by the sources of energy in the case of Turkey over of period 1980-2010.

Keywords: CO2 emissions, energy consumption, GDP, Turkey, time series analysis

Procedia PDF Downloads 490
19429 Exceptional Cost and Time Optimization with Successful Leak Repair and Restoration of Oil Production: West Kuwait Case Study

Authors: Nasser Al-Azmi, Al-Sabea Salem, Abu-Eida Abdullah, Milan Patra, Mohamed Elyas, Daniel Freile, Larisa Tagarieva

Abstract:

Well intervention was done along with Production Logging Tools (PLT) to detect sources of water, and to check well integrity for two West Kuwait oil wells started to produce 100 % water. For the first well, to detect the source of water, PLT was performed to check the perforations, no production observed from the bottom two perforation intervals, and an intake of water was observed from the top most perforation. Then a decision was taken to extend the PLT survey from tag depth to the Y-tool. For the second well, the aim was to detect the source of water and if there was a leak in the 7’’liner in front of the upper zones. Data could not be recorded in flowing conditions due to the casing deformation at almost 8300 ft. For the first well from the interpretation of PLT and well integrity data, there was a hole in the 9 5/8'' casing from 8468 ft to 8494 ft producing almost the majority of water, which is 2478 bbl/d. The upper perforation from 10812 ft to 10854 ft was taking 534 stb/d. For the second well, there was a hole in the 7’’liner from 8303 ft MD to 8324 ft MD producing 8334.0 stb/d of water with an intake zone from10322.9-10380.8 ft MD taking the whole fluid. To restore the oil production, W/O rig was mobilized to prevent dump flooding, and during the W/O, the leaking interval was confirmed for both wells. The leakage was cement squeezed and tested at 900-psi positive pressure and 500-psi drawdown pressure. The cement squeeze job was successful. After W/O, the wells kept producing for cleaning, and eventually, the WC reduced to 0%. Regular PLT and well integrity logs are required to study well performance, and well integrity issues, proper cement behind casing is essential to well longevity and well integrity, and the presence of the Y-tool is essential as monitoring of well parameters and ESP to facilitate well intervention tasks. Cost and time optimization in oil and gas and especially during rig operations is crucial. PLT data quality and the accuracy of the interpretations contributed a lot to identify the leakage interval accurately and, in turn, saved a lot of time and reduced the repair cost with almost 35 to 45 %. The added value here was more related to the cost reduction and effective and quick proper decision making based on the economic environment.

Keywords: leak, water shut-off, cement, water leak

Procedia PDF Downloads 95
19428 Enhancing Learners' Metacognitive, Cultural and Linguistic Proficiency through Egyptian Series

Authors: Hanan Eltayeb, Reem Al Refaie

Abstract:

To be able to connect and relate to shows spoken in a foreign language, advanced learners must understand not only linguistics inferences but also cultural, metacognitive, and pragmatic connotations in colloquial Egyptian TV series. These connotations are needed to both understand the different facets of the dramas put before them, and they’re also consistently grown and formulated through watching these shows. The inferences have become a staple in the Egyptian colloquial culture over the years, making their way into day-to-day conversations as Egyptians use them to speak, relate, joke, and connect with each other, without having known one another from previous times. As for advanced learners, they need to understand these inferences not only to watch these shows, but also to be able to converse with Egyptians on a level that surpasses the formal, or standard. When faced with some of the somewhat recent shows on the Egyptian screens, learners faced challenges in understanding pragmatics, cultural, and religious background of the target language and consequently not able to interact effectively with a native speaker in real-life situations. This study aims to enhance the linguistic and cultural proficiency of learners through studying two genres of TV Colloquial Egyptian series. Study samples derived from two recent comedian and social Egyptian series ('The Seventh Neighbor' سابع جار, and 'Nelly and Sherihan' نيللي و شريهان). When learners watch such series, they are usually faced with a problem understanding inferences that have to do with social, religious, and political events that are addressed in the series. Using discourse analysis of the sematic, semantic, pragmatic, cultural, and linguistic characteristics of the target language, some major deductions were highlighted and repeated, showing a pattern in both. The research paper concludes that there are many sets of lingual and para-lingual phrases, idioms, and proverbs to be acquired and used effectively by teaching these series. The strategies adopted in the study can be applied to different types of media, like movies, TV shows, and even cartoons, to enhance student proficiency.

Keywords: Egyptian series, culture, linguistic competence, pragmatics, semantics, social

Procedia PDF Downloads 115
19427 Representation of Agamben's Concept of 'Homo Sacer': Interpretative Analysis in Turkish TV Series Based on Turkey's 1980 Military Coup

Authors: Oyku Yenen

Abstract:

The notion of biopolitics, as studied by such intellectuals as Foucault, Agamben, and Negri, is an important guide for comprehending the current understanding of politics. While Foucault evaluates biopolitics as a survival policy, Giorgio Agamben, German legist, identifies the theory with death. Agamben claims the fact we can all considered to be homo sacer who are abandoned by the law, left in the field of exception, and whose killing does not require punishment. Agamben defines the person who is tried by the public for committing a crime but is not allowed to be sacrificed and whose killing is not considered a crime, as 'homo sacer'. This study analyzes how the concept of 'homo sacer' is made visible in TV series such as Çemberimde Gül Oya (Cagan Irmak, 2005-2005), Hatırla Sevgili (Ummu Burhan, 2006-2008), Bu Kalp Seni Unutur Mu? (Aydin Bulut, 2009-1010) all of which portray the period Turkey's 1980 military coup, within the framework of Agamben's thoughts and notions about biopolitics. When the main plots of these abovementioned TV series, which constitute the universe of this study, are scrutinized closely, they lay out the understanding of politics that has existed throughout history and prevails today. Although there is a large number of TV series on the coup of 1980, these three series are the only main productions that specifically focused on the event itself. Our final analysis will reveal that the concepts of homo sacer, bare life, exception, camp have been embodied in different ways in these three series. In these three series, which all deal with similar subjects using differing perspectives, the dominant understanding of politics is clearly conveyed to the audience. In all three series, the reigning power always decides on the exceptions, those who will live, those who will die, and those who will be ignored by law. Such characters as Mehmet, Sinan, Yıldız, Deniz, Defne, all of which we come across in these series, are on trial as a criminals of thought and are subjected to various forms of torture while isolated in an area where they are virtually deprived of law. Their citizenship rights are revoked. All of them are left alone with their bare lives (zoe).

Keywords: bare life, biopolitics, homo sacer, sovereign power, state of exception

Procedia PDF Downloads 109
19426 A Quantitative Study of the Evolution of Open Source Software Communities

Authors: M. R. Martinez-Torres, S. L. Toral, M. Olmedilla

Abstract:

Typically, virtual communities exhibit the well-known phenomenon of participation inequality, which means that only a small percentage of users is responsible of the majority of contributions. However, the sustainability of the community requires that the group of active users must be continuously nurtured with new users that gain expertise through a participation process. This paper analyzes the time evolution of Open Source Software (OSS) communities, considering users that join/abandon the community over time and several topological properties of the network when modeled as a social network. More specifically, the paper analyzes the role of those users rejoining the community and their influence in the global characteristics of the network.

Keywords: open source communities, social network Analysis, time series, virtual communities

Procedia PDF Downloads 502
19425 The Hurricane 'Bump': Measuring the Effects of Hurricanes on Wages in Southern Louisiana

Authors: Jasmine Latiolais

Abstract:

Much of the disaster-related literature finds a positive relationship between the impact of a natural disaster and the growth of wages. Panel datasets are often used to explore these effects. However, natural disasters do not impact a single variable in the economy. Rather, natural disasters affect all facets of the economy, simultaneously, upon impact. It is difficult to control for all factors that would be influenced by the impact of a natural disaster, which can lead to lead to omitted variable bias in those studies employing panel datasets. To address this issue of omitted variable bias, an interrupted time series analysis is used to test the short-run relationship between the impact of Hurricanes Katrina and Rita on parish wage levels in Southern Louisiana, inherently controlling for economic conditions. This study provides evidence that natural disasters do increase wages in the very short term (one quarter following the impact of the hurricane) but that these results are not seen in the longer term and are not robust. In addition, the significance of the coefficients changes depending on the parish. Overall, this study finds that previous literature on this topic may not be robust when considered through a time-series lens.

Keywords: economic recovery, local economies, local wage growth, natural disasters

Procedia PDF Downloads 113
19424 Secure Authentication Scheme Based on Numerical Series Cryptography for Internet of Things

Authors: Maha Aladdin, Khaled Nagaty, Abeer Hamdy

Abstract:

The rapid advancement cellular networks and wireless networks have laid a solid basis for the Internet of Things. IoT has evolved into a unique standard that allows diverse physical devices to collaborate with one another. A service provider gives a variety of services that may be accessed via smart apps anywhere, at any time, and from any location over the Internet. Because of the public environment of mobile communication and the Internet, these services are highly vulnerable to a several malicious attacks, such as unauthorized disclosure by hostile attackers. As a result, the best option for overcoming these vulnerabilities is a strong authentication method. In this paper, a lightweight authentication scheme that is based on numerical series cryptography is proposed for the IoT environments. It allows mutual authentication between IoT devices Parametric study and formal proofs are utilized to illustrate that the pro-posed approach is resistant to a variety of security threats.

Keywords: internet of things, authentication, cryptography, security protocol

Procedia PDF Downloads 85
19423 Dynamic Voltage Restorer Control Strategies: An Overview

Authors: Arvind Dhingra, Ashwani Kumar Sharma

Abstract:

Power quality is an important parameter for today’s consumers. Various custom power devices are in use to give a proper supply of power quality. Dynamic Voltage Restorer is one such custom power device. DVR is a static VAR device which is used for series compensation. It is a power electronic device that is used to inject a voltage in series and in synchronism to compensate for the sag in voltage. Inductive Loads are a major source of power quality distortion. The induction furnace is one such typical load. A typical induction furnace is used for melting the scrap or iron. At the time of starting the melting process, the power quality is distorted to a large extent especially with the induction of harmonics. DVR is one such approach to mitigate these harmonics. This paper is an attempt to overview the various control strategies being followed for control of power quality by using DVR. An overview of control of harmonics using DVR is also presented.

Keywords: DVR, power quality, harmonics, harmonic mitigation

Procedia PDF Downloads 348
19422 Effect of Serum Electrolytes on a QTc Interval and Mortality in Patients admitted to Coronary Care Unit

Authors: Thoetchai Peeraphatdit, Peter A. Brady, Suraj Kapa, Samuel J. Asirvatham, Niyada Naksuk

Abstract:

Background: Serum electrolyte abnormalities are a common cause of an acquired prolonged QT syndrome, especially, in the coronary care unit (CCU) setting. Optimal electrolyte ranges among the CCU patients have not been sufficiently investigated. Methods: We identified 8,498 consecutive CCU patients who were admitted to the CCU at Mayo Clinic, Rochester, the USA, from 2004 through 2013. Association between first serum electrolytes and baseline corrected QT intervals (QTc), as well as in-hospital mortality, was tested using multivariate linear regression and logistic regression, respectively. Serum potassium 4.0- < 4.5 mEq/L, ionized calcium (iCa) 4.6-4.8 mg/dL, and magnesium 2.0- < 2.2 mg/dL were used as the reference levels. Results: There was a modest level-dependent relationship between hypokalemia ( < 4.0 mEq/L), hypocalcemia ( < 4.4 mg/dL), and a prolonged QTc interval; serum magnesium did not affect the QTc interval. Association between the serum electrolytes and in-hospital mortality included a U-shaped relationship for serum potassium (adjusted odds ratio (OR) 1.53 and OR 1.91for serum potassium 4.5- < 5.0 and ≥ 5.0 mEq/L, respectively) and an inverted J-shaped relationship for iCa (adjusted OR 2.79 and OR 2.03 for calcium < 4.4 and 4.4- < 4.6 mg/dL, respectively). For serum magnesium, the mortality was greater only among patients with levels ≥ 2.4 mg/dL (adjusted OR 1.40), compared to the reference level. Findings were similar in sensitivity analyses examining the association between mean serum electrolytes and mean QTc intervals, as well as in-hospital mortality. Conclusions: Serum potassium 4.0- < 4.5 mEq/L, iCa ≥ 4.6 mg/dL, and magnesium < 2.4 mg/dL had a neutral effect on QTc intervals and were associated with the lowest in-hospital mortality among the CCU patients.

Keywords: calcium, electrocardiography, long-QT syndrome, magnesium, mortality, potassium

Procedia PDF Downloads 368