Search results for: Excavation method.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8112

Search results for: Excavation method.

5652 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics

Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo

Abstract:

Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.

Keywords: Communication signal, feature extraction, holder coefficient, improved cloud model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708
5651 Design of Roller Compacting Concrete Pavement

Authors: O. Zarrin, M. Ramezan Shirazi

Abstract:

The quality of concrete is usually defined by compressive strength, but flexural strength is the most important characteristic of concrete in a pavement which control the mix design of concrete instead of compressive strength. Therefore, the aggregates which are selected for the pavements are affected by higher flexural strength. Roller Compacting Concrete Pavement (RCCP) is not a new construction method. The other characteristic of this method is no bleeding and less shrinkage due to the lower amount of water. For this purpose, a roller is needed for placing and compacting. The surface of RCCP is not smooth; therefore, the most common use of this pavement is in an industrial zone with slower traffic speed which requires durable and tough pavement. For preparing a smoother surface, it can be achieved by asphalt paver. RCCP decrease the finishing cost because there are no bars, formwork, and the lesser labor need for placing the concrete. In this paper, different aspect of RCCP such as mix design, flexural, compressive strength and focus on the different part of RCCP on detail have been investigated.

Keywords: Flexural Strength, Compressive Strength, Pavement, Asphalt.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2064
5650 The Effect of Choke on the Efficiency of Coaxial Antenna for Percutaneous Microwave Coagulation Therapy for Hepatic Tumor

Authors: Surita Maini

Abstract:

There are many perceived advantages of microwave ablation have driven researchers to develop innovative antennas to effectively treat deep-seated, non-resectable hepatic tumors. In this paper a coaxial antenna with a miniaturized sleeve choke has been discussed for microwave interstitial ablation therapy, in order to reduce backward heating effects irrespective of the insertion depth into the tissue. Two dimensional Finite Element Method (FEM) is used to simulate and measure the results of miniaturized sleeve choke antenna. This paper emphasizes the importance of factors that can affect simulation accuracy, which include mesh resolution, surface heating and reflection coefficient. Quarter wavelength choke effectiveness has been discussed by comparing it with the unchoked antenna with same dimensions.

Keywords: Microwave ablation, tumor, Finite Element Method, Coaxial slot antenna, Coaxial dipole antenna.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609
5649 Comparison of Different Solvents and Extraction Methods for Isolation of Phenolic Compounds from Horseradish Roots (Armoracia rusticana)

Authors: Lolita Tomsone, Zanda Kruma, Ruta Galoburda

Abstract:

Horseradish (Armoracia rusticana) is a perennial herb belonging to the Brassicaceae family and contains biologically active substances. The aim of the current research was to determine best method for extraction of phenolic compounds from horseradish roots showing high antiradical activity. Three genotypes (No. 105; No. 106 and variety ‘Turku’) of horseradish roots were extracted with eight different solvents: n-hexane, ethyl acetate, diethyl ether, 2-propanol, acetone, ethanol (95%), ethanol / water / acetic acid (80/20/1 v/v/v) and ethanol / water (80/20 by volume) using two extraction methods (conventional and Soxhlet). As the best solvents ethanol and ethanol / water solutions can be chosen. Although in Soxhlet extracts TPC was higher, scavenging activity of DPPH˙ radicals did not increase. It can be concluded that using Soxhlet extraction method more compounds that are not effective antioxidants.

Keywords: DPPH˙, extraction, solvent, Soxhlet, TPC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14497
5648 Control and Simulation of FOPDT Food Processes with Constraints using PI Controller

Authors: M.Y. Pua, M.C. Tan, L.W. Tan, N. Ab.Aziz, F.S. Taip

Abstract:

The most common type of controller being used in the industry is PI(D) controller which has been used since 1945 and is still being widely used due to its efficiency and simplicity. In most cases, the PI(D) controller was tuned without taking into consideration of the effect of actuator saturation. In real processes, the most common actuator which is valve will act as constraint and restrict the controller output. Since the controller is not designed to encounter saturation, the process may windup and consequently resulted in large oscillation or may become unstable. Usually, an antiwindup compensator is added to the feedback control loop to reduce the deterioration effect of integral windup. This research aims to specifically control processes with constraints. The proposed method was applied to two different types of food processes, which are blending and spray drying. Simulations were done using MATLAB and the performances of the proposed method were compared with other conventional methods. The proposed technique was able to control the processes and avoid saturation such that no anti windup compensator is needed.

Keywords: constraints, food process control, first order plusdead time process, PI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
5647 On-line Speech Enhancement by Time-Frequency Masking under Prior Knowledge of Source Location

Authors: Min Ah Kang, Sangbae Jeong, Minsoo Hahn

Abstract:

This paper presents the source extraction system which can extract only target signals with constraints on source localization in on-line systems. The proposed system is a kind of methods for enhancing a target signal and suppressing other interference signals. But, the performance of proposed system is superior to any other methods and the extraction of target source is comparatively complete. The method has a beamforming concept and uses an improved time-frequency (TF) mask-based BSS algorithm to separate a target signal from multiple noise sources. The target sources are assumed to be in front and test data was recorded in a reverberant room. The experimental results of the proposed method was evaluated by the PESQ score of real-recording sentences and showed a noticeable speech enhancement.

Keywords: Beam forming, Non-stationary noise reduction, Source separation, TF mask.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
5646 Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models

Authors: Vladimir Rastunkov, Jae-Eun Park, Abhijit Mitra, Brian Quanz, Steve Wood, Christopher Codella, Heather Higgins, Joseph Broz

Abstract:

Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods. In this work we propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets. This approach is derived from the best ensemble building practices that worked well in traditional machine learning and thus should push the limits of quantum model performance even further. We find that in some cases, a single QSVM model with tuned hyperparameters is sufficient to simulate the data, while in others - an ensemble of QSVMs that are forced to do exploration of the feature space via proposed method is beneficial.

Keywords: QSVM, Quantum Support Vector Machines, quantum kernel, boosting, ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 439
5645 Segmental and Subsegmental Lung Vessel Segmentation in CTA Images

Authors: H. Özkan

Abstract:

In this paper, a novel and fast algorithm for segmental and subsegmental lung vessel segmentation is introduced using Computed Tomography Angiography images. This process is quite important especially at the detection of pulmonary embolism, lung nodule, and interstitial lung disease. The applied method has been realized at five steps. At the first step, lung segmentation is achieved. At the second one, images are threshold and differences between the images are detected. At the third one, left and right lungs are gathered with the differences which are attained in the second step and Exact Lung Image (ELI) is achieved. At the fourth one, image, which is threshold for vessel, is gathered with the ELI. Lastly, identifying and segmentation of segmental and subsegmental lung vessel have been carried out thanks to image which is obtained in the fourth step. The performance of the applied method is found quite well for radiologists and it gives enough results to the surgeries medically.

Keywords: Computed tomography angiography (CTA), Computer aided detection (CAD), Lung segmentation, Lung vessel segmentation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179
5644 Mathematical Modeling of Human Cardiovascular System: A Lumped Parameter Approach and Simulation

Authors: Ketan Naik, P. H. Bhathawala

Abstract:

The purpose of this work is to develop a mathematical model of Human Cardiovascular System using lumped parameter method. The model is divided in three parts: Systemic Circulation, Pulmonary Circulation and the Heart. The established mathematical model has been simulated by MATLAB software. The innovation of this study is in describing the system based on the vessel diameters and simulating mathematical equations with active electrical elements. Terminology of human physical body and required physical data like vessel’s radius, thickness etc., which are required to calculate circuit parameters like resistance, inductance and capacitance, are proceeds from well-known medical books. The developed model is useful to understand the anatomic of human cardiovascular system and related syndromes. The model is deal with vessel’s pressure and blood flow at certain time.

Keywords: Cardiovascular system, lumped parameter method, mathematical modeling, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3361
5643 Region-Based Image Fusion with Artificial Neural Network

Authors: Shuo-Li Hsu, Peng-Wei Gau, I-Lin Wu, Jyh-Horng Jeng

Abstract:

For most image fusion algorithms separate relationship by pixels in the image and treat them more or less independently. In addition, they have to be adjusted different parameters in different time or weather. In this paper, we propose a region–based image fusion which combines aspects of feature and pixel-level fusion method to replace only by pixel. The basic idea is to segment far infrared image only and to add information of each region from segmented image to visual image respectively. Then we determine different fused parameters according different region. At last, we adopt artificial neural network to deal with the problems of different time or weather, because the relationship between fused parameters and image features are nonlinear. It render the fused parameters can be produce automatically according different states. The experimental results present the method we proposed indeed have good adaptive capacity with automatic determined fused parameters. And the architecture can be used for lots of applications.

Keywords: Image fusion, Region-based fusion, Segmentation, Neural network, Multi-sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2258
5642 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement – Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: Failure, pavement, probability, reliability index, simulation, tensile crack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2305
5641 Temperature Dependent Interaction Energies among X (=Ru, Rh) Impurities in Pd-Rich PdX Alloys

Authors: M. Asato, C. Liu, N. Fujima, T. Hoshino, Y. Chen, T. Mohri

Abstract:

We study the temperature dependence of the interaction energies (IEs) of X (=Ru, Rh) impurities in Pd, due to the Fermi-Dirac (FD) distribution and the thermal vibration effect by the Debye-Grüneisen model. The n-body (n=2~4) IEs among X impurities in Pd, being used to calculate the internal energies in the free energies of the Pd-rich PdX alloys, are determined uniquely and successively from the lower-order to higher-order, by the full-potential Korringa-Kohn-Rostoker Green’s function method (FPKKR), combined with the generalized gradient approximation in the density functional theory. We found that the temperature dependence of IEs due to the FD distribution, being usually neglected, is very important to reproduce the X-concentration dependence of the observed solvus temperatures of the Pd-rich PdX (X=Ru, Rh) alloys.

Keywords: Full-potential KKR-Green’s function method, Fermi-Dirac distribution, GGA, phase diagram of Pd-rich PdX (X=Ru, Rh) alloys, thermal vibration effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003
5640 Comparison of Detrending Methods in Spectral Analysis of Heart Rate Variability

Authors: Liping Li, Changchun Liu, Ke Li, Chengyu Liu

Abstract:

Non-stationary trend in R-R interval series is considered as a main factor that could highly influence the evaluation of spectral analysis. It is suggested to remove trends in order to obtain reliable results. In this study, three detrending methods, the smoothness prior approach, the wavelet and the empirical mode decomposition, were compared on artificial R-R interval series with four types of simulated trends. The Lomb-Scargle periodogram was used for spectral analysis of R-R interval series. Results indicated that the wavelet method showed a better overall performance than the other two methods, and more time-saving, too. Therefore it was selected for spectral analysis of real R-R interval series of thirty-seven healthy subjects. Significant decreases (19.94±5.87% in the low frequency band and 18.97±5.78% in the ratio (p<0.001)) were found. Thus the wavelet method is recommended as an optimal choice for use.

Keywords: empirical mode decomposition, heart rate variability, signal detrending, smoothness priors, wavelet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
5639 Development of Nondestructive Imaging Analysis Method Using Muonic X-Ray with a Double-Sided Silicon Strip Detector

Authors: I-Huan Chiu, Kazuhiko Ninomiya, Shin’ichiro Takeda, Meito Kajino, Miho Katsuragawa, Shunsaku Nagasawa, Atsushi Shinohara, Tadayuki Takahashi, Ryota Tomaru, Shin Watanabe, Goro Yabu

Abstract:

In recent years, a nondestructive elemental analysis method based on muonic X-ray measurements has been developed and applied for various samples. Muonic X-rays are emitted after the formation of a muonic atom, which occurs when a negatively charged muon is captured in a muon atomic orbit around the nucleus. Because muonic X-rays have a higher energy than electronic X-rays due to the muon mass, they can be measured without being absorbed by a material. Thus, estimating the two-dimensional (2D) elemental distribution of a sample became possible using an X-ray imaging detector. In this work, we report a non-destructive imaging experiment using muonic X-rays at Japan Proton Accelerator Research Complex. The irradiated target consisted of a polypropylene material, and a double-sided silicon strip detector, which was developed as an imaging detector for astronomical obervation, was employed. A peak corresponding to muonic X-rays from the carbon atoms in the target was clearly observed in the energy spectrum at an energy of 14 keV, and 2D visualizations were successfully reconstructed to reveal the projection image from the target. This result demonstrates the potential of the nondestructive elemental imaging method that is based on muonic X-ray measurement. To obtain a higher position resolution for imaging a smaller target, a new detector system will be developed to improve the statistical analysis in further research.

Keywords: DSSD, muon, muonic X-ray, imaging, non-destructive analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1259
5638 Active Tendons for Seismic Control of Buildings

Authors: S. M. Nigdeli, M. H. Boduroglu

Abstract:

In this study, active tendons with Proportional Integral Derivation type controllers were applied to a SDOF and a MDOF building model. Physical models of buildings were constituted with virtual springs, dampers and rigid masses. After that, equations of motion of all degrees of freedoms were obtained. Matlab Simulink was utilized to obtain the block diagrams for these equations of motion. Parameters for controller actions were found by using a trial method. After earthquake acceleration data were applied to the systems, building characteristics such as displacements, velocities, accelerations and transfer functions were analyzed for all degrees of freedoms. Comparisons on displacement vs. time, velocity vs. time, acceleration vs. time and transfer function (Db) vs. frequency (Hz) were made for uncontrolled and controlled buildings. The results show that the method seems feasible.

Keywords: Active Tendons, Proportional Integral DerivationType Controllers, SDOF, MDOF, Earthquake, Building.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3328
5637 Speed Sensorless Direct Torque Control of a PMSM Drive using Space Vector Modulation Based MRAS and Stator Resistance Estimator

Authors: A. Ameur, B. Mokhtari, N. Essounbouli, L. Mokrani

Abstract:

This paper presents a speed sensorless direct torque control scheme using space vector modulation (DTC-SVM) for permanent magnet synchronous motor (PMSM) drive based a Model Reference Adaptive System (MRAS) algorithm and stator resistance estimator. The MRAS is utilized to estimate speed and stator resistance and compensate the effects of parameter variation on stator resistance, which makes flux and torque estimation more accurate and insensitive to parameter variation. In other hand the use of SVM method reduces the torque ripple while achieving a good dynamic response. Simulation results are presented and show the effectiveness of the proposed method.

Keywords: MRAS, PMSM, SVM, DTC, Speed and Resistance estimation, Sensorless drive

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3869
5636 A Wavelet-Based Watermarking Method Exploiting the Contrast Sensitivity Function

Authors: John N. Ellinas, Panagiotis Kenterlis

Abstract:

The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The current paper presents an approach for still image digital watermarking in which the watermark embedding process employs the wavelet transform and incorporates Human Visual System (HVS) characteristics. The sensitivity of a human observer to contrast with respect to spatial frequency is described by the Contrast Sensitivity Function (CSF). The strength of the watermark within the decomposition subbands, which occupy an interval on the spatial frequencies, is adjusted according to this sensitivity. Moreover, the watermark embedding process is carried over the subband coefficients that lie on edges where distortions are less noticeable. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency.

Keywords: Image watermarking, wavelet transform, human visual system, contrast sensitivity function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
5635 Retrieving Similar Segmented Objects Using Motion Descriptors

Authors: Konstantinos C. Kartsakalis, Angeliki Skoura, Vasileios Megalooikonomou

Abstract:

The fuzzy composition of objects depicted in images acquired through MR imaging or the use of bio-scanners has often been a point of controversy for field experts attempting to effectively delineate between the visualized objects. Modern approaches in medical image segmentation tend to consider fuzziness as a characteristic and inherent feature of the depicted object, instead of an undesirable trait. In this paper, a novel technique for efficient image retrieval in the context of images in which segmented objects are either crisp or fuzzily bounded is presented. Moreover, the proposed method is applied in the case of multiple, even conflicting, segmentations from field experts. Experimental results demonstrate the efficiency of the suggested method in retrieving similar objects from the aforementioned categories while taking into account the fuzzy nature of the depicted data.

Keywords: Fuzzy Object, Fuzzy Image Segmentation, Motion Descriptors, MRI Imaging, Object-Based Image Retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2302
5634 An Approach of the Inverter Voltage Used for the Linear Machine with Multi Air-Gap Structure

Authors: Pierre Kenfack

Abstract:

In this paper we present a contribution for the modelling and control of the inverter voltage of a permanent magnet linear generator with multi air-gap structure. The time domain control method is based on instant comparison of reference signals, in the form of current or voltage, with actual or measured signals. The reference current or voltage must be kept close to the actual signal with a reasonable tolerance. In this work, the time domain control method is used to control tracking signals. The performance evaluation concerns the continuation of reference signal. Simulations validate very well the tracking of reference variables (current, voltage) by measured or actual signals. All is simulated and presented under PSIM Software to show the performance and robustness of the proposed controller.

Keywords: Control, permanent magnet, linear machine, multi air-gap structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 582
5633 Solar Radiation Studies for Dubai and Sharjah, UAE

Authors: Muhammed A. Ahmed, Sidra A. Shaikh

Abstract:

Global Solar Radiation (H) for Dubai and Sharjah, Latitude 25.25oN, Longitude 55oE and 25.29oN, Longitude 55oE respectively have been studied using sunshine hour data (n) of the areas using various methods. These calculated global solar radiation values are then compared to the measured values presented by NASA. Furthermore, the extraterrestrial (H0), diffuse (Hd) and beam radiation (Hb) are also calculated. The diffuse radiation is calculated using methods proposed by Page and Liu and Jordan (L-J). Diffuse Radiation from the Page method is higher than the L-J method. Moreover, the clearness index (KT) signifies a clear sky almost all year round. Rainy days are hardly a few in a year and limited in the months December to March. The temperature remains between 25oC in winter to 44oC in summer and is desirable for thermal applications of solar energy. From the estimated results, it appears that solar radiation can be utilized very efficiently throughout the year for photovoltaic and thermal applications.

Keywords: Dubai, Sharjah, Global Solar Radiation, Diffuse Radiation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9299
5632 PM10 Chemical Characteristics in a Background Site at the Universidad Libre Bogotá

Authors: Laura X. Martinez, Andrés F. Rodríguez, Ruth A. Catacoli

Abstract:

One of the most important factors for air pollution is that the concentrations of PM10 maintain a constant trend, with the exception of some places where that frequently surpasses the allowed ranges established by Colombian legislation. The community that surrounds the Universidad Libre Bogotá is inhabited by a considerable number of students and workers, all of whom are possibly being exposed to PM10 for long periods of time while on campus. Thus, the chemical characterization of PM10 found in the ambient air at the Universidad Libre Bogotá was identified as a problem. A Hi-Vol sampler and EPA Test Method 5 were used to determine if the quality of air is adequate for the human respiratory system. Additionally, quartz fiber filters were utilized during sampling. Samples were taken three days a week during a dry period throughout the months of November and December 2015. The gravimetric analysis method was used to determine PM10 concentrations. The chemical characterization includes non-conventional carcinogenic pollutants. Atomic absorption spectrophotometry (AAS) was used for the determination of metals and VOCs were analyzed using the FTIR (Fourier transform infrared spectroscopy) method. In this way, concentrations of PM10, ranging from values of 13 µg/m3 to 66 µg/m3, were obtained; these values were below standard conditions. This evidence concludes that the PM10 concentrations during an exposure period of 24 hours are lower than the values established by Colombian law, Resolution 610 of 2010; however, when comparing these with the limits set by the World Health Organization (WHO), these concentrations could possibly exceed permissible levels.

Keywords: Air quality, atomic absorption spectrophotometry, Fourier transform infrared spectroscopy, particulate matter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 914
5631 Observations about the Principal Components Analysis and Data Clustering Techniques in the Study of Medical Data

Authors: Cristina G. Dascâlu, Corina Dima Cozma, Elena Carmen Cotrutz

Abstract:

The medical data statistical analysis often requires the using of some special techniques, because of the particularities of these data. The principal components analysis and the data clustering are two statistical methods for data mining very useful in the medical field, the first one as a method to decrease the number of studied parameters, and the second one as a method to analyze the connections between diagnosis and the data about the patient-s condition. In this paper we investigate the implications obtained from a specific data analysis technique: the data clustering preceded by a selection of the most relevant parameters, made using the principal components analysis. Our assumption was that, using the principal components analysis before data clustering - in order to select and to classify only the most relevant parameters – the accuracy of clustering is improved, but the practical results showed the opposite fact: the clustering accuracy decreases, with a percentage approximately equal with the percentage of information loss reported by the principal components analysis.

Keywords: Data clustering, medical data, principal components analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
5630 Real Time Object Tracking in H.264/ AVC Using Polar Vector Median and Block Coding Modes

Authors: T. Kusuma, K. Ashwini

Abstract:

This paper presents a real time video surveillance system which is capable of tracking multiple real time objects using Polar Vector Median (PVM) and Block Coding Modes (BCM) with Global Motion Compensation (GMC). This strategy works in the packed area and furthermore utilizes the movement vectors and BCM from the compressed bit stream to perform real time object tracking. We propose to do this in view of the neighboring Motion Vectors (MVs) using a method called PVM. Since GM adds to the object’s native motion, for accurate tracking, it is important to remove GM from the MV field prior to further processing. The proposed method is tested on a number of standard sequences and the results show its advantages over some of the current modern methods.

Keywords: Block coding mode, global motion compensation, object tracking, polar vector median, video surveillance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748
5629 Optimal Placement of DG in Distribution System to Mitigate Power Quality Disturbances

Authors: G.V.K Murthy, S. Sivanagaraju, S. Satyanarayana, B. Hanumantha Rao

Abstract:

Distributed Generation (DG) systems are considered an integral part in future distribution system planning. Appropriate size and location of distributed generation plays a significant role in minimizing power losses in distribution systems. Among the benefits of distributed generation is the reduction in active power losses, which can improve the system performance, reliability and power quality. In this paper, Artificial Bee Colony (ABC) algorithm is proposed to determine the optimal DG-unit size and location by loss sensitivity index in order to minimize the real power loss, total harmonic distortion (THD) and voltage sag index improvement. Simulation study is conducted on 69-bus radial test system to verify the efficacy of the proposed method.

Keywords: Distributed generation, artificial bee colony method, loss reduction, radial distribution network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2859
5628 Relationship between Codependency, Perceived Social Support, and Depression in Mothers of Children with Intellectual Disability

Authors: Sajed Yaghoubnezhad, Mina Karimi, Seyede Marjan Modirkhazeni

Abstract:

The goal of this research was to study the relationship between codependency, perceived social support and depression in mothers of children with intellectual disability (ID). The correlational method was used in this study. The research population is comprised of mothers of educable children with ID in the age range of 25 to 61 years. From among this, a sample of 251 individuals, in the multistage cluster sampling method, was selected from educational districts in Tehran, who responded to the Spann-Fischer Codependency Scale (SFCDS), the Social Support Questionnaire and the Beck Depression Inventory (BDI). The findings of this study indicate that among mothers of children with ID depression has a positive and significant correlation with codependency (P<0.01, r=0.4) and a negative and significant correlation with the total score of social support (P<0.01, r=-0.34). Moreover, the results of stepwise multiple regression analysis showed that codependency is allocated a higher variance than social support in explaining depression (R2=0.023).

Keywords: Codependency, social support, depression, mothers of children with ID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
5627 Different Approaches for the Design of IFIR Compaction Filter

Authors: Sheeba V.S, Elizabeth Elias

Abstract:

Optimization of filter banks based on the knowledge of input statistics has been of interest for a long time. Finite impulse response (FIR) Compaction filters are used in the design of optimal signal adapted orthonormal FIR filter banks. In this paper we discuss three different approaches for the design of interpolated finite impulse response (IFIR) compaction filters. In the first method, the magnitude squared response satisfies Nyquist constraint approximately. In the second and third methods Nyquist constraint is exactly satisfied. These methods yield FIR compaction filters whose response is comparable with that of the existing methods. At the same time, IFIR filters enjoy significant saving in the number of multipliers and can be implemented efficiently. Since eigenfilter approach is used here, the method is less complex. Design of IFIR filters in the least square sense is presented.

Keywords: Principal Component Filter Bank, InterpolatedFinite Impulse Response filter, Orthonormal Filter Bank, Eigen Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
5626 Multiple Sequence Alignment Using Optimization Algorithms

Authors: M. F. Omar, R. A. Salam, R. Abdullah, N. A. Rashid

Abstract:

Proteins or genes that have similar sequences are likely to perform the same function. One of the most widely used techniques for sequence comparison is sequence alignment. Sequence alignment allows mismatches and insertion/deletion, which represents biological mutations. Sequence alignment is usually performed only on two sequences. Multiple sequence alignment, is a natural extension of two-sequence alignment. In multiple sequence alignment, the emphasis is to find optimal alignment for a group of sequences. Several applicable techniques were observed in this research, from traditional method such as dynamic programming to the extend of widely used stochastic optimization method such as Genetic Algorithms (GAs) and Simulated Annealing. A framework with combination of Genetic Algorithm and Simulated Annealing is presented to solve Multiple Sequence Alignment problem. The Genetic Algorithm phase will try to find new region of solution while Simulated Annealing can be considered as an alignment improver for any near optimal solution produced by GAs.

Keywords: Simulated annealing, genetic algorithm, sequence alignment, multiple sequence alignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2409
5625 The Application of Fuzzy Set Theory to Mobile Internet Advertisement Fraud Detection

Authors: Jinming Ma, Tianbing Xia, Janusz R. Getta

Abstract:

This paper presents the application of fuzzy set theory to implement of mobile advertisement anti-fraud systems. Mobile anti-fraud is a method aiming to identify mobile advertisement fraudsters. One of the main problems of mobile anti-fraud is the lack of evidence to prove a user to be a fraudster. In this paper, we implement an application by using fuzzy set theory to demonstrate how to detect cheaters. The advantage of our method is that the hardship in detecting fraudsters in small data samples has been avoided. We achieved this by giving each user a suspicious degree showing how likely the user is cheating and decide whether a group of users (like all users of a certain APP) together to be fraudsters according to the average suspicious degree. This makes the process more accurate as the data of a single user is too small to be predictable.

Keywords: Mobile internet, advertisement, anti-fraud, fuzzy set theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
5624 A New Scheme for Improving the Quality of Service in Heterogeneous Wireless Network for Data Stream Sending

Authors: Ebadollah Zohrevandi, Rasoul Roustaei, Omid Moradtalab

Abstract:

In this paper, we first consider the quality of service problems in heterogeneous wireless networks for sending the video data, which their problem of being real-time is pronounced. At last, we present a method for ensuring the end-to-end quality of service at application layer level for adaptable sending of the video data at heterogeneous wireless networks. To do this, mechanism in different layers has been used. We have used the stop mechanism, the adaptation mechanism and the graceful degrade at the application layer, the multi-level congestion feedback mechanism in the network layer and connection cutting off decision mechanism in the link layer. At the end, the presented method and the achieved improvement is simulated and presented in the NS-2 software.

Keywords: Congestion, Handoff, Heterogeneous wireless networks, Adaptation mechanism, Stop mechanism, Graceful degrade.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
5623 Calculation of Inflation from Salaries Instead of Consumer Products: A Logical Exercise

Authors: E. Dahlen

Abstract:

Inflation can be calculated from either the prices of consumer products or from salaries. This paper presents a logical exercise that shows it is easier to calculate inflation from salaries than from consumer products. While the prices of consumer products may change due to technological advancement, such as automation, which must be corrected for, salaries do not. If technological advancements are not accounted for within calculations based on consumer product prices, inflation can be confused with real wage changes, since both inflation and real wage changes affect the prices of consumer products. The method employed in this paper is a logical exercise. Logical arguments are presented that suggest the existence of many different feasible ways by which inflation can be determined. Then a short mathematical exercise will be presented which shows that one of these methods –using salaries – contains the fewest number of unknown parameters, and hence, is the preferred method, since the risk of mistakes is lower. From the results, it can be concluded that salaries, rather than consumer products, should be used to calculate inflation.

Keywords: Inflation, logic, math, real wages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 717