Search results for: Ambiguity Resolution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 459

Search results for: Ambiguity Resolution

339 Novel NMR-Technology to Assess Food Quality and Safety

Authors: Markus Link, Manfred Spraul, Hartmut Schaefer, Fang Fang, Birk Schuetz

Abstract:

High Resolution NMR Spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis.

The objective is to demonstrate, that due to its extreme reproducibility NMR can detect smallest changes in concentrations of many components in a mixture, which is best monitored by statistical evaluation however also delivers reliable quantification results.

The methodology typically uses a 400 MHz high resolution instrument under full automation after minimized sample preparation.

For example one fruit juice analysis in a push button operation takes at maximum 15 minutes and delivers a multitude of results, which are automatically summarized in a PDF report.

The method has been proven on fruit juices, where so far unknown frauds could be detected. In addition conventional targeted parameters are obtained in the same analysis. This technology has the advantage that NMR is completely quantitative and concentration calibration only has to be done once for all compounds. Since NMR is so reproducible, it is also transferable between different instruments (with same field strength) and laboratories. Based on strict SOP`s, statistical models developed once can be used on multiple instruments and strategies for compound identification and quantification are applicable as well across labs.

Keywords: Automated solution, NMR, non-targeted screening, targeted screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207
338 Deep Learning Based 6D Pose Estimation for Bin-Picking Using 3D Point Clouds

Authors: Hesheng Wang, Haoyu Wang, Chungang Zhuang

Abstract:

Estimating the 6D pose of objects is a core step for robot bin-picking tasks. The problem is that various objects are usually randomly stacked with heavy occlusion in real applications. In this work, we propose a method to regress 6D poses by predicting three points for each object in the 3D point cloud through deep learning. To solve the ambiguity of symmetric pose, we propose a labeling method to help the network converge better. Based on the predicted pose, an iterative method is employed for pose optimization. In real-world experiments, our method outperforms the classical approach in both precision and recall.

Keywords: Pose estimation, deep learning, point cloud, bin-picking, 3D computer vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
337 Analysis of Classifications of Unsolicited Bulk Emails

Authors: Jatinderkumar R. Saini, Apurva A. Desai

Abstract:

In recent times, the problem of Unsolicited Bulk Email (UBE) or commonly known as Spam Email, has increased at a tremendous growth rate. We present an analysis of survey based on classifications of UBE in various research works. There are many research instances for classification between spam and non-spam emails but very few research instances are available for classification of spam emails, per se. This paper does not intend to assert some UBE classification to be better than the others nor does it propose any new classification but it bemoans the lack of harmony on number and definition of categories proposed by different researchers. The paper also elaborates on factors like intent of spammer, content of UBE and ambiguity in different categories as proposed in related research works of classifications of UBE.

Keywords: E-mail, Scams, Spam Email, Unsolicited Bulk Email(UBE)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
336 Study on Performance of Wigner Ville Distribution for Linear FM and Transient Signal Analysis

Authors: Azeemsha Thacham Poyil, Nasimudeen KM

Abstract:

This research paper presents some methods to assess the performance of Wigner Ville Distribution for Time-Frequency representation of non-stationary signals, in comparison with the other representations like STFT, Spectrogram etc. The simultaneous timefrequency resolution of WVD is one of the important properties which makes it preferable for analysis and detection of linear FM and transient signals. There are two algorithms proposed here to assess the resolution and to compare the performance of signal detection. First method is based on the measurement of area under timefrequency plot; in case of a linear FM signal analysis. A second method is based on the instantaneous power calculation and is used in case of transient, non-stationary signals. The implementation is explained briefly for both methods with suitable diagrams. The accuracy of the measurements is validated to show the better performance of WVD representation in comparison with STFT and Spectrograms.

Keywords: WVD: Wigner Ville Distribution, STFT: Short Time Fourier Transform, FT: Fourier Transform, TFR: Time-Frequency Representation, FM: Frequency Modulation, LFM Signal: Linear FM Signal, JTFA: Joint time frequency analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
335 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition

Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine

Abstract:

In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.

Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
334 Transmit Sub-aperture Optimization in MSTA Ultrasound Imaging Method

Authors: YuriyTasinkevych, Ihor Trots, AndrzejNowicki, Marcin Lewandowski

Abstract:

The paper presents the optimization problem for the multi-element synthetic transmit aperture method (MSTA) in ultrasound imaging applications. The optimal choice of the transmit aperture size is performed as a trade-off between the lateral resolution, penetration depth and the frame rate. Results of the analysis obtained by a developed optimization algorithm are presented. Maximum penetration depth and the best lateral resolution at given depths are chosen as the optimization criteria. The optimization algorithm was tested using synthetic aperture data of point reflectors simulated by Filed II program for Matlab® for the case of 5MHz 128-element linear transducer array with 0.48 mm pitch are presented. The visualization of experimentally obtained synthetic aperture data of a tissue mimicking phantom and in vitro measurements of the beef liver are also shown. The data were obtained using the SonixTOUCH Research systemequipped with a linear 4MHz 128 element transducerwith 0.3 mm element pitch, 0.28 mm element width and 70% fractional bandwidth was excited by one sine cycle pulse burst of transducer's center frequency.

Keywords: synthetic aperture method, ultrasound imaging, beamforming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840
333 Speaker Identification Using Admissible Wavelet Packet Based Decomposition

Authors: Mangesh S. Deshpande, Raghunath S. Holambe

Abstract:

Mel Frequency Cepstral Coefficient (MFCC) features are widely used as acoustic features for speech recognition as well as speaker recognition. In MFCC feature representation, the Mel frequency scale is used to get a high resolution in low frequency region, and a low resolution in high frequency region. This kind of processing is good for obtaining stable phonetic information, but not suitable for speaker features that are located in high frequency regions. The speaker individual information, which is non-uniformly distributed in the high frequencies, is equally important for speaker recognition. Based on this fact we proposed an admissible wavelet packet based filter structure for speaker identification. Multiresolution capabilities of wavelet packet transform are used to derive the new features. The proposed scheme differs from previous wavelet based works, mainly in designing the filter structure. Unlike others, the proposed filter structure does not follow Mel scale. The closed-set speaker identification experiments performed on the TIMIT database shows improved identification performance compared to other commonly used Mel scale based filter structures using wavelets.

Keywords: Speaker identification, Wavelet transform, Feature extraction, MFCC, GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
332 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3037
331 An Approach to Solving a Permutation Problem of Frequency Domain Independent Component Analysis for Blind Source Separation of Speech Signals

Authors: Masaru Fujieda, Takahiro Murakami, Yoshihisa Ishida

Abstract:

Independent component analysis (ICA) in the frequency domain is used for solving the problem of blind source separation (BSS). However, this method has some problems. For example, a general ICA algorithm cannot determine the permutation of signals which is important in the frequency domain ICA. In this paper, we propose an approach to the solution for a permutation problem. The idea is to effectively combine two conventional approaches. This approach improves the signal separation performance by exploiting features of the conventional approaches. We show the simulation results using artificial data.

Keywords: Blind source separation, Independent componentanalysis, Frequency domain, Permutation ambiguity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
330 Proposal of Design Method in the Semi-Acausal System Model

Authors: Junji Kaneko, Shigeyuki Haruyama, Ken Kaminishi, Tadayuki Kyoutani, Siti Ruhana Omar, Oke Oktavianty

Abstract:

This study is used as a definition method to the value and function in manufacturing sector. In concurrence of discussion about present condition of modeling method, until now definition of 1D-CAE is ambiguity and not conceptual. Across all the physic fields, those methods are defined with the formulation of differential algebraic equation which only applied time derivation and simulation. At the same time, we propose semi-acausal modeling concept and differential algebraic equation method as a newly modeling method which the efficiency has been verified through the comparison of numerical analysis result between the semi-acausal modeling calculation and FEM theory calculation.

Keywords: System Model, Physical Models, Empirical Models, Conservation Law, Differential Algebraic Equation, Object-Oriented.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2197
329 Comparison of Detached Eddy Simulations with Turbulence Modeling

Authors: Muhammad Amjad Sohail, Prof. Yan Chao, Mukkarum Husain

Abstract:

Flow field around hypersonic vehicles is very complex and difficult to simulate. The boundary layers are squeezed between shock layer and body surface. Resolution of boundary layer, shock wave and turbulent regions where the flow field has high values is difficult of capture. Detached eddy simulation (DES) is a modification of a RANS model in which the model switches to a subgrid scale formulation in regions fine enough for LES calculations. Regions near solid body boundaries and where the turbulent length scale is less than the maximum grid dimension are assigned the RANS mode of solution. As the turbulent length scale exceeds the grid dimension, the regions are solved using the LES mode. Therefore the grid resolution is not as demanding as pure LES, thereby considerably cutting down the cost of the computation. In this research study hypersonic flow is simulated at Mach 8 and different angle of attacks to resolve the proper boundary layers and discontinuities. The flow is also simulated in the long wake regions. Mesh is little different than RANS simulations and it is made dense near the boundary layers and in the wake regions to resolve it properly. Hypersonic blunt cone cylinder body with frustrum at angle 5o and 10 o are simulated and there aerodynamics study is performed to calculate aerodynamics characteristics of different geometries. The results and then compared with experimental as well as with some turbulence model (SA Model). The results achieved with DES simulation have very good resolution as well as have excellent agreement with experimental and available data. Unsteady simulations are performed for DES calculations by using duel time stepping method or implicit time stepping. The simulations are performed at Mach number 8 and angle of attack from 0o to 10o for all these cases. The results and resolutions for DES model found much better than SA turbulence model.

Keywords: Detached eddy simulation, dual time stepping, hypersonic flow, turbulence modeling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307
328 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: Undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 946
327 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

Modelling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve more dense and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D Models, Environment, Matching, Pleiades.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2647
326 An Interval Type-2 Dual Fuzzy Polynomial Equations and Ranking Method of Fuzzy Numbers

Authors: Nurhakimah Ab. Rahman, Lazim Abdullah

Abstract:

According to fuzzy arithmetic, dual fuzzy polynomials cannot be replaced by fuzzy polynomials. Hence, the concept of ranking method is used to find real roots of dual fuzzy polynomial equations. Therefore, in this study we want to propose an interval type-2 dual fuzzy polynomial equation (IT2 DFPE). Then, the concept of ranking method also is used to find real roots of IT2 DFPE (if exists). We transform IT2 DFPE to system of crisp IT2 DFPE. This transformation performed with ranking method of fuzzy numbers based on three parameters namely value, ambiguity and fuzziness. At the end, we illustrate our approach by two numerical examples.

Keywords: Dual fuzzy polynomial equations, Interval type-2, Ranking method, Value.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
325 Resolving Dependency Ambiguity of Subordinate Clauses using Support Vector Machines

Authors: Sang-Soo Kim, Seong-Bae Park, Sang-Jo Lee

Abstract:

In this paper, we propose a method of resolving dependency ambiguities of Korean subordinate clauses based on Support Vector Machines (SVMs). Dependency analysis of clauses is well known to be one of the most difficult tasks in parsing sentences, especially in Korean. In order to solve this problem, we assume that the dependency relation of Korean subordinate clauses is the dependency relation among verb phrase, verb and endings in the clauses. As a result, this problem is represented as a binary classification task. In order to apply SVMs to this problem, we selected two kinds of features: static and dynamic features. The experimental results on STEP2000 corpus show that our system achieves the accuracy of 73.5%.

Keywords: Dependency analysis, subordinate clauses, binaryclassification, support vector machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
324 High Accuracy ESPRIT-TLS Technique for Wind Turbine Fault Discrimination

Authors: Saad Chakkor, Mostafa Baghouri, Abderrahmane Hajraoui

Abstract:

ESPRIT-TLS method appears a good choice for high resolution fault detection in induction machines. It has a very high effectiveness in the frequency and amplitude identification. Contrariwise, it presents a high computation complexity which affects its implementation in real time fault diagnosis. To avoid this problem, a Fast-ESPRIT algorithm that combined the IIR band-pass filtering technique, the decimation technique and the original ESPRIT-TLS method was employed to enhance extracting accurately frequencies and their magnitudes from the wind stator current with less computation cost. The proposed algorithm has been applied to verify the wind turbine machine need in the implementation of an online, fast, and proactive condition monitoring. This type of remote and periodic maintenance provides an acceptable machine lifetime, minimize its downtimes and maximize its productivity. The developed technique has evaluated by computer simulations under many fault scenarios. Study results prove the performance of Fast- ESPRIT offering rapid and high resolution harmonics recognizing with minimum computation time and less memory cost.

Keywords: Spectral Estimation, ESPRIT-TLS, Real Time, Diagnosis, Wind Turbine Faults, Band-Pass Filtering, Decimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2202
323 Banking Union: A New Step towards Completing the Economic and Monetary Union

Authors: Marijana Ivanov, Roman Šubić

Abstract:

This study analyzes the critical gaps in the architecture of European stability and the expected role of the banking union as the new important step towards completing the Economic and Monetary Union that should enable the creation of safe and sound financial sector for the euro area market. The single rulebook together with the Single Supervisory Mechanism and the Single Resolution Mechanism - as two main pillars of the banking union, should provide a consistent application of common rules and administrative standards for supervision, recovery and resolution of banks – with the final aim of replacing the former bail-out practice with the bail-in system through which possible future bank failures would be resolved by their own funds, i.e. with minimal costs for taxpayers and real economy. In this way, the vicious circle between banks and sovereigns would be broken. It would also reduce the financial fragmentation recorded in the years of crisis as the result of divergent behaviors in risk premium, lending activities and interest rates between the core and the periphery. In addition, it should strengthen the effectiveness of monetary transmission channels, in particular the credit channels and overflows of liquidity on the money market which, due to the fragmentation of the common financial market, has been significantly disabled in period of crisis. However, contrary to all the positive expectations related to the future functioning of the banking union, major findings of this study indicate that characteristics of the economic system in which the banking union will operate should not be ignored. The euro area is an integration of strong and weak entities with large differences in economic development, wealth, assets of banking systems, growth rates and accountability of fiscal policy. The analysis indicates that low and unbalanced economic growth remains a challenge for the maintenance of financial stability and this problem cannot be resolved just by a single supervision. In many countries bank assets exceed their GDP by several times and large banks are still a matter of concern, because of their systemic importance for individual countries and the euro zone as a whole. The creation of the Single Supervisory Mechanism and the Single Resolution Mechanism is a response to the European crisis, which has particularly affected peripheral countries and caused the associated loop between the banking crisis and the sovereign debt crisis, but has also influenced banks’ balance sheets in the core countries, as the result of crossborder capital flows. The creation of the SSM and the SRM should prevent the similar episodes to happen again and should also provide a new opportunity for strengthening of economic and financial systems of the peripheral countries. On the other hand, there is a potential threat that future focus of the ECB, resolution mechanism and other relevant institutions will be extremely oriented towards large and significant banks (whereby one half of them operate in the core and most important euro area countries), and therefore it remains questionable to what extent will the common resolution funds will be used for rescue of less important institutions. Recent geopolitical developments will be the optimal indicator to show whether the previously established mechanisms are sufficient enough to maintain the adequate financial stability in the euro area market.

Keywords: Banking Union, financial integration, single supervisory mechanism (SSM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
322 Order Partitioning in Hybrid MTS/MTO Contexts using Fuzzy ANP

Authors: H. Rafiei, M. Rabbani

Abstract:

A novel concept to balance and tradeoff between make-to-stock and make-to-order has been hybrid MTS/MTO production context. One of the most important decisions involved in the hybrid MTS/MTO environment is determining whether a product is manufactured to stock, to order, or hybrid MTS/MTO strategy. In this paper, a model based on analytic network process is developed to tackle the addressed decision. Since the regarded decision deals with the uncertainty and ambiguity of data as well as experts- and managers- linguistic judgments, the proposed model is equipped with fuzzy sets theory. An important attribute of the model is its generality due to diverse decision factors which are elicited from the literature and developed by the authors. Finally, the model is validated by applying to a real case study to reveal how the proposed model can actually be implemented.

Keywords: Fuzzy analytic network process, Hybrid make-tostock/ make-to-order, Order partitioning, Production planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2131
321 Texture Based Weed Detection Using Multi Resolution Combined Statistical and Spatial Frequency (MRCSF)

Authors: R.S.Sabeenian, V.Palanisamy

Abstract:

Texture classification is a trendy and a catchy technology in the field of texture analysis. Textures, the repeated patterns, have different frequency components along different orientations. Our work is based on Texture Classification and its applications. It finds its applications in various fields like Medical Image Classification, Computer Vision, Remote Sensing, Agricultural Field, and Textile Industry. Weed control has a major effect on agriculture. A large amount of herbicide has been used for controlling weeds in agriculture fields, lawns, golf courses, sport fields, etc. Random spraying of herbicides does not meet the exact requirement of the field. Certain areas in field have more weed patches than estimated. So, we need a visual system that can discriminate weeds from the field image which will reduce or even eliminate the amount of herbicide used. This would allow farmers to not use any herbicides or only apply them where they are needed. A machine vision precision automated weed control system could reduce the usage of chemicals in crop fields. In this paper, an intelligent system for automatic weeding strategy Multi Resolution Combined Statistical & spatial Frequency is used to discriminate the weeds from the crops and to classify them as narrow, little and broad weeds.

Keywords: crop weed discrimination, MRCSF, MRFM, Weeddetection, Spatial Frequency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
320 Outlier Pulse Detection and Feature Extraction for Wrist Pulse Analysis

Authors: Bhaskar Thakker, Anoop Lal Vyas

Abstract:

Wrist pulse analysis for identification of health status is found in Ancient Indian as well as Chinese literature. The preprocessing of wrist pulse is necessary to remove outlier pulses and fluctuations prior to the analysis of pulse pressure signal. This paper discusses the identification of irregular pulses present in the pulse series and intricacies associated with the extraction of time domain pulse features. An approach of Dynamic Time Warping (DTW) has been utilized for the identification of outlier pulses in the wrist pulse series. The ambiguity present in the identification of pulse features is resolved with the help of first derivative of Ensemble Average of wrist pulse series. An algorithm for detecting tidal and dicrotic notch in individual wrist pulse segment is proposed.

Keywords: Wrist Pulse Segment, Ensemble Average, Dynamic Time Warping (DTW), Pulse Similarity Vector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2038
319 Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS

Authors: K. Tunde Olagunju, C. Scott Allen, F.D. (Freek) van der Meer

Abstract:

Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).

Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon – substrate combination, Sentinel-2, WorldView-3

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 623
318 In-Flight Radiometric Performances Analysis of an Airborne Optical Payload

Authors: Caixia Gao, Chuanrong Li, Lingli Tang, Lingling Ma, Yaokai Liu, Xinhong Wang, Yongsheng Zhou

Abstract:

Performances analysis of remote sensing sensor is required to pursue a range of scientific research and application objectives. Laboratory analysis of any remote sensing instrument is essential, but not sufficient to establish a valid inflight one. In this study, with the aid of the in situ measurements and corresponding image of three-gray scale permanent artificial target, the in-flight radiometric performances analyses (in-flight radiometric calibration, dynamic range and response linearity, signal-noise-ratio (SNR), radiometric resolution) of self-developed short-wave infrared (SWIR) camera are performed. To acquire the inflight calibration coefficients of the SWIR camera, the at-sensor radiances (Li) for the artificial targets are firstly simulated with in situ measurements (atmosphere parameter and spectral reflectance of the target) and viewing geometries using MODTRAN model. With these radiances and the corresponding digital numbers (DN) in the image, a straight line with a formulation of L = G × DN + B is fitted by a minimization regression method, and the fitted coefficients, G and B, are inflight calibration coefficients. And then the high point (LH) and the low point (LL) of dynamic range can be described as LH= (G × DNH + B) and LL= B, respectively, where DNH is equal to 2n − 1 (n is the quantization number of the payload). Meanwhile, the sensor’s response linearity (δ) is described as the correlation coefficient of the regressed line. The results show that the calibration coefficients (G and B) are 0.0083 W·sr−1m−2µm−1 and −3.5 W·sr−1m−2µm−1; the low point of dynamic range is −3.5 W·sr−1m−2µm−1 and the high point is 30.5 W·sr−1m−2µm−1; the response linearity is approximately 99%. Furthermore, a SNR normalization method is used to assess the sensor’s SNR, and the normalized SNR is about 59.6 when the mean value of radiance is equal to 11.0 W·sr−1m−2µm−1; subsequently, the radiometric resolution is calculated about 0.1845 W•sr-1m-2μm-1. Moreover, in order to validate the result, a comparison of the measured radiance with a radiative-transfer-code-predicted over four portable artificial targets with reflectance of 20%, 30%, 40%, 50% respectively, is performed. It is noted that relative error for the calibration is within 6.6%.

Keywords: Calibration, dynamic range, radiometric resolution, SNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1302
317 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir

Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder

Abstract:

22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.

Keywords: Catchment management, drinking water reservoir, multivariate curve resolution alternating least squares, thermal stratification, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 869
316 Virtual Conciliation in Colombia: Evaluation of Maturity Level within the Framework of E-Government

Authors: Jenny Paola Forero Pachón, Sonia Cristina Gamboa Sarmiento, Luis Carlos Gómez Flórez

Abstract:

The Colombian government has defined an e-government strategy to take advantage of Information Technologies (IT) in order to contribute to the building of a more efficient, transparent and participative State that provides better services to citizens and businesses. In this regard, the Justice sector is one of the government sectors where IT has generated more expectation considering that the country has a judicial processes backlog. This situation has led to the search for alternative forms of access to justice that speed up the process while providing a low cost for citizens. To this end, the Colombian government has authorized the use of Alternative Dispute Resolution methods (ADR), a remedy where disputes can be resolved more quickly compared to judicial processes while facilitating greater communication between the parties, without recourse to judicial authority. One of these methods is conciliation, which includes a special modality that takes advantage of IT for the development of itself known as virtual conciliation. With this option the conciliation is supported by information systems, applications or platforms and communications are provided through it. This paper evaluates the level of maturity in how the service of virtual conciliation is under the framework of this strategy. This evaluation is carried out considering Shahkooh's 5-phase model for e-government. As a result, it is evident that in the context of conciliation, maturity does not reach the necessary level in the model so that it can be considered as virtual conciliation; therefore, it is necessary to define strategies to maximize the potential of IT in this context.

Keywords: Alternative dispute resolution, e-government, evaluation of maturity, Shahkooh model, virtual conciliation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 896
315 Portfolio Management: A Fuzzy Set Based Approach to Monitoring Size to Maximize Return and Minimize Risk

Authors: Margaret F. Shipley

Abstract:

Fuzzy logic can be used when knowledge is incomplete or when ambiguity of data exists. The purpose of this paper is to propose a proactive fuzzy set- based model for reacting to the risk inherent in investment activities relative to a complete view of portfolio management. Fuzzy rules are given where, depending on the antecedents, the portfolio size may be slightly or significantly decreased or increased. The decision maker considers acceptable bounds on the proportion of acceptable risk and return. The Fuzzy Controller model allows learning to be achieved as 1) the firing strength of each rule is measured, 2) fuzzy output allows rules to be updated, and 3) new actions are recommended as the system continues to loop. An extension is given to the fuzzy controller that evaluates potential financial loss before adjusting the portfolio. An application is presented that illustrates the algorithm and extension developed in the paper.

Keywords: Portfolio Management, Financial Market Monitoring, Fuzzy Controller, Fuzzy Logic,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795
314 Scale Time Offset Robust Modulation (STORM) in a Code Division Multiaccess Environment

Authors: David M. Jenkins Jr.

Abstract:

Scale Time Offset Robust Modulation (STORM) [1]– [3] is a high bandwidth waveform design that adds time-scale to embedded reference modulations using only time-delay [4]. In an environment where each user has a specific delay and scale, identification of the user with the highest signal power and that user-s phase is facilitated by the STORM processor. Both of these parameters are required in an efficient multiuser detection algorithm. In this paper, the STORM modulation approach is evaluated with a direct sequence spread quadrature phase shift keying (DS-QPSK) system. A misconception of the STORM time scale modulation is that a fine temporal resolution is required at the receiver. STORM will be applied to a QPSK code division multiaccess (CDMA) system by modifying the spreading codes. Specifically, the in-phase code will use a typical spreading code, and the quadrature code will use a time-delayed and time-scaled version of the in-phase code. Subsequently, the same temporal resolution in the receiver is required before and after the application of STORM. In this paper, the bit error performance of STORM in a synchronous CDMA system is evaluated and compared to theory, and the bit error performance of STORM incorporated in a single user WCDMA downlink is presented to demonstrate the applicability of STORM in a modern communication system.

Keywords: Pseudonoise coded communication, Cyclic codes, Code division multiaccess

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594
313 Influence of Calcium Intake Level to Osteoporptic Vertebral bone and Degenerated Disc in Biomechanical Study

Authors: Dae Gon Woo, Ji Hyung Park, Chi Hoon Kim, Tae Woo Lee, Beob Yi Lee, Han Sung Kim

Abstract:

The aim of the present study is to analyze the generation of osteoporotic vertebral bone induced by lack of calcium during growth period and analyze its effects for disc degeneration, based on biomechanical and histomorphometrical study. Mechanical and histomorphological characteristics of lumbar vertebral bones and discs of rats with calcium free diet (CFD) were detected and tracked by using high resolution in-vivo micro-computed tomography (in-vivo micro-CT), finite element (FE) and histological analysis. Twenty female Sprague-Dawley rats (6 weeks old, approximate weight 170g) were randomly divided into two groups (CFD group: 10, NOR group: 10). The CFD group was maintained on a refmed calcium-controlled semisynthetic diet without added calcium, to induce osteoporosis. All lumbar (L 1-L6) were scanned by using in vivo micro-CT with 35i.un resolution at 0, 4, 8 weeks to track the effects of CFD on the generation of osteoporosis. The fmdings of the present study indicated that calcium insufficiency was the main factor in the generation of osteoporosis and it induced lumbar vertebral disc degeneration. This study is a valuable experiment to firstly evaluate osteoporotic vertebral bone and disc degeneration induced by lack of calcium during growth period from a biomechanical and histomorphometrical point of view.

Keywords: Calcium free diet, Disc degeneration, Osteoporosis, in-vivo micro-CT, Finite element analysis, Histology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
312 2D Validation of a High-order Adaptive Cartesian-grid finite-volume Characteristic- flux Model with Embedded Boundaries

Authors: C. Leroy, G. Oger, D. Le Touzé, B. Alessandrini

Abstract:

A Finite Volume method based on Characteristic Fluxes for compressible fluids is developed. An explicit cell-centered resolution is adopted, where second and third order accuracy is provided by using two different MUSCL schemes with Minmod, Sweby or Superbee limiters for the hyperbolic part. Few different times integrator is used and be describe in this paper. Resolution is performed on a generic unstructured Cartesian grid, where solid boundaries are handled by a Cut-Cell method. Interfaces are explicitely advected in a non-diffusive way, ensuring local mass conservation. An improved cell cutting has been developed to handle boundaries of arbitrary geometrical complexity. Instead of using a polygon clipping algorithm, we use the Voxel traversal algorithm coupled with a local floodfill scanline to intersect 2D or 3D boundary surface meshes with the fixed Cartesian grid. Small cells stability problem near the boundaries is solved using a fully conservative merging method. Inflow and outflow conditions are also implemented in the model. The solver is validated on 2D academic test cases, such as the flow past a cylinder. The latter test cases are performed both in the frame of the body and in a fixed frame where the body is moving across the mesh. Adaptive Cartesian grid is provided by Paramesh without complex geometries for the moment.

Keywords: Finite volume method, cartesian grid, compressible solver, complex geometries, Paramesh.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
311 Overcrowding and Adequate Housing: The Potential of Adaptability

Authors: Inês Ramalhete, Hugo Farias, Rui da Silva Pinto

Abstract:

Adequate housing has been a widely discussed theme in academic circles related to low-cost housing, whereas its physical features are easy to deal with, overcrowding (related to social, cultural and economic aspects) is still ambiguous, particularly regarding the set of indicators that can accurately reflect and measure it. This paper develops research on low-cost housing models for developing countries and what is the best method to embed overcrowding as an important parameter for adaptability. A critical review of international overcrowding indicators and their application in two developing countries, Cape Verde and Angola, is presented. The several rationales and the constraints for an accurate assessment of overcrowding are considered, namely baseline data (statistics), which can induce misjudgments, as well as social and cultural factors (such as personal choices of residents). This paper proposes a way to tackle overcrowding through housing adaptability, considering factors such as physical flexibility, functional ambiguity, and incremental expansion schemes. Moreover, a case-study is presented to establish a framework for the theoretical application of the proposed approach.

Keywords: Adaptive housing, low-cost housing, overcrowding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1054
310 Teacher Training Course: Conflict Resolution through Mediation

Authors: Csilla M. Szabó

Abstract:

In Hungary, the society has changed a lot for the past 25 years, and these changes could be detected in educational situations as well. The number and the intensity of conflicts have been increased in most fields of life, as well as at schools. Teachers have difficulties to be able to handle school conflicts. What is more, the new net generation, generation Z has values and behavioural patterns different from those of the previous one, which might generate more serious conflicts at school, especially with teachers who were mainly socialising in a traditional teacher – student relationship. In Hungary, the bill CCIV of 2011 declared the foundation of Institutes of Teacher Training in higher education institutes. One of the tasks of the Institutes is to survey the competences and needs of teachers working in public education and to provide further trainings and services for them according to their needs and requirements. This job is supported by the Social Renewal Operative Programs 4.1.2.B. The professors of a college carried out a questionnaire and surveyed the needs and the requirements of teachers working in the region. Based on the results, the professors of the Institute of Teacher Training decided to meet the requirements of teachers and to launch short teacher further training courses in spring 2015. One of the courses is going to focus on school conflict management through mediation. The aim of the pilot course is to provide conflict management techniques for teachers and to present different mediation techniques to them. The theoretical part of the course (5 hours) will enable participants to understand the main points and the advantages of mediation, while the practical part (10 hours) will involve teachers in role plays to learn how to cope with conflict situations applying mediation. We hope if conflicts could be reduced, it would influence school atmosphere in a positive way and the teaching – learning process could be more successful and effective.

Keywords: Conflict resolution, generation Z, mediation, teacher training.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685