Search results for: Adult images filtering
899 Recursive Filter for Coastal Displacement Estimation
Authors: Efstratios Doukakis, Nikolaos Petrelis
Abstract:
All climate models agree that the temperature in Greece will increase in the range of 1° to 2°C by the year 2030 and mean sea level in Mediterranean is expected to rise at the rate of 5 cm/decade. The aim of the present paper is the estimation of the coastline displacement driven by the climate change and sea level rise. In order to achieve that, all known statistical and non-statistical computational methods are employed on some Greek coastal areas. Furthermore, Kalman filtering techniques are for the first time introduced, formulated and tested. Based on all the above, shoreline change signals and noises are computed and an inter-comparison between the different methods can be deduced to help evaluating which method is most promising as far as the retrieve of shoreline change rate is concerned.Keywords: Climate Change, Coastal Displacement, KalmanFilter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408898 Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression
Authors: Amarunnishad T.M., Govindan V.K., Abraham T. Mathew
Abstract:
An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.Keywords: Image compression, Edge detection, Ground truth image, Peak signal to noise ratio
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701897 Methods of Geodesic Distance in Two-Dimensional Face Recognition
Authors: Rachid Ahdid, Said Safi, Bouzid Manaut
Abstract:
In this paper, we present a comparative study of three methods of 2D face recognition system such as: Iso-Geodesic Curves (IGC), Geodesic Distance (GD) and Geodesic-Intensity Histogram (GIH). These approaches are based on computing of geodesic distance between points of facial surface and between facial curves. In this study we represented the image at gray level as a 2D surface in a 3D space, with the third coordinate proportional to the intensity values of pixels. In the classifying step, we use: Neural Networks (NN), K-Nearest Neighbor (KNN) and Support Vector Machines (SVM). The images used in our experiments are from two wellknown databases of face images ORL and YaleB. ORL data base was used to evaluate the performance of methods under conditions where the pose and sample size are varied, and the database YaleB was used to examine the performance of the systems when the facial expressions and lighting are varied.
Keywords: 2D face recognition, Geodesic distance, Iso-Geodesic Curves, Geodesic-Intensity Histogram, facial surface, Neural Networks, K-Nearest Neighbor, Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816896 Pre-Analysis of Printed Circuit Boards Based On Multispectral Imaging for Vision Based Recognition of Electronics Waste
Authors: Florian Kleber, Martin Kampel
Abstract:
The increasing demand of gallium, indium and rare-earth elements for the production of electronics, e.g. solid state-lighting, photovoltaics, integrated circuits, and liquid crystal displays, will exceed the world-wide supply according to current forecasts. Recycling systems to reclaim these materials are not yet in place, which challenges the sustainability of these technologies. This paper proposes a multispectral imaging system as a basis for a vision based recognition system for valuable components of electronics waste. Multispectral images intend to enhance the contrast of images of printed circuit boards (single components, as well as labels) for further analysis, such as optical character recognition and entire printed circuit board recognition. The results show, that a higher contrast is achieved in the near infrared compared to ultraviolett and visible light.
Keywords: Electronic Waste, Recycling, Multispectral Imaging, Printed Circuit Boards, Rare-Earth Elements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2686895 The Statistical Properties of Filtered Signals
Authors: Ephraim Gower, Thato Tsalaile, Monageng Kgwadi, Malcolm Hawksford.
Abstract:
In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.
Keywords: Circular Convolution, linear Convolution, mixture density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517894 Electric Load Forecasting Using Genetic Based Algorithm, Optimal Filter Estimator and Least Error Squares Technique: Comparative Study
Authors: Khaled M. EL-Naggar, Khaled A. AL-Rumaih
Abstract:
This paper presents performance comparison of three estimation techniques used for peak load forecasting in power systems. The three optimum estimation techniques are, genetic algorithms (GA), least error squares (LS) and, least absolute value filtering (LAVF). The problem is formulated as an estimation problem. Different forecasting models are considered. Actual recorded data is used to perform the study. The performance of the above three optimal estimation techniques is examined. Advantages of each algorithms are reported and discussed.
Keywords: Forecasting, Least error squares, Least absolute Value, Genetic algorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2724893 New Mitigating Technique to Overcome DDOS Attack
Authors: V. Praveena, N. Kiruthika
Abstract:
In this paper, we explore a new scheme for filtering spoofed packets (DDOS attack) which is a combination of path fingerprint and client puzzle concepts. In this each IP packet has a unique fingerprint is embedded that represents, the route a packet has traversed. The server maintains a mapping table which contains the client IP address and its corresponding fingerprint. In ingress router, client puzzle is placed. For each request, the puzzle issuer provides a puzzle which the source has to solve. Our design has the following advantages over prior approaches, 1) Reduce the network traffic, as we place a client puzzle at the ingress router. 2) Mapping table at the server is lightweight and moderate.
Keywords: Client puzzle, DDOS attack, Egress, Ingress, IP Spoofing, Spoofed Packet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629892 Exploiting Global Self Similarity for Head-Shoulder Detection
Authors: Lae-Jeong Park, Jung-Ho Moon
Abstract:
People detection from images has a variety of applications such as video surveillance and driver assistance system, but is still a challenging task and more difficult in crowded environments such as shopping malls in which occlusion of lower parts of human body often occurs. Lack of the full-body information requires more effective features than common features such as HOG. In this paper, new features are introduced that exploits global self-symmetry (GSS) characteristic in head-shoulder patterns. The features encode the similarity or difference of color histograms and oriented gradient histograms between two vertically symmetric blocks. The domain-specific features are rapid to compute from the integral images in Viola-Jones cascade-of-rejecters framework. The proposed features are evaluated with our own head-shoulder dataset that, in part, consists of a well-known INRIA pedestrian dataset. Experimental results show that the GSS features are effective in reduction of false alarmsmarginally and the gradient GSS features are preferred more often than the color GSS ones in the feature selection.
Keywords: Pedestrian detection, cascade of rejecters, feature extraction, self-symmetry, HOG.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2401891 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: Grayscale image format, image fusing, SURF detection, YCbCr image format.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1156890 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis
Authors: Nikolay Nikolaev, Evgueni Smirnov
Abstract:
This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332889 RoboWeedSupport-Sub Millimeter Weed Image Acquisition in Cereal Crops with Speeds up till 50 Km/H
Authors: Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Mads Dyrmann, Robert Poulsen
Abstract:
For the past three years, the Danish project, RoboWeedSupport, has sought to bridge the gap between the potential herbicide savings using a decision support system and the required weed inspections. In order to automate the weed inspections it is desired to generate a map of the weed species present within the field, to generate the map images must be captured with samples covering the field. This paper investigates the economical cost of performing this data collection based on a camera system mounted on a all-terain vehicle (ATV) able to drive and collect data at up to 50 km/h while still maintaining a image quality sufficient for identifying newly emerged grass weeds. The economical estimates are based on approximately 100 hectares recorded at three different locations in Denmark. With an average image density of 99 images per hectare the ATV had an capacity of 28 ha per hour, which is estimated to cost 6.6 EUR/ha. Alternatively relying on a boom solution for an existing tracktor it was estimated that a cost of 2.4 EUR/ha is obtainable under equal conditions.Keywords: Weed mapping, integrated weed management, weed recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467888 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification
Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian
Abstract:
Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.
Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 777887 Characterization of a Pure Diamond-Like Carbon Film Deposited by Nanosecond Pulsed Laser Deposition
Authors: Camilla G. Goncalves, Benedito Christ, Walter Miyakawa, Antonio J. Abdalla
Abstract:
This work aims to investigate the properties and microstructure of diamond-like carbon film deposited by pulsed laser deposition by ablation of a graphite target in a vacuum chamber on a steel substrate. The equipment was mounted to provide one laser beam. The target of high purity graphite and the steel substrate were polished. The mechanical and tribological properties of the film were characterized using Raman spectroscopy, nanoindentation test, scratch test, roughness profile, tribometer, optical microscopy and SEM images. It was concluded that the pulsed laser deposition (PLD) technique associated with the low-pressure chamber and a graphite target provides a good fraction of sp3 bonding, that the process variable as surface polishing and laser parameter have great influence in tribological properties and in adherence tests performance. The optical microscopy images are efficient to identify the metallurgical bond.
Keywords: Characterization, diamond-like carbon, DLC, mechanical properties, pulsed laser deposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 707886 Impulsive Noise-Resilient Subband Adaptive Filter
Authors: Young-Seok Choi
Abstract:
We present a new subband adaptive filter (R-SAF) which is robust against impulsive noise in system identification. To address the vulnerability of adaptive filters based on the L2-norm optimization criterion against impulsive noise, the R-SAF comes from the L1-norm optimization criterion with a constraint on the energy of the weight update. Minimizing L1-norm of the a posteriori error in each subband with a constraint on minimum disturbance gives rise to the robustness against the impulsive noise and the capable convergence performance. Experimental results clearly demonstrate that the proposed R-SAF outperforms the classical adaptive filtering algorithms when impulsive noise as well as background noise exist.Keywords: Subband adaptive filter, L1-norm, system identification, robustness, impulsive interference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473885 Optical Flow Technique for Supersonic Jet Measurements
Authors: H. D. Lim, Jie Wu, T. H. New, Shengxian Shi
Abstract:
This paper outlines the development of an experimental technique in quantifying supersonic jet flows, in an attempt to avoid seeding particle problems frequently associated with particle-image velocimetry (PIV) techniques at high Mach numbers. Based on optical flow algorithms, the idea behind the technique involves using high speed cameras to capture Schlieren images of the supersonic jet shear layers, before they are subjected to an adapted optical flow algorithm based on the Horn-Schnuck method to determine the associated flow fields. The proposed method is capable of offering full-field unsteady flow information with potentially higher accuracy and resolution than existing point-measurements or PIV techniques. Preliminary study via numerical simulations of a circular de Laval jet nozzle successfully reveals flow and shock structures typically associated with supersonic jet flows, which serve as useful data for subsequent validation of the optical flow based experimental results. For experimental technique, a Z-type Schlieren setup is proposed with supersonic jet operated in cold mode, stagnation pressure of 4 bar and exit Mach of 1.5. High-speed singleframe or double-frame cameras are used to capture successive Schlieren images. As implementation of optical flow technique to supersonic flows remains rare, the current focus revolves around methodology validation through synthetic images. The results of validation test offers valuable insight into how the optical flow algorithm can be further improved to improve robustness and accuracy. Despite these challenges however, this supersonic flow measurement technique may potentially offer a simpler way to identify and quantify the fine spatial structures within the shock shear layer.
Keywords: Schlieren, optical flow, supersonic jets, shock shear layer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904884 Journey on Image Clustering Based on Color Composition
Authors: Achmad Nizar Hidayanto, Elisabeth Martha Koeanan
Abstract:
Image clustering is a process of grouping images based on their similarity. The image clustering usually uses the color component, texture, edge, shape, or mixture of two components, etc. This research aims to explore image clustering using color composition. In order to complete this image clustering, three main components should be considered, which are color space, image representation (feature extraction), and clustering method itself. We aim to explore which composition of these factors will produce the best clustering results by combining various techniques from the three components. The color spaces use RGB, HSV, and L*a*b* method. The image representations use Histogram and Gaussian Mixture Model (GMM), whereas the clustering methods use KMeans and Agglomerative Hierarchical Clustering algorithm. The results of the experiment show that GMM representation is better combined with RGB and L*a*b* color space, whereas Histogram is better combined with HSV. The experiments also show that K-Means is better than Agglomerative Hierarchical for images clustering.Keywords: Image clustering, feature extraction, RGB, HSV, L*a*b*, Gaussian Mixture Model (GMM), histogram, Agglomerative Hierarchical Clustering (AHC), K-Means, Expectation-Maximization (EM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207883 Piecewise Interpolation Filter for Effective Processing of Large Signal Sets
Authors: Anatoli Torokhti, Stanley Miklavcic
Abstract:
Suppose KY and KX are large sets of observed and reference signals, respectively, each containing N signals. Is it possible to construct a filter F : KY → KX that requires a priori information only on few signals, p N, from KX but performs better than the known filters based on a priori information on every reference signal from KX? It is shown that the positive answer is achievable under quite unrestrictive assumptions. The device behind the proposed method is based on a special extension of the piecewise linear interpolation technique to the case of random signal sets. The proposed technique provides a single filter to process any signal from the arbitrarily large signal set. The filter is determined in terms of pseudo-inverse matrices so that it always exists.Keywords: Wiener filter, filtering of stochastic signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413882 SEM Image Classification Using CNN Architectures
Authors: G. Türkmen, Ö. Tekin, K. Kurtuluş, Y. Y. Yurtseven, M. Baran
Abstract:
A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.
Keywords: Convolutional Neural Networks, deep learning, image classification, scanning electron microscope.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203881 Statistical Computational of Volatility in Financial Time Series Data
Authors: S. Al Wadi, Mohd Tahir Ismail, Samsul Ariffin Abdul Karim
Abstract:
It is well known that during the developments in the economic sector and through the financial crises occur everywhere in the whole world, volatility measurement is the most important concept in financial time series. Therefore in this paper we discuss the volatility for Amman stocks market (Jordan) for certain period of time. Since wavelet transform is one of the most famous filtering methods and grows up very quickly in the last decade, we compare this method with the traditional technique, Fast Fourier transform to decide the best method for analyzing the volatility. The comparison will be done on some of the statistical properties by using Matlab program.Keywords: Fast Fourier transforms, Haar wavelet transform, Matlab (Wavelet tools), stocks market, Volatility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2319880 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered as a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.
Keywords: Text detection, CNN, PZM, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164879 Optimization of Solar Tracking Systems
Authors: A. Zaher, A. Traore, F. Thiéry, T. Talbert, B. Shaer
Abstract:
In this paper, an intelligent approach is proposed to optimize the orientation of continuous solar tracking systems on cloudy days. Considering the weather case, the direct sunlight is more important than the diffuse radiation in case of clear sky. Thus, the panel is always pointed towards the sun. In case of an overcast sky, the solar beam is close to zero, and the panel is placed horizontally to receive the maximum of diffuse radiation. Under partly covered conditions, the panel must be pointed towards the source that emits the maximum of solar energy and it may be anywhere in the sky dome. Thus, the idea of our approach is to analyze the images, captured by ground-based sky camera system, in order to detect the zone in the sky dome which is considered as the optimal source of energy under cloudy conditions. The proposed approach is implemented using experimental setup developed at PROMES-CNRS laboratory in Perpignan city (France). Under overcast conditions, the results were very satisfactory, and the intelligent approach has provided efficiency gains of up to 9% relative to conventional continuous sun tracking systems.
Keywords: Clouds detection, fuzzy inference systems, images processing, sun trackers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1213878 Denoising and Compression in Wavelet Domainvia Projection on to Approximation Coefficients
Authors: Mario Mastriani
Abstract:
We describe a new filtering approach in the wavelet domain for image denoising and compression, based on the projections of details subbands coefficients (resultants of the splitting procedure, typical in wavelet domain) onto the approximation subband coefficients (much less noisy). The new algorithm is called Projection Onto Approximation Coefficients (POAC). As a result of this approach, only the approximation subband coefficients and three scalars are stored and/or transmitted to the channel. Besides, with the elimination of the details subbands coefficients, we obtain a bigger compression rate. Experimental results demonstrate that our approach compares favorably to more typical methods of denoising and compression in wavelet domain.
Keywords: Compression, denoising, projections, wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618877 Compensated CIC-Hybrid Signed Digit Decimation Filter
Authors: Vishal Awasthi, Krishna Raj
Abstract:
In this paper, firstly, we present the mathematical modeling of finite impulse response (FIR) filter and Cascaded Integrator Comb (CIC) filter for sampling rate reduction and then an extension of Canonical signed digit (CSD) based efficient structure is presented in framework using hybrid signed digit (HSD) arithmetic. CSD representation imposed a restriction that two non-zero CSD coefficient bits cannot acquire adjacent bit positions and therefore, represented structure is not economical in terms of speed, area and power consumption. The HSD based structure gives optimum performance in terms of area and speed with 37.02% passband droop compensation.
Keywords: Multirate filtering, compensation theory, CIC filter, compensation filter, signed digit arithmetic, canonical signed digit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1080876 Design and Fabrication of a Low Cost Heart Monitor using Reflectance Photoplethysmogram
Authors: Nur Ilyani Ramli, Mansour Youseffi, Peter Widdop
Abstract:
This paper presents a low cost design of heart beat monitoring device using reflectance mode PhotoPlethysmography (PPG). PPG is known for its simple construction, ease of use and cost effectiveness and can provide information about the changes in cardiac activity as well as aid in earlier non-invasive diagnostics. The proposed device is divided into three phases. First is the detection of pulses through the fingertip. The signal is then passed to the signal processing unit for the purpose of amplification, filtering and digitizing. Finally the heart rate is calculated and displayed on the computer using parallel port interface. The paper is concluded with prototyping of the device followed by verification procedure of the heartbeat signal obtained in laboratory setting.
Keywords: Reflectance mode PPG, Heart beat detection, Circuitdesign, PCB design
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4564875 REDUCER – An Architectural Design Pattern for Reducing Large and Noisy Data Sets
Authors: Apkar Salatian
Abstract:
To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article we also show how REDUCER has successfully been applied to 3 different case studies.
Keywords: Design Pattern, filtering, compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494874 Overall Function and Symptom Impact of Self-Applied Myofascial Release in Adult Patients with Fibromyalgia: A Seven-Week Pilot Study
Authors: Domenica Tambasco, Riina Bray
Abstract:
Fibromyalgia is a chronic condition characterized by widespread musculoskeletal pain, fatigue, and reduced function. Management of symptoms include medications, physical treatments and mindfulness therapies. Myofascial Release is a modality that has been successfully applied in various musculoskeletal conditions. However, to the author’s best knowledge, it is not yet recognized as a self-management therapy option in Fibromyalgia. In this study, we investigated whether Self-applied Myofascial Release (SMR) is associated with overall improved function and symptoms in Fibromyalgia. Eligible adult patients with a confirmed diagnosis of Fibromyalgia at Women’s College Hospital were recruited to SMR. Sessions ran for 1 hour once a week for 7 weeks, led by the same two physiotherapists knowledgeable in this physical treatment modality. The main outcome measure was an overall impact score for function and symptoms based on the validated assessment tool for fibromyalgia, the Revised Fibromyalgia Impact Questionnaire (FIQR), measured pre- and post-intervention. Both descriptive and analytical methods were applied and reported. We analyzed results using a paired t-test to determine if there was a statistically significant difference in mean FIQR scores between initial (pre-intervention) and final (post-intervention) scores. A clinically significant difference in FIQR was defined as a reduction in score by 10 or more points. Our pilot study showed that SMR appeared to be a safe and effective intervention for our fibromyalgia participants and the overall impact on function and symptoms occurred in only 7 weeks. Further studies with larger sample sizes comparing SMR to other physical treatment modalities (such as stretching) in an randomized control trial (RCT) are recommended.
Keywords: Fibromyalgia, myofascial release, fibromyalgia impact questionnaire, fibromyalgia assessment status.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 306873 Digital filters for Hot-Mix Asphalt Complex Modulus Test Data Using Genetic Algorithm Strategies
Authors: Madhav V. Chitturi, Anshu Manik, Kasthurirangan Gopalakrishnan
Abstract:
The dynamic or complex modulus test is considered to be a mechanistically based laboratory test to reliably characterize the strength and load-resistance of Hot-Mix Asphalt (HMA) mixes used in the construction of roads. The most common observation is that the data collected from these tests are often noisy and somewhat non-sinusoidal. This hampers accurate analysis of the data to obtain engineering insight. The goal of the work presented in this paper is to develop and compare automated evolutionary computational techniques to filter test noise in the collection of data for the HMA complex modulus test. The results showed that the Covariance Matrix Adaptation-Evolutionary Strategy (CMA-ES) approach is computationally efficient for filtering data obtained from the HMA complex modulus test.Keywords: HMA, dynamic modulus, GA, evolutionarycomputation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572872 Changes in Behavior and Learning Ability of Rats Intoxicated with Lead
Authors: Amira, A. Goma, U. E. Mahrous
Abstract:
Measuring the effect of perinatal lead exposure on learning ability of offspring is considered as a sensitive and selective index for providing an early marker for central nervous system damage produced by this toxic metal. A total of 35 Sprague-Dawley adult rats were used to investigate the effect of lead acetate toxicity on behavioral patterns of adult female rats and learning ability of offspring. Rats were allotted into 4 groups, group one received 1g/l lead acetate (n=10), group two received 1.5g/l lead acetate (n=10), group three received 2g/l lead acetate in drinking water (n=10) and control group did not receive lead acetate (n=5) from 8th day of pregnancy till weaning of pups.
The obtained results revealed a dose dependent increase in the feeding time, drinking frequency, licking frequency, scratching frequency, licking litters, nest building and retrieving frequencies, while standing time increased significantly in rats treated with 1.5g/l lead acetate than other treated groups and control, on contrary lying time decreased gradually in a dose dependent manner. Moreover, movement activities were higher in rats treated with 1g/l lead acetate than other treated groups and control. Furthermore, time spent in closed arms was significantly lower in rats given 2g/l lead acetate than other treated groups, while, they spent significantly much time spent in open arms than other treated groups which could be attributed to occurrence of adaptation. Furthermore, number of entries in open arms was dose dependent. However, the ratio between open/closed arms revealed a significant decrease in rats treated with 2g/l lead acetate than control group.
Keywords: Lead toxicity, rats, learning ability, behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2688871 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image
Authors: Yohei Saika, Yuji Haraguchi
Abstract:
We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988870 Robust Statistics Based Algorithm to Remove Salt and Pepper Noise in Images
Authors: V.R.Vijaykumar, P.T.Vanathi, P.Kanagasabapathy, D.Ebenezer
Abstract:
In this paper, a robust statistics based filter to remove salt and pepper noise in digital images is presented. The function of the algorithm is to detect the corrupted pixels first since the impulse noise only affect certain pixels in the image and the remaining pixels are uncorrupted. The corrupted pixels are replaced by an estimated value using the proposed robust statistics based filter. The proposed method perform well in removing low to medium density impulse noise with detail preservation upto a noise density of 70% compared to standard median filter, weighted median filter, recursive weighted median filter, progressive switching median filter, signal dependent rank ordered mean filter, adaptive median filter and recently proposed decision based algorithm. The visual and quantitative results show the proposed algorithm outperforms in restoring the original image with superior preservation of edges and better suppression of impulse noise
Keywords: Image denoising, Nonlinear filter, Robust Statistics, and Salt and Pepper Noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205