Search results for: band-pass filtering
307 Quality Control of Automotive Gearbox Based On Vibration Signal Analysis
Authors: Nilson Barbieri, Bruno Matos Martins, Gabriel de Sant'Anna Vitor Barbieri
Abstract:
In more complex systems, such as automotive gearbox, a rigorous treatment of the data is necessary because there are several moving parts (gears, bearings, shafts, etc.), and in this way, there are several possible sources of errors and also noise. The basic objective of this work is the detection of damage in automotive gearbox. The detection methods used are the wavelet method, the bispectrum; advanced filtering techniques (selective filtering) of vibrational signals and mathematical morphology. Gearbox vibration tests were performed (gearboxes in good condition and with defects) of a production line of a large vehicle assembler. The vibration signals are obtained using five accelerometers in different positions of the sample. The results obtained using the kurtosis, bispectrum, wavelet and mathematical morphology showed that it is possible to identify the existence of defects in automotive gearboxes.Keywords: automotive gearbox, mathematical morphology, wavelet, bispectrum
Procedia PDF Downloads 472306 The Effect of Technology on Advanced Automotive Electronics
Authors: Abanob Nady Wasef Moawed
Abstract:
In more complicated systems, inclusive of automotive gearboxes, a rigorous remedy of the data is essential because there are several transferring elements (gears, bearings, shafts, and many others.), and in this way, there are numerous viable sources of mistakes and also noise. The fundamental goal of these elements are the detection of damage in car gearbox. The detection strategies used are the wavelet technique, the bispectrum, advanced filtering techniques (selective filtering) of vibrational alerts and mathematical morphology. Gearbox vibration assessments were achieved (gearboxes in proper circumstance and with defects) of a manufacturing line of a huge car assembler. The vibration indicators have acquired the use of five accelerometers in distinct positions of the sample. The effects acquired using the kurtosis, bispectrum, wavelet and mathematical morphology confirmed that it's far possible to identify the lifestyles of defects in automobile gearboxes.Keywords: 3D-shaped electronics, electronic components, thermoforming, component positioningautomotive gearbox, mathematical morphology, wavelet, bispectrum
Procedia PDF Downloads 23305 Three-Dimensional Jet Refraction Simulation Using a Gradient Term Suppression and Filtering Method
Authors: Lican Wang, Rongqian Chen, Yancheng You, Ruofan Qiu
Abstract:
In the applications of jet engine, open-jet wind tunnel and airframe, there wildly exists a shear layer formed by the velocity and temperature gradients between jet flow and surrounded medium. The presence of shear layer will refract and reflect the sound path that consequently influences the measurement results in far-field. To investigate and evaluate the shear layer effect, a gradient term suppression and filtering method is adopted to simulate sound propagation through a steady sheared flow in three dimensions. Two typical configurations are considered: one is an incompressible and cold jet flow in wind tunnel and the other is a compressible and hot jet flow in turbofan engine. A numerically linear microphone array is used to localize the position of given sound source. The localization error is presented and linearly fitted.Keywords: aeroacoustic, linearized Euler equation, acoustic propagation, source localization
Procedia PDF Downloads 200304 Online Prediction of Nonlinear Signal Processing Problems Based Kernel Adaptive Filtering
Authors: Hamza Nejib, Okba Taouali
Abstract:
This paper presents two of the most knowing kernel adaptive filtering (KAF) approaches, the kernel least mean squares and the kernel recursive least squares, in order to predict a new output of nonlinear signal processing. Both of these methods implement a nonlinear transfer function using kernel methods in a particular space named reproducing kernel Hilbert space (RKHS) where the model is a linear combination of kernel functions applied to transform the observed data from the input space to a high dimensional feature space of vectors, this idea known as the kernel trick. Then KAF is the developing filters in RKHS. We use two nonlinear signal processing problems, Mackey Glass chaotic time series prediction and nonlinear channel equalization to figure the performance of the approaches presented and finally to result which of them is the adapted one.Keywords: online prediction, KAF, signal processing, RKHS, Kernel methods, KRLS, KLMS
Procedia PDF Downloads 397303 Evaluation of Sensor Pattern Noise Estimators for Source Camera Identification
Authors: Benjamin Anderson-Sackaney, Amr Abdel-Dayem
Abstract:
This paper presents a comprehensive survey of recent source camera identification (SCI) systems. Then, the performance of various sensor pattern noise (SPN) estimators was experimentally assessed, under common photo response non-uniformity (PRNU) frameworks. The experiments used 1350 natural and 900 flat-field images, captured by 18 individual cameras. 12 different experiments, grouped into three sets, were conducted. The results were analyzed using the receiver operator characteristic (ROC) curves. The experimental results demonstrated that combining the basic SPN estimator with a wavelet-based filtering scheme provides promising results. However, the phase SPN estimator fits better with both patch-based (BM3D) and anisotropic diffusion (AD) filtering schemes.Keywords: sensor pattern noise, source camera identification, photo response non-uniformity, anisotropic diffusion, peak to correlation energy ratio
Procedia PDF Downloads 438302 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 343301 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 497300 Noise Removal Techniques in Medical Images
Authors: Amhimmid Mohammed Saffour, Abdelkader Salama
Abstract:
Filtering is a part of image enhancement techniques, it is used to enhance certain details such as edges in the image that are relevant to the application. Additionally, filtering can even be used to eliminate unwanted components of noise. Medical images typically contain salt and pepper noise and Poisson noise. This noise appears to the presence of minute grey scale variations within the image. In this paper, different filters techniques namely (Median, Wiener, Rank order3, Rank order5, and Average) were applied on CT medical images (Brain and chest). We using all these filters to remove salt and pepper noise from these images. This type of noise consists of random pixels being set to black or white. Peak Signal to Noise Ratio (PSNR), Mean Square Error r(MSE) and Histogram were used to evaluated the quality of filtered images. The results, which we have achieved shows that, these filters, are more useful and they prove to be helpful for general medical practitioners to analyze the symptoms of the patients with no difficulty.Keywords: CT imaging, median filter, adaptive filter and average filter, MATLAB
Procedia PDF Downloads 312299 The Use of Image Processing Responses Tools Applied to Analysing Bouguer Gravity Anomaly Map (Tangier-Tetuan's Area-Morocco)
Authors: Saad Bakkali
Abstract:
Image processing is a powerful tool for the enhancement of edges in images used in the interpretation of geophysical potential field data. Arial and terrestrial gravimetric surveys were carried out in the region of Tangier-Tetuan. From the observed and measured data of gravity Bouguer gravity anomalies map was prepared. This paper reports the results and interpretations of the transformed maps of Bouguer gravity anomaly of the Tangier-Tetuan area using image processing. Filtering analysis based on classical image process was applied. Operator image process like logarithmic and gamma correction are used. This paper also present the results obtained from this image processing analysis of the enhancement edges of the Bouguer gravity anomaly map of the Tangier-Tetuan zone.Keywords: bouguer, tangier, filtering, gamma correction, logarithmic enhancement edges
Procedia PDF Downloads 420298 A Background Subtraction Based Moving Object Detection Around the Host Vehicle
Authors: Hyojin Lim, Cuong Nguyen Khac, Ho-Youl Jung
Abstract:
In this paper, we propose moving object detection method which is helpful for driver to safely take his/her car out of parking lot. When moving objects such as motorbikes, pedestrians, the other cars and some obstacles are detected at the rear-side of host vehicle, the proposed algorithm can provide to driver warning. We assume that the host vehicle is just before departure. Gaussian Mixture Model (GMM) based background subtraction is basically applied. Pre-processing such as smoothing and post-processing as morphological filtering are added.We examine “which color space has better performance for detection of moving objects?” Three color spaces including RGB, YCbCr, and Y are applied and compared, in terms of detection rate. Through simulation, we prove that RGB space is more suitable for moving object detection based on background subtraction.Keywords: gaussian mixture model, background subtraction, moving object detection, color space, morphological filtering
Procedia PDF Downloads 613297 A Robust Digital Image Watermarking Against Geometrical Attack Based on Hybrid Scheme
Authors: M. Samadzadeh Mahabadi, J. Shanbehzadeh
Abstract:
This paper presents a hybrid digital image-watermarking scheme, which is robust against varieties of attacks and geometric distortions. The image content is represented by important feature points obtained by an image-texture-based adaptive Harris corner detector. These feature points are extracted from LL2 of 2-D discrete wavelet transform which are obtained by using the Harris-Laplacian detector. We calculate the Fourier transform of circular regions around these points. The amplitude of this transform is rotation invariant. The experimental results demonstrate the robustness of the proposed method against the geometric distortions and various common image processing operations such as JPEG compression, colour reduction, Gaussian filtering, median filtering, and rotation.Keywords: digital watermarking, geometric distortions, geometrical attack, Harris Laplace, important feature points, rotation, scale invariant feature
Procedia PDF Downloads 500296 Filtering and Reconstruction System for Grey-Level Forensic Images
Authors: Ahd Aljarf, Saad Amin
Abstract:
Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.Keywords: image filtering, image reconstruction, image processing, forensic images
Procedia PDF Downloads 361295 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data
Authors: Hyun-Woo Cho
Abstract:
It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring
Procedia PDF Downloads 241294 Semantic-Based Collaborative Filtering to Improve Visitor Cold Start in Recommender Systems
Authors: Baba Mbaye
Abstract:
In collaborative filtering recommendation systems, a user receives suggested items based on the opinions and evaluations of a community of users. This type of recommendation system uses only the information (notes in numerical values) contained in a usage matrix as input data. This matrix can be constructed based on users' behaviors or by offering users to declare their opinions on the items they know. The cold start problem leads to very poor performance for new users. It is a phenomenon that occurs at the beginning of use, in the situation where the system lacks data to make recommendations. There are three types of cold start problems: cold start for a new item, a new system, and a new user. We are interested in this article at the cold start for a new user. When the system welcomes a new user, the profile exists but does not have enough data, and its communities with other users profiles are still unknown. This leads to recommendations not adapted to the profile of the new user. In this paper, we propose an approach that improves cold start by using the notions of similarity and semantic proximity between users profiles during cold start. We will use the cold-metadata available (metadata extracted from the new user's data) useful in positioning the new user within a community. The aim is to look for similarities and semantic proximities with the old and current user profiles of the system. Proximity is represented by close concepts considered to belong to the same group, while similarity groups together elements that appear similar. Similarity and proximity are two close but not similar concepts. This similarity leads us to the construction of similarity which is based on: a) the concepts (properties, terms, instances) independent of ontology structure and, b) the simultaneous representation of the two concepts (relations, presence of terms in a document, simultaneous presence of the authorities). We propose an ontology, OIVCSRS (Ontology of Improvement Visitor Cold Start in Recommender Systems), in order to structure the terms and concepts representing the meaning of an information field, whether by the metadata of a namespace, or the elements of a knowledge domain. This approach allows us to automatically attach the new user to a user community, partially compensate for the data that was not initially provided and ultimately to associate a better first profile with the cold start. Thus, the aim of this paper is to propose an approach to improving cold start using semantic technologies.Keywords: visitor cold start, recommender systems, collaborative filtering, semantic filtering
Procedia PDF Downloads 217293 Analyzing On-Line Process Data for Industrial Production Quality Control
Authors: Hyun-Woo Cho
Abstract:
The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.Keywords: detection, filtering, monitoring, process data
Procedia PDF Downloads 557292 Email Phishing Detection Using Natural Language Processing and Convolutional Neural Network
Abstract:
Phishing is one of the oldest and best known scams on the Internet. It can be defined as any type of telecommunications fraud that uses social engineering tricks to obtain confidential data from its victims. It’s a cybercrime aimed at stealing your sensitive information. Phishing is generally done via private email, so scammers impersonate large companies or other trusted entities to encourage victims to voluntarily provide information such as login credentials or, worse yet, credit card numbers. The COVID-19 theme is used by cybercriminals in multiple malicious campaigns like phishing. In this environment, messaging filtering solutions have become essential to protect devices that will now be used outside of the secure perimeter. Despite constantly updating methods to avoid these cyberattacks, the end result is currently insufficient. Many researchers are looking for optimal solutions to filter phishing emails, but we still need good results. In this work, we concentrated on solving the problem of detecting phishing emails using the different steps of NLP preprocessing, and we proposed and trained a model using one-dimensional CNN. Our study results show that our model obtained an accuracy of 99.99%, which demonstrates how well our model is working.Keywords: phishing, e-mail, NLP preprocessing, CNN, e-mail filtering
Procedia PDF Downloads 124291 Frequency Decomposition Approach for Sub-Band Common Spatial Pattern Methods for Motor Imagery Based Brain-Computer Interface
Authors: Vitor M. Vilas Boas, Cleison D. Silva, Gustavo S. Mafra, Alexandre Trofino Neto
Abstract:
Motor imagery (MI) based brain-computer interfaces (BCI) uses event-related (de)synchronization (ERS/ ERD), typically recorded using electroencephalography (EEG), to translate brain electrical activity into control commands. To mitigate undesirable artifacts and noise measurements on EEG signals, methods based on band-pass filters defined by a specific frequency band (i.e., 8 – 30Hz), such as the Infinity Impulse Response (IIR) filters, are typically used. Spatial techniques, such as Common Spatial Patterns (CSP), are also used to estimate the variations of the filtered signal and extract features that define the imagined motion. The CSP effectiveness depends on the subject's discriminative frequency, and approaches based on the decomposition of the band of interest into sub-bands with smaller frequency ranges (SBCSP) have been suggested to EEG signals classification. However, despite providing good results, the SBCSP approach generally increases the computational cost of the filtering step in IM-based BCI systems. This paper proposes the use of the Fast Fourier Transform (FFT) algorithm in the IM-based BCI filtering stage that implements SBCSP. The goal is to apply the FFT algorithm to reduce the computational cost of the processing step of these systems and to make them more efficient without compromising classification accuracy. The proposal is based on the representation of EEG signals in a matrix of coefficients resulting from the frequency decomposition performed by the FFT, which is then submitted to the SBCSP process. The structure of the SBCSP contemplates dividing the band of interest, initially defined between 0 and 40Hz, into a set of 33 sub-bands spanning specific frequency bands which are processed in parallel each by a CSP filter and an LDA classifier. A Bayesian meta-classifier is then used to represent the LDA outputs of each sub-band as scores and organize them into a single vector, and then used as a training vector of an SVM global classifier. Initially, the public EEG data set IIa of the BCI Competition IV is used to validate the approach. The first contribution of the proposed method is that, in addition to being more compact, because it has a 68% smaller dimension than the original signal, the resulting FFT matrix maintains the signal information relevant to class discrimination. In addition, the results showed an average reduction of 31.6% in the computational cost in relation to the application of filtering methods based on IIR filters, suggesting FFT efficiency when applied in the filtering step. Finally, the frequency decomposition approach improves the overall system classification rate significantly compared to the commonly used filtering, going from 73.7% using IIR to 84.2% using FFT. The accuracy improvement above 10% and the computational cost reduction denote the potential of FFT in EEG signal filtering applied to the context of IM-based BCI implementing SBCSP. Tests with other data sets are currently being performed to reinforce such conclusions.Keywords: brain-computer interfaces, fast Fourier transform algorithm, motor imagery, sub-band common spatial patterns
Procedia PDF Downloads 128290 Geomatic Techniques to Filter Vegetation from Point Clouds
Authors: M. Amparo Núñez-Andrés, Felipe Buill, Albert Prades
Abstract:
More and more frequently, geomatics techniques such as terrestrial laser scanning or digital photogrammetry, either terrestrial or from drones, are being used to obtain digital terrain models (DTM) used for the monitoring of geological phenomena that cause natural disasters, such as landslides, rockfalls, debris-flow. One of the main multitemporal analyses developed from these models is the quantification of volume changes in the slopes and hillsides, either caused by erosion, fall, or land movement in the source area or sedimentation in the deposition zone. To carry out this task, it is necessary to filter the point clouds of all those elements that do not belong to the slopes. Among these elements, vegetation stands out as it is the one we find with the greatest presence and its constant change, both seasonal and daily, as it is affected by factors such as wind. One of the best-known indexes to detect vegetation on the image is the NVDI (Normalized Difference Vegetation Index), which is obtained from the combination of the infrared and red channels. Therefore it is necessary to have a multispectral camera. These cameras are generally of lower resolution than conventional RGB cameras, while their cost is much higher. Therefore we have to look for alternative indices based on RGB. In this communication, we present the results obtained in Georisk project (PID2019‐103974RB‐I00/MCIN/AEI/10.13039/501100011033) by using the GLI (Green Leaf Index) and ExG (Excessive Greenness), as well as the change to the Hue-Saturation-Value (HSV) color space being the H coordinate the one that gives us the most information for vegetation filtering. These filters are applied both to the images, creating binary masks to be used when applying the SfM algorithms, and to the point cloud obtained directly by the photogrammetric process without any previous filter or the one obtained by TLS (Terrestrial Laser Scanning). In this last case, we have also tried to work with a Riegl VZ400i sensor that allows the reception, as in the aerial LiDAR, of several returns of the signal. Information to be used for the classification on the point cloud. After applying all the techniques in different locations, the results show that the color-based filters allow correct filtering in those areas where the presence of shadows is not excessive and there is a contrast between the color of the slope lithology and the vegetation. As we have advanced in the case of using the HSV color space, it is the H coordinate that responds best for this filtering. Finally, the use of the various returns of the TLS signal allows filtering with some limitations.Keywords: RGB index, TLS, photogrammetry, multispectral camera, point cloud
Procedia PDF Downloads 151289 Combline Cavity Bandpass Filter Design and Implementation Using EM Simulation Tool
Authors: Taha Ahmed Özbey, Sedat Nazlıbilek, Alparslan Çağrı Yapıcı
Abstract:
Combline cavity filters have gained significant attention in recent years due to their exceptional narrowband characteristics, high unloaded Q, remarkable out-of-band rejection, and versatile post-manufacturing tuning capabilities. These filters play a vital role in various wireless communication systems, radar applications, and other advanced technologies where stringent frequency selectivity and superior performance are required. This paper represents combined cavity filter design and implementation by coupling matrix synthesis. Limited filter length, 50 dB out-of-band rejection, and agile design were aimed. To do so, CAD tools and intuitive methods were used.Keywords: cavity, band pass filter, cavity combline filter, coupling matrix synthesis
Procedia PDF Downloads 71288 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances
Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim
Abstract:
This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering
Procedia PDF Downloads 186287 Optimization of Multiplier Extraction Digital Filter On FPGA
Authors: Shiksha Jain, Ramesh Mishra
Abstract:
One of the most widely used complex signals processing operation is filtering. The most important FIR digital filter are widely used in DSP for filtering to alter the spectrum according to some given specifications. Power consumption and Area complexity in the algorithm of Finite Impulse Response (FIR) filter is mainly caused by multipliers. So we present a multiplier less technique (DA technique). In this technique, precomputed value of inner product is stored in LUT. Which are further added and shifted with number of iterations equal to the precision of input sample. But the exponential growth of LUT with the order of FIR filter, in this basic structure, makes it prohibitive for many applications. The significant area and power reduction over traditional Distributed Arithmetic (DA) structure is presented in this paper, by the use of slicing of LUT to the desired length. An architecture of 16 tap FIR filter is presented, with different length of slice of LUT. The result of FIR Filter implementation on Xilinx ISE synthesis tool (XST) vertex-4 FPGA Tool by using proposed method shows the increase of the maximum frequency, the decrease of the resources as usage saving in area with more number of slices and the reduction dynamic power.Keywords: multiplier less technique, linear phase symmetric FIR filter, FPGA tool, look up table
Procedia PDF Downloads 389286 User Modeling from the Perspective of Improvement in Search Results: A Survey of the State of the Art
Authors: Samira Karimi-Mansoub, Rahem Abri
Abstract:
Currently, users expect high quality and personalized information from search results. To satisfy user’s needs, personalized approaches to web search have been proposed. These approaches can provide the most appropriate answer for user’s needs by using user context and incorporating information about query provided by combining search technologies. To carry out personalized web search, there is a need to make different techniques on whole of user search process. There are the number of possible deployment of personalized approaches such as personalized web search, personalized recommendation, personalized summarization and filtering systems and etc. but the common feature of all approaches in various domains is that user modeling is utilized to provide personalized information from the Web. So the most important work in personalized approaches is user model mining. User modeling applications and technologies can be used in various domains depending on how the user collected information may be extracted. In addition to, the used techniques to create user model is also different in each of these applications. Since in the previous studies, there was not a complete survey in this field, our purpose is to present a survey on applications and techniques of user modeling from the viewpoint of improvement in search results by considering the existing literature and researches.Keywords: filtering systems, personalized web search, user modeling, user search behavior
Procedia PDF Downloads 277285 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 385284 An Improved Tracking Approach Using Particle Filter and Background Subtraction
Authors: Amir Mukhtar, Dr. Likun Xia
Abstract:
An improved, robust and efficient visual target tracking algorithm using particle filtering is proposed. Particle filtering has been proven very successful in estimating non-Gaussian and non-linear problems. In this paper, the particle filter is used with color feature to estimate the target state with time. Color distributions are applied as this feature is scale and rotational invariant, shows robustness to partial occlusion and computationally efficient. The performance is made more robust by choosing the different (YIQ) color scheme. Tracking is performed by comparison of chrominance histograms of target and candidate positions (particles). Color based particle filter tracking often leads to inaccurate results when light intensity changes during a video stream. Furthermore, background subtraction technique is used for size estimation of the target. The qualitative evaluation of proposed algorithm is performed on several real-world videos. The experimental results demonstrate that the improved algorithm can track the moving objects very well under illumination changes, occlusion and moving background.Keywords: tracking, particle filter, histogram, corner points, occlusion, illumination
Procedia PDF Downloads 378283 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement
Authors: Pogula Rakesh, T. Kishore Kumar
Abstract:
Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids, and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB, and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR), and SNR loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.Keywords: adaptive filter, adaptive noise canceller, mean squared error, noise reduction, NLMS, RLS, SNR, SNR loss
Procedia PDF Downloads 480282 Static Application Security Testing Approach for Non-Standard Smart Contracts
Authors: Antonio Horta, Renato Marinho, Raimir Holanda
Abstract:
Considered as an evolution of the Blockchain, the Ethereum platform, besides allowing transactions of its cryptocurrency named Ether, it allows the programming of decentralised applications (DApps) and smart contracts. However, this functionality into blockchains has raised other types of threats, and the exploitation of smart contracts vulnerabilities has taken companies to experience big losses. This research intends to figure out the number of contracts that are under risk of being drained. Through a deep investigation, more than two hundred thousand smart contracts currently available in the Ethereum platform were scanned and estimated how much money is at risk. The experiment was based in a query run on Google Big Query in July 2022 and returned 50,707,133 contracts published on the Ethereum platform. After applying the filtering criteria, the experimentgot 430,584 smart contracts to download and analyse. The filtering criteria consisted of filtering out: ERC20 and ERC721 contracts, contracts without transactions, and contracts without balance. From this amount of 430,584 smart contracts selected, only 268,103 had source codes published on Etherscan, however, we discovered, using a hashing process, that there were contracts duplication. Removing the duplicated contracts, the process ended up with 20,417 source codes, which were analysed using the open source SAST tool smartbugswith oyente and securify algorithms. In the end, there was nearly $100,000 at risk of being drained from the potentially vulnerable smart contracts. It is important to note that the tools used in this study may generate false positives, which may interfere with the number of vulnerable contracts. To address this point, our next step in this research is to develop an application to test the contract in a parallel environment to verify the vulnerability. Finally, this study aims to alert users and companies about the risk on not properly creating and analysing their smart contracts before publishing them into the platform. As any other application, smart contracts are at risk of having vulnerabilities which, in this case, may result in direct financial losses.Keywords: blockchain, reentrancy, static application security testing, smart contracts
Procedia PDF Downloads 86281 Novel Recommender Systems Using Hybrid CF and Social Network Information
Authors: Kyoung-Jae Kim
Abstract:
Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition
Procedia PDF Downloads 288280 Chemical Warfare Agent Simulant by Photocatalytic Filtering Reactor: Effect of Operating Parameters
Authors: Youcef Serhane, Abdelkrim Bouzaza, Dominique Wolbert, Aymen Amin Assadi
Abstract:
Throughout history, the use of chemical weapons is not exclusive to combats between army corps; some of these weapons are also found in very targeted intelligence operations (political assassinations), organized crime, and terrorist organizations. To improve the speed of action, important technological devices have been developed in recent years, in particular in the field of protection and decontamination techniques to better protect and neutralize a chemical threat. In order to assess certain protective, decontaminating technologies or to improve medical countermeasures, tests must be conducted. In view of the great toxicity of toxic chemical agents from (real) wars, simulants can be used, chosen according to the desired application. Here, we present an investigation about using a photocatalytic filtering reactor (PFR) for highly contaminated environments containing diethyl sulfide (DES). This target pollutant is used as a simulant of CWA, namely of Yperite (Mustard Gas). The influence of the inlet concentration (until high concentrations of DES (1200 ppmv, i.e., 5 g/m³ of air) has been studied. Also, the conversion rate was monitored under different relative humidity and different flow rates (respiratory flow - standards: ISO / DIS 8996 and NF EN 14387 + A1). In order to understand the efficacity of pollutant neutralization by PFR, a kinetic model based on the Langmuir–Hinshelwood (L–H) approach and taking into account the mass transfer step was developed. This allows us to determine the adsorption and kinetic degradation constants with no influence of mass transfer. The obtained results confirm that this small configuration of reactor presents an extremely promising way for the use of photocatalysis for treatment to deal with highly contaminated environments containing real chemical warfare agents. Also, they can give birth to an individual protection device (an autonomous cartridge for a gas mask).Keywords: photocatalysis, photocatalytic filtering reactor, diethylsulfide, chemical warfare agents
Procedia PDF Downloads 102279 Automation of Process Waste-Free Air Filtration in Production of Concrete, Reinforced with Basalt Fiber
Authors: Stanislav Perepechko
Abstract:
Industrial companies - one of the major sources of harmful substances to the atmosphere. The main cause of pollution on the concrete plants are cement dust emissions. All the cement silos, pneumatic transport, and ventilation systems equipped with filters, to avoid this. Today, many Russian companies have to decide on replacement morally and physically outdated filters and guided back to the electrostatic filters as usual equipment. The offered way of a cleaning of waste-free filtering of air differs in the fact that a filtering medium of the filter is used in concrete manufacture. Basalt is widespread and pollution-free material. In the course of cleaning, one part of basalt fiber and cement immediately goes to the mixer through flow-control units of initial basalt fiber and cement. Another part of basalt fiber goes to filters for purification of the air used in systems of an air lift, and ventilating emissions passes through them, and with trapped particles also goes to the mixer through flow-control units of the basalt fiber fulfilled in filters. At the same time, regulators are adjusted in such a way that total supply of basalt fiber and cement into the mixer remains invariable and corresponds to a given technological mode.Keywords: waste-free air filtration, concrete, basalt fiber, building automation
Procedia PDF Downloads 427278 Analysis of Spamming Threats and Some Possible Solutions for Online Social Networking Sites (OSNS)
Authors: Dilip Singh Sisodia, Shrish Verma
Abstract:
Spamming is the most common issue seen nowadays in the Internet especially in Online Social Networking Sites (like Facebook, Twitter, and Google+ etc.). Spam messages keep wasting Internet bandwidth and the storage space of servers. On social network sites; spammers often disguise themselves by creating fake accounts and hijacking user’s accounts for personal gains. They behave like normal user and they continue to change their spamming strategy. To prevent this, most modern spam-filtering solutions are deployed on the receiver side; they are good at filtering spam for end users. In this paper we are presenting some spamming techniques their behaviour and possible solutions. We have analyzed how Spammers enters into online social networking sites (OSNSs) and how they target it and the techniques they use for it. The five discussed techniques of spamming techniques which are clickjacking, social engineered attacks, cross site scripting, URL shortening, and drive by download. We have used elgg framework for demonstration of some of spamming threats and respective implementation of solutions.Keywords: online social networking sites, spam, attacks, internet, clickjacking / likejacking, drive-by-download, URL shortening, networking, socially engineered attacks, elgg framework
Procedia PDF Downloads 345