Search results for: 2D particle image velocimetry
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4349

Search results for: 2D particle image velocimetry

3629 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model

Authors: Can Huang, Xiaoliang Wang, Qingquan Liu

Abstract:

Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.

Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH

Procedia PDF Downloads 63
3628 Automatic Moment-Based Texture Segmentation

Authors: Tudor Barbu

Abstract:

An automatic moment-based texture segmentation approach is proposed in this paper. First, we describe the related work in this computer vision domain. Our texture feature extraction, the first part of the texture recognition process, produces a set of moment-based feature vectors. For each image pixel, a texture feature vector is computed as a sequence of area moments. Second, an automatic pixel classification approach is proposed. The feature vectors are clustered using some unsupervised classification algorithm, the optimal number of clusters being determined using a measure based on validation indexes. From the resulted pixel classes one determines easily the desired texture regions of the image.

Keywords: image segmentation, moment-based, texture analysis, automatic classification, validation indexes

Procedia PDF Downloads 414
3627 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 153
3626 Impact of Ship Traffic to PM 2.5 and Particle Number Concentrations in Three Port-Cities of the Adriatic/Ionian Area

Authors: Daniele Contini, Antonio Donateo, Andrea Gambaro, Athanasios Argiriou, Dimitrios Melas, Daniela Cesari, Anastasia Poupkou, Athanasios Karagiannidis, Apostolos Tsakis, Eva Merico, Rita Cesari, Adelaide Dinoi

Abstract:

Emissions of atmospheric pollutants from ships and harbour activities are a growing concern at International level given their potential impacts on air quality and climate. These close-to-land emissions have potential impact on local communities in terms of air quality and health. Recent studies show that the impact of maritime traffic to atmospheric particulate matter concentrations in several coastal urban areas is comparable with the impact of road traffic of a medium size town. However, several different approaches have been used for these estimates making difficult a direct comparison of results. In this work an integrated approach based on emission inventories and dedicated measurement campaigns has been applied to give a comparable estimate of the impact of maritime traffic to PM2.5 and particle number concentrations in three major harbours of the Adriatic/Ionian Seas. The influences of local meteorology and of the logistic layout of the harbours are discussed.

Keywords: ship emissions, PM2.5, particle number concentrations, impact of shipping to atmospheric aerosol

Procedia PDF Downloads 751
3625 MSG Image Encryption Based on AES and RSA Algorithms "MSG Image Security"

Authors: Boukhatem Mohammed Belkaid, Lahdir Mourad

Abstract:

In this paper, we propose a new encryption system for security issues meteorological images from Meteosat Second Generation (MSG), which generates 12 images every 15 minutes. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every 15 minutes that will be used to encrypt each frame of the MSG meteorological basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.

Keywords: AES, RSA, integrity, confidentiality, authentication, satellite MSG, encryption, decryption, key, correlation

Procedia PDF Downloads 380
3624 Development of Wide Bandgap Semiconductor Based Particle Detector

Authors: Rupa Jeena, Pankaj Chetry, Pradeep Sarin

Abstract:

The study of fundamental particles and the forces governing them has always remained an attractive field of theoretical study to pursue. With the advancement and development of new technologies and instruments, it is possible now to perform particle physics experiments on a large scale for the validation of theoretical predictions. These experiments are generally carried out in a highly intense beam environment. This, in turn, requires the development of a detector prototype possessing properties like radiation tolerance, thermal stability, and fast timing response. Semiconductors like Silicon, Germanium, Diamond, and Gallium Nitride (GaN) have been widely used for particle detection applications. Silicon and germanium being narrow bandgap semiconductors, require pre-cooling to suppress the effect of noise by thermally generated intrinsic charge carriers. The application of diamond in large-scale experiments is rare owing to its high cost of fabrication, while GaN is one of the most extensively explored potential candidates. But we are aiming to introduce another wide bandgap semiconductor in this active area of research by considering all the requirements. We have made an attempt by utilizing the wide bandgap of rutile Titanium dioxide (TiO2) and other properties to use it for particle detection purposes. The thermal evaporation-oxidation (in PID furnace) technique is used for the deposition of the film, and the Metal Semiconductor Metal (MSM) electrical contacts are made using Titanium+Gold (Ti+Au) (20/80nm). The characterization comprising X-Ray Diffraction (XRD), Atomic Force Microscopy (AFM), Ultraviolet (UV)-Visible spectroscopy, and Laser Raman Spectroscopy (LRS) has been performed on the film to get detailed information about surface morphology. On the other hand, electrical characterizations like Current Voltage (IV) measurement in dark and light and test with laser are performed to have a better understanding of the working of the detector prototype. All these preliminary tests of the detector will be presented.

Keywords: particle detector, rutile titanium dioxide, thermal evaporation, wide bandgap semiconductors

Procedia PDF Downloads 78
3623 Cuckoo Search Optimization for Black Scholes Option Pricing

Authors: Manas Shah

Abstract:

Black Scholes option pricing model is one of the most important concepts in modern world of computational finance. However, its practical use can be challenging as one of the input parameters must be estimated; implied volatility of the underlying security. The more precisely these values are estimated, the more accurate their corresponding estimates of theoretical option prices would be. Here, we present a novel model based on Cuckoo Search Optimization (CS) which finds more precise estimates of implied volatility than Particle Swarm Optimization (PSO) and Genetic Algorithm (GA).

Keywords: black scholes model, cuckoo search optimization, particle swarm optimization, genetic algorithm

Procedia PDF Downloads 452
3622 Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

Authors: Chalakorn Chitsaart, Suchada Rianmora, Noppawat Vongpiyasatit

Abstract:

In order to reduce the transportation time and cost for direct interface between customer and manufacturer, the image processing technique has been introduced in this research where designing part and defining manufacturing process can be performed quickly. A3D virtual model is directly generated from a series of multi-view images of an object, and it can be modified, analyzed, and improved the structure, or function for the further implementations, such as computer-aided manufacturing (CAM). To estimate and quote the production cost, the user-friendly platform has been developed in this research where the appropriate manufacturing parameters and process detections have been identified and planned by CAM simulation.

Keywords: image processing technique, feature detections, surface registrations, capturing multi-view images, Production costs and Manufacturing processes

Procedia PDF Downloads 249
3621 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration

Authors: C. Iraklis, G. Evmiridis, A. Iraklis

Abstract:

Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.

Keywords: congestion, distribution networks, loss reduction, particle swarm optimization, smart grid

Procedia PDF Downloads 443
3620 Image Denoising Using Spatial Adaptive Mask Filter for Medical Images

Authors: R. Sumalatha, M. V. Subramanyam

Abstract:

In medical image processing the quality of the image is degraded in the presence of noise. Especially in ultra sound imaging and Magnetic resonance imaging the data was corrupted by signal dependent noise known as salt and pepper noise. Removal of noise from the medical images is a critical issue for researchers. In this paper, a new type of technique Adaptive Spatial Mask Filter (ASMF) has been proposed. The proposed filter is used to increase the quality of MRI and ultra sound images. Experimental results show that the proposed filter outperforms the implementation of mean, median, adaptive median filters in terms of MSE and PSNR.

Keywords: salt and pepper noise, ASMF, PSNR, MSE

Procedia PDF Downloads 434
3619 Mobile and Hot Spot Measurement with Optical Particle Counting Based Dust Monitor EDM264

Authors: V. Ziegler, F. Schneider, M. Pesch

Abstract:

With the EDM264, GRIMM offers a solution for mobile short- and long-term measurements in outdoor areas and at production sites. For research as well as permanent areal observations on a near reference quality base. The model EDM264 features a powerful and robust measuring cell based on optical particle counting (OPC) principle with all the advantages that users of GRIMM's portable aerosol spectrometers are used to. The system is embedded in a compact weather-protection housing with all-weather sampling, heated inlet system, data logger, and meteorological sensor. With TSP, PM10, PM4, PM2.5, PM1, and PMcoarse, the EDM264 provides all fine dust fractions real-time, valid for outdoor applications and calculated with the proven GRIMM enviro-algorithm, as well as six additional dust mass fractions pm10, pm2.5, pm1, inhalable, thoracic and respirable for IAQ and workplace measurements. This highly versatile instrument performs real-time monitoring of particle number, particle size and provides information on particle surface distribution as well as dust mass distribution. GRIMM's EDM264 has 31 equidistant size channels, which are PSL traceable. A high-end data logger enables data acquisition and wireless communication via LTE, WLAN, or wired via Ethernet. Backup copies of the measurement data are stored in the device directly. The rinsing air function, which protects the laser and detector in the optical cell, further increases the reliability and long term stability of the EDM264 under different environmental and climatic conditions. The entire sample volume flow of 1.2 L/min is analyzed by 100% in the optical cell, which assures excellent counting efficiency at low and high concentrations and complies with the ISO 21501-1standard for OPCs. With all these features, the EDM264 is a world-leading dust monitor for precise monitoring of particulate matter and particle number concentration. This highly reliable instrument is an indispensable tool for many users who need to measure aerosol levels and air quality outdoors, on construction sites, or at production facilities.

Keywords: aerosol research, aerial observation, fence line monitoring, wild fire detection

Procedia PDF Downloads 148
3618 Detection of Latent Fingerprints Recovered from Arson Simulation by a Novel Fluorescent Method

Authors: Somayeh Khanjani, Samaneh Nabavi, Shirin Jalili, Afshin Khara

Abstract:

Fingerprints are area source of ubiquitous evidence and consequential for establishing identity. The detection and subsequent development of fingerprints are thus inevitable in criminal investigations. This becomes a difficult task in the case of certain extreme conditions like fire. A fire scene may be accidental or arson. The evidence subjected to fire is generally overlooked as there is a misconception that they are damaged. There are several scientific approaches to determine whether the fire was deliberate or not. In such as scenario, fingerprints may be most critical to link the perpetrator to the crime. The reason for this may be the destructive nature of fire. Fingerprints subjected to fire are exposed to high temperatures, soot deposition, electromagnetic radiation, and subsequent water force. It is believed that these phenomena damage the fingerprint. A novel fluorescent and a pre existing small particle reagent were investigated for the same. Zinc carbonates based fluorescent small particle reagent was capable of developing latent fingerprints exposed to a maximum temperature of 800 ̊C. Fluorescent SPR may prove very useful in such cases. Fluorescent SPR reagent based on zinc carbonate is a potential method for developing fingerprints from arson sites. The method is cost effective and non hazardous. This formulation is suitable for developing fingerprints exposed to fire/ arson.

Keywords: fingerprint, small particle reagent (SPR), arson, novel fluorescent

Procedia PDF Downloads 469
3617 Enhancing the Bionic Eye: A Real-time Image Optimization Framework to Encode Color and Spatial Information Into Retinal Prostheses

Authors: William Huang

Abstract:

Retinal prostheses are currently limited to low resolution grayscale images that lack color and spatial information. This study develops a novel real-time image optimization framework and tools to encode maximum information to the prostheses which are constrained by the number of electrodes. One key idea is to localize main objects in images while reducing unnecessary background noise through region-contrast saliency maps. A novel color depth mapping technique was developed through MiniBatchKmeans clustering and color space selection. The resulting image was downsampled using bicubic interpolation to reduce image size while preserving color quality. In comparison to current schemes, the proposed framework demonstrated better visual quality in tested images. The use of the region-contrast saliency map showed improvements in efficacy up to 30%. Finally, the computational speed of this algorithm is less than 380 ms on tested cases, making real-time retinal prostheses feasible.

Keywords: retinal implants, virtual processing unit, computer vision, saliency maps, color quantization

Procedia PDF Downloads 151
3616 Static and Dynamic Hand Gesture Recognition Using Convolutional Neural Network Models

Authors: Keyi Wang

Abstract:

Similar to the touchscreen, hand gesture based human-computer interaction (HCI) is a technology that could allow people to perform a variety of tasks faster and more conveniently. This paper proposes a training method of an image-based hand gesture image and video clip recognition system using a CNN (Convolutional Neural Network) with a dataset. A dataset containing 6 hand gesture images is used to train a 2D CNN model. ~98% accuracy is achieved. Furthermore, a 3D CNN model is trained on a dataset containing 4 hand gesture video clips resulting in ~83% accuracy. It is demonstrated that a Cozmo robot loaded with pre-trained models is able to recognize static and dynamic hand gestures.

Keywords: deep learning, hand gesture recognition, computer vision, image processing

Procedia PDF Downloads 136
3615 Content-Based Color Image Retrieval Based on the 2-D Histogram and Statistical Moments

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach can overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.

Keywords: 2-D histogram, statistical moments, indexing, similarity distance, histograms intersection

Procedia PDF Downloads 454
3614 Inertial Particle Focusing Dynamics in Trapezoid Straight Microchannels: Application to Continuous Particle Filtration

Authors: Reza Moloudi, Steve Oh, Charles Chun Yang, Majid Ebrahimi Warkiani, May Win Naing

Abstract:

Inertial microfluidics has emerged recently as a promising tool for high-throughput manipulation of particles and cells for a wide range of flow cytometric tasks including cell separation/filtration, cell counting, and mechanical phenotyping. Inertial focusing is profoundly reliant on the cross-sectional shape of the channel and its impacts not only on the shear field but also the wall-effect lift force near the wall region. Despite comprehensive experiments and numerical analysis of the lift forces for rectangular and non-rectangular microchannels (half-circular and triangular cross-section), which all possess planes of symmetry, less effort has been made on the 'flow field structure' of trapezoidal straight microchannels and its effects on inertial focusing. On the other hand, a rectilinear channel with trapezoidal cross-sections breaks down all planes of symmetry. In this study, particle focusing dynamics inside trapezoid straight microchannels was first studied systematically for a broad range of channel Re number (20 < Re < 800). The altered axial velocity profile and consequently new shear force arrangement led to a cross-laterally movement of equilibration toward the longer side wall when the rectangular straight channel was changed to a trapezoid; however, the main lateral focusing started to move backward toward the middle and the shorter side wall, depending on particle clogging ratio (K=a/Hmin, a is particle size), channel aspect ratio (AR=W/Hmin, W is channel width, and Hmin is smaller channel height), and slope of slanted wall, as the channel Reynolds number further increased (Re > 50). Increasing the channel aspect ratio (AR) from 2 to 4 and the slope of slanted wall up to Tan(α)≈0.4 (Tan(α)=(Hlonger-sidewall-Hshorter-sidewall)/W) enhanced the off-center lateral focusing position from the middle of channel cross-section, up to ~20 percent of the channel width. It was found that the focusing point was spoiled near the slanted wall due to the dissymmetry; it mainly focused near the bottom wall or fluctuated between the channel center and the bottom wall, depending on the slanted wall and Re (Re < 100, channel aspect ratio 4:1). Eventually, as a proof of principle, a trapezoidal straight microchannel along with a bifurcation was designed and utilized for continuous filtration of a broader range of particle clogging ratio (0.3 < K < 1) exiting through the longer wall outlet with ~99% efficiency (Re < 100) in comparison to the rectangular straight microchannels (W > H, 0.3 ≤ K < 0.5).

Keywords: cell/particle sorting, filtration, inertial microfluidics, straight microchannel, trapezoid

Procedia PDF Downloads 223
3613 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy

Procedia PDF Downloads 342
3612 Contrast Enhancement of Color Images with Color Morphing Approach

Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi

Abstract:

Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.

Keywords: contrast enhacement, normalized RGB, adaptive histogram equalization, cumulative variance.

Procedia PDF Downloads 374
3611 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach

Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann

Abstract:

Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.

Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech

Procedia PDF Downloads 101
3610 A Neural Approach for Color-Textured Images Segmentation

Authors: Khalid Salhi, El Miloud Jaara, Mohammed Talibi Alaoui

Abstract:

In this paper, we present a neural approach for unsupervised natural color-texture image segmentation, which is based on both Kohonen maps and mathematical morphology, using a combination of the texture and the image color information of the image, namely, the fractal features based on fractal dimension are selected to present the information texture, and the color features presented in RGB color space. These features are then used to train the network Kohonen, which will be represented by the underlying probability density function, the segmentation of this map is made by morphological watershed transformation. The performance of our color-texture segmentation approach is compared first, to color-based methods or texture-based methods only, and then to k-means method.

Keywords: segmentation, color-texture, neural networks, fractal, watershed

Procedia PDF Downloads 344
3609 Development of Algorithms for the Study of the Image in Digital Form for Satellite Applications: Extraction of a Road Network and Its Nodes

Authors: Zineb Nougrara

Abstract:

In this paper, we propose a novel methodology for extracting a road network and its nodes from satellite images of Algeria country. This developed technique is a progress of our previous research works. It is founded on the information theory and the mathematical morphology; the information theory and the mathematical morphology are combined together to extract and link the road segments to form a road network and its nodes. We, therefore, have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. In this approach, geometric and radiometric features of roads are integrated by a cost function and a set of selected points of a crossing road. Its performances were tested on satellite images of Algeria country.

Keywords: satellite image, road network, nodes, image analysis and processing

Procedia PDF Downloads 272
3608 Optimal Tuning of Linear Quadratic Regulator Controller Using a Particle Swarm Optimization for Two-Rotor Aerodynamical System

Authors: Ayad Al-Mahturi, Herman Wahid

Abstract:

This paper presents an optimal state feedback controller based on Linear Quadratic Regulator (LQR) for a two-rotor aero-dynamical system (TRAS). TRAS is a highly nonlinear multi-input multi-output (MIMO) system with two degrees of freedom and cross coupling. There are two parameters that define the behavior of LQR controller: state weighting matrix and control weighting matrix. The two parameters influence the performance of LQR. Particle Swarm Optimization (PSO) is proposed to optimally tune weighting matrices of LQR. The major concern of using LQR controller is to stabilize the TRAS by making the beam move quickly and accurately for tracking a trajectory or to reach a desired altitude. The simulation results were carried out in MATLAB/Simulink. The system is decoupled into two single-input single-output (SISO) systems. Comparing the performance of the optimized proportional, integral and derivative (PID) controller provided by INTECO, results depict that LQR controller gives a better performance in terms of both transient and steady state responses when PSO is performed.

Keywords: LQR controller, optimal control, particle swarm optimization (PSO), two rotor aero-dynamical system (TRAS)

Procedia PDF Downloads 321
3607 A Visual Inspection System for Automotive Sheet Metal Chasis Parts Produced with Cold-Forming Method

Authors: İmren Öztürk Yılmaz, Abdullah Yasin Bilici, Yasin Atalay Candemir

Abstract:

The system consists of 4 main elements: motion system, image acquisition system, image processing software, and control interface. The parts coming out of the production line to enter the image processing system with the conveyor belt at the end of the line. The 3D scanning of the produced part is performed with the laser scanning system integrated into the system entry side. With the 3D scanning method, it is determined at what position and angle the parts enter the system, and according to the data obtained, parameters such as part origin and conveyor speed are calculated with the designed software, and the robot is informed about the position where it will take part. The robot, which receives the information, takes the produced part on the belt conveyor and shows it to high-resolution cameras for quality control. Measurement processes are carried out with a maximum error of 20 microns determined by the experiments.

Keywords: quality control, industry 4.0, image processing, automated fault detection, digital visual inspection

Procedia PDF Downloads 111
3606 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique

Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu

Abstract:

Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.

Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing

Procedia PDF Downloads 99
3605 Structural, Optical and Electrical Thin-Film Characterization Using Graphite-Bioepoxy Composite Materials

Authors: Anika Zafiah M. Rus, Nur Munirah Abdullah, M. F. L. Abdullah

Abstract:

The fabrication and characterization of composite films of graphite- bioepoxy is described. Free-standing thin films of ~0.1 mm thick are prepared using a simple solution mixing with mass proportion of 7/3 (bioepoxy/graphite) and drop casting at room temperature. Fourier transform infra-red spectroscopy (FTIR) and Ultraviolet-visible (UV-vis) spectrophotometer are performed to evaluate the changes in chemical structure and adsorption spectra arising with the increasing of graphite weight loading (wt.%) into the biopolymer matrix. The morphologic study shows a homogeneously dispersed and strong particle bonding between the graphite and the bioepoxy, with conductivity of the film 103 S/m, confirming the efficiency of the processes.

Keywords: absorbance peak, biopolymer, graphite- bioepoxy composites, particle bonding

Procedia PDF Downloads 513
3604 A Pilot Study of Influences of Scan Speed on Image Quality for Digital Tomosynthesis

Authors: Li-Ting Huang, Yu-Hsiang Shen, Cing-Ciao Ke, Sheng-Pin Tseng, Fan-Pin Tseng, Yu-Ching Ni, Chia-Yu Lin

Abstract:

Chest radiography is the most common technique for the diagnosis and follow-up of pulmonary diseases. However, the lesions superimposed with normal structures are difficult to be detected in chest radiography. Chest tomosynthesis is a relatively new technique to obtain 3D section images from a set of low-dose projections acquired over a limited angular range. However, there are some limitations with chest tomosynthesis. Patients undergoing tomosynthesis have to be able to hold their breath firmly for 10 seconds. A digital tomosynthesis system with advanced reconstruction algorithm and high-stability motion mechanism was developed by our research group. The potential for the system to perform a bidirectional chest scan within 10 seconds is expected. The purpose of this study is to realize the influences of the scan speed on the image quality for our digital tomosynthesis system. The major factors that lead image blurring are the motion of the X-ray source and the patient. For the fore one, an experiment of imaging a chest phantom with three different scan speeds, which are 6 cm/s, 8 cm/s, and 15 cm/s, was proceeded to understand the scan speed influences on the image quality. For the rear factor, a normal SD (Sprague-Dawley) rat was imaged with it alive and sacrificed to assess the impact on the image quality due to breath motion. In both experiments, the profile of the ROIs (region of interest) and the CNRs (contrast-to-noise ratio) of the ROIs to the normal tissue of the reconstructed images was examined to realize the degradations of the qualities of the images. The preliminary results show that no obvious degradation of the image quality was observed with increasing scan speed, possibly due to the advanced designs for the hardware and software of the system. It implies that higher speed (15 cm/s) than that of the commercialized tomosynthesis system (12 cm/s) for the proposed system is achieved, and therefore a complete chest scan within 10 seconds is expected.

Keywords: chest radiography, digital tomosynthesis, image quality, scan speed

Procedia PDF Downloads 330
3603 Transport of Inertial Finite-Size Floating Plastic Pollution by Ocean Surface Waves

Authors: Ross Calvert, Colin Whittaker, Alison Raby, Alistair G. L. Borthwick, Ton S. van den Bremer

Abstract:

Large concentrations of plastic have polluted the seas in the last half century, with harmful effects on marine wildlife and potentially to human health. Plastic pollution will have lasting effects because it is expected to take hundreds or thousands of years for plastic to decay in the ocean. The question arises how waves transport plastic in the ocean. The predominant motion induced by waves creates ellipsoid orbits. However, these orbits do not close, resulting in a drift. This is defined as Stokes drift. If a particle is infinitesimally small and the same density as water, it will behave exactly as the water does, i.e., as a purely Lagrangian tracer. However, as the particle grows in size or changes density, it will behave differently. The particle will then have its own inertia, the fluid will exert drag on the particle, because there is relative velocity, and it will rise or sink depending on the density and whether it is on the free surface. Previously, plastic pollution has all been considered to be purely Lagrangian. However, the steepness of waves in the ocean is small, normally about α = k₀a = 0.1 (where k₀ is the wavenumber and a is the wave amplitude), this means that the mean drift flows are of the order of ten times smaller than the oscillatory velocities (Stokes drift is proportional to steepness squared, whilst the oscillatory velocities are proportional to the steepness). Thus, the particle motion must have the forces of the full motion, oscillatory and mean flow, as well as a dynamic buoyancy term to account for the free surface, to determine whether inertia is important. To track the motion of a floating inertial particle under wave action requires the fluid velocities, which form the forcing, and the full equations of motion of a particle to be solved. Starting with the equation of motion of a sphere in unsteady flow with viscous drag. Terms can added then be added to the equation of motion to better model floating plastic: a dynamic buoyancy to model a particle floating on the free surface, quadratic drag for larger particles and a slope sliding term. Using perturbation methods to order the equation of motion into sequentially solvable parts allows a parametric equation for the transport of inertial finite-sized floating particles to be derived. This parametric equation can then be validated using numerical simulations of the equation of motion and flume experiments. This paper presents a parametric equation for the transport of inertial floating finite-size particles by ocean waves. The equation shows an increase in Stokes drift for larger, less dense particles. The equation has been validated using numerical solutions of the equation of motion and laboratory flume experiments. The difference in the particle transport equation and a purely Lagrangian tracer is illustrated using worlds maps of the induced transport. This parametric transport equation would allow ocean-scale numerical models to include inertial effects of floating plastic when predicting or tracing the transport of pollutants.

Keywords: perturbation methods, plastic pollution transport, Stokes drift, wave flume experiments, wave-induced mean flow

Procedia PDF Downloads 120
3602 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 511
3601 The Brand Value of Cosmetics in the View of Customers in Thailand

Authors: Mananya Meenakorn

Abstract:

The purpose of this research is to study the relationship customer perception and brand value of cosmetics in the view of customers in Thailand. The research is quantitative research using the survey method by questionnaire. Data were collected from female cosmetics consumer that residents in Bangkok, aged between 25-55 years. Researchers have determined the size of the sample by using Taro Yamane technic a total of 400 people. The study found the Shiseido cosmetics brand image always come with the new products innovation is in the height level. The average was 3.812, second is Shiseido brand has used innovation to produce the product for 3.792. And brand Shiseido looks luxury with an average of 3.707 respectively. In additional in terms of Lancôme cosmetic brand found the brand image is luxury at the height levels for 4.170 average. The seductive glamor is considered in the moderate with an average of 3.822 respectively.

Keywords: brand image, international fashion dress, values, working women

Procedia PDF Downloads 219
3600 Bi-Component Particle Segregation Studies in a Spiral Concentrator Using Experimental and CFD Techniques

Authors: Prudhvinath Reddy Ankireddy, Narasimha Mangadoddy

Abstract:

Spiral concentrators are commonly used in various industries, including mineral and coal processing, to efficiently separate materials based on their density and size. In these concentrators, a mixture of solid particles and fluid (usually water) is introduced as feed at the top of a spiral channel. As the mixture flows down the spiral, centrifugal and gravitational forces act on the particles, causing them to stratify based on their density and size. Spiral flows exhibit complex fluid dynamics, and interactions involve multiple phases and components in the process. Understanding the behavior of these phases within the spiral concentrator is crucial for achieving efficient separation. An experimental bi-component particle interaction study is conducted in this work utilizing magnetite (heavier density) and silica (lighter density) with different proportions processed in the spiral concentrator. The observation separation reveals that denser particles accumulate towards the inner region of the spiral trough, while a significant concentration of lighter particles are found close to the outer edge. The 5th turn of the spiral trough is partitioned into five zones to achieve a comprehensive distribution analysis of bicomponent particle segregation. Samples are then gathered from these individual streams using an in-house sample collector, and subsequent analysis is conducted to assess component segregation. Along the trough, there was a decline in the concentration of coarser particles, accompanied by an increase in the concentration of lighter particles. The segregation pattern indicates that the heavier coarse component accumulates in the inner zone, whereas the lighter fine component collects in the outer zone. The middle zone primarily consists of heavier fine particles and lighter coarse particles. The zone-wise results reveal that there is a significant fraction of segregation occurs in inner and middle zones. Finer magnetite and silica particles predominantly accumulate in outer zones with the smallest fraction of segregation. Additionally, numerical simulations are also carried out using the computational fluid dynamics (CFD) model based on the volume of fluid (VOF) approach incorporating the RSM turbulence model. The discrete phase model (DPM) is employed for particle tracking, thereby understanding the particle segregation of magnetite and silica along the spiral trough.

Keywords: spiral concentrator, bi-component particle segregation, computational fluid dynamics, discrete phase model

Procedia PDF Downloads 65